Time  Nick        Message
00:43 Genji       hiya.. whats the best way to log in a user so their session gets created, and then force them to make a selection, before they reach the search page?
00:45 Genji       and that selection gets saved into their session?
01:05 chris       no idea Genji
01:05 chris       nengard++
01:06 Genji       my idea of doing it inside opac-main.pl .. doesn't seem to have worked.
01:16 bob         hi, is there a fix/patch for the broken 'new' patron button in koha 3.2?
01:18 bob         or is not working for me due to certain needed admin settings?
01:19 brendan     have you created patron categories ?
01:20 bob         yep
01:20 bob         there are 3 categories
01:21 brendan     hmm...  what browser are you using?
01:21 bob         ff 3.6.7
01:22 bob         there are existing patrons set up as this worked fine when the koha was 3.0
01:22 bob         but has stopped working on upgrade to 3.2
01:23 * bob       tries with a diff browser
01:23 chris       yeah that doesnt happen normally
01:23 chris       what user are you logged in as?
01:24 bob         the main one - same login as db
01:24 bob         hmm, doesn't seem to work for me in chrome either
01:25 chris       yeah you should never do that (except to make a real user and make that a superlibrarian) to start
01:25 chris       but i dont think that is causing your problem
01:25 bob         i noted in the koha list someone else had the same issue
01:26 bob         and owen mentioned a patch - but the link was to anther issue
01:26 chris       im pretty sure jo would be ringing me constantly if they couldnt add borrowers :)
01:26 bob         indeed :)
01:27 bob         so it is odd - cause the koha i upgraded was a clean 3.0 (no hacks)
01:27 bob         to a 3.2
01:27 bob         so it has me stumped
01:28 bob         is the 'new' button worked by javascript?
01:29 chris       it should be a dropdown
01:29 chris       when you click on it
01:29 bob         ah nothing drops down
01:29 chris       you should get a list of patron categories
01:29 chris       you might want to just check them
01:29 chris       /cgi-bin/koha/admin/categorie.pl
01:30 bob         i have 3
01:30 bob         and they look ok
01:30 chris       click on one of them
01:30 chris       what category type is it?
01:31 bob         one is adult
01:31 bob         one is child and 3rd is staff
01:31 chris       right, try just saving it
01:31 chris       for kicks
01:31 chris       see if that makes any difference
01:31 bob         nope
01:34 brendan     the new button IRC uses the YUI package - is that right chris?
01:35 ray         good day!
01:35 brendan     bob what do you have set for your system preference -->  yuipath
01:36 ray         is there a possiblity to add an image link inside the link section of the opac? how?
01:36 bob         brendan - where will i find that?
01:36 brendan     look in your global system preferences and you can search those too - search for yui
01:37 bob         ta
01:37 bob         says 'included with koha'
01:40 brendan     sorry out of thoughts
01:45 bob         chatting to chris in another window - he recons my setup is spazzing on returning the lists of patrons for the 'new' button fro some reason
01:48 chris       yeah
01:48 chris       ray: and have it show as an image?
01:48 chris       like in the 856u field of the MARC?
02:15 * jcamins_a looks at the scrollback and gives up
02:17 ray         good day!
02:17 jcamins_a   Good evening.
02:18 thd         chris: I had asked the wrong question of hdl earlier
02:18 ray         is koha already capable of sending circulation notices upon installation?
02:19 thd         chris: Had hdl described the conditions under which an ordinary search produces no result set from Zebra?
02:19 ray         will I configure it first to send notices?
02:20 jcamins     ray: I don't really know anything about this, but I do know you have to have an smtp server set up.
02:21 jcamins     Something like Postfix.
02:21 chris       ray: yeah and some cronjobs
02:21 chris       thd: not that im aware of
02:21 ray         oops... a bit tricky
02:22 thd         chris: there is a general case of failure for no record set when using ICU but I think that is small or we all would have quit Zebra long ago
02:23 richard     thd: anthony mao might have some info on that. i saw his instructions for setting up koha for chinese = nozebra plus a change to the search script
02:23 ray         could you give sites which i can use as references?
02:24 thd         richard: he was avoiding Zebra for Chinese?
02:24 richard     yep
02:24 chris       ray: you know how to set up a mailserver under linux?
02:24 richard     thd: i think i've got a url to his set up in an email somewhere
02:24 richard     i'll see if i can hunt it up
02:25 ray         i don't know.. can i use apt-get in command line? or synaptic?
02:25 jcamins     ray: https://help.ubuntu.com/community/Postfix
02:25 jcamins     It doesn't matter if you're on Debian or Ubuntu. The instructions will help. :)
02:26 ray         ok
02:26 thd         richard: I have had problems searching some Japanese Z39.50 servers using Yaz but I presumed a problem at the server and not with Yaz as other Japanese Z39.50 servers searched using Yaz had no problem.
02:26 chris       and then in misc/cronjobs/
02:26 chris       there is a crontab.example file
02:26 chris       (in koha)
02:27 chris       the SEND EMAILS line
02:27 ray         ok thank you very much!
02:27 chris       and the environment variables .. .plus the rebuild_zebra one are probably the ones you care about the most
02:28 richard     thd: here's his instructions for installing koha for chinese - http://koha.wikispaces.com/koha-301-git
02:28 richard     i think changes to Search.pm as well
02:29 richard     http://koha.wikispaces.com/file/view/Search.pm
02:32 jcamins     Well that's absurd.
02:32 * jcamins   just finished catching up on Koha e-mail
02:32 chris       which bit?
02:33 chris       it was a bit of an absurd day
02:33 thd         richard: I think the core problem which Anthony Mao had been addressing is the form lack of any Unicode support in Zebra.  Anthony Mao could not use Zebra originally and may never have tried properly later.
02:33 jcamins     The bit by our favorite, and highly active...
02:33 jcamins     wait...
02:33 jcamins     our favorite...
02:33 thd         s/form/former/
02:34 chris       ahh yeah, that was absurd, ... both of the mails
02:34 richard     thd: that could well be right.
02:35 thd         richard: I had contacted him about the need to add Unicode support to Zebra when fredericd had raised it as an issue.
02:36 thd         richard: Anthony Mao had moved on without Zebra and was less interested than he might have been otherwise in a Unicode fix for Zebra.
02:38 richard     right. i guess if you have got a system that works for you, it takes the pressure/need off for changing it.
02:38 chris       chris_n++
02:38 thd         The bugs in the ICU implementation may be in Yaz which would be much worse than merely a problem in Zebra,
02:39 thd         We need Yaz as a Z39.50/SRU client because we have no other such client with Perl bindings.
02:40 chris_n     [off]ARGHHH! # for the second time today
02:41 * chris_n   will now go and bite his tongue
02:53 chris       heh
02:54 chris       thd: i dont think they are
02:54 chris       but thats just conjecture
02:54 chris       a LOT of libraries, including library of congress use yaz
02:55 chris       (yaz-proxy sits in front of their voyager to make the z3950 actually work)
02:55 thd         chris: yes of course, and I have tested Yaz greatly or so I thought for character set support
02:56 thd         chris: I merely saw something about the ICU code being in Yaz
02:56 thd         Maybe Yaz never uses the ICU code on its own.
02:56 chris       i still feel like most of our problesm with zebra
02:56 chris       are actually problems with C4/Search
02:57 thd         That could be.
02:57 thd         It would be very expensive to hire Index Data to fix a problem which was actually in Koha and not in Index Data code.
02:57 chris       heya college boy
02:57 chris       :)
02:57 pianohack   Hey chris!
02:58 chris       indeed
02:58 sekjal      evening, pianohack!
02:58 chris       thd: and C4/Search is broken in numerous other ways, id still love to see that rewritten for 3.4
02:58 pianohack   Tomorrow's lecture is on not getting "structure" and "version" confused
02:59 chris       ohh good timing :)
02:59 * chris_n   thought gmcharlt had taken on the C4::Search rewrite task :)
02:59 thd         pianohack: where have you matriculated?
02:59 pianohack   thd: School of Mines here in colorado. Fun school, though it keeps me pretty busy :)
03:00 pianohack   Lots of other geeky people, so much more fun than a normal college
03:00 chris       chris_n: he volunteered, he may have stopped with the crazy pills and reconsidered though :)
03:00 chris       apparently ppl pour salt on desks there a lot
03:00 pianohack   chris: right, I was going to say, maybe he decided to join sisyphus at a different rock
03:00 pianohack   Hahahahaha
03:00 pianohack   Yes
03:00 pianohack   One girl did the salt, the other did the... scenery
03:01 chris       hehe
03:02 brendan     heya pianohack
03:02 thd         pianohack: KohaCon 2006 was in École Nationale Supérieure des Mines de Paris
03:02 pianohack   Hey brendan
03:02 brendan     woohoo - I got it right this time too
03:02 pianohack   thd: Oh, wow
03:02 chris       yeah i gots pictures to proove it
03:02 pianohack   brendan: You can't get it quite right, because of the IRC name length gestapo, but that's close enough ;)
03:02 chris       -o
03:03 brendan     ok pianohacker (I liked that one too)
03:03 pianohack   brendan: How's it going?
03:03 brendan     going great
03:03 brendan     no sure if you know or knew we've got a baby girl on the way in december
03:03 thd         pianohack: ENSMP was one of the earliest customers of paul_p
03:04 pianohack   Our school of mines isn't quite so lucky, stuck on exlibris
03:04 chris       http://opensource.califa.org/node/266
03:04 chris       brendan: you might want to comment
03:04 pianohack   Academic library, though, they tend to be a bit conservative
03:04 chris       So 3.03.00.02  == 3.3.0.2 == Koha 3.2 (with some additional patches that ByWater has uploaded).
03:05 chris       isnt quite right :)
03:05 chris       that would be 3.3.0
03:05 brendan     doesn't seem to be - will have to go and read it
03:05 thd         ENSMP has a head librarian with very broad views
03:05 chris       yeah at the mo its a confusinating post :)
03:05 sekjal      dang it
03:06 chris       i think she means 3.02.00.02
03:06 chris       master is running 3.03.00.02
03:06 chris       which is 3.3 .. i may even do a 3.3.x release
03:07 chris       probably will at 5 montsh
03:07 chris       3.3.99.xxx
03:07 sekjal      I was the one who explained versioning to her
03:07 sekjal      she was curious about what version the ByWater demo is on
03:07 chris       leading up to 3.4 (instead of alphas and betas)
03:07 sekjal      we've got it on HEAD currently
03:07 chris       yeah
03:07 sekjal      so 3.03.00.02
03:07 chris       so 3.3
03:07 brendan     heh - yeah it's 3.3
03:08 chris       i mean its currently pretty darn close to 3.2
03:08 chris       but it does have a couple of new features already
03:08 sekjal      we have NOT loaded any additional patches
03:08 sekjal      it's just pure Koha
03:08 brendan     heh - Linux debian.bywater-gallagher.net 2.6.24-24-xen
03:08 chris       *sigh*
03:08 chris       its even more confusinating them
03:08 brendan     I guess a long time ago I screwed with the hostnames
03:09 chris       id ask her to take it down i think
03:09 brendan     gallagher.net (wonder if that goes anywhere)
03:09 chris       or fix it ;)
03:09 sekjal      I'll message her
03:09 chris       http://gallagher.net/
03:09 chris       hehe
03:10 chris_n     g'night
03:10 brendan     I feel that my father should by such a domain
03:10 brendan     night chris_n
03:10 chris       night chris_n
03:11 chris       its wizzyrea_ !
03:11 wizzyrea_   hi :)
03:11 brendan     heya wizzyrea_
03:11 wizzyrea_   how is everybody
03:12 jcamins     Hello wizzyrea_.
03:12 richard     hi wizzyrea
03:12 jcamins     Why did I say "assiduously"?
03:12 wizzyrea_   "follow the directions assiduously and you should have few issues"
03:13 chris       heh
03:13 robin       well, it's correct use of the word
03:13 jcamins     Ah.
03:13 jcamins     Right.
03:13 jcamins     I knew I had used the word recently, I just couldn't remember when.
03:14 wizzyrea_   I just appreciated its usage
03:14 wizzyrea_   less lol and more yay
03:14 jcamins     Your remark sounded complimentary, I just had no idea when I'd said it.
03:15 wizzyrea_   < english major :P
03:15 * jcamins   too
03:18 chris       well although the day started retarded
03:18 chris       i think that the work on commenting on rfcs,  and the proposed rules redeemed it somewhat
03:19 wizzyrea_   woooot!
03:19 chris       im certainly less depressed now than i was during hte meeting
03:19 jcamins     chris: you should use my strategy for depressing meetings... get lost about ten minutes in, and just give up on the scrollback.
03:19 chris       and because i have an instinctual urge never to be too serious
03:20 chris       http://wiki.koha-community.org/wiki/Talk:Add_support_for_NORMARC
03:20 thd         chris: There are too many other things about which to be depressed :)
03:21 wizzyrea_   lol
03:21 jcamins     chris: my cat says if you want to make the world a happier place, you could send him a Kiwi sheep to play with. He's sure it would be a great toy. :)
03:21 chris       heh
03:22 chris       i suspect that soon people will be saying, why did we want chris c to comment, hes a dick
03:22 chris       http://wiki.koha-community.org/wiki/Talk:EAN_reading_RFC
03:22 wizzyrea_   lol
03:22 wizzyrea_   ugh what is wrong with stupid yahoo's openid's lately
03:23 wizzyrea_   half the time it doesn't work
03:23 * wizzyrea_ stomps and pouts
03:23 chris       oh yeah, i noticed that
03:23 chris       we should make a koha-community openid server
03:23 wizzyrea_   ...best idea ever
03:23 ray         where can i adjust the rental charge in koha?
03:23 * jcamins   has an even better idea!
03:24 wizzyrea_   follow the recipe?
03:24 chris       ray: on the itemtypes page
03:24 jcamins     Let's find out who has the time to set up the openid server, and make them fix bugs in Koha. :D
03:24 chris       in administration
03:24 chris       or, it might be the circulation rules
03:24 chris       its on the the admin page anyway :)
03:24 ray         ok tnx!
03:24 jcamins     wizzyrea_: follow the recipe? What fun is that?
03:25 wizzyrea_   have you seen ratatouille?
03:25 jcamins     wizzyrea_: nope.
03:25 chris       http://ownopenidserver.com/
03:25 wizzyrea_   itypes
03:25 wizzyrea_   http://koha-community.org/documentation/3-2-manual/?ch=x2959#AEN3047
03:25 chris       easy peasy
03:25 jcamins     I was told I'd enjoy it, though.
03:25 wizzyrea_   you would it's cute
03:26 wizzyrea_   not just a movie for kids, though kids seem to love it
03:26 wizzyrea_   anyway
03:29 wizzyrea_   ph! <big hug> missed you at kohacon
03:48 pianohack   Hallo wizzyrea
03:49 jcamins     Hey, there's a new version of umlaut out!
03:49 jcamins     Shiny!
04:01 Oak         \o
04:03 wizzyrea_   hi oak
04:03 Oak         hiya wizzyrea_
04:05 sekjal      goodnight, #koha
04:18 jcamins     Good night, #koha
04:24 chris       Its the twilight area, europe still asleep north america about to sleeo
04:27 wizzyrea_   oooeeeoooo
04:28 chris       Hehe
04:28 chris       India should be on soon
04:29 chris       Amit kmkale indradg etc
04:58 brendan     ah time to engage vpn catch you all in a little bit
05:01 brendan     blocked - so I'm still here ;)
05:36 indradg     good morning #kohs
05:36 indradg     oops
05:36 indradg     #koha
05:37 * indradg   can't believe he actually fell asleep when gmcharlt was about to move to agenda item #4 :(
05:39 chris       hehe
05:39 chris       you can read the logs
05:39 chris       what was #4?
05:39 indradg     chris, TC
05:40 * indradg   reading logs
05:50 chris       ah yeah,you didnt miss much :)
06:07 cait        hi #koha
06:07 chris       hiya cait
06:09 indradg     hi cait
06:11 cait        hi indradg
06:14 brendan     heya cait and indradg
06:14 cait        hi brendan
06:14 indradg     hi brendan
06:14 cait        isn't this a strange time for you?
06:15 brendan     not really
06:15 brendan     it's 10:30 ;)
06:15 cait        oh
06:21 cait        7:30 am here
06:21 cait        hm 7:37
06:22 brendan     heh
06:22 chris       hmm speaking strange
06:22 brendan     yeah I'm in a little bit late tonight
06:22 chris       i dont really understand this
06:22 brendan     I should be here for a long time more
06:22 chris       http://wiki.koha-community.org/w/index.php?title=Talk:KohaCon2011_Proposals
06:23 cait        aq
06:23 cait        aw
06:24 cait        going without my koha friends for another year?
06:24 chris       yeah
06:24 chris       i think if you cant make it, it shouldnt be a reason not to have one
06:25 brendan     agreed or maybe others who didn't make will be able to make that one
06:25 chris       exactly
06:25 cait        I think the costs will vary each year, nz flight costs was probably the most expensive from europe this time, but england would be much cheaper for us
06:25 chris       yeah, usually we lose
06:25 chris       lost to france, lost to usa
06:25 cait        you said you will not organise another kohacon!
06:25 chris       its cheaper for pretty much everyone else
06:25 chris       yeah i dont mind travelling
06:26 cait        perhaps australia 2012? :)
06:26 chris       :)
06:26 chris       i was hoping india 2012
06:26 cait        indian food... ok, I will work on that
06:26 cait        ;)
06:27 chris       :)
06:28 brendan     yeah I love india
06:46 indradg     yeah india wont be a bad place for 2012
06:47 indradg     around late nov / early dec when its cooler
06:47 indradg     cait, brendan chris ^^^
06:47 cait        cooler sounds good :)
06:48 indradg     @wunder new delhi
06:48 munin       indradg: The current temperature in New Delhi, India is 26.0�C (12:00 PM IST on November 11, 2010). Conditions: Smoke. Humidity: 42%. Dew Point: 12.0�C. Pressure: 29.95 in 1014 hPa (Falling).
06:48 indradg     @wunder bangalore
06:48 munin       indradg: The current temperature in Bangalore, India is 26.0�C (11:30 AM IST on November 11, 2010). Conditions: Partly Cloudy. Humidity: 62%. Dew Point: 20.0�C.
06:48 brendan     @wunder 93117
06:48 munin       brendan: The current temperature in Northwest Goleta, Goleta, California is 14.3�C (11:01 PM PST on November 10, 2010). Conditions: Clear. Humidity: 47%. Dew Point: 3.0�C. Pressure: 30.00 in 1015.8 hPa (Steady).  Wind Advisory in effect until 3 am PST Thursday... 
06:48 indradg     @wunder kolkata
06:48 munin       indradg: The current temperature in Kolkata, India is 31.0�C (12:20 PM IST on November 11, 2010). Conditions: Haze. Humidity: 46%. Dew Point: 18.0�C. Pressure: 29.83 in 1010 hPa (Falling).
06:48 indradg     @wunder chennai
06:48 munin       indradg: The current temperature in Chennai, India is 29.0�C (11:40 AM IST on November 11, 2010). Conditions: Mostly Cloudy. Humidity: 79%. Dew Point: 25.0�C. Pressure: 29.92 in 1013 hPa (Steady).
06:49 indradg     @wunder hyderabad
06:49 munin       indradg: Error: No such location could be found.
06:49 indradg     how nice :P
06:57 fredericd   If I add a regression test on switch use, in which test file should I add it?
07:01 chris       fredericd: you might want to look at this
07:01 chris       http://git.catalyst.net.nz/gw?p=koha.git;a=shortlog;h=refs/heads/git-hooks
07:02 chris       the top 2 commits
07:03 chris       thats what im running, the git-hook will run all tests that start with 00
07:04 fredericd   So a file named 00deprecated.t would do it?
07:04 chris       yeah that would be great
07:05 chris       i should submit my 00-load.t
07:05 chris       it works like this
07:07 fredericd   yes it would help to have already this file... thks
07:10 pastebot    "chris" at 203.97.214.51 pasted "rorohiko:[git/test-]:~/git/koh" (3 lines) at http://paste.koha-community.org/91
07:12 pastebot    "chris" at 203.97.214.51 pasted "# Failed test 'use C4::Account" (6 lines) at http://paste.koha-community.org/92
07:13 chris       so basic syntax errors get caught :) left over conflict markers etc :)
07:17 chris       patch sent
07:17 chris       if you wanted to signoff on it, that would be great :)
07:18 chris       i think with your patch + colins patch, we have killed all the switch statements, so a test to stop them coming back would be great
07:22 mihafan     hllo
07:22 mihafan     hello
07:25 Elwell      what versions of Perl does koha tolerate?
07:26 Elwell      cos I guess changing switch => given / when is probably a bit new
07:27 chris       yeah given/when is too new
07:27 chris       3.2 was 5.8
07:28 chris       the places we were using switch were trivially rewritten is  if elsif anyway
07:28 chris       is=as
07:32 chris       personally i think we should be using quantum superpositions :)
07:34 chris       http://search.cpan.org/~dconway/Quantum-Superpositions-1.03/lib/Quantum/Superpositions.pm
07:34 chris       i love damian conway :)
07:38 Elwell      heh
08:12 chris       fredericd: thanks for the sign off and the new patch :)
08:16 munin       New commit(s) kohagit: Test modules compile <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=7ecf2c2dc42d443d819ed04e3a11065f2802d51a>
08:18 chris       hi magnus
08:18 magnus      kia ora chris & #koha
08:19 chris       wanna learn a new greeting?
08:19 chris       when its morning for you, you can say
08:19 chris       ata marie
08:20 chris       which is 'peaceful morning'
08:21 magnus      cool, i'll try and remember that! ;-)
08:21 chris       po marie = good night (or peaceful night literal translation)
08:21 chris       so now you know the words for morning and night too :)
08:22 magnus      thanks
08:22 magnus      and kia ora is all around the clock?
08:22 chris       yup
08:23 magnus      col
08:23 chris       ora = life
08:23 magnus      cool, even
08:23 chris       kia = have/ to have
08:23 chris       so you are saying 'have life'
08:24 chris       you can use it for hello, or thank you
08:25 magnus      ah, yas i was wondering if it was just hello
08:25 magnus      s/yas/yes/
08:26 munin       New commit(s) kohagit: Merge remote branch 'kc/new/bug_5105' into kcmaster <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=235cf872a5c40a0baae554bdb28ec98c4520c528> / Bug 5105 Regression test for switch statement <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=5ebb5d4f4cd7709751e5f201fada5351d0e9c29e>
08:26 ivanc       hi #koa
08:26 chris       hi ivanc
08:26 ivanc       hi chris
08:30 hudsonbot   Starting build 139 for job Koha_Master (previous build: SUCCESS)
08:33 magnus      hi ivanc
08:49 ivanc       hi magnus
08:50 hudsonbot   Project Koha_Master build #139: UNSTABLE in 20 min: http://bugs.koha-community.org:8080/job/Koha_Master/139/
08:50 hudsonbot   * Chris Cormack: Test modules compile
08:50 hudsonbot   * Frédéric Demians: Bug 5105 Regression test for switch statement
08:50 munin       04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=5105 enhancement, PATCH-Sent, ---, colin.campbell, ASSIGNED, Switch Module Depreciated in perl 12
08:50 chris       !hudson build koha_master now
08:50 hudsonbot   chris: job koha_master build scheduled now
08:51 hudsonbot   Starting build 140 for job Koha_Master (previous build: UNSTABLE -- last SUCCESS #138 10 hr ago)
08:56 chris       colin++
09:05 kmkale      hi all
09:10 hudsonbot   Yippie, build fixed!
09:10 hudsonbot   Project Koha_Master build #140: FIXED in 19 min: http://bugs.koha-community.org:8080/job/Koha_Master/140/
09:12 magnus      yay!
09:12 magnus      hi kmkale
09:14 chris       1822 tests now
09:17 larsw       yay for tests
09:18 kmkale      hi magnus chris
09:26 magnus      could have been cool to have a graph showing the increase in number of tests over time
09:34 chris       hudson has one
09:34 magnus      hudson++
09:34 chris       but it only goes back the last 7 builds
09:34 magnus      ah
09:39 * Elwell    starts a pile of POD cleanups
09:39 magnus      Elwell++
09:44 chris       yay, see if you can add to the coverage
09:44 chris       http://bugs.koha-community.org:8080/job/Koha_Master/HTML_Report/?
09:46 Elwell      what's the metric it uses?
09:47 chris       http://search.cpan.org/~pjcj/Devel-Cover-0.73/lib/Devel/Cover.pm
09:47 chris       and
09:47 chris       http://search.cpan.org/~rclamp/Pod-Coverage-0.21/lib/Pod/Coverage.pm
09:48 Elwell      right now I'm just getting podchecker to run without errors / warnings
09:48 chris       that would help too
09:49 chris       if you could give some lines of bash
09:49 chris       i can make hudson run the podchecker too
09:50 chris       magnus: did you see my comment on normarc?
09:51 magnus      chris: "I hate MARC so let's add more"? i agree completely! ;-)
09:52 chris       :)
09:52 Elwell      interesting. 'file' claims alot of the .pm are 'awk script text'
09:53 magnus      chris: in my frustration i even started marc-must-die.info...
09:54 chris       yep, i saw that :)
09:59 * braedon|h hides marc
10:00 braedon|h   leave the poor guy alone! It's not his fault his parents couldn't spell.
10:05 magnus      :-)
10:06 * magnus    has to go write some boring docs for a non-koha related project. sigh...
11:04 Elwell      chris: bash to call podchecker:
11:04 Elwell      find . -type f | grep -v \.git/ | egrep -i '\.p[lm]$' | xargs podchecker 2>/tmp/pod.err
11:05 Elwell      well, 's what I'm using
12:15 gmcharlt    good morning
12:30 nengard     morning
12:32 cait_a      hi gmcharlt and nengard
12:33 Elwell      right, once again I've forgotten the syntax for mailing patches. git commit -a ;????? ; git send email
12:33 Elwell      ah format patch
12:41 gmcharlt    Elwell: is there a bug for POD correction?
12:41 Elwell      uhm, prob not, want me to open one?
12:42 * cait_a    waves to druthb
12:42 gmcharlt    Elwell: yeah, please do
12:42 * druthb    waves to cait_a.  :)
12:42 gmcharlt    I'll add the bug number to your patch's commit messages when I test and sign off on them
12:43 Elwell      meh, what was my bugzilla login
12:46 nengard     your email address is your username
12:47 Elwell      nengard: yeah I remembered password. gmcharlt: 5385
12:47 gmcharlt    Elwell: thanks
12:49 gmcharlt    what are you using to check the POD?  podchecker?
12:49 Elwell      yup
12:50 Elwell      and mk1 eyeball to see if perldoc ...... looks reasonable after
12:51 gmcharlt    Elwell: http://perldoc.koha-community.org/
12:51 Elwell      ah ok - didn't know about that
12:53 pastebot    "gmcharlt" at 68.101.78.67 pasted "for Elwell - shouldn't the line of code be indented two spaces if you're removing the =over?" (11 lines) at http://paste.koha-community.org/94
12:57 Elwell      =head2 get_heading_type_from_marc
12:57 Elwell      meh ooops
12:58 Elwell      it is -- are we looking at ./C4/AuthoritiesMarc/MARC21.pm ?
12:59 gmcharlt    yep, and I noticed other examples like that in your patch
13:00 gmcharlt    doh
13:00 gmcharlt    yeah, of course I wouldn't see it if I'm using git show -w
13:00 gmcharlt    never mind me
13:01 Elwell      phew. Thought I'd done a schoolboy error. Right gotta go AFK for a bit
13:15 Oak         http://www.npccomic.com/2010/11/10/hardware-check/
13:29 magnus      hiya jwagner, sekjal et al
13:29 druthb      :)
13:29 sekjal      morning, magnus, druthb!
13:31 jwagner     Hi all
13:31 * druthb    waves to jwagner and magnus and sekjal, serially.
13:32 sekjal      morning, jwagner
13:33 * magnus    waves to druthb
13:36 kmkale      hi jwagner & druthb
13:36 druthb      hi, kmkale! :)
13:40 jwagner     hi kmkale & everyone
13:44 hudsonbot   Starting build 28 for job Koha_3.2.x (previous build: SUCCESS)
14:03 hudsonbot   Project Koha_3.2.x build #28: SUCCESS in 18 min: http://bugs.koha-community.org:8080/job/Koha_3.2.x/28/
14:03 hudsonbot   Chris Cormack: Merge remote branch 'kc/new/bug_5105' into kcmaster
14:10 magnus      yay!
14:12 cait_a      hm?
14:14 magnus      just cheering for hudson ;-)
14:19 cait_a      ah, he deserves it :)
14:36 * jcamins   curses the MTA roundly
14:36 * jcamins   also devoutly hopes that everyone else on #koha had a better trip into work than he did
14:36 jcamins     Good morning, #koha
14:37 jwagner     jcamins, I avoided that problem by working from home today :-)
14:37 jcamins     jwagner: smart.
14:37 * druthb    had a fine commute, if a little longer than usual.
14:37 jcamins     I was actually in a pretty good mood when I left home almost two hours ago.
14:38 jcamins     Then it took me an hour to get to Times Square.
14:39 * cait_a    hands jcamins a cookie
14:39 jwagner     jcamins, not so much smart as just dealing with the horrible head cold I've had since the flight home.  No energy to go in to the office....
14:39 jcamins     cait_a++ # for providing cheering cookies :)
14:40 cait_a      lol
14:40 slef        nengard: I think it would be more help if you updated any non-template-using RFCs to use the template, and put requests for clarification in the page itself, rather than on Talk: - I'm not sure you get notified about the Talk: page if you're Watch-ing the main page.
14:41 slef        nengard: what's the submission date for the newsletter?  I'll write the RFC corner if you want if we can get that done and submission date is far enough away.
14:41 nengard     not hanging up on you - just about to head into webinar - will talk about how we can do it better when i return in a few hours
14:41 nengard     slef the 13th  for newsletter -but this one is all conference stuff
14:41 nengard     ttyl
14:42 ivanc       by all
14:49 jcamins     slef: what does tech-ctte mean?
14:49 jcamins     Oh, that's the Debian abbreviation for Technical Committee?
14:53 magnus      and POC = Proof of Concept?
14:53 cait_a      I think so
14:54 jcamins     cait_a: what's the _a?
14:54 jcamins     With me it means "away," but you're clearly here. ;)
14:54 cait_a      hmpf
14:54 cait        changed my name when I went back to sleep earlier
14:55 jcamins     cait: oh no! Has your cold gotten worse?
14:56 cait        it's ok, had a terrible headache when I woke up, but it's much better now
15:14 * chris_n   hands cait a cup of hot chocolate
15:14 wizzyrea_   ooooo cocoa
15:15 cait        yay
15:15 cait        thx chris_n :)
15:15 trea        man now i want cocoa, afk
15:15 wizzyrea_   ok, I may be weird, but I loved the cocoa in NZ
15:15 cait        omg, I didn't try it there
15:15 cait        ah, no, I did
15:15 cait        the marshmellows were a bit strange
15:37 wizzyrea_   I don't think I ever got any marshmellows
15:37 wizzyrea_   spud did, he seemed to like them
15:39 cait        :)
15:39 jcamins     wizzyrea_: and did you ever worry that he wouldn't like a marshmallow?
15:39 jcamins     Strange or otherwise? ;)
15:40 wizzyrea_   ^.^ he is like a tiny food vacuum, if it's tasty, and put in front of him, he'll probably eat it
15:40 magnus      good on him!
15:40 magnus      wizzyrea_: how do you make that face?
15:41 wizzyrea_   ^ . ^
15:41 magnus      ^.^
15:41 wizzyrea_   caret dot caret
15:41 magnus      a ha
15:41 druthb      ^.^ o.O
15:41 cait        ok, time to shop for some dinner - bbl!
15:43 magnus      colloquy displays it as a face with lots of black hair - hard to tell what it's supposed to convey - hardly lives up to the name of "emoticon"...
15:44 wizzyrea_   supposedly it's the japanese version of :)
15:46 jcamins     Any UNIMARC users around?
15:47 slef        no, they don't really exist ;-)
15:48 slef        They are an imaginary thing sent to torture us.
15:49 jcamins     Hm.
15:50 jcamins     I'm looking at hdl's patch for UNIMARC support with the analytic code.
15:51 jcamins     Is there any implied relationship between the 001 field and the biblionumber?
15:52 jcamins     @marc 001
15:52 munin       jcamins: The control number assigned by the organization creating, using, or distributing the record. The MARC code for the organization is contained in field 003 (Control Number Identifier). []
15:54 nengard     self i'm back ... what was it you wanted me to do? :)
15:54 nengard     or slef (not self)
15:55 wizzyrea_   i routinely greet myself
15:57 jcamins     cait_a: when you get back from dinner, I have questions about analytics.
15:57 jcamins     I'm writing part 1 of my magnum opus on "why you don't *really* want comments from jcamins."
15:58 wizzyrea_   (teehee)
15:58 jcamins     (the subtitle is "we can do better!")
16:03 nengard     there was something i was gonna do when i finished training - but my brain is still shot and i have no idea what it was .....
16:04 jcamins     nengard: something about template-izing non-template RFCs.
16:05 jcamins     I think.
16:05 jcamins     Didn't cait's analytic code get pushed into a branch on git.koha-community.org?
16:05 nengard     hehe -no i mean something for a customer - not the thing slef wanted me to do
16:05 jcamins     nengard: oh, okay. In that case, I have no idea.
16:05 nengard     hehe
16:05 nengard     why not? can't read my mind?
16:06 * jcamins   puts on his mindreading hat
16:06 * jcamins   tears it off immediately
16:06 jcamins     There's too much in your mind! ;)
16:07 sekjal      okay, anyone around got experience working with MARC::Record?
16:08 jcamins     sekjal: not much, but I'm always happy to offer guesses. ;)
16:08 sekjal      I want to get at a specific instance of a repeatable field in a MARC record
16:09 sekjal      or, rather, I want to go through a for loop of all the instances
16:09 sekjal      I've got to mix together info that's in the first 852 with info that's in the first 961
16:09 sekjal      and second with second, and so on
16:09 sekjal      no other linkage, other than order
16:10 jcamins     Hm.
16:10 sekjal      using field(852) is supposed to return all the matching fields, I think in an array
16:11 chris_n     sekjal: I've messed with it, but it has been quite a while ago
16:11 chris_n     that sounds right iirc
16:11 jcamins     $field852s = $marc->field('852'); $field961s = $marc->field('961')
16:11 jcamins     Then use a regular for loop, not a foreach loop.
16:12 * jcamins   realizes that he's never used a non-foreach loop in Perl
16:13 sekjal      trying that... but getting Can't call method "subfield" on an undefined value
16:13 jcamins     Oh, yeah, you need to do some weird dereferencing.
16:13 jcamins     Let me check that for you.
16:13 sekjal      when I try to access the first subfield I want in the first iteration of the 853
16:13 sekjal      ~852
16:15 chris_n     I think you have to $field852s->subfield('foo')
16:16 chris_n     iirc  $marc->field('852') returns an object
16:16 sekjal      a hash ref, yes
16:16 * chris_n   was looking for the code he did using that
16:18 jcamins     sekjal: you tried $field852s->[$ii]->subfield('a');?
16:18 * jcamins   doesn't know why that works, but he does know that it worked for him
16:19 sekjal      get warned it's not an array reference
16:19 chris_n     sekjal: fwiw: http://git.koha-community.org/gitweb/?p=wip/koha-fbc.git;a=blob;f=C4/Biblio.pm;h=35b3588c906a8977f9bd6ec0d0de3c6275c0d65a;hb=65ae1a98282ea2d5decceedbbbe857ecb7804d8a#l1276
16:21 cait        I am back
16:21 cait        but I am scared of jcamins
16:21 * cait      hides
16:21 cait        ;)
16:21 sekjal      wish I could do this with a foreach...
16:22 chris_n     it seems you can
16:22 sekjal      if it was just the one field, yes
16:22 sekjal      but I need subfields from the first 852 and also from the first 961
16:22 sekjal      then second and second, etc etc
16:23 sekjal      no linkage between the fields except their order in the record
16:23 sekjal      so perhaps do a foreach to dump them into an array?
16:24 jcamins     sekjal: you could do it with a foreach and a counter, but you'd still have to deal with the 961.
16:24 jcamins     cait: how do your libraries populate the 003 field?
16:25 chris_n     sekjal: load them into arrays and then cmp?
16:26 jcamins     sekjal: try changint the declaration of $field852s and $field961s to @field852s and @field961s?
16:26 sekjal      jcamins: declaring as @ gives undefined value (that was my first plan)
16:27 jcamins     Ah. Right.
16:27 cait        jcamisn: they don't - it gets populated
16:27 cait        DE-576
16:27 cait        the isil of our union catalog
16:27 cait        @marc 003
16:27 munin       cait: The MARC code for the organization whose control number is contained in field 001 (Control Number). []
16:27 cait        right
16:27 cait        but I think it will get populated automatically with the value in the sys pref
16:27 cait        or it should
16:27 jcamins     cait: okay, so every library that uses your union catalog has the same 003 field?
16:27 cait        yes
16:28 cait        this is correct
16:28 chris_n     sekjal: you'd have to @subfields = $record->field(852)->subfiields();
16:28 * chris_n   thinks
16:28 jcamins     If a new library were to join the union catalog, would they have to start using DE-576?
16:28 cait        they have their own isil too
16:28 jcamins     chris_n: that won't work right if there are more than one 852 fields.
16:28 cait        but the source of the 001 is de-576
16:29 cait        doesn't count for local titles that they catalog themselves
16:29 chris_n     jcamins: in which case you are stuck with foreach
16:29 sekjal      $record->field(852) returns something you can use foreach with, but not for
16:29 sekjal      so, apparently not an array
16:29 jcamins     cait: okay, just checking.
16:29 chris_n     you can also pass in a scope of fields
16:29 cait        not sure I can explain it good - so keep asking please
16:29 chris_n     but that doesn't help here
16:30 chris_n     sekjal: it returns an field object
16:30 jcamins     cait: no, that's what I thought. I just wanted to make sure I had it right before I said that's how you did it.
16:30 cait        ok :)
16:30 sekjal      chris_n: It's supposed to be a list of matching field objects
16:30 jcamins     sekjal: I don't suppose pop works with $field852s?
16:30 sekjal      or just the first one if used in a scalar context
16:31 sekjal      jcamins: good question
16:31 cait        I think if you download a record from worldcat it will have the worldcat umber in 001 and the org code in 003?
16:32 jcamins     cait: yes, I think so.
16:32 cait        the $w fields look different when we get them out of our union catalog: $w(DE-576)idnumber
16:33 cait        zebra is not so happy about that, so we kill the org code
16:33 jcamins     cait: yeah, that's what I'm responding about.
16:33 cait        not sure how you can make zebra work here
16:33 cait        and it will get more difficult in the xslt to treat the repeatable $w right
16:34 sekjal      okay, going to try it with two foreach and a helper array
16:34 jcamins     Amit_G's analytic code currently makes the assumption that everything in the #w is valid, and that's not a safe assumption.
16:34 jcamins     It seems to me that we can adjust the index in Zebra to handle that.
16:35 jcamins     And, if not, that's a very serious problem.
16:35 jcamins     thd: around?
16:36 cait        I am not aware that we can do it
16:36 cait        but will be happy if we can
16:36 slef        wizzyrea brendan chris - would you fix the topic with the new meeting date, please?
16:37 jcamins     cait: I don't know how to, but the analytic support can't be added unless that's resolved.
16:37 cait        you could make it work with biblionumbers... urgs
16:37 cait        nah
16:37 jcamins     Well, I mean, it could, but I will agitate against it, because it will completely break our catalog.
16:38 jcamins     cait: the analytic support that you implemented is safe.
16:38 cait        yeah, it has some problems, I think 001 is not populated for local cataloged records :(
16:38 cait        oh
16:38 cait        jcamins: perhaps you can guide me a bit, when I am looking at the branch
16:39 cait        but have to cook some dinner first
16:39 cait        jcamins++
16:39 jcamins     cait: I thought you already ate.
16:39 cait        no, bought food
16:39 cait        now I have to cook it
16:40 jcamins     When you get back, I have more questions about how you do things.
16:41 jcamins     But go eat first.
16:41 cait        I have the laptop in the kitchen with me
16:41 cait        advantages of a very small apartment
16:41 jcamins     Heh.
16:42 jcamins     Each of your libraries has a separate catalog, and they download their records from the union catalog, right?
16:42 cait        why is my solution save compared to Amit's?
16:42 cait        yes, download + using staged marc import every night
16:42 cait        using the 001 as field for matching
16:42 jcamins     cait: because Amit's shows the items based on 773s.
16:43 cait        ?
16:43 jcamins     So if something doesn't match with yours, that's an erroneous search.
16:43 jcamins     I mean, item as in 952.
16:43 jcamins     With his, the patron will be told to look in the completely wrong place.
16:44 cait        sorry, I don't get it
16:44 jcamins     You know that table with items in the OPAC?
16:44 cait        you mean we can't merge both branches because we do things entirely different?
16:44 cait        yep
16:44 cait        by heart ;)
16:44 jcamins     Amit's adds an item to that table based on the 773.
16:44 cait        uh
16:44 cait        why would you do that?
16:45 jcamins     Rather, Amit's takes the item from the host record based on the information in the 773.
16:45 cait        ok, say I have a serial and catalog the essays in it
16:45 cait        than I will have items on the serial record?
16:45 jcamins     Right.
16:45 cait        but I can't check them out
16:46 jcamins     With Amit's solution, you can.
16:46 cait        and where will I store the important information? like pages, title, author etc.?
16:46 jcamins     One moment and I'll try and draw a diagram.
16:46 jcamins     It's very confusing.
16:46 cait        thx jcamins - really glad you are involved in this analytics thing with me :)
16:47 thd         jcamins: yes I am here.
16:48 jcamins     thd: any idea if it would be possible to index this in Zebra: $w(DE-576)12345 in such a way that Zebra is able to search on just the 12345?
16:48 cait        jcamins: I have a feeling we will perhaps need some sys prefs if people have different ideas about analytics support
16:49 gmcharlt    cait: I would think so
16:49 gmcharlt    it's a broader question, too
16:49 cait        gmcharlt: yeah, I have implemented something in the xslt, so that you can add a syspref
16:49 thd         jcamins: We should be doing many things like that by creating special records for indexing.
16:49 cait        but I have still to programm it
16:49 cait        program...
16:49 gmcharlt    analytics using the 773 is a subset of more general links among bibs via 7xx fields
16:50 cait        at the moment it's hardcoded to 1
16:50 cait        it was on my list, but I wanted to have it out there for discussion now
16:50 cait        gmcharlt: would you perhaps look at my xslt changes? I know that you know a lot about marc
16:51 thd         jcamins: Using XPath based indexing in Zebra would help.
16:51 gmcharlt    cait: that's the 4056 pull requests, right?
16:51 jcamins     thd: okay, so it's *possible*, at least.
16:51 cait        thd: but we ould need to change it to dom indexing for that, is that correct?
16:51 cait        gmcharlt: right
16:51 gmcharlt    cait: yes, that can be done
16:52 * jcamins   gives up on a diagram.
16:52 thd         cait: Yes that is correct
16:52 cait        it's my first bigger project, so be gentle ;)
16:52 cait        or nice
16:52 cait        whatever is the better word
16:52 thd         :)
16:52 * jcamins   will write an explanation, and get back to cait in a moment
16:52 thd         nice and gentle
16:52 gmcharlt    cait: heh.  you make it sound like I have a reputation for eating new committers for lunch
16:52 gmcharlt    ...
16:52 gmcharlt    don't answer :)
16:53 trea        nomnomnom
16:53 cait        no, you don't :)
16:53 cait        ok, ignore me... I am just a bit nervous about it and want to get it right...
16:53 cait        and our libraries really need it to work with our data - so I would be really happy to get it into koha
16:54 * chris_n   hopes he tastes bad...
16:55 cait        or help build a solution that works for them
16:58 jcamins     cait: I'm going to use a conference proceeding (we'll call it Biblionumber 100) as an example. This conference proceeding is one physical volume, and has ten articles (biblionumber 101-110). The record for the conference proceeding (biblionumber 100) has an item record (and a 952 field, we'll call this itemnumber 1) with a barcode so that it can be checked out. With Amit's proposed feature, the ten article records (biblionumbers 101-110) do not have item rec
16:59 cait        do not have item records?
16:59 jcamins     Right.
16:59 slef        oops I hate it when I email lists from work by mistake
16:59 cait        hi slef
17:00 cait        I like your mails - and I can hear you reading them to me know :)
17:00 jcamins     So the record for an article has these variable fields: 100, 245, 650, 773 (and whatever others are relevant).
17:01 thd         cait: I read back a little of your discussion.  If you want the Koha biblionumber in 001 you can modify the frameworks accordingly or add a script for record creation which also puts the Koha biblionumber in 001.
17:02 jcamins     thd: but biblionumber isn't robust.
17:02 thd         cait: Exactly, which is why I was uncertain if we should really put it in 001.
17:03 thd         s/cait/jamins/
17:03 thd         s/cait/jcamins/
17:03 cait        thd: we have a good identifier there
17:03 jcamins     thd: I would tend to think we should not, but I was a little confused by hdl's patch.
17:03 cait        but I was thinking about other libraries
17:04 thd         001 would be standard but this is Koha which is something short of standard.
17:04 gmcharlt    cait: $w for sharing, $0 (or $9) with the Koha biblionumber within the database
17:04 jcamins     gmcharlt: okay, that would help.
17:05 gmcharlt    e.g., $0 (my_libs_symbol)biblionumber
17:05 gmcharlt    would be one way
17:05 cait        gmcharlt: that's a nice idea
17:05 thd         jcamins: Which patch of hdl's do you mean?
17:05 gmcharlt    though with that approach, some additoinal code to help the catalogers maintain the $0 would be probably be good
17:06 jcamins     thd: http://git.koha-community.org/gitweb/?p=koha.git;a=commit;h=0cdbfeea635eb535e05551a64dbd633cd5a5c4db
17:06 cait        at the moment I test for $w and use a text link if it doesn't exist
17:06 jcamins     gmcharlt: yes, there would need to be some sort of authority control-type dialog.
17:07 cait        we could add $0 there too
17:07 cait        but I am not sure I am able to do the cataloging part :(
17:09 jcamins     Well, hopefully that was coherent.
17:09 thd         jcamins: That patch has no explicit reference to 001.
17:10 jcamins     thd: exactly.
17:10 jcamins     The standard says "this should be used for 001," but it's used for the biblionumber.
17:10 jcamins     That's why I was confused.
17:11 cait        why don't make it an option?
17:11 thd         jcamins: For Koha, the appropriate field reference is wherever biblionumber is stored.
17:11 jcamins     thd: http://www.unimarc.info/bibliographic/2.3/en/461
17:11 cait        to use biblionumbers or 001?
17:11 jcamins     $0 is defined in the standard.
17:11 cait        thd: I am not happy about that, I think we should stick with the standard - to make data migration easier too
17:12 thd         cait: The problem is in the Koha MARC frameworks and not the patch from hdl.
17:13 jcamins     thd: the problem is generally in the way that the analytic support is coded.
17:13 jcamins     But that's what code review is for. :)
17:13 cait        I think I will start looking at the code - but perhaps we should discuss that at a meeting or something
17:14 cait        to learn more about how Amit's feature works
17:14 thd         jcamins: Aside from whether the biblionumber is in 001 or not does the patch from hdl work?
17:14 jcamins     thd: I do not know.
17:14 jcamins     My concern is with the analytic code in general. hdl's patch was just the one that made me ask the question about the001.
17:15 thd         jcamins: It looks right to me but I am not looking hard for any bug.
17:15 thd         biblionumber should be in 001.
17:16 thd         The problem with changing it is a need for wide consultation about the need to change every record when updating to a new version of the Koha MARC frameworks.
17:16 jcamins     thd: I disagree, but I'm willing to be convinced. Why?
17:16 jcamins     thd: oh, are you saying that the frameworks for UNIMARC specify that?
17:17 jcamins     I didn't realize that.
17:17 thd         jcamins: I think that the Koha UNIMARC frameworks may now use 001 correctly.
17:17 jcamins     standard_compliance++
17:17 cait        thd: would the 001 field be overwritten with the biblio number?
17:18 cait        that would break a lot of things for us
17:18 thd         jcamins: If something fundamental is moved in the Koha MARC frameworks everyone's records have to be modified to match the new Koha MARC framework.
17:19 jcamins     thd: I think you are responding to an orthogonal issue.
17:19 jcamins     thd: http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=10b1ffcce964a44f01a38ae09d4789f76a516b72
17:20 jcamins     Actually, see my e-mail where I expressed concern. :)
17:20 thd         cait: Nothing would be overwritten in old records without a script to overwrite them except if you change the Koha MARC frameworks and load a record into the record editor and save it.
17:20 cait        thd: ok, thx
17:21 cait        so if there is no plugin defined for 001 in the frameworks everything is ok?
17:21 jwagner     chiming in late here -- I thought biblionumber went into the 999c by default.  Or am I just looking at an old system?  Putting things into the 001 would screw things up for a lot of sites.
17:21 gmcharlt    jwagner: it various depending on the framework
17:21 jcamins     jwagner: that's what I thought too.
17:22 cait        would break the wohle import for us, but it seems to be ok for now
17:22 jwagner     I'm looking at the Koha to MARC mapping.
17:22 gmcharlt    for MARC21 users, it is the 999$c
17:22 jwagner     Yeah, I don't have a UNIMARC system so I don't know what it uses.
17:22 gmcharlt    IIRC, it's different for UNIMARC users
17:22 jcamins     To clarify my concern, I am worried that this branch which has not been merged into master yet makes an unwarranted and dangerous assumption about 001 and biblionumber being the same.
17:22 jwagner     OCLC, for example, puts the OCLC control number in the 001, and a lot of our sites use that for matching.
17:23 gmcharlt    the theoretical standard approach would be move the 001+003 to the 035 during bib import
17:24 jwagner     But there are other numbers in the 035.  Would it matter if there are multiples?
17:24 gmcharlt    and do dedupe matching with 001/003 from the incoming record to the 035 of records already in teh database
17:24 gmcharlt    again in theory, the value in each 035 is qualified by who assigned the ID
17:24 gmcharlt    e.g.,
17:24 gmcharlt    035 $a(OCoLC)123
17:24 gmcharlt    035 $a(some_random_library)ABC444
17:25 gmcharlt    of course, practice is more complicated
17:25 jcamins     This would require a switch to DOM indexing, per thd's earlier explanation of how indexing that would work.
17:25 * jcamins   thinks he'll quickly eat lunch, since a patron is expected in a few minutes
17:25 gmcharlt    so I'd be chary of any proposal to change the dfeault mapping of the bilbionumber to the 001 for MARC21 users
17:25 munin       New commit(s) kohagit: Merge remote branch 'kc/new/bug_5386' into kcmaster <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=af1d8290317d7602616d7269fd40109ccf8a3f8d> / bug 5386: remove dep on Path::Class from t/00-load.t <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=e3322f732e5fe22d5972421d9a3087738aa7cab2>
17:26 slef        is chary a word?
17:26 cait        jwagner: same here for the number from our union catalog - it's the key for mathing imports
17:26 gmcharlt    slef: yes
17:26 trea        slef: char·y/ˈCHe(ə)rē/Adjective
17:26 trea        1. Cautious; wary.
17:26 trea        2. Cautious about the amount one gives or reveals.
17:26 trea        according to teh googlez
17:26 slef        trea: don't swear at me ;-)
17:26 slef        but live and learn...
17:27 * trea      holds his head and weeps
17:27 * cait      hands trea a tissue
17:27 trea        thx cait
17:28 slef        I've got a library whose mysql used only for Koha 3.0.x keeps deadlocking. Anyone seen that?
17:29 hudsonbot   Starting build 141 for job Koha_Master (previous build: FIXED)
17:32 thd         gmcharlt++ about what should happen when importing records
17:33 cait        lunch already eaten?
17:33 jcamins     I eat quickly, yes.
17:33 * cait      is still cooking
17:33 thd         jwagner: Yes, as I explained moving form 999 $c to 001 would require updating all the records on many systems.
17:34 * jcamins   thinks he must have missed something
17:34 cait        it's druthb's fault - she mademe hungry for spaghetti bolognese
17:34 thd         jwagner: However, I had understood that some LibLime customers were using 001.
17:34 jcamins     Did someone propose that it was a *good* idea to put the biblionumber in 001?
17:34 cait        jcamins: and store the union catalog/source number in 035
17:34 thd         jcamins: I discussed it with kados long ago.
17:35 cait        I think we actually have the information doubled up and in 035 too, but have to check some sample records
17:35 jcamins     cait: I think storing the union catalog/source in the 035 is a great idea, but that doesn't make putting the biblionumber in the 001 anything other than a terrible idea.
17:35 jcamins     IMHO
17:35 jwagner     thd, I don't know what some of the inherited sites might be doing -- have to ask cfouts or druthb about that.
17:35 cait        I think we can disucss that - but it should not break anything for existing catalogs
17:35 jwagner     jcamins, agreed.  I don't want anything to touch the 001.
17:36 cait        jcamins: I am with you on that - it would make things more difficult for us too
17:36 thd         jcamins: My biggest concern at the time was the easiest way to preserve the previous 001 fields in records without needing to ensure people were using a script using to populate  035.
17:36 cait        but as long as we can buid consensus on a way that does not break my import and make it possible to display the linkings in our records, I will be happy
17:37 cait        I think using biblionumber requires a lot more work to do that
17:37 cait        or it is even impossible for us - because of our union catalog situation - so it should be an option
17:38 jcamins     thd: I think a bigger concern is that the 001 field has a meaning according to the standard. The biblionumber is unstable and does not.
17:38 thd         If all the Koha import scripts were written to use 035 correctly and we supplied an appropriate migration script then we could use 001 following the standard.
17:39 thd         I only hesitated when I expected that too many users would be caught by surprise with broken systems for not following some appropriate record migration procedure.
17:40 thd         We should fix the issue for Koha 3.4.
17:40 gmcharlt    thd: which is exactly the reason why we're stuck - we would have to make an upgrade bulletproof
17:40 thd         gmcharlt: We changed it once before.
17:41 jcamins     Can someone please clarify for me why we would want to make a change like this?
17:41 gmcharlt    and, frankly, making matching more flexible would be more achievable for 3.4
17:41 thd         There were many fewer users then.
17:41 gmcharlt    jcamins: it is theoretically more correct - I agree with the thd on that
17:41 thd         jcamins: Following the standard is good especially if you are sharing your records.
17:41 gmcharlt    I don't think it would be easy to do in practice without annoying a great many current users
17:42 gmcharlt    thd: as far as following the standard is concerned, code to munge the 999 and 001 on *export* would achieve that and be easier
17:42 gmcharlt    after all, MARC is an interchange format
17:42 gmcharlt    nothing says that the version that gets exported from a Koha catalog has to be 100% identical to what's stored in teh database
17:42 gmcharlt    so, e.g., if you want the Koha bib number in the 001 on export
17:42 thd         gmcharlt: That is exactly how I conceived of addressing the issue when considering the problem with kados.
17:42 gmcharlt    a simple output filter would achieve that
17:43 slef        "DBD::mysql::st execute failed: Lock wait timeout exceeded; try restarting transaction" hmmm
17:43 thd         gmcharlt: How do we get Zebra to do that for a Z39.50/SRU server serving the world?
17:44 jcamins     gmcharlt: I'm not convinced that storing the biblionumber in the 001 is theoretically more correct. Where does that leave accession numbers from previous systems?
17:44 gmcharlt    interpose an XSLT stylesheet to do the transform before emitting the MARC or MARCXML ... pretty sure that would be possible
17:44 gmcharlt    jcamins: all legacy IDs should be moved to 035s with appropriate library symbol
17:45 hudsonbot   Starting build 29 for job Koha_3.2.x (previous build: SUCCESS)
17:45 cait        gmcharlt: ok, this would work for matching, but what about records with identifiers in $w?
17:45 thd         gmcharlt: That would be the way.  Does Zebra have support for such a filter on output?
17:45 cait        how do they find each other? $w is supposed to link to 001
17:45 cait        same on export
17:45 jcamins     gmcharlt: but libraries don't change their Organizational Codes when they change ILSes.
17:45 munin       New commit(s) kohagit32: Merge remote branch 'kc/new/bug_5386' into kcmaster <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=77c39f6cfc3b2155ce93096f94aeb7e45e7311a5> / Test modules compile <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=3b4cf45f9b67ab777ac47ceebb447321791799f9>
17:46 jcamins     So you'd end up with 035$a(NNAN)12345 035$a(NNAN)43218 and 035$a(NNAN)smp1234567 all in the same record.
17:46 jcamins     And the first two could theoretically both be valid at once.
17:47 gmcharlt    jcamins: slightly different thing - the legacy ID during a migration is more commonly not in the 001 in the first place
17:47 thd         As long as everything is valid that should not be a problem.
17:47 thd         Preserve your history and you can recover what you need.
17:49 jcamins     gmcharlt: oh, is that how properly-configured systems do it? ;)
17:49 gmcharlt    jcamins: heh
17:49 * jcamins   likes that idea
17:49 thd         cfouts: Where are LibLime customers generally storing the Koha biblionumber in MARC 21?
17:50 jcamins     (properly configured systems that do things right, that is)
17:50 gmcharlt    IMO, one would hang on to the ID from the past ILS for a year, then purge it
17:50 hudsonbot   Project Koha_Master build #141: SUCCESS in 20 min: http://bugs.koha-community.org:8080/job/Koha_Master/141/
17:50 hudsonbot   Galen Charlton: bug 5386: remove dep on Path::Class from t/00-load.t
17:50 munin       04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=5386 minor, PATCH-Sent, ---, gmcharlt, ASSIGNED, t/00-load.t adds unnecessary dependency to Path::Class
17:51 cfouts      thd: 999$b, iirc
17:52 thd         cfouts: Are any using 001?
17:52 chris       morning peeps
17:52 cfouts      I don't think so. however, MARC is not my forte, and there could be exceptions.
17:54 gmcharlt    @admin ignore add hudsonbot
17:54 munin       gmcharlt: The operation succeeded.
17:54 jwagner     cfouts, we were talking earlier about the Koha to MARC mapping, and thd thought LL had used something else.  What I'm familiar with is 999c
17:54 * chris     sees some more signed off patches to check and push this morning :)
17:55 thd         I think ryan and kados may have both told me in late 2007 that new customers of some type would be using 001.
17:55 cait        hi chris
17:55 jcamins     I'm not sure I see what the benefit to changing the biblionumber mapping would be.
17:58 cfouts      right, 999$c. I don't think it's different in any LL customers. like jcamins, I'm not seeing an advantage to moving it.
17:58 thd         jcamins: There would be no benefit other than the better appearance of standards compliance.  However, if we can have Zebra add a filter then everything would be fine for external Z39.50/SRU users.
18:00 jcamins     thd: I guess I can see the argument for external Z39.50 users, but I'm not even convinced that matters.
18:00 wizzyrea_   mornin chris
18:00 thd         jcamins: We would also need to know if Solr/Lucene would support such filtering for the prospect of replacing Zebra even as Z39.50/SRU server for external users.
18:01 thd         jcamins: Are you against sharing your records or do you think that external users do not want to be able to recognise your record nubers?
18:02 thd         s/nubers/numbers/
18:02 jcamins     thd: I am pro sharing, and think that external users should have access to our accession numbers.
18:02 jcamins     Like most libraries I've been at, accession numbers are external.
18:02 gmcharlt    thd: that was a bit rude
18:03 thd         jcamins: External users are not going to know the significance of 999 $c.
18:03 gmcharlt    thd: they also don't necessarily have any reason to care
18:03 thd         Sorry if I seemed rude.
18:03 cait        thd: for us the union catalog number is always more intersting
18:03 jcamins     thd: no, but they're also not going to be able to do anything useful with it in the 001 field.
18:03 cfouts      what's the utility of knowing the internal reference number at any rate?
18:03 cait        than the internal number, for sharing too
18:03 gmcharlt    certainly anybody copy-cataloging just want the record, and does not care about any local identifiers
18:03 thd         My question was asked in good humour.
18:03 jcamins     No offense was taken.
18:04 gmcharlt    no worries, then
18:04 jcamins     :)
18:04 thd         My facial expression and vocal inflection is not communicated well on IRC :)
18:05 hudsonbot   Project Koha_3.2.x build #29: SUCCESS in 20 min: http://bugs.koha-community.org:8080/job/Koha_3.2.x/29/
18:05 hudsonbot   * Chris Cormack: Test modules compile
18:05 hudsonbot   * Chris Cormack: Merge remote branch 'kc/new/bug_5386' into kcmaster
18:05 chris       kapow!, collaboration ftw
18:05 jcamins     My argument against the biblionumber in the 001 is that it is specific to the library's *current* installation of its *current* ILS. 001 should have semantic significance, which ideally means reference to a guaranteed-unique and guaranteed-permanent identifier. Like accession number.
18:06 thd         I am rarely ever rude with intent.
18:06 jcamins     Now, granted, the ANS has done a terrible job with their accession numbers and 001s, but that just gives me 12 months worth of experience to tell you that you don't want to experience that particular error.
18:07 thd         gmcharlt: Copy cataloguers should be copying the previous 001 into 035.
18:07 jcamins     (yes, it took me twelve months of data clean up to give up and conclude that reliable identifiers were a lost cause)
18:08 jcamins     (I see this less as an argument for biblionumbers and more as an argument against them, in case that's unclear)
18:08 thd         gmcharlt: If we had reaally good automation support for cataloguing we could check the originating source and other matches for possible updates or more complete record information.
18:08 jcamins     *biblionumbers as the 001, I mean
18:09 gmcharlt    thd: right, but good automation support would have to entail not making assumptions about whether any particular source of records was actually following the ideal convention for 001/003/035 usage
18:09 thd         The real world has no reliable identifiers but having identifiers is better than not having them.
18:10 thd         gmcharlt: I would not presume but hope and verify with good automation support for record matching.
18:16 thd         003 is often empty and shoud be checked against knowledge of the organisation code for the originating library.
18:16 thd         s/empty/missing/
18:17 gmcharlt    which of course can be tricky
18:18 gmcharlt    as not every library necessary has (or knows) what their code is
18:21 gmcharlt    chris_n: when you cherry-pick, please pick the base commits, not the merge commits
18:21 gmcharlt    otherwise, it squashes history and attribution
18:22 thd         I would create clearly labelled locally created codes for records from libraries which had not taken the opportunity to register for an organisation code.
18:23 gmcharlt    chris_n: what you can do if there are too many patches in a branch to cherry-pick individually is to merge in the branch, then sign off on the merge commit before pushing
18:24 chris       you do need to be careful with that tho
18:24 thd         I would add a note field to the record explaining a locally created code.
18:24 chris       because the branches are based on master
18:24 chris       you may merge a feature by accident
18:24 gmcharlt    good point
18:24 chris       i will try to leave the branches up
18:25 chris       with the commits at the top, so that cherry picking from them is easier
18:25 chris       rather than from master where they are sometimes not grouped together anymore
18:28 chris       okie dokie bus time
18:28 chris       cya's soon from the back seat of the bus, with the other naughty kids
18:29 nengard     hehe
18:29 * druthb    chuckles.
18:30 chris_n     gmcharlt: tnx, I missed that point
18:31 chris_n     I have a script written that will cherry-pick ranges of commits, I'll just go back to using to pick from branches
18:44 chris       Back
18:46 cait        fast :)
18:46 chris       Well the bus stops 7 metres from my front door
18:47 chris       So its not far to walk:)
18:47 chris       Bout 20 mins now til I get to work
18:48 chris       Except I will stop at neo for a coffee
18:48 wizzyrea_   she have those pastry things you like ;)
18:48 chris       Its funny cos now u know where that is
18:49 chris       I hope so !
18:49 jwagner     chris, is that the place I went for lunch the first day of the hackfest?
18:50 chris       Yup just down the street from catalyst
18:50 jwagner     THey had good French toast and eggs, too.
18:50 brendan     heya all
18:51 * druthb    waves to brendan
18:51 chris       They are good all round, they do me a good deal
18:51 chris       12 large flat white coffees for 37
18:51 jwagner     So we should have told them that you sent us, and they'd have given us a special rate???
18:51 chris       I'm not sure they like me that much :)
18:51 chris       Worth a try tho :)
18:53 * chris     has played a few akoha.com missions there
18:53 trea        man i miss flat whites
18:53 * wizzyrea_ too
18:53 * gmcharlt  googles
18:54 chris       There's a good wikipedia page on them gmcharlt
18:54 chris       Close to a latte
18:55 chris       trea: jcamins was saying you can get them in nyc
18:55 chris       Slightly closer :)
18:55 trea        i'd sooner fly to welington
18:55 chris       Hehe
18:56 jwagner     I've been threatening almost daily since I got back to move to NZ.  I'm sure someone must be hiring down there :-)
18:56 chris       I like ny without it, id never have met laurel
18:56 jwagner     (In other words, chris, you have a really great country to live in -- me like!)
18:57 chris       Windy though :)
18:57 cait        the only negative thing is, that it is so far away!
18:57 jwagner     Yes, I could do without the wind :-(  On the whole, though....
18:57 chris       And crazy roads eh nengard?
18:59 chris       Ata marie Brooke
18:59 Brooke      kia ora
19:00 jcamins     chris: I didn't say you can get a flat white coffee in NYC.
19:00 jcamins     Someone told me you could on this channel, though.
19:00 chris       Ohhh
19:00 Brooke      mr dukleth perhaps?
19:01 chris       Cripes I'm on the pda bus
19:01 chris       Get a room!
19:01 jcamins     chris: I thought you meant everyone had an Android at first.
19:01 Brooke      you're in polynesia. Toughen up :P
19:01 trea        ok chopper
19:02 chris       Htfu
19:02 * chris     assumes that's the chopper you mean :)
19:02 trea        yes
19:03 chris       Now I have to watch that clip again when I get to work
19:05 jcamins     trea: http://tmagazine.blogs.nytimes.com/2010/05/06/ristretto-flat-white/
19:06 nengard     huh? what? was on a call ... whatcha saying chris?
19:07 nengard     oh - yes the roads can be a bit insane
19:07 nengard     deadly even
19:09 chris       K my stop bbiab with coffee
19:10 chris       Single biggest killer in nz, cars
19:11 trea        i think our time in nz was so enjoyable precisely because we didn't drive
19:14 nengard     it wsn't driving that was a problem
19:15 nengard     it was driving up a mountain with no guardrail or sholder to speak off - i think my tire hit dirt a few times and if it was just a little further over i would have been at teh bottom of that mountain on the wrong side of the car :) hehe
19:15 druthb      hehehe....driving in DC isn't a problem....it's the fact that everyone *else* is driving at the same time!  If we could all take turns, would be no problem.
19:16 trea        yea, my original comment still stands
19:17 Brooke      it's cause you live in MD :P
19:18 Brooke      the choice was betwixt bad merging in MD
19:18 Brooke      and bad parking in VA
19:18 Brooke      clear choice to me :)
19:19 druthb      Brooke:  I had a car for about two weeks after I moved here.  When I called my insurance company and they wanted to *triple* what I was paying in Texas, I told them to scram, and sold the chariot.  And lowered my stress level.
19:19 Brooke      :D
19:19 Brooke      way to go
19:20 druthb      I figured that $300/month for insurance, plus fuel, plus inspections, plus $100/month for blood pressure pills was a bit much.
19:20 jcamins     Yay!
19:20 jcamins     cars--
19:21 jcamins     @karma cars
19:21 munin       jcamins: Karma for "cars" has been increased 0 times and decreased 1 time for a total karma of -1.
19:21 druthb      cars--
19:21 chris       back
19:24 cait        cars--
19:24 cait        I can't drive them and get car sick
19:24 cait        cars--
19:25 druthb      @karma druthb
19:25 munin       druthb: Karma for "druthb" has been increased 37 times and decreased 0 times for a total karma of 37.
19:25 druthb      @karma jwagner
19:25 munin       druthb: Karma for "jwagner" has been increased 49 times and decreased 0 times for a total karma of 49.
19:25 druthb      harrrumph!
19:25 brendan     @karma sekjal
19:25 munin       brendan: Karma for "sekjal" has been increased 39 times and decreased 1 time for a total karma of 38.
19:25 gmcharlt    @quote add <druthb> harrrumph!
19:25 munin       gmcharlt: The operation succeeded.  Quote #102 added.
19:26 jwagner     Hey, that's MY line!!!
19:26 * druthb    giggles like a maniac.
19:26 cait        lol
19:26 cait        druthb++
19:26 trea        that's a creative commons line. it's in the public domain now
19:26 jwagner     me thinks druthb had better watch it or I'll come sneeze and cough on her!
19:27 * druthb    may wear a hazmat suit to work tomorrow, just in case.  Probably get funny looks on the bus--but is used to that.
19:27 jcamins     druthb: you'd get funny looks?
19:27 jcamins     Here in NYC, I don't think anyone would blink.
19:28 jwagner     a hazmat suit would probably provoke a terror alert, sigh.
19:28 druthb      Nearly every day, yup.  I get it less in the District proper, but out here in the 'burbs yah.
19:28 * chris_n   is amazed at all of the author/owner-less RFCs for 3.4
19:28 chris       mostly come over from the old wiki i bet
19:28 trea        druthb: http://bit.ly/9Jn07W
19:29 nengard     chris_n -- hence all my comments on not following the template
19:29 jcamins     jwagner: yeah, I guess you're right.
19:29 Brooke      move over to the People's Republik of Takoma Park, comrade.
19:30 druthb      trea++
19:31 * chris     wanders downstairs to chat with anitsirk
19:31 richard     hi
19:32 druthb      Brooke:  The only place where I've been concerned about immediate violence was on the Ft. Totten metro platform.  Had a scary moment there.
19:34 jcamins     Brooke: I thought you worked at a rural library? I didn't realize there were rural libraries in DC suburbs.
19:40 druthb      off to catch the bus.
19:40 druthb      public_transit++
19:40 * druthb    waves
19:49 * chris_n   notes 620 open enh reqs in bugzilla
19:52 jcamins     Was bug 5040 patched in master yet?
19:52 munin       04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=5040 enhancement, PATCH-Sent, ---, nengard, ASSIGNED, "Distance" misspelled in default framework
19:54 jcamins     Nope.
19:54 jcamins     Sorry chris_n, I can't close any of my bugs just now.
19:55 jcamins     I did change that one from enhancement to trivial.
19:56 cait        chris_n: I'll try and sign-off the patch for fines and notices tomorrow
19:56 chris_n     cait: tnx
20:00 cait        closing some of my bugs right now :)
20:12 chris       back
20:14 cait        :)
20:14 magnus      g'night #koha
20:15 cait        chris: will you push my analytics on a branch? we had lively discussion on linking records earlier
20:15 cait        I think it will probably not be easy to merge the different concepts
20:16 chris       right
20:16 chris       hmm, i can push them on an awaiting_qa branch
20:16 cait        that would be ok for me - to let people test
20:16 chris       new/awaiting_qa/bsz_analytics
20:16 cait        and look at it?
20:16 chris       people could test from your repo also
20:17 chris       you could email koha-devel, and give a link to your public repo
20:17 chris       and the branch
20:17 cait        hm right
20:17 chris       and ask people to test there
20:17 chris       i pushed osslabs, because theirs came in in patches, not in a nice branch already :)
20:17 cait        when I asked you yesterday if a branch would help you said yes!
20:17 * cait      grumbles a little ;)
20:18 chris       yes, you having a branch helps me a lot
20:18 chris       because it means it odnt have to take a bunch of patches and make a branch
20:18 chris       and i can spend the time learning how to type instead
20:18 * jcamins   wails and gnashes his teeth
20:19 cait        huh
20:19 * cait      looks scared
20:19 cait        you type wonderful, much better than me ;)
20:19 jcamins     Apparently the only way to make searches for accented characters to work is to open each record in the Koha record editor and save it.
20:19 cait        ?
20:19 cait        that doesn't sound good
20:20 chris       cait: i can make a branch under git.koha-community.org and then you can ask people to test it from there, or you can ask them to test it from your repo .. i dont mind either way, what do you prefer?
20:20 cait        we use icu... but that has it's own problems
20:20 cait        chris: that's a good question
20:20 cait        I was thinking about writing my own rfc
20:20 chris       now thats a good idea
20:20 cait        jcamins: what do you think?
20:20 jcamins     I have 175363 records in this database. That's an awful lot of records to open and save. :(
20:20 jcamins     cait: I'd definitely like to see your RFC.
20:21 chris       jcamins: are you changing anything?
20:21 chris       and does the save change anything?
20:21 jcamins     And as you probably gathered from our lively discussion earlier, I am liable to comment. :)
20:21 chris       or does it just make zebra reindex?
20:21 cait        and I had hoped you would accept to proof read it - because I had writing longer things in english
20:21 chris       cos if its just zebra, rebuild_zebra.pl -r
20:21 cait        had = hat
20:21 cait        e
20:21 cait        my
20:21 jcamins     chris: no, I'm not changing anything, I think it's just changing the unicode normalization.
20:21 cait        better learn typing first...
20:21 jcamins     The records are in the index.
20:22 chris       jcamins: but wrongly if they arent being searched eh?
20:22 braedon|h   jcamins: can't you write a script to batch process the normalisation?
20:22 chris       jcamins: what im driving at, is the save changing the marcxml in biblioitems
20:22 chris       cos if its not, a full reindex is all you need, if it is, then you need to do what braedon|h is suggesting :)
20:24 braedon|h   scripts are always the answer :)
20:26 jcamins     chris: I'm guessing that braedon|h is right.
20:26 braedon|h   you could probably actually write a simple firefox plugin to edit and save all your records
20:27 braedon|h   (or depending on the save mechanism, do it via a command line script)
20:28 jcamins     I'll do a bit more investigation.
20:30 jcamins     And maybe wail and gnash my teeth a bit more.
20:30 chris       if you arent changing anything
20:30 chris       the script could be as simple as
20:30 chris       select biblionumber from biblio;
20:30 chris       iterate over that
20:30 chris       get record, save record
20:30 chris       done
20:31 braedon|h   so there are get/save functions you can easily call externally?
20:33 braedon|h   no benefit in a higher level firefox plugin then :P
20:33 chris       all in C4/Biblio.pm
20:38 munin       New commit(s) kohagit: Bug 5112: Organisation does not show links to professionals <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=63fdd8768acc42cf1b02a655506d6ea97c11fe14>
20:38 * jcamins   wails and gnashes his teeth again - MarcEdit has broken.
20:39 * braedon|h feels sorry for jcamins' teeth
20:39 cait        yay!
20:39 * cait      gives jcamins another cookie
20:39 nengard     jcamins, i had that problem and i had to downgrade to an older version
20:39 cait        better chew on that
20:39 jcamins     cait++
20:39 nengard     and then after a few weeks tried upgrading back to the new one and it worked
20:40 jcamins     nengard: probably there was a broken version. I haven't used MarcEdit in a while.
20:43 jcamins     Hey, can I just pull a single record out of a binary MARC file?
20:43 nengard     um
20:43 jcamins     I seem to recall that it's safe to just cat a bunch of MARC records together.
20:43 nengard     i don't know
20:44 jcamins     Well, we'll find out soon. :)
20:44 hudsonbot   Starting build 142 for job Koha_Master (previous build: SUCCESS)
20:45 gmcharlt    jcamins: yes, it's safe
20:45 brendan     for marcedit - I just uninstalled and reinstalled with the new one and it worked great
20:48 jcamins     Well, if MarcEdit finishes downloading before I extract this record with my handy text editor, I'll do that. :)
20:50 jcamins     Hm. I don't think this record is 136kb.
20:52 jcamins     As it turns out, I finished at the same time as the download.
20:58 munin       New commit(s) kohagit: Bug 5385: POD Cleanups (part 2) <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=efa66f1f556dcff71779c9b89148f2bb99149e51> / Bug 5385: POD Cleanups (part 1) <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=9fa574f6097b8fc1eb9efc5321141ec0d23d3268>
21:04 hudsonbot   Project Koha_Master build #142: SUCCESS in 19 min: http://bugs.koha-community.org:8080/job/Koha_Master/142/
21:04 hudsonbot   Katrin Fischer: Bug 5112: Organisation does not show links to professionals
21:04 hudsonbot   Starting build 143 for job Koha_Master (previous build: SUCCESS)
21:07 hudsonbot   Project Koha_Master build #143: FAILURE in 2 min 37 sec: http://bugs.koha-community.org:8080/job/Koha_Master/143/
21:07 hudsonbot   * Andrew Elwell: Bug 5385: POD Cleanups (part 1)
21:07 hudsonbot   * Andrew Elwell: Bug 5385: POD Cleanups (part 2)
21:07 chris       ohh, best i go  check that out
21:08 chris       ah ha
21:08 chris       yay for unit tests
21:08 chris       Error:  Global symbol "@positionsForX" requires explicit package name at /var/lib/hudson/jobs/Koha_Master/workspace/C4/Barcodes/PrinterConfig.pm line 86.
21:08 chris       doing exactly what they are supposed to do
21:09 * chris     goes to fixinate
21:10 * larsw     waves a banner saying "unit tests ftw"
21:10 gmcharlt    bah
21:11 * gmcharlt  should have been more paranoid
21:11 chris       well i should run the tests locally first
21:11 * chris_n   does 'prove' before pushing
21:11 chris_n     always
21:11 chris_n     snap
21:11 chris       yup, my git hook does that for me, if i commit
21:11 chris       but i was lazy and didnt sign the merge, no commit
21:12 * chris     will make them run on merge too
21:13 cait        going to switch laptops
21:14 chris       interestingly enough
21:14 chris       that line wasnt changed by the patch
21:15 chris       i cant see how that file ever worked
21:16 chris       ahh
21:16 chris       yes i do
21:17 chris       or not
21:17 chris       hehe
21:18 chris       ahhh yeah
21:18 chris       -my @positionsForX; # Takes all the X positions of the pdf file.
21:18 chris       =head2 my @positionsForX;
21:18 jcamins     I was right.
21:18 chris       thats not the same thing :)
21:19 chris       mind you theres no reason those variables should be global anyway
21:19 jcamins     Saving the record results in the Unicode being normalized.
21:19 * chris     will fix the pod and the variables
21:21 cait        re
21:21 chris       heya cait
21:22 cait        cleaned the fan today - seems to work better know
21:22 chris       excellent
21:23 nengard     chris_n
21:23 nengard     new label maker issue
21:23 nengard     i'm on head and when i select multiple titles from the search results on the add to batch
21:23 nengard     only one is added :(
21:23 chris       !hudson build koha_master now
21:23 hudsonbot   chris: job koha_master build scheduled now
21:23 hudsonbot   Starting build 144 for job Koha_Master (previous build: FAILURE -- last SUCCESS #142 39 min ago)
21:25 jcamins     Should we have a script included with Koha, say in the misc/migration_tools directory that will resave in order to correct Unicode normalization?
21:26 * chris_n   swats vainly at the horde of bugs swarming about
21:28 munin       New commit(s) kohagit: Bug 5385 - Fixing an error that crept in with the POD cleanup <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=bd0e9e92a9d272f21b6627b1ff106595ebd55224>
21:31 jcamins     Good night, #koha
21:38 munin       New commit(s) kohagit: Merge remote branch 'kc/new/bug_4305' into kcmaster <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=061e05ca97e58009753c95c57706c48ce28e86aa> / Bug 5385 - Fixing an error that crept in with the POD cleanup <http://git.koha-community.org/gitweb/?p=koha.git;a=commitdiff;h=9c6db56a3f807c819be5e724f423a212fa4b362c> / bug 4305: add a couple test cases for _isbn_cleanup <http://git.koha-community.org/gitweb/?p=koh
21:42 hudsonbot   Yippie, build fixed!
21:42 hudsonbot   Project Koha_Master build #144: FIXED in 19 min: http://bugs.koha-community.org:8080/job/Koha_Master/144/
21:42 hudsonbot   Chris Cormack: Bug 5385 - Fixing an error that crept in with the POD cleanup
21:43 hudsonbot   Starting build 145 for job Koha_Master (previous build: FIXED)
21:50 chris_n     robin++ # for reminding us that words have more than one meaning
21:59 robin       chris_n: :)
22:02 hudsonbot   Project Koha_Master build #145: SUCCESS in 19 min: http://bugs.koha-community.org:8080/job/Koha_Master/145/
22:02 hudsonbot   * Colin Campbell: Bug 4305 Improve code flow
22:02 hudsonbot   * Galen Charlton: bug 4305: add a couple test cases for _isbn_cleanup
22:02 hudsonbot   * Chris Cormack: Bug 5385 - Fixing an error that crept in with the POD cleanup
22:08 cait        re
22:33 chris       cfouts: the expat is bad came out of liblime a while back, and may no longer be true
22:33 chris       lemme find the emails
22:33 cfouts      Thanks. I would love to see!
22:35 chris       heh
22:36 chris       i can find me telling people to use it in 2006
22:36 chris       As for this error, I think installing
22:36 chris       XML::SAX::Expat will fix it
22:36 chris       so sometime after then, and between now, it became bad :)
22:39 chris       apparently an issue with diacritics
22:40 cfouts      do you remember a test case?
22:41 chris       do you ahve an arabic, or a chinese record (or even french)
22:43 chris       ah ha!!
22:43 chris       http://www.nntp.perl.org/group/perl.perl4lib/2006/05/msg2369.html
22:43 chris       my google foo is weak, it found me longer to find that than it should have :)
22:44 chris       i imagine those files no longer exist tho ;(
22:47 cfouts      no, surely not. I don't even know where "liblime.com" was hosted at that time.
22:47 chris       somewhere in dallas i think
22:47 chris       spry is ringing a bell
22:48 cfouts      yes, those are long gone
23:25 jcamins_a   Was someone asking for an Arabic record?
23:26 jcamins     Grr. ln:ara isn't working on this version. Forgot about that.
23:28 cfouts      well, I can't break it in the way Josh described, though his description was very vague regarding the Expat failure.
23:28 jcamins     cfouts: would you like a record in Cyrillic?
23:28 cfouts      thanks, yes
23:29 cfouts      I fudged one on my own, but a more authentic test case would be helpful
23:29 jcamins     cfouts: http://donum.numismatics.org/cgi-bin/koha/opac-detail.pl?biblionumber=175519
23:30 jcamins     If the download from the OPAC will do for you, there's that. If you need it exported, I can get that for you in five minutes.
23:30 gmcharlt    cfouts: I've confirmed that PurePerl is broken with respect to MARC::File::XML (not that that's relevant)
23:30 gmcharlt    cfouts: I've not thus far found any problems with using Expat
23:31 jcamins     You would think after we sent out a press release showing off the Arabic capabilities of DONUM that I would have some idea what record it was that had Arabic in it...
23:32 jcamins     This record uses macrons: http://donum.numismatics.org/cgi-bin/koha/opac-detail.pl?biblionumber=1081
23:36 cfouts      what's a macron?
23:37 jcamins     The bar over vowels used in Arabic transliteration.
23:37 chris       and Māori
23:37 jcamins     Yes, that too.
23:37 * jcamins   doesn't know how to type macrons, actually
23:38 chris       its a double vowel, or a long sound at least in te reo it is
23:42 jcamins     Okay, this is just absurd.
23:42 cfouts      http://treebeard.liblime.com/ctf/expat-marc-test.pl
23:43 jcamins     I cannot find the record with the Arabic.
23:43 cfouts      test script. contains diacritics, hirigana, arabic, hebrew, and accented latin chars
23:45 * jcamins   gives up and goes to figure out what to do about dinner.
23:45 chris       have u tried with plan Expat?
23:46 cfouts      no
23:46 chris       seems to work ok too, i wish we had that several.mrc file
23:47 chris       without it, its pretty hard to know exactly what was causing the error
23:47 cfouts      indeed
23:48 chris       AH HA
23:48 gmcharlt    found it?
23:48 chris       i can make Expat barf
23:48 chris       if i change the leader
23:48 chris       to say the record is marc8
23:48 chris       but give it utf8
23:49 chris       i wonder if that was the issue
23:49 cfouts      isn't that a feature?
23:49 chris       well it only sorta barfs
23:49 jcamins     cfouts: I would've thought so, but apparently Z39.50 doesn't reliably report encoding.
23:49 chris       Wide character in print at ./expat-marc-test.pl line 123.
23:49 chris       Wide character in subroutine entry at /usr/share/perl5/MARC/Charset/Table.pm line 96.
23:50 chris       prints the marc, aslpodes trying to print the xml
23:50 cfouts      yeah, the ISO still comes out fine.
23:50 chris       you should be able to replicate if you change the 9th char from an a to space
23:50 chris       yeah
23:50 jcamins     chris: why can't we catch that and fallback to assuming it's MARC-8?
23:51 chris       cos it might be exploding for lots of reasons
23:51 gmcharlt    jcamins: MARC-8 is not the only game in town
23:51 chris       what he said
23:51 robin       chris: that can be fixed by telling Perl to output as UTF8, no?
23:51 cfouts      LibXML barfs then, too
23:51 robin       open my $fh, '>:utf8', $file
23:52 jcamins     gmcharlt: but those are the only two options for a MARC record, aren't they?
23:52 gmcharlt    jcamins: I wish
23:52 chris       robin: not really no, its silently dying on $record->as_xml();
23:52 robin       ah ok
23:52 gmcharlt    the reality is that there are at least half a dozen character encodings that one can run into
23:53 gmcharlt    just within the borders of USA
23:53 gmcharlt    and more legacy encodings if you look across the ponds
23:53 robin       I've started using Encoding::FixLatin to get things as UTF8-y as I can
23:54 * jcamins   clearly doesn't want to think about this
23:54 * jcamins   also needs to actually leave his computer so he can eat dinner
23:54 chris       but yeah if this is dying on LibXML it might not have been the original problem
23:55 gmcharlt    chris: I'm pretty sure it isn't
23:55 gmcharlt    yet another glitch with MARC::Charset, not the one that kados was running into back then
23:55 chris       it is one that causes the error
23:55 chris       that was reported on the mailing list
23:56 chris       yesterday i think
23:56 gmcharlt    yeah, but needing to make sure that the Leader/09 for MARC21 UTF-8 records is set correctly is a known error
23:57 chris       2. When I importing a record using z39.50, after clicking the Save and
23:57 chris       view record button, I get the message "Tag "" is not a valid tag. at
23:57 chris       /usr/share/koha/lib/C4/Biblio.pm line 1863"
23:57 chris       those kidna errors are the ones you run into, when you tell z3950 that you are expecting utf8 and it gives you marc8
23:58 chris       gmcharlt: yep