IRC log for #koha, 2014-05-09

All times shown according to UTC.

Time S Nick Message
00:07 dcook @$#%^#
00:07 huginn dcook: I'll give you the answer just as soon as RDA is ready
00:13 tcohen joined #koha
00:13 dcook o/ tcohen!
00:14 tcohen \o
00:14 dcook Back home?
00:15 tcohen work + meetings + tennis -> home
00:15 dcook Sounds like an all right day
00:15 dcook I was wondering if you were back from your honeymoon ;)
00:15 dcook I only saw some photos but it looked interesting!
00:16 tcohen yes, i got back two weeks ago
00:17 tcohen it was a 28 day trip in europe
00:18 tcohen rangi: i don't think the ppa's should substitute our repo, I was thinking more of the way they work and expect us to send our stuff
00:19 tcohen i might set my own just for testing purposes
00:19 rangi its a lot faster to just build your own packages using the build-git-snapshot and pbuilder .. a lot faster
00:19 rangi you have to wait for it to get built, and it can be in the queue for hours
00:20 dcook tcohen: Well, welcome home belatedly!
00:20 dcook Is this Debian's introduction of PPAs?
00:20 rangi nope
00:21 tcohen i just noticed that the ppa's expect us to provide patches specific for each dist build
00:21 rangi only where needed
00:21 tcohen so for the same codebase, one provides the patches for (for example) a specific apache version
00:22 rangi yeah thats the same with any package
00:22 tcohen i already have the patches for it to work with either version
00:22 tcohen so i don't need that
00:22 rangi thats not specific to ppa
00:22 tcohen exactly
00:22 rangi if you are building for unstable, you should make it work with the unstable apache
00:22 tcohen i was just commenting that another approach (rather then supporting everything in the same code) is used too
00:23 rangi yep
00:23 rangi however that doesnt help the people using tar
00:23 * dcook perks up
00:23 tcohen i just didn't know it before i read the ppa docs
00:23 rangi so if we can do it, we should
00:23 rangi ahh right :)
00:23 dcook As a non-package user, I'm very interested
00:23 rangi you can do it with quilt quite easily
00:24 tcohen quilt?
00:24 rangi i was fixing debian bugs last weekend with debian/patches for stuff, using quilt
00:25 rangi http://pkg-perl.alioth.debian.[…]/howto/quilt.html
00:25 tcohen damn rangi, more stuff to read before bed!
00:25 tcohen heh
00:25 rangi you usually do this
00:25 rangi when you arent the upstream
00:25 rangi say if i was packaging koha, but wasnt a developer of koha
00:26 rangi i would add the changes i need as a debian patch ..
00:26 rangi since we are upstream as well, we dont have to do that, we can edit the file ourselves etc
00:27 rangi but it doesnt hurt to know about it
00:27 rangi maybe i could do a packaging tutorial or something at kohacon
00:28 tcohen :-D
00:28 rangi since we want/need to package upstream perl modules sometimes too
00:28 tcohen btw, how do i ask Debian perl maintainers to update some lib I maintain?
00:29 rangi the best thing to do
00:29 wahanui it has been said that the best thing to do is make the password VERY secure
00:29 rangi
00:29 tcohen thanks rangi
00:29 rangi its a very low barrier to entry, so you could even join the debian perl group and update the modules yourself
00:30 * tcohen hopes he finds the time to publish Memoize::Memcached v1.00 soon
00:35 tcohen rangi, if you have some time would you take a look at[…]g_11404_apache2.4 ?
00:36 tcohen i've been focused on the packages
00:36 rangi yep ill have a look after lunch
00:36 tcohen thanks!
00:36 tcohen there might be some glitches
00:38 t1 joined #koha
00:40 tcohen dcook, as a tar user, do u think you could benefit from bug 11962?
00:40 huginn Bug[…]_bug.cgi?id=11962 enhancement, P5 - low, ---, tomascohen, NEW , New 'cluster' install mode
00:49 rocio left #koha
00:51 bshum joined #koha
01:03 * dcook is having a mega rant on Bugzilla it seems
01:03 dcook Sorry for spamming everyone
01:04 dcook @later tell tcohen Possibly, although we use a non-standard FS layout, so probably not for us. Seems like a worthwhile goal overall though!
01:04 huginn dcook: The operation succeeded.
01:08 tcohen joined #koha
01:09 tcohen networking issues
01:09 tcohen night #koha, pizza time :-D
01:09 dcook Mmm pizza
01:09 dcook night tcohen!
01:10 rangi you've ranted more than it would take to write a patch :)
01:11 dcook Hehe. I don't know about that.
01:11 dcook Maybe to write the patch. Probably not the test plan though.
01:12 dcook I don't really see the purpose of that syspref anyway.
01:12 dcook I think Owen was bang on about using the existing pref from the start
01:12 * dcook hasn't had breakfast so he might be a bit ornery
01:12 dcook I have a few re-writes to the XSLT that will come up eventually anyway..
01:13 dcook Thought I'd make folks aware in any case :p
01:14 dcook Actually, while you're around rangi, can you shed some light on the link tracking?
01:14 rangi yep
01:14 dcook So you set the pref, and then you get the data using your own SQL queries from scratch?
01:14 rangi yep
01:15 dcook Cool. Nothing else involved?
01:15 rangi nope
01:15 dcook Sweet as. Thanks :)
01:15 rangi just select * from linktracker
01:15 rangi and whatever conditions you want
01:16 dcook Sounds good. I'm working on our new Koha, so trying to combine all the changes..
01:16 rangi sweet
01:17 dcook Now I'm remembering why we made a change :p
01:17 dcook It groups and deduplicates bib level and item level links, so that you can get them in the results as well..
01:17 dcook All the patches...
01:17 * dcook is pretty sure he doesn't even sound coherent anymore
01:21 dcook Actually, I guess those first few posts would be an easy patch..
01:22 dcook Well, maybe they all are.
01:22 * dcook takes a cupcake break.
01:34 dcook joined #koha
02:01 dcook rangi++
02:01 dcook That link tracker is a nice touch
03:27 eythian hi
03:28 * dcook waves
03:33 * eythian once again has a valid first aid qualification
03:34 eythian (actually, I don't know that my old one expired, but it was getting close.)
03:34 wizzyrea I feel ok having a heart attack now.
03:34 wizzyrea j/k
03:34 dcook Ack!
03:34 eythian yeah, we talked about rangi :)
03:34 dcook Gotta make sure someone is around to keep rangi in working order :p
03:35 rangi heh
03:45 eythian the koha list definitely seems to be playing up.
03:45 eythian Something I sent on wednesday appeared to only arrive this morning.
03:51 eythian[…]s-smart-must-see/ <-- honey badgers
04:55 cait joined #koha
05:07 cait hi #koha
05:50 dcook yo cait
05:58 cait h dcook
05:59 cait @wunder Konstanz
05:59 huginn cait: The current temperature in Taegerwilen, Taegerwilen, Germany is 12.7°C (7:55 AM CEST on May 09, 2014). Conditions: Light Drizzle. Humidity: 90%. Dew Point: 11.0°C. Pressure: 30.06 in 1018 hPa (Steady).
06:01 laurence joined #koha
06:41 magnuse joined #koha
06:42 * magnuse waves
06:51 sophie_m joined #koha
06:52 reiveune joined #koha
06:52 reiveune hello
06:52 wahanui bonjour, reiveune
06:58 magnuse hiya sophie_m and reiveune
06:58 reiveune hi magnuse
06:58 sophie_m hello magnuse and #koha
07:01 alex_a joined #koha
07:02 alex_a bonjour
07:06 sophie_m khall_away: I only receive now your notification on bug 9593, sorry for not answering before, but I see that now it is pushed
07:06 huginn Bug[…]w_bug.cgi?id=9593 normal, P5 - low, ---, sophie.meynieux, Pushed to Master , Prices not imported correctly from a staged file
07:21 papa joined #koha
07:48 paxed magnuse: re. trimming placeholders from po-files, see bug 12221
07:48 huginn Bug[…]_bug.cgi?id=12221 enhancement, P5 - low, ---, pasi.kallinen, NEW , Remove TT statement placeholders from translatable strings
07:54 cait joined #koha
07:54 cait ping gmcharlt
07:57 ashimema morning cait
07:57 cait morning
08:00 cait ashimema: read your pm
08:15 papa joined #koha
08:22 papa joined #koha
08:24 papa joined #koha
08:32 ashimema :)
10:25 cait quiet freiday
10:26 cait hm friday
10:42 atheia joined #koha
10:52 ashimema super quiet
10:58 * magnuse hums "it's oh so quiet" almost inaudibly
11:00 barton ping ashimema
11:03 ashimema hi
11:05 barton morning khall!
11:05 khall mornin all!
11:06 khall left #koha
11:06 khall joined #koha
11:09 barton @later tell ashimema there's a broken link on your 'adding a link to zebra' page: 'Updating Koha Install' does not exist on the koha wiki.
11:09 huginn barton: The operation succeeded.
11:10 ashimema I'm here barton ;)
11:11 * barton shakes his head...
11:11 barton not enough coffee yet.
11:12 ashimema just in a call.. will reply in a minute
11:14 barton so... I've been working through your instructions for updating zebra indexes... you suggest working in kohaclone, then running the 'koha update procedure'... but the link to that is broken.
11:20 * barton has coffee now... Coffee: |==>|................| 20 % <== this means that I'm functional ;-)
11:22 magnuse yay! :-)
11:22 barton coffee++
11:23 barton morning magnuse :-)
11:25 magnuse howdy barton
11:26 barton @wunder oslo
11:26 huginn barton: The current temperature in Kikut, Oslo, Norway is 6.9°C (1:16 PM CEST on May 09, 2014). Conditions: Clear. Humidity: 69%. Dew Point: 2.0°C. Windchill: 6.0°C. Pressure: 29.62 in 1003 hPa (Steady).
11:27 barton heh.
11:29 barton I don't think that my irssi 'ring the bell when my name shows up' plugin likes unicode.
11:29 barton hense the dbus message above.
11:30 barton Ironically, it's never actually notified me of anything, either.
11:31 ashimema ok barton, I've lifted the relevant page from our internal wiki now.
11:31 barton aha!
11:32 * barton refreshes wiki/MRenvoize/zebra ...
11:32 ashimema The command set in there is based on how we install koha though..
11:32 barton et voila!
11:33 barton nod.
11:33 ashimema which is a dev install using git etc.. so you may need to alter some apths etc..
11:33 ashimema the important bit is running through the makefile, make and make upgrade steps.. then rebuilding zebra.
11:33 barton right... I think that we do basically the same...
11:33 ashimema else your changes won't take effect
11:33 ashimema :)
11:34 barton ... but we're also moving toward a gitified package install.
11:34 ashimema yeah, we've been doing that..
11:34 barton but this gives me somewhere to start.
11:34 ashimema in fact, we're now building our own repositories and going the whole pacakge appraoch
11:35 ashimema gitified could be interesting.. as managing the zebra config will be a tad different in terms of all the locations..
11:35 barton right.
11:36 ashimema may be worth looking at:[…]_bug.cgi?id=12216
11:36 huginn Bug 12216: enhancement, P4, ---, martin.renvoize, Needs Signoff , One should be able to override zebra configuration on a per instance basis
11:37 ashimema and of course.. your bug 11910
11:37 huginn Bug[…]_bug.cgi?id=11910 enhancement, P5 - low, ---, gmcharlt, NEW , Adding a zebra index should be a configuration change.
11:37 ashimema it's the last missing step really ;)
11:37 barton exactly.
11:38 barton well, thanks for the update... that will give larrby, khall and me something to talk aobut today :-)
11:39 ashimema should be sort of straight forward to add a syspref or config option to enable simple additon of indexes to for koha's array..
11:39 ashimema but I don't really have the time to do it at the moment.
11:39 barton who does?
11:39 barton ;-)
11:39 ashimema lol
11:40 barton ... although frankly, I've got a pile of zebra index tickets that I'm going to be able to close real soon now...
11:40 barton oh, one more question.
11:40 ashimema My approach would be to append an array (based upon a directive in koha-conf.xml) to the indexes array in sure khall could nock that up in minutes ;)
11:40 ashimema go ahead
11:40 khall indeed!
11:41 ashimema haha.. sorry khall.. volunteering you ;)
11:41 barton oh, he's used to it.
11:41 khall np! ; )
11:41 ashimema (I'll sign it off though.. ;) )
11:41 khall that is exactly what we've discussed doing as well
11:42 barton your 'Additional steps for DOM indexing...' section -- you've got a couple of lines that get added to ' ~/kohaclone/etc/zebradb/marc_defs/marc​21/biblios/biblio-koha-indexdefs.xml'
11:42 ashimema along with the packages patch I've put together it'll allow proper customisation of zebra config all over I beleive
11:42 ashimema mmm.
11:42 khall it would also be nice to have a system to add new indexes to the advanced search pulldown, but that's a different ball of wax. We just do it with jquery at the moment
11:43 ashimema yeah, that's true.
11:43 cait ping gmcharlt
11:43 ashimema zebra is a complex beast
11:44 barton I've been testing your process against 3.14.05 -- I don't think that my biblio-koha-indexdefs.xml ends up looking like yours...
11:44 ashimema barton: those additional steps are highly frowned upon by Jared.. you have been warned ;)
11:45 ashimema basically, I run through the step 2 (using scripts to create new dom configs) having taken a backups of the existing ones..
11:45 barton ok, I think I'll skip them...
11:45 ashimema then I just diff the backup and the newly created ones.. and make sure I've not accidentally dropped anything from the backup that I didn't want to
11:45 * cait waves
11:46 * barton waves at cait.
11:46 ashimema problem is.. those scripts a) create a massive diff due to the comments with linenumber, and B) grs1 and dom haven't been kept in sync well generally.. so they are often mor different than you may expect
11:47 barton hm.
11:47 cait barton: we also go the full package approach, i think there were some security concerns about git installs in the past
11:47 khall hi cait!
11:47 cait hi khall
11:48 ashimema Are you suing Dom barton?
11:48 cait why would he sue the dom?
11:48 barton using yes, suing, no ;-)
11:48 ashimema if you are, you'll deffo need a hacked up version of those additional steps ;)
11:48 meliss joined #koha
11:48 ashimema haha.. classic typo
11:48 cait ... it as just too tempting :)
11:52 barton ok. I've got my spear and magic helmet ...
11:52 cait good luck :)
11:52 barton kill the zebra, kill the, kill the ZEEEE-BRA!!!
11:53 barton (appoligies to elmer fudd)
11:53 ashimema ok, barton, I've updated the instructions a bit to be more generic
11:53 * cait prefer taming it
11:53 ashimema Jared still won't like them, as he goes on about never ever using those scripts to generate configs again..
11:54 ashimema but I'm no pro in writing the DOM config manually.. so I use them and deal with the consequences
11:54 barton ashimema++
11:54 ashimema jcamins, surprised you've not shot me down in flames yet ;)
11:55 jcamins ashimema: if you want to do something stupid after you *know* it's stupid, I don't really care... it's just when there's a possibility that you're doing something dumb because you don't know any better that I'll point it out.
11:55 barton maybe jcamins hasn't had his coffee yet, either.
11:55 ashimema hehe.. thanks jcamins.. that's reasonable..
11:55 jcamins Also, I've already pointed out to Barton that doing that is a Bad Idea, so there's no one involved in the conversation who might leave with incorrect ideas.
11:56 * barton stands corrected.
11:56 ashimema I'm still stuck on getting QueryParser working with opacsupression though..
11:56 ashimema eythian was having similar issues with getting elasticsearch working with supression
11:56 ashimema supression is plain baldy implimented
11:56 ashimema hehe :)
11:57 jcamins You said it!
11:57 ashimema I really aught to take another look at that bug.. we simply can't use QP until that's fixed :'(
11:57 barton um, jcamins ... actually I *don't* remember you polnting out to me that this was a bad idea...
11:58 ashimema he did.. when I first added that page to the wiki for you ;)
11:58 ashimema I remember ;)
11:58 jcamins barton: really? I could've sworn you were part of the conversation when ashimena added the page and I said "STOP!!!! DON'T FOLLOW THOSE DIRECTIONS UNTIL YOU KNOW THAT YOU SHOULDN'T REGENERATE THE DOM CONFIGS!!!"
11:59 ashimema hehe.. yup.. that's how I remember it
12:00 barton hm. that slightly rings a bell ...
12:00 ashimema which reminds me.. I'd like a aptch to ditch all the comments in those dom files at some point..  if w don't want them relating to grs anymore.. then they don't need to be there.. they're just confusing
12:00 jcamins Agreed.
12:00 ashimema :)
12:03 barton so ... is there someplace that I could read up on regenerating DOM configs?
12:03 jcamins gmcharlt did write some documentation, but I don't remember where it is. It's a pretty straightforward format.
12:05 ashimema I'm sure it's not too bad.. but I've been trying to keep the configs in sync whilst we support both.. and i've found the easiest way was to regenerate and then diff accross the 'special' bits for dom
12:06 francharb joined #koha
12:08 barton ok, that makes sense, more or less.
12:09 ashimema joined #koha
12:10 ashimema ack.. my IRC just crashed..
12:10 ashimema did I miss anything
12:10 jcamins ashimema: yeah, there was a moment of Transendent Moment of Understanding.
12:12 ashimema :)
12:12 ashimema glad to hear it ;)
12:14 collum joined #koha
12:15 oleonard joined #koha
12:17 oleonard Hi #koha
12:18 cait hi oleonard :)
12:22 kmlussier joined #koha
12:24 ashimema joined #koha
12:25 tcohen joined #koha
12:28 tcohen morning #koha
12:28 ashimema morning tcohen
12:29 tcohen dcookhi
12:29 tcohen hi ashimema
12:37 markvandenborre joined #koha
12:38 francharb joined #koha
12:46 tcohen hi magnuse
12:59 barton ashimema: If you fix the ~/kohaclone/etc/zebradb/marc_defs/mar​c21/biblios/biblio-koha-indexdefs.xml before you run xsltproc, that basically takes care of ~/kohaclone/etc/zebradb/marc_defs/marc​21/biblios/biblio-zebra-indexdefs.xsl ... the only changes that I had there were the ones that I wanted.
13:00 barton I think that it's worth doing a diff after the fact, just to make sure...
13:00 ashimema sounds fair to me barton
13:01 barton i'll update your wiki page.
13:01 ashimema it's a while since i've actually followed the instructions to be honest.. I understand it just about well enough to fly by the seed of my pants now ;)
13:01 ashimema :)
13:01 ashimema feel free to 'publish' it somewhere more sensible if you like.
13:02 ashimema I left it under my username whilst it was a bit crude.. but it sounds like you've got allot of it tidied up nicely now..
13:03 barton Yeah, I will, once I've been through it once or twice. I'm still learning...
13:06 markvandenborre I have a library of 2702 pieces of sheet music
13:07 markvandenborre right now, the only metadata we have is in iTunes (don't ask)
13:07 markvandenborre the only fields are: title, code, level (string), various (string), instrument, composer
13:07 markvandenborre and pdf document
13:08 markvandenborre is there an easy and clean way to import these into koha?
13:08 markvandenborre without wading through piles and piles of metadata fields?
13:09 markvandenborre the goal would be to upgrade this data set in the long term with some extra fields (publisher, ...)
13:10 markvandenborre but to keep things as simple as possible in the short term, and to have something up and running
13:10 markvandenborre asap
13:10 jcamins markvandenborre: you'll need to convert your records into MARC format. You'll probably need to write a script to do that, though if you can get a CSV out of iTunes, you might be able to make use of that.
13:11 jcamins There is already a script for converting CSVs to MARC.
13:12 markvandenborre jcamins: I have pulled some csv out of iTunes (don't ask how, it was horrible)
13:12 jcamins markvandenborre: in that case...
13:12 wahanui in that case is preferable thar each text be inside some tag, then it will be picked alone, and not with all %s mess, they are left and  removed by Bug 11631
13:12 huginn Bug[…]_bug.cgi?id=11631 enhancement, P5 - low, ---, pasi.kallinen, Pushed to Master , Make translation toolchain ignore useless translatable strings
13:12 jcamins koha tool box?
13:12 jcamins koha toolbox?
13:12 jcamins toolbox?
13:12 jcamins migration toolbox?
13:12 wahanui well, migration toolbox is
13:13 jcamins ^^ there's a csv to marc script in the migration toolbox.
13:13 oleonard wahanui: forget in that case
13:13 wahanui oleonard: I forgot in that case
13:14 markvandenborre but don't I need to define an item type for this kind of thing first?
13:16 Dyrcona joined #koha
13:16 jcamins Possibly. That's a policy decision. How you want to handle item types, etc.
13:17 markvandenborre jcamins: I was hoping to get a better idea of minimal steps to take
13:18 markvandenborre there seem to be quite a few layers of metadata in koha (probably rightly so for many situations)
13:18 jcamins If I had scanned sheet music, all I would do is convert the CSV into MARC and load it into Koha.
13:18 jcamins I would not add any items.
13:19 markvandenborre ok, I'll try and do that first then
13:19 markvandenborre the point of course being that all of these will need to be edited (pdf url field needs to be customised)
13:19 jcamins But whether that's what you want to do is a policy decision. If you have multiple branches, or the sheet music exists somewhere physically as well, you might want items.
13:20 markvandenborre jcamins: no, we don't want that (for the moment)
13:20 markvandenborre there is only one branch
13:20 markvandenborre the main one
13:20 markvandenborre and we don't want the physical copies of the sheet music to be used
13:21 SherryS joined #koha
13:21 markvandenborre jcamins: /me is looking at the conversion scripts
13:25 SherryS left #koha
13:30 tcohen joined #koha
13:31 markvandenborre jcamins: so I run the script, and I get a text file that I can import somehow into koha...
13:31 jcamins It generates MARC records which you can load through the Stage MARC Record tool.
13:32 jcamins You'll have to read the documentation for that script, because you have to do all the mapping on the command line.
13:35 mveron joined #koha
13:35 mveron Hi #koha
13:36 barton ashimema: I just made those changes; also I added a 'REGENERATE THE DOM CONFIGS' step -- there should be a link there to any documentation that we have about that.
13:36 barton morning mveron
13:38 mveron Good afternoon barton :-)
13:38 mveron @wunder allschwil
13:38 huginn mveron: The current temperature in Wetter Allschwil, Allschwil, Switzerland is 22.1°C (3:38 PM CEST on May 09, 2014). Conditions: Mostly Cloudy. Humidity: 55%. Dew Point: 13.0°C. Pressure: 30.06 in 1018 hPa (Steady).
13:38 markvandenborre jcamins: thank you for the hints
13:51 JesseM joined #koha
13:52 rocio joined #koha
13:55 francharb joined #koha
14:05 khall I'm getting bad search results on master:
14:13 mveron khall: See Bug 9612 - SRU Response is different when DOM indexing is enabled . A patch  that fixes this issue is pushed.
14:13 huginn Bug[…]w_bug.cgi?id=9612 minor, P5 - low, ---,, Needs Signoff , SRU Response is different when DOM indexing is enabled
14:14 khall thanks!
14:16 mveron Sorry, mistaked, has "Needs Signoff" as status.
14:17 oleonard khall: And then you can sign off :)
14:19 * mveron ...thinks that it could be set to critical or higher...
14:24 Callender_ joined #koha
14:25 logbot joined #koha
14:25 Topic for #koha is now Koha 3.16-beta is now available. Next general meeting is 3 and 4 June 2014 at 22:00 and 15:00 UTC. Welcome to the IRC home of Koha Please use for pastes.
14:27 halcyonCorsair joined #koha
14:42 tgoat joined #koha
14:45 rambutan @wunder 64507
14:45 huginn rambutan: The current temperature in Wyatt Park, St Joseph, Missouri is 13.4°C (9:45 AM CDT on May 09, 2014). Conditions: Clear. Humidity: 64%. Dew Point: 7.0°C. Pressure: 29.91 in 1013 hPa (Rising).
14:54 huginn New commit(s) kohagit: Bug 12214: (follow-up) Clean up, show Edit link when SQL has... <[…]07dfc24beab0b88dd> / Bug 12214: (follow-up) correct POD of C4::Reports::Guided::execute_query() <[…]d3976ec0f865bb370> / Bug 12214: display SQL errors in reports to users
14:58 laurence left #koha
15:01 jenkins_koha Starting build #1750 for job Koha_master (previous build: SUCCESS)
15:02 cait nice
15:05 tcohen hi cait
15:05 reiveune bye
15:05 reiveune left #koha
15:06 cait hi tcohen
15:14 markvandenborre hm, anyone used before?
15:14 markvandenborre I have all dependencies installed, and I am quite certain I am calling it in the right way
15:14 markvandenborre but for some reason it complains about
15:15 markvandenborre Missing column specified in the mapping: titel doesn't exist in migrate.csv
15:15 markvandenborre while the first line of that file says:
15:15 markvandenborre code;titel;graad;stuk/studie/methode (string);instrument;componist (string);
15:16 markvandenborre jcamins: any idea?
15:16 markvandenborre I'm not a perfect perl reader
15:16 jcamins markvandenborre: I can't really troubleshoot, but if I had to guess, I'd say you might not have changed the separator character.
15:17 markvandenborre it doesn't seem to tell what separator it expects, but indeed...
15:17 markvandenborre jcamins: I was hoping someone would be able to tell me what separator it was expecting
15:17 jcamins Comma.
15:17 markvandenborre _ah_
15:17 markvandenborre brilliant, thank you
15:18 jcamins CSV is generally pronounced "comma-separated values," even though technically it stands for "character-separated values."
15:20 markvandenborre I tried to avoid using commas since some book titles tend to contain them...
15:21 jcamins Yeah, that's why the best thing to do is change the separator character the script is using.
15:23 markvandenborre just did that
15:23 markvandenborre thx for the hint
15:23 markvandenborre it seems to be workign
15:24 huginn New commit(s) kohagit: Bug 12065: use encode_qp consistently when sending basket/shelf <[…]5ad604c446a50f751>
15:41 NateC joined #koha
16:02 thd-away joined #koha
16:09 markvandenborre I'm looking through all kinds of metadata fields and I'm not really sure
16:09 markvandenborre how to map these very basic fields from csv to marc
16:10 markvandenborre title, code, level (string), various (string), instrument, composer and (internal) url
16:10 markvandenborre the tool gives the example of title as something to map
16:10 markvandenborre but I'm not 100% where to find this
16:11 talljoy markvandenborre i've used it before and ethyian is the one who wrote it, if i'm not mistaken.
16:11 markvandenborre I understand the syntax, and it's working correctly
16:11 markvandenborre but I don't know the exact name of the marc side of mappings of the fields
16:11 markvandenborre so I have a composer
16:11 markvandenborre header in the csv file
16:12 markvandenborre but I don't know the name of the marc field to map this to
16:12 markvandenborre I have a url field and I have been looking into koha
16:12 talljoy oh, that is cataloging question.
16:12 markvandenborre and found something like 856u there
16:12 markvandenborre that stands for url
16:13 gmcharlt are your records all works of musics?
16:13 talljoy yes your url woud go in th 856u
16:13 talljoy   is a great resource for marc tags
16:13 markvandenborre gmcharlt: all sheet music, correct
16:14 gmcharlt then for the easy ones
16:14 gmcharlt title = 245a
16:14 gmcharlt composer = 100a
16:14 talljoy additional composers/lyricists 700a
16:16 markvandenborre so I just use these numbers that I can find in the url that talljoy just gave
16:16 markvandenborre ?
16:16 markvandenborre as mappings in
16:17 markvandenborre ok
16:18 markvandenborre 100a doesn't appear to be a valid Koha field name.
16:18 markvandenborre is what I get...
16:19 markvandenborre so I'm not entirely sure yet where I can find a list of destination mappings
16:19 oleonard khall: So online registration requires that  autoMemberNum be turned on, otherwise patrons who register online will be created with no barcode number
16:19 markvandenborre for use with this koha migration toolbox script
16:19 oleonard khall: I think this is incorrect.
16:21 markvandenborre somehow, this script "knows" what "title" means
16:22 markvandenborre and running with only the 'titel=title' mapping results in something like
16:23 gmcharlt hmm; there are a few scripts of that name running around
16:23 gmcharlt could you point to the one you're using specifically?
16:23 cait i think you need to tell it the field it goes to - 245a somewhere
16:23 markvandenborre correct, just found information about that in the source of the script
16:23 markvandenborre marc:100a
16:24 cait german records? :)
16:24 markvandenborre German sheet music
16:24 cait nice
16:24 markvandenborre or, to be correct, German sheet music records
16:24 markvandenborre :-)
16:24 talljoy this is an example of one of my mapping lines in the csvtomarc script
16:24 talljoy --mapping=SHOW=marc:245_a?
16:25 talljoy SHOW is the column heading and it is mapped to the marc tag 245 a.
16:40 markvandenborre talljoy: thx, just found that out too
16:40 markvandenborre by RTFS (reading the ....... source)
16:44 sophie_m left #koha
16:46 markvandenborre how would you map an existing unique code to marc?
16:47 jcamins markvandenborre: there's pretty extensive Perldoc, as I recall. But the actual fields you'll have to figure out by talking to your catalogers (or if you're the cataloger, by looking at the standard).
16:47 markvandenborre the code is just a label that was (inconsistently)
16:47 markvandenborre applied locally to the book
16:47 markvandenborre it wasn't applied to every piece of sheet music
16:47 jcamins Well, that could be an argument for creating items, so you could use the call number field.
16:48 markvandenborre ah, an item is a local "instance" of a particular book?
16:48 jcamins Exactly.
16:49 jcamins Well... usually.
16:50 markvandenborre ...and presumably, there is no way to keep that local information
16:50 markvandenborre unless I create items then, right?
16:50 jcamins You can use any fields for local information, if you're not sharing your records.
16:51 markvandenborre ...but that would rather be abusing fields
16:51 markvandenborre ?
16:51 jcamins If you're not a library cataloger, you won't care, so why would anyone else?
16:52 * jcamins takes a very cavalier attitude toward these things.
16:52 jcamins The MARC standard should have been retired >20 years ago. Just making it work is enough trouble.
16:52 markvandenborre so let me take an example... instrument
16:52 oleonard We should really stop letting people believe that Koha can restrict the number of holds a patron can place.
16:52 bag_away oleonard++
16:53 jcamins markvandenborre: there may be a field for that sort of thing. There probably is, in the 5xx fields.
16:53 markvandenborre jcamins: I had already figured that out more or less
16:54 jcamins Okay.
16:55 markvandenborre koha really feels like a very friendly community around really ancient technology
16:55 jcamins That's pretty accurate.
16:55 oleonard ...assuming you're talking about MARC and the holds system.
16:56 markvandenborre ...and perl
16:56 * markvandenborre ducks
16:56 markvandenborre also, about really nice people like jcamins or cait or talljoy or ...
16:56 oleonard Then computers are ancient too, but that doesn't mean you have to use the oldest one.
16:57 * markvandenborre has looked at 5xx and not really found anything useful there
16:58 talljoy happy to help markvandenborre
16:59 markvandenborre jcamins: or do you mean like that is a place where one could use an unrelated field
16:59 markvandenborre (or abuse an unrelated field)
17:00 gmcharlt the 500a (general note) is a good catchall
17:00 ashimema perls still alive and kicking.. but feel free to re-wrtie koha in python if you have the inclination ;)
17:00 markvandenborre cool
17:00 ashimema :)
17:01 jcamins markvandenborre: I meant there was a good chance a field would already exist, and if not, you can use a related field, or, if there isn't even one that's similar, 500 is good for everything.
17:01 markvandenborre I just dug in more deeply
17:01 markvandenborre and saw what you meant
17:01 markvandenborre thx
17:03 markvandenborre gmcharlt: you mean 500n (misc info) instead of 500a (uniform title) I presume?
17:03 gmcharlt 500a
17:03 jcamins markvandenborre: where are you located?
17:03 gmcharlt @marc 500$a
17:03 huginn gmcharlt: unknown tag 500$a
17:03 gmcharlt @marc 500 a
17:03 huginn gmcharlt: General note
17:04 markvandenborre ah, EU
17:04 markvandenborre versus US
17:04 markvandenborre ?
17:04 markvandenborre different Marc dialect?
17:04 gmcharlt yes
17:04 gmcharlt it's not a strict US vs EU distinction
17:04 jcamins France/Portugal/Spain vs. the rest of the world.
17:04 gmcharlt but the two main flavors are UNIMARC and MARC21
17:05 markvandenborre I read that Belgium was more into the French flavor...
17:05 markvandenborre but if most people here use the other standard, that's worth a lot to me
17:05 jcamins Ah, yeah, that might be. You're in Belgium?
17:05 * chris_n begins to think in terms of pastry and icecream
17:06 markvandenborre chris_n: and chocolate, and witloof
17:06 jcamins chris_n: ooh, I have paneer saag for lunch. I think I'll go eat.
17:07 markvandenborre and strawberries, this time of the year
17:07 markvandenborre and cheese
17:07 markvandenborre and beer
17:07 markvandenborre and (very few people seem to know this)
17:07 markvandenborre some really good wine too!
17:08 chris_n had strawberrys, biscuit, and whipped cream for breakfast
17:08 chris_n strawberries, even
17:14 NateC joined #koha
17:16 jenkins_koha Project Koha_master build #1750: SUCCESS in 2 hr 17 min: http://jenkins.koha-community.[…]Koha_master/1750/
17:16 jenkins_koha * Julian Maurice: Bug 11843: prevent manual history from being overwritten if subscription switched to automatic history
17:16 jenkins_koha * Julian Maurice: Bug 11843: (follow-up) fix unit test in t/db_dependent/Serials.t
17:16 jenkins_koha * Jonathan Druart: Bug 11975: improve the batch patron deletion code
17:16 jenkins_koha * Galen Charlton: Bug 11975: (follow-up) simplify construction of params for GetBorrowersToExpunge()
17:16 huginn Bug[…]_bug.cgi?id=11843 blocker, P5 - low, ---, julian.maurice, Pushed to Master , Manual subscription history doesn't seem to work as expected
17:16 jenkins_koha * Galen Charlton: Bug 12214: add regression test for reporting error when running report with SQL error
17:16 jenkins_koha * Pasi Kallinen: Bug 12214: display SQL errors in reports to users
17:16 jenkins_koha * Galen Charlton: Bug 12214: (follow-up) correct POD of C4::Reports::Guided::execute_query()
17:16 jenkins_koha * Kyle M Hall: Bug 12214: (follow-up) Clean up, show Edit link when SQL has errors
17:16 huginn Bug[…]_bug.cgi?id=11975 enhancement, P5 - low, ---, jonathan.druart, Pushed to Master , Tools: Batch patron deletion code should be improved
17:16 huginn Bug[…]_bug.cgi?id=12214 minor, P5 - low, ---, pasi.kallinen, Pushed to Master , SQL errors in reports are not shown to user
17:18 kmlussier1 joined #koha
17:20 jenkins_koha Starting build #1751 for job Koha_master (previous build: SUCCESS)
17:23 tcohen joined #koha
17:24 NateC joined #koha
17:28 khall @later tell cait I've left a comment/question for you on bug 9303
17:28 huginn khall: The operation succeeded.
17:29 markvandenborre I tried to import what I had reformatted into a running koha instance
17:29 markvandenborre it only imported one record
17:30 markvandenborre I wonder if that has something to do with the marc dialect
17:30 markvandenborre this is what the manpage says:
17:30 markvandenborre This is the file format that will be output. Valid options are B<usmarc>
17:30 markvandenborre and B<marcxml>. B<marcxml> is the default.
17:30 jcamins Koha requires usmarc.
17:30 markvandenborre ok
17:30 jcamins (which is the name of the binary format that is used for both MARC21 and UNIMARC)
17:31 jcamins However, if you're using UNIMARC, you'll want to consult the documentation for UNIMARC to figure out what fields to use. I don't know UNIMARC.
17:31 cait marcxml is way easier to read for testing
17:31 jcamins Right, unless you want to load it into Koha.
17:32 cait left #koha
17:32 tcohen bulkmarcimport accepts MARCXML, doesn't?
17:32 jcamins tcohen: yes, but I will never ever suggest someone use bulkmarcimport because I don't want to be responsible for making someone use bulkmarcimport. :P
17:33 tcohen heh
17:37 edveal joined #koha
17:37 markvandenborre ok, so now I have staged 2702 records for import
17:37 markvandenborre and added them to the catalog (I think)
17:38 markvandenborre so the first thing I try to do is search for them of course
17:38 markvandenborre which fails... obviously, I'm missing something here
17:38 barton joined #koha
17:40 markvandenborre status for every record says "imported"
17:41 markvandenborre is there anything obvious I am missing?
17:41 markvandenborre do I need to trigger a kind of indexing in any way?
17:44 markvandenborre I feel like I am missing something vitally important here, and I have no idea what
17:44 markvandenborre ah, a restart of the catalog helped
17:47 markvandenborre I was too quick thinking that
17:47 markvandenborre "cataloging search" works
17:47 markvandenborre but search not
17:51 * markvandenborre reading the manual again
17:54 gmcharlt markvandenborre: you need to index; if you installed from packages, the relevant cronjob should already be set up
17:55 gmcharlt if not, you'd need to run rebuild_zebra
17:55 markvandenborre gmcharlt: I installed from debian packages
17:55 markvandenborre it seems to be indexing in some way
17:56 gmcharlt then let's see if the search engine is running -- is there at least one process called zebrasrv running?
17:56 markvandenborre yes
17:57 markvandenborre as I said, I can search the catalog using
17:57 markvandenborre "cataloging search"
17:57 markvandenborre that yields some results
17:57 markvandenborre but using the normal "search" not
17:57 gmcharlt yeah, that's a separate search mode
17:57 markvandenborre ah, it doesn't rely on zebra?
17:57 markvandenborre ok
17:57 gmcharlt right
17:58 gmcharlt try a "sudo koha-rebuild-zebra -b -f -v instancename"
18:01 markvandenborre
18:01 markvandenborre it seems to find all records
18:02 markvandenborre on the rebuild zebra command line
18:02 markvandenborre but nothing yet in the web frontend
18:02 jcamins markvandenborre: what field did you put the title in?
18:02 markvandenborre ./ --input=/root/metadata_itunes_geraardsbergen.csv --output=/root/bib_gbergen.marc --mapping='titel=title' --mapping='componist=marc:100_a?' --mapping='code=marc:500_2?' --mapping='instrument=marc:500_r?' --mapping='graad=marc:500_n?' --mapping='type=marc:500_x?' --kohaconf=/etc/koha/sites/gbergen/koha-conf.xml --kohalibs=/usr/share/koha/lib --fieldsep=';'  --format=usmarc
18:02 markvandenborre was my exact line
18:03 jcamins Yeah, those are MARC21 fields.
18:03 jcamins You'll need to identify appropriate UNIMARC fields to replace the fields.
18:04 markvandenborre you mean for the title field?
18:04 jcamins No, all the MARC fields.
18:05 jcamins unimarc?
18:05 wahanui unimarc is[…]ted-documentation
18:08 mveron joined #koha
18:11 markvandenborre I'm sorry, I'm a bit confused now
18:12 cait joined #koha
18:12 jcamins Everyone here uses MARC21, so those are the fields we suggested.
18:12 markvandenborre ok, but I checked the equivalent unimarc fields
18:13 markvandenborre and I used those
18:13 markvandenborre looking at my koha test instance for reference
18:13 jcamins 100$a is a required UNIMARC field, and it doesn't have composer in it.
18:13 markvandenborre ah, that is the one before indeed
18:14 markvandenborre that one was the only one that I took without double checking
18:14 markvandenborre so that may be the issue
18:14 markvandenborre together with the title one maybe
18:14 jcamins Title should be fine.
18:15 jcamins Actually, just use the MARC field.
18:15 jcamins Then you don't have to worry about it connecting to the database.
18:15 markvandenborre yup
18:15 jcamins Though I guess you weren't getting permissions issues.
18:18 markvandenborre aargh, this is such a long list of meta-metadata
18:19 lolo joined #koha
18:19 Guest10046 joined #koha
18:19 markvandenborre if only there were a way to find out the number for the ten most important bits of information
18:19 markvandenborre like title, author, ...
18:22 edveal joined #koha
18:24 markvandenborre the intelligent search in the koha interface does help though
18:26 markvandenborre what is the cleanest way to remove all existing records from koha?
18:26 markvandenborre (so that I can test while certain)
18:31 markvandenborre sorry about all these questions, all...
18:32 markvandenborre I'd hoped to have something running by now
18:33 markvandenborre koha is so much more heavyweight than what we actually need, but I _will_ kick it into submission
18:41 markvandenborre ah, it only becomes searchable when the koha mapping biblio.title and are filled in maybe?
18:41 markvandenborre let's hope that is the case...
18:45 markvandenborre I mapped correctly now for unimarc, but still no cigar
18:45 markvandenborre have to run
18:45 markvandenborre thx all for the help
18:45 markvandenborre bye
18:46 tcohen is field 019 of any use in MARC21?
18:46 tcohen i've just noticed we don't have it, but have '01e' instead
18:57 tcohen bye #koha
19:21 jenkins_koha Project Koha_master build #1751: SUCCESS in 2 hr 2 min: http://jenkins.koha-community.[…]Koha_master/1751/
19:21 jenkins_koha Marcel de Rooy: Bug 12065: use encode_qp consistently when sending basket/shelf
19:21 huginn Bug[…]_bug.cgi?id=12065 minor, P5 - low, ---,, Pushed to Master , Consistent use of encode_qp when sending basket/shelf
19:22 JesseM joined #koha
19:37 talljoy_tickets i know it's friday, but is anyone still in that knows if there are any 'rules' for converting marc21 to Bibtex citations?  i.e. what tag goes into what output field
20:10 jcamins talljoy_tickets: you might take a look at marc2bibtex.
20:11 talljoy_tickets yup i'm digging around and it seems 'loosey goosey'
20:11 jcamins I believe the code is somewhere in C4.
20:11 talljoy_tickets i've found the koha code.  but before i add more, i thought it would be a good idea to see if anyone cared what i shoved into Alternate Name.  heh
20:12 talljoy_tickets mainly my issue is with host items (analytics).  and i think this is going to be a custom export using the C1, C2 and C3 RIS fields.
20:12 talljoy_tickets and shoving 773 tags into those.
20:13 jcamins Ah, yeah, you're SOL when it comes to 773s.
20:13 talljoy_tickets HA
20:14 jcamins That's basically just a general rule.
20:14 tcohen joined #koha
20:50 NateC joined #koha
21:03 NateC joined #koha
22:00 cait left #koha
22:09 rambutan left #koha
22:12 JesseM joined #koha
22:32 thd-away joined #koha
22:50 thd-away joined #koha
22:50 thd-away joined #koha
22:52 thd-away` joined #koha
22:52 thd-away` joined #koha

| Channels | #koha index | Today | | Search | Google Search | Plain-Text | plain, newest first | summary