Time  Nick          Message
23:53 brendan       but that looks the same as the patch - i why I think that never was pushed
23:52 brendan       type="audio/wav" autostart=true hidden=true loop=1 height=60 width=128 </embed>
23:52 brendan       <EMBED src="<!-- TMPL_VAR NAME='themelang' -->/includes/sounds [ remove highlighting ]/error.wav"
23:52 brendan       <!-- Play a sound if the item is bad-->
23:52 brendan       The following code is added at line 143 of circulation.tmpl and line 241 of returns.tmpl
23:52 brendan       A wave file will play when a bad barcode is entered in checkin or checkout.
23:52 brendan       some old notes that I had -
23:50 chris         someone would need to tidy them up and resubmit them
23:49 chris         there are patches there
23:49 munin         04Bug 1080: enhancement, P2, ---, oleonard@myacpl.org, NEW, How about telling the template to make a noise on circulation errors?
23:49 chris         http://bugs.koha.org/cgi-bin/bugzilla3/show_bug.cgi?id=1080
23:46 chris         do you have a bug number?
23:46 chris         you would have to ask galen that, ill do a quick search tho
23:46 chris         s/dropped/withheld/
23:45 schuster      Question - with all the thing that have been dropped...  Are the sounds in circ still there for 3.2?
23:34 thd           ftherese:  Mention that you would be happy to convert a script designed for MARC 21.
23:33 ftherese      ok... I'll keep that in mind
23:32 thd           ftherese:  Request anything which might help if you ask for UNIMARC that may scare away most response.
23:26 thd           ftherese:  If you have no success on the users' list after a few days, the try the devel list.
23:26 ftherese      ok
23:25 thd           ftherese:  More people see that one.
23:25 thd           ftherese:  I would send a message to the user's or general list first.
23:24 ftherese      should I mail the regular list or the dev. list?
23:23 thd           ftherese:  I have a script for capturing Koha MARC frameworks which could help if you find someone who wants to share his Koha MARC frameworks but does not know how.
23:20 thd           ftherese:  Requests for UNIMARC defaults such as better configured Koha UNIMARC frameworks from someone who would share might be best asked on the French Koha mailing list.
23:19 thd           ftherese:  There are some configuration issues which you may need to address for UNIMARC Koha because some of the defaults have left too much work for the librarian.  You could ask for people to share those also.
23:17 ftherese      ok
23:17 thd           ftherese:  Ask on the Koha mailing list.  Other people have had the same problem and it is likely that someone would share a solution.
23:12 ftherese      unless, for example, I could use the script to move - in one fell swoop - all the data in my filemaker databases directly into the koha catalog... I am afraid I would be backpeddling
23:11 thd           ftherese:  I think that the script did something useful with the exported data which is what you need.
23:10 ftherese      no... I don't have a deadline
23:10 ftherese      well... unless the filemaker script is REALLY REALLY good... I've already gone beyond needing help moving my data out of filemaker
23:10 thd           ftherese:  Are you referring to some deadline of yours?
23:09 thd           ftherese:  Why too late?  What is too late?
23:09 ftherese      too late for the filemaker script
23:09 thd           ftherese:  What do you mean by late?
23:08 ftherese      unless the script was REALLY REALLY good
23:08 thd           ftherese:  I know all about Filemaker and its limitations.  You would definitely be better off with Koha.
23:08 ftherese      a bit late for that one...
23:07 ftherese      hmmm...
23:07 thd           ftherese:  There had been an old script for importing from Filemaker for Koha 2.
23:06 ftherese      but once I started I wouldn't be able to stop until I had it... so if there was a script for Marc 21 I could probably change it for UniMarc
23:05 ftherese      so... in theory... yes... it might take me a few days... and frustration
23:05 ftherese      I believe that is not a complete overestimation of my capacities
23:04 thd           ftherese:  If someone has a script which does what you want for MARC 21 you could fix it to work for UNIMARC.
23:04 pianohacker   sounds rather like my understanding of spanish...
23:04 ftherese      but, do something from scratch?!?  expecially when I don't know the data structures!?!?!?
23:04 thd           ftherese:  Exactly.
23:03 ftherese      and even tweak it
23:03 ftherese      I mean... I can usually understand what a script does by looking at it
23:03 ftherese      wow... laundry list of languages I don't know
23:03 ftherese      nope
23:03 chris         ruby?
23:02 ftherese      <groan> no </groan>
23:02 thd           ftherese:  Do you know PHP?
23:02 ftherese      pianohacker: just french and spanish
23:02 ftherese      thd: no I don't know Python either... I have always hoped I could avoid those two...
23:02 slef          heh http://ec.europa.eu/eurostat/ramon/nuts/introannex_regions_en.html
23:01 thd           ftherese:  Do you know Python?
23:01 thd           ftherese:  If someone already has a script, then changing it would not be especially difficult even without real Perl knowledge.
22:57 pianohacker   ftherese: What human languages do you speak, out of curiosity?
22:56 ftherese      I speak many languages... but not perl :(
22:53 thd           ftherese:  Most likely their script would be for MARC 21 so you would simply need to make adjustments for UNIMARC.
22:52 thd           ftherese:  If you ask on the koha mailing list someone may already have a script for you.
22:51 thd           ftherese:  People use programs such as MARCEdit for many things but a GUI program such as MARCEdit may not give the level of control which you need.
22:50 thd           ftherese:  Writing your own Perl script to create the UNIMARC records from your CVS files using MARC::Record would be a very effective way to accomplish the task.
22:47 thd           ftherese:  You should write a script to take each item for a record and create a bibliographic record and within each bibliographic record a 995 for each item which has at least 995 $b $c $k and maybe $f if you have barcodes.
22:46 thd           ftherese:  You should write a script to take each item for a record and create a 995 which has at least 995 $b $c $k and maybe $f if you have barcodes.
22:42 ftherese      I would need to filter it using the bulkmarkimport.pl script
22:42 ftherese      right... but when I do the import how do I deal with that
22:42 thd           ftherese:  You can have them all in the bibliographic record by repeating the 995 field for holdings.
22:41 thd           ftherese:  Yes, do you not want to record the multiple copies?
22:40 ftherese      so how do I deal with multiple copies of the same book?
22:40 thd           ftherese:  Or some large number last time I counted
22:40 thd           ftherese:  There are 16 different Thomas Manns in the Harvard libraries' collection.
22:38 ftherese      lol
22:38 thd           ftherese:  The 700 authority form of the name disambiguates for all the other D H Lawrances of the world.
22:38 ftherese      if I add the marc file... how can I tell it not to add duplicate item information
22:38 ftherese      because I have several duplicates copies of books
22:37 ftherese      ok... my problem is also for adding items correctly
22:37 thd           ftherese:  The above shows the use of both 200 $f and 700 in the same UNIMARC record.
22:36 thd           ftherese:  200 1#$aSons and lovers $fby D H Lawrance.  700 #1$aLawrence$bD.H.$gDavid Herbert
22:36 thd           ftherese: yes, bulkmarcimport.pl is in misc/migration_tools in the code repository and should also be in a similar subdirectory of wherever you installed Koha.
22:33 thd           ftherese:  bulkmarcimport.pl is in misc/migration_tools I believe
22:32 thd           ftherese:  I am not exactly certain of the procedure for taking material out of the reservoir because I avoid it.
22:32 Colin         thd: I think that will be more marketing then reality.
22:32 ftherese      where do I get that?
22:32 thd           ftherese:  If you use the bulkmarkimport.pl script then you can bypass all that reservoir nonsense.
22:31 ftherese      I want to add them all... what do I do thd?
22:31 thd           ftherese:  You have to add records from the reservoir to the catalogue.
22:30 thd           Colin: they could be different and yet use an API which had a unified dialogue on the front end with divergent code for the back end.
22:29 ftherese      why is it that I have results in a reservoir, and none in the catalog?
22:29 Colin         The two systems are totally different in structure
22:26 thd           There is a very recent script which may cover both systems.
22:25 thd           What I had read about the API is that it covered both Unicorn and Horizon.
22:25 chris         awesome schuster
22:25 schuster      I'll try and answer him tonight when I have a chance to read the post - as we migrated with LL help in Nov of 08 and went live in Jan 09
22:24 schuster      Then API scripts won't matter as those are Unicorn.
22:24 Colin         Horizon
22:24 schuster      I have not read the post yet..
22:24 schuster      Are they Horizon or Unicorn?
22:23 thd           chris: I could point him to some recent migration work and refer him to a SirsDynix API script for migrating data out of the system.
22:23 chris         the horizon libraries i have been involved in, were running old old versions
22:22 chris         i was hoping someone who had recently migrated would speak up
22:22 chris         nope, i just didnt have a right direction to point him
22:21 thd           chris: There is a koha-devel question about migrating from Sirsi which has gone unanswered.  Are you worried about painting a target on ourselves for non-library respondents to point him in the correct direction?
22:20 brendan       :)
22:20 chris         the one for getting into the gig yeah
22:20 brendan       did you have a bracelet on?
22:19 chris         was an awesome concert that
22:19 chris         :)
22:19 brendan       nice was thinking 5:56
22:19 chris         brendan: somewhere roudn the 5.50 mark
22:18 chris         :)
22:18 chris_n2      come and get it... ;-)
22:14 Nate          k thats all bye for real
22:14 Nate          By the way Chris that video is SWEET!
22:11 chris         lemme find it again hehe
22:10 brendan       the famous hand :)
22:10 chris         not that one, but my hand is in this one http://www.youtube.com/watch?v=SblTKIZoInI
22:08 Nate          gnite #koha
22:08 brendan       so the question is -- does chris appear anywhere in the video
22:05 brendan       ahhh NZ - can't wait :)
22:02 chris         heres the video im gonna use for the kohacon10 invites
22:02 chris         http://www.youtube.com/watch?v=7wKhrEFzLfM
22:02 chris         yeah
22:02 chris         guess they must be just evaluating
22:02 Colin         As a horizon user they are probably thinking about change
22:01 Colin         It says powered by dynix in big letters
21:59 chris         id look, but their website doesnt want to load
21:59 * chris_n2    cooks up breakfast
21:58 chris         *nod*
21:58 brendan       no idea -- but I have seen them poking around on the mailing list
21:57 chris         so does the new mexico state libary use koha?
21:56 * wizzyrea    goes to check for rotten tomatoes
21:54 pianohacker   dangit, and I just used up the last of the whipped cream
21:54 chris         h
21:54 chris         he
21:53 wizzyrea      oh OH a FUD fight!
21:52 cait          ah
21:52 chris         cait: fighting fud with fud
21:51 wizzyrea      that is win all over the place
21:51 cait          huh?
21:51 wizzyrea      LOL
21:51 chris         LEK doesnt support titles for books
21:51 chris         oh i just noticed
21:51 wizzyrea      much better
21:51 * wizzyrea    demands precision!
21:51 * wizzyrea    succumbs to a spelling fail
21:51 ricardo       Bye pianohacker, chris, slef and wizzyrea! :)
21:50 * wizzyrea    demands precisioin!
21:50 chris         yeah, labelling with their full numbers is much more useful
21:50 wizzyrea      (sorry, that's one I get on about because it's easy to be misled)
21:50 wizzyrea      bye, ttyl :)
21:50 Colin         boa noite
21:50 wizzyrea      I think it's right, he labels them 3.0.1 "official release" or 3.01.00.32
21:49 pianohacker   bye, ricardo, sleep well
21:49 pianohacker   slef: hey, whatever works :P shell scripts ain't supposed to be pretty
21:49 ricardo       Well, thanks for replying. Leaving now. Take care!  :)
21:49 ricardo       wizzyrea: OK
21:49 chris         right
21:49 slef          pianohacker: kill $(ps --ppid $$ --no-heading opid) inside the pipeline works. ewww ;-)
21:49 wizzyrea      there's a download for 3.0.1 and for 3.01
21:48 wizzyrea      and not a release
21:48 wizzyrea      it must be based off of master
21:48 chris         or its based off master branch, and not a release
21:48 wizzyrea      or it's right and what he intended
21:48 ricardo       pianohacker: LOL!
21:48 chris         wizzyrea: so thats eitehr a typo
21:48 pianohacker   *phew*
21:48 * pianohacker breathes out
21:48 pianohacker   3.01 != 3.0.1
21:48 wizzyrea      :)
21:48 chris         3.0.x = stable 3.2.x = stable (they can be written as 3.00.x and 3.02.x .. 3.1 = 3.01)
21:48 wizzyrea      that said, the download says 3.01
21:48 ricardo       pianohacker: *nod*
21:47 wizzyrea      so
21:47 ricardo       chris: Like in Linux Kernel tradition, right? OK
21:47 wizzyrea      pianohacker right
21:47 pianohacker   3.01 is the perl-version-number way of saying 3.1, which is the unstable of 3.2 by linux kernel standards
21:47 chris         ricardo: odd second numbers are unstable
21:47 wizzyrea      there's no official release for that
21:47 wizzyrea      no sir, I mean 3.01
21:47 pianohacker   wizzyrea: There's a possibility for confusion: current git version numbers are like 3.01.00.061
21:47 ricardo       wizzyrea: I think you mean "3.1"... and it seems that we are moving directly towards "3.2". But I admit I'm paying more attention to 3.0.x
21:47 chris         but there has never ever been an 3.01.x release
21:46 wizzyrea      ok, yes
21:46 chris         3.01.01 would be an upgrade
21:46 pianohacker   chris: yeah
21:46 nengard       are you?
21:46 wizzyrea      am I high?
21:46 chris         much more salient thing to remember than guy fawkes :)
21:46 wizzyrea      ricardo: i'm not sure, 3.01 is claimed to be an upgrade from 3.0.4, whereas 3.0.1 would be a downgrade from 3.0.4
21:46 chris         Then in 1881 it was the scene of one of the worst infringements of civil and human rights ever committed and witnessed in this country.
21:45 ricardo       pianohacker: I think I know a good joke about that, but it's in cartoon form (difficult to search)
21:45 ricardo       pianohacker: LOL!
21:45 pianohacker   I guess that's why they don't pay me the big bucks
21:45 ricardo       wizzyrea: 3.01 = 3.0.1 I believe. But thanks :)
21:45 chris         "The invasion of the settlement on the 5th of November 1881 by 1500 militia and armed members of the constabulary was the result of greed for Māori owned land and the quest for power by politicians and settlers. Parihaka had become a haven for the dispossessed from througout the country."
21:45 pianohacker   never have quite understood the funding model of my library
21:45 wizzyrea        3.01
21:45 wizzyrea      sorry
21:45 wizzyrea      er
21:45 wizzyrea      ricardo: actually it appears that the virtual appliances are 3.0.1
21:44 chris         http://www.parihaka.com/About.aspx
21:44 pianohacker   ricardo: Hey, so am I (technically)
21:44 chris         personally i use the 5th to remember parihaka instead
21:44 ricardo       pianohacker: Hey, I'm a public servant, you insensitive clod!  ;-)
21:43 ricardo       wizzyrea: :)
21:43 pianohacker   s/parliament/any government building/g
21:43 * wizzyrea    makes a mental note to break out V for Vendetta tomorrow
21:43 chris         :)
21:43 Colin         The only man ever to enter parliament with honest intentions
21:43 wizzyrea      ooh yes
21:43 ricardo       chris: Nice... Always a good day to (re)watch "V for Vendetta"  ;-)
21:43 brendan       oh right cool
21:42 chris         guy fawkes day
21:42 brendan       fireworks?
21:42 chris         remember remember the 5th of november
21:42 ricardo       chris: LOL!
21:42 chris         in between fireworks
21:42 chris         yeah, i have a backlog of mail i have to reply to, will try to tonigh
21:42 ricardo       OK. Leaving towards home now... Take care everyone  :)
21:42 chris         righto
21:41 ricardo       chris: I have to use that information for replying to Jaqueline (she replied me off-list , meaning off the "Koha Translate" list). I won't be able to reply today, though
21:41 ricardo       chris: yeah.
21:41 chris         probably, it is a little bit old now
21:40 ricardo       http://sourceforge.net/projects/koha-tools/files/Koha%20Virtual%20Appliance/
21:40 ricardo       It seems to be 3.0.1
21:40 chris         the about page would tell ya
21:40 ricardo       Does anyone know what Koha version is installed in Kyle Hall's Virtual Machines?
21:37 chris         night hdl
21:36 Colin         gnight hdl
21:34 pianohacker   good night, hdl
21:34 hdl           good night
21:27 cait          thx Colin :)
21:27 Colin         I will. I added your bug number to the commit as well
21:26 cait          we should mark one og the bugs as duplicate
21:26 Colin         Yes or if you put a barcode on a book but it never got added to the db
21:23 cait          Colin: I think its not only about misreads, but could also happen when an item is accidently deleted from koha
21:20 Colin         some units validate the barcode some dont and pass the misreads
21:19 cait          wizzyrea: dont know :) I think it does not happen very often, we discovered in testing, I had a test case for that. did not happen in the library so far.
21:17 wizzyrea      (which either means all of the barcodes are in the system or we've just been really lucky(
21:17 wizzyrea      they work very well, we haven't even heard reports of the invalid barcode problem
21:16 wizzyrea      ours are 3M RFID machines
21:15 cait          we did a lot of testing, we provided test cases and got feedback from easycheck.
21:13 cait          its a german vendor - easycheck
21:13 cait          they have one self check out station for check out and check in, no sorting, just a shelf for check ins, we dont allow renewals, this seemed to be problematic
21:13 Colin         Who supplies the self check units
21:13 cait          sorry, dont know if I understand your question correctly
21:12 * ricardo     sighs because of data migration project work (to get data from proprietary system, using non-standard format, into Koha)
21:12 Colin         cait: what units are connected?
21:11 cait          wizzyrea: self check works really good for our library - no phone calls so far and 750 checkouts and 500+ check ins - only 2 small bugs I know of, this is one of them
21:11 Colin         thats patch not path
21:10 Colin         cait: yes it is the same. path coming
21:09 cait          yes, I think they are
21:08 wizzyrea      cait: those bugs look extremely similar to me
21:08 pianohacker   if the context isn't super-crucial, probably good enough
21:07 pianohacker   bit of a race condition, tho
21:07 ricardo       wizzyrea: :)
21:07 pianohacker   slef: A quick test says that they _usually_ do
21:07 wizzyrea      HEHEHE
21:07 cait          dont know if the humor was intended
21:07 ricardo       wizzyrea: I think it should be "checking in invalid item causes break of comm NO CARRIER"  ;-)
21:06 sekjal        good <localtime>, #koha
21:06 slef          pianohacker: do pipelines always have consecutive PIDs?
21:06 wizzyrea      humor in bug reports. I like it
21:06 sekjal        hmmm, time to grab some groceries and head home to cook them.
21:06 ricardo       wizzyrea: LOL
21:06 pianohacker   slef: It looks like your trick would probably be the easiest
21:06 wizzyrea      ooh, I like the description of that second one
21:06 chris_n       bbl
21:05 munin         04Bug http://bugs.koha.org/cgi-bin/bugzilla3/show_bug.cgi?id=3696 major, P3, ---, joe.atzberger@liblime.com, NEW, checking in invalid item causes break of comm
21:05 munin         04Bug http://bugs.koha.org/cgi-bin/bugzilla3/show_bug.cgi?id=3767 enhancement, P5, ---, joe.atzberger@liblime.com, NEW, Invalid barcodes in checkin cause sip connection to terminate
21:05 cait          Colin: I just saw your bug 3767, is it possible its the same bug reported by easycheck (vendor from our koha library) bug 3696)?
21:05 slef          no trickery, just & on end of line
21:04 slef          pondering doing wait $! ; kill $(expr $! - 1)
21:04 pianohacker   slef: If you moved the backgrounded pipe into its own process, either by making a separate script or by some other trickery, you could simply background the separate process, then only have one thing to kill
21:04 chris_n       sekjal: the present method of storing barcodes is rather problematic
21:03 sekjal        the two barcodes got concat'ed into the barcode field (separated by " | "), and cut off after 30 char
21:03 cait          Colin?
21:02 thd           Multiple barcodes per item could be useful and often exist in the real world
21:02 sekjal        thd: agreed.  my old ILS allowed multiple barcodes, and now some of our items aren't scanning right
21:02 slef          pianohacker: this is a scripted thing
21:01 SelfishMan    I've worked with our director to figure out an option that should work fine though
21:01 pianohacker   slef: Hmm. is this a one-time thing? If not, you could use pgrep -f
21:01 wizzyrea      i mean, we make our libraries issue new cards for exactly this reason
21:01 SelfishMan    brendan: Yeah, the MT state library is pretty bad for that
21:01 thd           We should have a way of tracking multiple barcodes
21:01 slef          pianohacker: complication: other_pid is the first process in a backgrounded pipe. How to get its pid?
21:01 SelfishMan    As a library that is being pressured to renumber our collection I can say that we aren't in a rush to switch as it will cost us about $75k
21:00 brendan       yup that is correct if I understand correctly
21:00 chris         yeah i basically replied and said, i dont think this is a koha issue, koha doesnt force you to use any particular barcode regime
21:00 chris_n       chris: sounds like a branch issue to me
21:00 pianohacker   slef: I'd go with an while ps ax | grep pid; sleep reasonable_amount_of_time; done; kill other_pid
20:59 brendan       ie. a global database for montana
20:59 wizzyrea      sekjal: yes, it requires rebarcoding, and they only do it in extreme circumstances
20:59 brendan       I think lee is asking weither she should listen to the state library and rebarcode her collection for the future
20:59 Colin         its not a project to undertake lightly (seen it done, wrote some support software)
20:58 slef          a barely koha-related question for all the sysadmins: what's the best way of killing process A when process B exits?
20:58 chris         yeah it doesnt sound like a koha problem
20:58 sekjal        wouldn't most libraries refuse to do that unless forced?
20:58 sekjal        wouldn't changing barcodes in a library require reprocessing all the items?
20:58 chris         yes
20:58 sekjal        Aren't barcodes usually physically printed and stuck to the material?
20:58 wizzyrea      probably more like what ricardo is saying
20:57 chris         ill reply saying huh?
20:57 wizzyrea      oh right
20:57 ricardo       chris: [Koha-Patches] s/he/she    ;-)
20:57 wizzyrea      :P
20:57 chris         lee from montanan wizzyrea  :)
20:57 wizzyrea      s/he
20:57 chris         its a she :)
20:57 slef          (current internet link fun is wholesaler claiming that my router is consistently interpreting electrical interference as LCP TermReq, which looks pretty impossible in the source code)
20:57 ricardo       wizzyrea: Maybe he wants something like a GUID (Globally Unique Identifiers)
20:56 wizzyrea      s
20:56 wizzyrea      so every koha install has unique barcode
20:56 slef          wizzyrea: I've got something like that with my internet link too.  Benefit of FOSS is you can go "here is the source code. Can you explain how it's broken because it passes the test suite and I just can't see it?"
20:56 wizzyrea      I think he's wanting perhaps there to be random generated barcode seed?
20:55 Colin         I was tempted to reply. I thought this was fixed 20/30 years ago
20:55 thd           Standards documents could never contradict one another, especially not within the same document. :)
20:55 wizzyrea      we ended up fixing koha
20:55 chris         off topic, did anyone understand lee's question about barocdes on the mailing list?
20:54 wizzyrea      we had a deal where one side said "Koha is broken" and the other said "the product is broken"
20:54 ColinC        Some of the documents have been lost in corporate offices too
20:54 slef          I've already two unresponsive supplier developments with RFID and EDI. Share and enjoy.
20:54 wizzyrea      slef: oi, no kidding
20:54 slef          I'm happy ColinC is working on SIP. Some of the docs contradict each other, some machines seem not to follow them anyway and hardware suppliers are not very responsive.
20:52 chris_n       ColinC++
20:52 thd           ftherese:  are you there?
20:52 chris         wizzyrea: ah well, it's probably best she knows
20:52 chris_n       slef:lol
20:52 pianohacker   fek sounds like a canadian curse word
20:52 chris         ColinC++
20:52 wizzyrea      I think HLT is pursuing private talks with the asset owner.
20:52 ColinC        I know the protocol well. I'm gathering & posting some patches for oops in it
20:51 slef          chris_n: FORK or FEK?
20:51 davi          good
20:51 pianohacker   davi: the official jargon is "restore ownership of koha.org to the community"
20:51 chris         hehe
20:51 * sekjal      stops again, breathes again, ratchets it back more forcefully
20:50 wizzyrea      colin: I didnt realize sip2 was partially your baby too
20:50 davi          What will be the steps we will follow to take over koha.org?
20:50 wizzyrea      she has been having nuts problems with her sip2 and magnetic media wiping
20:50 sekjal        but all the spin, misreprentation, and outright lies... that's why I get so worked up over this
20:50 wizzyrea      melanie was here
20:50 wizzyrea      oh drat
20:50 pianohacker   and then there's "offline circulation", but that's been there for a while
20:49 * brendan     *sigh*
20:49 chris_n       fork vs FORK maybe
20:49 ricardo       chris: *nod*
20:49 chris         sekjal: yes, its less a fork, more an attempted hostile takeover
20:49 sekjal        codebases fork; its part of life, and a fundamental part of our version control system
20:49 chris         heh
20:49 sekjal        that's what this whole fork issue is really all about; who gets to control what people know to be "Koha"
20:48 ColinC        by coincidence I've just posted a bug in sip2 fix coming
20:48 chris         yes
20:48 chris_n       we are effectively suffering damages due to misrepresentation
20:48 chris         sekjal: its in main koha
20:47 wizzyrea      not to mention that somewhere, someone is saying koha isn't sip2 compliant, which is laughably false
20:47 * chris_n     agrees
20:47 * sekjal      stops, breathes, dials it back down
20:47 chris         its important we know that this kind of thing is happening
20:47 chris_n       bye owen
20:47 wizzyrea      I just felt the need to share my irritation about almost being denied a grant over misleading marketing
20:47 pianohacker   bye, owen
20:46 owen          See y'all tomorrow
20:46 pianohacker   wizzyrea: we're self-stirring
20:46 chris         im glad you brought it up
20:46 * owen        had better quit before he gets too worked up ;)
20:46 sekjal        what the heck is "Patron loading"?
20:46 wizzyrea      :(
20:46 wizzyrea      sorry I didn't mean to stir that up
20:46 pianohacker   of course!
20:45 owen          pianohacker: Yes, but then they rewrote every line of code to eliminate every bug.
20:45 chris         basically we need control of www.koha.org back
20:45 wizzyrea      I know. they say lots of misleading things.
20:45 pianohacker   especially given that LEK is developed off HEAD
20:45 owen          LibLime now sees the open-source version of Koha purely in "official release" terms
20:45 wizzyrea      pianohacker: they say lots of things
20:44 pianohacker   when so many koha users are running off git
20:44 slef          is there an independent SIP test-suite?
20:44 * pianohacker finds it very odd that 3.0.x is being pushed as community
20:44 sekjal        that bloody comparison chart.... very misleading.  Is it even true if "Koha Community" = 3.0.2?
20:43 wizzyrea      yea, I was thinking about that actually
20:43 chris         wizzyrea: maybe drop atz a note
20:43 wizzyrea      priceless: "The mayor, who was cycling past, stopped and chased the girls down the street, calling them 'oiks'. "
20:42 wizzyrea      slef:  the headline alone
20:42 wizzyrea      I mentioned that I wouldn't personally take any claims from that certain vendor very seriously.
20:42 * slef        boggles at the completely unrelated http://www.bikeradar.com/road/news/article/london-mayor-chases-would-be-attackers-on-bike-23846
20:42 nengard       okey dokey
20:41 wizzyrea      from one of their representatives
20:41 wizzyrea      so you may see a post to the koha list re: sip2 compliance
20:41 wizzyrea      nengard: nothing to be done about it really. I think we set them straight
20:41 chris         thats stooping even low than usual
20:41 wizzyrea      rhcl: oh good
20:41 nengard       wizzyrea once again sorry you're dealing such stuff :(
20:41 * chris       boggles !!!!
20:40 wizzyrea      yea, you know, that NEKLS paid for and is mostly in the project already
20:40 rhcl          BTW, we had an excellent visit in Atchison.
20:40 owen          "Enhanced SIP2" ?
20:40 wizzyrea      not sure how the money would work out but we like having biggish libraries in our consortium
20:39 wizzyrea      rhcl: well, actually... you'd have to talk to my boss about that (lol)
20:38 * jdavidb     thinks of a paraphrase of Mark Twain:  There are lies, damn lies, and...
20:38 wizzyrea      colin: I will remember you said that :)
20:38 wizzyrea      you know, the notorious table of features
20:38 Colin         wizzyrea: any SIP issues feel free to punt them in my direction
20:38 wizzyrea      they consulted a certain vendor's website,  yes
20:38 ricardo       wizzyrea: OK, thanks
20:38 rhcl          Is that an invitation?
20:38 owen          Is said vendor saying this in reference to LEK?
20:38 wizzyrea      still evaluating
20:37 wizzyrea      ;)
20:37 ricardo       wizzyrea: And they accepted it after that -or- are they still evaluating it?
20:37 wizzyrea      well, are you coming into NExpress?
20:37 rhcl_away     Did you leave any money for us?
20:37 wizzyrea      not sure what they're smoking
20:37 wizzyrea      because it's working right now
20:37 wizzyrea      yes, we convinced them to take a second look
20:37 owen          "almost deny" ?
20:36 wizzyrea      ok, we just had a grant funding board almost deny our grant because they say someone *cough formerly the only us vendor cough* said that community koha wasn't SIP compliant
20:36 hdl           really awfully sorry.
20:36 ricardo       ftherese: Using other words -> A record will (almost) always have 200$f filled in. Besides that, it *may* also have some 7XX fields filled in, if you're using Authorities
20:36 hdl           I missed you all.
20:35 wizzyrea      hdl: we missed you :)
20:35 ricardo       ftherese: You're welcome  :)
20:35 ftherese      ok... thanks ricardo
20:34 ricardo       hdl: LOL! Happens to the best :)
20:34 ricardo       ftherese: Yes... You do want those (200$f and 200$g)... but you may also want 7XX (if you want to use "Authorities" for people's names, in order to find that Joseph Ratzinger and Pope Benedict XVI are the one and same person)
20:34 hdl           chris : I forgot the meeting.
20:33 hdl           ... anyaone there ?
20:32 ftherese      I think I want 200$f and 200$g
20:32 ricardo       ftherese: "Authorities are much wider than just the dude who wrote the book" - agreed. They are also used for "Subjects" as well, for example
20:31 ftherese      Authorities are much wider than just the dude who wrote the book... I take it...  I am looking for just a marc number that works for 1. "the main dude who wrote the book, and who should get the most credit." and then 2. etc.  "other dudes who also kinda helped out or something"
20:26 ricardo       ftherese: 7XX is for linking names to "Authorities"
20:26 ricardo       ftherese: LOL!
20:26 ftherese      I hope people know they become intellectually responsible for something when they write it!!!
20:26 thd           ftherese:  7XX is for the official name.
20:25 ftherese      lol @ 7-- Intellectual Responsibility Block
20:25 thd           ftherese:  200 $f is for the first statement of responsibility 200 $g is for additional statements of resposibility
20:24 ftherese      ahhh
20:23 pianohacker   brb foodz
20:23 thd           ftherese:  200 1#$aThree adventures of Asterix$iAsterix in Switzerland$ftext by Goscinny$gdrawings by Uderzo$gtranslated by Anthea Bell and Derek Hockridge.
20:23 slef          I saw Aussie rules football on tv yesterday.  The channel name was something completely unrelated, though, like "Real Estate TV"
20:22 thd           ftherese:  That is a mistaken help message left from Koha 1.9
20:22 pianohacker   hell, you play cricket
20:22 chris         nzers will play any sport tho
20:22 ftherese      thd: I am looking at the koha to Marc links in the administration section, and I see no other author input blancs other than the 200$f.  Why is that?
20:22 chris         pianohacker: wellington has 2 teams
20:21 davi          Australians?
20:21 pianohacker   that's sad
20:20 pianohacker   Hahaha, there's actually people outside the US that care about american football?
20:19 slef          pianohacker: and conversely, rumours are that the NFL is going to get a European team.
20:19 pianohacker   us_centric_naming++
20:19 slef          chris: hopefully one match :)
20:19 ftherese      christ_n: I didn't use it with a scanner, I converted old book scans in pdf format to pdm (I think it was ) and It usually did work pretty well, even with french text
20:19 chris         slef: will probably still lose :)
20:19 slef          chris: did you see, their rugby team have resorted to playing English club teams?
20:19 pianohacker   slef: reminds me of the "world" series
20:19 chris         :-)
20:18 chris         slef: they are probably out practicing bowling underarm
20:18 slef          chris: say that when they're awake
20:18 thd           ftherese:  The statement of responsibility in UNIMARC $200 is only for transcribed authorship statements
20:18 chris_n       heh
20:18 ricardo       chris: LOL!
20:18 chris         no one cares about AU
20:18 slef          chris_n: I think most of us do at the moment.
20:18 ricardo       chris_n: Sure. Portugal does DST (as do some other countries).
20:18 slef          pianohacker: European Summer Time, Eastern Standard Time (US) and Eastern Standard Time (AU)
20:18 tajoli        BUT as are written in frontpage
20:18 chris_n       slef: I think some European countries do DST as well
20:17 pianohacker   slef: well, exactly
20:17 slef          pianohacker: we get the additional fun of "which EST is that?" sometimes.
20:17 ricardo       ftherese: http://www.unimarc.info/bibliographic/2.3/en/200
20:17 tajoli        authors/editors
20:17 chris_n       ftherese, thd: I run tesseract integrated with xsane and they work great together
20:17 tajoli        In unimarc 200$f and 200 $g are for all aut
20:17 ftherese      I don't see any koha fields for them?
20:17 ftherese      so where do the other ones go?
20:17 slef          pianohacker: so? We have to decode US times more than you get to decode UTC.
20:16 pianohacker   I think it's mainly DST that screwed people up this time, but...
20:16 ricardo       ftherese: Nope... You should get only one entry in 200$f I think
20:16 ftherese      or do they all get 200$f?
20:16 slef          I mean, most countries are in only one timezone.
20:16 pianohacker   slef: you are a lot closer to gmt than we are
20:16 ftherese      what about the other authors?
20:16 chris_n       slef: lol
20:15 * pianohacker shrugs
20:15 chris_n       thd: not sure
20:15 slef          pianohacker: why are people in the US less competent at telling the time?
20:15 ftherese      plus it is open
20:15 thd           chris_n: I wonder how it does with CAPTCHA.
20:15 ftherese      it does work well
20:15 chris_n       http://code.google.com/p/tesseract-ocr/
20:15 ftherese      chris_n wow! They use tesseract?!  I compiled that fo use with linux a few years ago
20:15 thd           chris_n: Oh yes, I had experimented with that slightly.
20:15 pianohacker   slef: Well, all of the developers, yes. :) I'll just follow up the meeting notices with a quick run-through of times in the US myself
20:15 ricardo       ftherese: Yes, for the author mentions, as they appear in the book cover (that's my understanding, at least)
20:14 chris_n       thd: supposed to be some of the best and its FOSS
20:14 chris_n       thd: tesseract is the engine google uses (old HP code iirc)
20:14 ftherese      ok... always trying to find better ways to use the unimarc classification... koha Field author 200$f  should I be using that one?
20:14 ricardo       chris: *nod*  It's just that more people are starting to adopt / want to adopt Koha here in Portugal, and I would feel more safe if we could get the "bugfix releases" more tested, that's all
20:14 slef          pianohacker: everyone has access to one of those.  There were also "world time" links in some announcements.
20:13 pianohacker   slef: that command _is_ very useful, but is somewhat limited to those with access to a linux command line
20:13 chris         ricardo: your questions are about 1.5 years to late for 3.0.x :-)
20:12 slef          pianohacker: I added date -d @123123123 to some notices, but that seems to have stopped
20:12 thd           chris_n: I have no idea about the basis of the software but their are postal system hackers who test its limits.
20:12 pianohacker   bye, danielg, see you at next meeting
20:12 chris         otoh for 3.2 we will have alphas and betas
20:12 danielg       bye-bye
20:12 chris         ricardo: nope its a bugfix for a stable release
20:12 wizzyrea      lol chris
20:12 nengard       bugs reported and one assigned to me - now to remember how to connect via FTP to my local virtual machine ...
20:12 * pianohacker thinks we should add all 4 US timezones plus nzst to meeting notices (maybe just the us, it only seems to be us that get caught out)
20:12 chris         cos frankly, the name is awesome
20:12 ricardo       chris: Which reminds me (forgot to talk about this during the meeting, damn...) - should we have some "Beta" and/or "RC" stage for 3.0.5, before 3.0.5 final (to see if we can attract people that would test it, but don't have / aren't accostumed to git / git snapshots)?
20:11 chris_n       thd: is it tesseract based?
20:11 chris         but i use 'Petes Parcels' a lot
20:11 thd           chris: US Postal Service uses OCR to read envelopes which can interpret CAPTCHA images perfectly.
20:11 chris         most people still use NZ post for domestic mail
20:11 chris_n       opps
20:11 chris_n       deregulation++
20:10 chris         thd: we also have deregulated mail service
20:10 magnusenger   chris: heh, that sounds a bit like norway ;-)
20:10 chris         it would get there
20:10 chris         thd: i could send a letter, Aunty Dawn, Brunswick Road, Wanganui
20:10 danielg       same here
20:09 schuster      phoey guess I'll read the log.
20:09 ricardo       chris: Agreed... But it's hard for me to test 3.0.x branch as it is (to find bugs before they reach 3.0.5 and such). I won't be able to do the testing for both branches, but I guess (and hope!) that other people can do that
20:09 chris         thd: some, not most
20:09 chris         schuster: you missed it .. daylight savings caught you out
20:09 danielg       hi schuster
20:09 schuster      Howdy!
20:09 thd           chris: Do you have barcodes printed at the base of envelopes when they arrive?
20:09 schuster      David Schuster - Plano ISD
20:08 chris         its not hard to find someone
20:08 chris_n       hi schuster
20:08 chris         but it also has only 4 million people
20:08 chris         thd: yep
20:08 thd           chris: does NZ have automated mail handling?
20:08 chris         ie 3.0.x was tested before the 3.0.0 release
20:08 nengard       grr - okay i'm off to edit the template for the cities/towns page so that it's clear what to add in the form
20:08 * pianohacker threatens 3.0 with overgrown, beastly sysprefs editor
20:08 chris         ricardo: you are supposed to test it before a stable release not after :-)
20:07 owen          nengard: it works if you populate it with the right data!
20:07 chris         ricardo: development on 3.2 has been going on for a year now ;)
20:07 cait          owen: country not state
20:07 chris_n       nengard++
20:07 nengard       we need ot make it clear that the city field if for city, state
20:07 danielg       hi to all saying hi to me!
20:07 ricardo       chris: Right, agreed. I'm just wondering if we shouldn't let people test Koha 3.0.x more intensely before moving development efforts to 3.2... but I won't make a fuss about this
20:07 nengard       owen and chris - and all - basically the cities, towns is not 100% useful
20:07 owen          I thought that someone recently added a state field to the table?
20:07 chris         we do have postcodes, no one uses them though
20:07 pianohacker   interesting. hi, danielg
20:06 owen          nengard: In the past Koha stored everything as "Athens, OH," city+state in the same field.
20:06 chris         yep
20:06 pianohacker   chris: just cities?
20:06 sekjal        hey, danielg!
20:06 chris         we dont have states or zips so its never been an issue for me :)
20:06 nengard       off to play with it a bit
20:06 nengard       thanks chris - good to know i'm not alone
20:06 chris         ricardo: then you could submit it, and say, "this might apply on master also, but i havent tested"
20:05 chris         nengard: i have no idea about your question
20:05 ricardo       chris: Even if I *don't* have a master setup (code + database) but I do have a 3.0.x setup?
20:05 chris         hiya danielg :)
20:04 chris         and note if they also apply in 3.0.x
20:04 chris         ricardo: yep if you find any bugs that exist in master, you should patch against master
20:04 ricardo       danielg: Welcome then  :)
20:04 thd           slef: Having people believe that they can still vote with the current shortly after 12 September text is a problem.
20:04 ricardo       s/where the tests/where the tests *and* bugs
20:04 danielg       daniel grobani here, first IRC experience ever
20:04 ricardo       chris: Well, that depends on who is doing the tests and where the tests are  :)
20:04 chris         ricardo: hdl and spent a few nights porting about 300+ patches from master to 3.0.x before 3.0.4
20:03 chris_n       tnx gmcharlt
20:03 chris         and ported back to 3.0.x (not the other way)
20:03 chris         ricardo: the bugs in 3.0.x are specific to it, any 3.2 ones are fixed in master
20:03 nengard       i have a question from writing docuemntation - cities and towns asks for city and zip - but not state - how is this data stored - cause on the patron record it's the city, state line that the pull down appears next to.
20:03 nengard       sorry all, thought the meeting was over guess I missed the part where we started discussing stuff again - anyway - I have a question now that I think the meeting is over :)
20:03 thd           slef: you should update the wiki relicensing ballot to state closing by 2 December.
20:03 ricardo       gmcharlt++
20:03 davi          ack
20:02 gmcharlt      davi: I think we'll be going with 10:00 UTC+0 on the 2nd
20:02 davi          same hours?
20:02 rafael        bye
20:02 ricardo       chris: how so?
20:02 gmcharlt      thanks to all for participating, and we'll meet again on 2 December
20:02 chris         nope
20:02 ricardo       Question: wouldn't Koha 3.2 development benefit for waiting for some tests & fixes for 3.0.x?
20:01 gmcharlt      OK, I think we've covered everything for this meeting
20:01 ricardo       chris: OK, thanks
20:01 chris         nope that is fine ricardo
20:01 nengard       np
20:01 gmcharlt      nengard: hold that thought a second, please
20:01 ricardo       chris: Like I said, I'll probably have to send you an updated Portuguese PO (during this or next week). Will this hurt your plans?
20:00 nengard       i have a question from writing docuemntation - cities and towns asks for city and zip - but not state - how is this data stored - cause on the patron record it's the city, state line that the pull down appears next to.
20:00 chris         i suspect 3.0.5 will be the last 3.0.x release
20:00 tajoli        dix/fix
20:00 chris         and then you guys have to translate, and then we are ready
20:00 chris         i just have to update the .po
20:00 tajoli        corret, I see the dix
20:00 chris         yep, fixed now
20:00 ricardo       tajoli: It is, but I believe hdl is submitting patches for correcting that bug
19:59 tajoli        OK
19:59 chris         tajoli: yes that is the main fix for 3.0.5
19:59 ricardo       chris: OK, thanks for the info
19:59 chris         when i update the templates ill call a string freeze
19:59 tajoli        missing submit buttons not translated is a bug for me
19:59 pianohacker   chris_n: it'll mostly consist of pestering people about bugs that are lacking info; I'm an expert at pestering
19:59 chris         not yet
19:59 ricardo       chris: Any "freeze dates" for 3.0.5 already?
19:59 davi          http://wiki.koha.org/doku.php?id=en:development:roadmap3.0 ?
19:59 tajoli        3.0.5 will have also bugfixes ?
19:58 chris         there were some missing submit buttons not translated, which 3.0.5 will fix
19:58 chris         slef: its only bugfixes from now
19:58 chris         i can speak a little to that, there will be a 3.0.5 release in the near future i have to tidy up some translation issues
19:58 slef          well, does anyone know where the current 3.0.x roadmap is?
19:58 ricardo       (*nod* pianohacker++ fdemians++)
19:58 gmcharlt      ricardo: yes, it would depend on hdl unless chris has something to say about 3.0.x
19:57 ricardo       Question: are we still going to talk about Koha 3.0.x roadmap in this meeting? Or can't we do it because hdl is silent now?
19:57 * chris_n     tries to imagine pianohacker in boots and chaps
19:57 thd           fdemains++
19:56 thd           pianohacker++
19:56 ricardo       Colin: :)
19:56 ricardo       sekjal: Sure (+1)
19:56 Colin         ditto to ricardo's comment
19:56 chris_n       +1
19:56 gmcharlt      sekjal: +1
19:55 ricardo       gmcharlt: Fine, by me... I won't be able to attend it, but I don't think that it should be changed because of me
19:55 davi          ack thd
19:55 tajoli        +1
19:55 sekjal        so, a cutoff date of, say Nov 15th for objections?
19:55 thd           davi: The exact form is undecided.  Imprecise language as slef said.
19:55 chris_n       +1
19:55 rafael        for me is ok
19:55 gmcharlt      I'd say that if we don't hear objections  in the next week or so, we'll run with 19:00 UTC+0 for the foundation meeting
19:54 davi          ok ok
19:54 slef          davi: first move is to HLT, a trust.
19:54 davi          ack
19:54 chris_n       19:00 UTC was set on the condition that there were no objections
19:53 slef          davi: no, not yet. slopply language on some.
19:53 chris         davi: not outright, but for the immediate future yes
19:53 thd           davi: Which association option?
19:53 thd           Anyone who wants to know when it will be.
19:53 davi          Was the Association options rejected?
19:53 ricardo       thd: Who's "we"?
19:53 thd           yes time.
19:53 chris_n       thd: the time?
19:52 thd           We do have an open question on the next foundation forming meeting
19:52 pianohacker   gmcharlt: I've volunteered as a generic bug wrangler; fdemians came up with idea, still trying to get idea of precise meaning
19:52 sekjal        thd: okay, I think I'm clear.  If my notes on the wiki are inaccurate, let me know afterwards and I'll correct
19:51 nengard       thanks gmcharlt
19:51 wizzyrea      I missed that part sorry gmcharlt
19:51 gmcharlt      of course, anybody who wants to volunteer to wrangle bugs for a particuar module should feel free to do so at any time
19:51 wizzyrea      ah, ok
19:51 gmcharlt      mostly that I think we should defer it and discuss next meeting when we start talking about 3.4 roles
19:51 thd           sekjal: More permissive would be an option of a contributor.  Not the other way around.
19:51 davi          Association -or- Foundation?, or is going to Foundation already decided?
19:50 nengard       was that a point already covered?
19:50 nengard       no one answered wizzyrea about bugzilla
19:50 ricardo       gmcharlt: Darn, I guess I'll probably also miss that one, then. Oh well...
19:50 davi          yes,  http://wiki.koha.org/doku.php?id=relicensing
19:50 gmcharlt      as a reminder, the next foundation forming meeting is currently scheduled at 19:00 UTC+0 on 3 December 2009
19:49 davi          ?
19:49 davi          "GPL v2 or later"
19:49 ricardo       gmcharlt: Yeah, I had to skip that meeting unfortunately. Will read the transcript, when I get around to it
19:49 sekjal        so the rule would be all new content is defaulted to GPLv2, with more permissive licenses available on request
19:49 gmcharlt      the other issue is the foundation, but I don't think there's all that much to say about it since it's been only a week since the last foundation-forming meeting
19:49 thd           ricardo: FSF has a list for software licenses and compatibility.
19:49 ricardo       davi: *nod*
19:48 thd           sekjal: It would mostly be for quoting content of others.
19:48 davi          sekjal, We should care too about all that licenses being compatibles, or just allow only one. so it would be easier use such material to write a book, manual or similar
19:48 ricardo       thd: My only question with that would be what would be a more permissive licensing -public domain, others?  (Is my double use of "would be" here correct? I think it is...)
19:47 wizzyrea      so where are we with bugzilla default assignees?
19:47 chris_n       +1
19:47 thd           sekjal: However, I doubt the issue of alternative licenses will arise.
19:47 gmcharlt      any objectiosn?
19:47 gmcharlt      and I propose 10:00 UTC+0 as the time, since we've had two general meetings in a row at 19:00 UTC+0
19:47 gmcharlt      December 2nd is the first Wednesday of December
19:47 thd           sekjal: we should certainly allow more permissive licensing of sections on request.
19:47 gmcharlt      which I propose to make an agenda item for next meeting
19:47 ricardo       gmcharlt: Ah, OK.
19:46 gmcharlt      BZ = Bugzilla
19:46 gmcharlt      and at this point we may as well roll it into discussino of 3.4 roles
19:46 ricardo       gmcharlt: Huh?
19:46 sekjal        this could be decided at a later meeting, if appropriate
19:46 gmcharlt      regarding other action items, the main one left from last meeting was BZ default assignees
19:46 sekjal        I agree that all the content on the wiki should be fully redistributable.  Do we want to make that a condition for publishing to the wiki, or just the default (with exceptions available upon request)?
19:46 slef          chris_n: or just that it may be replaced with something under more friendly terms if possible.
19:45 nengard       home
19:45 nengard       oh - well had car trouble and just got hom
19:45 thd           sekjal: that would defeat the purpose of the wiki.
19:45 wizzyrea      nengard: daylight savings :(
19:44 gmcharlt      but I agree that it should be an unusual thing
19:44 chris         ill try
19:44 chris_n       perhaps "on approval of the community or its reps"?
19:44 thd           sekjal: we should discourage incompatible license notices appearing in the wiki
19:44 nengard       did the meeting start? I thought it was in 15 min ...
19:44 ricardo       chris: stay put, will ya?  ;-)
19:44 gmcharlt      I think the main case would be republishing content that's under another license that allows distribution
19:44 sekjal        ricardo: no, nothing in mind.  I just have it noted as an unresolved question
19:44 slef          sekjal: I think we should, but it should not be usual.
19:44 ricardo       sekjal: I don't think so (just my opinion). Did you have any particular license / case in mind?
19:44 thd           We should briefly address sekjal's question.
19:43 chris_n       we're multiplying rapidly
19:43 chrisc        yeah, network went down, and came back up
19:43 pianohacker   looks like it
19:43 gmcharlt      clones?
19:43 chris_n       and chris
19:43 chris_n       wb chrisc
19:43 chrisc        well that was annoying
19:43 sekjal        are we going to decide to allow other new content to be published in the wiki with other licensing if explicitly requested?
19:43 ricardo       chris_n: *nod*
19:42 davi          slef, copyright law being extended in a lot of countries is what has allowed the grow of the Free Software community IMHO
19:42 chris_n       we probably have a similar issue with the code itself given the various copyright holders of the various pieces
19:42 thd           gmcharlt: I think that we could although any old copies in the internet archive would still be GPL 2 or later.
19:42 gmcharlt      ok, anyway, I think we have discussed this enough for now - hopefully we get to finalize at least some of this next month
19:41 davi          so, no dishonest IMHO
19:41 slef          Have I mentioned recently how much I dislike copyright law?
19:41 davi          slef, the original is "GPL v2 or later" not just "GPL v2"
19:41 slef          dishonest is a bit too strong
19:41 slef          davi: legal, but slightly dishonest because the original content is also available under GPL-2.
19:41 gmcharlt      doesn't "GPL2 or later" mean that the recipient has the option to redistribute per GPL2?  I don't see how that can turn into GPL3 or later w/o a relicensing vote
19:40 davi          ack
19:40 thd           davi: That would make this task, much easier if we collectively decided that would be helpful.
19:40 pianohacker   bah
19:40 ricardo       Ah! Here's my homonym. Hi richard!  :)
19:40 davi          I  know
19:40 davi          ack
19:40 thd           davi: That is an advantage, not a problem.
19:39 * gmcharlt    tells pianohacker to get off his lawn
19:39 * wizzyrea    calls pianohacker a whippersnapper
19:39 davi          thd, It is not a problem as all the wiki could be moved from "GPL v2 or later" to "GPL v3 or later" just changing the notice. It is legal.
19:39 richard       hi
19:39 * pianohacker hands gmcharlt his cane
19:39 thd           davi: we do not have quoting much code in the wiki in any case.
19:39 * jdavidb     is olde.
19:39 gmcharlt      chris_n: the bunch is olde, relatively speaking - I'm making no assertions about any of the members of same ;)
19:38 thd           davi: However, any special section marked GPL 3 or later could be kept as such.
19:38 thd           davi: yes
19:38 * chris_n     wonders about the young ones in the bunch ;-)
19:38 davi          thd, GPL v2 is incompatible with GPL v3
19:37 ricardo       gmcharlt: OK, thanks
19:37 gmcharlt      ricardo: ye olde bunch o' people
19:37 ricardo       I have a newbie question (sorry): I thought that works under the GPL are / could be copyrighted (although allowing the right for modifications to others, of course). If this is so, who is the copyright holder, in this case?
19:37 davi          just change the license and go.
19:36 davi          Being "GPL 2 or later" it is very easy publish a copy as "GPLv3 or later"
19:36 slef          thd: yes, so we'd still need to track terms.
19:36 ricardo       gmcharlt: OK
19:36 thd           slef: the only issue would be quoting GPL 3 code in the wiki and that content can always be marked appropriately and is covered by the or later clause.
19:36 gmcharlt      so what the yes voters have assented to is "GPL 2 or later"
19:35 davi          chris_n++
19:35 gmcharlt      "changing the wiki page license to the GPL version 2 or later terms used by the main Koha download"
19:35 davi          slef, It would be improvable, but this would be a good time to avoid such risk
19:35 slef          chris_n: yes
19:35 chris_n       if that is possible
19:35 * chris_n     thinks that both Koha and the wiki and the documentation should be sync'd in licensing
19:34 slef          it seems unlikely that any fatal GPL-2 bug will harm wiki text IMO
19:34 thd           Wikipedia had an easier time because they had an or later clause.
19:34 gmcharlt      regarding GPL 2 => GPL 3 or later, yes, we would have to go through this exercise again if we decided to do that, but practically speaking, unless somebody discovers a major flaw with "GPL 2 or later", I don't think it would be necessary
19:34 davi          good
19:34 thd           chris_n, davi:  The or later clause provides flexibility without needing a similar vote.
19:34 davi          ack
19:34 slef          davi: I think the proposal is for the same terms as Koha. Check the wording.
19:33 gmcharlt      I don't know if we've ever decided this, but we could decide to make it stronger and *require* that new content be GPL
19:33 davi          Will us relicense to "GPL version 3" or "GPL version 3 or later" to avoid have to run again with this troubles when "GPL version 4 be out"?
19:32 gmcharlt      chris_n: yes - an implicit part of this would be added a statement to the wiki (on the account creation page, among other places), that any content aded to the wiki should be GPL unless the contributor *explicitly* licenses it otherwise
19:32 slef          chris_n: only as long as we keep track of the licensing status.
19:32 chris_n       thd: having to seek out each editor every time there is a copyright question
19:32 ricardo       chris_n: *nod*
19:32 * chris_n     can imagine many people editing and then disappearing over time
19:32 thd           chris_n: Which sort of question?
19:32 chris_n       will licensing under GPL clear up the possibility of this sort of question in the future?
19:31 thd           s/edited/edited by other than the original author/
19:30 gmcharlt      thd: true any content clearly originating from one of the "yes" voters can be marked as GPL after the voting closed
19:30 thd           gmcharlt: I know that many pages have never been edited.
19:30 pianohacker   +1 on relicensing plan, be nice to get this over with
19:30 thd           gmcharlt: We should be able to relicense at least some old content.
19:30 pianohacker   owen: we have 38 more steps before we have to worry about panicking
19:29 thd           gmcharlt: you may have shortened a step
19:29 gmcharlt      slef, thd: +1
19:29 chris_n       slef: +1
19:29 chris_n       owen: lol
19:29 owen          Don't panic!
19:29 slef          4. thd to ask SFLC opinion on joint authorship theory?
19:29 ricardo       Oops... There goes Chris
19:28 slef          gmcharlt: +1
19:28 gmcharlt      I would like to see if we can get this issue closed, at least for new content, by shortly after the December meeting
19:28 chris_n       gmcharlt: +1
19:28 thd           slef: Which is why we should ask now. SFLC has more experience with this particular issue.
19:28 wizzyrea      sorry I had another meeting >.<
19:28 wizzyrea      hehe
19:27 ricardo       wizzyrea: I think you got some *huge* network lag  ;-)
19:27 thd           slef: They are both probably slow unless you are being sued.
19:27 slef          FTF = Freedom Task Force http://fsfe.org/projects/ftf/
19:27 gmcharlt      and mark it for replacement - given need to document 3.2, I think we'll have reason to do that anyway
19:27 ricardo       gmcharlt: Sounds good to me...
19:27 wizzyrea      Liz Rea, NEKLS
19:27 gmcharlt      3. if necessary, tag any content that's still udner CC-BY-NC
19:27 ricardo       thd: SLFC = Software Freedom Law Center - http://www.softwarefreedom.org/   Right?
19:27 slef          thd: any idea of the relative response rates of SFLC and FTF?
19:27 gmcharlt      2. after that meeting, plan to update wiki to specify that all new content will fall under the GPL -
19:26 thd           slef: We should probably just pose the question to SFLC in any case.
19:26 gmcharlt      1. keep the voting period open until the next general meeting
19:26 gmcharlt      here's what I propose we do:
19:26 thd           slef: We should seek legal advice if we are asking about safe
19:26 mason         hi slef
19:26 slef          s/safe even/worth trying/
19:25 slef          thd: is it safe even without one significant copyright holder?
19:25 thd           legally we could relicense under joint authorship with merely support from significant copyright holders but we should not morally
19:24 slef          mason?
19:24 gmcharlt      and perhaps make it incumbent on them to clearly identify the portions of the content that should remain under CC-BY-NC
19:24 slef          I'm not comfortable with the joint authorship theory, so I would prefer to track things touched by the "yet to vote" list.
19:24 * mason       waves...
19:24 gmcharlt      yes
19:24 thd           I think that if anyone would object we would need to attach a special notice to their content with the CC-BY-NC license.
19:23 ricardo       gmcharlt: *nod*
19:23 gmcharlt      obviously, one of the biggest holdouts is kados, and frankly that's the main direction I would expect any challenge to come from
19:22 thd           gmcharlt: yes we have no objection from any voters
19:22 chris_n       perhaps future licensing should have a "contact-ability" clause in it?
19:22 slef          presumably someone else at katipo could speak for one of them as their employer - maybe one or two others are similar
19:22 gmcharlt      thd: we do have a majority of voters
19:21 slef          those on the Yet to vote and email bouncing, I'm not sure what to do with their contributions
19:21 thd           gmcharlt: We could under the joint authorship theory of the construction of the wiki.
19:21 slef          another escape route is to email all of the "Yet to vote" list with the decision and give them chance to object
19:21 thd           gmcharlt: I think that we do not quite have a majority.
19:21 gmcharlt      but we cannot, ultimately, just relicense existing content w/o the consent of the copyright holder
19:21 slef          chris_n: a majority of authors or of content?
19:20 slef          the escape routes are trying to track which pages have "clean" licenses, but I think the recent namespace restructuring may complicate that.
19:20 thd           gmcharlt: the election rules called for a majority.
19:20 gmcharlt      and putting up appropriate notes on the interface
19:20 gmcharlt      chris_n: I think we have a basis for now stating that *new* content on the wiki will be GPL
19:20 slef          pianohacker: no.  I suspect we'll keep this going until next meeting at least.
19:19 davi          slef, I would agree that voting where not closed allowing new ones at any time, or modification of votes already issued
19:19 chris_n       don't we really just need a simple majority?
19:19 gmcharlt      the main issue at this point would be if one of those who hasn't voted yet decides to come back and want to keep their content non-GPL
19:19 gmcharlt      I'm pretty sure that we've established that a plurarilty of the editors are OK with the relicensing
19:19 slef          davi: I don't see much point in closing the vote.
19:18 * owen        is late too
19:18 pianohacker   thd, slef: do you have a set quorum before you consider the issue closed?
19:18 chris         ill take them back off, doesnt look like i missed too much
19:18 ricardo       If anyone is wondering what's this "wiki relicensing thing", we're talking about this - http://wiki.koha.org/doku.php?id=relicensing
19:18 chris         naw, tried to get to work before 8, didnt think id make it, so put my apologies on the wiki already
19:18 thd           :)
19:18 chris         heh
19:18 slef          chris: did you forget to blow on the pie?
19:18 gmcharlt      I'm pretty sure that at least three of the accounts are spam accounts, back when the wiki was World of Warcraft gold-trading central
19:17 chris         apologies for being late
19:17 slef          I also want to analyse what proportion of pages they represent.  I suspect having kados||liblime approval would cover lots of them.
19:17 davi          Is it boring vote again and again? Should we keep or allow updating current votes, allowing to just add new people votes?
19:17 thd           Some people may never respond or really be found.
19:17 thd           The vote is quite close to half of the electorate.
19:16 thd           I will also contact some again.
19:16 thd           Some have now voted, a couple of others who I can identify I still need to contact.
19:16 thd           I have contacted some people who had not voted.
19:15 slef          thd: go ahead
19:15 slef          thd and myself need to sweep through the yet to votes
19:15 thd           I have an update if slef does not.
19:15 slef          not much
19:14 gmcharlt      the wiki relicensing ballot is still open as far as I can tell - slef, do you have an update on that?
19:14 gmcharlt      items from last meeting
19:14 gmcharlt      so jumping onwards
19:13 gmcharlt      indeed, hdl++
19:12 ricardo       hdl++
19:12 gmcharlt      regarding 3.0.x, I didn't see hdl_laptop introduce himself, so in case he's not here, he has released 3.0.4
19:11 ricardo       gmcharlt: I agree with davi. But I guess that's not a big issue: weren't deadlines created to be skipped?  ;-)
19:11 thd           I will try to have some neglected MARC 21 framework updates in that period including a German translation from kf
19:10 davi          ack
19:10 thd           davi: RC1 is still not a release
19:10 pianohacker   davi: 4 weeks from alpha to release candidate is what I think gmcharlt had in mind
19:10 gmcharlt      alpha to release *candidate* - I'm not expecting that the translations would be done that quickly
19:09 davi          4 weeks from alpha to release is too short?
19:09 gmcharlt      after which the release will be in bugfix mode running up to the RC1 which I hope to get out 3-4 weeks after the alpha
19:09 chris_n       heh
19:08 gmcharlt      (hey, it's going to be 48-hour long November 4th :/ )
19:08 gmcharlt      that was the main (non-bug) blocker, so I'll pull that in and put out the alpha in the next day or two
19:08 gmcharlt      so regarding 3.2, I've now gotten confirmation from paul_p and hdl_laptop that they've fixed the DB update issues in their new_acq branch
19:08 Melanie       Melanie Hedgespeth, Salina, KS  Salina Public Library
19:07 thd           Thomas Dukleth, Agogme, New York City
19:07 slef          (6pm-8pm most nights at the moment :-/ )
19:07 dbirmingham   David Birmingham, PTFS
19:06 SelfishMan    Blaine Fleming, Livingston-Park County Public Library (mostly)
19:06 slef          I pass to sekjal due to network instability.
19:06 gmcharlt      sekjal: thanks, please go ahead
19:06 sekjal        gmcharlt: I can take notes, if slef passes
19:06 pianohacker   Jesse Weaver, John C. Fremont Library District
19:05 Nate          nate curulla bywater solutions
19:05 gmcharlt      before we can started, can I impose on slef or somebody else to be notetaker?
19:05 gmcharlt      Agree times of next meetings.
19:05 gmcharlt        4.
19:05 gmcharlt      Follow-up on actions from General IRC Meeting 7 October 2009.
19:05 gmcharlt        3.
19:05 gmcharlt      Update on Koha 3.0 Roadmap.
19:05 gmcharlt        2.
19:05 gmcharlt      Update on Roadmap to 3.2.
19:05 gmcharlt        1.
19:04 gmcharlt      and the agenda is
19:04 gmcharlt      http://wiki.koha.org/doku.php?id=en:events:meetings:irc_meetings:meetingnotes09nov04
19:04 gmcharlt      the page for this meeting is
19:04 gmcharlt      ok, cool
19:03 ricardo       Ricardo Dias Marques, Portugal
19:03 collum        Garry Collum, Kenton County Public Library, Kentucky
19:03 sekjal        Ian Walls, NYU Health Sciences Libraries
19:03 rafael        Rafael Antonio, Portugal
19:03 jdavidb       J. David Bavousett, PTFS
19:03 brendan       Brendan Gallagher, ByWater Solutions
19:02 slef          MJ Ray, member of software.coop
19:02 Colin         Colin Campbell, PTFS Europe
19:02 davi          Davi Diaz, worker for software.coop
19:02 magnusenger   Magnus Enger, Libriotech, Norway
19:02 cait          Katrin Fischer, BSZ, Germany
19:02 chris_n       Chris Nighswonger, FBC
19:02 Ropuch        Piotr Wejman, Biblioteka CSNE, Poland
19:02 * gmcharlt    - Galen Charlton, Equinox, RM
19:02 gmcharlt      let's start with roll cal
19:01 gmcharlt      one sec
19:01 gmcharlt      welcome to the November 4 meeting of the Koha project
19:00 chris_n       lol
19:00 gmcharlt      and good night
19:00 gmcharlt      good morning
19:00 gmcharlt      good evening
19:00 gmcharlt      good afternoon
18:59 wizzyrea      afk, hopefully back soon
18:59 pianohacker   hi everyone
18:58 cait          :)
18:58 davi          hi
18:58 chris_n       hi cait
18:57 jdavidb       Hi cait! :)
18:57 slef          hi cait
18:57 cait          hi #koha
18:50 sekjal        ftherese++
18:49 Ropuch        I will sure look into it, my boss likes the google books ;>
18:48 ftherese      cute is the right word... just a sample of how Google Books can be integrated into any site @wizzyrea
18:48 wizzyrea      cute in a good way
18:48 sekjal        hi, ricardo
18:48 ricardo       Hi everyone! :)
18:47 wizzyrea      oh that's cute ftherese
18:44 sekjal        pianohacker: oh, right.  I failed to factor in DST when I made the appointment in my calendar
18:44 sekjal        ftherese:  nice!
18:43 wizzyrea      bah, I have to go to another meeting in 18mins
18:43 ftherese      if you click on one of the books it returns, you get an inline viewer right on the page
18:43 pianohacker   just to confirm, irc meeting in 18 min, right?
18:42 ftherese      http://sites.google.com/site/bibliothequeproject/
18:42 ftherese      here is one that I threw together real quick, I am not good at html... but this one only returns results that have some sort of preview:
18:41 sekjal        brendan: impressive.  sounds like its pretty much there, except for some minor cleanup
18:38 brendan       even if the ISBN can't be found in google -- so it's not correct yet
18:38 brendan       and I haven't worked on it in little bit -- so it displays on every result
18:37 brendan       but it doesn't match that often
18:37 brendan       is one version that I hacked together quickly
18:37 brendan       http://catalog.my.pacifica.edu/cgi-bin/koha/opac-detail.pl?biblionumber=4331
18:36 sekjal        easy is good
18:36 ftherese      it would be really easy to set up... just a few lines of javascript and some ajax calls
18:35 sekjal        sounds like it would be a great additional feature to be able to offer in Koha. perhaps set it up so it can display in a different tab in the details page... (next to Holdings, Title Notes or Subscriptions)
18:33 ftherese      sorry koha
18:33 ftherese      the best part is you could keep the user on kola and integrate a viewer/fulltext search
18:33 sekjal        ftherese: I've looked at it some, but not in depth.  I do so little with actual content these days...
18:33 ftherese      have you checked out Google books yet?
18:32 sekjal        it replaces the old hardcoded links to WorldCat, Google Scholar and Bookfinder.com
18:31 sekjal        ftherese: most excellent!  there is a new system preference in 3.1 for searching other sources.  It can parse in the title, author or ISBN
18:30 ftherese      I already did some java scripting and ajax stuff with Google Books, so I know the api pretty well
18:29 ftherese      that way... you can do an ajax to google on your book's isbn and get an inline fulltext serchable if it is available
18:29 sekjal        but at least the mechanics are in place
18:29 sekjal        ftherese: longer answer: I have no idea how to actually make it work.  I imagine some trial and error
18:28 slef          sekjal's way is probably simpler if you can
18:28 ftherese      ahhhh great!
18:28 slef          ftherese: there's javascript in the koha-tmpl folder
18:28 sekjal        ftherese: short answer, yes.  you can overlay arbitrary Javascript on your OPAC from the system preferences
18:28 Ropuch        Evenening #koha
18:27 slef          ftherese: everything is possible, but some things are difficult or expensive.
18:27 munin         brendan: The current temperature in Northwest Goleta, Goleta, California is 15.1�C (10:22 AM PST on November 04, 2009). Conditions: Overcast. Humidity: 82%. Dew Point: 12.0�C. Pressure: 30.02 in 1016.5 hPa (Steady).
18:27 brendan       @wunder 93117
18:27 ftherese      everything seems to be in perl
18:27 ftherese      I'd like to add a google books interface on koha using javascript... is that possible?
18:26 munin         Colin: Error: "useproprietrysoftware" is not a valid command.
18:26 Colin         munin: useproprietrysoftware
18:22 munin         pianohacker: Error: "command" is not a valid command.
18:22 pianohacker   munin: command
18:22 sekjal        I find that hard to believe
18:22 munin         sekjal: Error: "aardvark" is not a valid command.
18:22 sekjal        munin: aardvark
18:21 jdavidb       hehe.
18:21 munin         jdavidb: Error: "what?" is not a valid command.
18:21 jdavidb       munin: what?
18:20 munin         pianohacker: Error: "question" is not a valid command.
18:20 pianohacker   munin: (pianohacker [question]) -- attempt to answer question, verbosely, or provide existential angst if no question is given.
18:11 sekjal        aww, thanks, wizzyrea.  I aspire
18:11 wizzyrea      probably a helpful one ;)
18:10 * sekjal      wonders what kind of command he'd be, if he were valid
18:08 sekjal        ftherese: good luck!
18:07 ftherese      :sekjal thank you
18:07 munin         ftherese: Error: "sekjal" is not a valid command.
18:07 ftherese      @sekjal thank you!
18:07 ftherese      perfect... that's what I needed to know
18:05 sekjal        ftherese: I've had difficulty mapping my own unique identifier into the Koha biblionumber field.  You'd probably need to pick a field of your own (for example, 942$9 would probably work)
18:04 hugo          I will, for sure. Thanks - and I'll run the install now
18:04 hugo          Oh - I see, maybe this is what you were asking - it says the failed tests in t/Label were 62 64 66 and 68
18:04 ftherese      sekjal: I mean, there is a unique identifier for each biblio, but I dont' know where to map it in MARC
18:04 chris_n       yup, but please consider filing a bug
18:03 hugo          should I just keep going and go on to make install?
18:03 ftherese      sekjal: no, but I can
18:03 hugo          the last line of the make test reads, "make: *** [test_dynamic] Error 255
18:02 chris_n       and so needs to be fixed
18:02 chris_n       which means it will break for others of the same type (exact same)
18:02 sekjal        ftherese: you just have to be sure to have a solid matching rule. do you have unique identifiers for each biblio that get mapped to MARC fields?
18:01 chris_n       it most likely means that the lccn splitting algorithm breaks on that particular call number
18:01 ftherese      perfect sekjal!
18:01 hugo          oh - OK - I assumed it was all me and not a bug at all.
18:00 chris_n       hugo: it's not a show stopper for you installation, but it would be helpful to file a bug along with a cut and paste of the output from the test at bugs.koha.org
18:00 ftherese      I was going the opposite way
18:00 ftherese      start with the concrete and work your way to the abstract
18:00 hugo          looking at the output from the make test, the errors start where it says...
18:00 sekjal        ftherese: then, on Staging for Import, you'd select your matching rule, and the options to skip adding the biblio if you match, but to still add the item
18:00 ftherese      I was just conceptualizing it wrong
17:59 ftherese      thank you sekjal
17:59 ftherese      lights go on in the room...
17:59 ftherese      oh man...
17:59 sekjal        ftherese: one option would be to have a line in the spreadsheet for each item, rather than just each biblio.
17:59 hugo          I'm a newbie with this stuff - how would I see which call failed?
17:58 sekjal        ftherese: I think I see your problem.  You have a variable number of items per biblio record, so mapping them from Excel columns isn't really going to work
17:58 slef          3.00.04_final?
17:57 chris_n       hugo: which call number failed?
17:57 hugo          sorry - 3.00.04
17:56 hugo          chris_n:3.00.03
17:56 ftherese      but I can't create a table on an excel sheet... or at least I don't know how...
17:56 chris_n       hugo: what version of koha?
17:55 ftherese      and I managed to map my previous database through an excel sheet to the marc
17:54 ftherese      I am using MarcEdit
17:54 ftherese      but I don't know how
17:54 ftherese      that's what I need to do
17:54 chris_n       hi pianohacker
17:54 sekjal        you can embed the item data in the MARC records that you import.  It's the 952 field in MARC21 (and I think in UNIMARC as well, but I'm not sure)
17:54 ftherese      an ID number if you will
17:53 ftherese      My old library database used a number to link the two
17:53 ftherese      My problem is conceptual... I don't see how to relate my items with the Catalog
17:52 ftherese      stage marc records
17:52 hugo          Failed test '5 of 3 pieces produced' at t/Labels_split_ddcn.t line 30 etc.
17:52 sekjal        did you use the Stage MARC Records for Import tool in the Tools area, or bulkmarcimport.pl?
17:51 pianohacker   good morning
17:51 ftherese      Unimarc
17:51 ftherese      I know there is a 995$9
17:51 ftherese      thati s correct sekjal
17:51 ftherese      yes
17:50 hugo          many fewer errors, but still some -
17:49 sekjal        ftherese: so you've loaded in all the bibliographic records, and now need the item records to be attached?  Am I understanding correctly?
17:49 hugo          no fire on my return...
17:48 ftherese      but now I need to know how to get my barcode information in there
17:48 slef          chris_n: code always litters my desktop
17:48 ftherese      I got the whole catalog uploaded into koha
17:48 chris_n       I actually finished up the core code for card generation (C and M)last week and am now working it into the interface (V)
17:47 ftherese      ok... I am stuck again
17:46 chris_n       code litters the desktop
17:46 slef          rather you than me
17:45 * chris_n     is ripping out and rewriting the rewrite of labels to implement patroncards atm
17:45 slef          hihi
17:44 chris_n       howdy slef
17:42 * slef        flits through, tidying up RFID some more
17:42 slef          so will hugo's computer have caught fire on return?
17:41 hugo          and walking down the drive to get the mail. thanks for the suggestons, I'll see what I've got when I return
17:41 hugo          ok, since I'm living on the edge here, I used sudo, your suggestion worked, I'm running make test again
17:40 hugo          and should I do this with sudo? Or the regular use is OK?
17:40 wizzyrea      colin++ that's what I was looking for to suggest
17:37 Colin         try libyaml-perl  as the package name
17:35 hugo          but failed
17:34 hugo          I thought there was an apt package - I tried: apt-get install YAML
17:33 Colin         hugo: use cpan (although there should be an apt package on debian for it)
17:31 hugo          make sense?
17:30 hugo          well, I guess it's a test - mainly because I've been given this newer (newish) dell box to run it on, and all I had was a PATA drive, while the machine really takes SATA drives. So I've rigged it to work with the PATA drive just so I can see that I can make it work. Once that's done, I'll go find a SATA drive and do it again.
17:29 wizzyrea      is this a test or are you planning on using this particular install for production?
17:29 hugo          And I'm sure I'm much closer that before to getting this working for our little library.
17:28 hugo          well, I've been trying this all summer (on the OS X machine) so a little delay now is not awful.
17:28 hugo          lol
17:28 wizzyrea      oh hugo, you turned on the lights and all they all scattered :)
17:26 hugo          any help very much appreciated :)
17:26 hugo          so not sure if that would be wise (using CPAN)
17:25 hugo          I would use CPAN to install YAML, but the install doc says "only use CPAN for Perl dependencies which are NOT, etc. etc."
17:25 hugo          followed by can't locate YAML.pl in @INC
17:24 hugo          the first error says "Failed test 'use C4::Circulation;'
17:23 hugo          There were more "yaml' not installed errors... but I've rebooted, and am running make test again to see what I get - ah, just failed again....
17:22 hugo          but got lots of errors when I ran 'make test'
17:22 hugo          I got a few messages about YAML not being installed... that was the first thing that didn't seem quite right. But I kept on going...
17:21 hugo          I'm following the INSTALL.debian-lenny instructions and reading instructions at http://blog.triumphovermadness.com/2009/05/koha-3-on-debianlenny.html too
17:20 hugo          and am stuck with a few questions.
17:20 hugo          I'm doing my first install of Koha on Debian (after many failed attempts on OS X)
17:20 sekjal        hi, hugo
17:19 hugo          Hello...
17:19 sekjal        things we'd want to facet on
17:19 jdavidb       lol owen
17:19 sekjal        I'd say branch, item type, collection code, call number, and enumeration/vol/copy info is good to keep in the 952
17:19 owen          jdavidb: And one of those places is the back of my car, and you know it's a mess back there.
17:18 jdavidb       Keyword:  *single*.  Storing things multiple places is asking for gobs of trouble.  (read:  Item type, which is in *four* places.)
17:18 sekjal        also, I think there has been talk of pulling the circ-related info out of MARC to speed up circulation
17:18 wizzyrea      branch etc probably should be left in the 952
17:17 wizzyrea      well, koha really does need a single place for STATUS
17:17 sekjal        owen: we'd definitely need some 952 item data in the MARC records that are fed to Zebra, but maybe by reducing it to just the key elements, we can minimize the impact of the bug.
17:16 owen          sekjal: If items can't be indexed by Zebra, how do you search by branch, availability, etc.?
17:16 wizzyrea      NEKLS might have some money to put towards that
17:16 sekjal        I've got some ideas for how to make Koha more consortia/electronic resources friendly, but they involve lots of gut-level changes
17:16 wizzyrea      :/ yea, maybe
17:16 sekjal        perhaps this and other major revisions to the Koha infrastructure are best saved for Koha 4.0
17:15 wizzyrea      hm, interesting
17:13 munin         04Bug http://bugs.koha.org/cgi-bin/bugzilla3/show_bug.cgi?id=2453 critical, P3, ---, gmcharlt@gmail.com, NEW, (very) large biblio/item handling
17:13 sekjal        I cite bug 2453 again
17:13 sekjal        having them in the MARC means they can get indexed by Zebra, but is that really a benefit worth the time/cost of large MARC records
17:13 Colin         the database can handle much of this data better than in a marc subfield
17:13 sekjal        there are 5 acquisitions-related subfields in 952, as well as 5 circulation-related ones
17:12 sekjal        this ties into a conversation yesterday about moving certain kinds of info out of the 952, and keeping it just in the database
17:10 jdavidb       We've got a recent (still in testing) enhancement that does some additional status work; I don't think it does anything much in 952 fields, though.
17:10 sekjal        no only would the logic attached to each of those 6 statuses need to be routed through the new configurable table, but we'd have to build the update script to convert libraries running the current setup to this new setup
17:09 sekjal        wizzyrea: it would probably be a pretty big deal to change
17:09 wizzyrea      ew
17:09 sekjal        there are current 6 separate subfields in 952 for status information
17:09 wizzyrea      how difficult of a job, do you think, that would be
17:08 wizzyrea      sekjal, I like that idea
17:07 sekjal        it'd be easy enough to ship Koha with some pre-created statuses, then let the library add/modify/delete as they like
17:06 sekjal        we could then create what statuses we want, on a per library basis, and apply logic to them to give them meaning (i.e. LOST items cannot be checked out and do not show up in search, ORDERED items can have holds placed on them, etc.)
17:05 sekjal        it seems to me it would make sense to just have a single "status" field which points to a user-configurable table
17:05 Jpr           owen: thank you very, very much
17:04 sekjal        s/'tatus'/'status'/
17:04 sekjal        wizzyrea: I was not aware of this, but it makes sense.  the 'tatus' of an item is kept in many different fields
17:04 owen          ...to: <form name="f" action="/cgi-bin/koha/opac-authorities-home.pl" method="get">
17:04 owen          Jpr: You'd change line 12 from "<form name="f" action="/cgi-bin/koha/opac-authorities-home.pl" method="post">"
17:03 owen          Yes, I believe so. You'd be modifying opac-authorities-home.tmpl. That's the template, not the script.
17:03 Jpr           owen: in short, what you're saying is that I could more or less apply that patch to my own cgi-bin/koha/opac-authorities-home.pl in order to then be able to interact with opac-authorities-home in the aforementioned manner? (out of breath)
17:02 wizzyrea      is that true, is koha really not z-standard compliant?
17:01 wizzyrea      I just got a question based on this statement: Since Koha treats circ status information in a non-Z standard way (they put it in a separate field apart from the call number, rather than in a subfield in the same field), the only time we can route requests around Koha libraries is when the item is checked out and has a due date. Any other statuses, such as Lost, Missing, On Order, etc. don't get transferred.
17:01 Jpr           ack, I don't think we've got it, as we haven't updated since installation at the beginning of January
17:01 wizzyrea      hmm, this is interesting
17:01 owen          If you're self-hosted and have access to the Koha files you could modify the template
17:00 owen          Jpr: I would think that would mean you'd have the fix. Maybe I've got my version numbers mixed up.
17:00 Jpr           owen: that is to say...?
16:59 |Lupin|       till soon all, bye
16:59 |Lupin|       np, sorry I couldn'thelp more ftherese
16:59 ftherese      that's ok... thanks Lupin
16:59 owen          Jpr: It looks like the authorities search form changed from "method=POST" to "method=GET" on 2009-04-18
16:59 |Lupin|       ftherese: I have to go now, can't help more, sorry
16:58 |Lupin|       ftherese: I'm not sure but I'd say you really have to script and to add the info progressively to the marc records tat have been previously imported
16:55 Jpr           I also have to admit that we haven't updated from 3.0.1, and this could be something that's been corrected in one of the more recent releases
16:54 Jpr           no, it's in-house only
16:54 owen          Jpr: Do you have a publicly-accessible OPAC to demonstrate?
16:54 Jpr           it remains simply 'cgi-bin/koha/opac-authorities-home.pl'
16:54 * owen        rarely works with authorities
16:54 Jpr           exactly
16:54 owen          Jpr: does the result of your normal authorities search not result in a URL like that?
16:53 ftherese      would it cause the items to correspond to the information contained in the catalogue?
16:52 ftherese      then import that seperately
16:52 Jpr           where you follow opac-search.pl with a question mark and then various search terms (like you can see in the address bar after doing a normal or advanced search)
16:52 ftherese      now if I map my old BibID number to, say 955$z, could I then make a separate marc file with all my barcode numbers and also put a reference to 955$z with the old BibID?
16:51 Jpr           hey, I'm curious if anyone knows whether it is possible to do a search through 'opac-authorities-home.pl' in the address bar of a browser, the way it is for opac-search.pl
16:50 kf            bye #koha
16:48 |Lupin|       ftherese: perhaps use something in the 9xx block ?
16:48 |Lupin|       ftherese: that I don' know, I'm really not a MARC specialist
16:46 ftherese      numbers.
16:46 ftherese      so what could I use as a 8xx$x for my old identity record number?
16:46 ftherese      ok... that's fine
16:44 |Lupin|       ftherese: and then once it's done you could write a script that goes over your aold db and Koha's db and adds your numbers as barcodes ?
16:43 |Lupin|       ftherese: ell so you could first import your MARC records and during the record you would keep track of the mapping between your old numbers (those who identify records) and those assigned by Koha
16:42 ftherese      basically yes
16:42 |Lupin|       ftherese: is this number a kind of barcode ?
16:42 ftherese      I need to connect the items to the main database
16:42 |Lupin|       ftherese: aaaaah...
16:42 ftherese      but that item number is not directly present in the main database
16:41 ftherese      which already have an item number on them
16:41 |Lupin|       ftherese: so what exactly is it that you want to migrate ?
16:41 ftherese      my concern is... I don't want to have to relabel all my books
16:41 |Lupin|       ok
16:41 ftherese      no loan history
16:41 ftherese      no
16:41 |Lupin|       ftherese: your concern is that you want to migrate the loan historiy from your old system in Koha, right ?
16:40 ftherese      ok
16:40 |Lupin|       ftherese: I think you have to proceed in several steps
16:39 ftherese      that is also found in the abstract information database
16:39 ftherese      whose only connection to their information is a number
16:39 ftherese      ok... that is helpful... now how do I take a database full of item numbers
16:38 |Lupin|       ftherese: and I think to refer to a copy of a book, for loans and so on, it's the itemnumber which is used.
16:37 |Lupin|       ftherese: yes, yes
16:37 |Lupin|       ftherese: item number : yo have one by physical book. So if for instance for one given book you have sevral copies of it, each copy will have its own itemnumber which is stored in 995$9 in Unimarc
16:36 ftherese      in MY old database
16:36 ftherese      BibID is the reference to an abstract piece of information
16:35 ftherese      item number refers to information or to concrete objects?
16:35 ftherese      I am not sure... I am brand new to the process
16:35 |Lupin|       ftherese: are you talking about itemnumber field in Koha ?
16:34 ftherese      this actually does help... but I am trying to leave this system behind as well, so it is important only for the sake of transition
16:33 ftherese      since we have multiple copies of the same book
16:33 ftherese      and if I look up the BibID on my information database I know the title, author, cotation, etc.
16:32 ftherese      I know that the book registered with 65537 has BibID=1000293845
16:32 ftherese      for example
16:31 ftherese      and it is how I can link the two
16:31 ftherese      that has no importance or significance other than it is what the items and their information card have in common
16:30 ftherese      an index number
16:30 ftherese      it is a key
16:30 |Lupin|       ftherese: which number do you try to store in a unimarc field ?
16:29 ftherese      I just need to know what field I can put that number in
16:29 |Lupin|       and I'm wondering whether this line doesn't distroy the prevous content of the variable...
16:29 slef          I'd have to look it up and I'm busy, sorry :-/
16:29 ftherese      I already have an ID number that corresponds between the item information and the items themselves
16:29 slef          |Lupin|: you can login and then give the CGISESSID cookie extracted from your browser on the command-line but I forget exactly how.
16:29 |Lupin|       there is a variable which is re-declared
16:28 |Lupin|       at lien 119 of virtualshelves/addbybiblionumber.pl
16:28 |Lupin|       slef: but I may have found something weird...
16:28 |Lupin|       slef: it's what I meant, yes
16:27 ftherese      I want my match point to be a number that I can put into the marc
16:27 schuster      Is the information card # on the MARC someplace?
16:26 slef          |Lupin|: debug info is in the koha-error_log if the ErrorLog and koha debug preferences are right, but you can't easily run it through the perl debugger, if that's what you mean.
16:26 schuster      doesn't matter on items.  You would need a match point to load the items into the database from your MARC records already in Koha.
16:25 ftherese      sorry
16:25 ftherese      I am using unimarc
16:25 schuster      ftherese - embedded information on the marc is in the 952 tag
16:24 |Lupin|       slef: I don't know how to debug a script that has been ran by apache... is that possible ? If it is I'm gonna be very interested !
16:23 |Lupin|       jwagner: however, she is the owner of the shelves, she has _all_ the permissions set, including super librarian
16:22 |Lupin|       jwagner: the problem is that when our librarian tries to add something to a virtual shelf, the existing ones are not proposed
16:22 slef          |Lupin|: I think you get into managing the CGISESSID cookie. Probably easier to script running it through the webserver.
16:21 jwagner       |Lupin|, which script & for what reason?
16:15 |Lupin|       Anyone knows how to pass login and password to a Koha scipt while running it from the command-line, please ?
16:13 ftherese      thd: are you around?
15:57 ftherese      could I just import a second marc file for the items?
15:55 wizzyrea      no clue, sorry
15:55 * wizzyrea    is like a fish gasping for water while trapped on the beach
15:54 ftherese      then how would I connect the barcode numbers to their respective info card #?
15:54 ftherese      ok... so if I used MarcEdit, and assigned some tag 8xx$a to all the "info card #"
15:52 wizzyrea      :(
15:51 wizzyrea      if I'm understanding correctly, you could use marcedit to map the bib to the item, but that would be a feat of data compilation that I probably wouldn't have much luck helping you with
15:51 ftherese      I have nearly imported all the information cards
15:51 ftherese      which is their "info card #"
15:50 ftherese      to their information card
15:50 ftherese      so I need to connect the items (based on their barcode number)
15:50 wizzyrea      yea
15:50 wizzyrea      wow
15:50 ftherese      and it has another number that refers to its information card (in another database)
15:49 ftherese      and the barcode number itself refers to an individual book
15:49 ftherese      barcode number
15:49 ftherese      right
15:49 wizzyrea      like, barcodes?
15:48 ftherese      so if I have a database full of book ID numbers
15:48 wizzyrea      doesn't say a lot, but my guess is that it looks for 952 fields
15:47 wizzyrea      http://koha.org/documentation/manual/3.0/tools/stage-marc-records-for-import
15:46 wizzyrea      the manual might have something about it, nengard isn't here, let me go see if i can find the section in the manual
15:42 ftherese      like on how it works
15:42 ftherese      I am importing records for the first time into koha from a marc file... one of the options was "check for embedded item record data?" how do I get more information on this?
15:41 wizzyrea      not in lynx, but saw the dropdown
15:40 wizzyrea      er
15:40 wizzyrea      I was too :/
15:39 owen          |Lupin|: I was able to successfully add items to a list in lynx
15:36 rhcl          OK, looks like Southiana may be India - Mainpur
15:34 wizzyrea      odd, it appears to be possible on my install, to add an item to an existing list
15:33 |Lupin|       wizzyrea: I think she has...
15:32 wizzyrea      it may not work if say, she doesn't have permissions to do  lists?
15:32 wizzyrea      I can send a screenshot for your librarian if you like
15:31 wizzyrea      when you get in that pop-up window
15:30 wizzyrea      |Lupin| there is a dropdown at the top that says "select existing list"
15:30 |Lupin|       wizzyrea: np ! I'm too happy that you accept to look to raise any complaint !
15:29 Melanie       Hi Owen.
15:29 owen          rhcl: I wondered the same thing
15:29 wizzyrea      I'm going to look right now
15:29 wizzyrea      hi melanie
15:29 wizzyrea      sorry |lupin| I am always getting off track
15:29 rhcl          where is Southiana? Even Google can't google it.
15:25 |Lupin|       hello Melanie
15:24 owen          Hi Melanie
15:24 |Lupin|       wizzyrea: thanks !
15:23 wizzyrea      |Lupin| hmm... let me go look
15:23 munin         04Bug http://bugs.koha.org/cgi-bin/bugzilla3/show_bug.cgi?id=1499 normal, P3, ---, gmcharlt@gmail.com, RESOLVED FIXED, add bilbio fails during dupe check if ISBN (020$a) has parentheses
15:23 chris_n       bug 1499
15:22 |Lupin|       I tried with lynx and couln't figure out what to do to add to an existing list... can someone please help ?
15:22 |Lupin|       it proposes her to create a new list and there seems to be no option to add to an existing list
15:21 |Lupin|       my librarian tells me that when she's on a record and uses the add to list button
15:21 |Lupin|       pls I need some help to add a title to a list
15:21 gmcharlt      that only does part of it
15:20 chris_n       wow... fixed over two years ago
15:20 chris_n       jwagner: http://git.koha.org/cgi-bin/gitweb.cgi?p=Koha;a=commit;h=afe642bc5233bb316537558351bc26e49bff7a9c
15:19 jwagner       Not a problem -- whenever you get a chance.  Thanks much!
15:19 gmcharlt      I'll have to look it up - doing other stuff atm
15:18 jwagner       gmcharlt, thanks -- is there an ID number/date for the patch file?
15:18 gmcharlt      the basic approach is to normalize the ISBN into both the ISBN-10 and ISBN-13 forms, then store them in a subfield in the 942 for indexing
15:18 gmcharlt      jwagner: the patches are out there and will be in 3.2
15:17 chris_n       anyhow, gmcharlt would know
15:17 chris_n       it calls SimpleSearch which i seem to recall might be a bit hackish
15:16 chris_n       it may be in that query that the filter could be adjusted (but I'm not sure at all as this is code I'm not familiar with)
15:15 chris_n       it appears to build the query that determines matches
15:15 chris_n       perhaps in the first foreach loop
15:13 chris_n       line 654
15:13 chris_n       then get_matches in Matcher.pm
15:12 jwagner       chris_n, that looks like the right place.
15:11 chris_n       jwanger: BatchFindBibDuplicates may be your function
15:09 jwagner       OK, thanks. gmcharlt, you online?
15:08 schuster      Sorry you need to check with gmcharlt
15:08 jwagner       schuster, do you know if that feature is backported for 3.01, or only in 3.2?
15:07 chris_n       jwanger: right
15:06 jwagner       chris_n, a regex such as stripping out everything that's not numeric?
15:05 schuster      We have not had issues since we did this development.
15:05 chris_n       a regex in the proper place would work
15:05 schuster      I don't know if the ISBN matching rules were part of the "patch" release or the 3.2 release.
15:05 jwagner       That sounds like what I need.
15:05 schuster      He had some type of algorithm that stripped "extra" characters when matching but left the characters on the MARC existing and incoming.
15:05 jwagner       schuster, different systems at different levels of 3.01.00.0xx
15:04 schuster      plano ISD sponsored some work on ISBN matching/loading that gmcharlt put together...
15:04 schuster      jwagner - are you running 3.2?
15:00 jwagner       I've poked at this at intervals for months, & never found a good solution.  But there has to be a way to do it properly (she said optimistically).
14:59 chris_n       ahh
14:59 jwagner       Yes -- frequent cases where a MARC import doesn't match an existing record, and the ISBNs aren't absolutely identical (i.e., extra characters in the ISBN).
14:58 chris_n       jwagner: is this an import operation?
14:58 jwagner       There are a couple of lines back in tools/manage-marc-import.pl that might be the point -- $new_matcher_id eq $current_matcher_id
14:57 chris_n       tnx hdl
14:57 chris_n       at that point it appears that the matching is done
14:57 hdl           chris_n it has : you can use rtrn and ltrn for that.
14:55 jwagner       I'm looking at the GetImportRecordMatches function, I think.
14:55 * chris_n     takes a quick look
14:55 jwagner       I'm looking at the ImportBatch.pm file, which is where the matching seems to take place, with that possibility in mind, but so far I haven't deciphered the code enough to see if that's possible.
14:54 chris_n       jwagner: perhaps the zebra syntax has something similar to sql 'like' rather than '='?
14:48 jwagner       I tried some tweaking of the 020 index, but haven't been successful yet.
14:48 jwagner       Question about indexing and matching rules -- I keep coming back to this every so often, but haven't found a good solution yet.  If the ISBN or whatever field isn't exactly identical, a MARC import doesn't find the match.  For example, ISBNs of 9780515147001 (pbk.) and 9780515147001 ; (note pbk and extra semicolon respectively).  Isn't there any way to structure the indexes or the rules to make these match?
14:48 chris_n       ok, back to patroncard stuff
14:48 Nate          hi everyone
14:47 brendan       wb Nater
14:47 chris_n       hi Nate
14:47 brendan       :)
14:47 munin         chris_n: Karma for "espresso" has been increased 2 times and decreased 0 times for a total karma of 2.
14:47 chris_n       @karma espresso
14:47 chris_n       opps
14:47 chris_n       wow
14:47 munin         chris_n: expresso has neutral karma.
14:47 chris_n       @karma expresso
14:47 brendan       espresso++
14:46 chris_n       espresso++
14:46 chris_n       hehe
14:46 * brendan     is flying while typing
14:46 brendan       I made two espresso shots
14:46 * chris_n     wonders if brendan found the coffee pot
14:37 hdl_laptop    amadan: see private message
14:32 amadan        or alternatively you can redo the training with me;-)
14:29 amadan        O:-)
14:26 amadan        with Mali
14:22 amadan        hdl, how about the link
14:20 amadan        ur welcome anytime
14:20 chris_n       I hope to visit sometime
14:19 amadan        i c
14:19 * chris_n     has a friend about an hour or so out of Accra
14:18 amadan        been here b4
14:18 amadan        am in Accra
14:18 chris_n       amadan: are you near Accra?
14:17 amadan        gr8 stuff man
14:17 amadan        gr8 stuff will be grateful so much
14:16 hdl_laptop    (I trained him in English, so he should be able to give some hints ;) )
14:16 hdl_laptop    I could put you in contact with them...
14:16 hdl_laptop    No. But I know the person who did it.
14:13 amadan        any links
14:13 amadan        really?
14:13 hdl_laptop    amadan: some ppl installed Koha in Mali
14:13 amadan        its installed but i have some hitches
14:12 amadan        well, i live in Ghana trying to install koha for my uni
14:11 kf            amadan: and the list is not complete
14:09 amadan        kool thks man
14:08 chris_n       check out http://koha.org/support/pay-for-support (fwiw, I would not base any aspect of my decision on the order of the list)
14:05 amadan        anyone in mind? I'll really prefer if it was online
14:03 chris_n       amadan: there are quite a few companies which provide services such as training, etc. for Koha
14:02 amadan        Does anyone know where i can get training on koha administratio and installation?
13:56 amadan        sorry i'm a newbie
13:55 amadan        Do i have to do another setup for the staff client or it install automatically after the initial install
13:54 Nate          good morning #koha!
13:53 Colin         The staff client is set up on a separte virtual host. By deault port 8080
13:53 amadan        How does a user access it then?
13:52 Colin         There is no separate staff client. Koha is a web based application and you use a browser to access it
13:51 * owen__      appears to be multiplying
13:50 amadan        where can i download koha staff client
13:41 * chris_n     hands brendan some fresh grind
13:40 * brendan     off to go make some coffee
13:40 * chris_n     lets his dog run interference on outside noises
13:40 brendan       ah...  good reminder
13:40 chris_n       great after I got the first cup of coffee down :-)
13:39 brendan       how you doing chris_n
13:39 chris_n       ouch
13:39 brendan       can't fall back to sleep -- so I'm starting the day :)
13:39 brendan       just got woken up by some weird noises outside
13:39 brendan       morning
13:39 chris_n       hi owen
13:39 chris_n       your up early brendan
13:17 chris_n       g'morning #koha
13:00 |Lupin|       hi jdavidb
12:56 jdavidb       hello, #koha
12:39 audun         nevermind, thd. Found it :)
12:38 jwagner       Good morning |Lupin|
12:38 |Lupin|       hi Jane
12:38 |Lupin|       hdl_laptop: re:news: thanks
12:30 audun         thd: or is there some clever way to avoid having to do that?
12:21 audun         well, as that data does not exist yet, I suppose I have to add it to the source db before using marcedit
12:16 thd           audun: Be certain to populate 952 $a and $b with a code for the library or branch.
12:15 thd           audun: Match your data to the values for 952 described in http://git.koha.org/cgi-bin/gitweb.cgi?p=Koha;a=blob;f=installer/data/mysql/en/marcflavour/marc21/mandatory/marc21_framework_DEFAULT.sql
12:14 audun         hmm..thanks. I'll give it a try
12:13 thd           audun: So if MARC 21 then create items with repeated 952 fields in the bibliographic records for each item in the bibliographic record.
12:12 audun         marc21
12:12 thd           audun: Are you using NORMARC or BSMARC?
12:11 thd           audun: Are you using MARC 21?
12:11 thd           audun: whether you can reimport without purging your records in Koha or not, you would still not to create items fields within the bibliographic record for each item.
12:08 thd           kf: http://www.loc.gov/cds/PDFdownloads/marc/index.html
12:03 kf            hm lunch time now - colleagues are waiting. will be back in about half an hour.
11:58 thd           kf: Library of Congress is the official maintenance agency for MARC 21.
11:58 audun         Used staged import
11:58 kf            audun: did you use bulkimport or staged import? you can reimport your data with staged import, if you have a good matchkey,  just adding the items and leave the records as they are
11:58 thd           kf: my source for MAB2 unification was merely the Library of Congress update to MARC 21.
11:56 kf            audun: I think you can use marcedit, when you already have fields for accession number
11:56 thd           audun: 952 $p is for barcode by default.
11:56 audun         kf: right..so i -do- have to add it manually then
11:55 kf            audun: you need a little more, item type, home and holding branch, according to your configuration in Koha
11:54 kf            thd: I can test that, but need to finish some other things today, so I can tell you next week
11:54 audun         kf: Hmm..I have. must have done something wrong. I have mapped the acquisition number  from the source db to 952$p
11:53 thd           audun: What is your previous ILS?
11:53 thd           kf: The web based import from a file and the internal record editor are the risks.
11:52 kf            audun: marc21? you need to add 952 fields, its repeatable, one for each item
11:52 thd           kf: I think that Z39.50 is safe now and I merely forgot about the fact.
11:52 audun         using marcedit
11:52 kf            thd: what was your source for the work you did on mab2 unification?
11:51 audun         Migrating from an ms access database
11:51 thd           audun: From what automation system are you migrating?
11:51 kf            thd: I think this is no problem. how to proceed? I can do some testing with staged import / z39.50 next week and find out about 689 and similar fields.
11:49 thd           audun: Sorry, what format are your items in currently?
11:49 thd           kf: Well we should include a link to the website in a note for the frameworks.
11:49 audun         thd: How would I go about adding those items?
11:48 kf            689 is on dnb web site I think, but probably only in german
11:48 kf            thd: im just the one to teach koha - because koha is the first ILS that will use marc21 in our consortia
11:47 thd           kf: It would be good to have a list of sources of documentation for major uses which are not part of the official standard.
11:47 kf            thd: I will ask my colleague about a list, she is the real export and was involved in "mab2 unification"
11:47 thd           kf: Definitely.
11:46 kf            thd: ok, I just wondered if its ok to include them
11:46 thd           kf: I have special fields used by OCLC and RLIN (now part of OCLC) included in the frameworks.
11:45 thd           kf: Everyone should have them available in all languages for anything major.
11:45 thd           kf: We should make a list of those.
11:44 thd           kf: Oh yes
11:44 kf            schlagwortketten (subject chains? )
11:44 kf            689
11:44 kf            there are some fields not documented in marc21 I think but used by dnb and union catalogs
11:44 thd           kf: I need to check the items fields for some changes so that the default framework would not necessarily break things.
11:43 thd           kf: there is an SQL file or a set of them but I have already done most of the work.
11:42 thd           kf: the Z39.50 client changed 3 years ago so I had forgotten that the issue is probably fixed for Z39.50.
11:42 kf            thd: anyway the marc21 frameworks should be updated, but I was not sure how to start with it
11:42 kf            thd: I will be on vacation for the next 2 days, but I think we should test if the bug is still there
11:42 thd           kf: Using the internal record editor is the big risk.
11:41 thd           kf: The command line scripts are safe for importing.
11:41 thd           kf: I think that if you use the internal Koha record editor to edit your records, you will still loose data for undefined subfields.
11:40 kf            thd: yes, and we will use staged marc import now and bulkauthimport to update data weekly from union catalog, nightly once we have an automated solution
11:40 audun         thd: So I have to manually add items to each record? woah
11:39 thd           kf: and bulkauthimport.pl ?
11:39 kf            thd: I think we need to add all $0 and all $w, and some new fields?
11:39 thd           kf: so you first started with bulkmarcimport.pl ?
11:38 kf            audun: thd is right, this should be the easiest way
11:38 thd           audun: there is no effective feature that you imagine
11:38 thd           audun: add your items to the bibliographic record set before importing
11:37 thd           audun: start again and add your items
11:37 kf            thd: yes, but I think I will leave them untranslated
11:37 kf            thd: I think we used all of them, first bulk, but also staged because we migrated bibliographic information from union catalog and item information from ms access (creating very basic records, overlayed by union catalog records later)
11:36 thd           Real libraries still have records with the obsolete fields
11:36 audun         So I've managed to import 5000 marc records from an msaccess db. Trouble is, none of them contain embeded items. Is there some way to do a bulk item creation?
11:36 thd           :)
11:36 kf            thd: I basically finished German translation for bd yesterday, but I set many markers, becaue I have no translation for all those obsolete fields
11:35 thd           kf: how had you been importing data?
11:35 thd           kf: yes that was my question for you
11:35 kf            thd: I just wondered if the bug is still there, I did some testing for z39.50 with marc21 from our union catalog today, I got a field called LOK, which was (of course) not defined in the framework, but got saved in xml in import_records
11:32 kf            thd: reading your mail right now
11:30 thd           kf: are you there?
11:24 hdl_laptop    It is in Tools
11:24 hdl_laptop    |Lupin|: yes there is.
11:20 |Lupin|       so that librarians can post news to the OPAC... ?
11:20 |Lupin|       isn't there a news system somewhere in Koha ?
10:55 |Lupin|       hdl_laptop: ok, thanks
10:55 hdl_laptop    it is enough.
10:54 |Lupin|       hdl_laptop or fredericd: for a field not to be indexed by zebra, is it enough to comment it in records.abs, or is there something else to take care of ?
10:46 hdl_laptop    and word to query and indexing
10:45 hdl_laptop    sort applies to sorting
10:34 Jpr           ?
10:34 Jpr           is it word-phrase-utf.chr that is in /etc/koha/zebradb/etc or one of the sort-string-utf.chr in ...zebradb/lang_defs/...
10:33 Jpr           in the /etc/koha/zebradb there are various .chr files (zebra character set files), and I'm not sure which one applies to searches in Koha:
10:31 Jpr           hey all, I have a question on Zebra character sets in Koha:
09:53 amadan        no worries
09:51 hdl_laptop    chris around ?
09:50 nicomo        sorry, wrong channel
09:50 nicomo        http://pepys.biblibre.com/dol_cvs/dolibarr_2_4/htdocs/soc.php?socid=116
09:45 amadan        thks man
09:45 kf            perhaps this will help: http://wiki.koha.org/doku.php?id=koha_3_install_guide_ubuntu_hardy
09:43 kf            you must reindex your records with a cronjob every x minutes - rebuild_zebra.pl and zebrasrv must be running
09:43 amadan        u hv the link
09:42 kf            I think it was explained in one of the installation manuals for ubuntu. hm.
09:41 amadan        perhaps might help
09:41 amadan        u can share your experience
09:40 kf            I have only done it on my test laptop and I am not using cronjobs there... perhaps someone else can help out?
09:39 amadan        fedora 10
09:39 amadan        I mean how do i do that?
09:39 kf            which os are u using?
09:38 amadan        any pointers to that?
09:38 kf            amadan: you probably need to index your data and zebrasrv must be running
09:38 amadan        i installed koha with zebra
09:37 amadan        yes i have
09:37 amadan        does anyone know where i can get training on koha either online on face to face?
09:37 kf            amadan: have you installed koha with zebra?
09:36 amadan        koha not oha
09:36 amadan        Need help, installed koha but the search engine seems not to bring up any items stored in oha
08:42 munin         paul_p: Error: I haven't seen owen, I'll let you do the telling.
08:42 paul_p        @tell owen about: sometimes misses the ability in 2.x to search a specific MARC field...but wouldn't give up Zebra to get it back
08:34 thd           ftherese:  Actually, the notes seem to indicate to use both 225 and 410 in all cases except when there is no standardised name for the for the series established.
08:30 thd           ftherese:  410 is for the authority controlled name of a series if the form transcribed from the work being catalogued does not use the standard series name.
08:29 thd           ftherese:  I had never thought about that but you may have found a good rule
08:28 ftherese      It seems that the logic there is to use the lower number whenever possible
08:27 thd           ftherese:  225 and not 410 is the primary series field in UNIMARC.
08:27 ftherese      I am now dealing with another problem... I think that the MarcEdit must have limit as to how many lines it can handle per file that it must translate
08:27 thd           ftherese:  You should check the old UNIMARC concise bibliographic manual, http://archive.ifla.org/VI/3/p1996-1/sec-uni.htm
08:26 ftherese      that's ok
08:26 thd           ftherese:  I had forgotten that the recent UNIMARC documentation linked from BNF has no helpful explanation nor examples.
08:22 |Lupin|       hi ftherese
08:16 ftherese      morning :)
08:14 |Lupin|       guten morgen kf
08:10 kf            good morning all
08:03 |Lupin|       hdl_laptop: ok, sorry
08:01 hdl_laptop    was just to say hi
08:01 |Lupin|       hdl_laptop: ?
07:54 hdl_laptop    |Lupin|:
07:54 hdl_laptop    hi
07:49 |Lupin|       chris: around ?
07:49 |Lupin|       oops
07:49 |Lupin|       chris_n: ahi
06:50 |Lupin|       good day / eening all
06:48 Ropuch        Good morning
06:32 nicomo        bonjour koha
05:15 brendan       cya all later
05:15 munin         brendan: The operation succeeded.
05:15 brendan       @later tell brendan - you got some things to remember
05:14 brendan       later all
04:16 SelfishMan    Is it just me or are people multiplying in ihere?
03:31 Jo            hi amit
03:25 brendan       hi Amit
03:25 chris_n2      g'morning Amit... g'night #koha
03:21 Amit          good morning #koha
03:21 Amit          hi chris, brendan
02:31 Jo            still working out how to maximise the power of kete
02:30 * chris_n2    greets gmcharlt
02:30 Jo            so using it to gather up research around the topic which i want to keep on hand and share, and discuss. the pest analysis is being carried out among a bunch of us - the wiki side of kete has not really been explored much by us yet.
02:29 brendan       cool - i will read about it
02:29 Jo            i have been using kete as a consultation tool too - on visioning libraries of the future: http://kete.library.org.nz/site/topics/show/76-horowhenua-library-services-2030
02:28 Jo            hehe
02:27 brendan       I also have fun looking at your site -- I seem to learn something everytime (that means a good site in my mind)
02:25 brendan       really cool
02:25 Jo            a real treasure
02:25 Jo            it is nice
02:25 brendan       enjoying the artwork of wendy
02:25 brendan       I was looking at your kete earlier
02:25 brendan       cool
02:24 Jo            Brendan: want to see an example of Koha drawing in results from Kete: http://opac.koha.catalystdemo.net.nz/cgi-bin/koha/opac-search.pl?q=maori+battalion
02:24 Jo            afternoon all
02:24 Jo            hi Brendan
02:23 brendan       hi Jo
01:44 brendan       evening #koha
01:30 chris_n2      does this '__PACKAGE__->foo_class('Package::foo');' make foo class a part of the __PACKAGE__ class?
01:23 brendan       cya all in a bit #koha
01:10 chris_n2      or at least a part thereof
01:10 chris_n2      there's a cool $150K for the first prime with at least 10,000,000 digits
01:08 * chris_n2    participates in the Mersenne prime search (http://mersenne.org)
01:08 * thd         is off to perform his civic duty
01:07 brendan       with the last one solved by a recloose in Russian, who never collected the prize
01:06 brendan       yeah -- my wife is always interested in some of the "open questions" in mathematics
01:05 chris_n2      back to reality now :-P
01:05 thd           brendan: Solve the carbon sequestration problem and Richard Branson will give you a billion or some such sum.
01:05 brendan       cool -- enjoyed looking at the site -- was never serious :)
01:04 thd           brendan: There are also much larger prizes.
01:04 thd           brendan: There are much less expensive contests with large prizes
01:04 brendan       more fun to watch :)
01:03 * chris_n2    's wallet is smoking :-)
01:03 chris_n2      after Jan 1, 2010, it jumps to $50K
01:02 chris_n2      ok brendan... the registration is a mere $30K
01:02 thd           brendan: The robot part confused me about which contest it was.
01:01 thd           brendan: Oh yes, most participants are spending far more than that to compete.
01:01 brendan       thd: http://www.googlelunarxprize.org/
01:00 thd           brendan: Who is offering the $30 million?
01:00 ftherese      ok... I'll take a look at that... Right now I am writing a function to concatenate my call numbers
00:59 thd           ftherese: Using the 'hidden' values for the bibliographic frameworks and other values for subfields, you can control how the record editor functions.
00:54 thd           ftherese: There is a table for possible values for hidden in the popup help linked from the '?' help link in the upper right corner of the subfields editor in the bibliographic frameworks editor.
00:51 brendan       build robot send to space -- sound good
00:51 chris_n2      hehe
00:51 brendan       think I have an new company mission :)
00:50 brendan       yup
00:50 chris_n2      brendan: $30 million is very tempting...
00:50 brendan       needed a moment of distraction and that is perfect
00:50 brendan       ha thanks chris_n2
00:49 thd           ftherese: The English UNIMARC framework is however not well tested to my knowledge and I only partially edited someone else's initial work on that framework.
00:48 thd           ftherese: The MARC 21 and English UNIMARC frameworks are already set up to preserve data which you ad for what is defined in those frameworks.
00:47 chris_n2      and writing the software in perl... :-S
00:47 thd           ftherese: There are only a few hidden ID fields which are an exception to the risk of losing data if you do not set the frameworks properly.
00:47 * chris_n2    considers entering the Google Lunar X PRIZE
00:47 chris_n2      mason++
00:46 thd           ftherese: You can change the behaviour but so that something will appear but if you do not you will loose the data stored in the subfield for any subfield which does not appear in the record editor.
00:44 thd           ftherese: That is a very important point to know what is defined as appearing in the record editor including which subfields within a field
00:44 hugo_         mason: I was trying on 10.5 just about all summer
00:43 thd           ftherese: You could populate 852 but unless you modify the Koha MARC bibliographic frameworks to use the field in the record editor, anything which you add to 852 will be lost upon editing the record with the internal Koha record editor..
00:41 thd           ftherese: Use 995 $k which is defined for items.itemcallnumber .
00:41 mason         i got my OSX 10.4 running the latest dev version of koha3 only 2 weeks ago.. , it took *lots* of experimenting
00:40 thd           ftherese: one day in the future it will be but for now we have 995 for Koha UNIMARC holdings default and 952 for Koha MARC 21 holdings default.
00:39 mason         then use your working debian install to help set up your OSX koha
00:39 hugo_         mason: thank you. Downloading netinstall of Debian now
00:39 mason         if you have problems with your debian install, everyone can help you,
00:39 thd           ftherese: 852 is not supported in Koha for items management.
00:39 hugo_         yes - that my plan mason - I just want to get it running - see that I can do it.
00:38 thd           ftherese: Then you merely need to combine it into a single string
00:38 ftherese      I could just put them all as 852$j
00:38 ftherese      it is already split up
00:37 thd           ftherese: Why would you split it?
00:37 ftherese      I've got to split that up into the options they have
00:37 thd           ftherese: If you ask on the French Koha mailing list, I suspect that some people could direct you to a helpful guides for UNIMARC.
00:37 ftherese      A - ## - AAA - ## -- *
00:36 ftherese      ok our call number (cotation) system looks like this:
00:36 thd           ftherese: I only know of the very complete UNIMARC texts. not the friendly tutorials
00:36 mason         hugo_ , get a debian koha running 1st, then attempt your OSX install..
00:35 ftherese      you are being a big help
00:35 ftherese      no worries
00:34 thd           ftherese: I have poor access to UNIMARC material in the US except mostly for what is available electronically.
00:34 ftherese      not your fault
00:33 ftherese      :(
00:33 thd           ftherese: Unfortunately, I have not included any UNIMARC material.
00:33 thd           ftherese: I have a bibliography for learning library cataloguing and classification at http://wiki.koha.org/doku.php?id=en:standards:cataloguing_classification:bibliography
00:31 hugo_         Cool - thank you.
00:31 thd           ftherese: You would need to find the UNIMARC equivalent.
00:31 chris         if you have a decent net connection, the net install is fine
00:30 thd           ftherese: The introduction to MARC cataloguing at http://libraries.idaho.gov/page/able is very easy but it is MARC 21 based.
00:30 chris         nope
00:30 hugo_         I'd like to get the install started tonight. I notice that the link here http://kohaindia.org/debian-lenny/ says to download the DVD's. If instead I follow this link http://www.debian.org/distrib/netinst and do the net install, am I starting off on the wrong foot?
00:26 thd           ftherese: 215 is for physical description such as number of pages or number of volumes in a set.
00:25 ftherese      what about pages?
00:25 ftherese      right... I've done that once before
00:24 thd           ftherese: Home : Administration :  Libraries and Groups is the path for defining libraries and their associated codes.
00:23 hugo_         (Debian is new to me too - I use Mac OS X day to day
00:23 thd           ftherese: yes
00:23 chris         hugo_: thats the right starting point, and go for stable (lenny)
00:23 thd           ftherese: One of the first tasks is to define the Library branches and branch codes which can be only one main library but an important step.
00:23 ftherese      so both 995$b and $c need to have the branch/library code in them?
00:21 thd           ftherese: In addition to 995 $b, Propriétaire, 'items.homebranch each item should have 995 $c dépositaire permanent, items.holdingbranch even if the branch codes are the same for both.
00:21 hugo_         My second question is, do I simply go here: http://www.debian.org/distrib/ - or are there choices I need to make to get the right thing?
00:18 hugo_         cool, thanks
00:18 chris         yeah taht should be fine
00:18 hugo_         maybe 500 patrons
00:18 hugo_         I'd guess a couple thousand books, tops
00:18 hugo_         very small local library. One branch.
00:18 chris         how big a library are you going to be running?
00:17 hugo_         my guess is that's plenty - am I right?
00:17 hugo_         it's a 3.2 GHz P4, with 1 GB RAM
00:17 thd           hugo_: OSX is not worth the effort unless you have a great amount of time.
00:17 hugo_         a client has donated an old dell (they bought a Mac), and I'd like to know if it's up to the job -
00:17 hugo_         that is my hope!
00:16 chris         right, following the INSTALL.debian-lenny guide is probably your best bet then
00:16 hugo_         thought I'd love to get the Mac working later...
00:16 hugo_         ah - but you see, with the deadline, I'm "giving up" and going to go with Debian
00:15 thd           ftherese: if you have many numbers for a book series then you might create a separate record for each with a 410 field to designate the series which they all have in common.
00:15 munin         chris: mason was last seen in #koha 1 day, 11 hours, 3 minutes, and 19 seconds ago: <mason> fyi: use zebra, nozeb has old functionallity gaps, and no-one is working on fixing them
00:15 chris         @seen mason
00:15 chris         if you can catch him, he might be some help
00:15 chris         hugo_: mason has been doing some installs on macs
00:14 chris         mason: are you around?
00:14 hugo_         I've spent quite some time learning that I don't know enough to get Koha to install on a Mac. Frustrating, and I'd like to keep trying... but I'm running up against a deadline.
00:13 hugo_         I wonder if I might ask some questions before beginning an install...?
00:12 gmcharlt      yep, I'm just telling you that I have in fact made a change
00:12 thd           gmcharlt: I just copied the previous agenda and simplified it.
00:12 thd           gmcharlt: I suggested that you change the agenda
00:12 hugo_         Hello everyone
00:11 gmcharlt      thd: note change I made to agenda for tomorrow
00:11 thd           ftherese: 410 is for books
00:11 ftherese      series of books
00:11 thd           ftherese: do you mean a regularly issued periodical or a series of books?
00:10 ftherese      with lots of numbers in the collection
00:10 ftherese      Sources Chretiennes
00:10 ftherese      so... for a collection like
00:09 thd           ftherese: 410 $h is for the number of a series
00:07 ftherese      where does the number go?
00:07 ftherese      I see 410
00:07 ftherese      what about collection
00:07 ftherese      ah
00:07 thd           ftherese: 200 $c might have "by T S Eliot" exactly as the words appear in the book.
00:06 thd           ftherese:  The joined representation transcribed from the book would go in 200 $c
00:05 thd           ftherese:  7XX is for the official name.
00:05 ftherese      then would those fields be "joined"
00:05 ftherese      ok
00:05 thd           ftherese:  As you said all tertiary, etc. authors would be added to repeated 702 fields
00:04 ftherese      ahh got it
00:04 thd           ftherese:  701 is when you cannot establish a primary author
00:03 ftherese      what about 701?
00:03 thd           ftherese:  The secondary author in 702
00:02 thd           ftherese:  The primary author would go in 700
00:02 thd           ftherese: MARC 21 is simpler on the treatment of author fields only.
00:02 thd           ftherese:  yes I failed to remember UNIMARC correctly
00:00 ftherese      or would I join all the secondary authors, by using the "join" feature
00:00 ftherese      and they would remain separate
00:00 ftherese      all the secondary authors would go in 701
00:00 thd           ftherese:  The secondary author in 701
00:00 thd           ftherese:  The primary author would go in 700