Time  Nick      Message
11:05 kados     morning folks
11:15 _hdl_     morning kados ?
11:15 kados     _hdl_: guess afternoon for you :-)
11:16 _hdl_     yep :)
11:16 _hdl_     I looked for a use of purchaseordernumber but couldnot get it.
11:16 _hdl_     Do you know hwat it is used for  ?
11:22 kados     yes
11:22 kados     a purchase order is used by a library to allocate funds for a specific purpose
11:22 kados     before actually paying
11:22 kados     it also acts as a reference point between a contract and an invoice
11:22 kados     and a check
11:23 kados     so someone wonders 'what is this check for made out to LibLime'
11:23 kados     they can see the purchase order
11:23 kados     then check the contract and invoices with the same purchase order
11:23 kados     does that make sense?
11:44 _hdl_     Yes but in Koha, there is no use of this field.
11:45 _hdl_     As I read you, this field would act as an internal field for basket research.
11:45 _hdl_     Do I understand correctly ?
12:13 kados     hmmm
12:13 kados     I think it is mainly for budget-base acquisitions
12:13 kados     s/base/based/
12:14 kados     so when a library makes a purchase for books
12:14 kados     they create a purchase order
12:14 kados     with a purchase order number
12:14 kados     which is used by the library to track the purcahse
12:15 kados     if you write a mail to koha-devel I'm sure Stephen can answer your question better as he served as NPL's clerk for some time
12:35 _hdl_     shedges : I am wondering how and where purchaseordernumber is used for in Koha.
12:35 _hdl_     Seems never used in my code.
12:35 _hdl_     But what was it used for ?
12:35 _hdl_     Kados seemed to say you helped Katipo with Acquisitions ?
12:36 _hdl_     (see logs today)
12:36 shedges   most libraries here have to have a "purchase order" before they can buy things.
12:36 shedges   It's just a paper form, usually signed by the treasurer and the director
12:37 shedges   it says that the purchase has been approved, that the library has enough money to pay for it, that the administration is aware of the order.
12:38 shedges   Auditors look to be sure each invoice was pre-approved by a purchase order.
12:39 shedges   So as orders arrive, they are usually matched with the corresponding purchase order.
12:39 _hdl_     But is invoice linked to a parcel you receive or to an order ?
12:39 _hdl_     (SHOULD be both. ;) in ideal world.)
12:40 _hdl_     I am asking it because you say purchase order is a way to approve invoice BEFORE paying it.
12:48 _hdl_     kados shedges any comment ?
12:50 shedges   the invoice is linked to each shipment
12:51 shedges   the purchase order is linked to each order
12:51 shedges   one purchase order can have several invoices
12:51 shedges   for example, I order 10 books, using one purchase order
12:51 shedges   seven of the books are shipped (with an invoice) but three are backordered (out of stock)
12:52 shedges   the three come later, on a separate invoice
12:52 shedges   so I have one purchase order, one order, two shipments and two invoices
13:06 _hdl_     I understand.
13:08 _hdl_     But I am wondering if it is possible to manage all these things at the moment, and keep a history of parcels arrived.
13:09 _hdl_     Becaus if many parcels comes for one order, they will overwrite the with the latest input.
13:10 shedges   ahh
13:10 _hdl_     aqorderdelivery was maybe desinged for this purpose.
13:10 _hdl_     But I wonder.
13:25 paul      hello all
13:25 paul      kados around ?
13:40 shedges   hi paul
13:40 paul      hello stephen.
13:40 shedges   I think he is on a conference call right now
13:40 shedges   (should be done soon)
13:40 paul      ok, i'll leave in around 30mn
13:40 paul      hoping i'll catch him
13:43 kados     paul: you still here?
13:43 paul      yes
13:43 paul      I have some bad news, posted a few minuts ago on koha-devel...
13:43 kados     opactest.liblime.com is running head and searches work!
13:44 paul      it seems that DBI::mysql and utf8 are NOT friends at all.
13:44 kados     hasn't come through yet
13:44 paul      we have a major problem...
13:44 kados     yikes!
13:44 paul      copying some links :
13:44 kados     do we need to move to postgres?
13:44 kados     :-)
13:44 paul      http://www.cpanforum.com/threads/654
13:44 paul      http://lists.mysql.com/perl/3714
13:44 paul      http://marc.theaimsgroup.com/?l=msql-mysql-modules&m=111970179409036&w=2
13:44 paul      http://lists.mysql.com/perl/3006?f=plain
13:45 paul      in fact, they CAN be friend, but through 2 changes :
13:45 paul      * bug DBD::mysql maintainer to have the fix included. Would require an official release, an upgrade of DBD::mysql, but it's a better solution in the long term.
13:45 paul      * modify EVERY SQL query to Encode::decode_utf8() every value. A quite huge task !
13:45 kados     hmmm
13:45 kados     why do we need utf-8 in the mysql db?
13:45 paul      (unless i'm missing something, & believe me, I really expect it !)
13:46 kados     don't we just need it for the bibliographic data?
13:46 paul      of course, if we want to be utf8 compliants
13:46 kados     I don't think we need it in mysql
13:46 kados     I think we can just encode as utf-8 before reaching the template
13:46 kados     meaning all the browser sees is utf-8
13:47 kados     ahh ...
13:47 paul      that would be a poor solution I think. For example LDAP is utf-8 by default. Thus, for libraries using ldap & koha, the borrowers names have a strange look.
13:47 kados     I see ... borrower names is a problem
13:47 kados     forgot about those
13:47 kados     hmmm
13:48 paul      EMN has the problem & accept it, but only because I told "3.0 will be unicode" !
13:48 kados     is there a fix for DBD::mysql?
13:48 paul      (one of the 4 links)
13:48 kados     (can we patch our local installations until it's released)
13:48 paul      except it seems the patch is quite old, and was never released :-(
13:48 kados     hmmm
13:49 kados     I am happy to send an email to the maintainer
13:49 kados     explaining our situation and asking him for advice
13:49 kados     if he's not able/willing to fix it
13:49 kados     it looks like we'll have to modify our SQL
13:49 kados     if you describe to me how to do this I'll be happy to be the grunt man
13:50 kados     so ... I'll await your email on koha-devel
13:50 kados     you can rest easy tonight :-)
13:50 thd       paul: have you looked at my SQL and quoting proposal?
13:50 paul      it's quite simple : every time you fetch something, add a Encode::decode_utf8()
13:50 kados     thd: good point this might be a good opportunity to apply that
13:50 paul      thd : I saw it, but could not read it completly yet.
13:51 paul      but still in "unread" status.
13:51 kados     paul: unless you object, I think I'm going to have stephen add it to the coding guidelines
13:51 kados     paul: thd's proposal that is
13:52 paul      kados : if the maintainer can't include it, then we will have to change coding guidelines. Except that such a fix will enforce mySQL-only copatibility
13:52 thd       paul: the identifier quoting part is the most important aspect for the short term
13:52 paul      because other DBD::XXX don't suffer this problem it seem.
13:52 paul      thd : I think i'll agree with your proposal.
13:52 kados     paul: yep
13:52 paul      thus the best solution from far would be to have the fix integrated !
13:52 kados     paul: I'm actually seriously considering moving LibLime's future dbs to postgres
13:53 paul      imho, that would :
13:53 kados     paul: which is based on ingres, one of the best SQL dbs ever created
13:53 paul      * delay koha 3.0
13:53 paul      * be a problem for libraries using Koha 2.2 when they will migrate.
13:54 kados     (I don't mean that mysql will be excluded, only that I want to create support for postgres)
13:54 thd       paul: my proposal is only for placeholder code as SQL code is newly created or modified.
13:54 kados     (and probably not for 3.0, I'm thinking 3.1 or 3.2)
13:55 kados     (many large libraries would never go with a solution based on mysql ... with good reason i think)
13:55 thd       paul: this would make it much easier to add flavour specific code later.
13:56 thd       paul: the quoting aspect is to use My SQL specific back quotes for the default MySQL code.
13:58 thd       paul: aside from identifier quoting needed already, marketing should be the significant motivator.
14:00 thd       _hdl_: are you creating a parallel order package management system to that of normal acquisitions?
14:00 paul      I hope not !!!
14:01 thd       paul: what is he doing then?
14:01 paul      I asked him to provide some tools to improve recievings, not rewritte it !
14:01 paul      i'm afraid he is doing too much.
14:02 _hdl_     I am not rewriting it.
14:02 thd       paul: every task tends to become too much after a closer look at the problem :)
14:02 paul      hdl : ouf ;-)
14:03 paul      I just wanted him to provide some facilities to see all recieves.
14:03 _hdl_     Just tryJust trying to well understand what has been done.
14:03 paul      hdl : I know that we have a problem when a recieve is partial.
14:03 thd       _hdl_: are you writing this 2.X also?
14:03 _hdl_     yes.
14:03 paul      but there is no simple solution to this problem
14:03 _hdl_     First for 2.2.X
14:04 kados     bbiab
14:04 paul      we have to create a new table.
14:04 paul      bye kados. i wont be here tomorrow.
14:04 paul      but friday yes
14:04 thd       _hdl_: I am happy about the first and would want to give you as much assistance as I can for that.
14:05 thd       _hdl_: you do know that normal acquisitions is still currently broken.
14:07 thd       _hdl_: chris ran out of time to fix it all for 2.X and made some uncommitted New Zealand specific shortcut fixes for a Katipo 2.X customer.
14:12 thd       _hdl_: I am also interested in the issue of checking in multiple items for automatically adding them to the database with pregenerated barcodes rather than manually manipulating the item editor each time.
14:13 thd       s/checking in/adding to the items for the biblio upon receipt/
14:13 thd       s/upon receipt/upon unpacking/
14:14 _hdl_     Just typing in barcodes with a space.
14:14 thd       _hdl_: I did not know.  Is that how it is meant to work?
14:15 thd       good evening _hdl_ and paul.
14:15 paul      I leave too.
14:15 thd       _hdl_: will you be about tomorrow?
14:16 paul_away (I think he should)
14:16 _hdl_     yes
14:18 thd       hdl_away: I will  try to sleep earlier today and rise earlier tomorrow or stay awake forever.
18:59 kados     paul_away: you around?
19:47 russ      jo you about6?
19:47 russ      oops wrong channel
22:07 russ      hi jo
22:07 russ      yep
22:07 pez       hello
22:07 pez       : )
22:07 russ      hi pez
22:18 audrey    yes
22:24 kados      _d1099 at C4/Biblio.pm line 158.
22:24 kados     ZOOM error 224 "ES: immediate execution failed" (addinfo: "update_record failed") from diag-set 'Bib-1'
23:04 chris     actually koha might handle 13 now, lemme check
23:05 chris     yep we handle up to 14
23:15 kados     er?
23:15 kados     chris: is that a joke?
23:16 chris     sorry was talking in hlt
23:16 chris     and then changed channels without noticing :)
23:17 chris     14 in the isbn field
23:17 chris     (13 char isbns are coming out soon)
23:17 chris     was jsut checking how big 2.2 would handle
23:21 kados     ahh
23:22 kados     so the thing is ... I'm getting this error when I try to import records
23:22 kados     this is line 158 of biblio.pm:
23:22 kados     warn "zebra_create : $biblionumber =".$record->as_formatted;
23:22 chris     didnt we fix that the other day?
23:22 kados     no
23:23 chris     im sure we were importing ok werent we?
23:23 kados     well ... it died on me
23:23 kados     and I assumed that was because it had finished
23:23 chris     ahh i imported 3000 ok .. maybe i better check again
23:23 kados     but I don't think that was why it died
23:24 kados     I can make a 135M MARC file available to you if you want it
23:24 kados     it's from a dynix system ... one of LibLime's new clients
23:24 chris     ooh
23:25 chris     yeah pop it up somewhere and drop me an email with a url
23:25 kados     k
23:25 chris     so when im home i can take a look
23:26 kados     k
23:28 kados     sent
23:28 chris     cool
23:29 chris     so whats the actual error its throwing?
23:30 chris     i think its scrolled out of my scroll buffer
23:37 kados     I think I found it
23:38 kados     this client sent me two marc files
23:38 kados     the first one had some errors in it
23:38 kados     and that's the one that is thowing errors for us
23:38 kados     I'll send you the link to the one that's been indexing for a bout a minute now
23:39 chris     sweet
23:39 chris     yeah we need nicer error handling for zebra ... that traps the error and reports abck nicely
23:40 kados     yep
23:40 kados     ok ... same link as before
23:40 chris     cool
23:41 kados     I wonder if there's an equiv to select count(*) in zebra
23:41 chris     it might actually be MARC::File
23:41 chris     that dies
23:41 chris     when it tries to render the marc
23:41 chris     either way, nicer error reports are needed
23:41 kados     yep
23:41 chris     hmm bound to be something like that
23:42 kados     whatever it was, it required manual editing of the MARC
23:42 chris     you can do do a search that will find everything, and then count the results
23:42 kados     wonder what search that would be
23:42 chris       my $rs = $Zconn->search($q);
23:42 chris             my $numresults=$rs->size();
23:43 kados     shoot
23:43 kados     it just borked again
23:43 chris     but yeah, i wonder what $q needs to be :)
23:43 chris     ok, it should report and skip
23:43 chris     i reckon
23:44 chris     report a bad record, and then continue on
23:44 chris     or say
23:44 chris     bad record, skip and continue y/n
23:44 kados     that makes sense
23:45 chris     fatal errors should only be things like zebra dying
23:45 kados     opac-detail is 500ing on me
23:45 kados     is there stuff you haven't committed?
23:46 chris     switch marc off
23:46 chris     in system pref
23:46 chris     and it should work
23:46 chris     marc display isnt finished yet
23:47 chris     tho why they opac tries to display marc im not sure :)
23:49 kados     it's actually set to 'normal'
23:49 chris     the marc on or off?
23:49 chris     can you set it to normal?
23:49 chris     i thought it was just on or off
23:50 kados     log has:
23:50 kados     [Wed Feb 15 20:08:07 2006] [error] [client 70.106.188.196] Undefined subroutine &C4::Biblio::MARCfind_MARCbibid_from_oldbiblionumber called at /home/koha/testing/koha/opac/cgi-bin/opac-detail.pl line 82., referer: http://opactest.liblime.com/cgi-bin/koha/opac-search.pl?op=do_search&type=opac&marclist=&and_or=and&excluding=&operator=contains&value=test
23:50 chris     ah yep, marc is still on
23:50 kados     preferences say:
23:50 kados     Define the default view of a biblio. Can be either normal, marc or isbd
23:50 chris     naw not that one
23:50 chris     that doesnt work :)
23:51 kados     gotcha
23:51 kados     hey, stuff's even displaying
23:51 chris     :)
23:51 kados     http://opactest.liblime.com/cgi-bin/koha/opac-detail.pl?bib=1570
23:51 chris     sweet
23:51 kados     nice job chris!
23:52 chris     yeah it should jsut drop in for 2.2.5 too ... paul did most of it
23:52 chris     havent got there yet
23:52 kados     right
23:52 chris     its fetching from koha tables at the mo
23:53 kados     there's still data in the koha tables?
23:55 chris     well there will need to be at least skeletons in items, with at least barcode and itemnumber
23:55 chris     there is nothing in the marc_word table though
23:55 chris     thats all in zebra
23:56 chris     currently SearchMarc.pm is a drop in replacement
23:57 chris     as Search.pm gets bigger more stuff dissapears from the db
23:57 chris     make sense?
23:57 kados     yea makes sense
23:57 chris     so it will always be workingish .. just slowly working more and more from zebra
23:58 kados     I'm not clear whether my contractual obligations are fulfilled yet :-)
23:58 kados     well, after items is done
23:58 chris     once i have a nice, i have a barcode
23:58 chris     give me all the info i need
23:58 chris     routine in Search.pm then 99% of the biblio info in koha will no longer be needed
23:59 kados     gotcha
23:59 chris     but if i take it out now, circ will break
23:59 kados     yep
23:59 chris     the good news is
00:00 chris     the stuff on the first search results page is fetched from zebra
00:00 chris     the detail.pl touchses koha tables
00:01 chris     but i think ill leave that, get my search-test.pl going
00:01 chris     and then get my get_record() going
00:01 kados     sweet
00:01 chris     and then opac-detail.pl should be easy
00:02 chris     it seems to be clicking into placce
00:02 kados     excellent ...
00:02 kados     so I can hack on the import problem
00:03 chris     cool
00:03 kados     any ideas for how to make it 'skip' a record?
00:03 kados     rather than just dying?
00:03 chris     wrap the bit that is dying
00:03 chris     in an eval { };
00:03 chris     then do an
00:03 kados     (it looks like it's dying with the ->as_formatted method
00:03 chris     if ($@){
00:03 kados     k
00:03 chris     print out some error message
00:03 chris     }
00:03 chris     and it will keep on running
00:04 chris     but at least grizzle about the error
00:04 kados     k
00:04 chris     it might just die on the next bit :)
00:05 chris     i think its probably dying because $record doesnt exist
00:05 chris     so maybe before the ->as_formatted
00:05 chris     do an if ($record){
00:05 chris     all the stuff it does with record
00:05 chris     } else {
00:06 chris     print "record wasnt set .. error woop woop etc";
00:06 chris     }
00:06 kados     actually ... I should probably write to a file
00:06 chris     i think marc::record couldnt make a record, and then we try to use it, but its doesnt exist
00:06 kados     cause the output's going by way to fast to catch that
00:07 chris     yeah, or pipe to a file
00:07 chris     when you run
00:08 kados     piping's easier :-)
00:08 chris     :)
00:08 kados     ok ... running now
00:08 chris     ideally
00:09 chris     we should hand the error back
00:09 chris     so that the calling program can deal with it
00:09 chris     rather than printing it in biblio.pm
00:09 chris     then bulkmarcimport.pl could print it out with the count, so you know what marc record is duff
00:10 chris     and acquisitions could pass it to a template etc
00:10 kados     yep
00:10 kados     makes sense
00:11 chris     ok, gonna go sort out some more bugs from my bugzilla list, catchya later
00:11 kados     later
00:22 thd       kados: are you still there?
00:28 kados     thd: yep
00:29 thd       kados: I committed a new version of SearchMarc.pm to HEAD.
00:30 thd       kados: I have a separate rel_2_2 checkout and tried to commit a rel_2_2 version.
00:31 thd       kados: I aborted that commit when the log was showing me that I was about to commit a modification of every file in the koha branch.
00:32 thd       kados: the file dates of my rel_2_2 checkout seem right so what could be wrong.
00:32 thd       ?
00:33 thd       kados: I had tested the rel_2_2 cvs commit with my local rysync archive of the Koha CVS tree.
00:34 kados     ?
00:34 kados     what did the new version contain?
00:35 thd       kados: Quick fix for functional bug in getMARCsubjects to avoid returning values that
00:35 thd       vary between different uses of the same authorised subject heading causing
00:35 thd       linked subject searches from the detail view to fail.  Other presentation fixes
00:35 thd       within getMARCsubjects.
00:38 audrey    kados: know much MARC tag designation?
00:39 audrey    what is difference between 852 subfield h and 852 subfield k?
00:39 kados     audrey: thd would be a better one to answer that
00:39 audrey    ok. thd?
00:40 audrey    do you know what the finer differences are?
00:40 audrey    852 is location according to http://www.loc.gov/marc/bibliographic/ecbdhold.html#mrcb852
00:40 kados     chris: whenever you get back, it still died
00:40 kados     chris: even with the eval
00:41 audrey    thd: but what is difference between subfields h and k?
00:41 thd       audrey: $k is a prefix like JUV for juvenile; $h is the base call number without the cutter number or possible location  suffix.
00:42 audrey    and m is the location suffix or the cutter?
00:44 thd       $i is the cutter; $m is the uncommon suffix.
00:44 audrey    thanks! really helps my understanding.
00:45 audrey    thd: what does a suffix look like?
00:45 audrey    the $m one?
00:46 thd       audrey: http://lwww.loc.gov/marc/ has much documentation but little explanation.
00:46 audrey    i am seeing that
00:47 audrey    do you know a better, more explanitory website?
00:49 thd       audrey: 852
00:49 thd       ##$aDLC$bc-G & M$hG3820 1687$i.H62$mVault
00:50 audrey    so the specific room or branch described right in the call number?
00:50 audrey    not used my many libraries, eh?
00:50 thd       audrey: try a cataloguing textbook.  There is a good one by Chen.
00:50 audrey    right, but I am not near any cataloguing testbooks right now.
00:51 thd       audrey: $m is seldom ever used in my experience.
00:51 audrey    ok, so not to worry too much about it.  cool.
00:51 audrey    good.  now i know what it is, can explain it to others, and know not to stress about it.
00:52 audrey    thanks:)
00:52 thd       audrey: There is a good brief guild to MARC on the LC website but it does not have the detail for which you are looking.
00:52 kados     thd: following up on our discussion about possible writing opportunities for you
00:52 kados     thd: would you be interested in preparing 'fact sheets' for Koha?
00:53 thd       audrey: Years ago I did find the notes for a library science class online but I have no idea now.
00:54 audrey    thd: that's cool. may try to find other marc sites later.
00:54 thd       kados: what would a 'fact sheet' as distinct from a 'lies sheet' contain?
00:54 kados     thd: :-)
00:56 thd       kados: I had thought about an approach for writing for the first issue of the LibLime newsletter.
00:57 kados     thd: yea?
00:57 kados     thd: I'm all ears
00:58 thd       kados: I was late for my dentist appointment thinking about it
01:00 kados     thd: :-)
01:00 kados     thd: so ... what's your idea?
01:00 thd       kados: tentative title, "Take back control of your library systems"
01:02 thd       kados: The principles of free software freedom and there application to libraries with special consideration of the issue that there are very few free software projects for large complete systems as opposed to tools for building them.
01:02 thd       s/there/their/
01:03 thd       s/the first one
01:05 thd       actually that would only match the first one :)
01:07 thd       kados: what do you mean by Koha 'fact sheets'?
01:09 kados     thd: well ... detailed descriptions of features maybe?
01:11 thd       kados: Why did "cvs -z3 -d thd@cvs.savannah.nongnu.org:/sources/koha commit $Log -r rel_2_2 koha
01:11 thd       " want to commit everything in my rel_2_2 checkout when I only modified one file?
01:11 kados     what you wanted to do
01:11 kados     was just do
01:11 kados     cvs commit filename
01:11 kados     you don't need to commit the whole tree
01:11 kados     just the file you changed
01:12 thd       kados: cvs commits only the modified files automatically.
01:14 thd       kados: that code worked fine to my local copy of the Koha CVS tree downloaded with rsync.
01:15 kados     rsync is your problem
01:15 kados     I bet
01:16 thd       kados: my CVS checkout used CVS directly from savannah I hope :)
01:17 kados     thd: looks like your changes killed my opac search :-)
01:17 kados     http://opactest.liblime.com/cgi-bin/koha/opac-search.pl
01:17 thd       kados: Well, I will check that again.
01:17 kados     wait
01:18 kados     yea, something weird happening
01:18 thd       kados: Head does not work on Koha 2.X.  You need my rel_2_2 file.
01:18 kados     [Wed Feb 15 21:34:52 2006] [error] [client 70.106.188.196] ZOOM error 10012 "CQL transformation error" (addinfo: "Illegal or unsupported index (addinfo=\xb0\x9e\xdf\blastic)") from diag-set 'ZOOM', referer: http://opactest.liblime.com/cgi-bin/koha/opac-search.pl
01:18 kados     ahh ... you changed 2.2
01:18 kados     so this problem is unrelated then
01:19 kados     i see
01:19 kados     it's a CQL parsing error
01:19 kados     can't handle multiple subject search terms yet
01:20 kados     well any multiple terms it seems
01:20 thd       kados: yes, I did not break anything that was not already broken.  I only fixed a few things that will give you a different appearance in every detail view.
01:21 kados     thd: could you be more specific?
01:22 thd       kados: I dealt with the problem you had with Sears in 650 $2.
01:23 kados     I thought we already had a fix for that?
01:23 thd       kados: Also, there was a line missing after the while loop that prevented the trailing " -- " from being removed from the last 6XX.
01:25 kados     thd: but searches were working fine now
01:25 thd       kados: And other improper extra " --  ".  I just realised that I may have not caught all possible cases, but certainly all that I have seen.
01:25 thd       kados: that fix was never committed.
01:27 thd       kados: the other issues were presentation problems that did not affect searching but looked bad on every detail view to have a trailing " -- " after the end of the last subject heading.
01:27 kados     thd: I see
01:28 thd       s/looked bad/had a poor appearance/
01:28 kados     OK ... I'll try those out on a working system sometime soon
01:36 thd       kados: that did not work, I killed my connection to be extra certain of not corrupting the whole Koha CVS tree.
01:37 thd       kados: how do I commit a single file or small group of files?  More importantly, what is the problem rsync is not involved this time for certain.
01:38 thd       s/problem/problem?/
01:41 thd       chris: not too take you from your important work but why would CVS insist that I modified the whole rel_2_2 checkout when I only changed one file?
01:41 kados     to commit a file just type:
01:41 kados     cvs commit filename
01:41 kados     if you want to commit a small group of files inside a directory:
01:41 kados     cvs commit *
01:41 kados     from within the directory
01:41 kados     but don't do atomic commits
01:41 kados     like cvs commit koha
01:41 kados     :-)
01:42 thd       kados: How will CVS know where to put the file?
01:42 thd       kados: What are atomic commits?
01:42 kados     cvs commit koha
01:42 kados     is an atomic commit
01:42 kados     thd: cvs just knows
01:43 kados     thd: based on where it is in your repo
01:43 kados     thd: all that info is stored in the CVS directory
01:43 kados     thd: inside every repo directory
01:44 thd       kados: am I supposed to remove the unmodified files.  CVS is just supposed to be able to just know the deference.
01:44 kados     no don't need to remove anything
01:45 kados     you can read some docs on how to use cvs on the savannah site
01:45 kados     if you have usage questions
01:47 thd       kados: I have read all the CVS docs and while I do not have extensive practise of usage I studied the docs from several CVS systems.  I am using just CVS and it is supposed to know what is modified and what is not.
01:49 thd       kados: I had failed to understand a point about log messages a couple of months ago before my first commit.  chris put me straight about automatic logging.
01:52 thd       kados: You are saying that I have to preserve the directory tree within which a single file or maybe even more than one file with the same name in different locations would be are you not?
01:54 thd       kados: If that question bores you and you are still awake, who is meant to read the 'fact sheets'?
03:14 audrey    Chris, hi
03:18 chris     hi audrey
03:18 audrey    have a moment
03:18 audrey    ?
03:19 chris     sure
04:52 |hdl|     hi
04:53 thd       hello |hdl|
04:53 hdl       could you sleep a little ?
04:54 hdl       (non slleping work is no good.
04:54 thd       hdl: I hope to sleep but I have been testing a problem committing to rel_2_2
04:55 hdl       Search.marc ?
04:55 thd       yes
04:55 hdl       (I saw you committed sthg.
04:55 hdl       what kind of problem ?
04:57 thd       I have the same code modifying rel_2_2 but when I go to commit CVS presumes that I have modified every file in rel_2_2
04:58 thd       hdl: I have tested my CVS arguments with no problem using a local copy of the source tree.
04:58 hdl       maybe you have a recursive commit on your directory ? ?
04:59 hdl       This IS indeed a problem ?
04:59 chris     you can just commit one file thd
04:59 chris     cvs commit path/filename
04:59 thd       hdl: Yes, that is the default but CVS is supposed to know the difference between what I have modified and what is the same.
04:59 chris     eg cvs commit C4/SearchMarc.pm
05:00 chris     you can go cvs diff filename to see what cvs thinks is different too
05:01 thd       chris would that be actually cvs commit koha/C4/SearchMarc.pm
05:01 thd       ?
05:01 chris     well im normally in koha/
05:02 chris     where koha is my cvs checkout
05:02 chris     so cd koha
05:02 chris     cvs commit C4/SearchMarc.pm
05:03 chris     you could go cvs diff someotherfile .. to see why cvs thinks it is different
05:04 thd       chris: I will try checking the diff return for all the files in rel_2_2 now :)
05:11 osmoze    hello
05:13 hdl       hi cris
05:13 hdl       chris
05:13 hdl       Hav you seen the logs ?
05:14 hdl       Trying to dive into acquisitions, one more time :)
05:15 chris     ahh, ill probably be doing some work on acquisitions for 3.0 in the next few weeks
05:15 chris     on the full acquistions anyway
05:54 hdl       chris : what are you planning to do ?
06:03 osmoze    do you know if it's possible to transcript unimarc to usmarc ?
06:04 thd       hdl: I assume chris intends to fix what is still broken.  I am curious about what aspects exactly chris knows to remain broken.
06:06 thd       osmoze: I would be very pleased if you could find the correct files for doing that.
06:07 thd       osmoze: http://www.bl.uk/services/bibliographic/usemarcon.html
06:08 thd       osmoze: what is your purpose for conversion exactly, particularly why do you want to convert into USMARC?
06:10 thd       osmoze: paul wrote a rough conversion for MARC 21 to UNIMARC.
06:11 osmoze    in fact, we have to send our "catalogue" to departemental librairy, but we are in unimarc and they are in USMARC. My question is that possible to send our "catalogue"
06:11 osmoze    (excuse my poor english ^^)
06:11 thd       osmoze: paul was not using USEMARCON.
06:13 thd       osmoze: Do you really want to convert into UNIMARC from USMARC?  USMARC -> UNIMARC ?
06:15 osmoze    no, it's the opposite
06:15 osmoze    UNIMARC --> USMARC
06:15 thd       osmoze: some libraries have the configuration files for USEMARCON.  If you find a library that has them, please let me know.
06:15 osmoze    k
06:16 osmoze    in fact it's to have an interoptability KOHA/ Multilys
06:17 thd       osmoze: That is a big part of the holy grail.
06:17 osmoze    lol :)
06:18 thd       osmoze: contact some people at BNF or other large libraries about configuration files for USEMARCON to do that.
06:20 thd       osmoze: Someone has the files somewhere already.  You could make your own but that is a nontrivial task with the special syntax for USEMARCON.
06:21 hdl       osmoze : we could do that but on twodays work.
06:21 osmoze    ok
06:22 osmoze    hdl : c est a dire ?
06:22 thd       osmoze: There is or was originally a proprietary version of USEMARCON that had configuration files not contained in the free version.  Find someone who is willing to share them or contact the original company about what leads they may have.
06:24 thd       hdl: only two days?
06:24 osmoze    thd>  french are good :p
06:25 hdl       thd: I know His catalogue.
06:25 hdl       :)
06:26 thd       hdl: you mean two days for the values he has in his catalogue.
06:28 thd       hdl: I want a tool to convert all BNF etc. records as completely as possible into MARC 21 and the other way from LC etc.
07:14 |hdl|     thd: sorry : still working remotely on my computer.
07:14 |hdl|     you said you wanted a tool to convert BNF biblio to LIC.
07:14 |hdl|     s/LIC/LOC/
07:14 |hdl|     This is not two days owk.
07:15 |hdl|     s/owk/work/
07:15 |hdl|     They are by far much more complicated biblio.
07:16 |hdl|     And UNIMARC to MARC-21 is not a mathematical bijection.
07:17 |hdl|     So There must be choicesto be done. Generally it is done on a target-display basis.
07:18 |hdl|     We want this kind of information int this field.
07:18 |hdl|     When you have a well defined kind of biblio, it is wuite easy.
07:18 |hdl|     But with BNF it is truly a great deal.
07:18 |hdl|     thd: ???