IRC log for #koha, 2006-07-13

All times shown according to UTC.

Time S Nick Message
12:19 kados sheesh
12:19 kados phone ringing off the hook
12:19 kados owen: ok, I'm back :-)
12:20 kados so ... where shall we start?
12:21 owen Yesterday as far as we got was, "needs to fit 800x600" monitor
12:22 kados right
12:22 kados so as far as the OPAC goes ...
12:22 kados the 800 x 600 thing
12:23 kados anything else?
12:23 kados the faceted searches we're going to skip I think, right?
12:23 owen Was that issue with the search form or the results?
12:23 kados the search form I think
12:23 owen Both look okay to me
12:23 owen Only the simple search is a little squeezed, but that's just the big form field.
12:24 kados I'll send you a screenshot
12:26 kados owen: check your inbox
12:28 owen I think yes to skipping the faceted searches, in the interests of time
12:30 owen Wow, kados, that's remarkably different than Firefox's rendering.
12:31 kados owen: that _is_ firefox :-)
12:31 kados owen: it looked the same way on the circ computer at the plains
12:31 kados owen: which I assume is win98 or 2000
12:32 owen Just have to resize the form field a little
12:33 kados so ...
12:33 kados as far as maintenance goes, are you using a fresh copy of the dev-week code?
12:33 kados cuz that's what's up on zoomopac
12:34 kados (do you need write access to more than just the template files?
12:34 kados (to upload changes?
12:34 owen I don't know
12:34 kados ok, well now you've got access to the koha-tmpl dir
12:35 kados let me know if you need more
12:40 kados http://zoomopac.liblime.com/cg[…]tail.pl?bib=49201
12:40 kados I suspect these are triple or even quadrupal encoded
12:41 kados marc8->latin1->latin1->latin1 :)
12:52 owen Did you see the issues with "you did not specify any search criteria" with sorted search results?
12:53 kados yea
12:53 kados I think that's fixed in devweek
12:53 kados but if not I'll certainly have a look
12:54 kados (not that interesting to you :-)
12:58 kados thd: 1106 is fixed
12:59 thd kados: is the fix committed or where can I test the fix
12:59 kados thd: afognak has the fix
12:59 kados thd: and it's commited
12:59 thd goody
13:06 thd kados: works nicely now
13:06 kados thd: great!
13:06 thd kados: where is the working editor code if it is not in devel-week?
13:07 thd kados: only on the Liblime server?
13:08 kados thd: it is now working in dev-week and rel_2_2 afaik
13:08 kados thd: dev-week for sure
13:09 thd kados: you have updated devel-week since it was not working yesterday?
13:09 kados thd: yes
13:09 kados thd: dev-week is now completely in sync with rel_2_2 except without the MARC editor bugs :-)
13:10 kados thd: didn't you see the 40+ commits this morning? :-)
13:10 thd kados: I had to avoid overfilling an email quota with commits so I do not see them in real time.
13:11 kados ahh
13:13 thd kados: I would need to have much more time than I have to fix my local mail system again to solve that quota problem
13:14 thd kados:  what Koha SQL columns need linking to the bibliographic framework for 2.2.4?
13:15 kados ?
13:15 thd kados: did you link only 2 yesterday
13:15 thd ?
13:15 kados linking?
13:16 thd kados: items.onloan etc linked to 952 $X?
13:17 kados ahh
13:18 kados yes, that's the only one
13:18 kados thd: I could use your advice on a good mapping scheme
13:18 thd kados: did you not need another for date?
13:18 kados date is in issues, not in items
13:19 thd kados: I thought you added date because things were not working.  Not that they worked afterwards.
13:20 thd kados: what do you want to map in a good mapping scheme?
13:20 kados well ...
13:20 kados thd: items fields mainly
13:21 kados thd: we need to come up with a scheme for call number
13:21 kados thd: that may include adding a column to items (fine by me)
13:21 kados thd: all of the statuses need to be mapped
13:21 kados things like datelastborrowed would be useful too
13:22 kados datelastseen, etc.
13:22 kados the more we can store in zebra the better
13:22 kados because then we can search on it
13:22 thd kados: what are all of the statuses?
13:22 thd kados: how many are they?
13:22 kados notforloan, itemlost, wthdrawn,binding I think
13:23 kados thd: we also need to clean up the descriptive language of the item labels
13:23 kados thd: so that a librarian is not confused :-)
13:23 thd kados: those are already mapped except for binding
13:23 kados paidfor
13:23 kados another status
13:24 thd kados: the descriptions were written to not confuse me :)
13:24 kados :-)
13:24 kados yes, I understand that
13:25 thd kados: do you think that we might run out of punctuation symbols?
13:25 kados thd: also, barcode should not be a mandatory field
13:25 kados thd: but I agree we need a method to alert if a field hasn't been filled in but let the cataloger proceed
13:25 thd kados: I never set it to mandatory except by request
13:26 kados thd: it's set to mandatory in the framework I believe
13:26 thd kados: I think that had been your request at one time
13:28 kados perhaps :-)
13:30 thd kados: I have code for filling the various call number parts for use in labels etc.
13:30 kados in perl?
13:31 thd yes
13:32 thd kados: however, there is no code for what we had discussed previously for the case where NPL needs a couple of system preferences for treating fiction differently etc.
13:34 kados well ... not necessarily
13:34 kados it can be achieved with minimal coding
13:34 kados if we distinguish between classification and call number
13:34 kados and local call number
13:35 kados and define local call number as either classification or call number depending on whether it's fiction or non-fiction
13:37 thd kados: to remind you we need a preference for call number type LCC, DDC, UDC, etc. and a preference for cutter type and rule and a preference for secondary classification defined by an exception list such as material type specifying fiction.
13:39 kados hmmm
13:39 thd kados: mere classification is unknown because there are many different classifications and MARC never has one and only one place to hide a given classification.
13:39 kados couldn't we just define it in the framework instead of a preference?
13:40 thd kados: define what in the framework?
13:41 kados thd: define the type of classification being used
13:41 thd kados: we could add a column to the framework to specify which locations are good for identifying particular parts of a call number.
13:42 kados thd: ie, the mappings would define it for us
13:43 thd kados: the type of classification that a library is using is up to the library and cannot be defined for all libraries
13:44 kados thd: what I mean is that each library will have to define that part of their framework
13:44 thd kados: the bibliographic framework should not be the place to define whether the library prefers DDC, UDC, LCC, etc.  should it.
13:44 kados hmmm
13:45 kados so what is our objective?
13:45 thd kados: you could define the standard locations in advance but they are multiple
13:45 kados to search and display call numbers correctly right?
13:45 kados do we need to distinguish between call numbers and classification?
13:46 thd kados: the objective is to fill the call number by the library's preference
13:46 kados thd: fill it with values coming from the MARC?
13:46 kados thd: because with zebra, we don't need it to be in the koha tables to search or display it
13:46 thd kados: call numbers are numbers of which the classification element is only one part
13:47 kados thd: the only reason for it to be in the koha tables with zebra is if it is used by the system in managing the collection somehow (like circ rules)
13:47 kados (or inventory)
13:48 thd kados: it does not need to be in the SQL tables
13:50 thd kados: MARC has provided 852 $k $h $i $m
13:51 kados thd: http://wiki.koha.org/doku.php?[…]rchingdefinitions
13:51 thd kados: I provided all those in 952 in the framework
13:51 kados thd: this is a document I wrote for NPL to help them clearly define their cataloging practices
13:51 kados thd: I think it would also be useful in this case
13:52 kados thd: the 'organization of materials' section is most relevant
13:52 kados thd: I could use some expansion on what I've written to more accurately describe the components of collection organization
13:56 thd kados: it seems as if they are using 942$c for the equivalent of $852 $k call number prefix
13:57 kados thd: that is my mistake
13:57 kados thd: it should read 952$c
13:58 kados fixed
14:01 thd kados: do they have the values in 952 $c on their spine labels?
14:04 kados yes
14:04 kados owen: right?
14:04 owen Sorry, what? What's 952 $c?
14:04 thd kados: then they are using that for 852 $k
14:06 thd owen: the MARC location defined in the framework used originally by NPL to store the contents of `items.location`
14:07 owen Our current production database doesn't store anything in items.location.
14:08 thd kados: I think that you had it correct previously it was 942 $c
14:09 thd owen: do the item type codes that you use appear on the spine labels?
14:10 owen Yes
14:10 thd kados: you were correct originally you should revert to the previous version in the wiki
14:11 kados right
14:11 owen biblioitems.itemtype
14:13 thd owen kados: NPL uses biblioitems.itemtype stored in 942 $c as the equivalent of 852 $k
14:13 kados right
14:13 thd kados: all our libraries do the same as far as I know
14:14 thd s/our/your/
14:15 kados yep
14:15 kados but 942 $c is not where the spine labels are printed from
14:15 kados afaik the spine labels are coming from a value mapped to callnumber
14:15 thd kados: so you should fill 852 $o in the current standard MARC 21 framework with the contents of 942 $c
14:16 kados but that's the itemtype
14:16 kados not the call number
14:16 thd kados: I thought that NPL was not yet using Koha for spine labels
14:16 kados they aren't
14:16 kados but they do generate spine labels
14:16 kados from fields in the MARc
14:17 thd kados: yes but that is way in which NPL is using the itemtypes that it has defined
14:18 thd kados: 852 $k is usually something like JUV, REF, FIC, etc. at most libraries
14:19 thd kados: those are item types of some sort but not a true material type
14:20 kados right
14:20 kados is there a standard that defines them somewhere?
14:21 thd kados: no just common practise
14:21 thd kados: material types have multiple standards
14:21 kados which turns out not to be very consistant :-)
14:22 thd kados: amazing how far a little cultural conformism will go to address the same problem in the same manner
14:24 thd kados: oh you were saying it was not consistent.  Each library has to apply the basic intent to its own needs.
14:25 thd kados: yet English language usage provides FIC more usually as opposed to FICT, REF as opposed to REFE, etc.
14:26 kados :-)
14:26 thd kados: and then there is the need to economise the space on the small spine label.
14:27 kados right ... any suggestions on that?
14:28 thd kados: the 852 $h $i tend to be more standard than 852 $k and $m which depends on how the library wants to organise its collection
14:30 thd kados: spine labels should be 852 $k $h $i $m with the addition of c. $t where $t > 1 .
14:32 thd kados: in the current MARC 21 framework that is 952 $o $h $i $t
14:34 thd kados: so 952 $o could be filled from the values contained in 942 $c for NPL.
14:37 thd kados: 952 $h would come from either 082 $a, 092, 055 $a, or 852 $h depending on what was filled.
14:38 thd kados: 055 and 852 should be tested for having the correct indicators or the correct number type when indicators were not set.
14:42 thd kados: 952 $i would come from either 082 $b, 092, 055 $b, or 852 $i depending on what was filled except that libraries may have their own preferred cuttering scheme and the supplied item part of the number may not be unique in the library's own collection.
14:43 thd kados: 952 $m is seldom used but would come from 852 $m only from previous records of the same library.
14:44 thd kados: 952 $t would come from 852 $t only from previous records of the same library.
14:45 kados hmmm
14:45 kados that's a lot to parse
14:46 kados it occurs to me to ask why we're trying to map 852 to 952
14:46 thd kados: I already have most of the code for parsing that neatly written for you n Perl
14:47 kados thd: but why do we need to parse it at all?
14:47 kados thd: with zebra?
14:48 thd kados: well one reason for continuing to use a Koha specific items field is that you have the data in the wrong locations in existing records
14:48 kados thd: rebuildnonmarc can fix that up
14:48 kados thd: given a proper framework
14:49 thd kados: another reason would be that you do not have enough subfields for everything that you need to track in one field unless you use capital letters.
14:50 kados capital letters--
14:50 thd kados: 852 $A $B etc.
14:50 kados capital letters--
14:54 kados thd: how many items do we need to track?
14:54 kados thd: have you considered writing a 'standaard MARC holdings' framework?
14:54 thd kados: there are not enough lowercase letters and numbers and punctuation to hold all the predefined or reserved 852 subfields in addition to things that are not in 852 such as the things in 876-878
14:55 kados thd: I've been thinking it might be useful to separate the holdings definitions
14:55 kados thd: from the standard MARC framework
14:55 kados thd: to make it easier for customization  (ie, a smaller file)
14:56 thd kados: I did write a more standard MARC holdings framework in the recommendations comment within the standard MARC 21 framework
14:56 kados ahh
14:56 kados thd: perhaps I'll just pull that out then
14:56 kados thd: add the onloan stuff
14:56 kados thd: does it include call number goodness?
14:58 thd kados: It cannot be really standard unless you change all the code that needs one and only one MARC field defined for holdings.
14:59 thd kados: my recommendation is just like what I had specified for 952 except that their is a greater alignment between MARC 21 subfield names and what I specified.
15:00 kados thd: I think we could create a holdings framework
15:00 kados thd: and link holdings records to bib records
15:01 thd kados: oh you mean really standard
15:02 thd kados: do you mean with multiple fields for the holdings instead of everything in one field?
15:02 kados thd: well ... for 2.4 I mean just 852
15:02 kados thd: for 3.0 I mean a really standard one with multiple fields for holdings
15:04 thd kados: so 3.0 could work with the needed code changes but there is not enough room in 852 for everything needed unless you can use capital letters for the subfields
15:06 thd kados: would you change the data type as needed to allow capital letters for the subfields for 2.4. ?
15:07 kados thd: no
15:07 thd kados: otherwise you would need to remove standard subfields from MARC 21 to use 852
15:07 kados thd: too many components rely on aA equivilence
15:08 kados thd:  you mean that not all koha item fields map to standard 852?
15:08 thd kados: the best I have then is what I recommended for 85k in the framework comments with the addition of punctuation symbols for status etc.
15:10 thd kados: the problem is that they map to 852, 876-8 and a few others including one or two  that are no part of standard MARC.
15:10 kados hmmm
15:11 thd kados: which is why it had filled all th letters and numbers when you looked
15:11 kados tumer is removing that limitation for 3.0
15:11 kados right
15:13 thd kados: 876 has some additional medium or long term status indications, where purchased, for how much, replacement price etc.
15:14 thd kados: those are all mapped to existing Koha items columns and are not in 852
15:17 thd kados: 852 has a very few things which I did not map to 952 such as the address of the library
15:23 thd kados: 952 as I defined it is basically 852 combined with 876-8 with a few minor additions and omissions.  Obviously, the subfield names cannot match perfectly so I wrote the verbose librarian text to keep everything straight.
15:23 thd by subfield names I mean $a $b etc.
15:25 kados thd: I don't think it matters if they are all mapped to 852
15:25 kados I'm not sure what is preventing us now from being completely compliant with MARC Holdings
15:25 kados other than a fully defined framework
15:27 thd kados: you have a fully defined framework but you do not have code allowing the use of more than one field for holdings.
15:27 kados thd: code meaning the editor?
15:27 kados thd: what code is needed?
15:29 thd kados: wherever there is code that falls down when biblioitems.itemnumber is linked to multiple fields.
15:29 thd s/biblioitems/items/
15:30 kados hmmm
15:32 thd kados: because standard MARC needs 852 and 876-8 not just 852.
15:33 thd kados: you need to be able to distinguish between 852 $c shelving location and 876 $c cost
15:35 kados I see
15:35 thd kados: Koha also needs to distinguish different types of cost which MARC stores yet again in other locations acquisitions cost from replacement cost
15:36 kados Koha does too
15:36 kados cost and replacementcost
15:36 kados ahh, but they aren't in 852
15:36 kados in standard MARC
15:36 thd kados: but Koha has them both in the items table
15:36 kados but we aren't limited to using the items table now
15:36 thd kados: nothing about acquisitions is in 852
15:37 kados I see no reason to map cost to items at all
15:37 thd kados: do you not want to know what your items cost?
15:37 kados thd: with zebra we can still see that data
15:38 kados thd: we are no longer limited to searching/seeing data in the koha tables
15:39 thd kados: yes exactly but if acquisitions cost is associated with an item and not only in SQL then you need it stored in MARC which means that you need more than one field.
15:39 thd for standard MARC 21.
15:40 kados thd: so I suppose we should create holdings based on more than one field
15:41 thd kados: well you were proposing to do that for 3.0 not 2.4 a few minutes ago.
15:43 kados thd: I didn't understand the issues fully
15:44 kados it's hard to decide what is best in this case
15:45 thd kados: do you have a good concept of how much work that would be to fix on top of encoding?
15:47 kados thd: no
15:47 thd kados: you had said that tumer was going to do that for 3.0 does tumer have an idea?
15:48 kados thd: maybe
15:48 kados :-)
16:00 thd kados: some problems are amazingly simple to solve if you look at them closely.
16:01 thd kados: other problems like encoding rely on external software libraries that you cannot have time to rewrite yourself if they do not work.
16:01 thd kados: obviously, many uses of Encode::decode are needed for encoding where that can work.
16:01 thd kados: Debian stable may be at a special disadvantage with the out of date MySQL libraries that do not understand UTF-8.
16:02 thd kados: it was issues like that which drove me to switch to Debian testing much to my later regret.
16:02 chris thats what backports are for :)
16:03 thd chris: is there a backport of MySQL for upgrading that library?
16:03 kados thd: yes
16:03 kados thd: backports.org
16:03 kados hey chris
16:03 chris morning
16:03 kados chris: I'm ready to do a 2.3.0 release
16:04 kados chris: and I'd like to re-tag dev_week as rel_2_4
16:04 kados chris: if that's possible and you have time to show me how :-)
16:04 chris its pretty easy
16:04 chris easiest way
16:04 dewey easiest way is to try
16:04 chris do a clean checkout of dev_week
16:05 thd kados: what is 2.3 as distinct from 2.4?  Isa it the unstable version of 2.4?
16:05 kados thd: yes
16:06 thd kados chris: at the time I switched to testing I could net get the updated applications I needed from backports.  I should have been more patient.
16:07 chris then cvs tag -b branchname
16:07 kados can I then delete dev_week tag?
16:07 chris nope, you cant delete tags
16:07 kados ahh
16:08 chris you'll just be branching a new branch at that point
16:08 thd chris: what can be deleted in CVS?
16:08 chris nothing, thats pretty much the point of version control
16:08 chris doesnt hurt kados
16:08 kados k
16:08 chris you could just tag it
16:09 chris not branch .. but then youd still be working in the dev_week branch
16:09 chris thats what the script taht creates the tarball does
16:09 kados hmmm
16:09 chris tags up the files when you make a release
16:09 chris so at any point you can check out those files using that that tag
16:09 chris depends on what you want to do
16:10 kados well ...
16:10 kados what I think I want
16:10 kados is to have a branch that will become 2.4.0
16:10 kados I suppose I can just use dev_week for that
16:10 chris yep
16:10 kados poorly named, but fine
16:11 kados ok, I"ll just do that then
16:11 chris thats what id do, and use the script to tag the files as 2.3.0
16:11 kados no sense making it more complicated
16:11 kados ok
16:11 kados so when I run the script will it show up on savannah?
16:11 chris if you tell it to
16:11 kados cool
16:12 chris hmm did i commit my fixes to the script to dev_week?
16:12 chris lemme check
16:14 chris looks like its all set up for savannah yep
16:14 kados sweet
16:14 chris misc/buildrelease
16:14 kados I'll give it a shot then
16:16 kados chris: any specific way I should run it?
16:16 kados chris: and from a specific directory?
16:17 chris id run it from the base
16:17 chris in ./misc/buildrelease
16:17 chris in= ie
16:17 chris sorry prolly need
16:18 chris perl misc/buildrelease
16:18 kados perl misc/buildrelease
16:18 kados yea
16:18 kados ok
16:18 chris it should ask you
16:18 chris it makes a good guess
16:18 chris so if you are in the base .. then the guess will be right
16:18 chris (making sure you are in the dev week checkout of course :-))
16:19 kados chris: should it be a fresh check out?
16:19 chris doesnt have to be
16:19 chris but is probably safest
16:19 kados k
16:20 chris hmm interesting synchronicity
16:20 chris http://www.bigballofwax.co.nz/[…]07/11/eclipse-ide
16:21 kados neat
16:21 kados nice new website chris :-)
16:21 chris id encourage you to take a look at eclipse sometime .. with the epic plugin
16:21 kados will do
16:22 chris the syntax highlighting, and completion ... plus cvs integration
16:22 chris make it pretty cool
16:22 kados cool
16:22 chris ie you type " and it does the matching "
16:22 chris {} etc
16:22 kados sweet
16:22 chris plus you type $
16:22 chris and you get a list of variables you have already used
16:23 chris so its really good writing new big chunks of code
16:23 kados Would you like to tag the CVS repository?
16:23 kados I'm guessing Y
16:23 chris yes
16:23 kados well here it comes :-)
16:23 chris kados: that blog is my first real play with ruby on rails
16:24 kados cool
16:24 kados yea, I browsed some books at B&N last week about ruby
16:24 kados ok, the tarbal is in my dir now
16:24 kados is it also on savannah?
16:24 chris sweet
16:24 chris no
16:24 chris its not that cool :-)
16:24 kados :-)
16:24 chris you will have to go to savannah and do the release thingy
16:25 chris hmm lemme look
16:25 chris i havent done a savannah one before
16:25 chris but it should be similair to sourceforge
16:27 chris hmm maybe not
16:27 chris http://download.savannah.nongn[…]rg/releases/koha/
16:27 chris i think you just have to stick it in here, somehow
16:27 kados their searches suck
16:28 kados ahh
16:28 chris then write some release notes for the news bit
16:28 chris https://savannah.nongnu.org/projects/koha/
16:28 chris for on that page
16:28 kados yea
16:28 kados I've done news before
16:28 kados that's pretty simple
16:30 chris https://savannah.gnu.org/faq/?[…]o_I_add_files.txt
16:30 chris so we need a gpg key
16:30 kados k, I can handle that
16:37 chris :)
17:27 chris is half an hour up yet? :-)
17:31 kados yea, it's uploaded
17:31 kados and I did a news item on it
17:31 chris excellent
17:31 kados http://download.savannah.nongn[…]rg/releases/koha/
17:31 russ news item ? where?
17:32 chris on savannah
17:32 kados russ: probably not newsworthy in a general sense
17:32 chris http://koha.org/download/  <-- should we write something here
17:32 chris to make sure ppl know that 2.3.0 is unstable
17:32 kados definitely
17:32 kados but I plan on doing 2.3.1 in a day or two
17:32 kados so I'm not sure just how much news we want to generate :-)
17:32 chris otherwise they will click on the 2.2.5 link
17:33 kados yea ... 2.2.5 is still good I think
17:33 chris and see 2.3.0 sitting there
17:33 chris and grab it cos its newer
17:33 kados can't we just link directly to the download ?
17:33 russ two minutes
17:33 kados ie, not the file list?
17:33 chris probably the best bet
17:34 kados too bad you can't put text in the list
17:34 kados names and such like on sourceforge
17:34 chris yep
17:35 chris but then you also have to click 54 times to actually download the file
17:35 kados yea :-)
17:35 chris i think linking to the file and having good text around the link is the way to go
17:35 kados yea I agree
17:35 russ kados - can we move those wiki notes into the en section?
17:36 kados russ: sure
17:36 chris ill have a go with 2.3 after work, on my koha at home
17:36 kados russ: be my guest :-)
17:36 kados chris: I haven't touched the installation scripts yet
17:36 kados chris: but I can send you a tarbal of my migration stuff
17:36 chris but if i follow your docs on the wiki it should work eh?
17:36 kados yea
17:36 chris ill have a go by hand first
17:36 kados cool
17:37 chris so i have an understanding, then ill try out the automatic way
17:37 kados yep
17:37 kados it's no walk in the park :-)
17:37 chris :)
17:37 kados and there are some hard-coded things still
17:37 kados that need to be moved to the config file
17:37 chris yep i figure ppl will be asking questions sooner or later
17:37 kados yea
17:40 kados nothing like a release to make you feel like you've acomplished something in a day :-)
17:40 kados even if I know it's not a stable release :-)
17:41 chris :-)
17:53 thd kados: have you solved the problem of migrating a rel_2_2 installation with UTF-8 data?
17:53 kados thd: not yet
17:53 kados thd: it's more a 'how to migrate from mysql 4.0 to 4.1' problem
17:53 kados thd: for NPL it's hopeless
17:54 kados thd: because they had marc-8 data that was interpreted as latin1 data and I think it's been completely mangled in some cases :-)
17:54 thd kados: so you did not worry about the issue for NPL because NPL had relatively little MARC 8 data?
17:54 kados right
17:55 thd kados: If you had a script that searched for encoding problems you could concentrate on upgrading those records.
17:57 kados thd: I've got one
17:57 kados thd: it just needs to be updated a bit to deal with these specific probs
17:58 thd kados: really, would it also work for Afognak double encoding, and other similar problems that you have seen?
17:58 kados yes, given enough time to troubleshoot
18:00 thd kados: I think tumer has many records that he needs to find before his records including the problem ones become canonical for Turkey
18:01 thd kados: nothing like a new national authority file complete with encoding problems :)
18:02 kados :-)
18:03 kados that wouldn't bode well for Koha :-)
18:04 thd kados: no, having the national library of Turkey using Koha should be all to the good as long as the records can be shared without problems.
18:06 russ philip you there?
18:06 thd russ: I do not see philip logged in?
18:07 russ oops wrong channel
18:07 russ :-)
18:16 thd kados: looking at http://wiki.koha.org/doku.php?[…]tion_of_materials.  Does NPL not have full call numbers in 952 $k?
18:16 kados they do
18:16 thd kados: also, I have noticed that you have been using the old WIKI namespace
18:17 thd kados: the new namespace from devweek has en for English
18:18 kados yea, russ is gonna fix that for me :-)
18:18 thd kados: That way Pascal or whomever can make a French translation under /fr/ instead of /en/
18:18 kados yep
18:19 thd kados: russ had been cleaning the whole thing so that you would not have made the mistake in the first place.
18:19 kados :-)
18:20 thd kados: pierrick had also been working on that but he is no longer participating :(
18:22 thd kados: if using multiple holdings fields is discovered to be too much work for 2.4 do you know of any problems using punctuation for subfields all in one items field?
18:25 kados thd: looking at the framework right now
18:25 kados thd: got some questions
18:25 kados INSERT INTO `marc_subfield_structure` VALUES ('952', '2', 'Source of classification or shelving scheme (similar to 852 $2)', 'Source of classification or shelving scheme', 0, 0, '', 10, '', '', '', NULL, 0, '', '', '');
18:25 kados is that really necessary?
18:26 kados some of these seem like they won't ever be used
18:26 thd kados: it is necessary for designating the call number for fiction in the case of NPL.
18:26 kados how so?
18:27 kados (btw: items.wthdrawn not items.withdrawn ... a minor misspelling in the framework due to a poorly named column name)
18:28 thd kados: indicators are used to specify if the call number is LCC, DDC, NLM. or SUDOC, not that Koha is providing access to the 952 indicator currently.
18:29 thd kados: anything that uses some other classification is supposed to specify the classification in $2
18:29 kados specify what about it?
18:29 kados the name of it?
18:29 thd kados: the classification code and version number
18:30 thd kados: kados has not seen nearly enough correctly encoded records
18:30 kados :-)
18:31 thd kados: NPL's private scheme may not have a standard classification code but they can pretend if there is not one for generic local scheme
18:31 kados they aren't really up to doing that :-)
18:32 kados and it doesn't really serve any purpose
18:32 thd kados: no of course not but you have the system do it for them
18:32 thd kados: it allows classification searches to use information in the records to do an intelligent search
18:33 kados ok so I'll call it NPL v1
18:33 kados :-)
18:33 kados thd: do you know NPL's MARC ORG code?
18:34 thd kados: no but classification codes in MARC 21 are all lower case
18:34 kados thd: it's 'ONe' :-)
18:34 thd kados: do you not have the code list bookmarked
18:34 kados thd: 'ONe' :-)
18:34 kados thd: that's it ... pretty cool, eh?
18:35 thd kados: does LC do vanity classification codes for a special fee? :)
18:35 thd s/classification/organisation/
18:39 thd kados: the only thing that I remember having put in 952 that is not most likely to be useful is article '952', 's', 'Copyright article-fee code (similar to 018 $a, 852 $s)'
18:40 thd kados: although WIPO might care very much about that
18:41 thd kados: do you think they observe the rules they establish for everyone internally?
18:41 kados thd: so ... question
18:41 thd yes
18:42 kados if we created a full MARC authorities file
18:42 kados with MARC holdings
18:42 kados 852, etc.
18:42 kados how many fields would need to contain itemnumber?
18:43 kados and could we acomplish the same thing with a link?
18:43 kados ie, link the itemnumber field to a field elsewhere in the record
18:43 thd kados: changing the code to use a link would be more MARC like but may not be the most efficient
18:44 kados with zebra I don't think it matters
18:44 kados in terms of searching we have no problem
18:44 kados of course, display is not a problem either
18:44 thd kados: do you mean full MARC holdings file not full MARC authorities file
18:44 thd ?
18:45 kados correct
18:46 thd kados: quite a few fields
18:50 thd kados: some are more important than others
19:04 thd kados: perhaps about 25 fields
19:04 thd kados ...
19:04 thd 541 - IMMEDIATE SOURCE OF ACQUISITION NOTE (R)
19:04 thd 561 - OWNERSHIP AND CUSTODIAL HISTORY (R)
19:04 thd 562 - COPY AND VERSION IDENTIFICATION NOTE (R)
19:04 thd 563 - BINDING INFORMATION (R)
19:04 thd 583 - ACTION NOTE (R)
19:04 thd 841 - HOLDINGS CODED DATA VALUES (NR)
19:04 thd 842 - TEXTUAL PHYSICAL FORM DESIGNATOR (NR)
19:04 thd 843 - REPRODUCTION NOTE (R)
19:04 thd 844 - NAME OF UNIT (NR)
19:04 thd 845 - TERMS GOVERNING USE AND REPRODUCTION NOTE (R)
19:04 thd 850 - HOLDING INSTITUTION (R)
19:04 thd 852 - LOCATION (R)
19:04 thd 853 - CAPTIONS AND PATTERN--BASIC BIBLIOGRAPHIC UNIT (R)
19:05 thd 854 - CAPTIONS AND PATTERN--SUPPLEMENTARY MATERIAL (R)
19:05 thd 855 - CAPTIONS AND PATTERN--INDEXES (R)
19:05 thd 856 - ELECTRONIC LOCATION AND ACCESS (R)
19:05 thd 863 - ENUMERATION AND CHRONOLOGY--BASIC BIBLIOGRAPHIC UNIT (R)
19:05 thd 864 - ENUMERATION AND CHRONOLOGY--SUPPLEMENTARY MATERIAL (R)
19:05 thd 865 - ENUMERATION AND CHRONOLOGY--INDEXES (R)
19:05 thd 866 - TEXTUAL HOLDINGS--BASIC BIBLIOGRAPHIC UNIT (R)
19:05 thd 867 - TEXTUAL HOLDINGS--SUPPLEMENTARY MATERIAL (R)
19:05 thd 868 - TEXTUAL HOLDINGS--INDEXES (R)
19:05 thd 876 - ITEM INFORMATION--BASIC BIBLIOGRAPHIC UNIT (R)
19:05 thd 877 - ITEM INFORMATION--SUPPLEMENTARY MATERIAL (R)
19:05 thd 878 - ITEM INFORMATION--INDEXES (R)
19:06 thd kados: do you think that would be enough?
19:07 kados enough for what?
19:08 thd linking to items.itemnumber ?
19:09 kados hmmm
19:09 kados we need to link all of those?
19:13 thd kados: the only scary ones are 853-855 and 863-868
19:14 thd kados: 866-868 are not scary for humans, they are merely scary for computers.
19:34 thd kados: you would not need to link specifically to 841 but the shiny forest in SQL would be much more efficient to parse for the scary cases.
19:36 thd kados: I can present a simplified case for the only users that you can obtain now in any case
19:39 thd kados: acquisitions would need 541, 583, and 876 that is just 3 fields with 877-8 for very special cases
19:43 thd kados: cataloguing would need 852 or 856, and sometimes 863-868
19:44 thd tumer: is that really you?
19:44 tumer yep awake and zebraing
19:45 thd tumer: kados had said that you were intending to eliminate the dependence that Koha had on using one and only one MARC field for holdings.
19:45 tumer i already  did
19:45 thd for 3.0
19:46 tumer yep
19:46 thd tumer: really?
19:46 dewey really are quite different
19:46 thd tumer: and it works?
19:46 tumer yes i have separate marc record for holdings niw
19:47 tumer thd: which field is best to hold the biblionumber in holdings record?
19:47 thd tumer: and do these separate holdings records use more than one of the holdings fields?
19:48 tumer they have their own 001 005 007 852- to 999 all
19:48 tumer a complete marc record
19:49 tumer any field to put anything in
19:49 tumer no restrictions
19:49 tumer well within reason
19:50 tumer if kados spares same space i will commit the whole lot to the cvs
19:50 thd tumer: well if you are doing something non-standard for Koha I like some field with a letter in it like 90k but I realise that may require some code change
19:51 tumer i used to now i stick with standard LC nummbers
19:52 tumer separate holdings,bibblios and authorities records you do not require fields with letters there is abundance of fields
19:52 thd tumer: well the standard place for the link number is 004 - CONTROL NUMBER FOR RELATED BIBLIOGRAPHIC RECORD
19:53 tumer great thanks
19:53 thd tumer: that would be the place for storing the 001 from the bibliographic record
19:53 tumer i hope to get a full koha intranet working on this model by weekend
19:54 thd tumer: are you no longer using SQL code for tracking the items?
19:54 tumer i already have biblionumber in 001 in biblios itemnumber 001 in holdings
19:55 tumer there is nothing in sql any more only blobs of marc records
19:55 tumer except issues and borrowers and subscript ofcourse
19:55 thd tumer: do you mean you are creating a separate holdings record for each item even if you have duplicate items for the same biblio?
19:57 tumer yes because even a second copy is a different entity which has copy 2 in it
19:57 thd tumer: do you have multiple copies of the same biblio in your library?
19:58 tumer yes but all call numbers end with c1 c2 etc so they are different and unique
19:58 thd tumer: you do not attempt to make use of the repeatable fields so that you have for example 852 $t1 and 852 $t for the first and second copies?
19:59 thd tumer: you do not attempt to make use of the repeatable fields so that you have for example 852 $t1 and 852 $t2 for the first and second copies?
19:59 tumer its easeier to manage them this way
19:59 thd tumer: yes that is very evident
19:59 tumer easier on indexing releases the starin on zebra
20:00 tumer s/starin/strain
20:00 thd tumer: and there I was thinking about how to do it the most difficult way
20:00 thd kados: are you there?
20:04 thd tumer: that is brilliant to have done it the way that is both easiest and most efficient
20:04 tumer well thanks i dont know whether correct but thats how i designed it
20:05 thd tumer: that is perfectly in conformance with the standard which is the only important obstacle
20:06 thd tumer: I would like to persuade you to proceed with authorities in a standards compliant manner
20:07 tumer thd: i have to redesign authorities, the way they are now is not working
20:07 thd tumer: It may be a little painful in the short run but you will be much happier if they interoperate well with other systems
20:08 thd tumer: what about them is not working?
20:08 tumer thats my intention
20:08 thd tumer: specifically is not working?
20:08 thd tumer: what specifically is not working about authorities?
20:09 tumer all linking of authorities etc i have to study it a bit more
20:09 tumer now we have thd producing some things i did some other all non complimenting each other
20:10 tumer the editor does not work with them either
20:10 thd tumer: the editor code was never fixed for authorities
20:11 tumer i know
20:11 thd tumer: the authorities editor code needs to be copied from the bibliographic editor code
20:12 tumer yes but try to add a authors name to a bibliographic record from authorities. Its now a mess
20:12 thd tumer: do you mean with encoding?
20:12 tumer no
20:13 tumer you try filling 700 it fills 690
20:13 tumer all the java script messing it all up
20:13 thd tumer: ooooh that is bad
20:13 tumer i reported a bug but in vain
20:15 thd tumer: unfortunately, the only Koha libraries actually using authorities merely use the script designed to  fill existing values in the bibliographic records into the database.
20:16 tumer its not obvious untill you use subfield cloning -- the bug i mean
20:17 tumer whats the use of creating authorities that you do not comply with?
20:17 thd tumer: those libraries are all using UNIMARC
20:17 thd tumer: they comply if they comply by using already compliant records.
20:19 tumer well its not good for me
20:19 tumer we are creating 90% of our records
20:19 thd tumer: subdivided subjects will seem to need repeated $9s for most free floating subdivision $z $x $y $v
20:20 tumer thd i am so far away from the subject, also they have to be freely ordered
20:21 thd tumer: I suspect that having repeated $9 in the same way that BnF has repeated $3 for subject subdivisions is the only way to have a manageable authority file.
20:22 tumer thd:ok
20:22 thd tumer: repeatable $9 should allow for freely ordering the subfields
20:23 thd tumer: your library is  translating LC authority files, is it not?
20:23 tumer but i dont even know whether we actually fill $x $y $z from differnt authorities or one record that has all fields in it
20:24 tumer currently its hybrid. LC subject headings beung translated but no actual records
20:24 thd tumer: decades ago LC stopped creating authority records with all the subdivisions in the authorised form
20:24 thd for subject
20:25 tumer BnF do dont they?
20:26 thd tumer: LC subject authority records only comprise about 13% of subject authority strings
20:26 thd 13% of subject bibliographic strings I mean
20:27 thd tumer: BnF bends the UNMARC rule that specifies $3 as not repeatable for subject subdivisions
20:27 tumer thats a relief, I thought i was going to have them all like that in millions
20:29 thd tumer: subject authority files would be millions of authorised forms if they had to be complete subject strings with all the subdivisions in one record
20:30 thd tumer: LC only has circa 125,000 authorised subject headings which can be used to build all the others.
20:31 tumer i am in more clear now
20:31 chris is there authorities for other languages thd?
20:31 thd yes it is very easy
20:32 thd chris: although tumer says there are none yet for Turkish so he must create them
20:32 chris right
20:32 chris im betting there are none for maori too
20:33 chris which would be needed if the total emmersion schools were to use them
20:33 chris immersion even
20:34 chris maybe thats something the future foundation could do, help hold/build authority records for other languages
20:34 thd tumer chris: preserve the original LC record you may be translating into Turkish or Maori by giving the original heading in a 7XX linking field in the authority record
20:35 chris ahh good idea
20:36 tumer i give it in 750
20:36 tumer i even linked them so searching in one language brings the other as well
20:37 tumer but thats not standart i know
20:38 thd tumer: that is standard
20:39 thd chris: national libraries usually do that or some really large university library
20:39 thd chris: although most counties lack authority records
20:41 thd chris: authority records are found in Latin America even if under used but they are largely absent even in some relatively rich countries like Italy
20:42 thd tumer: you have done it precisely correctly as long as you are not still using $6 to link outside the record
20:43 thd $6 is for linking fields within the same record
20:44 tumer which subfield should it be 8?
20:44 thd tumer: $8 is also for linking within the record
20:44 tumer or a specific field?
20:45 tumer as i say i know zilch about authorities
20:46 thd tumer: $0 is used for linking from the authority record to the original record although you should also preserve the original record control number in 035
20:47 thd tumer: an example is 750 #0$aSummer resorts$0(DLC)sh#85130430#$wna
20:48 thd tumer: http://www.loc.gov/marc/author[…]link.html#mrca750
20:49 tumer i will look into it
20:49 tumer i have to check this zebra which is refusing to start
20:50 thd tumer: be careful not to be kicked from behind by your zebra
21:41 kados Burgundavia: still waiting for that email :-)
21:41 kados Burgundavia: I'll hold you to your purpose :-)
21:41 Burgundavia kados: it is stuck in the loop of no time
21:42 kados :-)
21:42 Burgundavia I am about to run out to the local LUG meeting
21:42 kados I hear you
21:42 kados cool
21:42 Burgundavia it is mostly written, just missing the mockup
21:42 kados excellent
21:42 kados can't wait to read it
21:42 kados I'll be working pretty much all week on 2.3.x
21:43 Burgundavia I just saw that annoucement
21:43 kados so now would be a really great time to have the ideas/mockup
21:43 Burgundavia what is policy on posting stuff to koha.org?
21:43 kados posting what?
21:43 kados announcements?
21:43 Burgundavia yep
21:43 Burgundavia the new development announcement
21:44 kados do you mean to savannah? or koha-devel? or on the website koha.org?
21:44 kados well I'm the release manager
21:44 kados so I can basically do anything :-)
21:44 kados muhahaha
21:44 kados if you have something you need announced you can generally send it to russ or someone
21:44 Burgundavia the website, in the little news box which has nothing in it
21:44 Burgundavia nothing new, that is
21:44 kados right
21:45 chris yep, ppl have to tell us news, then we'll put it there
21:45 kados well basically if someone has time to write something they submit it to russ
21:45 kados or someone
21:45 kados the project is really loosely defined
21:45 kados there are not that many core developers
21:45 kados so we basically know what's going on :-)
21:46 Burgundavia indeed
21:46 kados and news and stuff is loose as well
21:46 kados plus, koha.org is really just the project page ...
21:46 chris pretty much anything thats interesting or new, and someone tells us about can go up
21:46 Burgundavia the start of a major new development cycle is pretty good news, I think
21:46 kados the liblime team has enough trouble keeping our marketing website up to date :-)
21:46 Burgundavia gets people all excited, reminds them the project is not dead
21:46 kados yep
21:47 kados but it's really the tail end of the dev cycle
21:47 chris yep
21:47 kados we've been working on this release for about two years now :-)
21:47 Burgundavia 2.3?
21:47 kados yea
21:47 Burgundavia see, that is not how (as an outsider), interpreted your email
21:47 kados right
21:47 chris work leading up to 2.4 and 3.0 has been going on since 2.2.0 was released
21:48 kados yea
21:49 chris when 2.4 comes out there will be tons of fanfare on koha.org .. and on the main koha-devel list
21:49 Burgundavia yep
21:49 kados so 2.3.0 is basically where we've said "ok, stuff is actually working ... now lets test and bugfix, etc. and in a month or so, we'll have a stable product"
21:49 chris but for now 2.3 is so bleeding edge we dont really want too many people trying it
21:49 chris cos their will be blood everywhere :-)
21:50 Burgundavia no, but you do want people to be excited about it existing
21:50 chris there even
21:50 kados yea, bleeding edge is key
21:50 chris its hard to get them excited without them asking where is it, can i have it
21:50 kados Burgundavia: not really, because the core developers don't even know how to install it :-)
21:50 Burgundavia heh
21:50 chris its a tricky one
21:50 kados Burgundavia: only tumer and I have got it going so far :-)
21:50 chris maybe in a week or so
21:50 kados yea
21:50 Burgundavia ok
21:50 chris when we are at 2.3.2 ish
21:50 kados yep
21:51 kados I'll do 2.3.1 tomorrow afternoon
21:51 Burgundavia just don't want to miss an opportunity to get us talked about
21:51 kados already spotted some probs
21:51 kados also I want to get some public demos going
21:51 kados so folks can try it out with real data
21:52 kados Burgundavia: well, when you get to that email, I'm all eyes :-)
21:52 Burgundavia ok
21:52 chris yep its true, more publicity would be good, but id hate to have ppl see it, and then try it out and say it sucks because they cant install it :-)
21:52 Burgundavia too many projects, so little time
21:53 kados heh
21:53 chris but there are few new libraries running koha
21:53 Burgundavia somewhere in there I also need to finish that edubuntu case study
21:53 chris which we should do some news about
21:54 Burgundavia chris: just to introduce myself, I am Corey Burger. I work for Userful in the day and volunteer with Ubuntu/Edubuntu at night
21:54 chris ahhh ive seen your name around :)
21:54 Burgundavia nothing bad I hope :)
21:54 chris im Chris Cormack, i work for Katipo Communications in the day .. and some of the night too, and Koha all over the place
21:54 Burgundavia cool
21:55 chris trying to remember where ive seen it
21:55 Burgundavia I am co-author on the Official Ubuntu Book?
21:55 chris ahh
21:55 chris that'll be it
21:55 chris planet ubuntu
21:56 Burgundavia ubuntu is everywhere
21:56 chris yep
21:57 chris i met mark at linuxconf.au this year
21:57 Burgundavia ah yes
21:57 Burgundavia I was supposed to be at the Paris development conference, but it clashed with ALA in New Orleans, sadly
21:57 chris ahh yes
21:57 chris how was ALA?
21:57 Burgundavia big
21:57 chris it sounded big
21:58 Burgundavia a big, giant Windows world, with a few beacons of hope: our booth and Index Data's
21:58 chris :-)
21:58 Burgundavia our booth = Userful's
21:58 chris right
21:58 chris i think liblime was sharing with index data?
21:59 Burgundavia had a laugh at SirsiDynix claiming to built on "open standards"
21:59 Burgundavia yep
21:59 chris yeah
21:59 chris interestingly enough sirsidynix are taking up the first 2 hours of the first day of lianza (the nz equiv of the ala conference) here this year
21:59 Burgundavia interesting. What with?
22:00 chris the keynote address
22:00 Burgundavia ah
22:00 chris and then chairing a panel
22:00 Burgundavia "how libraries can spend more money on us"
22:00 russ http://www.lianza.org.nz/event[…]06/programme.html
22:01 chris im picking that will be the angle they will take
22:01 Burgundavia library I know is migrating Dynix Classic --> Sirsi Unicorn. Loosing 900k in fines, due to Unicorn not understanding that fines should persist after the book/thing has been removed from the collection
22:01 chris roll up roll up, get locked in to our solution, ull never escape
22:01 chris yikes thats a lot to lose
22:02 Burgundavia not too mention the headache of the migration itself
22:02 russ oooh that would cripple a public library in our region
22:02 chris 2 libraries in nz have migrated from old dynix to koha in the last year
22:02 Burgundavia they said that getting support for classic has basically been impossible since the merger
22:02 chris yeah
22:03 chris the ones who migrated really had no choice, migrate or upgrade
22:03 Burgundavia it is important to publicize any libraries that do migrate
22:03 Burgundavia the koha website gives the impression that it is a one library trick
22:03 chris yep, we are working with the libraries to get them to do some publicity
22:03 Burgundavia well, two, if you county athen county
22:04 Burgundavia count, that is
22:04 chris yeah, its more like a 200 library trick
22:04 chris but its hard to show that in a nice way
22:04 chris we used to have a map
22:04 chris on the old website ... and we need something like that again
22:04 Burgundavia a little news piece on the main website
22:04 Burgundavia X library has migrated from Y to Koha
22:05 chris http://old.koha.org/about/map/index.html <-- old map
22:05 Burgundavia I have noticed libraries hate being pathfinders on technology
22:05 russ i guess we have steered clear of that in the past
22:05 chris we try to get the libraries to give us the permission for that
22:05 Burgundavia ok, that is an amazing map
22:06 chris thats old too, theres lots more than that now
22:06 Burgundavia who does Koha marketing?
22:06 chris no one :)
22:06 Burgundavia hmm
22:06 chris katipo markets its services ... which semi markets koha
22:06 chris liblime does the same
22:07 chris paul in france does the same
22:07 Burgundavia but you have no Koha case studies, in printed form
22:07 Burgundavia nor a product guide
22:07 chris in printed, nope
22:07 chris we have lots in html
22:07 chris in the wiki
22:07 chris and at www.kohadocs.org
22:07 chris hehe
22:07 russ lol
22:08 chris have you looked at wiki.koha.org ... and www.kohadocs.org ?
22:08 Burgundavia yep I have dug through
22:08 chris thats pretty much what we have
22:08 chris we'd love printed material
22:08 Burgundavia most oss projects are bad on this
22:08 chris but suffer from the same only so many hours in the day proble
22:08 chris m
22:09 Burgundavia hence the edubuntu case study
22:09 chris right
22:09 Burgundavia http://ubuntu.ca/Edubuntu-casestudy.png
22:09 chris we have some printed material
22:10 chris some brochure stuff etc, that we (katipo) have used at ALIA online, and lianza
22:10 Burgundavia ah
22:10 chris and i think liblime has some
22:10 Burgundavia let me get this edubuntu case study out the door and the OPAC UI critique and I will see if I can take stock of what is available
22:11 chris but there is no really general koha ones
22:11 chris excellent :)
22:11 Burgundavia ideally you wanat something you can co-brand: liblime/katipo/koha-fr and koha
22:11 chris yeah that would be ideal
22:12 Burgundavia does koha have a palette? (aside from purple)
22:12 chris koha the project?
22:12 Burgundavia yep
22:12 chris hmm ill defer to russ on that one
22:13 chris you dont want me doing anything to do with design :-)
22:13 Burgundavia heh
22:14 russ ahh not really
22:14 Burgundavia ok
22:14 Burgundavia the blue and green on the website are not bad
22:14 Burgundavia anyway, I have to run
22:14 russ it is mainly the green and the purple
22:14 Burgundavia I will get my work IRC client on this channel as well
22:14 russ catch you later
22:15 chris cya later
23:16 thd kados: are you still awake?
02:39 ToinS hello world
02:40 chris hi toins
02:40 ToinS hello chris
03:43 chris toins: with the work you are doing with C4::Koha ... are you planning to use get_itemtypeinfos_of instead of GetItemTypes ?
03:44 chris or keep using GetItemTypes ?
03:44 ToinS i've not really worked on koha.pm
03:45 ToinS i've just meet getitemtypes in a script and renamed it
03:45 ToinS so i don't knw
03:45 chris ahh ok
03:45 btoumi hi all
03:45 ToinS hi btoumi
03:45 btoumi hi toins:
03:45 btoumi hi chris
03:46 chris hi btoumi
03:46 chris toins: i just noticed the FIXME but im not sure who wrote that
03:46 btoumi chris i have a question for u
03:47 btoumi can I?
03:47 chris i think they do different things, so i think keep using GetItemTypes for now
03:47 chris sure btoumi
03:50 ToinS chris: ok !
03:53 btoumi i work on fines
03:54 btoumi but i can't see all value for accounttype can u help me?
03:55 chris right
03:55 chris there is Pay
03:56 btoumi i find fu o f m lr lc status but not all
03:56 chris as well as those ones
03:57 chris L, F, Rep, F, FU, A, Pay
03:57 chris are the ones i know
03:57 btoumi ok
03:58 chris plus the
03:58 chris CS, CB, CW, CF and CL
03:58 btoumi can u configure them or not?
03:59 chris no, they arent configurable
03:59 chris you can add more if you need more tho
03:59 btoumi now i learn more about the fines functionnement
04:00 chris all the C ones are credits
04:01 btoumi because for my library we must  manage fines
04:01 chris right
04:02 btoumi but now the calcul of fines is do by fines2.pl ?
04:02 chris yes
04:02 chris and Accounts2.pm
04:03 btoumi i find a lot of problem with this file
04:03 chris and Circulation/Fines.pm
04:03 chris fines2.pl ?
04:03 btoumi yes
04:03 btoumi it uses a table whio not exist
04:04 chris ahh 2 seconds
04:04 chris ill update it
04:04 btoumi ok thanks
04:04 chris hm which table does it use?
04:06 chris maybe im looking at the wrong file
04:06 chris is it misc/fines2.pl ?
04:08 btoumi just one minute i think i make confuse with file i think i must sleep now
04:08 btoumi ;=)
04:08 chris :-)
04:09 btoumi i confirm
04:10 chris in head?
04:10 btoumi not fines2.pl but fines.pm because in fines2.pl u use fines.pm
04:10 chris ah ha
04:10 btoumi yes i work only in head
04:11 btoumi line 161 in fines.pm
04:12 chris ahh yes i see it
04:12 chris categoryitem
04:12 chris that should be issuingrules
04:12 btoumi yes
04:12 chris 2 mintues ill fix that sql up
04:13 btoumi i do it in my repository but i wasn't sure
04:13 btoumi that why i ask u
04:14 chris back before issuingrules existed, the issuing rules where in a table called categoryitem (borrower categories, plus itemtypes = categoryitem)
04:14 chris but issuingrules replaced that
04:15 chris there we go
04:16 chris damn
04:16 chris i edited that file just a while ago and didnt even notice that :-)
04:18 btoumi i'm happy to help u ;=)
04:19 chris :)
04:21 btoumi perahps i have another question for u but not for the moment
04:21 btoumi ty
04:22 chris no problem
04:25 btoumi another probleme with fines
04:25 chris yep?
04:26 chris hi arnaud
04:26 alaurin hi, how are u .????
04:26 btoumi calcfines was not called in export
04:27 chris ahh, it should be, can you fix that?
04:27 btoumi yes i do it
04:27 chris arnaud: im good thanks, and you?
04:32 alaurin fine, i'm near of my holidays, so , I feel good !!!!!
04:33 ToinS hello arnaud
04:36 alaurin hi Toins
04:37 btoumi chris: ok for fines.pm i have some probleme with my repository because call to calfine is ok in fines.pm
04:37 chris ahh ok
04:37 btoumi need one week for sleep
04:38 btoumi ;=)
04:39 chris hehe
04:40 paul hello world
04:40 chris hi paul
04:41 chris oh no
04:41 paul so works from Antoine home !!!
04:42 chris up near the Notre Dame de la Guarde ?
04:42 paul yep
04:42 chris there must be a great view for Antoine's home
04:42 paul my new home being close to ND
04:42 chris ohh cool
04:43 paul just on the other side of the hill (not sure of the word hill)
04:43 chris i think hill is right ... its a big hill though :)
04:49 chris will you get cable or adsl in your office paul?
04:49 paul adsl
04:49 paul something like 4MB dn / 800KB up
04:49 paul s/B/b/
04:49 chris cool, thats not bad at all
04:50 chris i have 2mb dn, 2mb up
04:50 chris on a cable modem
04:50 paul I had 6Mb previously, but my main concern is the up bandwidth
04:50 chris yeah
04:51 paul to investigage zoom & unimarc things...
04:51 chris excellent
04:51 chris did you see there is a 2.3.0 ?
04:51 paul ToinS will start cleaning acquistion module this afternoon
04:51 paul yep. it's a dev_week snapshot I bet ?
04:51 chris yep
04:52 chris cool, we merged all our fixes in the acquistion module in head a few weeks ago
04:52 chris so it probably needs some cleaning :)
04:52 paul did you look at ToinS code cleaning on suggestions ?
04:53 chris yes its really good
07:43 kados hi paul
07:44 paul hello kados
07:44 paul good morning to you
07:44 paul (working from ToinS home, because I still don't have any connexion at my new home :( )
07:45 paul do you have a few seconds for some questions ?
07:45 kados sure
07:45 paul http://wiki.koha.org/doku.php?[…]ingzebraplugin226
07:45 ToinS hi kados
07:45 kados hey ToinS
07:45 paul I'm not sure to understand :
07:45 paul - run rebuild-nonmarc from rel_2_2 if your framework has changed
07:45 paul #
07:45 paul run missing090field.pl (from dev-week)
07:45 paul #
07:45 paul run biblio_framework.sql from within the mysql monitor (from dev-week)
07:46 paul #
07:46 paul run phrase_log.sql from within the mysql monitor (from dev-week)
07:46 paul   5.
07:46 paul      export your MARC records
07:46 paul   6.
07:46 paul      run them through a preprocess routine to convert to utf-8
07:46 paul   7.
07:46 paul      double-check again for missing 090 fields (very critical)
07:46 paul rebuild non marc is OK I think
07:46 kados except it should say from dev_week :-)
07:47 paul i have completed this page & will continue as well, when I encounter something I don't understand well
07:47 kados have you done those steps? or don't understand how?
07:47 paul I haven't done anything yet
07:48 paul I just want to know what does those scripts ;-)
07:48 kados http://cvs.savannah.nongnu.org[…]_with_tag=R_2-3-0
07:48 kados they are in the zebraplugin dir
07:48 paul (+ learn what you mean by "preprocess routine to convert to utf-8 ? don't you have one already ?)
07:49 kados well ... preprocess includes many things
07:49 kados every previous version of Koha produced improper MARC records
07:49 kados so all of those marc records must be preprocessed
07:49 kados change encoding is just one step
07:49 dewey kados: that doesn't look right
07:50 kados other steps are to add a new leader
07:50 paul dewey : do back to bed please. It's time to sleep in new zealand...
07:50 dewey paul: excuse me?
07:50 kados add all of the fixed fields
07:50 kados (used for searching by date and format/content/audience)
07:51 kados plus ...
07:51 kados koha 2.x doesn't export items properly
07:51 kados so it's necessary to query items table to get the right values for items fields
07:51 kados all of that is covered under preprocessing
07:52 ToinS have someone ever seen this error : HTML::Template->param() : You gave me an odd number of parameters to param()! ???
07:52 paul yes ToinS : you have a 'XX' => $variable and $variable is probably empty.
07:52 paul try 'XX' => "$variable"
07:52 ToinS ok
07:52 paul kados: is there a script that does all of this for you somewhere ?
07:53 kados paul: it's not completely written yet
07:53 kados paul: and every client has different needs
07:53 paul yes, but that would be nice to have a starter ;-)
07:53 kados right ... but MARC21 is quite different than UNIMARC
07:53 kados I'm afraid it won't be useful to you
07:54 paul because I can accept that " so it's necessary to query items table to get the right values for items fields" but I don't know what it means exactly & what to code to fix this !!!
07:54 kados well ...
07:54 kados what it means is that somewhere in rel_2_2 code
07:55 kados when items are updated, MARC isn't
07:55 paul on transfers, right ?
07:55 kados also, somewhere in rel_2_2 code, 090 fields are being deleted or are never added for some records
07:55 kados NPL has several thousand!! missing 090 fields
07:56 paul wow ! strange, I never had this problem in France !
07:56 kados have you run missing090.pl before? :-)
07:56 kados you may be surprised :-)
07:56 paul I'll run & let you know if there is the problem
07:57 kados paul: also, we spoke briefly about acquisitions bugs before
07:57 kados http://wiki.liblime.com/doku.php?id=koha226bugs
07:57 kados a client has explained the bugs
07:57 kados and I have submitted bug reports for each confirmed bug
07:57 kados (welcome back :-))
07:58 paul ToinS will start code cleaning on head in the afternoon
07:58 mason hiya guys
07:58 kados excellent
07:58 kados hey mason
07:58 kados mason: kinda late, eh? :-)
07:58 kados paul: I hear your new home is very beautiful
07:59 mason i have some acqui. changes done in that last couple of weeks that ill commit to head in the next hour, too
07:59 mason tis only 1am :)
07:59 paul great mason, we will wait for them then
07:59 mason cheers paul
08:00 kados paul: it's good to have you back :-)
08:01 kados paul: been very lonely in #koha in the morning for me :-)
08:02 kados paul: http://liblime.com/public/roundtrip.pl
08:02 kados paul: there is a sample of a preprocess script that only converts to UTF-8
08:02 kados paul: for MARC21 only
08:03 kados paul: it also keeps track of problem records and saves them in XML and iso2709 form in separate dump files
08:06 paul2 ok kados, i'll investigate in the next minutes, i'll let you know if I have another problem
08:41 kados paul: I just investigated whether it would be possible to keep the rel_2_2 style API for searching, and I think it could be done with some minor changes
08:41 kados paul: changes only to the routines in SearchMarc.pm
08:42 paul good news...
08:42 kados paul: the CCL style of searching is very similar to the old API
08:42 kados :-)
08:43 kados the only tricky part is how to integrate the results
08:43 kados because with zebra, we get back full MARC records
08:43 kados not bibids
08:44 kados (in fact, we don't get the whole set of bibids at all ... that is kept in zebra)
08:44 kados so maybe each MARC record in the result set must be processed to split it into Koha fields and MARC fields for display
08:49 paul if you have the MARC record, MARCmarc2koha sub in Biblio.pm will create the Koha field easily
08:50 kados cool, thanks
09:12 paul kados : what do you mean by "update to the latest bib framework"? do you mean "update to the latest marc21 frameworks from thomas" ?
09:13 paul if yes, what is specific with them (as I can't do it for unimarc)
09:15 paul other question : I think convert_to_utf8.pl is redundant with updater/updatedatabase (see updatedatabase line 1404+ : it's the same alter table)
09:22 paul kados : another problem/question : i try to run missing090field.pl, but it fails, as it requires zebra working. are you OK if I move this part AFTER Starting zebra ?
09:25 paul kados : another one : biblio_framework.sql is partially already in updatedatabase. Are you OK if I put everything in this script ? (should be easy)
09:25 paul same for phrase_log.sql
09:25 paul hello tumer/
09:25 tumer hi paul sorry about france
09:26 paul the priest of my church is italian, so we all had a very large problem on sunday : could we have been happy with a victory from anyone.
09:27 paul in fact, this end was the worst possible, but also the best possible : it means nothing to win with penalties.
09:27 paul so it also means nothing to loose with penalties ;-)
09:27 tumer not if you ask italians
09:28 paul not all italians : our priest was happy, but only half happy.
09:28 paul and in 2 or 3 days, their joy will end with the justice decision, probably.
09:28 paul did you hear of this scandal in cyprus as well ?
09:28 tumer paul i am almost finishing a complete new api for record handling. 1 question?
09:29 tumer do you think we shouldmanage itemtype at biblio level or item level?
09:29 paul mmm... this question should be asked to koha-devel. In UNIMARC, itemtype is a biblio level information.
09:30 paul the best being to have something like FRBR, that is incompatible with unimarc.
09:30 paul (and maybe with marc21 as well)
09:30 tumer so you cannot hve a CD version of a book under same biblio
09:30 paul no you can't, because it's a different intellectual object.
09:31 paul otherwise, we could, for example, put all books from the same author in the same biblio.
09:31 tumer a book with a cd causing us problems-- CD gets loaned out for shorrter
09:31 tumer and sometimes separately
09:31 tumer but they are in sme biblio
09:32 tumer well may be we should change library policy tehn
09:33 paul note that we could have itemtype at item level & still have the same behaviour as previously, the library just would have the same itemtype for each item.
09:34 paul but I think it must be asked to koha & koha-devel ml
09:34 tumer i'll ask it there
09:34 tumer here is what i have changed:
09:34 tumer biblio table has only marc,biblionumber and itemtype
09:35 tumer items has itemnumber,biblionumber,marc,barcode
09:35 kados paul: just make sure you have two conf files
09:35 tumer no biblioitems
09:35 tumer biblio also has frameworkcode
09:35 kados paul: and you are using rel_2_2 bases when running missing090
09:35 kados hi tumer
09:35 tumer hi kados
09:36 paul ah, ok , it means you must have 2 DB during migrations.
09:36 kados tumer: I read your conversation with thd yesterda
09:36 kados y
09:36 kados paul: not quite
09:36 kados paul: you run missing090 in the database before you mysqldump it
09:36 kados paul: then you must run it again when you have the MARC data
09:37 kados paul: (because sometimes there are still missing 090 fields and zebra indexing will crash if they are missing)
09:37 kados tumer: please proceed with your plan and feel free to commit to HEAD
09:38 tumer kados:it will break everything itsa a complete new api
09:38 kados tumer: that's the point
09:38 kados tumer: there is no official HEAD api yet
09:38 kados tumer: what you are doing is closest to my vision of it
09:39 tumer ok but head has lots of junk old script in it, i have cleaned mine
09:39 tumer how can i delete from head
09:39 kados good question
09:39 kados I wonder if savannah supports subversion yet
09:39 kados that would allow this
09:39 kados you can also do:
09:39 kados cvs delete file.pl
09:40 tumer does that put them in archive?
09:40 paul tumer: yes
09:40 kados yes, you can't actually delete them
09:40 paul (in Attic directory)
09:41 tumer so  what i will do is submit new biblio which will break all record handling ok?
09:41 paul you plan to break everything ? sounds cool to me :D
09:42 kados tumer: ok
09:42 tumer i think thats the only way to actually get good clean code
09:43 kados tumer: if you commit it I will take a look immediately
09:43 kados tumer: and tell you if I don't like something :-)
09:43 kados (note that HEAD is already completely broken)
09:43 kados (so it won't change anything :-))
09:47 tumer i am currently working on it hope to get it fully functional by this weekend and put it to production
09:48 tumer then i will commit
09:49 tumer btw it was not zebra problem zebra was crashing cuase this new M:F:X was not giving me xml_as_record
09:49 tumer i did not understand why
09:58 kados weird
09:58 kados what new M:F:X?
09:58 kados is it the newest from SF?
09:58 tumer yep
10:00 tumer i write $record->as_xml_record() and still get <colection> wrapper
10:00 paul kados : I can't get missing090.pl work correctly.
10:00 paul it calls MARCgetbiblio with a biblionumber as parameter.
10:00 paul in rel_2_2, the parameter must be a bibid.
10:01 paul and if I try with PERL5LIB pointing to dev_week, it fails, because the KOHA_CONF is not xml, but the old .conf file
10:02 paul so, I have to run it from dev_week 100% :
10:02 paul - copy rel_2_2 DB
10:02 paul - run updater/updatedatabase
10:02 paul - run missing090.pl
10:03 paul the good news being that i don't have any missing 090
10:03 kados excellent
10:03 paul oups, no, I have a few.
10:03 kados ahh...good for me :-)
10:04 paul 10 for a 14000 biblio DB
10:04 tumer but should be 0
10:05 paul right tumer
10:05 paul it's really a lot
10:06 paul kados : you've missed some of my questions it seems :
10:06 paul  - update to the latest bib framework
10:07 paul means "update to marc21 from thomas"  ?if yes what is interesting for me to know for unimarc ?
10:07 paul (hello owen)
10:07 owen Hi paul
10:08 tumer paul i dont think you need it and infact should not otherwise you will loose your mappings
10:08 paul - biblio_framework.sql, phrase_log.sql are already partially in updatedatabase. Is it OK if I put everything there ?
10:08 kados yes
10:08 kados everything should be in dev_week updatabase
10:09 kados eventually I would have put it in but didn't get to it
10:09 paul ok, i take care of it & update the wiki
10:09 kados thx
10:09 tumer biblio_framework update is not an essential part of this upgrade is it?
10:09 kados yes
10:09 kados it's poorly named
10:09 tumer it merely adds more field definitions to what we already have and which we will not use
10:09 kados biblio_framework.sql is what alters biblio and moves the frameworkcode as well as adding some columns
10:10 tumer ahhh
10:10 paul things that are already in udpatedabatase (like utf-8 conversion)
10:10 kados utf-8 conversion should not be in updatedatabase
10:10 paul why ?
10:10 kados the wiki is not up to date with my latest tests
10:11 kados because depending on the version of mysql you are migrating from, converting to utf-8 could mangle all your data
10:11 kados if you are running rel_2_2 on 4.0
10:11 kados there is no character set support
10:11 paul right. but we require 4.1 isn't it ?
10:11 kados but you table and database definitions are probably set to latin1
10:11 kados I speak of migration
10:11 kados for new installs the procedure is different
10:12 paul so we should add a test to check for 4.1 when starting updatedatabase ?
10:12 kados you can't
10:12 paul I can't what ?
10:12 kados you'd have to test the mysqldump to see if the data was coming from 4.0 or 4.1
10:12 paul add a test or require 4.1 ?
10:12 kados 4.1 is required for dev_week
10:12 kados but most libraries will be migrating from 4.0
10:13 kados if you migrate from 4.0 you must do special things to preserve your character sets
10:13 paul but I thought we have said previously that 4.1 will be required for Koha 3
10:13 kados yes
10:13 kados 4.1 _is_ required for 2.4 and 3.0
10:14 paul se a library migrating from Koha 2.2 will have to deal with mysql upgrade if needed.
10:14 kados but most libraries are running 2.2 on mysql 4.0
10:14 paul before upgrading.
10:14 kados yes
10:14 kados but that is a complicated process
10:14 paul and we could check this in updater.
10:14 kados I doubt it
10:14 kados there are so many cases
10:14 kados what encoding they started on, what they want to end up with, etc.
10:14 paul so, in updater, if mysql 4.0 => stop immediatly, if 4.1 => continue
10:15 kados if 4.1, make sure the mysqldump is from 4.1 or stop immediately
10:15 paul but everybody & everything will be in utf8 in 3.0
10:15 kados yes, but you need to convert it to utf-8 _and_ tell mysql that it is utf-8
10:15 paul and mysql alter table know where it comes from isn't it ?
10:16 kados http://wiki.koha.org/doku.php?[…]ncodingscratchpad
10:16 kados ok
10:16 kados I will give you NPL as an example
10:16 paul yes, i've read, but that's not enough for me
10:16 paul ok, listening
10:16 kados before Koha NPL had MARC_8 encoded records
10:16 kados they imported them into a mysql 3.23 db
10:16 kados the database was set up for latin1 defaults
10:16 kados so ...
10:17 kados when NPL migrated from 1.9 to 2.0 using mysqldump
10:17 kados all of the data was mangled by mylsql
10:17 kados so now NPL has a database that thinks it has latin1 data
10:17 kados but actually has marc8, mangled once in a conversion
10:18 kados to upgrade, NPL must:
10:18 paul what does 'mangled' means ?
10:18 kados alter the mysql tables and convert to binary or blob
10:18 kados (so now mysql doesn't know the encoding)
10:18 kados (mangled means the characters have been re-encoded)
10:18 kados (as was common when moving from 3.23 to 4.0 (which started having some character set support))
10:19 kados (so now mysql doesn't know the encoding because the data is of type blob, etc.)
10:19 kados so now we can run mysqldump without mangling the data
10:19 kados (again)
10:20 kados now we need to convert everything to utf8
10:20 kados then we need to tell mysql that the tables are utf8
10:20 kados and in the case of the MARC data, we need to update all the leaders
10:21 kados but every Koha migration will be different
10:21 kados depending on the mysql version and encoding defaults they are using
10:21 kados and the type of MARC
10:22 kados paul: make better sense?
10:22 kados paul: for a new install it's much simpler
10:22 paul yes partially make sense now.
10:22 kados paul: a library must simply make sure that mysql 4.1 is set up with the correct encoding
10:23 kados paul: and that all their data is in utf8 before importing
10:23 paul to rewrite it : depending on what you really have and what mysql think you have, you will be in a big pain ;-)
10:23 kados yep :-)
10:23 kados esp for french libraries
10:23 kados who have many non-ascii characters
10:24 paul to transform from one to another encoding, is iconv enough (on the dump) ?
10:24 kados you can transform within mysql itself
10:24 kados I don't know about iconv
10:24 paul how do you do to know what is the real encoding ?
10:24 kados hehe
10:24 kados there is no way to know
10:24 kados unless you know beforehand :-)
10:25 paul really nice...
10:25 kados you can examine the hex :-)
10:25 kados the reason I know all of this
10:25 paul but we don't have any hex in mysql
10:26 kados is because with WIPO, I was losing hair trying to get mysqldump to export utf-8 data to their db
10:26 kados because I didn't know that mysql thought it had latin1, not utf-8
10:26 kados very frustrating
10:26 kados no hex in mysql
10:26 kados you need to export to a file
10:26 paul ah, I understand now why you don't have your long hairs anymore...
10:26 kados well ... even that could change it
10:26 kados some tests will need to be done
10:27 kados mysqlhotcopy will take a snapshot of the db
10:27 kados without touching the encoding
10:27 kados so you can set up a test environment
10:27 kados on another server
10:28 kados it's a really big problem, unfortunately, maybe too big a job for updatedatabase :(
10:28 kados but it's also unreleated to zebra :-)
10:28 kados it's a problem we created ourselves :-)
10:28 paul do you know how we created it ?
10:28 kados yes
10:29 kados because we said 'koha can run perfectly on any version of mysql with any settings'
10:29 kados and not 'make sure you are running mysql 4.1 with encoding set to the same as your data'
10:29 paul mmm... I never said that... but I agree that I never checked ;-)
10:30 kados (we of course, didn't say it, it was a sin of omission :-))
10:30 paul my problem is that I think I alway had iso8859 datas, but I was probably wrong, and I'll discover it soon
10:30 paul another question :
10:30 paul if the datas where not iso8859, how is it possible to have correct MARCdetail ?
10:30 paul (with proper accented chars)
10:30 kados hehe
10:30 kados that's just the best part
10:31 kados mysql will let you store any encoding
10:31 kados it doesn't care what you put in
10:31 kados except when you try to mysqldump or convert to another encoding
10:32 kados so you may have correctly encoded values
10:32 kados but to get them out you must trick mysql
10:32 paul but when you put it on a web page, if it's not 8859, the page should be wrong in the browser isn't it ?
10:33 kados isn't 8859 eq latin1?
10:33 paul yep
10:33 kados so if your encoding is 8859, and your db is 8859, and your meta tag is 8859, it's all good
10:34 paul but you said that I could not be sure of my real encoding. So, if mysql = latin1, meta=latin1 and the pages are OK, then I have real latin1 datas ?
10:34 kados sounds like it
10:34 paul if yes, then all my libraries are ok !
10:35 paul that's what is nice with a language that uses accents : any problem is easy to find ;-)
10:35 kados :)
10:35 kados so you can probably convert to utf8 before mysqldump
10:35 kados (use a test db first of course)
10:36 paul of course.
10:37 paul bon appetit !
10:45 paul kados : how do you export your datas ?
10:46 paul because export/export.pl is buggy (compilation failure)
11:07 kados paul: in rel22 it should work
11:10 paul forget this, my copy was wrong. i had updated from something locally modified, so I got some cvs errors. restarting from a fresh copy
11:11 thd kados: did you understand from reading the logs what tumer had done for holdings
11:11 thd ?
11:11 paul hello thd.
11:11 thd hello paul
11:16 kados thd: yes, and I approved it
11:16 ToinS paul : updatedatabase has not been merged between rel_2_2 & head
11:17 kados thd: it will be in head as soon as tumer deems it stabler :-)
11:17 kados ToinS: I did it yesterday
11:17 ToinS kados ah ok
11:17 thd kados: he is not using it in production yet?
11:18 thd kados: did you see my message about how there are only about 3 fields for the librarian to worry about in standard MARC 21 holdings?
11:18 kados thd: no
11:18 kados thd: tell me
11:18 thd kados: or 3 at a time
11:19 thd kados: acquisitions would need 541, 583, and 876 that is just 3 fields with 877-8 for very special cases
11:19 dewey i already had it that way, thd.
11:20 kados mainly dev_week and rel_2_2 are merged
11:20 thd kados: cataloguing would need 852 or 856, and sometimes 863-868
11:20 kados there are a few that need to be manually merged
11:20 kados like biblio.pm, if there was anything in there
11:20 kados thd: if they are defined in tab 10 they will show up I think
11:21 kados even in the current scheme
11:21 thd kados: only 952 is defined in tab 10
11:22 thd kados: would you not use the MARC editor to load a holdings record?
11:23 thd kados: that would give the librarian access to whatever is needed
11:24 thd kados: a holdings framework would in the case of MARC 21 be a very trimmed down bibliographic framework
11:25 kados ahh ...
11:25 kados right
11:25 thd kados: the fields are all there it is just a question of unhiding them
11:25 kados so the regular MARC editor would be used
11:26 thd kados: although, tumer has spotted an editor bug for filling a field from an authority field
11:26 thd s/field/record/
11:27 thd kados: tumer reports that if you add a repeated field the authorised heading fills the wrong field
11:29 thd kados: in his case he found that an attempt to add an authorised heading was filling 690 instead of 700 after a repeatable field had been added
11:29 kados thd: I'll take a look
11:29 paul kados/tumer/thd : which templates are OK for dev_week ? npl ?
11:30 thd kados: will there be a meeting today?
11:30 thd now that paul is back
11:30 kados paul: only npl are tested
11:30 paul i'm not back thd
11:30 thd welcome back paul
11:30 thd no
11:30 kados paul: and don't expect searching to work except at opac-zoomsearch.pl
11:30 kados paul: :-)
11:30 paul i'm at ToinS home, so I will not be here everytime
11:30 thd paul: you are still moving?
11:31 kados thd: no time for a meeting today
11:31 paul my move is done. but still no DSL
11:31 thd paul: do you have dial-up
11:31 thd ?
11:32 paul i've DSL at ToinS home. I may be able to have a dial up, but it's very expensive
11:33 thd paul: I have cheap dial-up because my building has the wrong too old equipment for cheaper DSL and the telephone company will not replace it.
11:35 thd paul: I will have optical fibre or a hole in a foundation wall before the telephone company will replace a circuit box from the 60s.
11:36 thd monopolies in action
11:39 thd kados: why was it so difficult to think of the simplest most efficient solution for standard holdings?
11:40 kados thd: sorry, got a lot on my plate right now
11:40 kados thd: I really want to resolve the standard holdings issues
11:40 kados thd: and I think we're very close
11:40 kados thd: but don't have the time today to discuss deeply
11:41 thd kados: tumer has it by not needing repeated fields just repeated holdings records
11:42 thd kados: OK I have to go now in any case
11:51 paul kados ?
11:51 dewey kados is probably becoming a true Perl Monger...
11:52 paul zebraidx -g iso2709 -c /koha/etc/zebra-biblios.cfg -d kohaplugin update /path/to/records => /path/to/records is the path to all records, right ?
11:52 paul so we have to export all of them in 1 big file (using export/export.pl) or 1 for each biblio ?
11:53 kados all in one file
11:53 tumer[A] paul one big chunck
11:53 kados file must be named *.iso2709
11:53 kados and put in /path/to/records
11:53 paul the /path/to/record being a directory with only this file, or can contains other things ?
11:53 kados it can contain other things
11:54 kados but they will be ignored
11:54 tumer[A] only with marc records
11:54 tumer[A] sometimes does not ignore them
11:54 kados only files with an excension of .iso2709 will be noticed
11:54 kados ahh ... tumer is probably right, i have not tested this extensively
11:55 tumer kados:it tries to read every file and sometimes goes crazy saying they dont have id's so best to have marc records only
11:55 kados :-)

| Channels | #koha index | Today | | Search | Google Search | Plain-Text | plain, newest first | summary