Time  Nick      Message
11:05 |hdl|     ok
11:20 pierrick_ I'm trying some catalogue search but it always returns "no record found"
11:20 paul      |hdl| has the same kind of problem.
11:20 paul      iirc, chris gave him a useful hint
11:20 pierrick_ OK, I've understood chris was working on it
11:21 |hdl|     pierrick_:  as tu fait un commit de ta base ?
11:21 |hdl|     zebraidx commit
11:21 paul      (pas un commit cvs, mais un commit zebra !)
11:21 pierrick_ (ah ah ah Paul)
11:22 pierrick_ Cannot perform commit
11:26 pierrick_ je ne comprends pas très bien à quel moment j'aurais dû faire le commit. Le rebuild_zebra.pl n'aurait pas dû s'en occuper ?
11:30 paul      s'il n'avait pas été buggué oui ;-)
11:33 |hdl|     pierrick_: tu as un dossier shadow ?
11:33 |hdl|     tu as modifié le zebra.cfg pour intégré les shadow register ?
11:34 |hdl|     tu as un dossier lock ?
11:34 |hdl|     un dossier tmp ?
11:35 |hdl|     modifications à ajouter à zebra : http://pastebin.com/590802
11:39 pierrick_ (back) non, pas de dossier shadow
11:41 paul      hello kados.
11:41 pierrick_ hi kados
11:42 paul      (i'm working on rle_2_2 myself. you can see commits on koha-cvs, that is faster than lightning today !!!)
11:42 paul      kados :
11:42 paul      http://127.0.0.1:9018/cgi-bin/koha/opac-authorities-home.pl
11:42 paul      search "job" as "Auteur NP
11:42 paul      http://bureau.paulpoulain.com:9018/cgi-bin/koha/opac-authorities-home.pl
11:42 paul      do you have an idea why there"s nothing in the summary column ?
11:43 paul      is it the construction you showed me few days ago (see, see also...)
11:43 paul      in this case, it don't work with unimarc :-(
11:43 kados     perhaps it doesn't work with unimarc
11:43 kados     but it shoudl be quite simple to fix in fact
11:43 pierrick_ I've modified my zebra.cfg, restarted zebrasrv, but...
11:44 pierrick_ 15:42:15-08/03 zebraidx(14478) [log] nothing to commit
11:44 paul      ok, gotcha the code in AuthoritiesMarc.pm
11:44 kados     paul: if you take a look at the code, you will see that the $summary is now constructed very simply
11:44 kados     paul: it can be adapted to unimarc practice
11:45 kados     of course ... it should be handed back to the template as a LOOP
11:45 kados     I think there is a FIXME about that in the code :-)
11:46 kados     pierrick_: here is how I handle zebra from the beginning:
11:46 paul      shame on you for this fix : opac.liblime.com is also hardcoded...
11:46 kados     paul: I intend to hand the variables to the template
11:47 kados     paul: as a LOOP ... so there is no hardcoded html in the script
11:47 paul      I let you do this ok ?
11:47 kados     yep
11:47 kados     if you tell me unimarc practice for constructing main heading, see, see also i will also fix it for unimarc
11:48 paul      I'll fix quickly AuthoritiesMarc.pm to handle unimarc.
11:48 paul      let me 30mn
11:48 kados     excellent
11:52 paul      kados : you don't show rejected form isn't it ?
11:53 paul      main entry : ferraro, rejected form : kados,...
11:54 kados     I do
11:54 kados     that's the 'see' and 'see also'
11:54 paul      mmm...
11:55 paul      in unimarc, we have :
11:55 paul      * accepted form : 2xx
11:55 paul      * rejected form : 3xx
11:55 kados     in MARC21 it's called 'authorized heading' and 'unauthorized headings'
11:55 paul      * associated form : 5xx
11:55 kados     there can be multiple authorized headings
11:55 paul      * parallel form : 7xx
11:55 kados     I see
11:55 paul      you have only 3 things
11:55 kados     right
11:55 kados     it should be quite simple to fix
11:56 kados     in USMARC 1XX is authorized heading
11:56 kados     so that explains why nothing is showing up :-)
11:56 kados     if unimarc it is 2XX
11:56 paul      yep, i've fixed this.
11:57 paul      i'm just wondering how to deal with my 4 forms were you have only 3 !
11:57 kados     I would handle them separately
11:57 kados     ie if (UNIMARC) {
11:57 kados     elsif (USMARC) {
12:02 paul      where are unauthorized headings hidden in MARC21 ?
12:02 paul      (can't find them on loc.gov/marc/authority/
12:14 kados     2XX in MARC21
12:14 kados     3XX are 'see also' in unimarc probably 'parallel form'
12:14 kados     or maybe 'associated form' I'm not sure
12:17 paul      OK, AuthoritiesMarc.pm commited.
12:18 pierrick_ kados: you wanted to tell me the way you handle zebra ?
12:22 kados     pierrick_: sorry :-)
12:22 kados     zebraidx create kohademo
12:22 kados     zebraidx commit
12:22 kados     zebrasrv localhost:2100
12:23 kados     if you create the db first in this way it should work
12:23 kados     but I don't know the specifics of your problem
12:24 pierrick_ my problem: search.marc/search.pl returns no record on any keyword
12:25 pierrick_ I recreate my zebra database, I suppose it was obvious I had to "rebuild_zebra.pl"
12:26 kados     hmmm
12:26 kados     I don't think so
12:27 kados     ahh ... well you need to have some records in there :-)
12:27 kados     I usually insert them with bulkmarcimport.pl
12:30 pierrick_ I have no iso2709 files :-/
12:31 pierrick_ what is rebuild_zebra.pl for if I should not use it in my case?
12:34 |hdl|     pierrick_ : try to commit your db
12:34 |hdl|     again.
12:34 |hdl|     when you are done with  rebuild_zebra.pl
12:35 |hdl|     (VERY long for me)
12:39 paul      kados/pierrick : pierrick has a copy of a 2.2 DB. He should user rebuild_zebra to populate the zebra DB. no need to bulkmarcimport, that is for starting libraries
12:41 pierrick_ 9943 MARC record done in 81.2802159786224 seconds
12:41 paul      joshua told me he made many improvements in speed.
12:41 paul      but this one is something like x25
12:42 paul      i'm a little afraid by such a speed...
12:42 kados     :-)
12:42 pierrick_ My computer is quite fast and 1GB memory
12:42 kados     still seems very fast
12:42 kados     my servers are pretty quick and have 4 gigs of memory :-)
12:43 pierrick_ `zebraidx commit` still says it has nothing to commit
12:43 kados     maybe you have a better xml parser installed
12:43 kados     hmmm
12:43 kados     you don't need to run the commit operation _after_ you import records
12:43 kados     that shoudl happen automatically
12:44 pierrick_ that's what I thought, because I read in rebuild_zebra.pl you already to it, but hdl advised me to do it
12:44 pierrick_ (and still no record found)
12:54 paul      (on phone)
12:54 paul      back
12:54 paul      I have moved template->param(XXX => xxx) that were in all opac script previously.
12:55 paul      thus, we don't have to do them in new scripts & can't forget one script
12:55 paul      (2 had been forgotten in I-don't-remember-which script)
12:55 paul      that changes absolutly nothing from a template designer point of view ;-)
12:55 owen      So those things can be left out of future scripts and could even be removed from existing ones?
12:57 paul      yep. I've taken care of this as well ;-) (removing from existing scripts)
12:57 kados     thanks paul!
12:58 kados     I didn't realize we could do that or I would have done it that way from the beginning :-)
12:58 owen      Could the same be done in the intranet to get 'current branch' into global headings?
12:58 paul      for sure.
12:58 paul      there are already some things like loggedinusername & CAN_user_permissionToDoSomething
12:58 paul      (CAN_user... is underuser anyway)
12:59 paul      (underused I mean)
12:59 owen      Definitely
13:00 owen      But current branch isn't a preference like opacnav or LibraryName...
13:05 kados     no
13:05 kados     that would mean it would be a system-wide setting
13:07 owen      So how to do it?  Is it possible?
13:10 owen      We continue to have problems at NPL where browsers get 'reset' and lose their home branch setting.
13:10 owen      It's often days or who knows how long before the mistake is discovered, and meanwhile all the circs are credited to the wrong branch
13:11 owen      We could get away with just modifying intranet-main and returns, but it would be nice if there was a global variable so that each script didn't have to be modified.
13:39 kados     what if we had a resident 'current branch' listed at the top of every page
13:39 kados     might not be so long before someone at the desk noticed if it was always there
13:40 owen      That's what I'm talking about
13:42 kados     I see
13:43 kados     I'll put it on my list ... there's got to be a simple way to do that
13:44 paul      kados :
13:44 paul      about your biblio.pm commit.
13:44 kados     paul: two actually :-)
13:44 paul      you've removed 5 lines i commited 1 hours ago :
13:44 kados     paul: woops ... sorry
13:44 paul      -			# deal with &, <, >,", ' that are not valid in a XML file.
13:44 paul      -			@$values[$i] =~ s/&/&amp;/g;
13:44 paul      -			@$values[$i] =~ s/</&lt;/g;
13:44 paul      -			@$values[$i] =~ s/>/&gt;/g;
13:44 paul      -			@$values[$i] =~ s/"/&quot;/g;
13:44 paul      -			@$values[$i] =~ s/'/&apos;/g;
13:45 kados     I don't know how it happened
13:45 paul      ok, I let you reintroduce them
13:45 kados     yep
13:45 paul      (& it's a good thing ml are now fast !)
13:45 kados     :-)
13:46 kados     are those the only chars that make an XML file not valid?
13:46 paul      yep, unless i'm mistaken
13:46 kados     (this is a concern I had not anticipated, did you notice it in real data?)
13:46 paul      yep.
13:46 paul      the 1st biblio I edited !!!
13:46 paul      it contained a & and I got a superb Internal server error !!!
13:47 paul      (& "malformed data" in the logs)
13:47 kados     wow ...
13:47 kados     that routine is very tricky
13:50 kados     paul: ok ... committed
13:55 paul      kados : look at => http://bureau.paulpoulain.com:9018/cgi-bin/koha/opac-search.pl
13:56 paul      i've ported liblime stylesheet to css templates.
13:56 paul      unless you're against, i'll commit it & announce in release notes for 2.4
13:56 kados     hehe
13:56 kados     it's a bit lopsided for me
13:56 paul      lopsided ??
13:57 kados     sorry ... bad word to use and probably not correct spelling :-)
13:57 kados     the left-hand bar is not on the left ... it's at the top-left
13:57 kados     and the midway content is shifted down
13:57 kados     but I think it's ok
13:58 paul      you're testing with firefox ?
13:58 paul      firefox & konqueror are fine.
13:58 kados     firefox on OSX
13:59 kados     maybe not high enough resolution
13:59 kados     paul: what is your resolution?
13:59 paul      problem with opera.
13:59 kados     i have 1024 X 768
13:59 paul      (it's the img)
13:59 paul      I checked with 1024 too.
14:00 kados     hmmm
14:00 paul      (i'm 1400x1050 natively)
14:00 kados     owen's the expert template designer
14:00 paul      the toolbar on opera is not on the left.
14:00 kados     I just meddle :-)
14:00 kados     ahh ... same as for me then
14:02 owen      paul, are you floating the navigation menu left ?
14:02 paul      it's fixed I think
14:03 paul      yep :
14:03 paul      	float:left;
14:03 kados     fixed for me as well
14:03 kados     nice job paul!
14:03 paul      (image height was hardcoded to 65px that is too much)
14:03 kados     ahh
14:04 owen      Yes, fixed now.
14:04 owen      Even Internet Explorer doesn't object
14:06 paul      great !
14:07 paul      yes thd. You have something like 20mn ;-)
14:10 thd       paul: Is there any place other than marc_subfield_structure where the original Koha column names are stored in relation to MARC links?
14:10 paul      no
14:11 thd       paul: when I use the delete option in bulkmarcimport.pl not everything is really deleted.  What is actually happening there?
14:12 paul      everything in :
14:12 paul      * biblio, biblioitems, additionalauthors, bibliosubject, bibliosubtitle
14:12 paul      * marc_biblio, marc_subfield_table, marc_word
14:12 pierrick_ (I wanted to tell #koha that instead of wanting to make HEAD working today, I've decided to install a 2.2 working copy, with EMP data, and it works fine, I have HEAD and 2.2 simultaneously, with HEAD making a lot of errors)
14:13 paul      great !
14:13 paul      it's always discouraging so spend days on a problem when you begin.
14:13 pierrick_ I've improved my installation skills
14:14 pierrick_ I've rewriten Joshua symlinks creation scripts
14:14 thd       paul: I find ghost data in marc_subfield_table
14:15 thd       pierrick: Did you find the same errors in that script that I had?
14:16 thd       pierrick: I had supposed that Koha files may have been organised a little differently when that script was written
14:16 pierrick_ thd: I use a already filled database, no need to bulkmarcimport.pl (and I have no MARC iso data)
14:17 pierrick_ i must admit I don't understand why directories tree is different between "installation directory" and "working copy"
14:18 thd       pierrick: sorry by that script in your case I was referring to the symlink creation script which kados had written
14:18 pierrick_ in my opinion, at the top of the directories tree, we should have "opac", "intranet", "modules", "misc", "doc"
14:18 pierrick_ thd: Oh... so I agree
14:20 thd       pierrick: I suspect the script may have worked perfectly at the time it had originally been written and then would still work for initial users because the directory locations were preserved in koha.conf.
14:22 thd       paul: Should I have ghost data in marc_subfield_table after using the delete option in bulkmarcimport.pl?
14:26 thd       paul:?
14:27 |hdl|     kados : is there a patch for MARC::Charset ?
14:27 |hdl|     there is a problem with accents here.
14:28 thd       |hdl|: What problem?
14:29 thd       Is paul still here?
14:29 |hdl|     no mapping found at position 7 in NuclaI\xcc\x80\xc2\x8cire (Orange) at /usr/lib/perl5/site_perl/5.8.7/MARC/Charset.pm line 194.,
14:30 |hdl|     Maybe should recompile MARC::Charset too ???
14:31 thd       |hdl|: I would imagine those two processes to go together
14:32 kados     |hdl|: I don't know of one
14:32 kados     |hdl|: I was not aware of the problem though
14:33 kados     |hdl|: what version of MARC::Charset are you running?
14:33 |hdl|     0.95
14:34 pierrick_ I'm leaving, see you tomorrow :-)
14:34 |hdl|     good evening pierrick_
14:36 paul_away leaving too
14:36 kados     |hdl|: seems that's up to date
14:37 kados     |hdl|: when did that problem occur?
15:00 chris     hmm i have an idea
15:01 kados     hey chris
15:01 kados     did you see my commit to Biblio.pm?
15:01 kados     well ... two?
15:01 kados     I think if fixed the prob :-)
15:01 owen      Man, chris is up early!
15:01 kados     NBBC is testing again
15:01 chris     not yet i just woke up with an idea what might be causing hdls problem
15:02 kados     the chraset prob?
15:02 chris     yep
15:03 chris     i think its this line
15:03 chris      my $record = MARC::Record->new_from_xml($rs->record($i)->raw());
15:03 kados     that should read:
15:03 kados     MARC::Record->new_from_xml($rs->record($i)->raw(), 'UTF-8'));
15:03 chris     yes
15:04 chris     hdl are you there
15:04 chris     ?
15:04 kados     otherwise, MARC::File::XML will turn it into MARC-8
15:04 kados     (the new version that is)
15:04 chris     i think that might be what is happening
15:05 chris     thats why the charset is complaining
15:05 kados     I'm not clear on when it was happening for him
15:05 kados     addbiblio and additem both already include this fix
15:05 chris     when he searches
15:05 chris     so its in SearchMarc.pm
15:05 chris     line 270
15:05 kados     we translate from XML to MARC::Record for the search?
15:05 chris     for the results yes
15:05 kados     weird
15:06 kados     that's got to be slow ...
15:06 chris     we search
15:06 chris     we get results back
15:06 chris     then we make them a marc::record
15:06 chris     its pretty fast
15:07 thd       chris: I had expected that the data would now be used in XML.
15:07 kados     [jmf@gandalf koha]$ grep -r new_from_xml *
15:07 kados     doesn't turn up any results from SearchMarc ...
15:07 chris     Search.pm as well
15:08 chris     line 270 of SearchMarc.pm in head
15:08 thd       chris: Is that merely to preserve existing functioning using MARC::Record?
15:08 kados     wait ... that'd be head :-)
15:08 kados     chris: I'll clean up all those calls and recommit them
15:08 kados     chris: nice catch!
15:08 chris     at this stage yes thd
15:09 chris     ok, now im going back to bed
15:09 thd       chris: Do you intend for that to change?
15:09 chris     thd: probably for head
15:09 chris     not for the 2.2.x plugin
15:09 owen      Think how many problems we'd solve if we spent all our time in bed and in the shower?
15:10 thd       chris: have a good rest
15:11 thd       owen: I would like to figure out how to take a shower in bed for twice the thinking power :)
15:12 kados     fix committed
15:13 kados     thd: :-)
15:13 thd       owen: yet, perhaps if it were all the time we would solve the problems when not in shower or bed
15:14 owen      You're probably right
15:14 kados     thd: how's the framework coming?
15:15 kados     thd: (it's a popular question these days :-))
15:15 thd       kados: being disconnected for a whole day did not help
15:16 thd       kados: I was working on it last night with documents recovered from my browser cache
15:16 thd       kados: Identifying those documents was not extremely easy
15:16 kados     thd: is your test box not located in your 'office'?
15:17 thd       kados: I have but one tiny room for everything
15:17 thd       kados: I live in New York City where a closet can cost a fortune to rent.
15:18 kados     right :-)
15:19 thd       kados: I had the longest internet outage in years that was not caused by telephone line failure.
15:20 thd       kados: So I never finished stating what remained to be done because you had to go.
15:21 kados     right
15:21 kados     please continue :-)
15:21 kados     iirc there are issues with subjects
15:21 kados     that I don't fully comprehend :-)
15:21 thd       kados: Much of a day today finish the initial work.  Probably late tonight or early tomorrow morning.
15:22 kados     great!
15:22 thd       kados: 2 or so days of very careful checking.
15:23 kados     yep
15:23 thd       kados: 2 or so days thinking about and testing something much better than a mere table replacement.
15:25 thd       kados: I want to be able to move 090, 942 (except that 942 is OK), and 952 by changing the data in marc_subfiled_table
15:27 thd       kados: more than just the field numbers to even change the subfield assignments linked between MARC and the original Koha.
15:28 kados     thd: that's a nice aim
15:28 kados     thd: but I'm not sure it's a priority for the coming release
15:29 kados     thd: paul is threatening us with feature freeze very soon :-)
15:29 kados     thd: the most important thing for me is having a solid working framework
15:30 kados     thd: for new customers
15:30 kados     thd: because I have several migrations coming up in the next few weeks
15:30 thd       kados: So the existing values for items.whatever would be read from the installed framework and then changed in the Koha install prior to installing the new bibliographic framework.  Then run rebuildnonmarc.pl.
15:31 thd       Run rebuildnonmarc.pl after the new framework is installed.
15:32 thd       kados: This is not a new feature it is a bug fix overdue for 2 or 3 years :)
15:34 thd       kados: And I can see why it had never been done comprehensively before.  However, if I had realised what was missing I should have scripted the LC data elements file and been done quite some time ago.
15:36 thd       kados: I still would have needed to consult 6 other references but that would have been most of the work done with much less typing.
15:37 thd       kados: I have a non-framework question.
15:38 kados     sure
15:38 thd       kados: Should I currently have UTF-8 templates returned in rel_2_2 or did I misunderstand what you had said?
15:38 kados     NPL currently has all UTF-8 templates in CVS
15:38 kados     we were discussing head though
15:39 thd       s/templates/HTML pages/
15:39 kados     what we were discussing doesn't relate to rel_2_2
15:39 kados     if you use NPL you will get UTF-8
15:39 kados     I'm not sure about default
15:39 thd       kados: I am referring to the OPAC
15:40 kados     me too :-)
15:40 kados     check the charset on koha.liblime.com
15:40 kados     and opac.liblime.com
15:40 thd       kados: I certainly see that on opac.liblime.com.
15:41 thd       kados: Is it testing my locale setting?
15:42 thd       kados: If my locale is set to ISO-8859-1 are the NPL templates returning 8859-1 versions?
15:44 kados     as far as I know, there is no way to detect the browser's system locale setting
15:44 kados     from the web server
15:44 kados     therefore, if you own a computer that uses only 8859-1
15:45 kados     it will show all utf-8 within the ascii range
15:45 kados     for extended characters, you'll have to upgrade your system ... sorry :-)
15:45 thd       kados: My computer can use either but I have not updated in CVS in 2 days.
15:46 kados     btw: there's a new survey on open source in libraries:
15:46 kados     http://www.zoomerang.com/survey.zgi?p=WEB224ZXHBAYFD
15:46 kados     owen: shedges: you guys might want to check this out
15:46 thd       kados: I can see almost any character on my system in different encodings and can encode the pages myself to produce the desired effect.
15:47 thd       kados: system capabilities within the browser are independent of the locale setting.
15:48 thd       kados: Is this fix older than 2 days in CVS?
15:49 thd       for the NPL templates?
15:50 thd       kados: You had spoken about a circumstance where 8859 would be returned depending on some factor but that was only for HEAD?
15:51 kados     I don't recall
15:52 thd       kados: I will look a little further and go back to the bibliographic framework.
15:54 thd       kados: Had you noticed ghost data in the marc_subfield_table after using bulkmarcimport.pl with the delete option.
15:54 shedges   kados: owen:  ok, i did the survey
15:54 thd       ?
15:55 kados     shedges: was it hard? :-)
15:55 shedges   no, i just lied when i didn't know the answer ;-)
15:56 kados     hehe
15:56 kados     thd: I've not noticed that
15:57 kados     thd: how much ghost data we we talking about?
15:57 thd       kados: I have not tested deeply but potentially very much data.
15:58 thd       kados: I notice that old biblio numbers are reused for material that was already there.
16:03 kados     thd: huh ... so if I run bulkmarcimport.pl -d
16:03 kados     thd: you're saying it's not deleting all the data?
16:04 kados     thd: doesn't it actually do a 'delete from X' for all the relevant tables?
16:20 thd       kados: The largest most obvious thing is holdings data when the import contained no Koha holdings.  The holdings data does not appear in the OPAC so you would never know unless you looked at the contents of marc_subfield_table.
16:21 thd       kados: maybe I have some mistake where my import script is grabbing a Koha export file.  I will check that before asking paul again tomorrow.
16:21 thd       kados: also, I suspect that some searching anomalies that you have experienced may be due to my having accidentally filled a duplicate MARC to Koha link when editing 650 in your bibliographic framework a few weeks ago.  A careless mouse keys motion on my part.
16:22 thd       kados: You ran rebuildnonmarc.pl just after I had created that problem and I suspect did not run it after I noticed my carelessness and corrected it a couple of hours later.
16:23 thd       kados: This was the day I tried to fill seealso for 650a past the 255 character limit.
16:24 thd       kados: You might profit from running rebuildnonmarc.pl while you sleep tonight.
16:25 kados     hmmm ... I haven't noticed any strangness ...
16:27 thd       kados: the strangeness to which I was referring just now is where 4 hits are reported in the '...'OPAC search and 6 appear in the actual results.
16:27 kados     hmmm
16:28 kados     I still need to look at how that works
16:28 kados     I'm not clear on what it's intended to do
16:28 thd       kados: What happened to the changes where '...' had been changed to value?
16:29 thd       or whatever you had used for greater visibility and new user understanding
16:33 thd       kados: The anomalies search is an and word search against the contents of the corresponding field for the original Koha tables.  Not the newer authorities search where the anomalies are well understood.
16:36 thd       kados: correction the search searches the marc words index for the see also field/subfield pairs but I understand displays initial results from original Koha table.
16:38 thd       kados: I think that may be the answer.  Two different search types may mingling there in the whatever it is called field values not authorities search.
16:39 kados     ?
16:39 thd       kados: paul had explained the improper authorities matching but that is different.
16:41 thd       kados: remember when you had me search forest and choose one of the bottom set of links claiming 4 hits but the actual results showed six biblios when clicking?
16:44 thd       kados: http://opac.liblime.com/cgi-bin/koha/opac-search.pl
16:45 thd       kados: search forest from the subject '...'
16:46 kados     I remember
16:46 thd       kados: The choose Forest animals reporting 4 hits from the 'Catalog Search Results' set at the bottom.
16:47 thd       kados: the result set has 6 records.
16:51 kados     yep
16:52 thd       kados: I think the reason is that the 'Catalog Search Results' show hits from 650 $a or rather biblio.subjects only while the final results search every field/subfield pair in the seealso for 650 $a from the bibliographic framework.
16:52 kados     because all the dictionary search does is fill the search input box with values for the search
16:52 kados     thd: of course that's the reason :-)
16:54 thd       kados: so what happened to 'values' or whatever instead of '...' for this immanently useful search?
16:56 kados     guess it got lost in the shuffle :-)
16:58 thd       kados: It would be much easier to refer to even as the values search instead of the '...' search :)
17:01 thd       kados: Unless you really still believe that it has no value.  It has some bugs but those can be fixed.
17:05 kados     I'd rather it not be highlighted until it's really great
17:06 kados     if you know what I mean :-)
17:06 kados     I have some ideas for how to improve it
17:06 kados     but no time right now I'm afraid
17:07 thd       kados: Once we have authority record import working it will be really great.  Even authority building which should work first will make it quite good.
17:08 kados     yep
17:08 kados     I've gotta grab a snack ... brb
17:08 thd       kados: Authority building is fairly trivial.  Paul explained the issues that I had not seen in the past.
17:11 kados     I have some time to work on that this afternoon
17:12 kados     the problem is, without a good authorities file and matching biblios
17:12 kados     it's hard to build an import script :-)
17:14 thd       kados: I only meat that building authority records from no authorities file is fairly trivial and you should ensure that is working as expected before working on the ultimate solution with authority file importing.
17:20 kados     i see
17:20 kados     ok ... so lets get that working right now
17:21 thd       kados: I would be happy to that should certainly be part of 2.2.6/2.4
17:22 thd       kados: I have not answered that thread but there are few new features.  Mostly major bug fixes.
17:23 thd       kados: Maybe you did not count them as features previously if they had not worked :)
17:24 kados     they are bug fixes :-)
17:27 kados     so I still don't quite understand paul's explaination of the roles of the 'taglist', 'key', 'other' and 'authtag'
17:27 kados     thd: do you?
17:27 thd       kados: yes I do so I understand everything now :)
17:28 kados     could you explain it to me?
17:29 thd       kados: I am still grabbing CVS and updating.  I will start with what I can remember well.
17:31 thd       kados: key is those subfields which are invariant in the controlled bibliographic field for all instances of the same controlled value.
17:32 thd       kados: other is those subfields which may be different in the controlled bibliographic field for all instances of the same controlled value.
17:32 thd       s/all/some/
17:33 thd       kados: Is that as clear as mud?
17:33 kados     heh
17:33 thd       kados: Do you understand that or should I give an example?
17:34 kados     I think I understand
17:34 kados     but I'm not clear on how to use that information to best create auth records given my current data
17:35 kados     so lets take some examples
17:35 kados     NAME
17:35 thd       kados: Is NAME an example?
17:35 kados     yea
17:35 kados     taglist should be '100'
17:35 kados     key should be 'a'
17:36 thd       kados: So NAME is personal author only ?
17:36 kados     other should be 'b|c|d|e|f|g|h|i...0|1|2|3etc.)
17:36 kados     right?
17:36 kados     well my thought was
17:36 kados     I would have to run the script separately for each type of name
17:36 kados     with different values in the hash
17:37 kados     (ie, don't overwrite the existing values, just append more )
17:38 thd       kados: yes but you can see from what we have already done that the authorities searching in the OPAC takes care of different types of name authorities.
17:39 kados     only if you manually add them I think
17:40 thd       kados: you need to run the script only once with all the values contained in buildauthoritis.pl
17:41 kados     i think there is a limimtation in the existing script
17:41 thd       kados: If you include provision for every controlled bibliographic field and authority type buildauthoritis.pl will work on them all unless I am missing something.
17:44 kados     thd: I don't think you can specify more than one authcode
17:44 kados     thd: which if I'm not mistaken, is critical to how MARC21 authorities work
17:44 kados     thd: right?
17:46 thd       kados: the authority codes are SAUT, SAUTIT, etc in the existing UNIMARC script.
17:48 kados     sorry, I meant authtag
17:49 kados     you can't specify more than one authtg
17:49 thd       kados: UNIMARC and MARC 21 authorities function comparably for everything except subject subdivisions.
17:49 kados     authtag :-)
17:51 thd       kados: The only difference there is that authtag is in the 2XX for UNIMARC authorities and 1XX for MARC 21 authorities.
17:56 kados     right, but you can't have more than one authtag for one auth type
17:56 kados     for instance, we have more than just 100 that belongs in NAME
17:56 kados     I hope I'm making sense
17:57 thd       kados: You are making sense but you are a little confused.
17:58 kados     could you enlighten me? :-)
17:58 thd       kados: Let ,me clarify one issue that you had been confused about yesterday when we spoke.
17:58 kados     sure
18:00 thd       kados: searches for tracings and references would always be on the authority record.  Authority searches start at the authority records and are then run against the bibliographic records keyed to that matching authority record .
18:01 thd       kados: This would be the behaviour of an authority search in the OPAC or intranet.
18:01 thd       s/or intranet/
18:01 thd       s/or intranet//
18:03 kados     that's not how an authorities search works now I don't think
18:03 thd       kados: I guess the intranet authority search also allows pulling up the bibliographic records and not just the authority record.
18:05 thd       kados: It may be structured internally in some different way but I was trying to clarify why 4XX and 5XX from the authority record do not end up in the bibliographic record only 1XX from the authority record ends up in the bibliographic record.
18:06 thd       kados: With the exception of subdivided subjects that may include 7XX from the authority record as a subject subdivision in the bibliographic record.
18:10 thd       kados: So as you were saying before I digressed 100 and only 100 will appear for authtag for your NAME which would be less ambiguous as PNAME.
18:10 kados     ok ... here's the problem
18:10 kados     if I bread down my auth types into PNAME, CNAME, etc.
18:11 kados     when someone goes to do an auth search they won't understand how it's supposed to work
18:11 kados     there should only be four search points:
18:11 kados     NAME, NAME/TITLE, SUBJECT, UNIFORM TITLE
18:12 thd       kados: That is why you want to rerun the script with different values.
18:12 kados     yep
18:12 kados     make sense?
18:13 thd       kados: The OPAC '...' authorities search takes care of that automatically.
18:14 thd       kados: Only the intranet authorities search which you had transferred to the OPAC has that problem.
18:15 kados     hmmm
18:15 kados     but the dictionary search doesn't provide any details on the subject headings
18:15 thd       kados: The intranet authorities search is designed for librarians who want or need that degree precision in their search.
18:16 |hdl|     chris : I read your message issued at 19:05 about  MARC::Record->new_from_xml($rs->record($i)->raw(), 'UTF-8'));
18:16 thd       kados: which dictionary search does not provide what details?
18:16 kados     |hdl|: i updated cvs
18:16 kados     |hdl|: so just update and it should work
18:20 thd       kados: Do you mean that the '...' authorities search in the OPAC does not return the type of subject heading, topical, geographic, etc.?
18:21 |hdl|     It works YES.
18:21 |hdl|     Can I commit thos change on Search and SearchMarc.pm ?
18:22 chris     i think kados already has hdl
18:23 kados     |hdl|: i did already :-)
18:26 thd       kados: I still have ISO 8859-1 XHTML from the rel_2_2 templates.
18:26 kados     thd: npl templates?
18:26 thd       kados: yes
18:27 thd       unless rsync is failing to copy the cvs checkout correctly
18:29 thd       kados: the character sets in the templates are trivial.  I will look at the issue later.
18:29 kados     thd: opac.liblime.com is stock CVS
18:29 kados     thd: so they are definitely utf-8 in npl's tempaltes
18:30 kados     |hdl|: you should change your templates to utf-8 too if you haven't already
18:30 thd       kados: except that you have special LibLime npl templates
18:31 kados     nope I don't
18:31 thd       kados: your templates on LibLime are different from the ones on CVS.
18:31 kados     nope :-)
18:32 thd       kados: unless you changed them in the past few days.
18:32 kados     thd: they are stock cvs
18:32 |hdl|     > as far as I know, there is no way to detect the browser's system locale setting
18:32 |hdl|     kados :
18:32 |hdl|     [19:47:24] <kados> from the web server
18:32 |hdl|     [19:47:40] <kados> therefore, if you own a computer that uses only 8859-1
18:32 |hdl|     Yes
18:32 thd       kados: cvs npl templates had the link to authorities which yours did not.
18:33 |hdl|     but you can adchasetdefault to your webserver.
18:33 |hdl|     And it should do the job.
18:33 kados     |hdl|: ahh ... I didn't know that
18:33 kados     |hdl|: thanks for the tip
18:33 |hdl|     thx for the commit.
18:34 |hdl|     I didn't know you did. (was reading the log along... and doing tests.
18:34 |hdl|     But there is still a problem with accents for me.
18:35 |hdl|     MtÌŒhodes de renormalisation
18:35 |hdl|     should be méthodes de renormalisation.
18:35 thd       |hdl| my webserver has a default but will return pages in whatever character set the charset meta-tag specifies in the page it serves.
18:36 kados     |hdl|: maybe your data was converted to MARC-8 on import
18:36 kados     |hdl|: so you will need to re-import in that case from the original files
18:36 kados     |hdl|: just a guess
18:38 thd       kados: And hope that my ghost data issue for bulkmarcimport.pl -d is just my own user error otherwise yu might be stuck in MARC-8 until you scrub the marc_subfields_table thoroughly.
18:39 thd       |hdl| above
18:39 thd       kados: Do you mean that the '...' authorities search in the OPAC does not return the type of subject heading, topical, geographic, etc.?
18:40 thd       kados: which dictionary search does not provide what details?
18:41 kados     thd: the dictionary search does not return the nicely formatted headings we spent so much time crafting :-)
18:42 thd       kados: Do you mean the '...'search in the OPAC?
18:42 kados     yes, that's the dictionary search
18:43 thd       kados: just clarifying my presumption
18:43 thd       kados: it will return Twain and Lewis.
18:45 thd       kados: There is a different template used than the one you had been working on so it does not have template features that you changed.
18:46 thd       kados: owen had suggested that it was controlled by the param template but I am sure |hdl| knows.
18:46 kados     yea, I might be able to modify the dictionary search to do what I want
18:46 kados     maybe that's the best way to proceed
18:47 kados     thd: so I conceed ... we need 8 authority types :-)
18:47 kados     X00  	Personal names
18:47 kados     X10 	Corporate names
18:47 kados     X11 	Meeting names
18:47 kados     X30 	Uniform titles
18:47 kados     X48 	Chronological terms
18:47 kados     X50 	Topical terms
18:47 kados     X51 	Geographic names
18:47 kados     X55 	Genre/form terms
18:47 kados     right?
18:47 kados     authtag for all of them will be 1XX
18:47 thd       kados: author searches from the dictionary search search all author types etc.
18:49 thd       kados: yes, please choose unambiguous names like PNAME for the authority frameworks.
18:49 kados     thd: I will :-)
18:51 kados     thd: so the 'report tag' in the thesaurus structure ... is that the bib or the auth tag?
18:51 thd       kados: There are the subject subdivision authority types that complicate things for MARC 21 but those are of no value in building authority records from the bibliographic records.
18:53 thd       kados: that is the authority field for the authority type if you are building a new authority framework with a name and report tag
18:53 thd       kados: something in the 1XX range matching your list above.
18:54 kados     k
18:56 kados     thd: this all of them?
18:56 kados     http://koha.liblime.com/cgi-bin/koha/admin/authtypes.pl
18:58 kados     ok ... tags and subfields all entered
18:59 thd       kados: looks good as long as no one trips over anything in their mind misapplying CNAME and CTERM
18:59 kados     heh
18:59 kados     ok ... so now we set up the script
18:59 kados     PNAME
19:00 thd       s/their/his/
19:00 kados     taglist should be '100' right?
19:00 thd       yes
19:00 kados     key?
19:00 thd       100 no
19:00 thd       kados: if taglist is a list we can add more
19:01 kados     ok
19:01 kados     such as 100, 400, 500?
19:01 kados     what are these going to be used for?
19:01 thd       kados: These woul be bibliographic record tags where personal name is controlled
19:02 kados     so 100 and 700 maybe?
19:02 thd       kados: so certainly 100 and 700
19:02 thd       kados: most likely more but those are the most obvious ones
19:03 kados     ok ... how about 'key'?
19:03 thd       kados: maybe 600, but I am not perfectly clear that applies.
19:05 kados     I think just 'a' in the key
19:05 kados     to maximize the number of items that are matched :-)
19:05 kados     thd: right?
19:06 thd       kados: I think 600 and 800 at least for taglist
19:06 kados     ok
19:06 kados     100|600|800|700 then
19:07 thd       good
19:08 kados     what about 'key'?
19:08 kados     just 'a' right?
19:08 kados     to mazimize the number of matches?
19:08 thd       kados: do you think that you have many 100 $a in your data where that would fail to match 100 $a $d for the same author?
19:09 kados     I'm not sure
19:10 kados     thd: lets assume so
19:10 kados     thd: I know that many records don't have $d in the 100
19:10 thd       kados: if you had all national quality bibliographic records obtained from authority controlled sources or created with authority control then I would suggest the following
19:11 thd       kados: It does not matter that many records do not have $d just that those records which should have $d are not also in the database along with those that do have it.
19:12 thd       kados: For the same author
19:13 kados     ok ... lets assume national quality records etc.
19:13 thd       kados: key should be "a|b|c|q|d|" to match authority controlled use.
19:14 thd       kados: In that order for records which had been created with a record editor that preserved order.
19:14 kados     right
19:14 kados     anything in 'other'?
19:16 thd       kados: all other subfields except $3 and $9 might go in other but I am uncertain about real records where $t is used for example.
19:17 thd       kados: you probably have no other subfields than those in key contained in your data.
19:18 kados     so that would be: "1|2|4|5|6|7|8|0|e|f|g|h|i|j|k|l|m|n|o|p|r|s|t|u|v|w|x|y|z"
19:18 thd       kados: very pretty
19:19 kados     authtag is 100
19:19 thd       my lucky day at scrabble :)
19:19 thd       yes
19:19 kados             PNAME   =>      {       taglist => "100|600|700|800",
19:19 kados                                     key     => "a|b|c|q|d",
19:19 kados                                     other   => "1|2|4|5|6|7|8|0|e|f|g|h|i|j|k|l|m|n|o|p|r|s|t|u|v|w|x|y|z",
19:20 kados                                     authtag => "100",
19:20 kados                             },
19:20 kados             CNAME   =>      {       taglist => "110|610|710|810",
19:20 kados                                     key     => "a|b|c|q|d",
19:20 kados                                     other   => "1|2|4|5|6|7|8|0|e|f|g|h|i|j|k|l|m|n|o|p|r|s|t|u|v|w|x|y|z",
19:20 kados                                     authtag => "110",
19:20 kados                             },
19:20 kados             MNAME   =>      {       taglist => "111|611|711|811",
19:20 kados                                     key     => "a|b|c|q|d",
19:20 kados                                     other   => "1|2|4|5|6|7|8|0|e|f|g|h|i|j|k|l|m|n|o|p|r|s|t|u|v|w|x|y|z",
19:20 kados                                     authtag => "111",
19:20 kados                             },
19:20 kados             GNAME   =>      {       taglist => "151|651|751|851",
19:20 kados                                     key     => "a|b|c|q|d",
19:20 kados                                     other   => "1|2|4|5|6|7|8|0|e|f|g|h|i|j|k|l|m|n|o|p|r|s|t|u|v|w|x|y|z",
19:20 kados                                     authtag => "151",
19:20 kados                             },
19:20 kados     does that do it for NAME?
19:20 kados     shall we move on to TITLE?
19:21 thd       kados: corporate names do not have $q
19:21 kados     ok
19:21 kados     fixed
19:24 kados     thd: anything else before we move to TITLE?
19:24 thd       no
19:26 kados     so what's the proper taglist for TITLE?
19:31 thd       kados: taglst for uniform title "130|210|211212|222|240|243|630|730|830"
19:32 thd       quite a mouthfull including some obsolete forms :)
19:33 kados     nice
19:33 thd       oops I missed some important ones
19:33 kados     heh
19:42 kados     thd: you still there?
19:49 thd       kados: taglst for uniform title "130|210|211212|222|240|243|400|410|411|440|630|730|830|930|980|981|982|983"
19:49 kados     wow
19:49 kados     :-)
19:49 kados     thd: are you sure that won't auto-fill all of those fields in MARC
19:49 kados     thd: in the biblio record?
19:50 thd       kados: you can add the corresponding 9XX for PNAME, CNAME, and MNAME for use in Canada.
19:51 kados     ?
19:52 thd       kados: add 900 to taglist for PNAME, 910 for CNAME, and 911 for meeting name.
19:53 thd       kados: The above 9XX is for bilingual record use in Canada
19:54 thd       kados: taglst for uniform title "130|210|211212|222|240|243|400|410|411|440|630|730|830|930|940|980|981|982|983" includes one obsolete Canadian use
19:55 thd       kados: I am uncertain about 490.
19:56 kados     ok
19:57 kados     thd: key?
19:58 thd       kados: taglst for uniform title "130|210|211212|222|240|243|400|410|411|440|490|630|730|830|930|940|980|981|982|983"
19:58 kados     ok ... how about key?
19:58 thd       kados: I do not know why I was uncertain.  obviously 490
19:59 kados     ok
20:00 thd       kados: hmm key will not be the same for all of those
20:00 kados     it's as I feared :-)
20:00 thd       kados: it is easy to separate the ones which are a problem
20:01 kados     ok ... lets take out the problematic ones
20:01 kados     we can add them separately if need be
20:01 kados     lets just try to get a 'fairly good' model of this going in our lifetimes :-)
20:02 kados     ie, just the basics
20:02 kados     then we can see how it works and can refine it
20:02 thd       kados take out 400, 410. 411, they are obsolete anyways.
20:02 kados     ok
20:03 kados     thd: if you could only pick one what would it be?
20:04 kados     and if you coudl only pick 5 what would they be?
20:04 thd       kados: key "a" will work
20:07 kados     thd: ok ...
20:07 kados     are we ready to move to 'GENRE'?
20:08 thd       kados: yes lets move on :)
20:09 thd       if you have just $a then you do not need to worry about obsolete
20:09 kados     ok
20:09 kados     should I have just $a for the NAMES as well?
20:10 thd       s/obsolete/obsolete or other for uniform title/
20:10 thd       kados: NAMES were fine
20:10 kados     ok ...
20:10 kados     taglist for GENRE, TTERM and CTERM?
20:11 thd       kados: I think that they will work well and if they do not then we should discover that because we need to know.
20:11 kados     (now we're in subjects :-) ... almost done :-)
20:12 kados     thd: where are you referencing the values for the taglist?
20:13 thd       kados: a great many places
20:13 thd       kados: 8 different references
20:14 kados     hehe
20:14 thd       kados: that was why it took so long
20:14 kados     right
20:14 thd       kados: they do not all have the same currency but that is good for records derived from older source records
20:15 thd       kados: actually, I did not consult all 8 so there are some things missing from taglist.
20:16 kados     well ... I think it's good enough ... I doubt we really want to fill up the bib record in all of those palges
20:16 kados     places even
20:16 kados     we should probably trim it down to one or two tags
20:16 thd       kados: But you do not have them in your test records and most of what I gave you would be lost when applying the incomplete framework anyways.
20:17 kados     otherwise we run the rist of bloating the bib records
20:17 thd       kados: Nothing will be bloated that is not there to have values applied.
20:17 kados     incomplete MARC framework?
20:17 kados     for biblios? or authorities?
20:18 kados     thd: I was under the impression that 'taglist' is the tags that will get the values from the auth record inserted into them
20:18 kados     thd: so with uniform titles 'taglist':
20:19 kados     130|210|211|212|222|240|243|440|490|630|730|830|930|940|980|981|982|983
20:19 thd       kados: I know that the authorities framework is partially incomplete but at least that is not an issue for this question.
20:19 kados     we will have probably 18 tags created in the bib records all with the exact same subfield values inserted into them
20:19 kados     or am I incorrect in that assumption?
20:19 thd       kados: half of those are obsolete fields and fields that would only be found in Canadian records
20:20 kados     canadian authority records?
20:20 kados     or canadian bibliographic records?
20:20 thd       kados: Canadian bibliographic records
20:20 kados     but with this script, we'll be inserting values into them regardless of whether they exist in the framework
20:20 kados     unless I'm not understanding something
20:21 kados     so what you're telling me is that you want to insert obsolete fields into the bib records?
20:21 thd       kados: The only insertion will be for matches unless I am mistaken about how the script works.
20:22 thd       kados: If your records have no Canadian 930 nothing will be inserted in a bib record for a field that is not present.
20:22 kados     ok ... we'll see soon enough :-)
20:22 kados     so we need a taglist for GENRE, TTERM and CTERM
20:23 kados     as well as 'key' for those three
20:24 thd       kados: it is sad that you are missing some of the fine genre/form fields from your bibliographic framework.  However, they may have never been used in your records.
20:24 thd       kados: so the taglist ...
20:28 thd       kados: "655|656|657|658"
20:28 kados     k
20:29 kados     key?
20:29 kados     thd: I've only got 15 more minutes
20:29 kados     thd: got a conference call
20:29 kados     thd: and I'd like to get this script started before then :-)
20:30 thd       kados: key "a|z|x|y|v"
20:31 thd       kaddos: the rest is easy because we can reuse the key
20:31 kados     great!
20:31 thd       kados: other is the rest of the subfields except $3 which would not even be MARC 21 and $9
20:32 kados     for all the TERMs and GENRE?
20:33 kados     "0|1|2|4|5|7|8|b|c|d|e|f|g|h|i|j|k|l|m|n|o|p|q|r|s|t|u|w"
20:33 thd       kados: CTERM taglist "648"
20:33 kados     k
20:34 thd       kados: genre/term is one authority type
20:34 kados     yep
20:34 kados     I have GENRE, TTERM and CTERM
20:34 kados     all I'm missing now is the taglist for TTERM
20:34 thd       kados: CTERM key and other is the same as with GENRE
20:35 kados     and the 'other' for TITLE
20:36 thd       kados: for tiel we only defined "a" as key so everything aside from that and $3, $9 could be other
20:36 kados     ok
20:37 thd       s/tiel/TITLE/
20:38 kados     now all I need is the taglist for TTERM
20:38 thd       kados: TTERM taglist is 650
20:38 kados     woot
20:38 kados     ok ... I'm gonna commit this so you can look at it
20:38 kados     then I'll run it
20:38 thd       kados: key "a|z|x|y|v"
20:39 kados     it's committed:
20:39 thd       kados: other the same as with GENRE
20:39 kados     build_marc21_authorities
20:39 kados     build_marc21_authorities.pl
20:39 kados     ok ...
20:39 kados     I'm going to try to run it now
20:40 thd       kados: no
20:40 thd       kados: one more
20:40 kados     which one?
20:40 thd       kados: GTERM taglist 651
20:41 kados     what's GTERM?
20:41 kados     I don't have it in my thesaurus framework
20:42 thd       kados: sorry then tha is GNAME but the key and other is wrong
20:42 kados     hmmm
20:42 kados     I do have GNAME
20:42 kados     I'm confused
20:42 thd       kados: you have GNAME do you not?
20:43 thd       kados: What do you have for geographic name or geographic term?
20:43 kados     nothing yet
20:43 kados     but I don't see that listed on the 'understanding MARC authorities' document
20:43 thd       kados: missed one I think :)
20:43 kados     maybe it's uncommon?
20:43 kados     so there are 9 auth types then
20:43 thd       kados: no it is very common
20:44 kados     http://www.loc.gov/marc/uma/pt1-7.html#pt4
20:44 kados     not listed there
20:44 thd       kados: you have many of them in your records
20:44 kados     ok
20:44 kados     what's the report tag?
20:44 kados     151?
20:45 kados     problem is GEOGRAPHIC NAMES is 151
20:45 kados     maybe GTERM and GNAME is the same thing?
20:45 kados     I have GNAME already
20:46 thd       kados: yes GNAME I maybe confused the reference
20:46 kados     ok so we're done
20:46 thd       kados: GNAME taglist 651
20:46 kados     for now at any rate
20:46 kados     yea, got it already
20:46 thd       do you have that?
20:46 kados     yep
20:47 thd       kados: key "a|z|x|y|v"
20:47 kados     it's running :-)
20:48 thd       kados: sorry for my delay consulting references earlier and almost omitting 490 from uniform title in the process
20:48 kados     for GNAME I have:
20:48 kados     GNAME   =>      {       taglist => "151|651|751|851",
20:48 kados                                     key     => "a|b|c|q|d",
20:48 kados                                     other   => "1|2|4|5|6|7|8|0|e|f|g|h|i|j|k|l|m|n|o|p|r|s|t|u|v|w|x|y|z",
20:48 kados                                     authtag => "151",
20:48 kados                             },
20:48 kados     that's what you gave me about an hour ago
20:48 kados     is that wrong?
20:48 thd       kados: stop the process
20:48 kados     stopped
20:49 thd       kados: GNAME is too important
20:49 kados     (in fact, I think it will take about a week to run :-)
20:49 kados     thd: so what should it be?
20:49 kados     just 651 and a|z|x|y|v?
20:50 kados     eeep :-)
20:50 thd       kados: taglist is right if you add obsolete 752 and Canadian something
20:50 thd       kados: key "a|z|x|y|v"
20:51 kados     ok ... running again
20:51 kados     I've got to go
20:51 kados     thd: thanks for your help!
20:52 thd       kados: 0ther would be everything else except $3 $9
20:52 thd       kados: I hope it has something pretty to show tomorrow
20:53 kados     thx
20:54 thd       kados: your quite welcome
20:55 thd       kados: Some of the Spanish subjects where Spanish was used in the same 650 as English may have a problem but those were not catalogued correctly.
20:56 thd       kados: There seem to be many that were catalogued correctly to show how it should work if the multilingual subject headings fail.
20:58 thd       kados: Multilingual subject headings are wrong.  Subject headings are repeatable for that purpose.
21:02 thd       kados: There is unfortunately no easy document that I have seen clearly stating which bibliographic fields are controlled by which authority types.  I had to use my knowledge of cataloguing practise but that is certainly imperfect as it applies to some uncommon and obsolete fields which may exist in great abundance somewhere.
21:43 Jo        Jo from HLT
21:43 Jo        maybe I can help
21:43 Jo        (as a librarian)
21:43 Jo        although I don't quite understand what you are discussing ....
21:46 thd       Jo: we were discussing matching bibliographic record fields with authority record fields
21:46 Jo        i know about cataloguing
21:46 Jo        and I use koha!
21:47 thd       Jo: I know that we made some mistakes in our haste to test this
21:51 kados     thd: I'm back from my con call
21:51 kados     thd: the import seems to be proceeding nicely, though there are quite a few warnings
21:51 thd       Jo: one thing that I was uncertain of is  whether MARC 21 656, 657, and 658 were considered topical authority terms or genre/form authority terms
21:52 kados     thd: it has finished 3500 out of about 19,000
21:52 Jo        and, there my friend, I cannot help. Not up with marc tags. but, can go have a look at the trusty guide to marc and have a look if you like?
21:53 kados     thd: try an authorities search on 'forest' and you'll see entries that include 'Sears' which I assume is incorrect
21:53 thd       Jo: I have all the MARC references the are with slight exaggeration and I have not noticed any clarity on that question.
21:54 thd       kados: I thought that we put $2 in other for 650
21:54 kados     thd: 'topical terms'
21:54 kados     i thought so too
21:54 thd       yes :)
21:55 kados     TTERM   =>      {       taglist => "650",
21:55 kados                                     key     => "a|z|x|y|v",
21:55 kados                                     other   => "0|1|2|4|5|6|7|8|b|c|d|e|f|g|h|i|j|k|l|m|n|o|p|q|r|s|t|u|w",
21:55 kados                                     authtag => "150",
21:55 kados     it's definitely in other
21:58 thd       kados: we may have to ask paul for clarity or maybe that behaviour needs to be controlled with modification elsewhere just as I had done for  SearchMarc.pm
21:58 kados     could be
22:01 thd       kados: Ideally you may only want sears to match Sears, however, the proper encoding of $2 has a different $2 for the edition of Sears in use at the time of of cataloguing.
22:03 kados     you mean $2 is repeatable within the tag?
22:03 thd       kados: Extra parsing would be required for correct behaviour and many of your records seem to show that $2 was not always applied when it should have been so you would miss some matches.
22:04 kados     I don't quite understand why $2 is making it into the authorized heading
22:04 kados     I'm assuming it shouldn't be there
22:04 kados     right?
22:04 thd       kados: $2 is not repeatable but is supposed to include both the name and the current edition of the thesaurus in use at the time of cataloguing within the same $2
22:05 kados     ok
22:05 kados     but still, I don't think it should be in the authorized heading
22:06 thd       kados: my guess for its appearance is that all the subfields are being glued together
22:07 thd       even if they are in other but that does not make sense if I understood paul correctly
22:07 kados     interesting
22:07 kados     http://opac.liblime.com/cgi-bin/koha/opac-authoritiesdetail.pl?authid=4158
22:07 kados     in that case, it is probably my fault
22:07 kados     I need a list of valid subfields for each of the authorized heading types
22:08 kados     (and unauthorized, parallel, etc.)
22:08 kados     thd: do you have such a list?
22:08 kados     (obviously $2 does not belong)
22:10 thd       kados: they are in the official documentation but for authorities but I think I know the problem.
22:11 thd       kados: I suspect the authority framework is supposed to exclude anything that is not wanted.
22:12 thd       kados: Real authority records for Sears headings would and should have $2.
22:13 thd       kados: If you had them all as a uniform set from a matching edition the edition variance anomaly would not matter as you would ignore that variance hen linking your bibliographic records.
22:14 thd       s/hen/then/
22:14 thd       s/hen/when/ # that is better
22:22 thd       kados: To control behaviour for your case where $2 may not always be encoded when it should we could use a workaround as I had before.  When you have real Sears authority records then you can override the workaround.
22:24 kados     thd: but in both cases, we don't want $2 to appear in the heading
22:24 kados     thd: in fact, I don't understand what you're saying
22:25 kados     thd: what are you refering to?
22:25 kados     what I'm saying only applies to the _display_ of the headings
22:27 kados     I can fix it now that I have the list of proper subfields to dislay ($2 is not on that list)
22:27 thd       kadoos: Using $2 is good if you do not want Sears to match LCSH provided that you have a complete up to date set of authority records and a working bulkauthimport.pl
22:34 kados     thd: it's fixed now ... a search on forest no longer shows 'sears' in the authorized heading
22:34 thd       kados: I see that so $2 would only be in the bibliographic record to determine which set of authority records to search
22:34 kados     I fixed it only for topical terms ... now I should fix the others
22:34 thd       kados: Dis you merely remove $2 from the authority framework?
22:35 kados     no
22:35 thd       kados: how did you fix it?
22:35 kados     I merely removed it from the list of subfields that are displayed in the authorized heading
22:35 kados     the authorized heading for topical terms previously was all of the subfields appearing in the 150
22:35 kados     now it is only: abvxyz68
22:36 kados     those subfields
22:36 kados     I speak merely of display of course
22:36 kados     if you view the full heading there is still a $2 there
22:36 kados     I'm only speaking of the display of the authorized heading when it is displayed in the authorities search
22:37 thd       kados: Oh yes that is what my SearchMarc.pm fix had done but does that automatically remove it from the 150 in the newly built authority record?
22:37 kados     no ... should it?
22:37 kados     are you saying that it is not valid to have the $2 in the first place?
22:39 thd       kados: It should not be in the authority record according to the documentation unless we are missing some qualifying information.  $2 is for the bibliographic record only.
22:40 kados     ahh
22:40 kados     in that case, perhaps we have incorrectly inserted the $2 in 'other'
22:40 thd       kados: It would still need to be supplied from the authority record somehow even if not from 150 but I will look at that issue later.
22:41 kados     ok
22:45 thd-away  kados there is no advantage in removing it from the display if it is in the authority record.
22:45 kados     thd-away: except that it looks better :-)
22:45 thd-away  kados: the display should match the authority record.
22:46 thd-away  kados: It may look better but make it more difficult to diagnose behaviour.
22:46 thd-away  bye for now
22:46 kados     yep ... so I should cancel the update and remove all innappropriate fields from the 'other'
01:28 kados     thd-away: are you back yet
01:28 kados     thd-away: I'm about to head to bed myself
01:53 thd       kados: I ma back now
01:53 thd       s/ma/am/
01:57 thd       kados: used in 0 records.  What happened?
01:59 thd       kados: Without more information I would guess that there is a problem with multiple subfield matching.
05:23 paul      faut dire que RDDV est quand même un grand comique !!!
05:23 pierrick_ salut Paul
05:23 pierrick_ qui est RDDV ? (le ministre ?)
05:23 paul      vivi
05:23 paul      (c'est à propos de l'Article 1, enlevé, puis remis)
05:24 paul      franchement hilarant !
05:24 pierrick_ j'ai du mal à suivre l'évolution de DADVSI
05:24 paul      la question est : la contestation va t'elle nous valoir un 3eme ministre de la culture ?
05:24 paul      va voir par là, c'est comique :
05:24 paul      http://www.ratiatum.com/forum/index.php?showtopic=54212&st=60
05:24 paul      (remonte au 1er fil)
05:24 paul      (là, c'est la 4eme page)
05:39 pierrick_ (c'est compliqué tout ça, ça m'a l'air bien stérile ces débats à l'assemblée)
05:39 paul      c'est surtout hilarant je trouve
05:39 paul      mais hélas assez habituel comme ambiance :
05:39 paul      * la majorité vote.
05:39 paul      * l'opposition fait de l'obstruction et crie au loup.
05:39 paul      sauf que cette fois ci :
05:40 paul      * il y a quelques UMP dans le camp de l'opposition (boutin, Dupont-Aignan et quelques autres -pas beaucoup-)
05:40 paul      * le ministre s'est complètement pris les pieds dans le tapis.
05:40 paul      il a tenté un coup tordu, et ca se retourne contre lui
05:48 hdl       paul : il y a moyen de droper la base zebra?
05:48 paul      rm -rf *.mf et hop, c'est fini !
05:49 paul      la commande étendue "drop" étant bugguée, dixit indexdata
05:49 hdl       c'est normal que j'aie une base avec l'interclassement parfois en latin1, d'autre fois en utf8 ?
05:50 hdl       (mysql)
05:50 paul      mmm... non.
05:53 hdl       tu as tout en utf8 toi.
05:53 hdl       bonjour à tout le monde quand même
05:55 pierrick_ salut hdl
05:55 pierrick_ hdl: pour l'interclassement MySQL, si c'est complètement normal (pardon Paul)
05:56 hdl       ah bon. And why ?
05:56 hdl       Il me semblait que updatedatabase allait tout transformer en utf8 ... J'ai loupé une marche ?
05:57 pierrick_ http://dev.mysql.com/doc/refman/4.1/en/charset-syntax.html
05:57 pierrick_ character sets and collations at four levels: server, database, table, and column
05:58 pierrick_ la version HEAD actuelle d'updatedatabase ne convertit pas en UTF8 les propriétés des tables/champs, c'est commenté, par contre il dit qu'il le fait !!! (j'ai demandé pourquoi à Paul avant hier, je n'ai pas eu de réponse :-/)
05:59 paul      ah, le fait que ce soit commenté est ptet bien une erreur. J'ai pas mal bidouillé.
05:59 paul      (j'ai raté cette remarque avant hier)
05:59 paul      en fait, pour tout dire, le passage en utf8 fut une vraie galère pour moi. Et visiblement, j'ai pas trouvé la bonne recette encore...
06:00 paul      donc si quelqu'un veut prendre le tablier et compléter le plat, il est le bienvenu ;-)
06:02 pierrick_ hum... je veux bien m'en occuper si tu veux
06:02 pierrick_ ... essayer de m'en occuper
06:04 hdl       pierrick_: pourrais-tu me tenir au courant de tes avancées/tests ?
06:04 hdl       Histoire que l'on ne soit pas forcément toujours à chercher chacun de son côté.
06:05 pierrick_ OK, je vous tiens au courant
06:06 paul      super, merci.
06:06 paul      n'hésite pas à nous interroger quand tu pédaleras. On pourra t'expliquer les galères dans lesquelles ont est déjà tombé !
10:52 kados     hi everyone
10:52 kados     hdl: did you get the encoding problem fixed?
10:52 paul      hello shedges & kados.
10:52 paul      i don't think so.
10:52 shedges   hi paul
10:52 paul      pierrick is working on it too.
10:53 paul      the 1st customers that have seen what will be 2.4.0 are very happy with OPAC new features !
10:53 pierrick_ Hi :-) Yes I'm working on UTF-8 issues
10:53 paul      (& everybody prefers liblime stylesheet for css templates to the previous ones ! )
10:54 kados     paul: :-)
10:54 paul      + someone suggested me to put it by default, to show something has changed for everybody.
10:54 kados     hehe
10:54 paul      I think i'll do it...
10:54 kados     cool ... well I'm glad to have contributed something useful :-)
10:55 kados     pierrick_: so when is the problem occuring?