Time  Nick      Message
10:21 _hdl_     good night chris :)
07:27 chris     right bedtime
07:16 chris     then i have to go to bed :)
07:16 chris     yep hdl .. for a little while
06:05 |hdl|     chris around ?
04:52 |hdl|     hi
02:33 thd       kados: Those servers are fantastically good compared to some that time out on almost any request for a single record.
02:18 kados     heh
02:05 thd       kados: for amusement rather than accuracy: http://urbansemiotic.com/2005/05/30/top-five-worst-z3950-connections/
02:03 thd       kados: Maybe I would have some better results for using 101 instead of 1.
02:02 thd       kados: "The type-101 query is defined as identical to the type-1 query, with the exception that the Prox operator and Restriction operand are defined not only for version 3, but for version 2 as well. Thus the definition of the type-101 query is independent of version."
01:56 thd       kados: Is 101 user defined?
01:55 kados     query-type 1 and 101
01:55 kados     Section 3.7 of the spec (page 104 of the NISO-2003) specifies
01:54 kados     hmmm ... it could be that 'type' is different than 'attribute'
01:53 thd       kados: Zing is partly a simplification of Z39.50 that people could agree upon.  There is obviously, a high value on searching across a consistently implemented standard.
01:50 thd       kados: The problem is always how much and in what manner the target supports the possible features.
01:50 kados     ok ... I'm gonna start at the source -- the z39.50 agency -- and try to figure out how all these specs are related
01:50 kados     right ... it's mainly SRW/U + CQL if I remember correctly
01:49 thd       kados: Zing is a major modification to elements of Z39.50, prepared by the same agency, not an alien system of different origin.
01:48 kados     that's what i would think
01:46 thd       kados: They may not belong.  Are they then Bib-2, Bib-3, etc. ?
01:45 kados     they don't belong ... I'm sure of it :-)
01:45 kados     yea, I see them
01:45 thd       kados: relation, structure, truncation, etc. are all there in that document.
01:44 kados     so bib-1 should only cover type = 1 or USE
01:44 kados     characteristics of a search term in a Type-1 query
01:43 kados     The attributes of Attribute Set bib-1 are used to indicate the
01:43 kados     thd: see section 2.
01:43 kados     thd: I think I'm right
01:43 kados     what the heck is the difference between bib-1 and the bath profile?
01:42 kados     shit ... I'm really not getting this then
01:42 thd       kados: Those are also present, look further don that link.
01:42 kados     right?
01:42 kados     bib-1 is stricly for defining what to _use_ when applying those other attributes
01:41 kados     meaning bib-1 doesn't define things like relation(2), position(3), structure(4), etc.
01:41 thd       kados: Zing SRU/SRW have other search options but they are rough equivalents.
01:41 kados     thd: so I'm right that the '1' in bib-1 is the 'USE' attribute in Z39.50 right?
01:39 thd       thd: Zing is made to Z39.50 and Bib-1 is tied to Z39.50.
01:38 kados     thd: you mean zing is made to replace bib-1?
01:38 kados     thd: er?
01:38 kados     (which, now that I'm thinking of it, could probably be done in zebra with a simple encoded hierarchy
01:37 thd       kados: Bib-1 is the needed set for retrieving records from the rest of the world which does not Zing yet.
01:37 kados     where given any arbitrarly complex hierarchy of branches, they can find records at any node or brach of the tree
01:37 kados     but I think they might make it up in things like scoped searching
01:36 kados     they are sacrificing on support for things like bib-1
01:36 kados     yea ... well they've done quite well actually
01:36 thd       kados: Much of Evergreen seemed to have a common design mistake/bug set with Koha 2 :)
01:35 kados     for us even
01:35 kados     their mappings should work well fo rus
01:35 kados     specifically, it's the 1=4 for title, etc. stuff
01:34 kados     bib-1 is the attribute set
01:34 kados     so ... I think this is starting to sink in
01:34 kados     thd: they've got a couple of years head start :-)
01:33 kados     thd: I'm playing catch up with the PINES guys :-)
01:33 kados     thd: sorry ... I don't have much more info yet
01:26 thd       kados: tell me about the Evergreen "meta records".  The last time I looked downloads were come back later, you missed the test window.
01:25 kados     ahh
01:24 thd       or toward the bottom
01:24 thd       kados: look at the bottom of that link for some simple suggested mappings.
01:23 thd       kados: http://www.loc.gov/z3950/agency/bib1.html
01:22 thd       kados: There is a very simple mapping for the Bib-1 attributes and much better for MODS.
01:20 thd       chris: I will be more helpful when I catch up.  I am still a month and a half behind.
01:19 thd       certainly, if I have been helpful chris.
01:19 kados     thd: It seems like it's left to the implementor
01:18 kados     hehe
01:18 thd       kados: sleep is unnecessary today.
01:18 chris     thanks for your help thd
01:18 kados     chris: have a nice evening
01:18 kados     chris: heh ...
01:18 chris     kados, get some sleep :)
01:18 thd       kados: Do you mean that is left as an exercise to the user?
01:18 chris     well im gonna take a break for a bit now
01:17 kados     sigh ... turns out the bath profile doesn't get down into the record level to define specific tags/subfields
01:17 thd       kados: a common title search should search the title, uniform title, key title, and other relevant fields.
01:15 thd       kados: key title is a similar standard abbreviated title commonly used with serial titles.
01:14 thd       kados: Uniform title usually only applies for works appearing in translation or with many variant functional titles.
01:13 thd       kados: Uniform title is a controlled title generally used or the original title in the original language.
01:13 kados     thd: the bath profile for cql has 'keyTitle' and 'uniformTitle'
01:12 thd       we still have mostly Z39.50.
01:12 kados     thd: is 'uniform title' what most people call 'title'?
01:12 thd       unfortunately, the rest of the world does not Zing yet.
01:10 chris     :)
01:10 kados     best thing I've read all night
01:10 kados     http://zing.z3950.org/cql/intro.html#4.2
01:10 kados     This is totally worth the 10 minutes it takes to read:
01:10 chris     yep
01:09 kados     is that you can have more than one spec in the same query :-)
01:09 chris     how cool is that
01:09 chris     and not have to touch anything else
01:09 kados     the really neat thing
01:09 kados     yep ...
01:09 chris     and a HLT spec
01:09 chris     to make a NPL spec
01:09 chris     well this gives library's the opportunity
01:09 kados     http://zing.z3950.org/srw/bath/2.0/#2
01:09 kados     and here's the bath context set:
01:09 chris     :)
01:08 kados     we'll come up with a 'koha' spec :-)
01:08 kados     and eventually, when we're rich and famous
01:08 chris     cool
01:08 kados     and use that to build my abs file
01:08 kados     yay ... so I should be able to get a complete spec of the 'bath' index set
01:08 kados     The only reliable way to find answers to such questions is by reference to the index-set definitions;
01:07 kados      What exactly are the meanings of indexes? We have an idea that bath.title is some kind of bibliographic title search, but does it include journal titles as well as book titles? Does it include searching for words that occur only in subtitles? And should searches against the dc.subject index-set be simple word-searches, or use a controlled subject-vocabulary? And if the latter, which subject vocabulary? LCSH, the Library of Congress Subject Headings? MeSH, the Medica
01:07 kados     here's the paragraph I've been hunting for all night:
01:05 thd       Actually, Amazon still has a command text box search which they had once called power search.
01:04 chris     cross site scripting is the rage with script kiddies at the moment, and it doesnt hurt to make their lives difficult :)
01:03 thd       chris: oh, I forgot Koha uses CGI !! :)
01:03 chris     before you do anything with it
01:02 chris     using taint and checking for malicious code
01:02 chris     its a good habit to get into with cgi
01:02 thd       chris: Is there no simple way to protect the system by limiting the effected system area of any malicious code to the perfectly harmless?
01:00 chris     thats right
01:00 thd       chris: you mean bad as wicked first then pass to Zebra for bad as in malformed.
01:00 kados     excellent
00:59 chris     for our little test anyway
00:59 chris     it gives quite good error messages
00:59 chris     and then zebra can grizzle about bad cql
00:58 chris     is strip out bad stuff
00:58 chris     basically what ill do
00:58 thd       syntax checking for parentheses matching etc.
00:58 kados     chris: or the cql context set?
00:57 kados     chris: should we start with the 'bath' context set in cql?
00:57 chris     ahh no, probably just say its malformed
00:57 thd       I was imagining that you would be repairing malformed queries.
00:56 thd       oh yes, I was not thinking about malicious in that context.
00:56 kados     doubt that would actually work
00:56 kados     well ... something like that anyway
00:56 kados     thd: or better yet `mailx -s "stolen passwords" < `cat /etc/passed`
00:55 chris     no input should ever be trusted to be sane :)
00:55 kados     :-)
00:55 kados     thd: "\; `cat /etc/passwd`"
00:54 kados     thd: things like:
00:54 kados     chris: meanwhile, I'll get a basic abs file for all the cql stuff best I can
00:54 thd       kados chris: what user input would not be trusted?
00:53 kados     thd: and not trust user input
00:53 kados     thd: but as chris pointed out, we'll want to parse it
00:53 kados     thd: I think we'll have it
00:53 kados     sweet
00:53 thd       kados: some cranky librarians may insist on having that text box as a feature.
00:53 chris     ill make a little search-test.pl
00:52 chris     ill work on that
00:52 chris     good idea
00:52 kados     chris: might help us better evaluate how to handle pattern matching, relation, etc.
00:52 kados     chris: for test purposes, it might actually be interesting to have a text box where we can enter in cql queries and see how zebra handles them
00:51 kados     which is 'sgml-like' in its hierarchical nature
00:51 kados     yea, I think in their case, it's just a naming convention for their internal structure
00:50 thd       kados: What I have not fully examined is the Zebra Docs.  I made an improper reference to SGML earlier today that did not apply to Zebra.
00:50 kados     http://zing.z3950.org/cql/intro.html
00:50 kados     here's a nice intro:
00:49 kados     excellent
00:49 thd       kados: CQL is easy and very close to CCL with which I grew up.
00:48 thd       kados: fortunately, I still have almost better than perfect vision
00:48 chris     :)
00:48 kados     we _do_ have those :-)
00:47 kados     and a finger on their lips :-)
00:47 kados     with glasses low on their noses
00:47 kados     hehe ... well ... they're librarians :-)
00:47 thd       kados: you mean more cranky than I must seem at times :)
00:46 kados     thd: have you looked at the cql spec?
00:46 kados     hehe ... if you only knew :-)
00:46 thd       kados: you have cranky librarians? :)
00:45 kados     which makes me happy :-)
00:45 thd       chris: no need to restrict anything to searching merely MARC as long as MARC itself is not restricted.
00:45 kados     and I can play with the .abs file until it does
00:45 kados     can say 'I did a search on X and it didn't come up'
00:45 kados     it also means that my cranky librarians
00:45 kados     which is faaaantastic! :-)
00:44 kados     !!
00:44 kados     thd: and they wont' need to code anything
00:44 kados     thd: so any library can just write a new .abs file if they want search behavior to change
00:44 kados     thd: the .abs syntax is quite sane
00:44 kados     thd: the nice thing is
00:44 kados     bummer
00:43 thd       kados: MODS mapping does look like a good starting point for determining which fields to index for common queries, however, I noticed some significant gaps in subfield coverage.
00:43 chris     the underlying structure will be marc-xml .. and maybe others .. we dont want to restrict the search to just marc either :)
00:40 thd       chris kados: try to preserve the ease of extending searching by not fixing the search parameters to a subset of MARC.  This would result if GRS-1 or MODS, were the actual target of searches and not an underlying MARC or MARC-XML record.
00:40 chris     :)
00:40 chris     voodoo?
00:38 kados     I'm actually composing a question about that to koha-zebra righ tnow
00:38 kados     hehe
00:35 chris     theres no sacrifice three chickens, point your toes eastward and now 245z = the name of the first born child of the 33rd leap year
00:34 chris     yeah its actually spec'd in a sane way
00:33 kados     well ... the easy stuff is
00:33 kados     cql didn't seem like a walk in the park when I looked at the spec last
00:33 kados     it seems like quite a job to parse incoming cql
00:32 kados     yep
00:32 chris     dont want to trust user input :)
00:32 kados     yea, makes sense
00:32 chris     for sanity checking
00:32 chris     id parse it anyway
00:32 kados     I guess we'd have to directly parse a query coming in from an input box ... or just feed it directly to zebra
00:31 chris     as long as input name=something where something is the same as url?something=stuff
00:31 chris     should be able to
00:30 kados     in cgi ...
00:29 kados     with the same param()
00:29 kados     and direct cql syntax in a text box
00:29 kados     post queries (from a form)
00:29 kados     url queries
00:29 kados     i wonder if there's a way to handle
00:28 kados     pure cql ... that's definitely the way to go
00:28 chris     and Search.pm  deals with the rest
00:28 kados     yep
00:28 chris     so we just say search author=bob
00:28 chris     and make the template writers and opac-search not have to know about marc
00:28 chris     lets cut out the middle man
00:28 chris     then convert it to cql
00:28 kados     or marc.title=''
00:28 chris     is take a bunch of marc stuff
00:27 kados     dc.title=''
00:27 chris     currently what searchmarc does
00:27 kados     cause you can do things like
00:27 chris     yeah
00:27 kados     cql will be nice for that
00:27 kados     nice
00:27 chris     in theory
00:27 chris     and doing more than just marc
00:27 kados     excellent
00:27 kados     based entirely on perl-zoom
00:27 chris     thats right
00:27 kados     a new set of search methods
00:27 chris     yep
00:26 kados     I see ... so Search.pm is where you're aiming
00:25 chris     meanwhile we can make Search.pm do cool stuff
00:25 kados     yep ...
00:25 chris     and get zebra going
00:25 kados     makes good sense
00:25 chris     just new Biblio.pm and SearchMarc.pm
00:25 kados     right
00:25 chris     without needing to change bunches of templates and scripts
00:25 chris     can be upgraded to head
00:24 chris     it means that any 2.2.5
00:23 chris     while i work on Search.pm
00:23 chris     its pretty much a drop in replacement
00:23 kados     nice ...
00:23 kados     right
00:23 chris     yep
00:23 chris     just its searching zebra, not the mysql db
00:23 kados     so $query is in CQL
00:23 chris     so opac-search.pl just keeps working like it alway did (to outside appearances)
00:22 chris     and hands them back in the same format it used to
00:22 chris       my $line = MARCmarc2koha($dbh,$record);
00:22 chris     it goes off to zebra, gets the results, runs them thru
00:22 chris     yep
00:22 kados     through the catalogsearch sub?
00:21 chris     on down
00:21 chris     look at about line 240 in SearchMarc.pm
00:21 chris     opac-search.pl is fetching its results from zebra
00:21 chris     yeah that runs thru zebra
00:19 kados     opac-search.pl right?
00:19 kados     I still don't get how the search is running through zebra
00:19 kados     so where are title and author coming from in default view?
00:18 chris     try a marc view and youll get a server error
00:18 kados     or is that just it ... it's not there at all ? :-0
00:18 chris     i went off to work on making sure acquisitions is working
00:18 kados     where's it coming from then?
00:18 chris     i figure thats a minor detail
00:18 chris     got sidetracked
00:18 chris     nope
00:17 kados     so are you pulling out the MARC now from zebra for display?
00:17 kados     heh
00:17 kados     chris: nice!
00:14 chris     i just managed to acquisition a book .. the add item was stuck
00:14 chris     http://opac.koha3.katipo.co.nz:81/cgi-bin/koha/opac-detail.pl?bib=101
00:14 chris     woot
22:35 kados     be back later
22:33 kados     I could be totally wrong though
22:33 kados     systag sysno 090$c
22:33 kados     but I _think_ you should be able to do something like
22:33 kados     I'm not sure why that's in the zebra.cfg instead of the collection.abs
22:32 kados     systag sysno sysno
22:32 kados     we have:
22:32 kados     in the zebra config
22:31 kados     sysno: An automatically generated identifier for the record, unique within this database. It is represented by the <localControlNumber> element in XML and the (1,14) tag in GRS-1.
22:31 kados     Specifies what information, if any, Zebra should automatically include in retrieval records for the ``system fields'' that it supports.  systemTag   may be any of the following:
22:31 kados     chris: there's something in there about 'systag'
22:30 kados     http://indexdata.dk/zebra/doc/data-model.tkl#field-structure-and-character-sets
22:28 thd       chris said MARC is cool.  I saw :)
22:27 kados     ahh
22:27 thd       kados: you have the intranet/OPAC preference backwards.  Only non-MARC changes the intranet view.
22:26 chris     that looks cool kados
22:26 kados     heh
22:26 kados     thd: are you sure it's a preference for the OPAC? I think it's just a pref for the intranet
22:25 chris     we should fix that :)
22:25 thd       :)
22:25 chris     dear god, the default is to show marc in the opac?
22:24 kados     find @attr 1=/*/datafield[@tag='090']/subfield[@code='c'] somenumber
22:24 thd       kados: in all templates it is a preference setting for Koha 2.
22:24 kados     chris: so you should be able to get the biblionumber with a query like:
22:23 kados     in NPL it's the non-marc view, in css templates its MARC view
22:23 kados     thd: well ... in rel_2_2, default differs in NPL vs default templates for OPAC view
22:23 kados     thd: dunno, we're just brainstorming ... I think I mean the default view
22:22 kados     one that's not quite as fast
22:22 thd       kados: which do you mean as the simple view?
22:22 kados     using another index altogether
22:22 kados     elements, if enabled, allow you do do searches on specific elements in the data
22:22 kados     attributes are your classed searches, defined with names in the abs file
22:22 chris     cool
22:22 kados     so it's what I thought
22:22 kados     >>searching, as an alternative to numerical USE attributes.
22:22 kados     >>In other words, XPATH-statements are used to select elements for
22:22 kados     this is a useful sentance:
22:21 chris     yep, lots to do, but its mostly straightforward
22:20 kados     yep ... pretty exciting eh? :-)
22:20 chris     yep lots of options
22:19 chris     $biblionumber=$line->{biblionumber};
22:19 kados     for the simple view, we may want to do the same, but pass it through ISBD
22:19 chris     and you end with a nice hash ref
22:19 kados     and let him deal with it:-)
22:19 chris     my $line = MARCmarc2koha($dbh,$record);
22:19 kados     directly to the template designer
22:19 chris     for the simple view we just go
22:19 kados     it might be interesting to just dump out the tags/subfields
22:19 chris     yep
22:19 kados     for the marc view
22:18 kados     yep
22:18 chris     then its just a matter of making $record into something nice to hand to the template
22:17 chris     my $record = MARC::Record->new_from_xml($raw);
22:17 chris     $raw=$rs->record(0)->raw();
22:17 kados     one is to search for results and pull them out at the same time
22:17 kados     then put out one
22:17 chris     you can use pqf
22:17 kados     one is to search for results
22:17 chris     yeah
22:16 chris     and once we can fetch the matching record
22:16 kados     i think
22:16 kados     there is more than one way to search in Z3950
22:16 kados     and 'present' in Z3950 terms (or 'fetch', I can't remember which)
22:16 chris     basically at this point we know the biblionumber
22:16 kados     I think it's 'show' in Yaz terms
22:15 kados     hehe
22:15 chris     once i figure out how :)
22:15 kados     sweet!
22:15 chris     and i should be able to just get that all from zebra
22:15 kados     and elements are a way to search on specific elements in the data (that aren't indexed?)
22:15 chris     ok, its just the marc view thats broken now
22:15 kados     so maybe attributes are 'classed' searches
22:15 chris     http://opac.koha3.katipo.co.nz:81/cgi-bin/koha/opac-detail.pl?bib=2
22:14 kados     attribute set is attset bib1.att
22:14 kados     in the collection.abs
22:13 kados     esetname B @
22:13 kados     esetname F @
22:13 kados     is that this:
22:13 chris     dunno
22:13 kados     do we have an element specification defined?
22:13 kados     The attribute set (which can possibly be a compound of multiple sets) which applies in the profile. This is used when indexing and searching the records belonging to the given profile.
22:12 kados     Element set names, which are a shorthand way for the client to ask for a subset of the data elements contained in a record. Element set names, in the retrieval module, are mapped to  element specifications  , which contain information equivalent to the  Espec-1   syntax of Z39.50
22:12 kados     so what's element and what's attribute?
22:11 kados     A ! in place of the attribute name is equivalent to specifying an attribute name identical to the element name. A - in place of the attribute name specifies that no indexing is to take place for the given element.
22:11 thd       kados: A possible need to re-export would seem crazy.
22:10 kados     we need some kind of myisamchk util
22:10 kados     I've also got questions about what to do if zebra crashes
22:10 kados     or if we need to first export them
22:10 kados     is whether we can reindex records that are already in zebra
22:09 kados     one thing I'm not clear on
22:09 chris     yep
22:09 kados     you'll of course need to reindex and restart zebra
22:09 kados     word will work for now
22:08 chris     ill have a play
22:08 kados     and you prolly want 'number' but I don't know the code for that
22:08 chris     right
22:08 kados     and 'w' is for 'word'
22:08 kados     oops ... you'll need a tab after biblionumber
22:08 kados     090$c     biblionumber:w
22:07 kados     something like:
22:07 kados     in your collection.abs
22:07 kados     and give it a name like 'biblionumber'
22:07 chris     right
22:07 kados     another is to index 090 as a word or number in zebra
22:06 kados     one is to do a MARC search as Seb outlined in a recent email
22:06 chris     i want to be able to fetch a record, using the biblionumber
22:06 kados     I think there are two ways to do it
22:05 kados     chris: you need biblionumber to be searchable?
22:05 kados     sec
22:01 thd       kados: are you busy finding the answer for chris?
21:55 chris     cos if we get the xml back from zebra .. i can get that in a nice form that opac-detail.pl knows how to handle in 2 lines
21:53 chris     any ideas kados?
21:53 chris     but im not sure what i need to do to make that a searchable field
21:53 chris     the biblionumber is sitting there in 090c
21:53 chris     and to fetch the marc from zebra
21:52 chris     i want to be handed a biblionumber
21:52 chris     is get_record
21:52 chris     so what im working on now
21:52 chris     ok
21:51 thd       kados: That is exactly what the old MELVYL system did for 'meta records' that the Ex-Libris system opted out of implementing.
21:50 kados     thd: meaning 'title', 'author' and the like
21:50 Destinati Chris/Kados - thanks for the sanity check.
21:50 kados     thd: I think they use MODS to build what they call 'classed' searches
21:50 kados     thd: and then indexing those
21:49 kados     thd: (ie, using subjects from all of them)
21:49 kados     thd: pulling together the 'best' parts of each similar record
21:49 kados     thd: they are made by taking the many versions of MARC records that exist in their 5 million record dataset
21:48 chris     destinati: that should work just fine
21:48 kados     thd: it's what they use for searching
21:48 chris     cc
21:48 thd       ?
21:48 thd       kados: what are the PINES 'meta records'
21:47 kados     yep
21:46 Destinati It appears like a USB keyboard to my computer
21:45 Destinati I bought a Symbol brand LS2208
21:45 Destinati I'm new to creating a library from scratch and bar codes
21:45 kados     Destinati: some of their scanners support the check, some don't
21:45 thd       s/can/ought to/
21:45 kados     Destinati: I've got that working for one of my clients
21:45 kados     Destinati: yes, you can have it confirm the check digit
21:44 thd       Destinati: your scanner can check a scanned code.
21:44 kados     thd: you'll want to take a look at the algorithms the PINES guys have put together for Evergreen's 'meta records'
21:44 Destinati or more likely... a human putting in the code
21:44 Destinati to double check the scan
21:44 Destinati but a nice to have
21:44 Destinati Not a required feature
21:44 Destinati I didn't know if it's possible to have Koha confirm the check digit
21:43 thd       Destinati: your presumption is correct to my knowledge that Koha cannot distinguish as long as your bar codes have unique numbers.
21:43 chris     itll work with Koha :)
21:43 chris     if numbers appear
21:43 chris     scan something
21:43 chris     good way to test, open text editor of your choice
21:43 Destinati Chris: Whew - thanks! I'm excited about putting Koha into action.
21:42 chris     yes thats right Destinati
21:41 thd       Audrey: The challenging things can still be done but not efficiently.  Much would need to be done as a batch process during of peak hours and stored in a normalised manner to overcome data inconsistency and incompleteness.
21:40 Destinati I am about to purchase bar code labels for the first time for our small library of 9000 items. I think that using the codabar format with a mod 10 check digit is fine. I just wanted a sanity check that Koha really won't care about the details as long as my bar code reader gives it a number. Can anyone confirm this?
21:40 Audrey    ok
21:39 thd       Audrey: A huge problem confirmed by OCLC research is lack of control over many important fields for FRBR relations.  Variance and error in cataloguing makes matching difficult relative to an ideal world.
21:37 Audrey    international standard book number?
21:37 Destinati Anyone around?
21:36 thd       Audrey:  Exploiting multiple ISBN matching from OCLC data is the next easiest.
21:35 thd       Audrey: Searching controlled against controlled values for controlled fields should work most easily.
21:33 chris     yep, we predated FRBR .. they just copied us *grin*
21:33 Audrey    which FRBR concepts would work with koha?
21:33 thd       Audrey: Koha started with a relational database model that was somewhat like FRBR even if inefficient for managing textual data.
21:31 thd       Audrey: I believe that both kados and I have a significant enough interest in FRBR concepts to make some aspects work.
21:30 thd       Audrey: I have received excellent advice from someone who knows as much or more than anyone that such a scheme is currently impractical in the way Martha envisions it, however, that will not stop me from trying to cheat around the edges.
21:28 thd       Audrey: It is worth reading, however, her proposals are very CPU intensive.
21:27 Audrey    not yet
21:26 thd       Audrey: have you seen Martha Yee's paper on FRBRizing the OPAC?
21:26 Audrey    Just a conceptualization question from a graduate student looking at koha.
21:25 Audrey    thd: will koha eventually use FRBR? and why or why not?
21:12 thd       kados: read above and signify when you are back.  I can copy post and/or fax the needed pages to whomever might need them.
21:10 thd       kados: The relevant fields and subfields should be essentially the same for MARC 21 now as they ever were for US MARC, although, auditing should always be done.
21:06 thd       kados: Ex-Libras merely chooses the best single record.  It does not merge data other than holdings.
21:04 thd       kados: It was creating a super-record that combined useful data from all duplicate records in contributing institutions.
21:03 kados     I'll be back later
21:02 kados     which is?
21:02 thd       kados: Ex-libris is a recent development and missed their single best feature.
21:01 kados     I'm wondering if there mappings are out of date (or maybe these things don't go out of date)
21:01 thd       kados: however, I have copied all the relevant sections after the interlibrary lone that took over 2 months.
21:01 kados     but ... MELVL uses Ex-Libris these days
21:01 kados     cool
21:01 kados     wow, and you have one
21:00 thd       kados: There are only about ten copies of the MELVYL system reference manual. (1990- ) listed on OCLC.
20:58 kados     thd: I don't think I have that ... where can I get it?
20:58 kados     thd: ?
20:55 thd       kados, chris: you should have access to the same information I have.  It is an instruction manual on how to build a robust standards compliant union catalogue that scales to many millions of records.
20:52 thd       kados chris: There is also the issue of normalising the values before including the terms in the index.
20:51 thd       kados: title word indexes for the keywords used in the find command are half a page of codes.
20:50 thd       kados: so far the find and browse indexes match for the MELVYL system.
20:41 thd       kados: subject searches are half a page of codes.
20:40 thd       kados: I was listing the searches for authorised values used in browse searches.
20:38 kados     this abs file's gonna be HUGE
20:38 kados     hehe
20:38 thd       kados: corporate author from fields 110, 410, 710, 810, 897, 797 with subfields abckq.  Also from fields 111, 411, 711, 811, 898, 798  with subfields abcdegknq.
20:34 thd       kados: personal author from fields 100, 400, 700, 800, 896, 796 with subfields abckq.
20:32 chris     heh
20:32 kados     wow ... that's a quotable quote
20:31 kados     hehe
20:31 chris     if the sun shines in ecuador then 245z equal title in swahili
20:31 kados     thd: if you're willing, I'm all ears :-)
20:31 chris     excellent thd
20:31 kados     and I _think_ that's one reason MARC must die
20:31 thd       kados: I have the page in front of me for the fields and subfields used in the original MELVYL indexes.
20:31 chris     thats all it is
20:31 chris     its retarded
20:31 chris     yeah
20:30 kados     that kind of thing really worries me
20:30 kados     [If $f$g$h$k follow $b they go with <subTitle>. If they follow $a they go with <title>
20:30 kados     in paticular:
20:30 kados     titleInfo
20:30 chris     but it might, we'll jsut have to see
20:30 kados     take a look at 3. Mapping
20:30 kados     http://www.loc.gov/standards/mods/v3/mods-mapping.html
20:29 chris     i dont believe it will
20:29 thd       kados: That should not produce a match in Koha 3 but that requires separate tests for index matching against the contents of each field.
20:27 thd       A search for author Fred Smith will match a biblio by Jack Smith and Fred March as joint authors in Koha 2.
20:26 kados     thd: could you explain to me what you mean?
20:26 thd       kados: I mentioned the problem last night to chris about not mushing the indexes for mere speed.
20:23 thd       kados: chapter 6 Index contents
20:20 kados     thd: yikes ...
20:20 thd       kados: there might be copyright issues and I have it in printed form copied from the looseleaf binder.
20:19 kados     thd: is it copyright? can you post it to the list?
20:19 thd       kados: Everyone working on Koha should have a copy.
20:18 kados     sweet ... that'd be interesting to look at
20:18 thd       kados: I have the relevant information from the original MELVYL system.
20:18 kados     thd: any thoughts on MARC searching?
20:17 kados     w is for word
20:17 kados     sweet ... we can specify field types too
20:16 kados     or whatever you think
20:16 kados     yea
20:15 thd       kados: your thought is that if you use the MODS mapping to determine which MARC fields to index for a simpler common usage in a search form.
20:15 kados     I'm guessing subject'll really be fun :-)
20:14 kados     etc.
20:14 kados     which for an author search
20:14 kados     I'm just looking for a way to determine which fields in MARC should be searched when I do a title search
20:14 kados     no ... not at all
20:14 thd       kados: as long as you were not planing to use MODS to store data or index against
20:14 kados     right
20:13 chris     ie if i search title "chris" on the opac .. what marc fields should that search?
20:13 chris     for searching purposes
20:13 kados     just ... what should be 'counted' as a 'title' in MARC
20:13 kados     I'm not talking about export yet
20:13 thd       s/fro/for/
20:12 thd       kados: yes, however, they are not a one to one mapping between MARC and MODS which will cause problems if you rely upon MODS as distinct from supporting MODS fro export.
20:11 kados     or should I steal your ISBD config?
20:11 kados     I'm wondering if we should steal their mappings for our index configuration
20:11 kados     etc.
20:11 kados     MODS describes some elements like title, author
20:11 kados     thd: http://www.loc.gov/standards/mods/
20:11 kados     thd: I think there are some mappings from MARC to MODS
20:10 thd       kados: What do you mean by the MODS conventions in this context?
20:10 thd       congratulations kados chris paul etc.
20:10 kados     thd: should we folow the MODS conventions?
20:09 kados     for things like 'title' author, etc.
20:09 kados     thd: now's your chance to shine with your mad MARC skills and tell us what we should be indexing on
20:09 thd       yes kados
20:09 kados     thd: you around?
20:05 chris     yeah go home :-)
20:04 kados     sweet ... that makes it easy
20:03 chris     yep
20:03 kados     with bulkmarcimport?
20:03 kados     can I still do a -d for deleting?
20:03 chris     im working on getting opac-biblio.pl to work
20:03 chris     go for it
20:03 kados     I'm thinking of expanding the collection.abs
20:03 chris     yeah we have to build up what we index now
20:03 kados     ok ... this is only indexed with title and author
20:02 kados     to put it in
20:02 kados     cause I had to create a new symlink
20:01 kados     so actually ... I'm not gonna commit that
20:01 chris     yep
20:01 kados     damn fast too
20:01 kados     search working
20:01 kados     http://opactest.liblime.com/
20:01 kados     yay
20:00 chris     ta
19:59 kados     k ... I"ll commit it too
19:58 chris     and stick it there
19:58 chris     ah grab it from unimarc
19:58 kados     chris: you have that file?
19:58 kados     chris: [Tue Feb 14 16:15:52 2006] [error] [client 70.106.188.196] ZOOM error 10012 "CQL transformation error" (addinfo: "can't open CQL transform file '/home/koha/testing/koha/intranet/zebra/pqf.properties': No such file or directory") from diag-set 'ZOOM', referer: http://opactest.liblime.com/
19:57 kados     hehe
19:57 chris     so im testing with npl :)
19:57 chris     css hurts my eyes
19:57 kados     right
19:57 chris     not sure what id change
19:57 owen      I haven't written a prog template for the OPAC
19:57 chris     i havent had to make any template changes yet
19:56 chris     testing with npl
19:56 kados     you developing with css or npl?
19:56 chris     owen is the man to ask
19:56 chris     umm i dont think so
19:56 kados     chris: do you have a 'prog' template for the OPAC?
19:55 chris     until we can swap it in for SearchMarc.pm
19:54 chris     Search.pm should slowly build up
19:54 kados     huh ... no prog tempaltes for the opac?
19:54 chris     im hoping to as well
19:54 kados     I'll prolly be committing a lot of stuff this week
19:54 kados     now that I've got a system up and running
19:53 kados     heh
19:53 chris      :)
19:53 chris     yeah, they mostly sort of work
19:53 kados     so maybe I should try those
19:53 kados     I think paul's actually been developing with prog
19:53 kados     yea ...
19:53 kados     there are no tab so you can't navigate it
19:53 chris     gonna be lots of template fixing
19:53 chris     heh
19:53 kados     shoot ... system prefs is borked in the npl templates
19:52 kados     yea, just installed it
19:52 chris     if you want to see my debugging output
19:52 chris     or install it :)
19:51 kados     k ...
19:51 chris     yuo dont need it
19:51 chris     ah sorry, comment that out
19:51 thd       kados chris: be careful with MARC in SGML.  LC no longer maintains the MARC to SGML mappings and the never made it out of beta.
19:51 kados     Can't locate Smart/Comments.pm in @INC
19:51 kados     hehe
19:50 chris     thanks
19:50 kados     ok ... zebra.cfg and collection.abs committed
19:46 kados     but I will in a sec
19:46 kados     didn't commit the usmarc stuff yet
19:46 chris     alter table is for structure
19:45 chris     you committed ur zebra changes eh?
19:45 chris     yes
19:45 kados     chris: what's wrong with that query? do I need to do an update instead?
19:45 kados     chris: alter table systempreferences set value='npl' where variable='template' and value='default'
19:42 kados     hehe ... error 505 now
19:41 kados     ahh
19:41 chris     to the url
19:41 chris     ahh just add /admin/
19:41 chris     http://opac.koha3.katipo.co.nz:81/cgi-bin/koha/opac-search.pl?op=do_search&type=opac&marclist=&and_or=and&excluding=&operator=contains&value=the
19:40 kados     yikes ... can't even get to the paramaters screen
19:40 chris     try the npl ones
19:40 kados     default I think
19:40 chris     what templates?
19:40 kados     opac-new.pl not found
19:39 kados     hmmm
19:39 chris     the one in cvs should work
19:39 kados     so I can try some searching?
19:39 kados     do you have a working SearchMarc.pm?
19:38 kados     why I have hundreds of Connection Colosed messages in the server log is a mystery
19:37 kados     and our diag errors are just wrong
19:37 kados     so maybe it finished all the records
19:37 kados     still running
19:37 kados     15:52:44-14/02 zebrasrv(412) [session] Connection closed by client
19:37 kados     15:52:44-14/02 zebrasrv(1) [session] Connection closed by client
19:37 kados     15:52:44-14/02 zebrasrv(1016) [request] EsRequest OK: Done !
19:37 kados     15:52:44-14/02 zebrasrv(1016) [log] user/system: 6/0
19:37 kados     wait
19:35 chris     its still running tho?
19:35 kados     bunch of them ... hundreds
19:35 kados     15:52:44-14/02 zebrasrv(959) [session] Connection closed by client
19:34 chris     what does you server log tell ya?
19:34 kados     looks like it crashed :-)
19:34 kados     Fatal error, cant connect to z3950 server at /home/koha/testing/cvsrepo/koha/C4/Biblio.pm line 165.
19:34 kados     Error 10000: Connect failed
19:34 kados           _d1040 at /home/koha/testing/cvsrepo/koha/C4/Biblio.pm line 158.
19:34 kados     uh oh
19:34 chris     oh yeah it can
19:34 kados     that's what I used before to test with
19:34 kados     ahh ... I"m thinking of grs.marc.usmarc
19:34 kados     unless I'm misunderstanding
19:34 kados     sgml can handle marc files fine
19:33 chris     MARC::File::SGML :-)
19:33 chris     we just hae to get someone to write us
19:33 kados     I'll commit the new config stuff
19:33 chris     well 3.1 could be sgml .. ie once all the code is there .. it wont be hard to swap the backend at all
19:32 kados     it would have crunched this db in like 30 secs
19:32 kados     too bad we can't do the sgml ... it's hundreds of times faster to index
19:31 kados     yea ... this rocks
19:31 chris     cooking with gas
19:31 chris     excellent
19:31 kados     chris: working!
19:31 kados     wohoo
19:30 kados     right ... it's subfield C ...
19:30 chris     from addbiblio.pl
19:30 chris     this is my new record
19:30 chris       </datafield>
19:30 chris         <subfield code="d">3027</subfield>
19:30 chris         <subfield code="c">3027</subfield>
19:30 chris      <datafield tag="090" ind1=" " ind2=" ">
19:29 chris     ah ha
19:29 kados     http://koha.liblime.com/cgi-bin/koha/export/marc.pl
19:29 chris     ohh cool
19:29 kados     (BTW: I think the export tool grabs items these days)
19:29 kados     I'll export some records and test with those
19:28 chris     maybe he was only modifying not adding a new one
19:28 kados     not in Biblio either
19:27 chris     he was working with acquisitions
19:27 chris     hmm i dunno
19:27 kados     that already have 090
19:27 kados     my guess is paul's using MARC records exported from Koha
19:27 chris     it will be in C4::Biblio
19:27 kados     I don't see anything in bulkmarcimport that adds a 090
19:25 chris     im getting the same error in acquisitions
19:25 kados     we'll have to put one in with bulkmarcimport
19:25 chris     it should be making them tho, and then inserting into zebra
19:25 kados     so there is no id
19:24 kados     these aren't koha records
19:24 kados     ahh ... of course it it
19:24 chris     lemme dump the xml and see
19:24 chris     could it be 090 is empty
19:24 kados     but didn't help
19:24 kados     I changed the 'i' to caps
19:24 chris     ahh
19:24 kados     melm 090$a      Identifier-standard,Identifier-standard:p
19:24 kados     in collection.abs
19:22 chris     11:22:19-15/02 zebrasrv(4) [warn] zebra_insert_record failed r=1
19:22 chris     11:22:19-15/02 zebrasrv(4) [warn] Bad match criteria (recordID)
19:22 chris     11:22:19-15/02 zebrasrv(4) [warn] Record didn't contain match fields in (bib1,Local-number)
19:22 chris     nope
19:21 chris     and see what happens
19:21 chris     lemme swithc it to recordInsert
19:21 chris     hmm
19:20 kados     same error here
19:19 kados     title is 245a
19:19 kados     author is 100a
19:19 kados     yea, you'll need to change the tag/subfields tho
19:19 kados     (because shouldn't we need a xml-based .abs)
19:19 chris     yeah maybe ill copy the collection.abs from unimarc instead
19:19 kados     (which actually seems a bit strange to me)
19:19 kados     even copied over usmarc.abs to collection.abs
19:18 kados     yea ... changed it, restarted zebrasrv ... no go
19:16 chris     local if you compile it yourself
19:16 chris     if you install the debian package
19:16 chris     its in /usr/share/idz
19:16 chris     take local out
19:16 kados     and there's nothing there :-)
19:16 chris     right
19:16 kados     /usr/local/share/idzebra/tab/
19:16 kados     it's in zebra.cfg
19:16 chris     oh no thats right
19:15 chris     i think its looking in
19:15 chris     check C4::Biblio
19:15 chris     looks like it
19:15 kados     maybe my path is wrong
19:15 kados     15:32:11-14/02 zebrasrv(4) [warn] Unknown register type: 0
19:15 kados     15:32:11-14/02 zebrasrv(4) [warn] Unknown register type: w
19:15 kados     15:32:11-14/02 zebrasrv(4) [warn] collection.abs:22: Couldn't find att 'any' in attset
19:14 chris     hmm
19:14 kados     15:32:11-14/02 zebrasrv(4) [warn] collection.abs:14: Couldn't find attset  bib1.att
19:14 kados     15:32:11-14/02 zebrasrv(4) [warn] Couldn't load attribute set bib1.att [No such file or directory]
19:14 kados     <collection xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.loc.gov/MARC21/slim http://www.l ...
19:14 kados     <?xml version="1.0" encoding="UTF-8"?>
19:14 kados     15:32:11-14/02 zebrasrv(4) [log] 1705 bytes:
19:14 kados     15:32:11-14/02 zebrasrv(4) [log] record 0 type XML
19:14 kados     I also get some other warnings
19:14 chris     i think we can use usmarc.abs as collection.abs
19:13 chris     11:10:00-15/02 zebrasrv(6) [warn] zebra_update_record failed r=1
19:13 chris     11:10:00-15/02 zebrasrv(6) [log] zebra_update_record returned res=1
19:13 chris     11:10:00-15/02 zebrasrv(6) [warn] Bad match criteria (recordID)
19:13 chris     yeah if you look back its waring about
19:13 kados     is the error I'm getting
19:13 chris     altho i wonder if this routine is used for updates as well
19:13 kados     EsRequest  ERROR 224 update_record failed
19:12 kados     yea, that sounds better
19:12 chris     maybe it should be doing a recordInsert
19:12 chris     a specialUpdate that is
19:12 chris     i wonder if it shouldnt be doing an update?
19:11 chris     11:10:00-15/02 zebrasrv(6) [warn] Bad match criteria (recordID)
19:11 chris     11:10:00-15/02 zebrasrv(6) [warn] Record didn't contain match fields in (bib1,Local-number)
19:11 chris     11:10:00-15/02 zebrasrv(6) [warn] Couldn't open collection.abs [No such file or directory]
19:11 chris     ttp://www.l ...
19:11 chris     yeah if you check the zebra log
19:11 kados     ZOOM error 224 "ES: immediate execution failed" (addinfo: "update_record failed") from diag-set 'Bib-1'
19:11 kados     ok ... yea I'm stillgetting an error:
19:10 kados     hmmm
19:10 chris     in usmarc
19:10 chris     oh we need a collection.abs file too
19:08 chris     one thing i like a lot about zebra, its log is verbose
19:08 chris     sweet
19:08 kados     so I'll update my zebracfg and commit it once I've got it working
19:07 kados     gotcha
19:07 chris     we kinda need to either deal with xml .. or pass around marc
19:07 chris     so without writing a marc->sgml parser
19:07 kados     right
19:07 chris     record => $record->as_xml()
19:07 chris     eg
19:06 chris     to do the translation for us
19:06 kados     ahh
19:06 kados     ing even
19:06 chris     we can use MARC::File::XML
19:06 kados     but I could be misunderstand it
19:06 chris     with xml
19:06 chris     hmm
19:06 kados     as xml can only handle one record per file
19:05 kados     not xml
19:05 kados     I think we ultimately want sgml
19:05 kados     Local Representation
19:05 kados     that section
19:05 kados     http://indexdata.dk/zebra/doc/record-model.tkl
19:05 chris     we should make sure usmarc and unimarc zebra.cfg stay in sync
19:05 kados     lemme find a link
19:04 kados     grs.sgml means it's stored internally much differently than grs.xml
19:04 kados     yea
19:04 chris     ?
19:04 chris     that bit
19:04 chris     recordType: grs.xml
19:04 chris     # Specify record type
19:04 kados     that's the underlying zebra storage format
19:03 chris     saw that
19:03 kados     paul switched from sgml to xml
19:03 kados     couple of other things
19:02 chris     ok add that line, restart the zebrasrv and you should be away laughing
19:02 chris     yep
19:01 kados     very steep learning curve with zebra
19:01 kados     yea, frustrating though
19:01 chris     well not really, its all learning
19:01 chris     hehe
19:01 chris     that was 1.5 hours wasted
19:01 chris     right
19:01 chris     its in the unimarc one
19:01 kados     dou!
19:01 kados     ahh ... right ... I remember that
19:01 chris     perm.anonymous: rw
19:00 chris     we need this line in our zebra.cfg
19:00 chris     ah ha
18:59 chris     This class represents an Extended Services Package: an instruction to the server to do something not covered by the core parts of the Z39.50 standard (or the equivalent in SRW or SRU). Since the core protocols are read-only, such requests are often used to make changes to the database, such as in the record update example above.
18:59 chris     hmm i wonder if you have to set permissions when you create a db
18:58 chris     we are in extended service i think
18:58 kados     (ie, outside of bib-1)
18:58 kados     right ... but doesn't that still use bib-1 diag-set ... or are we in extended services now?
18:58 chris     or koha3 in my case
18:58 chris     but it should be modifying kohatest
18:57 kados     it's the Bib-1 diagnostic set
18:57 chris     thats what it says
18:57 chris     yep
18:57 kados     ES: permission denied on ES - cannot modify or delete
18:57 kados     error 223
18:57 kados     http://www.loc.gov/z3950/agency/defns/bib1diag.html
18:56 chris     why diag-set though
18:56 chris     ES: permission denied on ES - cannot modify or delete" from diag-set 'Bib-1'
18:56 chris     on the client side i get
18:56 chris     yep
18:56 kados     there's a list of those error codes somewher I think
18:55 kados     14:49:32-14/02 zebrasrv(1) [session] Connection closed by client
18:55 kados     14:49:32-14/02 zebrasrv(1) [request] EsRequest  ERROR 223
18:55 kados     14:49:32-14/02 zebrasrv(1) [log] database: kohatest
18:55 kados     14:49:32-14/02 zebrasrv(1) [log] specialUpdate
18:55 kados     14:49:32-14/02 zebrasrv(1) [log] action
18:55 kados     14:49:32-14/02 zebrasrv(1) [log] Received DB Update
18:55 kados     on the server side I'm getting:
18:53 chris     hmm still in the dark
18:43 chris     ohh its further down in the man ZOOM
18:42 chris     ill go see if its on cpan
18:42 kados     maybe perldoc?
18:42 kados     heh
18:42 chris     No manual entry for ZOOM::Package
18:42 chris     chris@wolf:~/koha$ man ZOOM::Package
18:42 chris            optionally be passed in.  See the "ZOOM::Package" documentation
18:42 chris      Creates and returns a new "ZOOM::Package", to be used in invoking an Extended Service.  An options block may
18:41 kados     yea ...
18:41 chris     and this is annoying
18:41 chris     hmm its a puzzle
18:36 kados     right
18:36 kados     changes committed
18:34 chris     and i have yet to figure out what
18:33 chris     theres something in the new way .. such that its not using our db, but is trying to modify the diag-set
18:33 chris     in the zebra_create routine
18:33 chris     it used to do the update an old way .. you can see that its commented out
18:32 kados     right ... it's already loaded
18:32 chris     and something else has use ZOOm
18:32 chris     but thats probably because its running under mod_perl
18:32 kados     weird
18:32 chris     and mine works
18:32 chris     im not sure, since i dont have a use ZOOM
18:32 kados     should the use ZOOM go at the top?
18:31 kados     ok ... I can commit these changes to Biblio.pm
18:31 kados     :-)
18:31 chris     now you have caught up to me
18:31 chris     right
18:31 kados     ZOOM error 223 "ES: permission denied on ES - cannot modify or delete" from diag-set 'Bib-1'
18:31 kados     hehe
18:31 chris     and see what happens
18:31 kados     ok
18:31 chris     try adding a use ZOOM;
18:31 kados     should there be?
18:30 chris     there isnt
18:30 kados     in biblio.pm
18:30 kados     I don't see a use  Net::Z3950::ZOOM
18:30 chris     right
18:29 kados     > force install Net::Z3950::ZOOM
18:29 kados     cpan
18:29 kados     to install perl-zoom i did:
18:29 chris     hmm
18:29 kados     Can't locate object method "code" via package "Can't locate object method "new" via package "ZOOM::Connection" (perhaps you forgot to load "ZOOM::Connection"?
18:28 kados     interesting
18:28 chris     we need to check we connected ok, before we try to set an option
18:27 chris     ah sorry, before line 161
18:27 kados     huh ... it's not throwing the error
18:26 chris     ill commit that to C4::Biblio
18:26 chris     then we shojuld be able to see whats happening
18:26 chris             }
18:26 chris                 die "Fatal error, cant connect to z3950 server";
18:26 chris      warn "Error ", $@->code(), ": ", $@->message(), "\n";
18:26 chris     if ($@){
18:25 chris     actually go
18:25 chris     this should be in there anyway
18:25 chris             }
18:25 chris                 die "Fatal error, cant connect to z3950 server";
18:25 chris     if ($@){
18:25 chris     add
18:25 chris     after line 161
18:25 kados     right
18:25 chris     ie 127.0.0.1
18:25 chris     no, but we are telling it to connect to localhost
18:25 kados     so it shouldn't matter?
18:24 kados     this box has multiple IPs ... but we're doing socket conenctions right?
18:24 chris     ok add this
18:24 kados     no deal
18:23 kados     (though I had the db name in the zebra.cfg ... maybe localhost is the trick)
18:23 kados     I'll try that
18:23 kados     ahh
18:22 chris     zebrasrv localhost:2100
18:22 chris     i did zebraidx -d kohatest update Biblios
18:22 kados     it's running ... but no connections thusfar
18:22 chris     right
18:22 kados     zebrasrv @2100
18:22 kados     zebraidx update Biblios
18:22 chris     ok
18:22 kados     and I started zebra with:
18:22 kados     my zebra db is called kohatest
18:21 kados     same error ..
18:21 kados     hmm
18:20 kados     right
18:20 chris     localhost:2100/koha3
18:20 chris     for eg i made a zebre db called koha3 so mine says
18:19 chris     (or whatever is relevant for you)
18:19 kados     k
18:19 chris     zebradb=localhost:2100/dbname
18:19 chris     add
18:19 chris     edit koha.conf
18:19 kados     the last line there
18:19 chris     right
18:19 kados             $Zconn->option(cqlfile => C4::Context->config("intranetdir")."/zebra/pqf.properties");
18:19 kados             };
18:19 kados                     $Zconn = new ZOOM::Connection(C4::Context->config("zebradb"));
18:19 kados     eval {
18:18 chris     what is line 161?
18:18 chris     cos i was to lazy to type the full path
18:18 kados     Can't call method "option" on an undefined value at /home/koha/testing/cvsrepo/koha/C4/Biblio.pm line 161.
18:18 kados     gives me:
18:18 chris     right ./ is just saying relative to this dir
18:18 kados     perl -I /home/koha/testing/cvsrepo/koha bulkmarcimport.pl -file /home/jmf/working/nbbc/data/nbbcmarc.mrc
18:17 kados     (with no '.'
18:17 kados     well ... the old bulkmarcimport needed -file /path/to/file.mrc
18:16 chris     i cant remember if it needs teh space or not
18:15 chris      perl -I /path/to/koha bulkmarcimport.pl -file ./file.mrc
18:15 chris     or maybe
18:15 chris      perl -I /path/to/koha bulkmarcimport.pl -file./file.mrc
18:15 chris     try
18:15 chris     its trying to execute file.marc as perl :)
18:15 chris     ah ha
18:14 kados     perl -I /path/to/koha bulkmarcimport.pl file.mrc
18:14 chris     bulkmarcimport -file/path/to/file
18:14 chris     how did you run it
18:14 kados     a different error all together
18:14 kados     Unrecognized character \x1E at /home/jmf/working/nbbc/data/nbbcmarc.mrc line 1.
18:14 kados     syntax error at /home/jmf/working/nbbc/data/nbbcmarc.mrc line 1, near "00382nam  "
18:14 kados     Illegal octal digit '8' at /home/jmf/working/nbbc/data/nbbcmarc.mrc line 1, at end of line
18:14 kados             (Do you need to predeclare a?)
18:14 kados     Number found where operator expected at /home/jmf/working/nbbc/data/nbbcmarc.mrc line 1, near "a 4500005001700000100001600017245001300033260005000046300001000096500001800106650002600124852011000150"
18:14 kados             (Missing operator before a?)
18:14 kados     Bareword found where operator expected at /home/jmf/working/nbbc/data/nbbcmarc.mrc line 1, near "2200121 a"
18:14 kados             (Do you need to predeclare nam?)
18:13 kados     Number found where operator expected at /home/jmf/working/nbbc/data/nbbcmarc.mrc line 1, near "nam  2200121"
18:13 kados             (Missing operator before nam?)
18:13 kados     Bareword found where operator expected at /home/jmf/working/nbbc/data/nbbcmarc.mrc line 1, near "00382nam"
18:13 kados     interesting
18:10 chris     off to read man pages
18:10 chris     when i try to acquisition a book, but from what I can see, it shouldnt be trying to add it to the diag-set .. it should be adding it to koha3
18:09 chris     [Wed Feb 15 10:08:32 2006] [error] ZOOM error 223 "ES: permission denied on ES - cannot modify or delete" from diag-set 'Bib-1'
18:08 chris     hmm paul is away
14:43 kados     heh
14:31 kados     in fact ... I'll send one immediately about next week
14:30 kados     I'll be sure to send a separate email about meetings from now on ;-)
14:30 |hdl|     I had overlooked the meeting information note.
14:30 kados     np
14:30 |hdl|     kados : sorry for not having been there yesterday.
14:08 kados     ciao
12:46 owen      Hi kados
12:46 kados     morning owen
11:40 kados     that's good at least
11:40 kados     ahh
11:40 paul      it's default behaviour. You can define char encoding for every database & table & field.
11:40 kados     (sounds good to me)
11:39 paul      (could be in may, 8-25)
11:39 kados     s/sql/mysql/
11:39 kados     do I remember correctly that 'utf-8 = on' in sql is a server-wide setting? and not an individual- database setting?
11:39 paul      + i'm preparing a mail for KohaCon.
11:38 kados     ahh
11:38 paul      ok, i'll check (but my problems are not here for instance, they are in SQL tables moving to UTF8)
11:37 kados     paul: so you will get some strange results
11:37 kados     paul: because CPAN version doesn't calculate directory offsets for utf-8 outside the normal ascii range
11:37 kados     paul: not CPAN version
11:37 kados     paul: on sourceforge
11:36 kados     paul: make sure you're using the latest version of MARC::Record
11:36 paul      (with many Tümer Garip mails & hints on this. Thanks to him)
11:36 kados     :-)
11:36 kados     ok :-)
11:36 paul      I know you are impatient, but it would be better to wait until next week.
11:36 kados     (I think it works fine, as I've done it before)
11:35 kados     (import of new records)
11:35 kados     should I begin testing with large datasets?
11:35 paul      import ?
11:35 kados     and import too?
11:35 paul      (& storing in zebra I mean ;-) )
11:35 kados     woohoo!
11:35 paul      some good news : biblio editing in HEAD works quite good.
11:35 kados     afternoon then :-)
11:35 kados     ahh ... right
11:34 paul      hello. Morning ended here in Europe...
11:34 kados     morning all