Time  Nick        Message
11:54 hdl_laptop  in the advanced tab, you can find boolean operators iirc.
11:53 hdl_laptop  mmmm...
11:21 Hui_Nan_    through OPAC
11:21 Hui_Nan_    not au:Smith?
11:21 Hui_Nan_    how could I search for records where author is not Smith?
11:20 Hui_Nan_    hello
10:45 k3rn3l      is there any framework (mvc) behind koha?
09:58 hdl_laptop  not sent on list at the moment
09:58 Amit        k
09:57 hdl_laptop  no
09:51 Amit        hdl : this patch http://lists.koha.org/pipermail/koha-patches/2008-December/002309.html
09:10 mason       but.. my dangerous patch is only used for the initial marc-import only - then the codebase is switched back, all is happy again
09:08 mason       hmm, i dont think so , for this specific upgrade
09:07 hdl_laptop  i think
09:07 hdl_laptop  mmmm... You could have done it with kohafield linking
09:06 mason       yes, a bit dangerous.... but needed for a tricky 2.2 -> 3.0 upgrade
09:06 hdl_laptop  will do
09:06 Amit        yes
09:06 hdl_laptop  it it is hardcoded.
09:06 hdl_laptop  mason: huh.... seems quite dangerous to do so.
09:05 Amit        u can send me if possible
09:05 hdl_laptop  Amit: no url but can send patch
09:05 mason       999$ has bib, bibitem and item numbers...
09:04 mason       i have modified bulkimarcimport.pl, addbib, addbibitems, additems to import new marc records with 999$ ids
09:04 Amit        have u any url
09:04 Amit        k
09:04 hdl_laptop  And it also adds importing authorities.
09:04 hdl_laptop  I have already patched bulkmarcimport for that
09:04 Amit        hdl: by hardcode of MARC tag one tag at a time
09:03 mason       kernel: and taking a look at koha src-code too ;)
09:03 Amit        no i have written some code last night but not working
09:02 hdl_laptop  Amit: how woul you do this ?
09:02 mason       hi hdl
09:02 hdl_laptop  k3rn3l: : KohaCon could be a good place too.
09:02 mason       ah, that would be handy
09:01 Amit        i m also planning to add matching in ./bulkmarcimport.pl
09:00 Amit        in my case there is matching problem with issn
09:00 mason       i have had success with isbn and issn matching
08:59 Amit        there is matching rule problem with issn
08:59 mason       hmm, yes - a little
08:58 Amit        yes
08:58 mason       amit, for staged-importing?
08:58 mason       thats a good history of koha's development, for you
08:58 mason       kernel: http://git.workbuffer.org/cgi-bin/gitweb.cgi?p=koha.git;a=blob_plain;f=docs/history.txt;hb=8d7f33c548600177a2b55f46a53f9828238c6e05
08:54 Amit        mason: have u checked matching rule in koha 3.01 version
08:53 mason       hey amit
08:53 Amit        heya mason
08:53 k3rn3l      I would like to know a little more about how koha was programmed, is there a good place to learn this?
08:45 mason       heya kernel
08:44 k3rn3l      hi
07:49 nicomo      hi Amit
07:48 Amit        hi nicomo
07:30 kf          good morning all
07:30 hdl_laptop  hi all
07:30 Amit        hi hdl
07:30 Amit        hi kf
07:20 matts       yep
07:20 hdl_laptop  Je ne vois personne
07:20 hdl_laptop  matts: tu es sur biblibre ?
07:09 Amit        hi greenmang0
04:10 Amit        same to u
04:09 brendan     I'm signing off and will be back next week... have a good week
04:08 brendan     ;)
04:08 brendan     cool
04:08 Amit        preparing for mumbai seminar
04:08 Amit        everything is fine
04:08 brendan     how's things
04:08 brendan     Heya Amit
03:53 Amit        hi chris, mason, brendan
01:17 atz         not too bad... taxes done in 2 hours
23:17 pianohacker bye
23:15 pianohacker Had completely forgotten about smolder 'till I saw the newsfeed in #kohanews
23:14 gmcharlt    there's a cronjob - I'll check it tomorrow
23:12 pianohacker Do we have a smokebot running? There's a lot of empty reports on smolder
23:10 atz         true
23:09 gmcharlt    well, there's always the option of an extension - form to request one is much shorter :)
23:09 atz         meh.  incentive, inschmentive.
23:08 gmcharlt    atz: incentive to do them (slightly) earlier?
23:06 atz         bad timing between kohacon and taxes :(
22:00 pianohacker Bye, see you there
22:00 liz_nekls   2 more days to kohacon! Woo! Ttyl!
21:00 chris       liz_nekls: ahh in that case my git log --grep="2940" command won't help you :)
21:00 chris       yep, and now im awake again :) but this time voluntarily
20:40 liz_nekls   totally sucks, for all involved
20:40 liz_nekls   chris: I can totally relate to the 5am child wakies
20:39 liz_nekls   chris: wow, I totally missed your question. I use the gitweb, we are hosted. It's a long story.
17:35 pianohacker Good night
17:35 chris       ok it's all quiet now, im gonna try and get more sleep, cyas later
17:34 chris       http://koha.pastebin.com/m3255725a
17:33 chris       right added :)
17:29 chris       adding a function to change the prompt if you are in a dirty state
17:29 chris       http://gist.github.com/31631
17:28 chris       i do like this tho
17:28 chris       heh
17:27 pianohacker One wonders what that prompt would be like after editing four template files in our directory layout
17:26 chris       :)
17:25 atz         wow... that's overkill... but then, he's probably russian.
17:25 gmcharlt    some of that should be ported to gitweb
17:25 gmcharlt    chris++
17:24 chris       for a really over the top prompt :)
17:23 chris       http://volnitsky.com/project/git-prompt/
17:23 chris       heh
17:23 atz         not sure i can do much for your kid from here, though  :)
17:23 atz         ?
17:22 chris       atz: you about?
17:18 chris       liz_nekls: do you have a git clone checked out locally, or do you just use gitweb for searching?
17:16 chris       thanks
17:15 pianohacker Oh, man. Sorry, hope he feels better soon
17:14 chris       but it means im awake at 5am having just got him back to sleep
17:14 chris       just a cold
17:13 pianohacker Yikes. How sick?
17:13 chris       sick_child_the_night_before_flying_to_texas--
16:08 liz_nekls   to avoid that
16:08 liz_nekls   thanks, I'll just hit the home page every time to start my searches
16:07 liz_nekls   yes, that must be it
16:07 liz_nekls   hm that could explain why it would work on some pages and not others
16:07 gmcharlt    and the search was relative to that
16:07 gmcharlt    it's possible that you may have been looking at an older commit
16:06 gmcharlt    one possibilitilty - gitweb's searches are in the context of whatever branch you're viewing
16:04 liz_nekls   <commences muttering>
16:03 liz_nekls   ok, I'm crazy I guess
16:03 liz_nekls   and I just now did it, and it worked, from the home page. one second, maybe it's the location of the search box that is different
16:02 liz_nekls   I did the exact same thing and it didn't come up
16:01 gmcharlt    a commit search of '2940" on gitweb turned it up for me
16:00 liz_nekls   i had to look through the shortlog to find it
16:00 liz_nekls   but here it is: http://git.koha.org/cgi-bin/gitweb.cgi?p=Koha;a=commit;h=bf17eb3902b2d7091406c43cf10834e45935c6f7
16:00 liz_nekls   i did a search for 2940, and got no results
15:59 liz_nekls   ok, here's what I'm seeing (feel free to say "git: ur doing it wrong"
15:58 Hui_Nan_    !
15:58 Hui_Nan_    atz, thanks for help~
15:57 Hui_Nan_    gonna check that line tomorrow %-)
15:57 Hui_Nan_    ough, there are a lot of Auth.pm errors today :-(
15:56 Hui_Nan_    authentication, well
15:56 Hui_Nan_    I can't see what kind of error it is
15:55 Hui_Nan_    ups... that's something strange
15:55 Hui_Nan_     /home/www/koha/lib/C4/Auth.pm line 1423., referer: http://catalog.isact.ru/cgi-bin/koha/tools/tools-home.pl
15:55 Hui_Nan_    [Mon Apr 13 13:44:25 2009] [error] [client 192.168.4.56] [Mon Apr 13 13:44:25 2009] stage-marc-import.pl: superlibrarian at
15:48 Hui_Nan_    I'll wait for the answer from zebra list and then may be some code studying will help %-)
15:47 Hui_Nan_    well, Zebra says there are 11 records in the biblio base, all 11 `biblioitem`.`marcsql` are correct
15:46 atz         before writing to biblio.author
15:46 atz         Hui_Nan_: that suggests that import is failing at some point
15:43 Hui_Nan_    Where Zebra finds the data on records?
15:43 Hui_Nan_    well, anyway, I'm fighting with Zebra, not Koha
15:42 Hui_Nan_    each record
15:42 Hui_Nan_    each has rusmarc 700 and 701 fields
15:41 Hui_Nan_    Is it OK?
15:41 Hui_Nan_    I noticed that `biblio`.`author` is empty for all the imported records
15:40 Hui_Nan_    but, the same irrelevant records %-(
15:40 Hui_Nan_    the order of records in the results list has changed
15:38 atz         may want to use -x (for XML indexing too)
15:38 atz         misc/migration_tools/rebuild_zebra.pl -b -r  # the r clears index first
15:34 Hui_Nan_    is it correct?
15:34 Hui_Nan_    ./rebuild_zebra.pl -b -w
15:34 Hui_Nan_    rebuilding does not help
15:33 Hui_Nan_    11 records for testing purposes
15:32 Hui_Nan_    XXX is not /Author.+/
15:32 Hui_Nan_    that's all
15:32 Hui_Nan_    [warn] Index 'XXX' not found in attset(s)
15:31 atz         otherwise, (if it isn't a huge dataset) try rebuilding the index from scratch and see if it still happens
15:30 atz         strange... check your zebra logs to see if indexing is failing on certain records
15:30 Hui_Nan_    ...an author by whose name I found this record
15:29 Hui_Nan_    that guy is certainly not mentioned nowhere in the record
15:28 Hui_Nan_    just looking at that irrelevant record in `biblioitems.marcxml`
15:25 liz_nekls   gmcharlt: thanks for checking.
15:21 Hui_Nan_    well, why coluld this happen? I'm importing some records, rebuild zebra's index and get within relevant records a few irrelewant ones while searching?
15:19 Hui_Nan_    I'm asking in a hope that someone here has played with zebra::index %-)
15:19 Hui_Nan_    certainly it is
15:18 atz         perhaps this is a question for the zebra listserv
15:17 Hui_Nan_    I believe I have to specify utf-8 somewhere inside retrieval
15:17 Hui_Nan_    to access zebra::XXX element sets
15:16 Hui_Nan_    I added the lines like "<retrieval syntax="xml" name="zebra::index" />"
15:16 atz         interesting!
15:16 Hui_Nan_    I believe that the trouble is in koha-conf.xml
15:16 atz         that makes sense
15:16 Hui_Nan_    that's why I see cyrillic when I ask for record
15:15 Hui_Nan_    charset UTF-8 UTF-8 UTF-8 is in my ~/.yazclientrc
15:15 Hui_Nan_    certainly I do
15:15 atz         you just need to negotiate character set
15:15 Hui_Nan_    but if I ask for indexing data - then alas
15:15 atz         ah, ok
15:15 Hui_Nan_    and if I ask zebra for just a record data I get marcxml/unimarc with russian
15:14 atz         just in zebra?
15:14 Hui_Nan_    it is not a problem 'cause in OPAC I see correct biblio
15:13 Hui_Nan_    I have there Character Set 1 = 50-UTF-8
15:13 Hui_Nan_    yep exactly
15:13 atz         i think it is in the 100 field then.  paul_p could confirm
15:12 Hui_Nan_    I'm using unimarc (rusmarc)
15:10 atz         that would be leader/09="a" for marc21
15:10 atz         Hui_Nan_: make sure the internal MARC flag for encoding matches your expectation (probably utf8)
15:08 Hui_Nan_    records were imported, using tools
15:07 Hui_Nan_    I worry because a SimpleSearch through opac "au=<some cyrillic surname>" gives a couple of records where <some cyrillic surname> is not mentioned at all!
15:05 Hui_Nan_    I'm using yaz-client connected to zebrasrv via unix socket which Koha uses for biblios
15:04 Hui_Nan_    like <index name="Author" type="w" seq="15">@@@@@@@@</index>
15:04 Hui_Nan_    just dogs instead of letters
15:03 Hui_Nan_    I can't get utf-8 encoded text
15:03 Hui_Nan_    had anyone played with zebra::index element set in zebra?
14:51 gmcharlt    liz_nekls: checked - didn't see any particular reason why gitweb should have acted up
14:51 paul_p      & very excited to come to TX! (i've been in OH, but Nelsonville is a not a large city !)
14:50 gmcharlt    +1 to that
14:50 liz_nekls   ;)
14:49 liz_nekls   but a welcome change in state of affairs
14:49 liz_nekls   very
14:49 atz         odd
14:49 liz_nekls   (correct results)
14:49 liz_nekls   atz: hrmph, it  works now, not sure what's different. I'm getting different results for things I searched for not 30 minutes ago
14:48 liz_nekls   meeting, etc
14:48 liz_nekls   there are a lot of people I"m looking forward to seeing
14:47 owen        I'm looking forward to it :)
14:46 liz_nekls   prepare to have your brain picked.
14:46 liz_nekls   owen: yessir, I will be there
14:45 owen        liz_nekls: are you coming to KohaCon?
14:30 atz         liz_nekls: not sure what you mean... got a link?
14:28 liz_nekls   err, can't "seem" to get results after dec 08
14:28 liz_nekls   (web interface)
14:28 liz_nekls   q: is there something amiss with git? I can't get search results after dec. 08
14:05 atz         nightly should do it
14:05 jwagner     Thanks, atz.  Can you clarify when that should be run?  If I'm reading the manual & syspref description correctly, if dontmerge is set to ON, things happen on the fly.  If it's set to OFF, this script should be cronned (nightly? how often?). Is that correct?
14:02 atz         slight imprecision in docs
14:02 atz         jwagner: misc/migration_tools/merge_authority.pl
13:57 jwagner     Question for folks on authorities -- the dontmerge authorities syspref references a merge_authorities.pl cron job.  However, I don't find that script anywhere on the server (either 3.0 or 3.0.1 versions).  Is the manual outdated for this, or are we missing a script?
13:42 Hui_Nan_    answer's found!
13:42 Hui_Nan_    http://lists.indexdata.dk/pipermail/zebralist/2007-July/001694.html
13:28 Hui_Nan_        [25] Specified element set name not valid for specified database -- v2 addinfo 'zebra::index'
13:28 Hui_Nan_    however koha's biblios database says: Diagnostic message(s) from database:
13:26 Hui_Nan_    zebra admin's manual states there is a special element set 'zebra::index'
13:25 Hui_Nan_    does anybody know how to figure out the data stored in Zebra indexes for a record?
13:25 Hui_Nan_    hello
13:09 owen        :)
13:08 owen        When are you leaving?
13:08 owen        Thanks
13:08 owen        And to you to, paul_p
13:08 paul_p      (hello & happy easter)
13:07 paul_p      owen: it can be taken out of the framework.
13:06 owen        Or can it be taken out of the framework?
13:06 owen        If you're using item-level itemtypes, is there any reason at all to specify an itemtype at the biblio level?