Time  Nick          Message
00:59 BobB          @wunder Sydney, Australia
00:59 huginn        BobB: The current temperature in Sydney, New South Wales is 16.0°C (10:30 AM EST on June 09, 2014). Conditions: Scattered Clouds. Humidity: 59%. Dew Point: 8.0°C. Pressure: 30.36 in 1028 hPa (Steady).
01:00 mtompset      Greetings, #koha.
01:01 mtompset      @seen dcook
01:01 huginn        mtompset: dcook was last seen in #koha 2 days, 18 hours, 8 minutes, and 3 seconds ago: <dcook> yo reiveune
01:03 eythian       mtompset: Class::ISA doesn't seem to be a dependency.
01:04 mtompset      It was on an install attempt for me under Ubuntu 14.04
01:04 mtompset      (if I recall correctly)
01:04 eythian       robin@zarathud:~/catalyst/koha$ grep -r Class::ISA * | wc -l
01:04 eythian       0
01:05 mtompset      Just a second... I'll attempt something.
01:06 * mtompset    grumbles about 2 minute boot time, because of wanting to be flexible.
01:08 mtompset      Oh... it probably isn't a koha dependency. It's probably a package up a perl library issue.
01:58 pastebot      "mtompset" at 127.0.0.1 pasted "What does the suggested command mean?" (6 lines) at http://paste.koha-community.org/34
01:58 mtompset      eythian: Any ideas?
01:58 wahanui       Any ideas are welcome :)
01:59 eythian       yeah, it's a branch that became a directory with the same name
01:59 eythian       try running git remote prune origin, though I don't know what it actually does.
02:16 mtompset      Well, it worked, but I got a whole bunch of things "pruned".
02:17 eythian       It's probably for the best...
02:24 rangi         if you havent run that in a long time, or git gc, you will have had all the old branches that were archived
02:25 rangi         http://git.koha-community.org/gitweb/?p=koha-archive.git;a=heads all of those
02:51 jcamins       wizzyrea: when you're the organizer and had months to plan, you don't get a bye that easily.
03:04 mtompset      Yes, that looks like it, rangi.
03:12 wizzyrea      jcamins: actually, if it's what I'm thinking of, and it was a live demo, I've actually done that. But I didn't have months to plan.
03:12 jcamins       wizzyrea: oh, I missed some important context.
03:12 wizzyrea      more like "oh by the way we want you to show this giant room the mobile app"
03:13 wizzyrea      yeah context would be helpful ^.^
03:13 jcamins       It was a hackathon. The entire point was that a bunch of people would be presenting interesting hacks they'd done.
03:14 jcamins       The subject was a website.
03:14 jcamins       They announced after everyone was there that they were using a document camera.
03:15 jcamins       And it wasn't a cost thing... I think they spent over $100k.
03:15 wizzyrea      we are talking about a thing that is kind of like an overhead projector, right?
03:15 jcamins       Yes.
03:15 wizzyrea      but digital
03:15 jcamins       Which they brought in especially to project pictures of computer screens.
03:16 jcamins       Exactly.
03:16 jcamins       Ever pointed a digital video camera at a computer screen?
03:16 wizzyrea      ok, then what is the proper way to put a phone interface on a projector?
03:16 eythian       in bad cases, you'd get flicker
03:16 wizzyrea      yes, actually
03:16 jcamins       wizzyrea: that's fine if you're presenting on a phone.
03:16 wizzyrea      the time I did this, there was no flicker, which may be why I'm a bit sympathetic to this
03:17 jcamins       But there was no indication up-front that this was a mobile hackathon.
03:17 eythian       it's ideal for a phone, because you can show ui. It seems dumb for a computer though.
03:17 jcamins       I wouldn't have gone had I known.
03:17 jcamins       eythian: exactly.
03:18 jcamins       Fortunately when a sizable portion of the participants said "oh well, I guess we won't present... when's dinner?" they decided to set up the projector hookup.
03:18 wizzyrea      wait they were pointing a document camera at a computer screen?
03:18 jcamins       wizzyrea: yes!
03:18 wizzyrea      oh I misunderstood I thought you were upset that they were pointing it at a phone.
03:19 wizzyrea      :)
03:19 jcamins       No, that's a good idea.
03:19 wizzyrea      I was like, well that seems sensible.
03:19 jcamins       What's dumb is pointing it at a computer.
03:19 jcamins       They wanted two people... one to hold the laptop upside down, the other to type backwards.
03:19 wizzyrea      uhhhh
03:19 wizzyrea      ok yeah, that's not sensible.
03:20 mtj           my brain hurts just thinking about that :0)
03:21 jcamins       :)
03:21 jcamins       Good night!
03:21 wahanui       If you feel like someone is looking through your window, it's OK, it's just me.
03:21 mtj           cya jcamins
03:36 mtompset      wizzyrea: That's another way of thinking about bug 6874. :)
03:36 huginn        04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=6874 enhancement, P3, ---, julian.maurice, Needs Signoff , Attach a file to a MARC record (Was: File upload in MARC)
04:33 wizzyrea      nobody could find the bug with the old title.
04:38 eythian       really it should be a biblio record I suppose
05:04 eythian       @later tell tcohen how do you mean?
05:04 huginn        eythian: The operation succeeded.
05:46 paxed         *grmbl* fun, just noticed Yet Another Conversion Problem. the due dates were set with time 00:00:00, when they should've been 23:59:00
06:01 mtompset      And with that horrible grammar moment, I think it is best to head to sleep.
06:01 mtompset      dcook: 11592 is ready for testing. :)
06:02 mtompset      @later tell dcook bug 11592 is ready for testing now.
06:02 huginn        mtompset: The operation succeeded.
06:02 mtompset      Have a great day, #koha eythian wizzyrea mtj
06:15 cait          @wunder Konstanz
06:15 huginn        cait: The current temperature in Taegerwilen, Taegerwilen, Germany is 20.0°C (8:15 AM CEST on June 09, 2014). Conditions: Clear. Humidity: 83%. Dew Point: 17.0°C. Pressure: 30.12 in 1020 hPa (Rising).
06:36 reiveune      hello
06:37 cait          hi reiveune
06:37 cait          holiday here today :)
06:37 reiveune      here too, hi cait
06:37 cait          oh nice :)
06:54 Joubu         Hi #koha
06:54 ashimema      Morning Joubu
07:00 yohann        salut
07:01 cait          hi yohann and ashimema
07:01 cait          oh and hi Joubu
07:01 cait          *reads back*
07:01 ashimema      moring cait, morning yohann
07:02 * ashimema    wishes google chrome would 'just work' the way it used to on this Ubuntu box.
07:05 * cait        offers cookies
07:12 ashimema      brb..
07:20 gaetan_B      hello
07:21 cait          hi gaetan_B
07:21 cait          everyone around on a holiday?
07:22 gaetan_B      gaetan_B: hmmm, it's not really a holiday in France anymore but it used to be so a lot of people take a day off i guess
07:22 cait          ah
07:22 gaetan_B      not so many here at biblibre though
07:26 fridolin      hie all
07:27 cait          hi fridolin
07:53 Tony          Greetings. I'm getting errors on staging marc for import: stage-marc-import.pl: Filehandle STDOUT reopened as FH only for input at /usr/lib/perl5/Template/Provider.pm
07:53 Tony          This is a clean install and we are trying to populate the db for the first time
07:54 Tony          On Manage staged MARC records, the citations are null. Any thoughts?
08:10 tony123       Greetings. I'm trying to do bulk import (file generated from MarcEdit) an on Stage MARC for import, I'm getting this error: stage-marc-import.pl: Filehandle STDOUT reopened as FH only for input at /usr/lib/perl5/Template/Provider.pm
08:11 tony123       In Manage staged MARC records, the Citation field is null for each record. The staging results shows items as 0
08:11 tony123       Any thoughts? This is a clean install and trying to import he data for the first time
08:28 paxed         does anyone else find advance_notices.pl formats the date-fields in wrong format?
08:30 cait          tony123: it#s hard to tell
08:30 cait          are you importing a valid marc file?
08:30 cait          in iso format?
08:30 cait          paxed: there is a bug for that - the problem was the the proposed solution slowed down the notice generation quite a bit, so it got stuck I think
08:31 paxed         cait: ugh. and one of the most visible notices for patrons, at that. sucks.
08:31 cait          i can find the bug for you
08:32 tony123       I assume so... I used MarcEdit to build the file. I used the Z39.50 client to retrieve the ISBN data. It built file and decompiled file (back to the mrk file)
08:32 cait          bug 11244
08:32 huginn        04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=11244 normal, P5 - low, ---, kyle, In Discussion , notices ignoring the dateformat preference
08:32 cait          tony123: did you convert it to an mrc file after using marcedit?
08:32 cait          or with marcedit?
08:33 tony123       I used marcedit to convert it to the mrc file
08:33 cait          tony123: sorry to ask so many questions, but it's hard to tell - the message from thelogs probably is just a warning and not a problem
08:33 cait          ok
08:33 tony123       no worries :)
08:33 cait          your system is set up to use marc21? :)
08:33 tony123       Here is the results log:
08:33 tony123       MARC staging results :      2 records in file     0 records not staged because of MARC error     2 records staged     Did not check for matches with existing records in catalog     0 item records found and staged     Manage staged records     Back
08:34 cait          that looks not bad
08:34 cait          but you say the title does not show up in the table below?
08:34 cait          on manage staged?
08:34 tony123       the values in Staged MARC record management are all null
08:34 cait          hm
08:34 cait          what do you mean by values?
08:34 cait          does the batch not show up at all?
08:35 tony123       the values in the the Citation field are all null. Status: Stages. Match details and Record and blank
08:35 tony123       the text is "null" with a hyperlink
08:36 tony123       When clicked, the javascript error is: TypeError: $(...).html(...).dialog is not a function 	  title: _("MARC Preview")
08:37 BobB          tony123, you could go back to MarcEdit and run the marc validator - that will show if your records are valid;
08:37 tony123       sure, thanks.
08:38 cait          packages?
08:38 wahanui       somebody said packages was at http://wiki.koha-community.org/wiki/Debian
08:38 BobB          do that on the .mrk file, then recompile it to .mrc
08:38 cait          tony123: just for a test - can you catalog a reacord manually?
08:38 BobB          are you sure you did that before?
08:38 cait          tony123: i had this happen when someone deleted the default framework once
08:38 BobB          Anyway, if it compiles ok, then it should mean your marc file is ok and your problem is a Koha one
08:38 cait          so there were no definitions for the fields and subfields in Koha - it's in the mandatory part of the web installer, but it's possible to uncheck it
08:39 cait          and also what BobB says :)
08:39 BobB          :)
08:39 tony123       MarcEdit's MARCValidator results: No errors were reported using the specified rules file.
08:40 tony123       Koha is installed on Ubuntu 12.04
08:42 BobB          by what method?  how did you install Koha?
08:42 tony123       apt-get via the instructions on the website
08:42 tony123       If you think using a different distro will fix it, or reinstalling, I'm happy to do that
08:42 BobB          hmm ...
08:43 cait          tony123: when you go into cataloguing
08:43 cait          can you add a record there?
08:43 tony123       hang on, one sec. Let me check
08:43 cait          and when you open your file in a editor... it should be basically unreadable, then it's right :)
08:52 BobB          diinner time, I'm off
08:56 tony123       Thanks BobB for your help! :)
08:57 tony123       I did create a record. I realized that I didn't create an Item type... I'll retry the import now.
08:58 tony123       cait:: yes, the mrc file is largely unreadable.
09:00 tony123       Same result: Citation field reads "null" Status reads "Staged". Match details and Records and both blank.
09:00 tony123       Do you recommend doing a clean install?
09:00 tony123       If you recommend a different distro, I can do that too
09:05 cait          didyou try catalouging manually?
09:05 cait          if you haven't made any configuration yet
09:06 tony123       I did a manual cataluging. It went in fine.
09:06 cait          i am not sure reinstalling would make a difference
09:06 cait          what's your marcflavour system preference set to?
09:08 tony123       Good question... where do I find it? I think I just selected the default flavour
09:08 cait          administration > system preferences
09:08 cait          and search for marcfl
09:11 tony123       thanks. UNIMARC
09:14 cait          oh
09:14 cait          where are you located?
09:15 tony123       Central Asia
09:15 cait          hm
09:15 cait          my guess is that your database is unimarc but your data is marc21
09:15 tony123       ahhh
09:15 cait          for example, the title in unimarc is 200 and the title in marc21 is in 245
09:15 cait          so there are quite some important differences
09:16 cait          i am not sure which is the most common format in central asia
09:16 tony123       We have Cyrillic titles, so we need to run unicode
09:16 tony123       Ok, this gives me something to go on
09:16 cait          it's a misunderstanding that happens often
09:16 cait          unimarc has nothing to do with unicode
09:16 tony123       oh, ok. wrong assumption. :)
09:16 cait          marc21 can be unicode too :) we are using it with hebrew for example
09:17 cait          ok, now we got the problem
09:17 cait          just switching the pref won't fix it all
09:17 cait          it mght be best to redo the installatoin - did you install from packages?
09:17 tony123       I did
09:18 cait          that's cool then
09:18 tony123       No problem. At least I have something to go on now :D
09:18 cait          in the conf file for creating the instance
09:18 cait          check what marcflavour is given there
09:18 cait          it should be marc21
09:18 tony123       ok
09:18 cait          then drop your instance or create a new one, make sure in the web installer, that you select marc21 as well
09:18 cait          and then try reimporting
09:19 tony123       Great, thanks! I'll make the changes and reinstall.
09:20 cait          hope it all works out on second try :)
09:20 cait          making sure the instance is marc21 is important so your data can be searched correctly /indexing configuration
09:21 cait          it's a lot easier that way then trying to fix everything manually
09:21 cait          hi Viktor :)
09:21 Viktor        Hi cait :)
10:06 * paxed       boggles at advance_notices.pl
10:06 paxed         $titles .= join("\t",@item_info) . "\n";
10:06 paxed         'items.content' => $titles,
10:07 paxed         so, <<items.content>> in the notice will contain tab-separated lines of crap that's mostly useless to the patron?
10:08 cait          it contains the fields you define
10:08 cait          to generate an item list
10:08 cait          but yeah, it's tab separated, only overdues can currently do the nicer <item></item> formatting i think (haven't tested lately)
10:09 cait          there is a command line option for the fields it will output for items.content
10:10 paxed         also, not very html-firendly, that.
10:11 cait          paxed: i might remember incorrectly, but i thought there was some code to bulid a table from it
10:11 cait          for the html, but that might be another notice... hm
10:16 sophie_m      paxed: cait it used to be an array formated, I don't know when it disappeared
10:17 paxed         *grmbl*
10:41 paxed         bug 11607
10:41 huginn        04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=11607 enhancement, P5 - low, ---, koha-bugs, NEW , items.content does not contain any formatting when HTML message is selected.
13:00 druthb        o/
13:05 oleonard      Hi druthb, and a belated happy birthday to you
13:05 oleonard      I hope it was a nice one
13:08 druthb        It was lovely!  Had a great weekend—busy busy, but good.
13:48 ashimema      hmmm. no cait today?
13:54 oleonard      So the deletedborrowers table doesn't show a timestamp for when they were deleted, or am I missing it?
14:16 ashimema      There was a bit of a debate around that oleonard..
14:16 ashimema      I'll did out the bug number for when it was all worked out.
14:16 ashimema      the data is there.. somewhere.
14:20 oleonard      Oh, in action_logs it seems ashimema?
14:20 ashimema      There's one.. http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=8926
14:20 huginn        04Bug 8926: enhancement, P5 - low, ---, gmcharlt, NEW , deletedborrowers should have a timestamp
14:20 ashimema      but thats not the one I was thinking of..
14:20 ashimema      maybe it's deleted bibs I was tihnking of..
14:20 ashimema      there was a bug about updating the timestamp in a dleeted table.. which I QA'd
14:21 ashimema      must be getting my wires crossed
14:23 ashimema      I can't find the bug I was thinking of.. so must have wires crossed somewhere.. sorry.
14:28 oleonard      Oh nice. You can't use a saved SQL report to query action_logs for 'DELETE' transactions because the query has the word DELETE in it :P
14:29 paxed         use concat?
14:29 paxed         or some other way to split the searchable text
14:38 ashimema      oh dear.. how silly
15:31 * cait        waves
15:56 jcamins       @later tell gmcharlt It occurs to me that it might be a good idea to have something in the code of conduct about cultural sensitivity, given the different traditions and cultures of members of the community.
15:56 huginn        jcamins: The operation succeeded.
15:56 gmcharlt      jcamins: indeed
15:57 jcamins       I guess it's later than I realized.
15:57 barton        oleonard: are you looking to run the report as a cron job or from the reports interface?
15:59 barton        (... reports interface => koha's reports page)
15:59 oleonard      From the reports interface. I ended up getting good results by using SUBSTRING
15:59 barton        hmm. tricky.
16:01 barton        Do you mind posting that? I'd like to see what you did.
16:04 pastebot      "oleonard" at 127.0.0.1 pasted "Querying deletedborrowers based on DELETE action in action_logs" (1 line) at http://paste.koha-community.org/36
16:04 oleonard      ...of course one would want to limit that query in other ways on a production system
16:25 mtompset      Greetings, #koha.
16:55 gaetan_B      bye !
17:04 reiveune      bye
18:09 jcamins       nengard: are you still on the constant lookout for video suggestions? You may already have this one, but if not a video showing how to mark items lost on recent versions of Koha might be an idea.
18:58 cait          gmcharlt++
19:23 cait          @later tell tcohen ping me when you got some time to talk?
19:23 huginn        cait: The operation succeeded.
20:10 nengard       jcamins - haven't had time to do videos in forever
20:10 nengard       what what has changed about marking items lost?
20:10 jcamins       nengard: you used to do it in the additem screen but now you do it on the moredetails screen.
20:11 nengard       that happened a LONG time ago so I think I have that covered - not in a video but in a tutorial i wrote
20:14 rangi         morning
20:14 jcamins       nengard: yeah, I only thought of it because someone who learned from 3.2 tutorial videos and didn't need to mark anything lost since then (!) mentioned it.
20:14 nengard       hehe
20:14 nengard       got it
20:51 jce           I'm working on a Koha 3.08 server that's having problems with OPAC.  When I do a search under "Library Catalog" I can get plausible results.  But if I select 'Author', 'Title', 'Subject', 'ISBN' or any of the other options from the pull-down list, I get "No results found!"  I've tried rebuilding Zebra many times.  Are the "Library Catalog" results not related to zebra?
20:53 cait          jce: hmmm are you using opacsuppression?
20:53 cait          are you using searchmylibraryfirst or a similar setting?
20:54 jce           cait:  I'm not aware of having any of those options set.  Where are they found?
20:55 jce           I have the same problem on the original server that I'm trying to fix, and on a server running in a VM that I loaded an SQL backup into.
20:55 cait          yeah, looks like something in your configuration
20:55 cait          check administration > system preferences > opac suppression first
20:55 cait          well opacsuppresson
20:55 cait          opacsuppression
20:55 wahanui       well, opacsuppression is different.
20:57 jce           I know there are some bad biblio records that I need to fix, but I don't think this problem is related.
20:57 cait          did you check the setting?
20:58 cait          turn the feature off ( don't hide records) if it's on
20:58 jce           Just getting the VM fired up again.
21:09 jce           OpacSuppression is set to "Don't hide".  It also says"  "Note that you must have the Suppress index set up in Zebra and at least one suppressed item, or your searches will be broken."  Does that pertain only if you have it set to 'hide'?
21:13 jce           SearchMyLibraryFirst is set to "Don't limit".
21:15 jce           Could it be a problem of items (holdings?) vs. biblio records?  Is that the difference between the "Library catalog" search and the others?
21:18 cait          hm
21:18 cait          what does your search url for a non keyword search look like?
21:19 jce           You mean a search under "Library catalog" in the pull-down?
21:20 jce           localhost/cgi-bin/koha/opac-search.pl?q=searchterm
21:21 cait          so that works
21:21 cait          how does the other url look like? for a title search or similar?
21:23 jce           Hmm.  It does look different:  bcmckoha/cgi-bin/koha/opac-search.pl?idx=ti&q=searchterm  [bcmckoha would probably be an instance name, and may not be in /etc/hosts]
21:27 cait          hm that doesn't look quite right
21:27 cait          really one is localhost and the other bcmckoha?
21:28 jce           Actually, as I look at it again, I don't think that's the problem.  On the 'real' server, the URL reads:  localhost/cgi-bin/koha/opac-search.pl?idx=au&q=searchterm
21:30 jce           Sorry, it reads differently between the 'real' server and the VM, but is consistent on each.
21:30 mtj           morning all...
21:31 mtj           jce: if i were you, i would load your 3.0.8 db into a new 3.14.x koha
21:32 mtj           ...and see if your search problems are fixed
21:33 cait          jce: running out of ideas sorry
21:33 jce           Well, I've tried that and not had good results.  I'm trying to fix 3.08 first, before upgrading.
21:33 cait          it's certainly weird...
21:33 cait          you didn't change any settings? when did it stop working?
21:33 jce           cait:  I
21:33 cait          you could check the action logs table for configration changes done in that time span
21:33 mtj           yeah, it is a bit weird :/
21:34 mtj           jce, perhaps your mappings or frameworks are a bit wonky?
21:34 jce           'm not exactly sure.  I'm an outside consultant who comes in to maintain a small church library's Koha server when problems arise.
21:34 jce           mjt:  that's entirely possible.
21:35 cait          jce: action_logs should hav elogged changes to system preferneces
21:35 cait          so if the problem is in the configuration... might be a chance to find it there
21:35 cait          if it happened out of nowhere without updating
21:36 jce           cait:  I doubt there was a change in the config, but I guess it could be.
21:36 mtj           Tools -> Koha to MARC mapping
21:37 mtj           jce: if you really get stuck - you usually need to set up a fresh/clean koha, and start comparing settings
21:38 jce           The system was originally set up in about 2007 or 2008 and has had several OS and Koha version upgrades.  I think that one problem came in when updating to 3.08.  Some of the MARC records have the infamous frey50 infection.  But I don't think that pertains to this problem.
21:39 cait          it's like i have seen something like this somewhere, but can't figure out where and what it was :(
21:39 mtompset      Is it 3.0.8 or 3.8?
21:39 jce           The VM is a fresh install that has the most recent SQL backup loaded.
21:39 mtompset      There is that extra upgrade step if it is prior to 3.4 :)
21:41 mtj           jce, if its a really old koha, the zebra config files might be way out of date too
21:41 jce           It's 3.8.  The package download file is 'koha-common_3.08.16.1-1_all.deb'.
21:42 mtj           ah ok... not too many zebra changes since 3.8 - thats good
21:43 jce           mtj:  I did a "Complete Removal" of Koha in Synaptic, which I understand removes config files as well.  Is it possible that old zebra config could have survived?
21:43 jce           mtj:  but it did start out life long before 3.8.
21:43 mtj           hmm, yeah, its possible
21:44 mtj           pro-tip: i use git to track /etc, before i do any upgrade
21:44 mtj           ...then you can always do a nice before/after diff, to see what really changed
21:45 jce           Part of this long, sad saga is that I did upgrade from 3.8 to 3.14, but it didn't go well, so I put it back to 3.8 to fix it.  I'm now going to upgrade a VM first, then migrate that data to the production server.
21:46 jce           Guess that's another reason to learn how to use git.  :)
21:46 mtj           yeah, i use git for general sysadmin tasks too
21:47 jce           On the other hand, stale config files aren't likely to be plaguing the VM.
21:48 mtj           yeah, correct
21:48 jce           That's a fresh install of 3.8.
21:48 jce           So it must be something in the SQL.
21:49 cait          jce: it would be my guess too... but really hard to tell :(
21:49 cait          youcould check the error logs
21:49 cait          or run the zebrasrv in the foreground (can you say that in english?) and look at the queries
21:49 cait          see if that gives you some kind of hint
21:50 mtj           you really need to start testing from scratch -  against a clean koha, with some clean test records
21:51 mtj           prove that an ISBN search works on your clean koha, and work backwards
21:52 mtj           here is a dir of handy test records from LOC -> http://www.loc.gov/catdir/cpso/RDAtest/
21:53 jce           mtj:  by clean koha, you mean a fresh instance?
21:54 cait          a fresh instance... and some records, see if that can be searched
21:54 cait          both instances use the same koha code.. might give a clue?
21:54 cait          but probably I'd take a look at the logs first
21:55 mtj           jce:  yeah, a fresh instance
21:56 mtj           load these bibs into your fresh instance -> www.loc.gov/catdir/cpso/RDAtest/extra_bib_marc.zip
21:57 jce           Ok, on the VM, the last error in /var/log/koha/BCMCKoha/opac-error.log is:  opac-search.pl:  Use of uninitialized value $sort_by[0] in join or string at /usr/share/koha/opac/cgi-bin/opac/opac-search.pl line 698.
21:57 mtj           ...then index, and confirm that the seaching behaves 'sanely'
21:58 mtj           ...then load those bibs into your problem koha, and confirm that you get different search results
21:58 cait          ah
21:58 jce           mtj:  I'll have to add items against these bibs, correct?
21:58 cait          hm what is the default sort option set to in this instance? there are prefs for that
21:58 cait          look for search
21:59 mtj           jce:  no need for items
21:59 pianohacker   hallo
22:02 pianohacker   http://devopsreactions.tumblr.com/post/88260308392/testing-my-own-code
22:03 cait          aaw
22:03 cait          that kid is cool
22:03 rangi         http://kohadevreactions.tumblr.com/post/85406182518/coming-back-to-a-feature-you-wrote-months-ago
22:04 pianohacker   rangi: I was looking at the last attempt I did at ajaxcirc back in 2009 the other day like that...
22:04 rangi         :)
22:05 pianohacker   "What a dumbass idea. Who thought that was reasonable? ... Oh..."
22:05 cait          itshows your development :)
22:07 jce           Ok, I searched 'search' in preferences and found that NoZebra was set to "Don't use".  Hmm.  I set it to 'Use' and saved Searching preferences.  Didn't seem to help immediately.  Do I need to restart apache to make Koha see that change?
22:08 pianohacker   jce: which version of koha are you using?
22:09 cait          jce: undo that
22:09 jce           3.8
22:09 mtj           jce:  that is very very suspicious
22:09 cait          jce: Don't use NoZebra is correct
22:09 cait          it#s double negated... it means use zebra then
22:09 cait          in other words: you don't want to use NoZebra :)
22:09 pianohacker   jce: NoZebra should be set to don't use, it's old code that was removed not long after the version you were using
22:10 mtj           jce:  your search should not work at all, without zebra
22:11 jce           It's worded very poorly.  The pull-down says [Use|Don't use] the Zebra search engine.  But I'll set it back the way I found it.  It didn't appear to make any difference.
22:11 mtj           is your koha config pointing to another db?
22:12 cait          jce: hm the wording is irritating me too indeed
22:12 pianohacker   jce: yeah, sorry about that. That and several other things were why it's since been eviscerated
22:12 jce           mtj:  No.  And again, I do get results when I don't specify Title, Author, etc.
22:13 pianohacker   jce: Do you have zebra set up for unimarc and are using MARC21, or vice versa?
22:14 jce           pianohacker:  No, it's MARC21 all the way.
22:14 pianohacker   hmm, okay. what flags are you passing to rebuild_zebra.pl?
22:16 jce           cait:  I see defaultSortField and defaultSortOrder and OPAC versions thereof.  Both are set to relevance and ascending.
22:16 pianohacker   and wait a second. From what I'm reading in the preferences file, shouldn't NoZebra be set to "use"? The .pref file accounts for the negation
22:16 pianohacker   at least as of 8626a5bebb3f900d852a3d98c5f45d95cea5272d
22:17 cait          pianohacker: i was wondering that
22:17 jce           pianohacker:  -f -v {instance}
22:17 cait          maybe you are right
22:17 cait          i am glad we finally removed that
22:17 cait          end of confusion
22:17 pianohacker   amen!
22:18 pianohacker   there are several .pref descriptions that I wrote that only partially removed confusion... things like gist, for instance
22:18 jce           So to clarify, the NoZebra preference is already do-nothing cruft by 3.8, or do I need to make sure it is set right?
22:19 pianohacker   that's a good question.
22:19 jce           It didn't seem to have an effect.
22:19 mtj           looks like its still live on 3.8.16
22:20 mtj           the search code in opac-seach.pl refers to it
22:20 pianohacker   yup
22:21 pianohacker   ace
22:21 mtj           jebus
22:21 pianohacker   jce: the only thing I can think of is to also pass -x to rebuild-zebra
22:21 mtj           according to searching.pref, nozebra  = yes *enables* zebra?!
22:22 jce           Would I need to rebuild zebra's index to get the NoZebra preference to take effect?
22:22 mtj           oops, i take that back ^^ :)
22:22 pianohacker   the wording is confusing, I hated describing the No* style prefs...
22:22 jce           mtj:  But in the web interface, the wording is "Use" or "Don't use" zebra search engine.
22:23 mtj           yes = 'dont use'
22:23 pianohacker   jce: It should be set to "Use", from what we can tell
22:23 jce           Mine was set to "Don't use."  But setting it to "Use" didn't immediately change anything.
22:23 jce           Ok, I'll set it back, restart apache, reindex (with -x) and see what that does.
22:24 pianohacker   mtj: since "Use" should set NoZebra to no or 0
22:24 pianohacker   jce: also restart the zebra server, just to be safe
22:29 mtj           yeah, nozebra needs to be set to 'use' - thanks pianohacker, cait :)
22:29 jce           Ok, after I set it to "Use" I got no results in "Library catalog".  When I put it back to "Don't use" "Library catalog" searches work.
22:30 cait          hmmmm - but only library catalog searched, not the other options
22:30 jce           This suggests to me that my zebra is broken and "Library catalog" searches aren't using zebra.  Is this possible?
22:30 jce           cait:  right.
22:30 cait          they normally use zebra
22:30 cait          but that is really what the pref des
22:30 cait          does
22:31 cait          it might be zebra is generally broken for some reason... and that only nozebra partially works
22:31 mtj           your koha has not been using zebra, perhap jce?
22:31 eythian       hi
22:31 jce           mtj:  It was at one time.  It may be broken now.
22:31 cait          does the rebuld give positive feedback? you could try adding the -x and a second -v
22:33 mtj           jce: if you stop zebra, and get search results... you have some problem :)
22:34 pianohacker   yeah, that's a good idea. Stop zebrasrv/koha-zebra if it's running, then try again
22:34 mtj           nozebra = 'dont use'  should *not* give you any results, on 3.8.16
22:34 jce           cait:  The syntax on koha-rebuild-zebra:  -x -x -v -v, or -xx -vv?
22:34 cait          one x to v
22:34 cait          2 v
22:34 cait          i think
22:34 cait          -x -v -v for more output
22:35 cait          also the -f i think
22:37 jce           It exports 3295 biblios, which is plausible.  It gives a warning:  Record didn't contain match fields in (bib1,Local-number)
22:37 cait          ah
22:37 cait          that's not good
22:37 jce           Is that a problem with mapping or frameworks?
22:39 mtj           hmm, mapping (i think?)
22:40 jce           mtj:  pianohacker:  I did 'koha-stop-zebra' and I can still search by "Library catalog" and get results.  Keyword searches still don't work.  Clearly Zebra is not working.
22:40 cait          can you still search when you flip the nozebra?
22:40 mtj           jce:  double check
22:40 mtj           $ ps -ef | grep zebra
22:42 mtj           jce: my hunch is you are not actually using zebra
22:43 jce           mtj:  assuming koha-stop-zebra actually shut Zebra down, that's what's happening.  So how do I get Zebra running again?
22:44 cait          i thnk it's a data problem
22:44 cait          something about the biblionumber?
22:45 mtj           yeah, a 999/mapping problem
22:45 jce           cait:  It must be a data problem that has moved from the original server to the VM in the SQL.  A mapping problem would make sense.
22:46 tcohen        #koha: i'm on a small trip to Ecuador, i'll try to push stuff to master soon, but not today
22:47 eythian       jce: koha-start-zebra
22:47 eythian       jce: though, I recommend 'sudo service koha-common start/stop/restart' instead
22:48 jce           I ran a bibliographic framework test a couple days ago and it said I had tabs 6 and 3 in use in tag 696, but when I went to try and fix it, I couldn't figure out how to get rid of one.
22:49 jce           eythian:  the real problem is not how to start the server, but how to get it working correctly.
22:49 cait          tcohen: have fun :)
22:49 cait          jce: the problem might be that your records are missing the 999 tag
22:49 cait          with the bilbionumber
22:50 tcohen        eythian: don't know what was about
22:50 mtj           tcohen:  Ecuador sounds nice :0)
22:50 eythian       tcohen: ES API
22:50 tcohen        it is :-D
22:50 tcohen        eythian: oh
22:50 eythian       I didn't understand your question
22:50 tcohen        we all know Zebra's code needs to be refactored
22:51 eythian       s/refactored/taken out and shot/ but yes.
22:51 tcohen        and also, that in the mid term, Zebra might even dissapear
22:51 tcohen        I was thinking, that if there was an API for implementing the ES code
22:51 tcohen        (like there was for SolR)
22:51 tcohen        some people might like to work on making Zebra code a bit better
22:52 tcohen        following that hypothetical API
22:52 eythian       tcohen: there is, sorta.
22:52 pianohacker   It's going through K::SearchEngine, no?
22:52 tcohen        pianohacker: that's SolR
22:52 eythian       I'm writing a compatibility layer so it can get requests from the existing stuff
22:52 eythian       it is also mostly using K::SE
22:52 wahanui       okay, eythian.
22:53 tcohen        ok, so sticking to K::SE is a good way to do it
22:53 tcohen        thanks eythian
22:53 eythian       It also has its own API, though it's a bit under-defined at the moment as I'm sorting out use cases.
22:53 eythian       Basically you feed it a lucene search string.
22:53 tcohen        instead of PQF
22:54 eythian       Yeah.
22:54 tcohen        so the search is not built byQueryParser
22:54 eythian       It is
22:54 cait          oh
22:54 eythian       If you're using the zebra-like code, it builds a lucene-type search string, and then turns that into an ES query.
22:54 cait          interesting
22:54 wahanui       somebody said interesting was sometimes good and sometimes bad
22:55 eythian       this also means that you can type lucene stuff straight into the keyword box and it should work.
22:55 eythian       e.g. title:foo AND subject:bar
22:55 pianohacker   wait, does that mean the existing code allows raw PQF for zebra?
22:56 eythian       probably
22:56 eythian       possible
22:56 wahanui       i heard possible was everything, but I think it's probably easy to miss something here
22:56 eythian       y
22:57 jce           cait:  I'm looking through a MARC XML file I exported and it appears that you're right.  No 999 tags.  Could it be that we were never using Zebra even though we thought we were?  Seems more likely than that all the 999 tags got stripped out.  Is it possible that there was another tag that was being used instead.  I have vague thoughts about 7xx-something.
22:58 tcohen        eythian: will there be something like K::SE::ES ?
22:58 jce           I see 700 tags that contain what look like author and title information.
22:58 eythian       tcohen: yep
22:58 tcohen        awesome
22:59 cait          jce: you ar elooking basically for 2 identical numbers
22:59 eythian       that aspect of it needs a bit of work right now, but it's really just a bit of rearranging of files and such.
23:00 mtj           jce, you are prolly missing 999 framework mappings - thats why your bibs have no 999 fields
23:00 mtj           Home › Administration › Koha to MARC mapping
23:00 tcohen        eythian: thanks for the update
23:01 mtj           ..its a common-ish bug for upgraded old kohas
23:01 cait          tcohen: when will you be back? :)
23:01 eythian       some time soon I hope to squash patches appropriately and publish something for testing.
23:01 eythian       well, super-alpha tesing
23:01 tcohen        saturday actually
23:02 tcohen        eythian: looking forward for it, thanks
23:03 mtj           jce: based on your info - you probably have not been using zebra for a while
23:03 * tcohen      notices that hotel hasn't filtered port 22 :-D
23:03 jce           mtj:  That could be.
23:04 mtj           ..if ever?
23:05 jce           So, I'm looking in Koha to MARC mapping, and I don't find any reference to 999.  I'll need to add it then?  Should this be under biblio, biblioitems, or items?
23:05 cait          hmm
23:05 eythian       that sounds like it would be very problematic.
23:06 cait          i'd recommend looking at a standard default framework
23:06 eythian       I'd be inclined to reload them from default
23:06 cait          probably even load a standard framework
23:06 cait          yeah
23:06 cait          i agree
23:07 mtj           me... i would load your bib, bibitems and items tables into a fresh koha, and test from there
23:07 cait          the problem is, even after reloading the frameworks
23:07 rangi         mtj: that wont work if those records have no 999 on them
23:07 mtj           or the other way around, too :)
23:07 cait          you'd need to add bilbionumbers i guess
23:07 rangi         yep
23:08 rangi         the best thng to do
23:08 * eythian     wonders if there's a fixbiblionumbers type script around
23:08 rangi         before worrying about frameworks
23:08 mtj           aah, yeah - those arent automagically created :/
23:08 rangi         is do a select marcxml from biblioitems limit 2;
23:08 mtj           me forgot that
23:08 rangi         and have a look
23:11 jce           rangi:  I'll need a little more context for your suggestion.  I've exported all the records in marcxml.  Are you just saying to check if I have 999 in the records.  I can already confirm I don't.
23:16 dcook         pianohacker: If you prefix your query with "pqf=" you can send straight PQF to Zebra
23:16 dcook         I don't think it would work without that prefix though as it would just interpret it as a kw search, me thinks
23:20 * pianohacker has a geeky trick
23:20 pianohacker   dcook++
23:20 dcook         hehe
23:27 jce           Ok all, here's another bit of information.  When I last tried to re-index Zebra, it reported the following:  BIBLIONUMBER in: 090$c   BIBLIOITEMNUMBER in: 090$d.  I take it that these should be in 999$c and 999$d?
23:28 rangi         isnt 090 unimarc
23:28 rangi         ?
23:28 rangi         hmm and old old koha too i think
23:29 jce           www.loc.gov/marc/bibliographic/bd09x.html says 090 is "Local Call Number [OBSOLETE, 1982]
23:30 jce           So maybe I simply need to run a script to upgrade the schema?  [hoping for something easy like that]
23:30 cait          you should change the records
23:30 cait          if you can do that, i think it might work
23:31 cait          loc is not relevant here - this is koha specific
23:31 dcook         rangi: It looks like 090$9 is unimarc
23:31 dcook         So I'm guessing old Koha?
23:32 jce           Would it work to do a global text replace in a marcxml dump, changing the 090 tag to 999?
23:32 rangi         id try it on 2 or 3 records
23:32 rangi         reindex
23:32 jce           dcook:  I think this is still leftover mess from years ago.
23:32 rangi         and see if it works, then do all of them (after a backup of course)
23:32 dcook         jce: I think you'll also need to change the place that is telling Zebra to look in 090$c and 090$d, I suspect
23:33 rangi         dcook: zebra is looking in 999
23:33 jce           rangi:  sounds like a good idea.  I have a working backup, the original server, and 2 virtual machines.
23:33 dcook         "GetMarcFromKohaField"
23:33 rangi         the problem is the are in 090
23:33 dcook         rangi: BIBLIONUMBER in: 090$c   BIBLIOITEMNUMBER in: 090$d is based on GetMarcFromKohaField
23:33 rangi         you will need to fix up your frameworks too jce
23:33 rangi         yes
23:33 rangi         thats the problem
23:34 rangi         the data and the frameworks say 090
23:34 rangi         zebra looks for 999
23:34 rangi         need to fix the data and fraemworks to say 999
23:34 dcook         That's what I meant. Bad communication on my part :p
23:34 dcook         Also want to update Marc => Koha mappings, as those get used quite a bit in Koha
23:34 rangi         (without fixing the frameworks, the problem will recreate itself over time)
23:35 dcook         Err "Koha to MARC Mapping"
23:36 jce           dcook:  is GetMarcFromKohaField in system preferences?  I'm not finding it there.
23:37 cait          it's in administration
23:37 cait          koha 2 marc mapping
23:37 dcook         /cgi-bin/koha/admin/koha2marclinks.pl
23:37 wahanui       /cgi-bin/koha/admin/koha2marclinks.pl is the place to check 1st
23:38 dcook         Whoa. Wahanui and I are in sync.
23:38 cait          you'd want to make sure that the mappings there are correct - they get applied to all frameworks
23:38 dcook         Not to be confused with nsync...
23:38 dcook         Yep. If those have 090 for the biblionumber/biblioitemnumber, new records and mods will get the number changed back to 090.
23:39 pianohacker   cait: that's a question I recently ran into, as some places use the marc mapping of the relevant framework and others just use the default
23:39 pianohacker   is there a best practice?
23:40 pianohacker   (or at least an accepted one)
23:40 pianohacker   I've been looking at circulation code, so my head hasn't exactly been in the clean, recently architected part of Koha lately :P
23:41 cait          i think it doesn't seem to make sense fo rme for different frameworks having different mapping right now
23:41 cait          there might be a use case i can't think of...
23:41 cait          you never know with libraries
23:41 pianohacker   always is in the library world, yeah...
23:42 * dcook       supposes it might depend on the case as well
23:42 * dcook       turns back to his Friday work before he starts thinking about other things..
23:42 dcook         But yay for pianohacker looking at circulation! :D
23:42 pianohacker   well, from a very unscientific ag -i marctokoha | grep -v framework | wc -l
23:43 pianohacker   about 80% of the instances use it
23:43 eythian       what is 'ag'?
23:43 * dcook       was also wondering this
23:44 pianohacker   sorry, 80% of instances in Koha don't pass frameworkcode
23:44 jce           dcook:  I don't find any reference to GetMarcFromKohaField in /usr/share/koha/intranet/cgi-bin/admin/koha2marclinks.pl  Is it a variable name?  A function?  A Perl module?
23:44 dcook         jce: a function. It's referred to in rebuild_zebra.pl
23:44 pianohacker   dcook, eythian: It's a superfast recursive grep that automatically ignores certain directories
23:44 dcook         What you want to do is go to the page koha2marclinks.pl
23:44 eythian       pianohacker: you mean ack
23:45 eythian       ?
23:45 pianohacker   eythian: nope, even faster than ack, though not quite as full featured
23:45 cait          jce: i think before thinking furhter, just try changing the data for 2 records or so and see if you can search them - might save you more headache, doing it step by step
23:45 jce           dcook:  Ah, I was looking in the code.
23:45 eythian       pianohacker: ah, interesting
23:46 pianohacker   eythian: see http://geoff.greer.fm/2012/09/03/profiling-ag-writing-my-own-scandir/ and http://geoff.greer.fm/2011/12/27/the-silver-searcher-better-than-ack/
23:46 pianohacker   dude is a bit performance crazy
23:46 cait          night
23:46 cait          :)
23:46 pianohacker   night cait
23:46 rangi         jce: id just try editing a couple of records, reindexing and see if it fixes the search
23:46 rangi         jce: everything else is moot if it doesnt
23:47 dcook         jce: Both cait and rangi have been doing this  much longer than me, so I'd probably listen to them :p
23:48 rangi         once we are sure that shifting the 090 to 999 fixes it, then we can worry about the mapping/frameworks :)
23:51 mtj           ahh, this script might do the 090->999 fixup jce
23:51 mtj           http://git.kohaaloha.com/?p=misc/.git;a=blob;f=fix-090-to-999.pl
23:51 jce           Ok, so I should edit a couple records by changing their 090 tags  to 999, or copying them to 999 tags in marcxml, import those records, re-index Zebra, and see if I can search those records on Title, Author, etc.  If that works, I should fix the framework and Koha to MARC mapping, mass-edit the rest of the marcxml records, import them, re-index, and there's a 100% guarantee that everything will be happy in the world.  :)
23:51 jce           mtj:  Ooh, I may check into that.
23:51 eythian       pianohacker: it's around half the speed of ack for me.
23:51 pianohacker   never a 100% guarantee
23:51 eythian       though perhaps there are differences in options
23:52 pianohacker   eythian: that's interesting. running a simple ag/ack MarcToKoha in my kohaclone is 0.47 vs 22.27 seconds
23:52 jce           pianohacker:  :O  shock and disbelief!
23:52 eythian       pianohacker: make sure you have a warm cache though
23:52 pianohacker   (on my machine)
23:53 eythian       i.e. run each one twice ant least
23:53 pianohacker   jce: heh. Just making sure :)
23:54 pianohacker   eythian: Just did, same result. What test are you running/are you on an SSD?
23:54 jce           May have to take a supper break before diving into this.  Thanks all for your help.
23:54 eythian       > time ack GetMarcFromKohaField
23:54 eythian       not on SSD
23:54 mtj           jce: you are making good progress :)
23:55 eythian       pianohacker: OK, when I make a .agignore from my .ackrc, it's a huge lot faster.
23:55 jce           mtj:  I do think this sounds like a plausible explanation for what is going on.
23:56 eythian       I had ack tuned to drop a large amount of unnecessary Koha directories
23:56 mtj           jce, yeah - ive travelled your path a few times, myself :)
23:57 rangi         pianohacker: you did the overdrive work eh?
23:57 mtj           a well worn road
23:58 pianohacker   eythian: Ah, yeah, even just making ack ignore .po files like I did for ag reduced the difference a lot
23:58 pianohacker   rangi: Yup. That reminds me, though
23:58 rangi         pianohacker: http://ils.stdc.govt.nz/cgi-bin/koha/opac-overdrive-search.pl?q=harry%20potter
23:59 pianohacker   gmcharlt: do you have a second to discuss a dependency problem that affects 3.16?
23:59 pianohacker   rangi: cool, didn't know NZ libraries used overdrive