Time  Nick              Message
00:17 smeagol           feesh..nice fish..
01:41 pastebot          "wajasu" at 127.0.0.1 pasted "zebra facet yaz-client result" (58 lines) at http://paste.koha-community.org/174
01:43 wajasu            looking into facets.
01:44 dcook             Did you set up zebra::facet::title:w in your zebra config?
01:45 dcook             You could try zebra::facet::title:0 since you're using ICU
01:48 wajasu            i've been putting <retrieval syntax="xml" name="zebra::facet::title:w"/>  into  ... zebradb/retrieval-info-bib-dom.xml
01:48 dcook             Also, wajasu, I don't think we actually have a "title" index set
01:48 dcook             You might try "Title"
01:48 dcook             Caps might matter
01:48 * dcook           can't remember the actual index he tried out in October
01:48 wajasu            i tried, but will try again.
01:49 wajasu            got the same thing with Title. let me try 0
01:51 wajasu            0 instead of w get
01:52 wajasu            got error
01:52 dcook             That's a bit bizarre
01:53 dcook             Let's see...I could probably take a few minutes to try this again
01:53 dcook             I really need to remember to set up a dev machine one of these days..
01:55 wajasu            i even tried the retreival lines from comment 6  from bug  11232
01:55 huginn            04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=11232 new feature, P5 - low, ---, gmcharlt, NEW , Retrieve facets from Zebra
01:56 dcook             Hold on. I'm just getting it set up now :p
02:01 dcook             Hmm
02:03 wajasu            maybe i forgat to put something in the yazgfs section
02:03 dcook             Mmm, I doubt it
02:03 dcook             Maybe I forgot to put something in instructions
02:03 dcook             wajasu: Are you using a package install?
02:04 wajasu            yes
02:04 dcook             Have you tried koha-zebra-restart?
02:04 wajasu            well i lied.
02:04 dcook             lol
02:05 wajasu            i created a VM and did koha packages to get dependencies, but then to a kohaclone dev install
02:05 wajasu            the VM in about 5 week old.
02:05 wajasu            in/is
02:05 dcook             In any case, if you can restart zebra itself, try that
02:07 wajasu            i have clean db with 3 records i pulled and caltaloged that have "heart" in the title
02:07 wajasu            i've rebuild_zebra and restart zebrasrv
02:07 dcook             That's not what I said to do :p
02:07 dcook             Ah
02:08 dcook             I'm thinking that there is something missing in my directions..
02:09 wajasu            my  base biblios    find @attr 1=4 heart    gets 4 hits.
02:09 wajasu            its the elements line
02:10 dcook             Yes, that would be it
02:11 wajasu            element zebra::facet::any:w,title:w,title:0       ends up giving me:
02:11 wajasu            [25] Specified element set name not valid for specified database -- v2 addinfo 'zebra::facet::any:w,title:w,title:0'
02:12 dcook             Yeah, same.
02:12 wajasu            maybe my retrieval should be combined
02:12 dcook             Same for any combo for me at the moment
02:12 dcook             O_o
02:13 wajasu            i hope its not because its squeeze  and an older zebrasrv yaz-client
02:13 wajasu            when you were hacking could it have been a newer compiled version?
02:13 dcook             Well, I'll try it on my wheezy vm in a minute
02:13 dcook             Well could've been
02:13 dcook             Facetting is only supported from Zebra 2....
02:13 dcook             2.1
02:14 dcook             Wait..
02:14 wajasu            zversion says 3
02:14 dcook             Maybe 2.0.20
02:14 wajasu            but that may be the default
02:20 wajasu            got yaz-client version 4.0.11
02:21 wajasu            option   gives:
02:21 wajasu            search present delSet triggerResourceCtrl scan sort extendedServices namedResultSets
02:25 wajasu            I just recently tried:
02:25 wajasu            base biblios
02:26 wajasu            format xml
02:26 wahanui           format xml is a client side transformation of whatever comes back from the server.
02:26 wajasu            elements zebra::facet::itype:w
02:26 wajasu            f @attr 1=8031 BK
02:27 wajasu            and got 7 hits.   then did a show and got a facet xml response, but text was G/A
02:28 wajasu            just grabbed 8031 from the ccl.properties mapping
02:29 dcook             G/A ?
02:30 wajasu            is see this in the log
02:30 wajasu            21:16:20-19/03 zebrasrv(1) [log] dict_lookup_grep: (\x01G\x01\x02)/A\x01\x01\x01\x06\x01\x01\x01\x06
02:30 wajasu            there is a G  and /A in that string
02:31 pastebot          "wajasu" at 127.0.0.1 pasted "zebra log" (9 lines) at http://paste.koha-community.org/175
02:32 wajasu            facet first phase real=0.00 cat=index     might be a lead
02:35 pastebot          "wajasu" at 127.0.0.1 pasted "zebra log -v all" (188 lines) at http://paste.koha-community.org/176
02:39 * mtompset        is glad dcook and wajasu are talking zebra and facets. :)
02:54 dcook             Aha!
02:54 wajasu            Wha?
02:55 dcook             I was wondering why it wasn't working on my Debian VM, but it's because I was pointing to the wrong file...
02:56 dcook             Alas, that's for package installs
02:57 dcook             So not really relevant to the task at hand
02:57 wajasu            ok. be right back
03:00 dcook             wajasu: I'm quite confident that the problem is with restarting Zebra
03:00 dcook             Double-check your koha-conf.xml to make sure that retrieval-info-bib-dom.xml is the correct file
03:01 wajasu            ok
03:05 dcook             So yeah...I think the problem is making sure you're changing the correct file, and then making sure that Zebra is restarting correctly
03:15 mtompset          dcook: less kohaclone/debian/scripts/koha-restart-zebra
03:15 mtompset          look for the lone daemon line. ;)
03:15 mtompset          ^lone^long^
03:15 dcook             Yeah, I already looked there
03:16 dcook             I don't have root access for my dev install in any case, so it looks like I'll have to ask someone else to do it
03:16 mtompset          You don't have root access for your dev install?!
03:17 dcook             It's not my server :p
03:17 dcook             Someone else runs it
03:17 dcook             I just provide the code :p
03:17 mtompset          So then... why don't you have a VM for your own development?
03:18 dcook             I don't really have my own development
03:18 dcook             But I do have a VM and I use packages and gitify :p
03:18 dcook             And we wind back up at the beginning where I have to ask someone else to restart zebra in this case :p
03:18 mtompset          I have 13 VM's to play with. :)
03:18 * mtompset        shouts, "MORE VMS!"
03:19 * mtompset        grins.
03:59 dcook             Ahhh!
03:59 dcook             I'm awesome!
03:59 * dcook           does a little dance
03:59 dcook             It was so obvious in the end
03:59 dcook             Gahhh
04:00 dcook             Too much excitement. Not enough outlet...
04:00 wizzyrea          hehe
04:00 dcook             We're doing our Zebra config all wrong for facets
04:00 wizzyrea          that is unsurprising to me.
04:01 dcook             Initial testing suggests that it'll be so easy to access Zebra's special element set
04:01 dcook             http://www.indexdata.com/yaz/doc/tools.retrieval.html
04:01 dcook             It specifies that "name" is optional...
04:01 dcook             Wait...
04:01 dcook             I need to double-check things
04:01 * dcook           halts the celebration
04:01 dcook             I might be an idiot still
04:03 dcook             Nope! I was right!
04:03 * dcook           continues dancing
04:04 dcook             We just need "<retrieval syntax="xml"/>" at the bottom within <retrievalinfo>
04:04 dcook             Then we regain access to the Zebra special retrieval elements
04:04 dcook             That we lost by using <retrievalinfo> in the first place
04:05 dcook             hdl posed the question in June 2009 and now there is an answer :)
04:30 dcook             @later tell wajasu take a look at bug 11232 again. I've updated the instructions.
04:30 huginn            dcook: The operation succeeded.
04:32 wajasu            trying out
04:33 dcook             It only fixes one of the problems though it appears..
04:33 pastebot          "dcook" at 127.0.0.1 pasted "Zebra output on non-package install" (25 lines) at http://paste.koha-community.org/177
04:35 pastebot          "dcook" at 127.0.0.1 pasted "Zebra facet output on package install" (51 lines) at http://paste.koha-community.org/178
04:40 dcook             Hmm, I'm using DOM for both of them..
04:40 dcook             Not sure about ICU
04:41 dcook             Interesting...
04:41 wahanui           hmmm... interesting is sometimes good and sometimes bad
04:41 wajasu            my koha-conf has a bunch of retrieval entries as well as the included one.
04:42 dcook             Yeah, those are fallbacks
04:42 dcook             It looks like my package install is non-ICU, while my tarball install is ICU
04:45 wajasu            i haven't had any success yet.
04:45 dcook             I doubt you will while using ICU
04:45 wajasu            remind me how to turn off icu
04:45 dcook             Although if you replaced all the specific zebra::facet::*:* type entries with just a <retrieval syntax="xml"/> you should be able to do any facet you want in yaz
04:46 dcook             You'll need to change default.idx
04:46 dcook             Possibly somewhere else. I'm not sure.
04:46 dcook             I do have an idea though..
04:47 wajasu            now things make sense.  those index w and index p  definitions in there link stuff up
04:48 dcook             Link stuff up?
04:49 wajasu            the index w, and index p,  must have to do with    zebra::facet::Title:p    and such
04:50 wajasu            can u paste your default.idx
04:50 dcook             Yeah, "zebra::facet::Title:p" would be a facet for the title phrase index
04:53 wajasu            i suppose i can do another make and choose chr
05:03 dcook             Ooo
05:03 dcook             wajasu: With your ICU install, try a facet of "itype:w"
05:03 dcook             That's working for me
05:04 dcook             As does pubdate
05:05 dcook             wth...
05:05 dcook             Suddenly title is working now too
05:06 dcook             wajasu: Try something like "elements zebra::facet::Title:w,itype:w,pubdate:w,rtype:w" with ICU
05:06 dcook             For some reason, it's working great
05:06 dcook             Oh I wonder..
05:07 dcook             Oh wait...it's because I'm an idiot
05:07 dcook             Probably at least
05:07 wajasu            i haven't had any luck yet
05:08 dcook             I switched back to non-ICU and that's why it was working..derp derp
05:09 wajasu            ok i'll try switching back to non-ICU
05:09 dcook             Use "charmap word-phrase-utf.chr" instead of "icuchain words-icu.xml" in default.idx for whatever index type you want to try
05:09 dcook             Still...it's not a great solution if it doesn't work for ICU..
05:09 wajasu            i'll copy my default.idx fromt he package install into my kohadev
05:09 dcook             That's certainly one way to do it, if your package install doesn't use ICU
05:10 dcook             Doesn't hurt to understand what's going on in the config files though :p
05:11 dcook             Of course, the reason it might not seem to work with ICU might be because the terminal isn't expecting utf8..
05:14 wajasu            i am thinking its a character conversion thing as well
05:15 dcook             Hmm, I just told yaz-client to use UTF-8 and that's not working..
05:15 wajasu            maybe inputcharset outputcharset stuff
05:15 dcook             Yeah, I changed the output
05:17 dcook             Hmm, none of those options are doing much
05:23 wajasu            http://lists.indexdata.dk/pipermail/zebralist/2008-January/001859.html
05:25 wajasu            http://lists.indexdata.dk/pipermail/zebralist/2009-April/002190.html
05:25 dcook             Interesting
05:25 wahanui           i think Interesting is sometimes good and sometimes bad
05:25 dcook             Gotta love how their bugzilla links are broken..
05:26 wajasu            :)
05:28 dcook             Would you post those links on Bugzilla?
05:28 dcook             Probably makes more sense to focus on ElasticSearch and/or Solr anyway :p
05:30 dcook             the :0 thing might fix this but I'm not sure..
05:31 wajasu            i did get facet counts working, where i retrieve all the results records (marcxml) and inspect hidden etc.  but what if the system is big like 1million records.
05:32 dcook             How do you mean that you got facet counts working?
05:32 dcook             As in, at the Perl code level?
05:33 dcook             Indexdata claims that Zebra special retrieval elements should work faster than using marcxml
05:33 wajasu            yes.  the other day.  i have a bug 11909 that i was working with mtompset
05:33 huginn            04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=11909 major, P5 - low, ---, matted-34813, ASSIGNED , Fix hidelostitems, OpacHiddenItems total count, prog them facet display
05:33 mtompset          -- which by the way is on my "get back to quickly" list.
05:34 mtompset          I should go. I need sleep.
05:34 mtompset          Have a great day, #koha wajasu dcook. :)
05:34 dcook             Then I'm not sure of the relevancy
05:34 dcook             ta mtompset
05:34 dcook             I suspect that querying Zebra for facets from a million results would be a lot faster than iterating through a million results to generate a facet list
05:34 dcook             Not that that was what you were precisely saying of course :p
05:35 wajasu            i agree.  that why i was investigating the facets.
05:35 wajasu            now that i inderstand search so well.
05:37 wajasu            what i wish i could do is query zebra and just get back the biblionumber (Local-number) for a search.  so i can grab items from the DB and count facets myself.
05:38 wajasu            but we get back a marcxml chunk and must decode with MARC::Record.
05:39 wajasu            what i noticed with my patch is as i went through more than N records that a syspref restricts,  the resulting 5 facets for author changed.
05:41 wajasu            so then you start realizing how much is left out.   when i increased the number from 5 to 100, the catalog got more interesting because i saw more writers who wrote on a topic, or such.
05:41 wajasu            so i was learning about new authors, etc.
05:41 wajasu            so i think facets are worth it.
05:43 dcook             I didn't really follow most of what you said there, but yeah...the current way of doing facets really sucks
05:43 dcook             There might be some hope though: http://lists.indexdata.dk/pipermail/zebralist/2013-February/002563.html
05:43 dcook             Probably unrelated though
05:44 wajasu            i'm sure if we could read the source code we could find out, and see if there is any ICU support code.
05:46 dcook             Alas, I don't know, and I have other things to do :/
05:46 dcook             Definitely learned something today though and we are farther than we were
05:47 wajasu            yup
05:47 wajasu            thx
05:49 dcook             You're welcome. It was interesting!
06:40 paxed             how does the cataloguer set the framework used for a record?
06:49 wajasu            when i search (z3950) and get the Add MaRC Record page, there is a default framework dropdown.  if other framework mappings have been setup by the admiin, i would expect they would be avaiable.
06:51 indradg           good morning #koha
06:55 magnuse           happy equinox, #koha!
07:04 paxed             hm. any reason why 008 value builder doesn't know how show the type of material correctly? (eg. LDR/06=a and LDR/07=s, 008 builder shows BKS, when it should show CR)
07:04 paxed             marc21^
07:08 * cait            waves
07:09 paxed             ah, it's bug 9093
07:09 huginn            04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=9093 normal, P5 - low, ---, gmcharlt, NEW , 008 forgetting what material type was chosen
07:09 paxed             lovely.
07:11 paxed             our cataloguer just had a fit over that.
07:29 alex_a            bonjour
07:43 paxed             so, is there any kind of ETA for Rancor?
07:44 magnuse           bug 11559 is currently "failed qa"
07:44 huginn            04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=11559 enhancement, P5 - low, ---, jweaver, Failed QA , Professional cataloger's interface
07:45 paxed             because the current cataloguing system is bad.
07:45 magnuse           so pianohacker will have to fix the issues and then it will go through signoff, qa and getting pushed by the rm
07:45 magnuse           there was quite some interest around it in marseille, so i think it will make it into 3.16
07:45 magnuse           but you never know...
07:46 Joubu             hello
07:46 wahanui           what's up, Joubu
07:46 paxed             i have no idea how the koha maintainers haven't been killed by cataloguers.
07:48 reiveune          hellok
07:54 paxed             what is the reserveconstraints table for?
08:11 cait              good morning #koha
08:16 magnuse           kia ora cait
08:26 cait              morning magnuse and paul_p
08:28 marcelr           hi #koha
08:43 magnuse           hiya marcelr
08:43 marcelr           hi magnuse
08:43 wahanui           hi magnuse are you spending much time in oslo?
08:48 cait              wahanui forget hi magnuse
08:48 wahanui           cait: I forgot hi magnuse
08:48 cait              hi marcelr - thx for continuing :)
08:49 cait              marcelr++
08:49 marcelr           hi cait
08:49 marcelr           continue what btw
08:49 cait              QA
08:49 marcelr           ok now i see :)
09:05 Hopla             Hey koha i just upgraded frop 3.8 to 3.12 i have some problems with my old css file so i want to make a new one only i find there are 2 now (i wanted to copy one to edit it) one in ccsr and one in prog ... in 3.8 there was only 1 so i should copy both or just 1 in a new local css file ?
09:09 cait              ccsr and prog are themes
09:09 cait              each has it's own css file
09:09 cait              but don't change the default css
09:09 Hopla             i wont
09:09 cait              ok :)
09:09 cait              you can just add your things to opacusercss system preference
09:09 cait              then you don't need to care about the files and it will just overwrite the default
09:09 cait              if you want to use a file, there are other preferences
09:10 cait              the theme you use can be found out by looking at the opacthemes system preference
09:10 Hopla             sec
09:10 Hopla             ok im using prog
09:12 cait              ccsr is going to go away with 3.16 probably, it's responsive, but there is a new theme bootstrap since 3.14 that's going to be the new and only default theme at some point
09:12 Hopla             so if i make a copy off the opac.css edited it and point to it in opaclayoutstylesheet it would be ok
09:12 cait              if you worked like that in the past, it should work again
09:12 Hopla             ow
09:13 Hopla             thx fr the info :)
09:13 cait              if you only have minor changes, i'd recomment just changing what you want to change and use opacusercss - it's bit easier to maintain
09:13 cait              sure
09:13 Hopla             aha
09:13 Hopla             searching that one
09:14 cait              yu can put in the css there directly and change anytime by logging in
09:14 cait              also on updating, you won't miss the changes from the new css, so probably less work to make it fit again
09:15 Hopla             ill try i only change logo + colours so
09:20 cait              ah yeah, probably easier then :)
09:54 vfernandes        hi :)
09:54 cait              hi vfernandes :)
09:54 vfernandes        where I can see the koha roadmap and new features/enhacements for 3.16?
09:56 cait              you could do a search in bugzilla
09:56 cait              for pushed since last release
09:56 cait              you can create searches by status change - that should work
09:56 cait              and master as version
09:57 vfernandes        i thought there was a simple way :D
09:57 cait              there is also a release notes script that can be fun - it's on the  git repo
09:58 cait              it's still a bit time to go, so hard to tell what will make it
09:58 cait              i woud look at what's already pushed
09:58 cait              i think it was get-bugs.pö
09:59 cait              .pl - http://git.koha-community.org/gitweb/?p=release-tools.git;a=tree;h=c5e072bd0ddc5f4467408f330ed94fabd6db4ea0;hb=c5e072bd0ddc5f4467408f330ed94fabd6db4ea0
09:59 cait              hi petter
09:59 petter            hi cait!
10:03 magnuse           kia ora petter!
10:03 petter            hei magnus!
10:03 magnuse           cait: .pö - is that for german perl scripts? ;-)
10:03 vfernandes        but there is any new features that you have any information cait?
10:06 cait              i'd have to think - there are some :) the dashboard always shows the last five enh/features pushed
10:07 cait              http://dashboard.koha-community.org/
10:07 wahanui           http://dashboard.koha-community.org/ are the better stats to optimise
10:14 magnuse           enhancements and new features pushed to master since 2013-11-22: http://bugs.koha-community.org/bugzilla3/buglist.cgi?bug_severity=enhancement&bug_severity=new%20feature&bug_status=UNCONFIRMED&bug_status=NEW&bug_status=REOPENED&bug_status=ASSIGNED&bug_status=In%20Discussion&bug_status=Needs%20Signoff&bug_status=Signed%20Off&bug_status=Passed%20QA&bug_status=Pushed%20for%20QA&bug_status=Failed%20QA&bug_status=Patch%20doesn%27t%20apply&bug_stat
10:14 magnuse           us=Pushed%20to%20Master&bug_status=Pushed%20to%20Stable&bug_status=RESOLVED&bug_status=VERIFIED&bug_status=CLOSED&bug_status=BLOCKED&chfield=bug_status&chfieldfrom=2013-11-22&chfieldto=Now&chfieldvalue=Pushed%20to%20master&list_id=93066&query_format=advanced
10:14 magnuse           whoa, that's a long url
10:15 magnuse           short version: http://tinyurl.com/o2l47s6
10:16 magnuse           hm, it just shows one new feature - everything might not be correctly tagged...
10:17 cait              i think we are quite strict with 'new feature' but not so many more modules we can write... so maybe we should be a bit less strict :)
10:18 magnuse           :-)
10:18 magnuse           well, there are some features waiting to get in, i guess
10:18 magnuse           like rancor
10:22 cait              hm i forgot to make tea today.
10:22 * cait            wanders off
10:23 vfernandes        thanks magnuse and cait
10:51 magnuse           woohoo - bug 10003
10:51 huginn            04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=10003 enhancement, P5 - low, ---, tomascohen, Passed QA , koha-* scripts (packages) should provide tab-completion in bash
11:20 * BTT             slaps francharb_afk around a bit with a large fishbot
11:22 francharb         good morning
11:38 * magnuse         will have french cheese for lunch, but only one kind...
12:05 drojf             hi #koha
12:16 oleonard          Hi #koha
12:17 cait              hi oleonard :)
12:20 nlegrand          hey #koha !
12:55 oleonard          Seems like people are entering search terms into Koha list emails like it's Google.
12:57 drojf             it isn't? ;)
13:34 kivilahtio        m de rooy
13:34 kivilahtio        does anyone now his nick?
13:34 oleonard          irc regulars?
13:34 wahanui           irc regulars is at http://wiki.koha-community.org/wiki/IRC_Regulars
13:35 kivilahtio        oleonard: thanks :D
13:35 kivilahtio        marcelr: about bug 11974
13:35 huginn            04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=11974 enhancement, P5 - low, ---, gmcharlt, NEW , Enable unix socket connections for database connections.
13:35 marcelr           hi kivilahtio
13:35 kivilahtio        hi there :D
13:36 kivilahtio        good suggestion btw
13:36 kivilahtio        I am not very good with Makefile.pl
13:36 marcelr           :)
13:36 kivilahtio        or make whatsovere
13:36 kivilahtio        it is bad for my health
13:36 marcelr           you can do it !
13:36 kivilahtio        I was hoping you could do it
13:36 marcelr           ;)
13:36 kivilahtio        that would force you to sign off my patch
13:36 kivilahtio        and I could sig of your pathc
13:37 kivilahtio        this way we both get a sign off and commit
13:37 kivilahtio        win win
13:37 kivilahtio        you scratch my back i scratch yours?
13:37 marcelr           better offer?
13:37 kivilahtio        :)
13:37 kivilahtio        I'll take a look ral quick
13:37 kivilahtio        but if ANY issues arise...
13:37 marcelr           so good
13:37 markvandenborre   I am looking into the best way to simplify entry of sheet music books into koha
13:38 markvandenborre   we only keep a relatively minor subset of metadata about them
13:38 markvandenborre   any pointers towards relevant bits of documentation?
13:38 cait              markvandenborre: sounds like you might want to use a custom framework
13:38 markvandenborre   cait: as in not koha?
13:39 markvandenborre   there seem to be provisions in koha for sheet music...
13:39 markvandenborre   even ISMN (sheet music) numbering lookup
13:39 cait              hm?
13:40 cait              a custom framework would be just a catalouging form defined witht he fields you want to use
13:40 cait              to make data entry easier
13:40 cait              a bibliographic framework
13:40 cait              are you using marc21 or unimarc?
13:40 markvandenborre   right, I was already guessing that this was some bilbiographic or koha specific terminology, not a custom web framewrork :-)
13:41 markvandenborre   unimarc iirc, I chose the EU specific version
13:41 cait              unimarc is only used in a few countries
13:41 cait              france, italy...
13:41 markvandenborre   Belgium?
13:41 cait              depending on where you are located marc21 might make sense
13:41 markvandenborre   what am I going to get the best support on?
13:41 cait              i can't really tell you, maybe look what your national library uses, or what z39.50 servers in your regions use
13:41 markvandenborre   ok
13:42 cait              germany is all marc21 now
13:42 cait              but that's something that is hard to change later
13:42 cait              because the formats are different
13:42 cait              so worth investigating a bit
13:42 markvandenborre   it's not incredibly important
13:42 markvandenborre   very small library
13:42 markvandenborre   2702 books
13:42 cait              also documentation about the format
13:42 markvandenborre   some audio cd's and only a tiny number of dvd's
13:43 cait              both formats are similar and then very different
13:43 markvandenborre   with no big increase on either a close or faraway horizon
13:43 cait              in which fields are used
13:43 markvandenborre   can you point me at more specific docs about this "custom framework" reference you just made
13:43 markvandenborre   "custom framework" is horrible as a web search term
13:43 cait              look in the manual for bibliographic frameworks
13:43 petter            olli: about your socket patch -  did you measure the speedup yourself?
13:43 petter            30-40%
13:44 petter            is it for real?
13:46 markvandenborre   heh, the links to the manual dropped off my screen
13:47 markvandenborre   when I clicked the documentation page
13:47 markvandenborre   so I just now looked at it more closely
13:47 markvandenborre   they were entirely invisible when arriving to http://koha-community.org/documentation/
13:48 markvandenborre   don't know if there's anyone with the ambition to make the docs more prominent, but it would have helped stupid me
13:49 markvandenborre   the manual seems to be quite comprehensive
13:55 markvandenborre   cait: so the idea is I remove all the tags I don't need from a copy of the default framework, correct?
13:55 marcelr           cait: bug 9032
13:55 huginn            04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=9032 enhancement, P5 - low, ---, m.de.rooy, Signed Off , Share a list
13:56 marcelr           i would have preferred to not include the notices for all the other languages at all :)
13:56 marcelr           but that project got killed rather quickly
13:57 markvandenborre   just answered my own question here after reading the f*cking manual
13:58 marcelr           ga zo door markvandenborre
13:59 markvandenborre   marcelr: I was hoping to find an easy way to delete tens of tags we don't need at once
13:59 markvandenborre   am I overlooking something
13:59 markvandenborre   or is that just not possible?
13:59 marcelr           command line?
14:00 markvandenborre   marcelr: where is this even stored, I presume in the sql db
14:00 marcelr           you mean user tags or marc tags? btw..
14:01 markvandenborre   I have made a copy of the default framework, which I suppose contains only marc tags
14:01 markvandenborre   and I'm deleting all kinds of information from that
14:02 marcelr           ok you mean tags in framework
14:02 markvandenborre   with the goal of creating a very _simple_
14:02 marcelr           you can hide them
14:02 markvandenborre   ah? please tell me more about that?
14:02 marcelr           look for tables marc_tag.. marc_subfield.. etc (please see the docs :)
14:03 cait              markvandenborre: better not delete but hide
14:03 cait              in my experience
14:03 cait              and you can use export/import and edit as spreadsheet
14:04 markvandenborre   ok... so now the question becomes how to "hide"
14:04 cait              export from koha
14:04 * markvandenborre starts looking through the manual
14:04 cait              i think -5 shoudl do it
14:04 cait              in the hideen
14:04 markvandenborre   this feels like very much a welcoming irc channel, more so than many others
14:05 marcelr           this feels like a quote :)
14:05 markvandenborre   thank you
14:05 marcelr           you can add quotes via a bot
14:06 markvandenborre   thank you for the info, but I'm more interested in getting this up and running
14:07 markvandenborre   cait: -5? hideen?
14:07 markvandenborre   I have this export open in libreoffice now
14:07 cait              first
14:07 cait              ah
14:07 cait              there is a hidden column
14:07 cait              in the spreadsheet
14:08 markvandenborre   there is no column with the header "hidden" as far as I can see
14:08 markvandenborre   but maybe you mean something more magic?
14:10 markvandenborre   I see column headers tagfield, liblibrarian, libopac, repeatable, mandatory, authorised_value, frameworkcode
14:10 markvandenborre   ah
14:10 markvandenborre   there is more info lower in the csv
14:15 markvandenborre   so for a lot of info, this is clear enough
14:17 markvandenborre   is there some overview page where I can find the meaning of all these unimarc fields?
14:17 markvandenborre   some are quite cryptic
14:17 jcamins           unimarc?
14:17 wahanui           i guess unimarc is http://www.ifla.org/en/publications/unimarc-formats-and-related-documentation
14:18 pastebot          "petter" at 127.0.0.1 pasted "bug 11974 before / after benchmarking" (34 lines) at http://paste.koha-community.org/179
14:18 huginn            04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=11974 enhancement, P5 - low, ---, gmcharlt, NEW , Enable unix socket connections for database connections.
14:30 petter            kivilahtio: I get a minor speeup from using domain sockets, from few percent to at best 19% faster
14:30 kivilahtio        petter: how are you testing?
14:30 petter            I'd be happy to sign of
14:30 petter            by the benchmarking scripts in misc/load_testing
14:30 petter            check the linked paste
14:30 petter            http://paste.koha-community.org/179
14:31 nengard           oleonard around?
14:31 oleonard          Yes
14:31 kivilahtio        petter: a sec I almost have the Makefile.PL modified
14:32 nengard           I have a library that wants to hide the guarantor info on the child record add/edit form
14:32 nengard           the borrowerunwanted field isn't working
14:32 nengard           and the section has no id
14:32 nengard           tips on hiding it with jquery?
14:32 marcelr           kivilahtio++
14:32 kivilahtio        it is a pain :)
14:32 kivilahtio        and I can feel the maintenance issues
14:33 petter            hm, now that i look at it actually some benchmarks are slower
14:33 petter            needs more testings
14:33 petter            more benchmarks i mean
14:34 oleonard          nengard: I shouldn't even ask, but... Why do they want to hide it?
14:34 nengard           i don't know the answer to that ... I just do what i'm asked to do :)
14:34 nengard           Probably they don't want to link the child to an adult but do want other child functionality like the j to a cron
14:35 oleonard          Do they think you can't create a child record without going through the "add child" link?
14:35 oleonard          What does it have to do with the J to A cron?
14:36 nengard           wait
14:36 nengard           i had a typo let me see if it works now
14:37 nengard           Yes you can ...
14:37 nengard           create a child without going through that link
14:37 kivilahtio        petter: marcelr: go for it! bug 11974
14:37 nengard           you just go to 'New Patron'
14:37 huginn            04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=11974 enhancement, P5 - low, ---, gmcharlt, NEW , Enable unix socket connections for database connections.
14:37 nengard           and choose the patron type you want :)
14:37 nengard           and the J to A Cron turns a child to an adult when they reach the right age per the patron category
14:38 marcelr           kivilahtio: i will be looking at it (in some stage of the process)
14:38 marcelr           but not now
14:39 oleonard          nengard: I'll give up trying to understand, but I think their request might indicate a misunderstanding of how Koha works on their part that you might want to investigate.
14:39 oleonard          nengard: $("legend:contains('Guarantor information')").parent().hide();
14:41 petter            olli - how to test the Makefile stuff. I always installed using packages then gitify..
14:41 petter            ?
14:45 kivilahtio        petter: erm
14:45 kivilahtio        petter: go to the folder where your INSTALL.ubuntu is
14:46 kivilahtio        petter: maybe the easiest way is to use those fancy virtual sandboxes you have
14:47 kivilahtio        petter: but from the INSTALL.ubuntu folder, run perl Makefile.PL
14:47 kivilahtio        answer questions
14:47 kivilahtio        make
14:47 kivilahtio        make test
14:47 kivilahtio        nano blib/KOHA_CONF_DIR/koha-conf.xml
14:47 petter            So does the packages use the Makefile in anyway?
14:48 kivilahtio        see if mysql_socket is there with hte value you gave
14:48 kivilahtio        petter: no idea
14:48 petter            If packages are the recommended & preferred way, who uses the perl Makefile way?
14:48 kivilahtio        but if you have the source from git, you should have Makefile.PL
14:48 petter            off course
14:48 petter            I have it
14:48 kivilahtio        petter: don't ask me
14:48 kivilahtio        petter: if you do crazy stuff like me you will enjoy the Makefile.PL
14:48 petter            haha
14:48 petter            never
14:49 petter            Anyway, I did some more benchmarkings
14:49 kivilahtio        I dont think you need to do make install (to overwrite your configs)
14:49 petter            It seem indeed to be about 10% speed up on average
14:49 petter            ok
14:49 kivilahtio        just verify that the file there is modified
14:49 petter            sure, got it
14:50 petter            Did you run the benchmarking scripts?
14:50 kivilahtio        i tried, i failed miserably
14:50 kivilahtio        I will write my own
14:50 petter            but its easy
14:50 kivilahtio        I got like 100% error
14:50 kivilahtio        dunno
14:50 kivilahtio        need to take a look there
14:50 kivilahtio        but now i need to configure zebra + mariadb + koha with unix sockets
14:50 petter            cpan install HTTPD::Bench::ApacheBench
14:50 petter            then run the scripts
14:50 kivilahtio        yeah
14:51 kivilahtio        like i said i got some errors
14:51 petter            you should measure!
14:51 kivilahtio        jaja
14:51 petter            if not you dont know if it works
14:51 kivilahtio        could be
15:55 nengard           anyone around to help me find a bug report?
15:55 nengard           I'm looking for a bug report I think we have that talks about the date shown for serials in the opac
15:55 nengard           how it's not the date every library wants to see
16:02 reiveune          bye
16:15 wajasu            nengard: maybe bug 8296
16:15 huginn            04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=8296 enhancement, P5 - low, ---, julian.maurice, Needs Signoff , Add descriptive (text) published date field for serials
16:56 bag               magnuse: HELLO there
17:09 pianohacker       hello
17:27 mtompset          Greetings, #koha/.
17:30 rhcl              hihi
17:34 magnuse           bag: HI
17:52 Brooke            o/
17:57 francharb         \o
17:57 Brooke            we gots us any big city fancy pants label makin' folkses?
18:04 Brooke            gadzukes there are actual pictures in the label creator :)
18:11 wajasu            should i start testing signoffs with wheezy for master from now on?
18:18 mtompset          wajasu: What do you mean?
18:22 wajasu            i wasn't sure if squeeze is the target debian, or wheezy.  i wondered if i needed to build my test VM with wheezy
18:30 mtompset          I would guess that squeeze is okay until some time in 2015. :)
18:30 mtompset          https://wiki.debian.org/DebianReleases
18:32 alphaman          Anybody have any experience using John Wohler's sip2.class.php? I'm having trouble getting a stable connection to 3.14 with it.
18:51 mtompset          Did you just ask about PHP?
18:51 mtompset          @karma php
18:51 huginn            mtompset: php has neutral karma.
18:52 mtompset          @karma perl
18:52 huginn            mtompset: perl has neutral karma.
18:52 alphaman          yup. sorry...
18:52 alphaman          I was worried I'd get neutralized...
18:52 mtompset          No need to apologize. Everyone does it. They are just ashamed to admit it. ;)
18:53 alphaman          In all fairness, I've been doing Perl far longer than PHP…
18:53 alphaman          sometimes you just gotta dig down in the muck to find a solution
18:55 alphaman          yay, tho I walk through the valley of PHP, I shall fear no evil, for Perl is my shepherd, I shall not stack dump…
18:58 rhcl              alphaman: just out of curiosity are you trying to connect some specific app to Koha?
19:03 alphaman          homegrown. trying to auth patron's cards for our Drupal server so they can access our online resources
19:25 jenkins_koha      Project Koha_Docs_3.12.x build #52: FAILURE in 47 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.12.x/52/
19:26 jenkins_koha      Project Koha_Docs build #458: FAILURE in 18 sec: http://jenkins.koha-community.org/job/Koha_Docs/458/
19:30 jenkins_koha      Project Koha_Docs_3.14.x build #32: FAILURE in 14 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.14.x/32/
19:39 jenkins_koha      Project Koha_Docs_3.12.x build #53: STILL FAILING in 4.5 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.12.x/53/
19:44 jenkins_koha      Project Koha_Docs_3.14.x build #33: STILL FAILING in 4 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.14.x/33/
19:54 jenkins_koha      Project Koha_Docs_3.12.x build #54: STILL FAILING in 5 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.12.x/54/
19:56 jenkins_koha      Project Koha_Docs build #459: STILL FAILING in 4.8 sec: http://jenkins.koha-community.org/job/Koha_Docs/459/
19:59 jenkins_koha      Project Koha_Docs_3.14.x build #34: STILL FAILING in 3.9 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.14.x/34/
20:09 jenkins_koha      Project Koha_Docs_3.12.x build #55: STILL FAILING in 4 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.12.x/55/
20:14 jenkins_koha      Project Koha_Docs_3.14.x build #35: STILL FAILING in 3.6 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.14.x/35/
20:24 nengard           hmm jenkins isn't helping me - he's not telling me why the manual is failing :( 	-- "No problems were identified. If you know why this problem occurred, please add a suitable Cause for it"
20:24 jenkins_koha      Project Koha_Docs_3.12.x build #56: STILL FAILING in 4.8 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.12.x/56/
20:25 nengard           I can't make it stop failing if you don't tell me what I did wrong jenkins!!
20:25 nengard           :(
20:26 jenkins_koha      Project Koha_Docs build #460: STILL FAILING in 5.3 sec: http://jenkins.koha-community.org/job/Koha_Docs/460/
20:27 bag               nengard: can you get to where the docs are?
20:27 mtompset          @later tell papa So tired trying to figure out bug 11213 test plan. I think I finally got it.
20:27 huginn            mtompset: The operation succeeded.
20:27 nengard           bag you mean git? yes
20:27 nengard           usually jenkins emails me though with the exact error
20:27 nengard           he's being a .... not nice bot .... today
20:27 magnuse           it fails awfully fast...
20:28 mtompset          "Cannot allocate memory" That's the problem, I figure.
20:28 mtompset          nengard: Can you reboot the machine it is building on?
20:28 nengard           not sure what machine that is ....
20:28 magnuse           i can see "ERROR: Workspace has a .git repository, but it appears to be corrupt." here: http://jenkins.koha-community.org/job/Koha_Docs/460/console
20:28 nengard           I type my fancy git commands and it magically updates
20:28 nengard           bag is it on our server?
20:29 mtompset          nengard: free -m
20:29 mtompset          How much is left?
20:29 mtompset          When the script runs, it probably is a memory hog.
20:29 bag               mtompset: are you talking about the git server?  or the jenkins server?
20:29 jenkins_koha      Project Koha_Docs_3.14.x build #36: STILL FAILING in 4.4 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.14.x/36/
20:29 magnuse           i also see "Building remotely on Galen in workspace /var/lib/jenkins/workspace/Koha_Docs" - maybe gmcharlt knows something about it?
20:29 mtompset          http://jenkins.koha-community.org/job/Koha_Docs/460/console
20:30 mtompset          ERROR: Workspace has a .git repository, but it appears to be corrupt.
20:30 mtompset          ewww....
20:31 nengard           is it corrupt on my local machine? or is that something that I have no control over?
20:31 nengard           it all seems okay from my end
20:32 magnuse           yeah, looks like a problem on the jenkins server
20:33 nengard           whew
20:33 nengard           so i'm all good :)
20:34 magnuse           from that console output it looks like the git repo on the jenkins server is corrupt, maybe because of some lack of memory (is my guess, based on looking at the error messages)
20:35 bag               nengard what are you doing to gmcharlt's memory :P
20:35 nengard           breaking it
20:39 jenkins_koha      Project Koha_Docs_3.12.x build #57: STILL FAILING in 7.9 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.12.x/57/
20:39 nengard           :P
20:41 jenkins_koha      Project Koha_Docs build #461: STILL FAILING in 6 sec: http://jenkins.koha-community.org/job/Koha_Docs/461/
20:44 jenkins_koha      Project Koha_Docs_3.14.x build #37: STILL FAILING in 4.8 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.14.x/37/
20:47 jenkins_koha      Project Koha_Docs build #462: STILL FAILING in 4.8 sec: http://jenkins.koha-community.org/job/Koha_Docs/462/
20:53 jenkins_koha      Project Koha_Docs build #463: STILL FAILING in 5.6 sec: http://jenkins.koha-community.org/job/Koha_Docs/463/
20:55 rangi             its not actually corrupt
20:55 rangi             its just the box is out of ram
20:55 rangi             not on the jenkins server either, but on galens node
20:56 mtompset          rangi++ # yay for a diagnosis. :)
20:56 Dyrcona           gmcharlt is giving a presentation. I'll let him know after he's finished.
20:57 * cait            waves
20:58 jenkins_koha      Project Koha_Docs_3.12.x build #58: STILL FAILING in 3.8 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.12.x/58/
20:59 rangi             Dyrcona: thanks, ill disable the node for the time being
21:00 rangi             http://jenkins.koha-community.org/computer/Galen/
21:09 magnuse           so Viktor's public library has been testing koha under plack on a server i set up for them, and it has not blown up while they have been testing the functions they usually use
21:10 magnuse           they are now begging me to let them test koha+plack for one day in real use :-)
21:10 rangi             sweet, we run the opac under plack in production
21:10 rangi             however without fixing the zconn subroutine
21:10 rangi             in C4::Context
21:11 rangi             it isnt safe, it randomly gives no search results (connection has gone away)
21:11 rangi             but if you fix that, its mostly safe enough, (there are more issues but we plan on fixing them soon)
21:15 magnuse           rangi: cool. and "the zconn subroutine" would be fixed by your patch which is floating around somewhere?
21:15 magnuse           if there are known issues it would be really good to have them documented somewhere
21:15 rangi             yep thats the plan
21:16 rangi             but sysprefs is one
21:16 rangi             fixing the memcached config is another
21:16 rangi             we cant be having env variables in apache ..
21:18 eythian           hi
21:18 magnuse           "sysprefs is one" - does that mean there is a problem with sysprefs in general, or that changing them can be troublesome?
21:18 magnuse           hiya eythian
21:22 * cait            waves
21:23 * magnuse         waves to cait
21:23 pianohacker       hi all
21:23 magnuse           hiya pianohacker
21:24 pianohacker       hey magnuse
21:24 pianohacker       how was your flight back?
21:24 magnuse           looong :-)
21:24 magnuse           well, not as long as yours, but...
21:24 cait              hi pianohacker
21:24 cait              :)
21:25 magnuse           we left our apartment at 05:00 am and arrived at home at 20:00 pm
21:25 magnuse           3 flights and lots of waiting
21:25 magnuse           pianohacker: how was yours?
21:25 pianohacker       magnuse: past a certain point, it's just long and exhausting
21:25 magnuse           yeah...
21:26 * cait            nods
21:26 magnuse           leonard is now ~22 months and has flown 22 times
21:26 pianohacker       magnuse: mine was decent. Got patted down twice, really should have shaved, and customs in the US was way understaffed, but everything happened in time :)
21:27 magnuse           that is something :-)
21:27 pianohacker       magnuse: wow. I hope you don't keep that up, that's a lot of time in the air!
21:32 magnuse           pianohacker: it is. huge carbon footprint for a small boy. it will be a lot less once he starts kindergarten and his mothers starts working again...
21:33 pianohacker       ahh, yeah... he's coming with you to oslo, I'm guessing?
21:37 magnuse           not so far, but we have talked about spending a couple weeks there before summer
21:37 magnuse           we'll see
21:37 * magnuse         wanders off
21:37 pianohacker       bye
21:41 tcohen            rangi: has anyone profilled if caching syspref fetching is worth the trouble?
21:41 tcohen            i guess mysql has to be good for caching such trivial queries
21:42 jcamins           tcohen: it is *not*.
21:42 eythian           jcamins: which comment are you refuting?
21:43 jcamins           Oh.
21:43 jcamins           Right.
21:43 tcohen            the second?
21:43 wahanui           the second is, like, by design
21:43 jcamins           tcohen: yes, I've profiled, and it's not worth it.
21:43 jcamins           Maybe with Plack it could be worthwhile, but that's kind of iffy, too.
21:43 jcamins           And the absolute worst thing you can do performance-wise is to preload all the sysprefs.
21:44 bag               right
21:46 jcamins           Well, no, the worst thing you can do performance-wise is preload all the sysprefs and discard them all.
21:46 jcamins           Or, I suppose, try to cache the entire database.
21:47 eythian           yeah, it's worth noting that in many cases deserialisation can be pretty expensive.
21:48 jcamins           It's better if you're using JSON::XS, but it's still bad.
21:48 eythian           I found a case a little while ago where the memoisation of a function made it slower.
21:49 eythian           (because it was doing its own internal caching anyway, which has its own problems. But someone had added the memoisation without benchmarking before/after.)
21:49 wajasu            could it be worth using the memcached option for zebra queries?  and possibly namedResults for paging
21:49 eythian           zebra is not something that is slow.
21:49 eythian           you do a search and profile it, zebra doesn't really show up at all.
21:49 tcohen            fetching the frameworks is
21:49 bag               it's just a pain :P
21:49 eythian           frameworks are slow.
21:49 wajasu            premature optimization then
21:50 eythian           they should be handled in a better way.
21:50 tcohen            question: would it be a bad thing to force a collation for SQL queries on the Perl side?
21:51 wajasu            what if the mapping framework could be generated as an XSLT transform that zebra uses and maybe we return MARC as JSON
21:51 wajasu            just a wild idea
21:51 eythian           nothing about that sounds like it's likely to be an improvement :)
21:51 tcohen            MiJ++
21:52 jcamins           Well, MiJ is faster than Marcxml.
21:52 tcohen            i was looking for a way to have dbic tell me the attributes of a db connection (character set, collation)
21:52 eythian           http://debian.koha-community.org/~robin/opac-search-cached/nytprof/ <-- wajasu, start here.
21:53 tcohen            and found that we could actually force those from the code
21:53 jcamins           (n.b. this was not a vote for making a change like that, just observing that MiJ is faster)
21:53 eythian           http://debian.koha-community.org/~robin/opac-search/ <-- or here for a version with memcache off.
21:54 rangi             the thing for remembering with caching
21:54 rangi             its not for speed
21:54 rangi             its for scalability
21:54 rangi             memcached scales more easily than mysql
21:55 wajasu            i wish i could get the biblionumbers for a resultset from zebra without having to process the marcxml chunk.  then i could write batched queries against the db to determine if an item is hidden.
21:56 tcohen            you can do it
21:56 tcohen            jcamins knows how :-P
21:57 tcohen            eythian: the cached version takes longer?
21:57 eythian           yes
21:58 tcohen            found the bottelneck?
21:58 pianohacker       rangi: that reminds me, did you get a chance to look at the Koha::Database slowness?
21:58 rangi             nope, ive been buried in ncip
21:59 pianohacker       that's a heavy, smelly pile to be buried in good sir
21:59 eythian           tcohen: it's mentioned on bug 11051
21:59 huginn            04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=11051 enhancement, P5 - low, ---, robin, Signed Off , Performance of opac-search
22:01 wajasu            by the way, is it better to run C4::Context::preference once at the top of the script instead of in the loops latter when deciding to process?
22:02 eythian           you should avoid calling functions in loops anyway, just to be safe.
22:02 eythian           (in general, anyway.)
22:03 eythian           (often it's not optional, but if you can readily store something in your own namespace, it's probably good to.)
22:03 eythian           but again, premature optimisation etc etc.
22:03 eythian           so do what looks best and then see.
22:03 eythian           if your loop happens 5 times, no one cares. If it happens 5,000 times, then maybe worth refactoring.
22:03 eythian           your profiler will let you know.
22:05 wajasu            is opac-search really geared toward returning 1000 results.  since the zebra sort seems like its configured for that many results.
22:05 eythian           I don't really know, I've not look at that sort of thing very deeply.
22:07 wajasu            i know we git the hits, then page around 20 per page, so we grab 20 marcxmls to present.
22:07 tcohen            actually 1000 might be a low value
22:08 wajasu            when the build facets runs it also has a max records to preocess syspref and that constrains the amount of marcxml and then the mysql db items retrived to calucalte facet counters.
22:09 tcohen            facets aren't even facets
22:09 * tcohen          hides
22:09 wajasu            true
22:10 wajasu            but we must process all the items for the biblionumbers for a given result set.
22:11 tcohen            context?
22:11 wahanui           i think context is everything?
22:12 wajasu            i'm wondering if i can just generate chunked queries supplying a subset of the biblionumbers, that use the where clause to match the OpacHideItem rules.
22:12 tcohen            ask zebra for several biblionumbers for using them?
22:13 bag               @quote random
22:13 huginn            bag: Quote #175: "*oleonard is waiting for a good scientist -> hulk patron category transition script" (added by wizzyrea at 03:44 PM, January 04, 2012)
22:13 wahanui           i already had it that way, huginn.
22:14 wajasu            got to run.
23:06 tcohen            [off] f*k indexdata
23:20 dcook             [off] well said, tcohen
23:33 pianohacker       amen
23:47 gmcharlt          rangi: jenkins slave process got bloated; I've killed it and put it back in the cluster
23:52 jenkins_koha      Starting build #38 for job Koha_Docs_3.14.x (previous build: STILL FAILING -- last SUCCESS #31 1 day 10 hr ago)
23:52 jenkins_koha      Yippee, build fixed!
23:52 wahanui           o/ '`'`'`'`'`'`'`'`'`
23:52 jenkins_koha      Project Koha_Docs_3.14.x build #38: FIXED in 5 min 30 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.14.x/38/
23:52 jenkins_koha      Nicole C. Engard: update local cover images info
23:52 tcohen            @wunder cordoba, argentina
23:52 huginn            tcohen: The current temperature in Bo Alto de San Martin, Cordoba City, Argentina is 20.6°C (8:50 PM ART on March 20, 2014). Conditions: Clear. Humidity: 81%. Dew Point: 17.0°C. Pressure: 29.71 in 1006 hPa (Steady).
23:56 dcook             tcohen: How long until you head to London? :)
23:56 dcook             @wunder london, england
23:56 huginn            dcook: The current temperature in London, United Kingdom is 8.0°C (11:50 PM GMT on March 20, 2014). Conditions: Clear. Humidity: 93%. Dew Point: 7.0°C. Windchill: 6.0°C. Pressure: 29.68 in 1005 hPa (Steady).
23:56 pianohacker       bye
23:56 dcook             ta, pianohacker
23:56 pianohacker       cya dcook
23:59 jenkins_koha      Starting build #59 for job Koha_Docs_3.12.x (previous build: STILL FAILING -- last SUCCESS #51 1 day 10 hr ago)
23:59 jenkins_koha      Yippee, build fixed!
23:59 wahanui           o/ '`'`'`'`'`'`'`'`'`
23:59 jenkins_koha      Project Koha_Docs_3.12.x build #59: FIXED in 1 min 55 sec: http://jenkins.koha-community.org/job/Koha_Docs_3.12.x/59/
23:59 jenkins_koha      Nicole C. Engard: update local cover images info