Time  Nick          Message
00:10 wizzyrea      unfair weather.
00:10 wizzyrea      first day back to work with brilliant sunshine after two wet days, one cold.
00:11 wizzyrea      three wet, one cold, I mean.
00:11 rangi         yeah
00:24 ibeardslee    go home, 'sick .. of work'
00:25 * Francesca   waves
00:25 rangi         hi Francesca
00:25 Francesca     sup
00:26 wizzyrea      ibeardslee: yeah nah, that can't happen
04:27 * dcook       waves
04:28 dcook         Don't suppose rangi or gmcharlt are around at the moment regarding a Debian package question?
04:28 dcook         Or does anyone recall if we're still supporting Debian Squeeze?
04:29 dcook         I suppose http://wiki.koha-community.org/wiki/Koha_on_Debian doesn't specify Squeeze anymore..
07:00 Joubu         hi
07:32 * magnuse     waves
07:39 * Francesca   waves
07:43 Joubu         @later rangi there is no way to disable PR on github
07:43 huginn        Joubu: I've exhausted my database of quotes
07:43 Francesca     lol
07:45 eythian       dcook: wiki page history will tell you when it was removed 🙂
07:46 Joubu         @later tell rangi there is no way to disable PR on github
07:46 huginn        Joubu: The operation succeeded.
07:46 Joubu         @later tell rangi I have updated the description of the project with "Note: This project uses its own bug tracker, see bugs.http://koha-comminity.org to report a bug or submit a patch. " Let me know if you prefer a better wording
07:46 huginn        Joubu: The operation succeeded.
07:46 reiveune      hello
07:47 Francesca     hi
07:47 wahanui       privet, Francesca
07:49 magnuse       Joubu++
07:57 alex_a        bonjour
07:57 wahanui       hola, alex_a
08:00 magnuse       @wunder boo
08:00 huginn        magnuse: The current temperature in Bodo Vi, Norway is -6.0°C (4:00 AM CET on January 05, 2016). Conditions: Partly Cloudy. Humidity: 33%. Dew Point: -16.0°C. Pressure: 29.99 in 1016 hPa (Falling).
08:02 rangi         Joubu++
08:02 rangi         thanks
08:02 rangi         at some point i should grab those PR and make them into patches
08:04 Joubu         At least 1 is important (it fixes a syntax issue in sql file), the last one
08:04 Joubu         I have closed the PR, asking to open a bug report on bz
08:10 gaetan_B      hello
08:10 Joubu         @later tell khall How did you generate the schema? I get an error "make_schema_at(): Checksum mismatch in './/Koha/Schema/Result/Borrower.pm', the auto-generated part of the file has been modified outside of this loader"
08:10 huginn        Joubu: The operation succeeded.
08:12 Joubu         @later tell khall certainly comes from 017f62ea3752a459a1f5cafecae85e9fb5bfbdd1
08:12 huginn        Joubu: The operation succeeded.
08:17 magnuse       kia ora slef!
08:22 mveron        Good morning / daytime #koha
08:22 * mveron      waves to rangi - thanks for your blog!
08:22 magnuse       hiya mveron
08:23 * cait        waves
08:23 rangi         no problem, thanks for all your hard work
08:23 cait          rangi++ mveron++ :)
08:23 mveron        hiya magnuse, cait, Joubu everybody...
08:24 Joubu         hi mveron
08:24 cait          and a Gutes Neues!
08:24 mveron        cait: Äbefalls äs guets Nöis
08:24 Joubu         mveron: what is your solution on bug 15462N
08:24 huginn        04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=15462 critical, P5 - low, ---, jonathan.druart, Needs Signoff , Unable to renew books via circ/renew.pl
08:24 Joubu         ?
08:24 Joubu         This is a tricky one
08:25 mveron        Joubu: Did you try my patch?
08:25 Joubu         I haven't seen it yet
08:25 Joubu         well yes
08:25 Joubu         that was the naive approach
08:25 Joubu         and it could work
08:25 Joubu         but I was not sure enough yet
08:25 Joubu         did you catch the reason (the unique key on itemnumber)?
08:27 mveron        Joubu: If my patch is wrong or to naive feel free to obsolete it.
08:29 Joubu         item->issues has been (automatically) renamed with item->issue after we have added the unique key constraint on item.itemnumber
08:29 Joubu         I am not sure to understand why but when this unique key has been added, the has_many rs has been replaced with a might_have
08:30 Joubu         and the plural of issue is gone
08:30 Joubu         so that would mean we can only access to 1 issue for a given item
08:30 drojf         morning #koha
08:30 Joubu         which sounds odd
08:31 Joubu         hi drojf :)
08:31 mveron        Hi drojf :-)
08:31 cait          not sure i can follow the 2 of you :)
08:31 cait          i think 1:1 between items and issues seems correct, 1:n for old_issues?
08:32 drojf         jo Joubu, mveron and cait :)
08:32 mveron        cait: I fixed soething thet seems to be a symptom of some bigger issue.
08:32 mveron        smoething
08:32 drojf         huh, i meant hi. but jo works too
08:32 drojf         :D
08:32 mveron        something (should put my glasses)
08:32 Joubu         yes it only refers issues of course
08:32 drojf         @wunder berlin, germany
08:32 huginn        drojf: The current temperature in Berlin Tegel, Germany is -9.0°C (9:20 AM CET on January 05, 2016). Conditions: Light Snow. Humidity: 79%. Dew Point: -12.0°C. Windchill: -15.0°C. Pressure: 29.53 in 1000 hPa (Rising).
08:32 drojf         eeek
08:32 mveron        @wunder Allschwil
08:32 huginn        mveron: The current temperature in Grenchen, Switzerland is 4.0°C (9:20 AM CET on January 05, 2016). Conditions: Light Rain Showers. Humidity: 100%. Dew Point: 4.0°C. Pressure: 29.44 in 997 hPa (Rising).
08:32 Joubu         there is no fk with old_issues
08:32 cait          @wunder Konstanz
08:32 huginn        cait: The current temperature in Saint Gallen-Altenrhein, Germany is 5.0°C (9:20 AM CET on January 05, 2016). Conditions: Light Rain Showers. Humidity: 87%. Dew Point: 3.0°C. Windchill: 4.0°C. Pressure: 29.47 in 998 hPa (Rising).
08:33 magnuse       @wunder boo
08:33 huginn        magnuse: The current temperature in Bodo, Norway is -6.0°C (9:20 AM CET on January 05, 2016). Conditions: Clear. Humidity: 42%. Dew Point: -17.0°C. Windchill: -15.0°C. Pressure: 29.98 in 1015 hPa (Steady).
08:33 liw           @wunder Espoo, Finland
08:33 huginn        liw: The current temperature in Viherlaakso, Espoo, Finland is -19.8°C (10:29 AM EET on January 05, 2016). Conditions: . Humidity: 90%. Dew Point: -21.0°C. Windchill: -20.0°C. Pressure: 29.68 in 1005.0 hPa (Rising).
08:33 cait          it picks the funniest places... all but Konstanz
08:33 * Joubu       is trying something
08:33 cait          no rain here, but cloudy and rather warm for winter
08:33 mveron        brb
08:33 magnuse       liw wins
08:41 rangi         @later tell tcohen can you give me ssh access to the debian_7 jenkins node, I
08:41 huginn        rangi: The operation succeeded.
08:42 rangi         @later tell tcohen I'd like to see if I can figure out why some of the tests are failing
08:42 huginn        rangi: The operation succeeded.
08:45 rangi         @later tell tcohen or can you install URL::Encode  that will fix 2
08:45 huginn        rangi: The operation succeeded.
08:50 ashimema      morning
08:51 lds           hello
08:51 cait          morning ashimema and lds
08:51 sophie_m      hello cait and #koha
08:51 mveron        hi ashiema and lds and sophie_m
08:57 LibraryClaire morning #koha :)
09:01 mveron        morning LibraryClaire
09:02 mveron        Joubu: Could your findings on http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=15462#c4 have impact at other places?
09:02 huginn        04Bug 15462: critical, P5 - low, ---, jonathan.druart, Signed Off , Unable to renew books via circ/renew.pl
09:02 ashimema      morning LibraryClaire.. and all others I missed since last saying good morning ;)
09:02 * ashimema    needs tea
09:03 * mveron      fetches some coffee...
09:04 LibraryClaire hi ashimema. mveron :)
09:05 mveron        brb
09:09 Joubu         mveron-away: no, seems to only impact this script
09:09 mveron        OK, glad to hear :-)
09:09 mveron        Joubu++
09:10 drojf1        mveron: does fetching coffee work with git?
09:11 mveron        drojf1: I did not try yet...  :-)
09:12 ashimema      Morning Joubu
09:16 Joubu         hi ashimema
09:16 cait          git fetch coffee? :)
09:19 ashimema      'git worktree coffee_break'
09:19 ashimema      :)
09:19 drojf1        :)
09:20 drojf         i should fetch the irc password from the old notebook …
09:22 ashimema      lol
09:26 * mveron      thinks about a coffee enhancement for Koha, maybe cron job driven (every 30 minutes a coffee?)
09:26 cait          hm maybe we need a sensor too
09:27 drojf         in koha-create, we have a lot of sed -e "s/__PLACEHOLDER__/$replacement/g" … but for UPLOAD_PATH it is sed -e "s#_UPLOAD_PATH__#UPLOAD_PATH#g" … is the # some magical sed variant or an error?
09:27 cait          to check caffeine levels
09:27 cait          so the coffee flow can be customized :)
09:27 liw           drojf, sed allows any character to be used as delimiter, / just the usual one, but since a path is likely to contain / that expression uses # instead
09:27 liw           drojf, you can even use a space charaqcter, I think
09:28 drojf         ah, yes. now that you say that, i remeber i have learned that before. and forgot. i'm old :/
09:28 drojf         liw++
09:29 drojf         i probably asked that for the exact same line when i tested the patch…
09:29 * drojf       hides in shame
09:29 liw           the good thing about getting old and forgetful is that you get to re-visit all manner of wonderful things as if they were new
09:29 drojf         heh
09:30 liw           and also you get to show off your knowledge of the esoteric things you didn't forget, of course
09:30 drojf         you mean the stuff everyone else has forgotten because it's obsolete? :)
09:31 drojf         now i forgot what led me to the question in the first place. i'll try the coffee thing
09:33 Joubu         @later tell khall please regenerate the Schema files, some tests are failing because of bug 13624
09:33 huginn        Joubu: The operation succeeded.
09:39 nlegrand      hey #koha
09:45 sophie_m      hi rangi, Would i be possible to add v3.18.13 on koha ? It seems that Liz is on holiday and I need it to upgrade some client. Or can someone else do that ?
09:46 sophie_m      add the tag I mean
10:01 Joubu         Does anyone understand this test?
10:01 Joubu         153 warning_is { BuildSummary($marc21_subdiv, 99999, 'GEN_SUBDIV') } [],
10:01 Joubu         154     'BuildSummary does not generate warning if main heading subfield not present';
10:02 Joubu         gmcharlt maybe?
10:04 nlegrand      I'm looking at translation and I'm a bit lost, how the .po files are generated?
10:06 magnuse       nlegrand: by misc/translator/translate
10:08 nlegrand      magnuse: ha! thanks, now I know where to start ^^
10:08 drojf         @later tell marcelr would you agree that having the upload folder in a separate backup file would be a good idea? right now (i think) it ends up in the configs.tar.gz when it is in the standard path, or nowhere if it is somewhere else (package installation)
10:08 huginn        drojf: The operation succeeded.
10:38 nlegrand      I've been messing a bit with holds lately, for our personnal use, not sure it will be usefull for others, and I didn't want to interfer with bug 5609
10:38 huginn        04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=5609 enhancement, P1 - high, ---, koha-bugs, NEW , Holds Rewrite
10:39 nlegrand      I just wanted to know, what's the status of this bug? Is it currently worked on? I guess I will put my nose on it when something comes up since we will soon use holds to asks for books in stacks :)
10:42 cait          I think it's an old omnibus bug
10:42 magnuse       and all the bugs it depends on seem to be fixed
10:42 cait          the depends on seem all resolved by now
10:43 cait          I thik it would probably be ok to close it even and then have a new one if someone wants to tackle a bigger holds rewrite again
10:43 magnuse       someone from bywater should say if there is more that will be done, perhaps?
10:44 cait          @later tell khall is 5609 currently being worked on or could it be closed?
10:44 huginn        cait: The operation succeeded.
10:44 magnuse       or just close it and say "looks like all the pieces have been done, please reopen if that is not the case"
10:44 nlegrand      ha right!
10:46 nlegrand      pff I should ask things more often, you answer everything nicely :)
10:46 nlegrand      thanks ^^
11:18 drojf         is there no direct link to upload.pl in tools? how are people supposed to set files nonpublic? i don't see that option if i use the upload.pl plugin from cataloguing
12:17 drojf         for the record, it's in bug 14686
12:17 huginn        04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=14686 enhancement, P5 - low, ---, m.de.rooy, NEW , New menu option and permission for file uploading
12:19 tcohen        mornin
12:19 tcohen        g
12:21 cait          hi tcohen and a happy new year :)
12:24 tcohen        hi cait!
12:24 tcohen        @later tell rangi you already have access to the server. Remember you need to jump from the master jenkins server
12:24 huginn        tcohen: The operation succeeded.
12:48 oleonard      Hi #koha
12:49 cait          morning oleonard :)
12:49 LibraryClaire hi oleonard
13:33 oleonard      @wunder 45701
13:33 huginn        oleonard: The current temperature in Heatherstone, Athens, Ohio is -10.4°C (8:33 AM EST on January 05, 2016). Conditions: Clear. Humidity: 63%. Dew Point: -16.0°C. Windchill: -10.0°C. Pressure: 30.63 in 1037 hPa (Steady).
13:35 oleonard      I wonder if we could work github pull requests into our workflow somehow. It would be great to not have to put off people's offer of help.
13:35 cait          true
13:36 cait          but not sure how
13:36 oleonard      I don't even know what one does with a pull request, so I'm no help there
13:40 Joubu         nothing directly I suppose, or you can write a hook to open a bug report on bugs.k-c.org, then fill it and attach the patch
13:40 Joubu         s/you/we :)
13:51 cait          oleonard: normally you merge them into your codebase - with a click in on a button.. i didn't really get further than that so far :)
14:27 cait          rainbow outside my window :)
14:55 ashimema      pull requests are lovely..
14:56 ashimema      shame we won't ever use them
14:57 ashimema      They basically do the 'report bug -> fix bug -> submit to community -> feedback on submission -> update submission -> merge to community' in one nice streamlined interface
14:57 * ashimema    uses github for pretty much every project except koha
14:59 ashimema      I did add bug 15465 this morning in relation to github as it happens though
14:59 huginn        04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=15465 enhancement, P5 - low, ---, nengard, NEW , README for github
14:59 ashimema      we are putting people off with our workflow me thinks
14:59 Joubu         Is there a way to have both?
14:59 Joubu         github and bz
14:59 ashimema      erm..
15:00 ashimema      we bascially have bugzilla 'cause we wanted to host it ourselves
15:00 Joubu         (There were 6 PR in 1 year on github)
15:00 ashimema      which makes sense
15:00 ashimema      indeed.. there aren't many cuase the vast majority of us know the github is not our primary place to work
15:01 ashimema      it's only entirely new to the project people that tend to submit there
15:02 ashimema      hence my thoughts on having a README.md file especially for github pointing them in the right direction before they even try
15:25 cait          have to run (bus) bye all
16:07 eythian       @wunder ams
16:08 huginn        eythian: The current temperature in Schiphol, Badhoevedorp, Netherlands is 7.2°C (5:07 PM CET on January 05, 2016). Conditions: Overcast. Humidity: 90%. Dew Point: 6.0°C. Windchill: 7.0°C. Pressure: 29.27 in 991 hPa (Steady).
16:12 drojf         @wunder berlin, germany
16:12 huginn        drojf: The current temperature in Holzdorf, Germany is -7.0°C (5:00 PM CET on January 05, 2016). Conditions: Light Snow Grains. Humidity: 86%. Dew Point: -9.0°C. Windchill: -12.0°C. Pressure: 29.53 in 1000 hPa (Rising).
16:12 drojf         getiing warmer ;)
16:13 reiveune      bye
16:53 gaetan_B      bye
17:55 mveron        bye
18:30 Nemo_bis      Hi
18:46 rangi         morning
18:46 rangi         @later tell tcohen thanks
18:46 huginn        rangi: The operation succeeded.
20:26 rangi         @later tell tcohen i don't suppose you could gpg encrypt and email the passphrase for the jenkins ssh key, I have totally forgotten/mislaid it
20:26 huginn        rangi: The operation succeeded.
20:40 rangi         @later tell fredericd I have change cf_release_notes to be a textarea now, on bugzilla
20:40 huginn        rangi: The operation succeeded.
21:20 cait          hi #koha :)
21:20 wizzyrea      hi cait
21:21 wizzyrea      https://github.com/mozilla/autolander
21:21 wizzyrea      ^ could be useful
21:21 wizzyrea      for github/bugzilla
21:22 rangi         hmm
21:23 rangi         that might encourage people to use github, which busts all our signoff/qa workflow
21:23 rangi         if you could switch a repo on github readonly
21:24 rangi         and just use it as a backup
21:24 rangi         or just get rid of it ...
21:24 wizzyrea      yeah the bad thing is that people kind of know how to use github, or it's an easy access point
21:25 wizzyrea      your points are valid though, it would break our workflow as it exists now
21:25 rangi         if someone wanted to take over it, and submit their pull requests as patches
21:25 wizzyrea      that's kind of what I was thinking this tool might do
21:25 rangi         seems mostly the other way
21:26 rangi         ie pulls from bugzilla
21:27 rangi         but if we could disable that bit, and just made pull requests create bugs/attachments
21:27 rangi         that might be useful
21:27 wizzyrea      yeah, that's the only part I was interested in
21:28 wizzyrea      if it could create bugs from incoming pull requests and immediately close them
21:28 wizzyrea      that would be ace
21:28 rangi         yup
21:29 rangi         actually i could script something like that
21:29 rangi         it just needs to be run as a cron, fetch the pull requests from github
21:29 wizzyrea      I saw a thing yesterday that could automatically close pull requests
21:29 rangi         make bugs
21:29 rangi         theres cpan libs for most of that
21:29 rangi         ill have a look at the weekend
21:30 wizzyrea      hm, it'd have to check to see if there was an existing bug
21:30 rangi         yep
21:30 rangi         it probably cant do that
21:30 rangi         not in any reliable way
21:30 rangi         that pretty much takes a human
21:31 rangi         because english sux
21:31 wizzyrea      :)
21:31 wizzyrea      it might be better if we started omitting the "bug" part of the commit messages
21:31 rangi         hmm?
21:31 wizzyrea      so just "xxxx - description of my very annoying bug"
21:32 wizzyrea      instead of "bug xxxx - description of my very annoying bug"
21:32 rangi         just the number?
21:32 wizzyrea      yeah
21:32 rangi         how would that help?
21:32 wizzyrea      dunno, fewer patterns to worry about?
21:32 rangi         if there's a number in the string thats easy to parse
21:33 rangi         its more that i highly doubt there will ever be a bug number
21:33 rangi         in the commit
21:33 rangi         just a string of text
21:33 wizzyrea      oh from github
21:33 rangi         so we'd pretty much have to make a new bug for every pull request
21:33 wizzyrea      yeah, I was just thinking about that
21:33 rangi         but if we tag them
21:33 wizzyrea      bit of chicken and egg there
21:34 wizzyrea      maybe when you close add the bug number to the close message
21:34 wizzyrea      so there's a record
21:35 rangi         yep
21:35 rangi         also if we mark them as created from github in bugs
21:35 rangi         then a human could eyeball and resolve duplicate etc
21:37 wizzyrea      https://github.com/gera/gitzilla there's this one too
21:39 rangi         ta
21:40 geek_cl       hi guys, what about this circulation.pl time run : top pastebin _ http://pastebin.com/fsuPFSfM
21:40 geek_cl       3 hours and counting
21:45 geek_cl       that is normal ?
21:45 rangi         of course not :)
21:45 rangi         whats the load on that machine?
21:45 geek_cl       on the roof
21:45 rangi         theres your problem then
21:45 geek_cl       all cpu (8) up to 100%
21:45 rangi         you might want to wind down the max number of connections that apache is letting through
21:46 rangi         also, is it swapping?
21:46 geek_cl       swapping a little
21:46 geek_cl       3 MiB
21:47 rangi         yeah, it just looks like that machine is trying to do too much, and more and more is piling up
21:47 rangi         is the db on the same machine?
21:47 geek_cl       nop
21:48 rangi         i'd check it as well
21:48 geek_cl       the db server is a remote machine
21:48 rangi         also the connection between the two
21:48 geek_cl       i already check the db server and is OK
21:48 rangi         whats your max connection for apache?
21:48 geek_cl       let me check
21:49 rangi         but yeah 2-4 seconds
21:49 rangi         is what circulation.pl should take
21:49 rangi         if it's taking longer than that, something is going wrong
21:49 rangi         what does show processlist
21:49 dcook         Wasn't there a bug at one stage where circulation was taking a long time?
21:49 rangi         on the mysql server tell you
21:50 rangi         not that long
21:50 wizzyrea      yeah not that long
21:50 dcook         Ahh
21:50 * dcook       retreats back into the shadows
21:50 rangi         not hours ;)
21:50 geek_cl       rangi, http://pastebin.com/T1WNz2dH
21:51 geek_cl       i will check processlist
21:51 rangi         dcook: it does take longer than it should, theres quite a few wins to be made, specially if more of it is done using the rest
21:51 dcook         rangi: Ah, I was reading that as seconds rather than minutes. Yikes.
21:51 geek_cl       rangi, too many Sleep process
21:52 rangi         i think what i would do is restart apache
21:52 geek_cl       7 , like circulation
21:52 geek_cl       ok
21:52 bag           is it all from the same IP?
21:52 rangi         and then watch the log
21:52 geek_cl       rangi, what about max conns of apache
21:52 rangi         you have 8 cores eh? and how much ram?
21:53 geek_cl       RAM 8 GiB
21:53 rangi         yeah, if you aren't OOMing .. you can probably leave that
21:53 geek_cl       after apache2 restart, the CPU's breath again
21:53 rangi         something else is hanging those circs
21:54 rangi         yeah id tail the access log for a while .. and see if you can spot where it comes from if it happens again
21:54 bag           find the computer that the traffic is coming from and clear the cache on that computer
21:54 bag           (that’s worked for us)
21:54 * geek_cl     nice
21:54 geek_cl       let me check
21:57 geek_cl       i find a internal (customer) IT guy, hitting circulation.pl
22:00 geek_cl       ohh. no no, is a IP of the gateway, koha is under a NAT
22:01 nuentoter     hello everyone
22:02 wizzyrea      hi
22:05 nuentoter     have a question, finally was able to get things imported into koha from our old system. If i go under tools>inventory I see my records and if i click them they all have items
22:05 nuentoter     i cannot search anything
22:05 rangi         how did you install koha?
22:05 nuentoter     figured zebra needed a restart
22:05 nuentoter     what do you mean how did i install koha?
22:06 rangi         did you install it from packages
22:06 rangi         or from the tarball
22:06 nuentoter     im running it on a virtualbox using debian7 installed through packages
22:06 rangi         right
22:06 rangi         so the thing you want to make sure you are doing is using the package commands, not by hand
22:07 rangi         so to restart zebrasrv it would be
22:07 rangi         sudo service koha-common restart
22:07 rangi         and to rebuild your index
22:07 rangi         sudo koha-rebuild-zebra -v -f <instance name goes here>
22:07 rangi         eg
22:07 rangi         sudo koha-rebuild-zebra -v -f mylibrary
22:08 geek_cl       thanks rangi ;)
22:08 rangi         a lot of problems are cause by people running commands as root, like rebuild_zebra.pl .. and messing up the permissions, and then the cronjobs cant run
22:08 nuentoter     thats what i did
22:08 nuentoter     oops
22:09 eythian       wahanui: zebra troubleshooting
22:09 wahanui       zebra troubleshooting is see [understanding zebra indexing] and [yaz client]
22:09 rangi         yeah, so now the zebrasrv cant actaully read those indexes
22:09 rangi         because root owns then, and it runs as koha-instancename
22:10 rangi         so you will want to fix those permissions
22:10 rangi         if you do a ls -l /var/lib/koha
22:10 rangi         what is in there?
22:10 wahanui       i heard in there was a css dir
22:10 nuentoter     i was logged in to root for an unrelated reason, while still logged in as root i sent the rebuild_zebra.pl command
22:11 rangi         yeah, dont ever do that :-)
22:11 rangi         koha-rebuild-zebra is what you want to use, but we can fix it, if you fix the permissions
22:12 rangi         in /var/lib/koha there should be a dir owned by instance-koha:instance-koha
22:12 rangi         eg
22:12 rangi         drwxr-xr-x 6 catalyst-koha     catalyst-koha     4096 Dec 14 12:09 catalyst
22:12 nuentoter     total 4     drwxr-xr-x 5 abelj-koha abelj-koha 4096 Nov  5 20:34 abelj
22:12 rangi         cool, and if you ls -l inside there?
22:13 rangi         its probably easiest just do
22:14 rangi         sudo chown -R abelj-koha:abelj-koha /var/lib/koha/abelj
22:14 rangi         to make sure
22:14 rangi         then sudo service koha-common restart
22:14 rangi         to restart the zebrasrv
22:14 rangi         and see if that has got your searching back
22:14 rangi         you might want to do a
22:15 rangi         sudo koha-rebuild-zebra -f -v abelj
22:15 rangi         as well, just for good measure
22:20 nuentoter     yay biblio export runnin looks good so far
22:21 nuentoter     and search works again
22:21 nuentoter     TY rangi!!!
22:21 wizzyrea      yay!
22:22 cait          rangi++ :)
22:22 rangi         no worries
22:22 rangi         it's almost always permissions
22:22 nuentoter     hahaha you dont know me, i always worry lol
22:23 cait          German? :P
22:24 rangi         now you've learnt some nz slang as well as fixing your koha :)
22:24 nuentoter     french/irish
22:24 cait          :)
22:24 cait          maybe it's more wide spread in europe :)
22:25 cait          worrying too much that is
22:26 nuentoter     now next step for me is to get all my record imported........ ugghhhhh
22:26 nuentoter     doing batches of 25 books at a time, 35,673 books left to go! YAY
22:26 nuentoter     i love my library lolol
22:27 wizzyrea      why 25 at a time?
22:27 nuentoter     because of the way our current system exports things I have to hand edit every single book. so i do it in batches to prevent sscrew ups
22:28 wizzyrea      oh that is rubbish :(
22:28 nuentoter     winnebago/spectrum is a pain in my @$$
22:28 wizzyrea      hm
22:28 wizzyrea      I know someone who may have some tips for you
22:28 wizzyrea      re: winnebago
22:28 nuentoter     !!!! i will take any info they have
22:28 wizzyrea      see your PM's in 2 secs
22:29 nuentoter     most people that converted did so YEARS ago, i inherited this old system :(
22:29 wizzyrea      *nod* it's crap
22:37 dcook         I have some design questions if people have a moment to offer thoughts
22:37 rangi         im going to get nachos
22:37 dcook         That's fair enough
22:37 rangi         but ill read back
22:38 wizzyrea      yeah will read them when I have a minute, go ahead and ask :)
22:38 dcook         For this OAI-PMH stuff, I have a daemon that downloads records and writes them to disk. That all works quite well.
22:38 dcook         But now I'm thinking about a daemon as an importer as well
22:38 dcook         Something that monitors the disk periodically and does X with them
22:38 dcook         But... I'd like that daemon to have access to Koha modules
22:39 dcook         So while the OAI-PMH downloader (it's actually extensible so it could pretty much do anything) is loosely coupled
22:39 dcook         I want the importer to be fairly tightly coupled
22:39 dcook         I could just have a cronjob run instead of a daemon, but a daemon is going to be a lot less problematic
22:40 dcook         So I could just start up the daemon with PERL5LIB set...
22:40 dcook         or use FindBin
22:40 wizzyrea      how does the zebra one do it?
22:40 dcook         That's a good point. We only use the cronjob... I hadn't thought about that
22:40 dcook         Tamil's daemon reads the database using koha-conf.xml
22:41 dcook         I don't know how the community one works, but that's a good point
22:41 dcook         Actually, the zebra one doesn't need any modules I think?
22:41 dcook         Just DB access
22:41 * wizzyrea    admits to not knowing
22:41 dcook         Hmm rebuild_zebra.pl does use a few C4 modules
22:41 dcook         wizzyrea: That's really helpful
22:42 dcook         Because if there's a precedent... I'll use that
22:42 dcook         I find the whole module path thing to be annoying in all projects..
22:42 dcook         Ideally, it would be kind of nice to just be able to use /usr/local or something...
22:42 dcook         Although then our gits wouldn't work..
22:43 dcook         Although I guess for git-dev stuff we could use PERL5LIB..
22:43 * dcook       wonders how all projects ever do this
22:43 nuentoter     i dont think zebra actually accesses the DB
22:43 wizzyrea      it about has to, to get the records to index
22:43 dcook         Well, not Zebra itself
22:43 wizzyrea      the indexer does though
22:43 dcook         rebuild_zebra.pl writes records out to a directory that Zebra manages
22:44 dcook         wizzyrea: what do you mean by indexer?
22:44 wizzyrea      rebuild_zebra
22:44 wahanui       it has been said that rebuild_zebra is already there but only every 10 mins....
22:44 dcook         Ah yeah
22:44 wizzyrea      zebra reads it's own databases
22:44 dcook         Yeah, rebuild_zebra.pl uses C4 modules to do stuff
22:44 wizzyrea      but the link is rebuild_zebra
22:44 dcook         Well, Zebra reads from disk to build its databases
22:44 dcook         rebuild_zebra writes to disk so that Zebra can read from it
22:45 dcook         With my project, my downloader writes to disk and the importer reads from disk
22:45 dcook         So sort of backwards to what we're doing with Zebra
22:45 wizzyrea      note I"m using 'database' in a loose form
22:45 wizzyrea      not strictly in the relational-database form
22:45 dcook         Good point
22:45 wahanui       I know! The blade went right through that child!
22:46 dcook         We sure have some odd ones in #koha :p
22:46 wizzyrea      hehe
22:46 dcook         Lesse... I should just fire up my Debian VM
22:46 dcook         My life would be so much easier if we just used Debian for dev and prod
22:47 dcook         I did take a few lessons from Zebra when doing this
22:47 nuentoter     what do you use if not debian?
22:47 dcook         I wrote a little "icarus-client" in a sort of shout out to "yaz-client"
22:47 dcook         openSUSE
22:47 wahanui       openSUSE is, like, not used by many developers, and will likely be difficult to get Koha working with
22:47 dcook         hehe
22:47 wizzyrea      it's like you like punishment
22:48 dcook         Ikr?
22:48 dcook         Slowly effecting change around here. That would be something that would make me really happy
22:48 dcook         O_O
22:49 dcook         These bywater folks are blowing my mind right now
22:49 wizzyrea      oh?
22:49 nuentoter     I've used linux for quite a few years personally, but never did a whole lot with it tbh, it was just a free os to surf the web with cuz windows sucks
22:49 bag           happy birthday thatcher
22:49 dcook         barton and thatcher are the same person? :p
22:49 wizzyrea      lol no they are not
22:49 dcook         nuentoter: Yeah, I started using Linux for work, and now I use it personally too
22:49 barton        no...
22:49 dcook         And I've used... a fair few distros
22:49 * dcook       still likes Debian best
22:50 wizzyrea      I like debian and it's derivatives best.
22:50 wizzyrea      everyone else wants to be like them :P
22:50 dcook         hehe
22:50 dcook         Yeah, we're a Debian/Ubuntu house
22:50 dcook         And work is all openSUSE/SUSE
22:50 nuentoter     I have used a handful and on my personal laptop i use #! only because its slick and quick with no bells and whistles
22:50 dcook         Except for one Ubuntu server I have..
22:50 nuentoter     i use debian on desktop at home
22:50 dcook         nuentoter: Yeah, I thought about #! for my old netbook
22:51 dcook         Went with lubuntu in the end though
22:51 dcook         Then it got stolen so it didn't matter much
22:51 dcook         wizzyrea: koha-indexer has /usr/share/koha/lib hard-coded into PERL5LIB
22:51 dcook         And not in the best of ways..
22:51 nuentoter     for older hardware #! is nice, not the new #!++ which is debian 8 based
22:52 wizzyrea      \o/
22:52 * dcook       ponders
22:52 dcook         I'll take that as incentive to just require users to have PERL5LIB set ahead of time I guess..
22:52 dcook         Or maybe use FindBin
22:52 nuentoter     the whole openbox setup is what hooked me, no icons, no menus just simple
22:52 wizzyrea      yeah, that's a precedent alright. rangi might have opinions.
22:52 dcook         wizzyrea: It's a Debian-specific precedent, but yeah, I'd be curious to hear what rangi would say
22:53 dcook         What is INC most of the time..
22:53 dcook         I was just looking yesterday
22:53 dcook         I'm cheating with something..
22:53 dcook         Oh, I do have another one..
22:53 wizzyrea      you could create a startup option where you use hardcode unless you specify something different
22:53 dcook         I have stuck some modules into "bin" :O
22:53 dcook         wizzyrea: Yeah, that's true
22:54 wizzyrea      i mean, there's probably a reason not to do that.
22:54 nuentoter     anyway, gotta close up the library and go home for the evening, thank you kind folks!
22:54 dcook         laters nuentoter
22:54 wizzyrea      good luck nuentoter :)
22:54 dcook         wizzyrea: Yeah, I'm trying to think of the ideal way of doing things
22:55 dcook         In the case of the downloader, it could be completely separate from Koha
22:55 dcook         So in theory... the downloader could be replaced down the line with something else
22:55 dcook         And the importer could handle other data providers
22:56 dcook         In theory, I'd like the importer to be given a bunch of data and go to town on it... and then you could check on the progress later
22:56 dcook         So you upload a MARC file, tell it to upload, and then go off and do something else without worrying about any timeouts or anyhting
22:56 dcook         I mean... we already do have BackgroundJob which works in the background, although I haven't studied that extensively
22:56 dcook         Anyway, just babbling now
22:57 dcook         I don't really like the idea of a closely-coupled daemon... but surely other projects must do it too?
22:57 dcook         So that you can exploit your project's existing code..
22:57 dcook         I get the whole "have your daemon do one thing" but..
22:58 dcook         I suppose that would be more possible if Koha were smaller
22:58 dcook         Anyway, thanks for that wizzyrea :)
22:58 wizzyrea      yep, rangi will probably have better opinions
22:59 dcook         Hmm, maybe I'll wait for his opinion before I start on that
23:02 dcook         Actually, now that I think about it, another idea I had wouldn't need that..
23:02 dcook         Rather, the "importer" daemon would download "import profiles" from a Koha web service
23:02 dcook         Each profile would have it monitoring a different place (e.g. a directory)
23:03 dcook         Then the daemon would simply send the record off asynchronously to a Koha import web service
23:03 dcook         I need a more complex import web service than any we currently have though
23:03 dcook         Something that filters a record, performs matching, and then does X
23:03 dcook         For that... the only thing the daemon would need is username/password and a web service URI
23:04 dcook         The downloader daemon uses a "task format" while this import daemon would use a "import profile format"
23:04 dcook         That would be the only sort of dependency
23:04 dcook         That whatever task provider or import profile provider use that format that the daemon understands
23:05 dcook         The import profiles could in theory also be moved into other parts of Koha...
23:06 dcook         So that all imports could perform record filtering, matching, and X
23:06 dcook         X being particular to that "kind" of import
23:06 dcook         X probably just being "authority","bibliographic","items" (and maybe "holdings")
23:07 dcook         And the daemon only needs to read from that "import profile" web service on start or reload
23:08 dcook         I suppose that web service would actually sort of be a "discovery" document
23:09 dcook         Well, no, not quite..
23:09 dcook         Because the daemon would need to know more than just "Oh hey... these are the import endpoints we support"
23:12 pianohacker   we need to add a feature to huginn that tracks the longest streak of someone talking to themselves, dcook ;)
23:13 dcook         hehe
23:13 dcook         I'm pretty sure I'd occupy all the top spots ;)
23:13 pianohacker   also, hi! :)
23:13 dcook         But I'm actually only talking to myself and whoever wants to listen/comment :p
23:13 dcook         also hi :)
23:13 dcook         New year going well so far?
23:26 pianohacker   dcook: absolutely. Looks to be much easier than the last :)
23:26 rangi         dcook: do you know much about gearman
23:27 rangi         http://gearman.org/
23:27 rangi         http://search.cpan.org/~dormando/Gearman-1.12/lib/Gearman/Client.pm
23:27 rangi         http://search.cpan.org/~dormando/Gearman-1.12/lib/Gearman/Worker.pm
23:27 rangi         etc
23:27 dcook         pianohacker: Awesome
23:27 dcook         rangi: Nopes
23:28 rangi         i think instead of reinventing the queue/work distribution wheel, we could slowly move bits of koha out and have gearman looking after who does what and when
23:29 dcook         I like this idea
23:29 rangi         file comes in for import, farm it off to the import worker
23:29 rangi         keep on trucking along
23:29 dcook         Well, the Gearman model is what I'm doing with the downloader already
23:29 dcook         Web UI sends a job to the job server, job server spawns a process to deal with it
23:29 rangi         yep
23:30 dcook         child process queues up the record for somethign else down the pipeline
23:30 rangi         gearman lets you create workers in whatever language you like etc too
23:30 dcook         Yeah, I'd be down with Gearman at a glance
23:30 rangi         handles the reporting back
23:30 rangi         all that stuff thats a pita to deal with
23:30 dcook         Yeah, the worker in the downloader case is a OAIPMH module I made, so that would be great
23:31 dcook         Yeah, the more I read, the more I like this idea
23:32 dcook         I have a project at home that this would be great for as well..
23:32 rangi         the thing I like is its a tried and tested thing, so one less bit of code we have to worry about
23:32 dcook         ^
23:32 dcook         Exactly
23:32 dcook         So much that
23:33 dcook         In the case of the importer, I'm not 100% sure what I want to do yet
23:33 dcook         I don't know if I want an importer reading in files and processing them..
23:33 dcook         Or if I want something reading in files and sending them to a Koha API
23:33 dcook         Handing them off to a Koha API would probably be faster and more generalizable..
23:34 dcook         But then the transfer agent thing needs to know extra details to tell the API..
23:34 dcook         Mainly (how do you want to filter, match, and ultimately add/update/ignore this)
23:34 dcook         While we could put that in a configuration file, it would be nice to have it in the database so that end users could make those decisions
23:35 dcook         But then there's needs to be a link between the file being read and all that import info..
23:35 rangi         yeah but the worker doesnt need to know its in the db
23:35 dcook         True. It just needs to know it in some way.
23:35 rangi         the client hands it off to the jobserver the jobserver passed that info to the worker, with the data
23:36 dcook         Hmm
23:36 rangi         heres some data, and heres the config
23:36 dcook         That's what I'm doing with the downloader
23:36 pianohacker   dcook/rangi: I've been wanting to make the cronjobs managed (configurable from web, etc.), and this sounds like it could be useful for that as well
23:36 dcook         I wasn't sure if I wanted to go that route with the importer
23:36 dcook         pianohacker: Yeah, I've been thinking about that too
23:36 rangi         pianohacker: yup
23:36 rangi         the task scheduler would be the first thing to do
23:36 dcook         ^
23:36 pianohacker   it's on the todo list, somewhere down there :)
23:36 rangi         its small, self contained, known problem
23:37 pianohacker   and doesn't currently work :)
23:37 dcook         The only thing is I have a deadline for this one :p
23:37 rangi         rewrite to use gearman instead of at
23:37 dcook         The downloader is a task scheduler at this point
23:37 dcook         Gearman would be great
23:37 dcook         So here's the flow I have so far
23:37 dcook         Client tells task/job server "Here's a task"
23:38 dcook         The task/job server spawns a worker to deal with that task
23:38 dcook         In this scenario, it's downloading records via OAI-PMH
23:38 dcook         If it's a repeating task, it loops forever following a certain interval
23:38 dcook         It writes those records out to disk atm
23:38 dcook         I suppose I could use that same task/job server to handle the importing..
23:39 dcook         And then when time isn't of the essence, we replace my task/job server with gearman
23:39 dcook         The OAI-PMH worker is specific to a module already so that could be pretty much drag and drop..
23:39 * dcook       is tempted to post his task/job server code separate to the OAI code...
23:40 dcook         rangi: Are there packages for gearman?
23:40 rangi         yep
23:40 rangi         https://packages.debian.org/search?keywords=gearman
23:41 dcook         Oh wow... it goes back a ways
23:41 dcook         Hmm pre version 1
23:41 dcook         rangi: Have you used it yet?
23:41 rangi         yeah, its been around for a while
23:41 rangi         yep
23:41 rangi         tumblr and yelp others use it
23:41 dcook         Yeah, I was seeing that
23:41 dcook         I was just wondering which version they're on
23:41 pianohacker   is it a swagger2 situation where we'd really want latest?
23:42 dcook         Yeah, that's what I'm worried about
23:42 rangi         id go for stable
23:42 rangi         over shiny and new
23:42 rangi         specially in a job queue
23:42 dcook         Totes
23:42 dcook         I'm just wondering what is stable
23:43 dcook         Hmm no news in a few years
23:43 dcook         That could be a good sign I suppose
23:43 dcook         It should be fairly rock solid as it's a fairly simple concept
23:43 rangi         yep
23:44 dcook         Hmm, no packages for openSUSE. Boo..
23:44 dcook         Although someone apparently has already done some legwork: https://gist.github.com/CauanCabral/5967374
23:45 rangi         i reckon it's worth trying out anyway
23:45 dcook         Agreed
23:46 dcook         For now, I'll probably just keep chugging with what I have, as I'm running out of time, but I like it
23:47 dcook         But that makes me wonder if I should use my existing task server for both downloading and importing..
23:47 dcook         Actually, how would that importing work?
23:47 dcook         You can pass data to the task server
23:47 dcook         But you couldn't necessarily take advantage of your Koha Perl modules
23:47 rangi         you hand of jobs to a worker suited to do them
23:47 dcook         I suppose you could tell it to send the queued records to a Koha API
23:48 rangi         obv a worker to do the import, should be able to talk to the db
23:48 dcook         Hmm, I'll have to look at those Perl modules to see how they implement that
23:48 rangi         either directly, or via some api
23:49 rangi         /svc/bib/new
23:49 rangi         eg
23:49 dcook         Yeah, atm I'm thinking api
23:50 dcook         Interesting... you have to spawn your own workers manually..
23:50 dcook         There has to be a way around that as it says it scales..
23:50 dcook         Unless you pre-spawn a certain number..
23:52 dcook         Since you're starting your own workers, you could provide all sorts of info..
23:53 dcook         Yeah, you have to manage your own clients and workers... but that's all right I guess.
23:53 dcook         I autospawn workers and cap out at a configurable limit
23:54 dcook         Probably less overhead with the Gearman way of doing it..
23:54 dcook         Also a lot more control
23:55 dcook         I wonder a bit about what happens when Gearman goes down
23:55 dcook         Well Gearmand
23:56 dcook         Looks like it uses a relational database to manage jobs..
23:56 dcook         rangi: Would you have one Gearmand per client or just one for all Koha clients?
23:56 dcook         Wait that didn't make sense
23:57 dcook         One gearmand per system or one per Koha client
23:57 * dcook       wonders how the Gearman client stores information about ongoing jobs..
23:57 rangi         that'd be up to you
23:58 rangi         all koha needs to know is what ip to talk to
23:58 dcook         Let's say you want to stop a job midway through though... does Gearmand return enough info to the client?
23:59 rangi         itll return what the worker gives it
23:59 dcook         Hmm
23:59 dcook         I wonder how that would work..