Time |
S |
Nick |
Message |
00:18 |
|
sylvar |
Hi y'all. If a search from the admin side shows 14 items, 14 available (for a particular bib record) and the same search phrase on the public side shows "(1)", what might cause that difference? |
00:19 |
|
chris |
none of them are marked lost? |
00:19 |
|
sylvar |
I'll double-check. The bib detail page says they're all Available, on both sides. |
00:19 |
|
chris |
hm interesting, what version of koha? |
00:22 |
|
sylvar |
chris: 3.02.00.007, but I suspect I've made a mistake during the data load, so I wouldn't consider it a bug. Not a Koha bug, anyhow. ;) |
00:23 |
|
chris |
:) |
00:23 |
|
sylvar |
Thanks, though. |
00:23 |
|
ebegin |
in your breadcrumbs, what koha shows? "kw,phr: your search" |
00:24 |
|
ebegin |
or "kw,wrdl: your search" |
00:37 |
|
ebegin |
I'm having trouble on my side to find word preceded of "L'", for exemple, I can find "L'entreprise" but not just "Entreprise"... |
00:54 |
|
sylvar |
OK, I thought I knew this, but... is borrowers.cardnumber where barcodes should go? |
00:55 |
|
chris |
yes |
00:55 |
|
sylvar |
OK, thanks. |
00:57 |
|
ebegin |
Is there a way to know what is indexed in zebra for a specific bib? |
01:15 |
|
|
ronald left #koha |
02:28 |
|
|
rachel joined #koha |
02:38 |
|
|
rachel left #koha |
02:50 |
|
munin |
New commit(s) kohagit32: Fix for 5143, now with IE debug removed <http://git.koha-community.org/[…]8b44c9a8f740311b7> / Bug 3789 Set off shelving location in staff and OPAC title display <http://git.koha-community.org/[…]5a343b7e505bbd0d4> / Bug 4937: Fixes XHTML in the pagination links of a saved report. <http://git.koha-community.org/gitwe |
02:55 |
|
|
druthb joined #koha |
02:56 |
|
chris |
hi druthb |
02:56 |
|
druthb |
hi, chris! :) |
03:00 |
|
hudsonbot |
Starting build 59 for job Koha_3.2.x (previous build: STILL UNSTABLE -- last SUCCESS #54 8 days 4 hr ago) |
03:22 |
|
hudsonbot |
Project Koha_3.2.x build #59: STILL UNSTABLE in 22 min: http://hudson.koha-community.o[…]ob/Koha_3.2.x/59/ |
03:22 |
|
hudsonbot |
* Katrin Fischer: Bug 4218: Fixes display problem introduced by last patch |
03:22 |
|
hudsonbot |
* Frédéric Demians: Bug 5041 Allow to delete non-repeatable field |
03:22 |
|
hudsonbot |
* Owen Leonard: Fix for Bug 5416, Template syntax error in moredetails.tmpl |
03:22 |
|
hudsonbot |
* Robin Sheat: Bug 5313 - allow creation of libraries with hyphens |
03:22 |
|
hudsonbot |
* Robin Sheat: Bug 5228 - make rebuild_zebra handle fixing the zebra dirs |
03:22 |
|
hudsonbot |
* Owen Leonard: Fix for Bug 5208, Language chooser missing on Batch item deletion/modification |
03:22 |
|
hudsonbot |
* Nicole Engard: bug 5150 change issuing to circ & fine rules |
03:22 |
|
hudsonbot |
* Chris Cormack: Bug 5484 - Handling bad borrower categories in serial routing lists more gracefully |
03:22 |
|
hudsonbot |
* Owen Leonard: Updated fix for Bug 2170, Adding 'edititems' user-permission |
03:22 |
|
hudsonbot |
* Colin Campbell: Bug 2170 Supplementary Fix Wrap link in permissions check |
03:22 |
|
hudsonbot |
* Nicole Engard: bug 4252 add authorites permission to menus |
03:22 |
|
hudsonbot |
* Nicole Engard: bug 5255 change 'document type' to 'item type' |
03:22 |
|
hudsonbot |
* Colin Campbell: Variable redeclared in same scope |
03:22 |
|
hudsonbot |
* Owen Leonard: Fix for Bug 5000, Uncertain prices misses option to choose display language |
03:22 |
|
hudsonbot |
* Srdjan Jankovic: Bug 2965: Allow due date in the past |
03:22 |
|
hudsonbot |
* Robin Sheat: Bug 5084 - hide funds that are part of an inactive budget |
03:22 |
|
hudsonbot |
* Katrin Fischer: Bug 2965: Allow due date in the past - small template fix |
03:23 |
|
chris |
chris_n`: you about? |
03:23 |
|
|
chris_n` is now known as chris_n |
03:23 |
|
chris |
did you catch the revert for Ergonomy improvement in smart rule management |
03:24 |
|
chris |
http://git.koha-community.org/[…]cde115764fa1afe44 |
03:31 |
|
chris_n |
chris: was that directed to me? |
03:31 |
|
chris |
yup |
03:32 |
|
* chris_n |
just wandered in |
03:32 |
|
chris_n |
I did |
03:32 |
|
chris |
cool |
03:32 |
|
chris_n |
pused |
03:32 |
|
chris_n |
pushed even |
03:32 |
|
chris |
:) |
03:33 |
|
chris_n |
its been one crazy ride the past two weeks here :-P |
03:33 |
|
chris |
oh yeah? |
03:33 |
|
chris_n |
end of fall semester schedule pains |
03:33 |
|
chris |
ahhh right |
03:35 |
|
chris_n |
tell sharrow that I put together a script to have nagios monitor the hot water boiler loop in a large chilled water cooling system this week |
03:35 |
|
chris |
will do :) |
03:35 |
|
chris_n |
its pretty cool; it texts the temps in the loop along with the boiler status |
03:35 |
|
chris |
cool :) |
03:35 |
|
chris_n |
in perl... of course :-) |
03:36 |
|
chris |
of course |
03:36 |
|
chris |
http://xkcd.com/224/ |
03:37 |
|
chris_n |
scarry thought :) |
03:40 |
|
chris_n |
off to sleep, g'night |
03:41 |
|
chris |
night |
03:54 |
|
|
Brooke joined #koha |
04:00 |
|
|
richard left #koha |
04:53 |
|
|
druthb left #koha |
04:54 |
|
Brooke |
@roulette |
04:54 |
|
|
Brooke was kicked by munin: BANG! |
04:54 |
|
* munin |
reloads and spins the chambers. |
04:55 |
|
|
kmkale joined #koha |
04:56 |
|
kmkale |
Good morning |
05:17 |
|
|
kmkale left #koha |
06:22 |
|
|
cait joined #koha |
06:22 |
|
cait |
hi #koha |
06:26 |
|
|
Brooke joined #koha |
06:27 |
|
chris |
hi cait and Brooke |
06:27 |
|
cait |
morning chris |
06:27 |
|
* Brooke |
waves at Chris |
06:27 |
|
cait |
morning Brooke |
06:27 |
|
Brooke |
eeyah |
06:28 |
|
* Brooke |
cites the old adage that good things happen early in the morning. |
06:30 |
|
cait |
hmpf! |
07:01 |
|
|
Brooke left #koha |
07:06 |
|
|
laurence joined #koha |
07:17 |
|
|
hdl joined #koha |
07:38 |
|
|
cait left #koha |
07:55 |
|
fredericd |
chris: Are you aware that tmpl_process3.pl script mess up templates .po files? |
07:56 |
|
fredericd |
A LOT of string are marked as fuzzy, but they shouldn't |
07:56 |
|
fredericd |
it means that translators will have to re-translate string they already translated |
07:58 |
|
fredericd |
on stable branch, it means that to be able to have a few new strings translated into your language, you have to do a huge work of re-translation... |
07:59 |
|
chris |
hmmm interesting that must be a new thing, because it didnt used to |
08:00 |
|
fredericd |
for example on French staff .po files 1843 strings have been marked as fuzzy |
08:00 |
|
chris |
when you did an update? |
08:00 |
|
fredericd |
yes, just an update |
08:00 |
|
|
Oak joined #koha |
08:00 |
|
chris |
and they definitely werent fuzzy before? |
08:01 |
|
fredericd |
new string are added, which is the expected behavior |
08:01 |
|
fredericd |
chris: yes |
08:01 |
|
Oak |
\o |
08:01 |
|
chris |
is pootle down? |
08:01 |
|
fredericd |
just before, the French .po files were without fuzzy |
08:01 |
|
chris |
you could try updating english-nz |
08:01 |
|
fredericd |
yes, I desactivated Pootle in the meantime |
08:01 |
|
chris |
because that had nothing untranslated |
08:02 |
|
chris |
i had updated them quite a few times |
08:02 |
|
chris |
without that happening |
08:02 |
|
chris |
i wonder if its to do with new pootle? |
08:03 |
|
fredericd |
I don't think so, because the French .po file have been update outside Pootle |
08:03 |
|
fredericd |
chris: same for en_NZ |
08:03 |
|
chris |
hmm, afaik nothing has changed with tmpl_process3.pl |
08:03 |
|
chris |
and it certainly never used to do this |
08:04 |
|
fredericd |
but less fuzzy: just 356 strings |
08:04 |
|
hdl |
fredericd: it can be that strings changed in context |
08:04 |
|
chris |
yep, that would make them fuzzy |
08:05 |
|
chris |
and that will be msmerge doing that |
08:07 |
|
hdl |
fredericd: most of those strings will be unchanged. and will just have to be confirmed... |
08:07 |
|
|
francharb joined #koha |
08:07 |
|
|
sophie_m joined #koha |
08:08 |
|
hdl |
fredericd: but some will still need some work |
08:09 |
|
fredericd |
hdl: Yes, but 1843 strings... |
08:09 |
|
hdl |
welcome the translating process ;) |
08:11 |
|
hdl |
it does require work |
08:11 |
|
fredericd |
300 strings in German, so there is something special in French |
08:12 |
|
fredericd |
Clearly the templates strings extraction will have to be improved for TT |
08:12 |
|
chris |
yep, its tons easier |
08:12 |
|
chris |
its already underway |
08:13 |
|
chris |
we dont have to try and look for comments |
08:13 |
|
chris |
that arent really comments |
08:13 |
|
chris |
[% %] |
08:13 |
|
chris |
easy |
08:13 |
|
chris |
so you can just use a proper HTML parser |
08:13 |
|
chris |
to extract the alt tags etc |
08:14 |
|
chris |
i reckon go from about 2000 to about 200 lines |
08:14 |
|
fredericd |
tmpl_process3.pl is a blackbox for me, I can't fix it |
08:14 |
|
chris |
yeah, it doesnt have too much longer to liv |
08:14 |
|
chris |
e |
08:14 |
|
chris |
just for 3.2.x |
08:14 |
|
fredericd |
hope so |
08:14 |
|
chris |
theres certainly no way we are going to try to bend it to work with TT |
08:15 |
|
fredericd |
I'm not sure it will be possible to update translation from 3.2.x regularly as planned |
08:15 |
|
chris |
once for each release is enough |
08:21 |
|
fredericd |
YEAH |
08:21 |
|
fredericd |
I was wrong for the fuzzy strings in French |
08:21 |
|
fredericd |
I wasn't working on the correct French version of .po files |
08:22 |
|
|
kf joined #koha |
08:22 |
|
kf |
hi #koha |
08:22 |
|
fredericd |
I was working on hdl .po files version, before updates we've done the last few days |
08:22 |
|
fredericd |
hi kf |
08:22 |
|
|
ebegin left #koha |
08:23 |
|
kf |
hi fredericd |
08:24 |
|
hdl |
fredericd: pleased for you |
08:24 |
|
kf |
fredericd: are you updating the pootle files today? |
08:24 |
|
kf |
sorry, missed the beginning of your conversation |
08:25 |
|
|
ivanc joined #koha |
08:28 |
|
|
hdl left #koha |
08:29 |
|
fredericd |
kf: it was about .po files update for 3.2.2 |
08:29 |
|
kf |
ah |
08:29 |
|
fredericd |
I was discovering that the update was marking as fuzzy more than 1800 string in French |
08:29 |
|
kf |
I think strings went fuzzy before when the template file got touched |
08:29 |
|
fredericd |
it was my fault |
08:29 |
|
kf |
you don't really have to reatranslate, but submit them again |
08:30 |
|
kf |
ah |
08:30 |
|
kf |
ok |
08:30 |
|
fredericd |
kf: for German you will 300 string exactly marked as fuzzy |
08:30 |
|
kf |
puh |
08:30 |
|
kf |
could be worse :) |
08:31 |
|
fredericd |
yes, it could have been worse |
08:32 |
|
kf |
but lot's of new strings I expect |
08:34 |
|
kf |
fredericd: can you ping me when pootle is back? |
08:34 |
|
ivanc |
hi #koha |
08:35 |
|
ivanc |
hi kf |
08:35 |
|
fredericd |
kf: yes, I'm updating all .po files and try to automate the process |
08:35 |
|
kf |
yeah, no problem. :) Just eager to start. |
08:40 |
|
kf |
hi ivanc |
08:42 |
|
|
bigbrovar joined #koha |
08:46 |
|
|
hdl joined #koha |
08:53 |
|
|
sophie_m left #koha |
08:53 |
|
|
alex_a left #koha |
08:56 |
|
|
clrh joined #koha |
08:56 |
|
clrh |
Hello everyone |
09:00 |
|
chris |
hi clrh |
09:00 |
|
|
magnus joined #koha |
09:01 |
|
|
thd-away is now known as thd |
09:06 |
|
magnus |
kia ora, #koha! |
09:06 |
|
magnus |
solr meeting in about 50 minutes, right? |
09:06 |
|
chris |
hi magnus |
09:06 |
|
magnus |
hey chris |
09:07 |
|
clrh |
right magnus |
09:07 |
|
magnus |
chris: did you ever get around to looking at the logrotate stuff i mention you mentioning in bug 5055? |
09:07 |
|
munin |
Bug http://bugs.koha-community.org[…]w_bug.cgi?id=5055 enhancement, PATCH-Sent, ---, magnus, NEW, crontab.example should use standard file paths |
09:07 |
|
magnus |
thanks, clrh |
09:16 |
|
chris |
not ringing a bell |
09:17 |
|
chris |
lemme look what line 79 is |
09:18 |
|
chris |
hmm i wonder what files i was going to create? |
09:19 |
|
chris |
ahh the logrotate.conf |
09:19 |
|
magnus |
probably ;-) |
09:20 |
|
chris |
actually i think taking that line out altogether |
09:21 |
|
chris |
and telling people to do it properly |
09:21 |
|
chris |
with /etc/logrotate.d/ |
09:22 |
|
chris |
is probably a better fix and then making a file that they can drop into there |
09:22 |
|
chris |
that way, it can get installed with the package |
09:22 |
|
magnus |
sounds nice, and a bit above me... ;-) |
09:23 |
|
magnus |
should i take the logrotate stuff out and submit a new patch without it? |
09:23 |
|
chris |
yeah that would be good |
09:26 |
|
magnus |
ok, will do |
09:26 |
|
|
BobB joined #koha |
09:28 |
|
|
irmaB joined #koha |
09:28 |
|
BobB |
Good evening all. Is Chris Cormack about? |
09:29 |
|
hdl |
hi BobB yes he is |
09:30 |
|
BobB |
Bon soir HDL, ca va? |
09:30 |
|
hdl |
(or was a few minutes ago) |
09:31 |
|
BobB |
I'll wait. I have something to ask him. |
09:31 |
|
clrh |
It would be great for us to have more than 2 connected persons to katipo for one ip... si ? |
09:32 |
|
hdl |
busy, as we all are. |
09:32 |
|
hdl |
what about you BobB? |
09:32 |
|
BobB |
Hi clrh. I don't think we've met. I'm Bob Birchall from Sydney Australia. |
09:32 |
|
magnus |
hi BobB & irmaB |
09:32 |
|
clrh |
Hi BobB :) |
09:33 |
|
hdl |
clrh: is from BibLibre, our new dev manager. |
09:33 |
|
BobB |
Ahhh, bienvenue! Good to meet you. |
09:33 |
|
clrh |
thanks |
09:33 |
|
magnus |
hi clrh, nice to meet you! ;-) |
09:33 |
|
hdl |
She is at the moment mostly on the solr developments. |
09:34 |
|
chris |
hi BobB |
09:34 |
|
BobB |
Hi Magnus. Obviously you arrived home safely from Wellington. I hear it is very cold in Northern Europe. It was 28 c in Sydney today. |
09:35 |
|
magnus |
@wunder bodo, norway |
09:35 |
|
munin |
magnus: The current temperature in Bodo, Norway is 6.0�C (10:20 AM CET on December 15, 2010). Conditions: Mostly Cloudy. Humidity: 66%. Dew Point: 0.0�C. Windchill: 0.0�C. Pressure: 29.77 in 1008 hPa (Falling). |
09:35 |
|
BobB |
Summer! |
09:35 |
|
magnus |
hehe |
09:36 |
|
magnus |
BobB: not quite summer, but yes the return trip was as smooth as the outward trip |
09:37 |
|
BobB |
It was so good to meet people at KohaCon. KohaCon turns email addresses into people! |
09:38 |
|
magnus |
yeah, not to mention irc nicks! ;-) |
09:38 |
|
BobB |
That too.:) |
09:39 |
|
BobB |
I need to find out who in the Koha community has lots of experience with DSpace? |
09:40 |
|
magnus |
are you thinking of connecting them? |
09:40 |
|
chris |
i know enough about dspace to know i dont like it |
09:41 |
|
BobB |
Hi Chris. Can we have a chat somewhere? |
09:41 |
|
kf |
hi Irma and Bob |
09:41 |
|
chris |
yep, just messaged you |
09:41 |
|
BobB |
Guten arbend/morgen KF |
09:45 |
|
|
tajoli joined #koha |
09:45 |
|
tajoli |
Hi to all |
09:46 |
|
tajoli |
The Solr meeting will be here ? |
09:46 |
|
clrh |
yep tajoli |
09:46 |
|
magnus |
tajoli: yep |
09:46 |
|
clrh |
hi |
09:46 |
|
magnus |
in about 10 minutes |
09:48 |
|
hdl |
yes tajoli |
09:54 |
|
|
Joubu joined #koha |
09:55 |
|
|
reed joined #koha |
09:55 |
|
clrh |
hey Joubu |
09:55 |
|
clrh |
he is working with me on solr dev for biblibre |
09:55 |
|
Joubu |
hello |
09:55 |
|
magnus |
hi Joubu |
09:56 |
|
hdl |
meeting in two minutes. |
09:57 |
|
hdl |
hi reed |
09:57 |
|
reed |
hellow |
09:58 |
|
hdl |
#startmeeting solr meeting |
09:58 |
|
munin |
Meeting started Wed Dec 15 10:00:17 2010 UTC. The chair is hdl. Information about MeetBot at http://wiki.debian.org/MeetBot. |
09:58 |
|
munin |
Useful Commands: #action #agreed #help #info #idea #link #topic. |
09:58 |
|
|
Topic for #koha is now (Meeting topic: solr meeting) |
09:58 |
|
irmaB |
Hi all |
09:58 |
|
hdl |
Hi.. |
09:58 |
|
hdl |
let's proceed to a round call |
09:59 |
|
* hdl |
Henri-Damien LAURENT, BibLibre |
09:59 |
|
thd |
Thomas Dukleth, Agogme, New York City |
09:59 |
|
* clrh |
Claire Hernandez, Biblibre |
09:59 |
|
irmaB |
irma birchall from CALYX in Sydney |
09:59 |
|
reed |
Reed Wade, Catalyst, NZ |
09:59 |
|
Joubu |
Jonathan Druart, Biblibre |
10:00 |
|
magnus |
Magnus Enger, Libriotech, Norway |
10:00 |
|
|
miguelxer joined #koha |
10:00 |
|
hdl |
any other persons ? |
10:01 |
|
hdl |
hi miguelxer |
10:01 |
|
miguelxer |
hello |
10:01 |
|
ibot |
bonjour, miguelxer |
10:01 |
|
hdl |
it is presentation time |
10:01 |
|
miguelxer |
buenos dias a todos!!, ja |
10:02 |
|
hdl |
#topic Why taking on solr |
10:02 |
|
|
Topic for #koha is now Why taking on solr (Meeting topic: solr meeting) |
10:02 |
|
hdl |
I think that this topic has been long advocated... |
10:03 |
|
|
josepedro joined #koha |
10:03 |
|
hdl |
If you have any questions or doubt on what we said previously and posted on list, |
10:03 |
|
hdl |
then ask. |
10:04 |
|
clrh |
#link http://librarypolice.com/koha-[…]07-20.00.log.html |
10:05 |
|
clrh |
#link http://wiki.koha-community.org[…]witch_to_Solr_RFC |
10:05 |
|
hdl |
Since there are no questions then we will skip to next topic : what we did and are up to. |
10:05 |
|
hdl |
no questions one |
10:05 |
|
clrh |
#link http://www.biblibre.com/en/blo[…]lopments-for-koha |
10:05 |
|
hdl |
two |
10:05 |
|
hdl |
three |
10:05 |
|
hdl |
#topic what is done |
10:05 |
|
|
Topic for #koha is now what is done (Meeting topic: solr meeting) |
10:06 |
|
|
juan_xerc joined #koha |
10:06 |
|
tajoli |
Zeno Tajoli, CILEA |
10:06 |
|
thd |
Does BibLibre intend to do work towards refactoring existing Koha record indexing and retrieval for use alongside Solr/Lucene and not as an either/or option? |
10:06 |
|
magnus |
i guess my main concern is support for Koha acting as Z39.50 and SRU server - what is the status on that and Solr? |
10:06 |
|
hdl |
guetting back then. |
10:07 |
|
hdl |
our work now is not an either or option... |
10:07 |
|
hdl |
Because of time and ressources. |
10:07 |
|
|
juan_xerc left #koha |
10:07 |
|
hdl |
But bricks that we used are flexible. |
10:07 |
|
hdl |
And I think that we could wrap ZOOM in them |
10:08 |
|
hdl |
That would be excellent not only for Koha... But also for Data::Search::Engine. |
10:08 |
|
thd |
magnus: #link http://git.biblibre.com/?p=koh[…]fs/heads/wip/solr misc/z3950.pl shows BibLibre work on a Simple2ZOOM gateway. |
10:08 |
|
hdl |
And we would be grateful if the community would help us inahcieving that. |
10:09 |
|
clrh |
#link https://github.com/eiro/rg-z3950-rpn/ wip about rpn grammar and z3950 server development |
10:09 |
|
|
juan_xerc joined #koha |
10:09 |
|
hdl |
#help on building Data::SearchEngine::ZOOM and Data::SearchEngine::Query::PQF via ZOOM wrappers |
10:09 |
|
clrh |
we have to match both together and translate in solr requests |
10:10 |
|
hdl |
magnus: we are doing some work on that... And will use 17 Mo of real use cases z3950 RPN queries to validate. |
10:11 |
|
hdl |
(well not all of those are interestiing... But we plan to take out the most relevant ones) |
10:11 |
|
hdl |
What we achieved |
10:11 |
|
hdl |
- advanced search is now working pretty well. |
10:12 |
|
hdl |
- Configuration of indexes via database is now ok. |
10:12 |
|
hdl |
- display items information is now OK. |
10:12 |
|
hdl |
as you can see and test on solr.biblibre.com |
10:12 |
|
irmaB |
wow that is impressive |
10:12 |
|
hdl |
or catalogue.solr.biblibre.com |
10:12 |
|
thd |
hdl: 17 mo? |
10:13 |
|
hdl |
mega bytes. |
10:13 |
|
tajoli |
I have a suggestion about Zebra in Koha |
10:13 |
|
|
Oak left #koha |
10:13 |
|
hdl |
listening. |
10:13 |
|
tajoli |
One of the problem that you report is about Facets |
10:14 |
|
tajoli |
why not insert Pazpar2 as mandatory and use its facets ? |
10:14 |
|
tajoli |
Ok is a dead road ? |
10:15 |
|
hdl |
pazpar2 could have been chosen... We chose solr because the community is much more active. |
10:15 |
|
thd |
tajoli: Pazpar2 requires CCL support which BibLibre have removed in their implementation. |
10:15 |
|
fredericd |
tajoli: if you want accurate facets with Pazpar2, you need to send to it the whole resultset |
10:15 |
|
fredericd |
it's not a solution |
10:15 |
|
hdl |
And CCL is not a standard... It needs configuration |
10:16 |
|
hdl |
and all the setups are a little bit different. |
10:16 |
|
josepedro |
that's true. we have proved it. |
10:16 |
|
thd |
tajoli: #link http://bugzilla.indexdata.dk/show_bug.cgi?id=2048 which is a bug for no facets with the ICU in Zebra which Index Data will only fix with a support contract. |
10:16 |
|
munin |
Bug 2048: blocker, PATCH-Sent, ---, gmcharlt, CLOSED FIXED, Kohazebraqueue daemon MAJOR issue |
10:16 |
|
hdl |
fredericd: miguelxer and josepedro can you preset your selves for the records ? |
10:17 |
|
hdl |
What we have problems and are willing to do ... |
10:18 |
|
hdl |
Z3950 support on top of solr. |
10:18 |
|
hdl |
clrh: mentioned that we worked on the Z3950 grammar to get the whole of it (at least what is presented on the Indexdata website which was the only ressource we got that from) |
10:19 |
|
hdl |
#action Z3950 support on top of solr |
10:19 |
|
josepedro |
we started with pazpar but we thought that is not the best solution |
10:19 |
|
irmaB |
josepedro was that with Koha 3? |
10:19 |
|
hdl |
#action improving the indexing time. |
10:20 |
|
josepedro |
yes |
10:20 |
|
hdl |
indexing speed is still quite slow compared to zebra... |
10:20 |
|
hdl |
But we are working on two ideas. |
10:20 |
|
clrh |
#link http://www.indexdata.com/yaz/doc/tools.html |
10:21 |
|
hdl |
- DIH Marc |
10:21 |
|
clrh |
#link http://lucene.472066.n3.nabble[…]DIH-td504691.html |
10:21 |
|
hdl |
(posted from erik hatcher...) |
10:21 |
|
magnus |
and SRU? |
10:22 |
|
josepedro |
since Biblibre started with solr, we started to review their code comparing with Vufind |
10:22 |
|
hdl |
magnus: I guess and hope that SimpleServer will also cope with SRU |
10:22 |
|
magnus |
ok |
10:22 |
|
hdl |
- forking and sending // batches to index to solr. |
10:23 |
|
clrh |
#link http://www.nntp.perl.org/group[…]0/12/msg2836.html about another idea for improving => multi-threaded call |
10:23 |
|
thd |
magnus: SimpleServer will map CQL to PQF for SRU. |
10:24 |
|
josepedro |
we are reviewing the sru/srw libs, too |
10:24 |
|
hdl |
josepedro: what have you found ? |
10:24 |
|
josepedro |
libs for dspace |
10:24 |
|
hdl |
josepedro: what have you figured out from your audits ? |
10:25 |
|
hdl |
and code reviews ? |
10:25 |
|
josepedro |
we think that the implementation is not going to be very difficult with time enough. |
10:26 |
|
hdl |
Do you have a plan or could devote people to work with us ? |
10:26 |
|
josepedro |
anyway, our main aim is facets solution |
10:27 |
|
josepedro |
yes, we would like to collaborate with you |
10:28 |
|
hdl |
josepedro: it looks that we have a nice solution... With true facets. needs some more work on thorough configuration of indexes. |
10:28 |
|
hdl |
and adapting the solr configuration for MARC21 ( |
10:28 |
|
hdl |
we based our work on UNIMARC) |
10:29 |
|
josepedro |
have you seen something about printing solr records directly?? |
10:29 |
|
josepedro |
without looking the database?? |
10:29 |
|
hdl |
Plugin system we implemented for solr (that could be also used for zebra) is quite handy |
10:30 |
|
hdl |
josepedro: I donot understand. |
10:31 |
|
hdl |
We are using records. getting them from koha and processing information to send to solr. |
10:31 |
|
thd |
hdl: Where did you find a Solr/Lucene configuration for MARC 21? |
10:31 |
|
tajoli |
Well, as I have undestood the mains problems are Zebra + ICU and facets. |
10:31 |
|
tajoli |
For me setup Zebra is not a problem |
10:32 |
|
josepedro |
when you get the solr records, you search them in the database instead of printing them directly |
10:32 |
|
tajoli |
and realtime indexing work (with a daily reboot) |
10:32 |
|
thd |
tajoli: Zebra has a few problems but we should be able to have both Zebra and Solr/Lucene together. |
10:33 |
|
tajoli |
But I undestand that Zebra + ICU is mandatory with more that one charset (like in France). |
10:33 |
|
hdl |
whatever quite big library you are. |
10:33 |
|
thd |
tajoli: That is exactly how BibLibre came to their problem. |
10:34 |
|
hdl |
even small have some special books |
10:34 |
|
hdl |
in Hebrew, arabic... and Georgian.. |
10:34 |
|
thd |
tajoli: As hdl states big libraries need full Unicode support. |
10:34 |
|
magnus |
so what is the real question here? no one seems opposed to solr as such, but there are some good reasons for keeping zebra around too. as long as solr is introduced as an option along side zebra everyone is happy, right? |
10:34 |
|
hdl |
This is how we ... and the whole community came into that problem. |
10:34 |
|
tajoli |
And Zebra + ICU doesn't work |
10:34 |
|
thd |
tajoli: Zebra is fixable but with support money. |
10:35 |
|
hdl |
tajoli: koha 3.0 was claimed to support full utf8 search |
10:35 |
|
hdl |
magnus: yes. Sure. |
10:35 |
|
hdl |
our main problem is that we have limited ressources. |
10:36 |
|
thd |
Would BibLibre not at least consider abstracting the search calls so that others attempting to reintroduce Zebra, Pazpar2, etc. support on top of BibLibre's Solr/Lucene work would not entail rewriting BibLibre Solr/Lucene work for better abstraction? |
10:36 |
|
hdl |
We are willing to share ideas and development. |
10:36 |
|
tajoli |
Well, what I want to say is that CILEA can TRY to help biblibre to develop an abstract search call interface |
10:36 |
|
reed |
there's a separate project here and I think that's 'make koha support various search back ends' |
10:36 |
|
tajoli |
sorry Biblibre |
10:37 |
|
hdl |
Well reed it is not much separate. I think it could be built on top of what we did. |
10:37 |
|
tajoli |
But with pointing that Zebra continues to have the problems: |
10:37 |
|
magnus |
hdl: we all have limited resources |
10:38 |
|
tajoli |
-- diffcult indexes setup |
10:38 |
|
tajoli |
-- no ICU |
10:38 |
|
tajoli |
-- bad facets |
10:38 |
|
tajoli |
but |
10:38 |
|
tajoli |
Less RAM to use |
10:38 |
|
tajoli |
For use this is the kay point |
10:39 |
|
reed |
hdl, good to hear -- but still is a thing that sounds like it needs additional attention |
10:39 |
|
clrh |
I have to see how much ram is needed with solr, didn't test anymore |
10:39 |
|
tajoli |
We can't ask to improve RAM requests |
10:39 |
|
hdl |
big CPU consumption |
10:39 |
|
tajoli |
big CPU consumption with solr ? |
10:39 |
|
hdl |
bug CPU consumption is for zebra. |
10:39 |
|
tajoli |
are you sure ? |
10:40 |
|
hdl |
absolutely. |
10:40 |
|
thd |
tajoli: Zebra certainly has ICU but it does not work for scan queries for facets nor truncation other than right truncation. |
10:40 |
|
tajoli |
I don't see it. |
10:40 |
|
hdl |
If you read some logs about performance improvements, mason pointed |
10:40 |
|
hdl |
that you needed to set zebra on a different machine. |
10:40 |
|
tajoli |
Ok, thank you. |
10:41 |
|
hdl |
reed: I always said that we would like to build that. |
10:41 |
|
hdl |
reed: but we cannot do that alone. |
10:41 |
|
hdl |
Refactoring the C4::Search was a priority for us. |
10:41 |
|
reed |
solr ram reqs will vary depending on updates and number of indexes and searches and catalog size and it'll be a few years before we have some stable config advice |
10:41 |
|
hdl |
We chose the best bricks for that. |
10:41 |
|
reed |
(I might be exagerating the case a little) |
10:41 |
|
tajoli |
With Solr as option I don't suggest to use Zebra with ICU |
10:42 |
|
hdl |
tajoli: said he would help us in making solr an option... any other persons , |
10:42 |
|
hdl |
? |
10:43 |
|
thd |
hdl: Do you understand the problem that anyone starting from your work on C4:Search to add other non-Solr/Lucene options would need to rewrite all search calls to keep your Solr/Lucene work? |
10:43 |
|
magnus |
i do not like the sound of "others attempting to reintroduce Zebra, Pazpar2, etc. support on top of BibLibre's Solr/Lucene work" - sure biblibre has put a lot into this, but why should "others" have to re-implement something that is working (although not perfectly) today? |
10:43 |
|
thd |
I should have s/on top/along side/ |
10:44 |
|
hdl |
magnus: C4::Search refactoring was not set by BibLibre. |
10:44 |
|
thd |
hdl? |
10:44 |
|
ibot |
well, hdl is in France. France is in a galaxy far, far away. |
10:44 |
|
hdl |
and is a point in 3.4 |
10:44 |
|
hdl |
forget hdl |
10:44 |
|
ibot |
hdl: I forgot hdl |
10:44 |
|
thd |
ahhh |
10:45 |
|
hdl |
magnus: we worked on that... And for solr integration... because it fixed many problems at once... |
10:45 |
|
hdl |
And would allow better end user experience. |
10:45 |
|
thd |
hdl: However, the particular implementation of refactoring C4::Search is BibLibre's work. |
10:45 |
|
tajoli |
I confirm, I will TRY to help to BibLibre to have Solr and Zebra as index tool in Koha. |
10:46 |
|
hdl |
We are willing to add advanced search customization from administration. |
10:46 |
|
thd |
hdl: Substituting one record indexing system for another is not refactoring as such. |
10:46 |
|
tajoli |
But not in the same install, as an option to select in install |
10:47 |
|
tajoli |
clealy with Zebra no those options: |
10:47 |
|
reed |
tajoli, yeah, that sounds sensible |
10:47 |
|
tajoli |
-- No ICU |
10:47 |
|
tajoli |
-- No vanced search customization from administration |
10:47 |
|
tajoli |
-- No improvment on facets |
10:48 |
|
hdl |
thd: show me any other code that works as much what we did. And I will be happy |
10:48 |
|
tajoli |
No indexes from administration |
10:48 |
|
tajoli |
No checks on data |
10:48 |
|
hdl |
#action work in pairs with CILEA for zebra as an option implementation |
10:49 |
|
thd |
hdl: I am trying to understand your last post. |
10:49 |
|
tajoli |
Etc. |
10:49 |
|
tajoli |
Zebra as is today |
10:49 |
|
irmaB |
and proprietary LMS already offer (or say they do) facet searching and truncation etc. |
10:50 |
|
hdl |
and many are using solr internally |
10:50 |
|
* slef |
= MJ Ray, worker-owner of software.coop |
10:51 |
|
hdl |
#help new ideas for plugins to add so that the indexing could be better. |
10:51 |
|
irmaB |
hi MJ |
10:51 |
|
irmaB |
late ... |
10:51 |
|
irmaB |
but here :) |
10:51 |
|
slef |
dentist, unavoidable |
10:52 |
|
thd |
hdl: I think I have understood your post about comparable code but stating that other work is inadequate should not be a basis for not attempting to develop a better model than other work. |
10:52 |
|
slef |
#welovethenhs but it does mean I'm reluctant to waste public money by moving appointments |
10:52 |
|
hdl |
josepedro: miguelxer we would appreciate your feedback |
10:52 |
|
hdl |
on the review. |
10:53 |
|
thd |
hdl: I do not question that much of the best work in Koha is work from BibLibre and Paul Poulain's business previously. |
10:53 |
|
hdl |
Can I add an action from xercode as of code review on what we did ? |
10:54 |
|
hdl |
thd: it is not a question of comparison. |
10:54 |
|
magnus |
hdl: no one is denying that biblibre is doing good work here - it just seems odd to me that one of the biggest companies should introduce new "things" that break old "things" that a lot of people still want... |
10:54 |
|
hdl |
There is no other code to compare. |
10:54 |
|
hdl |
magnus: we donot want to break... |
10:55 |
|
magnus |
good :-) |
10:55 |
|
hdl |
But to build on safer ground. |
10:55 |
|
josepedro |
we do not understand |
10:55 |
|
thd |
hdl: safer ground? |
10:55 |
|
hdl |
We would like to have your feedback from the code review you did. |
10:56 |
|
thd |
hdl:I have been typing furiously on that since last night in addition to other days. |
10:56 |
|
hdl |
thd: more abstract bricks so that it is more flexible. |
10:56 |
|
hdl |
Have any thing that worked before working with the new system. |
10:57 |
|
hdl |
And then... use that abstraction to reintroduce options |
10:58 |
|
thd |
hdl: Yet, not everything that worked before would work with the new system otherwise we would merely be busy praising your effort without these qualifications. |
10:58 |
|
hdl |
We are gathering use cases of searches that worked in koha previously. |
10:58 |
|
thd |
hdl: Do not mistake that I do praise BibLibre's work. |
10:58 |
|
clrh |
#link https://spreadsheets.google.co[…]hl=en&output=html |
10:59 |
|
hdl |
#help please try and contribute yours |
10:59 |
|
slef |
but not use cases like "run in less than a gig" which are also a vital concern |
10:59 |
|
hdl |
are you having servers with less than one gig ? |
10:59 |
|
josepedro |
we consider it a great job but we think that there are several points that need to review deeply. |
10:59 |
|
slef |
yes, lots of our libraries have sub-gig servers |
10:59 |
|
thd |
hdl: One thing which would have worked on Koha using Pazpar2 which needs CCL is metasearch. |
10:59 |
|
josepedro |
for example, the last i commented you |
11:00 |
|
clrh |
ok josepedro can you send us a mail with your points? |
11:00 |
|
thd |
hdl: What support do you envision for metasearch? |
11:00 |
|
hdl |
solr has internal support for metasearch. |
11:00 |
|
thd |
hdl: To Z39.50 servers? |
11:01 |
|
slef |
hdl: The co-op may be unusual in that we support almost as many self-hosted libraries as shared-hosted ones, but I would expect a lot of independent libraries to be worried by the increased resource demands of solr too. |
11:01 |
|
thd |
hdl: Solr/Lucene is not the API for library databases while Z39.50/SRU is. |
11:02 |
|
* magnus |
agrees with slef |
11:02 |
|
josepedro |
yes, no problem. But we have already sent you something about this. |
11:02 |
|
hdl |
slef: can you then be accurate in your demands ? |
11:02 |
|
hdl |
Those libraries surely donot have 300000 records. |
11:02 |
|
|
laurence left #koha |
11:02 |
|
hdl |
And it would be quite nonesense to pretend that koha3.0 works in that context. |
11:03 |
|
slef |
hdl: reportedly (see link I added to RFC), solr defaults to a 1Gb RAM usage. |
11:03 |
|
thd |
hdl: Do you find database size limitations for Zebra? |
11:04 |
|
hdl |
#action josepedro send a mail with code reviews. |
11:04 |
|
hdl |
again, it is still work in progress. |
11:04 |
|
hdl |
We can help. |
11:04 |
|
hdl |
We are willing to recieve help. |
11:04 |
|
reed |
slef, hdl - solr is just not going to be viable in small installations, it can't become a requirement for using koha |
11:04 |
|
hdl |
Even comments. |
11:04 |
|
slef |
hdl: I don't know how many records, but I suspect most koha libraries are smaller than that. Solr may be needed for the top 10% of libraries, but we cannot let 10% of libraries increase expense for the 90% unnecessarily, can we? |
11:05 |
|
reed |
snap |
11:05 |
|
thd |
hdl: I think you missed a 0 in 3 million if you intended 3 million. |
11:05 |
|
hdl |
slef: if there is an abstraction layer you will have no problems. |
11:06 |
|
hdl |
thd: no. 300,000 with less that 8Gb is not viable option. |
11:06 |
|
hdl |
with koha3.2 |
11:06 |
|
slef |
reed++ |
11:06 |
|
magnus |
redd++ |
11:06 |
|
magnus |
reed++ # sorry |
11:07 |
|
hdl |
slef: reed we are willing to help but we cannot do that alone. |
11:07 |
|
thd |
hdl: wow, I have only tested very small record sets. |
11:07 |
|
hdl |
tajoli: propsed to help. |
11:07 |
|
slef |
(I don't know OTTOMH, that is) |
11:07 |
|
tajoli |
yes, I confirm |
11:07 |
|
hdl |
and that is fine. |
11:07 |
|
reed |
hdl, right -- was going to say that I don't think you expect it should be a requirement for koha |
11:07 |
|
hdl |
We will try and help him. |
11:07 |
|
thd |
hdl: Do you have a comparison of RAM requirements in your Solr/Lucene test? |
11:08 |
|
clrh |
nop thd I ll try to provide it |
11:08 |
|
hdl |
#action provide a comparison of RAM requirements between zebra and solr |
11:08 |
|
hdl |
thd: to be honest... it would require to do multiple tests. |
11:09 |
|
thd |
hdl: how multiple? |
11:09 |
|
hdl |
I am sure that Croswalking records in zebra is also ram demanding. |
11:09 |
|
josepedro |
From our point of view, Koha has 3 big problems: 1- Facets. 2- Abstraction. 3- ModPerl. At present Zebra does not meet our expectations about facets, so we believe SOLR is the best solution and we would like to collaborate with BibLibre to develop this solution. |
11:09 |
|
hdl |
josepedro: Mod Perl Plack will be another meeting. |
11:09 |
|
reed |
re: ram requirements --- my prediction is that solr schema tuning is going to be a very long process and so and profiling done now is likely to go out of date fast |
11:10 |
|
hdl |
josepedro: Please keep contributing... |
11:10 |
|
hdl |
any other questions ? |
11:10 |
|
tajoli |
No |
11:10 |
|
irmaB |
I propose to look at the MARC21 implications with sorl - adapting the solr configuration for MARC21 |
11:11 |
|
thd |
reed: Improvements in the sophistication of indexing may actually greatly increase RAM requirements for all options. |
11:11 |
|
reed |
agree |
11:11 |
|
hdl |
#action irmaB build an adaptation of MARC21 for solr |
11:12 |
|
hdl |
#action BibLibre make a solr instance for MARC21 and publicise that to Irma |
11:12 |
|
irmaB |
yes. |
11:12 |
|
thd |
hdl: Did you state that you adapted an existing MARC 21 Solr/Lucene schema to UNIMARC? |
11:13 |
|
hdl |
No. |
11:13 |
|
tajoli |
Vufind has a MARC21 setup for Solrs |
11:13 |
|
hdl |
tajoli: with solrmarc. |
11:14 |
|
tajoli |
because you don't use solrmarc ? |
11:14 |
|
hdl |
yes... we investigated that. And think that we could build bridges. |
11:14 |
|
hdl |
No. |
11:14 |
|
thd |
hdl: I think that was merely a confusion between your description and a comment next to yours. |
11:14 |
|
hdl |
It was proved not to be that efficient in indexing. |
11:15 |
|
magnus |
hmmm... seems to me the main focus now should be on getting solr and zebra to both be options along side each other - otherwise it sounds like the solr solution will have a hard time becoming part of koha/replacing zebra... |
11:15 |
|
thd |
hdl: How do you return a complete record from Solr/Lucene? |
11:15 |
|
hdl |
And would have required too much time.... and would not have enable ppl wirh the flexibility we wanted to provide them. |
11:16 |
|
clrh |
thd, you can look in Search::IndexRecord |
11:16 |
|
tajoli |
Clearly is not a task for 3.4 (april 2010) |
11:16 |
|
hdl |
magnus: we cannot. This is why we asker for help. |
11:16 |
|
hdl |
tajoli: agreed.. |
11:16 |
|
tajoli |
But for 3.6 |
11:16 |
|
clrh |
a record is constructed before indexing |
11:16 |
|
clrh |
s/before/during |
11:16 |
|
hdl |
But anyway, it is work on progress. |
11:17 |
|
hdl |
and if all the RFCs cannot be integrated into 3.4 RM is fine with that. |
11:17 |
|
thd |
clrh: I had looked at addbiblio.pl |
11:18 |
|
slef |
hdl: it is difficult to persuade 90% of libraries that they should fund something to enable support for the biggest 10%, and I mentioned last meeting that I don't think our members will pay from the co-op's community donation fund. You see our difficulty here? |
11:18 |
|
hdl |
slef: our work was only very little funded. |
11:18 |
|
hdl |
And we did that. |
11:19 |
|
magnus |
hdl: which is cool, but it seems there are lots of fun things to do with solr that may not be so important compared to getting solr into koha in the first place (which seems to imply making zebra and solr work as options next to each other) |
11:19 |
|
thd |
hdl: Do you not think you should have tried to obtain more funding for a development with such a large scope? |
11:20 |
|
tajoli |
well, for me the main bonus of Solrs is to replace Zebra +ICU |
11:20 |
|
hdl |
thd: noone would have ever funded refactoring. |
11:20 |
|
hdl |
you donot want to redo things. |
11:20 |
|
hdl |
libraries want features. |
11:20 |
|
slef |
hdl: that is your decision. Maybe this is easier for a private company. I just explain the difficulty of our membership organisation in the hope you will comprehend it. |
11:20 |
|
tajoli |
Is clear that Zebra+ICU doesn't work. |
11:20 |
|
hdl |
I explain ours. |
11:21 |
|
slef |
hdl: that's not true. The co-op funded a lot of SQL-injection/placeholder cleanup way back when. |
11:21 |
|
thd |
tajoli: It works mostly but with important exceptions. |
11:21 |
|
tajoli |
So every library with a complex charset (like Arab+latin) can't use koha in good way now |
11:21 |
|
slef |
I think someone now is funding template refactoring (sorry I forget who). |
11:21 |
|
thd |
tajoli: Yes that is correct. |
11:21 |
|
tajoli |
Koha in fact is use on a simlgle charset enviroment |
11:22 |
|
tajoli |
The english speaking countriies and contry like Italy with latin charset only |
11:22 |
|
hdl |
ok. |
11:22 |
|
tajoli |
And in this situation Zebra is good |
11:22 |
|
hdl |
ok. |
11:23 |
|
hdl |
I propose to stop the meeting now. |
11:23 |
|
tajoli |
But if you need two charset, problems arise |
11:23 |
|
hdl |
And if you have other questions, or concerns or feed back on test instances, let us know. |
11:23 |
|
hdl |
on list please |
11:24 |
|
tajoli |
bye |
11:24 |
|
|
tajoli left #koha |
11:24 |
|
slef |
catalyst are funding template refactoring |
11:26 |
|
hdl |
#endmeeting |
11:26 |
|
|
Topic for #koha is now Welcome to #koha - www.koha-community.org. Koha 3.2.1 is now available. Next general meeting on 5 January 2011 |
11:26 |
|
munin |
Meeting ended Wed Dec 15 11:28:19 2010 UTC. Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4) |
11:26 |
|
munin |
Minutes: http://librarypolice.com/koha-[…]-12-15-10.00.html |
11:26 |
|
munin |
Minutes (text): http://librarypolice.com/koha-[…]0-12-15-10.00.txt |
11:26 |
|
munin |
Log: http://librarypolice.com/koha-[…]15-10.00.log.html |
11:26 |
|
clrh |
Thanks all! Happy to see more interractions today :) |
11:26 |
|
magnus |
yeah, interesting meeting! |
11:27 |
|
hdl |
slef: we all contribute. There are many places to work on. |
11:27 |
|
hdl |
We will have work on Acquisitions and serials. |
11:28 |
|
hdl |
But rather than letting ppl alone... working in small groups of interest could make better work in the end. |
11:28 |
|
thd |
hdl: I have a concern that if we take the approach that refactoring will never be funded too much new development will be at the expense of old features. |
11:28 |
|
hdl |
thd: this has already been the case... And will be the case unless we have unit TESTS. for all. |
11:29 |
|
hdl |
every single bug every single feature should have one. |
11:29 |
|
thd |
hdl: I know that it has already been the case and that BibLibre have been great victims of the problem as well as perpetrators. |
11:29 |
|
hdl |
But it costs... And Libraries have not been used to pay much on Koha developments. only the minimum |
11:30 |
|
irmaB |
G'night all and thanks hdl for the good meeting. |
11:30 |
|
hdl |
thx irmaB |
11:30 |
|
magnus |
i guess getting funding for writing tests is about as easy as getting funding for refactoring... |
11:30 |
|
|
irmaB left #koha |
11:31 |
|
hdl |
magnus: you are right. |
11:31 |
|
thd |
hdl: We need a better scheme for distributing costs where more than one library contributes to payments. |
11:31 |
|
hdl |
This is why we try and make those refactorings part of new features. |
11:31 |
|
slef |
Would someone review a patch branch for me, please? |
11:31 |
|
hdl |
But it needs planning and sharing. |
11:31 |
|
hdl |
slef: which one ? |
11:32 |
|
thd |
hdl: I still think that we have a different conception of refactoring. |
11:33 |
|
magnus |
planning_and_sharing++ |
11:33 |
|
thd |
hdl: Refactoring is not rewriting it is abstracting. |
11:33 |
|
slef |
hdl: thanks but I can't find the bloody one for HEAD. Only for 3.0. Interested? |
11:33 |
|
hdl |
Sure ;) |
11:33 |
|
hdl |
Will try and make it in for 3.0.x |
11:34 |
|
slef |
gitgitorious.org:koha/mainline.git for-3.0/bug_5394 |
11:34 |
|
hdl |
If you find it for HEAD then let me know. |
11:34 |
|
slef |
once I free some disk space, I'll port it to HEAD |
11:35 |
|
BobB |
Late here now. I'm off. Good night. |
11:35 |
|
|
BobB left #koha |
11:36 |
|
reed |
g'nite all |
11:36 |
|
|
reed left #koha |
11:37 |
|
magnus |
good night reed |
11:44 |
|
magnus |
lunchtime! |
11:44 |
|
|
magnus is now known as magnus_a |
11:51 |
|
thd |
hdl: Is 300,000 records the largest BibLibre library? |
11:54 |
|
|
tcohen joined #koha |
11:54 |
|
|
druthb joined #koha |
11:54 |
|
|
ivanc left #koha |
11:55 |
|
|
ivanc joined #koha |
11:59 |
|
|
miguelxer left #koha |
12:00 |
|
|
Joubu left #koha |
12:08 |
|
slef |
thd: might be commercially-sensitive info. |
12:10 |
|
|
juan_xerc left #koha |
12:11 |
|
thd |
slef: It should not be in the sense in which I am asking. |
12:13 |
|
thd |
slef: If 300,000 records needs 8GB for Zebra that must not be a base number of records per GB or some services which use Zebra would be unaffordable to create. |
12:14 |
|
thd |
slef: Think of Biblios.net with millions of records indexed in Zebra. |
12:17 |
|
kf |
thd: I think the 8gb are disk space not ram |
12:17 |
|
kf |
we once made a bigger database for a demo, I think 250.000? |
12:18 |
|
kf |
the indexes get pretty big, but we have a lot of indexes. |
12:18 |
|
thd |
kf: hdl and I were having multiple linguistic confusions during the meeting. |
12:18 |
|
kf |
I just read back |
12:18 |
|
|
alohalog_ joined #koha |
12:19 |
|
kf |
I can not confirm zebra + icu does not work at all |
12:19 |
|
kf |
we are using it |
12:19 |
|
kf |
with hebrew |
12:19 |
|
kf |
and german umlauts, french diacritics |
12:19 |
|
|
alohalog left #koha |
12:19 |
|
thd |
kf: I tried to qualify that statement correctly every time it was raised. |
12:19 |
|
kf |
the truncation and scan problems, yes. you have to live without some of the search features, but not sure nothing can be done about that. |
12:20 |
|
kf |
the problem I see is, that these problems are not really discusses |
12:20 |
|
kf |
d |
12:20 |
|
thd |
kf: Money for a support contract with Index Data could fix those problems. |
12:20 |
|
kf |
it's not like the community decided to go to solr - it's biblibre's decision. and the decision process is missing. |
12:22 |
|
kf |
and although I would love to have those solr features, I think we need to have an option for those real small libraries... and it will need time to prove a better solution. |
12:22 |
|
thd |
kf: Solr/Lucene is a good choice. The implementation is merely problematic for adding the feature along side other features. |
12:22 |
|
kf |
thd: perhaps that could have been funded by the community, lots of people, smaller amounts |
12:23 |
|
kf |
thd: I am not against it, but as I stated previously - it should be an option - at least at first |
12:23 |
|
|
nengard joined #koha |
12:23 |
|
kf |
and zebra is very fast in indexing |
12:23 |
|
thd |
kf: Many things can be funded with many people and small amounts. |
12:23 |
|
kf |
yeah, but it has to be discussed before development starts to make that work |
12:24 |
|
nengard |
thd i like that quote - we need to add that to the quotes |
12:24 |
|
nengard |
which i have no idea how to do |
12:24 |
|
thd |
nengard: neither do I. |
12:24 |
|
kf |
I have |
12:24 |
|
kf |
... @quote add ... |
12:25 |
|
kf |
want to try nengard? |
12:25 |
|
nengard |
k |
12:25 |
|
nengard |
@quote add thd> Many things can be funded with many people and small amounts. |
12:25 |
|
munin |
nengard: Error: You must be registered to use this command. If you are already registered, you must either identify (using the identify command) or add a hostmask matching your current hostmask (using the "hostmask add" command). |
12:25 |
|
nengard |
hmm |
12:26 |
|
kf |
@hostmask add |
12:26 |
|
munin |
kf: The operation succeeded. |
12:26 |
|
kf |
huh |
12:26 |
|
slef |
@quote add thd> Many things can be funded with many people and small amounts. |
12:26 |
|
munin |
slef: Error: You must be registered to use this command. If you are already registered, you must either identify (using the identify command) or add a hostmask matching your current hostmask (using the "hostmask add" command). |
12:26 |
|
druthb |
@quote add <thd> kf: Many things can be funded with many people and small amounts. |
12:26 |
|
munin |
druthb: The operation succeeded. Quote #111 added. |
12:26 |
|
slef |
munin: grrr |
12:26 |
|
munin |
slef: Error: "grrr" is not a valid command. |
12:26 |
|
kf |
ah no |
12:26 |
|
kf |
that was not my quote |
12:26 |
|
kf |
it was thd |
12:26 |
|
|
bigbrovar left #koha |
12:26 |
|
kf |
ah... he said it to me... right |
12:26 |
|
* kf |
confused |
12:26 |
|
druthb |
:) |
12:26 |
|
slef |
kf #fail |
12:26 |
|
kf |
:( |
12:27 |
|
slef |
but still infinitely better than Initial Citylink |
12:27 |
|
kf |
? |
12:27 |
|
slef |
kf: http://identi.ca/mjray |
12:28 |
|
slef |
I've been having a few problems with a courier. |
12:29 |
|
kf |
but why kf #fail? |
12:31 |
|
slef |
confusion |
12:31 |
|
nengard |
data inmport question "Can a record import to Koha with no values in the three required fixed fields?" |
12:31 |
|
slef |
sorry, didn't mean anything by it |
12:31 |
|
slef |
nengard: not from tools last I tried. Would expect bulkmarcimport to barf too. |
12:31 |
|
nengard |
that's what I thought |
12:31 |
|
nengard |
thanks for confirming |
12:32 |
|
slef |
nengard: might be able to import from Z39.50... I don't remember. |
12:32 |
|
nengard |
what about using the command line? |
12:32 |
|
slef |
want me to check? |
12:32 |
|
nengard |
yes please :) |
12:32 |
|
nengard |
and thank you |
12:34 |
|
thd |
kf: I think that if there had been better communication by BibLibre with the rest of the Koha community about problems with Zebra ICU support etc. they would not have been in such a rush to add Solr/Lucene without preserving Zebra. |
12:34 |
|
kf |
yes, I agree |
12:36 |
|
thd |
s/etc./etc. we would have formed a plan to fund fixing the bugs and/ |
12:37 |
|
slef |
nengard: looks like it depends on C4::AddBiblio which depends on the biblioitems columns being NOT NULL or some similar restriction. Checking that on a live server |
12:38 |
|
nengard |
this is so silly ... libraries used to delete the fixed fields to save space cause they were being charged too much, this is the second library I've had tell me this - so when you go to import their old records they don't work - stupid proprietary nonsense! |
12:39 |
|
slef |
nengard: erm. Ow. Which fields do you think are required? |
12:39 |
|
nengard |
000, 007, 008 |
12:39 |
|
nengard |
or one of the three |
12:39 |
|
nengard |
these are the ones that are missing |
12:39 |
|
thd |
LibLime once paid for a support contract for Zebra with Index Data but such burdens should not fall on one company alone. |
12:40 |
|
thd |
nengard: The original MARC Koha clobbered those fields as well. |
12:40 |
|
slef |
nengard: oh, it'll also depend if MARC::Batch can read them without those! |
12:41 |
|
thd |
nengard: kados wrote a script for NPL to reconstruct at least 000 from other data in the records. |
12:43 |
|
nengard |
and is that script available in the wild? |
12:43 |
|
|
jwagner joined #koha |
12:48 |
|
thd |
nengard: Perhaps NPL has a copy but it may be lost on some LibLime servers. |
12:48 |
|
nengard |
figures |
12:48 |
|
nengard |
okey dokey - at least we've confirmed some details |
12:48 |
|
thd |
nengard: kados had told me he spent two weeks working on the script. |
12:49 |
|
kf |
we normally use some default values if we can't make real fields |
12:49 |
|
thd |
nengard: I have a better option for you. |
12:49 |
|
kf |
it's nice to have the year in 000 for publication year seach in opac |
12:49 |
|
nengard |
thd whatcha got? :) |
12:49 |
|
thd |
nengard: Distributed Z39.50 searches with record matching. |
12:50 |
|
hdl |
thd: We have a library which has 800,000 records. |
12:50 |
|
thd |
hdl: I think that we had another linguistic confusion. |
12:50 |
|
hdl |
thd: and we read that one library in India has 7M records. |
12:50 |
|
thd |
hdl: How much RAM do those libraries need? |
12:53 |
|
hdl |
For 300,000 records, it is not more a RAM problem than a CPU consumption. |
12:53 |
|
hdl |
We are trying to use zebra with no icu... in order to check whether it is icu. |
12:54 |
|
hdl |
kf: Taking decisions... about money or technical decisions... is a problem we raised... and that some told there was no problems. |
12:55 |
|
thd |
hdl: Do you mean a real time problem for CPU usage during queries or an indexing CPU problem. |
12:55 |
|
hdl |
thd: realtime problems... zebrasrv using one proc for him alone. |
12:56 |
|
hdl |
when you launch multiple concurrent queries... your server goes wild. |
12:56 |
|
thd |
hdl: I probably was not paying attention to Koha and I know that support companies not using the ICU may not have investigated well. |
12:57 |
|
thd |
hdl: That is the worst problem identified thus far. |
12:57 |
|
hdl |
thd: about distributed solution... are you talking about the work that was in an unstable status (despite quite promising) from LibLime ? |
12:58 |
|
thd |
hdl: No I mean my own work in PHP which needs porting to Perl. |
12:58 |
|
hdl |
nice... |
12:59 |
|
hdl |
there is also iirc apache2-pazpar2 or masterkey for that. |
12:59 |
|
thd |
hdl: I keep it a secret because not everyone would think the default configuration to be nice. |
12:59 |
|
slef |
nengard: I'm not happy about the newsletter linking to biblibre.fr which requires one to pass an audio-visual ability test to comment. Could we have newsletter articles hosted on k-c.o instead or is that a ton more work? |
13:00 |
|
nengard |
It would requires authors of posts in the newsletter to publish to k-c.org |
13:00 |
|
nengard |
I can say that if your info isn't on the official koha site then I can't include it |
13:01 |
|
nengard |
or I can start publishing entire articles in the newsletter instead of short blurbs (but I fear people won't read them) |
13:01 |
|
nengard |
I'm up for any of these options |
13:01 |
|
slef |
I think it's a bit odd clrh didn't link to the RFC or email thread where some claims in that article are queried, and other drawbacks are mentioned, but that's not an editing/compilation matter. |
13:02 |
|
slef |
nengard: short blurbs are fine. Remind me: do we have feedwordpress installed? |
13:02 |
|
nengard |
I don't know ... wizzyrea would know that |
13:02 |
|
nengard |
not sure I have access to the plugins page ... off to check |
13:02 |
|
slef |
I'll look into it, if you're not against the idea. |
13:02 |
|
nengard |
yeah I don't have plugin access |
13:03 |
|
nengard |
and I don't like the idea of all posts from other Koha sites coming into the official site |
13:03 |
|
nengard |
things get in that don't belong that way |
13:03 |
|
nengard |
if I post something Koha related on the ByWater site that is more ByWater related it shouldn't go to the k-c.org site |
13:03 |
|
slef |
no, not all posts, just selected ones |
13:03 |
|
nengard |
oh cool - if we can do that then a-ok with me :) |
13:04 |
|
slef |
I'm thinking set up feedwordpress to read category feeds like for-koha-newsletter (which I just imagined) into a koha-newsletter-proposals category on k-c.o and set it to manual update |
13:04 |
|
nengard |
that sounds fine with me |
13:05 |
|
nengard |
we might want to put it on the agenda for next meeting so others know the options/plan |
13:05 |
|
slef |
then newsletter time, you'd tell it to update, see what comes in, delete crap, reorder what you want, then maybe just copy-paste the category index into the newsletter, with header/footer. |
13:05 |
|
slef |
yeah, I'll write something up |
13:05 |
|
nengard |
awesome |
13:05 |
|
slef |
@later tell wizzyrea do we have feedwordpress on koha-community.org? |
13:05 |
|
munin |
slef: The operation succeeded. |
13:05 |
|
nengard |
well starting in 2011 it's a bi-monthly newsletter - it's too hard to get enough content from people for monthly |
13:06 |
|
slef |
nengard: Shame, it seems like the newsletter is really popular, which I put down to your great work and persistent promotion. |
13:06 |
|
slef |
nengard: today I got email about it from a librarian before I saw the newsletter myself. |
13:07 |
|
nengard |
well then maybe we need to put that to a vote too at the meeting |
13:07 |
|
slef |
I can appreciate the struggle for monthly, though. It's more stuff that it's hard to get paid for. |
13:07 |
|
nengard |
I'll do it monthly but I don't want to have to keep sending out 4 reminders to get articles |
13:07 |
|
nengard |
it's too much nagging for me :) |
13:07 |
|
|
jcamins_a is now known as jcamins |
13:07 |
|
slef |
want me to schedule them for you? ;-) |
13:08 |
|
hdl |
nengard: the link could be done on the RFC around solr and the progress. iirc we posted that. |
13:08 |
|
hdl |
or use the meetings logs. |
13:09 |
|
nengard |
slef yes |
13:09 |
|
slef |
nengard: You really don't want to know how many emails we send co-op members... one member's email was down for a few months and I reconnected him yesterday and he had 2800 emails. Our daily newsletter is a lot of that. |
13:09 |
|
nengard |
and hdl, actually I link to the RFC too in that newsletter so maybe we didn't need the two posts, but like I said I can't get enough content so I put them both in :) |
13:10 |
|
nengard |
slef hehe - scary stuff :) hehe |
13:11 |
|
hdl |
sorry nengard thanks for your action |
13:11 |
|
slef |
nengard: yes to scheduling? After I write the category idea, shall I base suggested reminders on your emails and send them to you for approval? |
13:11 |
|
nengard |
slef yes to it all |
13:11 |
|
nengard |
reminders and scheduling |
13:11 |
|
nengard |
yes to anything that means less work for me :) |
13:11 |
|
nengard |
yes to anything that means helping nengard do her many jobs :) |
13:12 |
|
slef |
nengard: do I send to your on-list email address? |
13:13 |
|
nengard |
yes please - nengardgmail.com |
13:13 |
|
|
owen joined #koha |
13:13 |
|
slef |
next meeting time anyone off the top of their head? |
13:14 |
|
slef |
Next general meeting on 5 January 2011 |
13:14 |
|
slef |
heh, topic |
13:15 |
|
nengard |
creating a wiki page |
13:15 |
|
slef |
bbi60 |
13:16 |
|
slef |
nengard: and do you have somewhere with a crontab that can send email? I can run it, but then you rely on me stopping/changing if/when you want. |
13:17 |
|
nengard |
someone who knows the time please update: http://wiki.koha-community.org[…]g,_5_January_2010 |
13:17 |
|
nengard |
slef I don't ... |
13:18 |
|
nengard |
I could if needed but I'd probably need some instructions |
13:23 |
|
hdl |
nengard: I think it was 9Pm ? |
13:23 |
|
hdl |
(Paris time) |
13:24 |
|
hdl |
(maybe previous logs could give you the precise time) |
13:26 |
|
hdl |
thd: kf about communication, I donot think we did not communicate We are really willing to do things with the community. We organise meetings. |
13:27 |
|
hdl |
we try to demonstrate. |
13:27 |
|
thd |
hdl: I do not blame you. |
13:28 |
|
thd |
hdl: I think that some problems could have been addressed with the LibLime support contract with Index Data. |
13:28 |
|
kf |
hdl: yes, I see that you make an effort. I was talkign about the zebra problems you encountered. |
13:28 |
|
kf |
at the last meeting you said: biblibre made a choice |
13:29 |
|
hdl |
thd: it could have... But those problems were not encountered at that time... |
13:29 |
|
kf |
I would liked to see this happen in the community - the whole, those problem exist, we tried that that that...see bug .... |
13:30 |
|
thd |
hdl: It is very unfortunate that we had not recognised the lack of Unicode support from the outset with Zebra. |
13:31 |
|
thd |
hdl: We could have been ahead of the issue if we had been aware of lack of Unicode support at the outset. |
13:31 |
|
gmcharlt |
Zebra lacks Unicode support? that is demonstrably false |
13:32 |
|
hdl |
gmcharlt: it lacks unicode support for search. |
13:33 |
|
hdl |
icu is a friend of yaz, not really integrated. |
13:36 |
|
thd |
gmcharlt: Zebra had no support for searching using Unicode when Koha adopted Zebra and we had not even realised. |
13:37 |
|
owen |
FWIW, NPL does not have a copy of a script to reconstruct 000 from other data in MARC |
13:37 |
|
owen |
...if anyone is still wondering. |
13:37 |
|
thd |
owen: I doubted that you had the code. |
13:37 |
|
gmcharlt |
thd: considering the hundreds of Koha installations that are successfully storing and searching MARC records using UTF-8, whatever applied at the time Zebra was adopted is a moot point |
13:38 |
|
thd |
owen: However, I hoped that you would. |
13:38 |
|
thd |
gmcharlt: I think that I still did not state it correctly. |
13:38 |
|
owen |
thd: Considering some of the errors in our MARC records, I'm not sure how well it worked... |
13:40 |
|
thd |
gmcharlt: Without the problematic implementation of the ICU for Zebra, there is no tokenisation and some other support for more than circa 250 characters in Zebra. |
13:41 |
|
thd |
gmcharlt: Tokenisation without the ICU is based on single byte character encoding. |
13:42 |
|
thd |
hdl: Can you help me distinguish better what cannot be done without using the ICU in Zebra? |
13:45 |
|
|
thd left #koha |
13:46 |
|
|
thd joined #koha |
13:50 |
|
kf |
gmcharlt: without icu it's not possible to search for Hebrew - at least we found no other way |
13:50 |
|
kf |
and using ICU means you loose some other features. but for our library it worked so far. |
13:50 |
|
gmcharlt |
thanks - that is the sort of specific information I'm looking for |
13:50 |
|
kf |
the facets as is are a problem - not sure how much of it is zebra and what could be solved and what not |
13:51 |
|
kf |
gmcharlt: standard indexing and hebrew records meant you got always all hebrew records back - with every hebre search |
13:51 |
|
kf |
term |
13:51 |
|
kf |
we didn't see that with a few test records, but icu solved the problem and we were able to add rules ignore diacritics and such for seach |
13:52 |
|
kf |
but some search options in koha don't work with icu - I have to retest that sometime with 3.2 |
13:54 |
|
|
druthb is now known as drb_mtg |
13:54 |
|
gmcharlt |
kf: would you mind emailing me a few of your Hebrew MARC records? |
13:56 |
|
kf |
gmcharlt: I can do that - but there is a problem |
13:56 |
|
kf |
they have hebrew in 880 |
13:56 |
|
kf |
and not sure that is indexed in standard |
13:56 |
|
hdl |
gmcharlt: I can send you some corean, some japanese hebrew unimarc records too |
13:57 |
|
gmcharlt |
kf: no problem, I can deal with the 880s |
13:57 |
|
gmcharlt |
hdl: thanks |
13:57 |
|
hdl |
(they are committed in the solr branch iirc.) |
13:57 |
|
owen |
jwagner: Two months ago you got an offer of help on Bug 3509. Did anything come of that? |
13:57 |
|
kf |
ok, I will make a note - I have a sample file for my bug 5406 that I wanted to clean up - it has multiple languages too |
13:57 |
|
munin |
Bug http://bugs.koha-community.org[…]w_bug.cgi?id=3509 enhancement, PATCH-Sent, ---, gmcharlt, NEW, Batch item edit |
13:57 |
|
munin |
Bug http://bugs.koha-community.org[…]w_bug.cgi?id=5406 enhancement, P5, ---, gmcharlt, NEW, Allow Koha to authenticate using LDAPS as well as LDAP |
13:57 |
|
kf |
... 4506 |
13:57 |
|
kf |
bug 4506 |
13:57 |
|
munin |
Bug http://bugs.koha-community.org[…]w_bug.cgi?id=4506 enhancement, PATCH-Sent, ---, katrin.fischer, NEW, Add support of record linking by record control number in $w |
13:57 |
|
owen |
s/Two/Six |
13:57 |
|
kf |
*sighs* |
14:00 |
|
kf |
gmcharlt: which email address? |
14:00 |
|
gmcharlt |
gmcharlt at gmail.com |
14:01 |
|
kf |
ok |
14:06 |
|
|
davi joined #koha |
14:07 |
|
thd |
gmcharlt: Some people had speculated that ICU problems with Zebra such as no facets were misconfiguration problems. |
14:09 |
|
thd |
gmcharlt: I have determined that facet and truncation issues when using the ICU are confirmed bugs in the Index Data bugs database. |
14:12 |
|
thd |
Unfortunately, I only found the official bug reports after scouring the documentation for configuration options which were not there to find. |
14:17 |
|
owen |
If I got to test a patch for a bug marked "needs signoff" and the patch doesn't apply, should I mark it as "failed qa?" |
14:18 |
|
owen |
s/got/go |
14:24 |
|
gmcharlt |
owen: hmm - maybe mark it as 'patch does not apply' |
14:24 |
|
gmcharlt |
less strong of a statement, as a patch failing to apply is often due to a matter of timing |
14:25 |
|
owen |
I have been making comments to that effect, just didn't know if we needed a specific patch status |
14:27 |
|
|
drb_mtg is now known as druthb |
14:28 |
|
kf |
owen: perhaps set back to --? |
14:29 |
|
kf |
gmcharlt: failed qa is a new status in the pull down - we don't have one for does not apply yet :) |
14:30 |
|
gmcharlt |
kf: we do now :) |
14:31 |
|
kf |
ah :) |
14:45 |
|
|
Brooke joined #koha |
14:48 |
|
Brooke |
http://www.grindtv.com/outdoor[…]set+world+record/ |
14:49 |
|
jwagner |
owen, was off in a meeting -- sorry, what were you asking about? |
14:50 |
|
owen |
jwagner: You got an offer of help on Bug 3509. Did anything come of that? |
14:50 |
|
munin |
Bug http://bugs.koha-community.org[…]w_bug.cgi?id=3509 enhancement, PATCH-Sent, ---, gmcharlt, NEW, Batch item edit |
14:51 |
|
jwagner |
Nothing at the moment -- we haven't had time to go back and pull it up or work on it. |
14:51 |
|
jwagner |
It's still on my list to do |
14:51 |
|
jwagner |
Was just talking about that one, as a matter of fact -- trying to figure out how to merge functionality with what's in 3.2 now |
14:52 |
|
slef |
owen: are we going with the liblime one or the parallel biblibre one? |
14:52 |
|
jwagner |
There are parts of both that would be good to keep, question is how best to merge. Which edit interface screen would be best? The one in our screenshot, or the BibLibre approach which looks like the standard item edit screen? |
14:53 |
|
hdl |
slef which ? |
14:53 |
|
slef |
hdl: batch item edit |
14:55 |
|
wizzyrea |
Having seen both, NEKLS feels that the searching interface in the harley batch edit is too dangerous. We only train people to use the barcode entry portion of it. |
14:55 |
|
hdl |
jwagner: if you would make the select interface a new page that would enter send itemnumbers or barcodes to the tools in... |
14:56 |
|
hdl |
Then it could be quite handy to reconcile both. |
14:56 |
|
wizzyrea |
it is far too easy to select items unintentionally, and there's no undo |
14:56 |
|
jwagner |
Yes, the ability to scan in barcodes at the batch edit screen is one of the things we liked about yours. |
14:56 |
|
hdl |
librarians need a validation of the selection before doing update. |
14:56 |
|
jwagner |
We wanted to bring that capability in. |
14:56 |
|
jwagner |
hdl, and I've also had sites ask for output of the edit to go to a file or something more than just the screen display. |
14:57 |
|
jwagner |
i.e., a report of which barcodes were edited. |
15:00 |
|
hdl |
jwagner: that would be good. |
15:01 |
|
hdl |
maybe we could serailize results in an array and use json or anything to display results. |
15:07 |
|
|
drotsk joined #koha |
15:07 |
|
* Brooke |
waves at drotsk |
15:07 |
|
drotsk |
o/ |
15:12 |
|
|
Casaubon joined #koha |
15:23 |
|
owen |
nengard: I agree about Bug 5500--I don't think it used to work that way. |
15:23 |
|
munin |
Bug http://bugs.koha-community.org[…]w_bug.cgi?id=5500 major, P5, ---, oleonard, NEW, shelf browse changing bib record |
15:23 |
|
nengard |
good to know i'm not imaging things :) |
15:23 |
|
nengard |
or imagining |
15:24 |
|
gmcharlt |
nengard: paint us a picture ;) |
15:26 |
|
Brooke |
galen, behave. |
15:26 |
|
gmcharlt |
heh |
15:27 |
|
* Brooke |
notes the Documentation Force is strong in this Channel. |
15:28 |
|
kf |
nengard: I agree with you and owen |
15:28 |
|
kf |
about bug 5500 :) |
15:28 |
|
owen |
See also: http://wiki.koha-community.org[…]OPAC_Comments_RFC |
15:29 |
|
owen |
...which might as well be addressed to "The Great Pumpkin" since no one is putting up money for it :) |
15:33 |
|
* Brooke |
has faith that on Samhain eve, the Great Pumpkin will in fact appear. |
15:35 |
|
kf |
see you all tomorrow |
15:35 |
|
kf |
hm or later |
15:35 |
|
kf |
:) |
15:35 |
|
|
kf left #koha |
15:37 |
|
|
Casaubon left #koha |
15:42 |
|
|
ivanc left #koha |
15:50 |
|
Brooke |
@roulette |
15:50 |
|
|
Brooke was kicked by munin: BANG! |
15:50 |
|
* munin |
reloads and spins the chambers. |
15:53 |
|
owen |
hdl still around? |
15:53 |
|
hdl |
yes. |
15:53 |
|
owen |
Hi, I'm looking at Bug 5157 and willing to test the patch but I don't understand what the problem is |
15:53 |
|
munin |
Bug http://bugs.koha-community.org[…]w_bug.cgi?id=5157 normal, PATCH-Sent, ---, henridamien, ASSIGNED, borrowers top issuers filters problems |
15:57 |
|
hdl |
mmm... I think that this is one of the information we got. |
15:57 |
|
hdl |
can't find an instance. |
16:01 |
|
hdl |
can't find an official instance of dev to test that. |
16:11 |
|
|
Oak joined #koha |
16:11 |
|
Oak |
\o |
16:27 |
|
slef |
o/ |
16:34 |
|
owen |
jwagner? |
16:34 |
|
ibot |
i heard jwagner was needing a vacation. |
16:34 |
|
* jwagner |
definitely needs a vacation! |
16:34 |
|
jwagner |
Maybe I can go back to New Zealand.... |
16:34 |
|
owen |
jwagner: I'm testing the patch for Bug 4329 and I have a question |
16:35 |
|
munin |
Bug http://bugs.koha-community.org[…]w_bug.cgi?id=4329 enhancement, PATCH-Sent, ---, chris, NEW, OPAC search by shelving location option |
16:35 |
|
jwagner |
yes? |
16:35 |
|
owen |
Do you need to modify the configuration of zebra? |
16:35 |
|
jwagner |
Shouldn't have to -- shelving location is one of the delivered searches as far as I know. |
16:36 |
|
jwagner |
melm 952$c location |
16:36 |
|
jwagner |
Unless a site has modified record.abs (or maybe early sites didn't have that?) should work as is. |
16:36 |
|
owen |
Okay, now I'm getting results. I guess you can't get results if all you do is check the box but don't supply search terms |
16:37 |
|
jwagner |
No, I got results just by checking the box. Dumb question -- did you search a location that you knew had items? |
16:37 |
|
owen |
Hm, maybe not. |
16:38 |
|
jwagner |
And a question for you -- I didn't control this with a syspref partly because so many people were complaining about proliferating sysprefs at the time. But do you think it should be? |
16:38 |
|
jwagner |
You can hide it with a jquery -- I designed it that way -- but maybe a syspref would be better? |
16:38 |
|
owen |
The presence of the shelving location search? |
16:38 |
|
jwagner |
Yes, on the advanced search screen. Some sites don't use shelving locations. |
16:39 |
|
owen |
I would be content to make it a default and wait for others to clamor for a system pref. |
16:39 |
|
wizzyrea |
probably the code to suppress something you've added should be in the bug report with the patch. But that's (a lot of extra) work |
16:40 |
|
wizzyrea |
though I'm for adding that kind of stuff to the wiki |
16:40 |
|
wizzyrea |
also work |
16:40 |
|
owen |
It is in this case |
16:42 |
|
jwagner |
wizzyrea, the jquery to hide it is in the bugzilla entry. |
16:42 |
|
owen |
I find myself expecting that if I search by itemtype and shelving location I expect it to do an AND search rather than OR |
16:43 |
|
wizzyrea |
for that one, it is |
16:43 |
|
owen |
Is that just me? |
16:43 |
|
wizzyrea |
i'm thinking of the other ones ;) |
16:43 |
|
jwagner |
No, that's one of the problems with combining searches, I think. |
16:44 |
|
owen |
Searching by keyword + itemtype gives me an AND search |
16:45 |
|
jwagner |
It's been so long since I worked on this, can't remember details. I do remember a lot of trouble getting it to OR within the location set if you selected multiples. |
16:46 |
|
owen |
wizzyrea: Is the shelving loc. search on your system? |
16:47 |
|
* owen |
can see for himself |
16:48 |
|
|
cait joined #koha |
16:48 |
|
cait |
hi #koha |
16:48 |
|
drotsk |
hi cait |
16:48 |
|
wizzyrea |
lemme check, hi cait |
16:48 |
|
cait |
hi drotsk and wizzyrea |
16:49 |
|
jwagner |
owen, it's on Arcadia's OPAC (they sponsored): |
16:49 |
|
jwagner |
http://koha.arcadia.edu/cgi-bi[…]ha/opac-search.pl |
16:52 |
|
|
drotsk left #koha |
16:52 |
|
|
drotsk joined #koha |
16:57 |
|
|
Brooke joined #koha |
16:58 |
|
Brooke |
TÄ“nÄ koutou |
16:58 |
|
owen |
munin blew you far away Brooke |
16:58 |
|
munin |
owen: Error: "blew" is not a valid command. |
16:59 |
|
Brooke |
Ko Omà miwininiwak te iwi. |
16:59 |
|
Brooke |
owen, that is Munin's way of telling me to do "real work." |
17:02 |
|
owen |
I wonder if Bug 3693 wouldn't be better implemented as a CSS customization |
17:02 |
|
munin |
Bug http://bugs.koha-community.org[…]w_bug.cgi?id=3693 enhancement, PATCH-Sent, ---, gmcharlt, NEW, Display options for buttons when holds triggered |
17:05 |
|
jwagner |
owen, the dev server where I did the shelving location work is gone. Can someone remind me how to bring a branch down from a remote server? I can see it on the git repo, how do I create a local branch with that code so I can work on it? |
17:06 |
|
jwagner |
On 3693, maybe css would work better. At the time I didn't know how to do it any other way. Sysprefs might be easier for sites to implement, though. |
17:11 |
|
owen |
jwagner: http://koha-community.org/koha[…]ry-2010/#biblibre |
17:11 |
|
owen |
...for an example. |
17:12 |
|
jwagner |
Thanks -- finally deciphered my sketchy notes too |
17:21 |
|
wizzyrea |
jcamins: we made the gnocchi last night -- Totally awesome, and super fast when you steam the potatoes |
17:22 |
|
wizzyrea |
and the spud gobbled them up |
17:22 |
|
jcamins |
Mmmm. |
17:22 |
|
* jcamins |
just finished having gnocchi for lunch |
17:22 |
|
* Brooke |
is jealous. |
17:22 |
|
jcamins |
Brooke: they're super easy. |
17:22 |
|
Brooke |
lies! |
17:23 |
|
jcamins |
Somewhere in the scrollback is the recipe. |
17:23 |
|
jcamins |
wizzyrea can confirm they're super easy. |
17:24 |
|
wizzyrea |
they are in fact super easy |
17:24 |
|
wizzyrea |
like, so easy you'll wonder why you don't eat gnocchi all the time |
17:24 |
|
jcamins |
See? wizzyrea can confirm |
17:24 |
|
wizzyrea |
and then you probably will |
17:24 |
|
wizzyrea |
eat gnocchi all the time |
17:24 |
|
jcamins |
wizzyrea: wait, "why you don't eat gnocchi all the time"? |
17:24 |
|
jcamins |
Errr... don't you? |
17:24 |
|
jcamins |
;) |
17:24 |
|
Brooke |
I've made them many times before |
17:24 |
|
wizzyrea |
these happen to have been made with sweet potatoes |
17:25 |
|
jcamins |
Brooke: then you should know they're super easy |
17:25 |
|
Brooke |
bah |
17:25 |
|
* Brooke |
likes leveraging her pasta machine for labour savingses. |
17:25 |
|
owen |
Maybe Brooke thinks if she protests too much someone will break down and mail her some? |
17:25 |
|
cait |
hehe |
17:25 |
|
jcamins |
owen: has that worked for you yet? |
17:26 |
|
* owen |
hasn't complained enough about anything yet |
17:26 |
|
Brooke |
because owen is not a whiner :D |
17:27 |
|
|
wizzyrea left #koha |
17:27 |
|
|
wizzyrea joined #koha |
17:27 |
|
wizzyrea |
oops |
17:27 |
|
jcamins |
Sneaky Command-W? |
17:27 |
|
wizzyrea |
lol ya |
17:29 |
|
jwagner |
owen, I've been looking at the search code, but i can't see where it decides to use AND or OR. Any pointers? |
17:29 |
|
jcamins |
Hey, didn't it used to be an Apple key? |
17:29 |
|
owen |
Sorry jwagner no idea |
17:31 |
|
jcamins |
(yes, it did) |
17:31 |
|
* Brooke |
nods at jcamins |
17:31 |
|
cait |
koha++ |
17:31 |
|
cait |
@karma koha |
17:31 |
|
munin |
cait: Karma for "koha" has been increased 17 times and decreased 0 times for a total karma of 17. |
17:31 |
|
cait |
hm! |
17:31 |
|
Brooke |
it's okay, though, soon Mr. Fairey will come and label it "obey". |
17:37 |
|
jcamins |
Okay, time to get going. |
17:37 |
|
jcamins |
So long, #koha |
17:37 |
|
|
jcamins is now known as jcamins_a |
17:54 |
|
|
drotsk left #koha |
17:54 |
|
|
drotsk joined #koha |
17:55 |
|
|
clrh left #koha |
17:56 |
|
|
hdl left #koha |
18:04 |
|
Brooke |
Ka kite anÅ. |
18:04 |
|
|
Brooke left #koha |
18:05 |
|
wizzyrea |
owen: re 3262 I don't think the full name is being generated by the script for the homebranch |
18:05 |
|
wizzyrea |
I personally would like to have them both standard on the code :P |
18:06 |
|
wizzyrea |
but I understand why people would want the full name, since not everyone uses sane library codes |
18:06 |
|
wizzyrea |
and I don't mean that as an indictment of how others do things |
18:26 |
|
|
owen left #koha |
18:29 |
|
|
owen joined #koha |
18:32 |
|
brendan |
@wunder 93117 |
18:32 |
|
munin |
brendan: The current temperature in Northwest Goleta, Goleta, California is 13.3�C (10:34 AM PST on December 15, 2010). Conditions: Overcast. Humidity: 87%. Dew Point: 11.0�C. Pressure: 29.96 in 1014.4 hPa (Falling). |
18:32 |
|
druthb |
@wunder 20852 |
18:32 |
|
munin |
druthb: The current temperature in Woodley Gardens, Rockville, Maryland is -1.8�C (1:30 PM EST on December 15, 2010). Conditions: Clear. Humidity: 34%. Dew Point: -16.0�C. Windchill: -5.0�C. Pressure: 29.81 in 1009.4 hPa (Steady). |
18:32 |
|
wizzyrea |
@wunder lawrence, ks |
18:32 |
|
munin |
wizzyrea: The current temperature in Channel 6 Downtown, Lawrence, Kansas is -1.1�C (12:34 PM CST on December 15, 2010). Conditions: Overcast. Humidity: 73%. Dew Point: -5.0�C. Windchill: -6.0�C. Pressure: 29.68 in 1005.0 hPa (Falling). |
18:33 |
|
nengard |
@wunder 19030 |
18:33 |
|
munin |
nengard: The current temperature in CWOP # AR939, Penndel, Pennsylvania is -2.7�C (1:35 PM EST on December 15, 2010). Conditions: Mostly Cloudy. Humidity: 43%. Dew Point: -14.0�C. Windchill: -6.0�C. Pressure: 29.72 in 1006.3 hPa (Steady). |
18:34 |
|
wizzyrea |
nengard: glad you made it home safely from your car debacle yesterday |
18:34 |
|
nengard |
thanks - that was no fun |
18:34 |
|
wizzyrea |
AAA++ |
18:34 |
|
nengard |
it was bitterly bitterly cold and windy |
18:35 |
|
nengard |
and the only thing that was hot was the burning engine |
18:35 |
|
nengard |
or smoking engine |
18:35 |
|
wizzyrea |
this last summer my car overheated on the hottest day of the year |
18:35 |
|
brendan |
It seems that most of you are going to be jealous of my daily temps - so I'll make sure to check daily |
18:35 |
|
chris |
Morning from the bus |
18:35 |
|
wizzyrea |
mornin' |
18:35 |
|
brendan |
heya chris - wizzyrea et a; |
18:35 |
|
brendan |
al |
18:35 |
|
chris |
@wunder wellington nz |
18:35 |
|
munin |
chris: The current temperature in Wellington, New Zealand is 14.0�C (7:00 AM NZDT on December 16, 2010). Conditions: Partly Cloudy. Humidity: 88%. Dew Point: 12.0�C. Pressure: 30.09 in 1019 hPa (Steady). |
18:35 |
|
wizzyrea |
<3 |
18:36 |
|
brendan |
ooh chris squeaks out an early morning win |
18:36 |
|
chris |
Heh |
18:37 |
|
nengard |
looks like we should have had KohaCon in Dec instead :) |
18:38 |
|
chris |
I thought the weather was ok |
18:38 |
|
cait |
good morning chris |
18:39 |
|
chris |
December has been a lot more humid and rainy |
18:39 |
|
wizzyrea |
it was windy and chillish but not awful |
18:39 |
|
wizzyrea |
didn't keep us in |
18:39 |
|
cait |
@wunder Konstanz |
18:39 |
|
munin |
cait: The current temperature in Taegerwilen, Taegerwilen, Germany is -5.8�C (7:40 PM CET on December 15, 2010). Conditions: Light Snow. Humidity: 87%. Dew Point: -8.0�C. Windchill: -9.0�C. Pressure: 30.30 in 1026.0 hPa (Steady). |
18:42 |
|
nengard |
chris i was cold most days |
18:42 |
|
chris |
I told you layers sheesh :) |
18:43 |
|
* cait |
thought really liked the weather :) |
18:43 |
|
cait |
-thought |
18:46 |
|
chris |
Jan/feb are the best months |
18:46 |
|
nengard |
chris - yes you did - I don't blame you at all |
18:46 |
|
nengard |
I brought light layers :) |
18:50 |
|
gmcharlt |
it has been below 0 C in Gainesville for the past week |
18:50 |
|
chris |
I just hope it is fine on the 27th |
18:50 |
|
gmcharlt |
consequently, everybody is dressing as if it were Anchorage |
18:50 |
|
nengard |
hehe |
18:51 |
|
chris |
Kristina is going to her first cricket game |
18:51 |
|
nengard |
yeah i was in FL a week ago - panhandle - and everyone had scarfs and coats |
18:51 |
|
cait |
:) |
18:51 |
|
cait |
hope she has fun |
18:54 |
|
chris |
I'm sure she will |
18:55 |
|
cait |
:) |
18:56 |
|
chris |
Ok my stop |
18:56 |
|
chris |
Bb after coffee |
18:59 |
|
|
Oak left #koha |
19:07 |
|
owen |
minimum requirement system for windows: one Debian ISO </snark> |
19:08 |
|
drotsk |
priceless |
19:09 |
|
|
tcohen left #koha |
19:09 |
|
|
cait left #koha |
19:09 |
|
chris |
back |
19:10 |
|
|
cait joined #koha |
19:10 |
|
cait |
back |
19:11 |
|
owen |
wb chris and cait |
19:11 |
|
chris |
owen, i basically said that |
19:12 |
|
cait |
oh so many windows mails today |
19:12 |
|
chris |
friends dont let friends use proprietary software |
19:13 |
|
cait |
lol |
19:14 |
|
chris |
ohh a pull request for the qa'd serials work, ill pull that down and do some testing on that myself then push it up |
19:14 |
|
cait |
I am working more with the old laptop (ubuntu) than with my newer one (vista) since kohacon |
19:16 |
|
* chris |
makes a bug for it, so he doesnt break his commits have a bugnumber rule |
19:16 |
|
cait |
clever :) |
19:17 |
|
chris |
bug 5508 |
19:17 |
|
munin |
Bug http://bugs.koha-community.org[…]w_bug.cgi?id=5508 enhancement, PATCH-Sent, ---, colin.campbell, NEW, Biblibre Serials work |
19:18 |
|
chris |
http://git.koha-community.org/[…]/new/enh/bug_5508 |
19:19 |
|
|
Elwell left #koha |
19:20 |
|
|
Elwell joined #koha |
19:21 |
|
|
Elwell left #koha |
19:31 |
|
|
richard joined #koha |
19:34 |
|
owen |
Colin++ |
19:35 |
|
chris |
yup |
19:35 |
|
chris |
colin++ |
19:36 |
|
owen |
He managed to get so much done, yet we rarely see him around here. Is there a connection? |
19:37 |
|
cait |
colin++ |
19:40 |
|
|
Elwell joined #koha |
19:44 |
|
|
briceSanc joined #koha |
19:44 |
|
briceSanc |
hello all ! |
19:46 |
|
* owen |
is closing on a fix for Bug 3523 |
19:46 |
|
munin |
Bug http://bugs.koha-community.org[…]w_bug.cgi?id=3523 normal, P5, ---, oleonard, ASSIGNED, Menu of existing lists limited to 10 |
19:46 |
|
chris |
yay |
19:48 |
|
briceSanc |
Does Koha is freeze in 3.2.2 ? |
19:48 |
|
chris |
template freeze |
19:48 |
|
briceSanc |
C4 freeze ? |
19:48 |
|
chris |
no, just template |
19:48 |
|
briceSanc |
ok |
19:49 |
|
chris |
chris n sent a mail yesterday |
19:49 |
|
briceSanc |
Do you have estimated the remaining time ? |
19:49 |
|
chris |
did you see the email? |
19:50 |
|
chris |
http://lists.koha-community.or[…]ember/034871.html |
19:51 |
|
briceSanc |
no i don't see this mail |
19:51 |
|
chris |
ahh it went to koha-devel and koha-translate .. if you are working on either, those are 2 lists you should be on :) |
19:52 |
|
briceSanc |
i'm on request and patches, devel is missing |
19:54 |
|
chris |
devel is a good one to be on thats for sure |
19:54 |
|
chris |
its quite low traffic, but thats where things like hte release schedule, and rfc etc are talked about |
19:54 |
|
briceSanc |
you're right ! |
19:56 |
|
|
francharb left #koha |
19:57 |
|
|
hdl joined #koha |
19:57 |
|
munin |
New commit(s) kohagit: bug 5398: make additional pages in staff interface obey noItemTypeImages <http://git.koha-community.org/[…]82a25ee2450470e79> / bug 4826 change 'add basket' to 'new basket' <http://git.koha-community.org/[…]4c39654049b890951> |
19:59 |
|
|
druthb is now known as drb_brb |
20:00 |
|
hudsonbot |
Starting build 221 for job Koha_Master (previous build: STILL UNSTABLE -- last SUCCESS #188 8 days 23 hr ago) |
20:02 |
|
|
briceSanc left #koha |
20:05 |
|
|
Brooke joined #koha |
20:06 |
|
|
owen left #koha |
20:08 |
|
Brooke |
TÄ“nÄ koutou |
20:13 |
|
|
Elwell left #koha |
20:13 |
|
|
Elwell joined #koha |
20:17 |
|
|
drb_brb is now known as druthb |
20:21 |
|
|
magnus_a left #koha |
20:22 |
|
hudsonbot |
Project Koha_Master build #221: STILL UNSTABLE in 22 min: http://hudson.koha-community.o[…]/Koha_Master/221/ |
20:22 |
|
hudsonbot |
* Nicole Engard: bug 4826 change 'add basket' to 'new basket' |
20:22 |
|
|
jwagner left #koha |
20:22 |
|
hudsonbot |
* Galen Charlton: bug 5398: make additional pages in staff interface obey noItemTypeImages |
20:28 |
|
druthb |
@karma Brooke |
20:28 |
|
munin |
druthb: Karma for "Brooke" has been increased 4 times and decreased 1 time for a total karma of 3. |
20:28 |
|
druthb |
:-O |
20:28 |
|
druthb |
Brooke++ #has her reasons, okay? |
20:29 |
|
wizzyrea |
chris: for 5143 do we need to update the database with those koha -> marc mappings or just document and trust people will add them? |
20:31 |
|
chris |
for subtitle? |
20:31 |
|
wizzyrea |
right |
20:31 |
|
chris |
the keyword mapping? |
20:31 |
|
wizzyrea |
yep |
20:32 |
|
chris |
probably should edit the help file for that page |
20:32 |
|
chris |
to tell them to |
20:32 |
|
wizzyrea |
right-o |
20:32 |
|
wizzyrea |
will do |
20:32 |
|
chris |
thanks |
20:32 |
|
wizzyrea |
yup :) |
20:34 |
|
|
Casaubon joined #koha |
20:34 |
|
|
Casaubon left #koha |
20:49 |
|
nengard |
are we talking about a koha help file that's being edited? If so let me know so I can update the manual to match |
20:49 |
|
nengard |
those help files are usually copies of the manual |
20:49 |
|
wizzyrea |
yeap |
20:52 |
|
|
cait left #koha |
20:55 |
|
|
thd is now known as thd-away |
21:05 |
|
wizzyrea |
ohh.... bug or intentional: WARNING: You will not be able save, because your webserver cannot write to '/home/liz/kohaclone/koha-tmpl/intranet-tmpl/prog/en/modules/help/circ/view_holdsqueue.tmpl'. Contact your admin about help file permissions. |
21:06 |
|
Brooke |
are joo a superlibrarian? |
21:06 |
|
wizzyrea |
yep, it's not those permissions, it's file system permissions |
21:06 |
|
wizzyrea |
I could fix this, easily |
21:07 |
|
wizzyrea |
but I don't know if it's intentional |
21:08 |
|
wizzyrea |
I should mention that the change I proposed I'm not making this way, I was curious and clicked on it |
21:13 |
|
gmcharlt |
apropos of nothing except wizzyrea's request |
21:13 |
|
gmcharlt |
only superlibrarians can be trusted with capes |
21:13 |
|
* Brooke |
grins |
21:13 |
|
wizzyrea |
yay! |
21:13 |
|
wizzyrea |
@quote add gmcharlt: only superlibrarians can be trusted with capes |
21:13 |
|
munin |
wizzyrea: Error: You must be registered to use this command. If you are already registered, you must either identify (using the identify command) or add a hostmask matching your current hostmask (using the "hostmask add" command). |
21:14 |
|
Brooke |
hmph. Oirish seems to have disapparated from Pootle. |
21:14 |
|
wizzyrea |
@quote add gmcharlt: only superlibrarians can be trusted with capes |
21:14 |
|
munin |
wizzyrea: The operation succeeded. Quote #112 added. |
21:14 |
|
wizzyrea |
woot! |
21:15 |
|
wizzyrea |
@quote random |
21:15 |
|
munin |
wizzyrea: Quote #10: "< pianohacker> You helped start an open source project; clearly your sense of what to avoid to make your life easier has been impaired for a while :)" (added by chris at 07:59 PM, June 23, 2009) |
21:15 |
|
wizzyrea |
@quote random |
21:15 |
|
munin |
wizzyrea: Quote #107: "<kmkale> This is a food channel. Sometimes we discuss Koha too ;)" (added by jwagner at 02:49 PM, November 29, 2010) |
21:15 |
|
wizzyrea |
@quote random |
21:15 |
|
munin |
wizzyrea: Quote #71: "cait: hm it works now and I have no idea why :)" (added by chris at 07:47 PM, April 08, 2010) |
21:32 |
|
|
ebegin joined #koha |
21:33 |
|
* Brooke |
waves at ebegin |
21:33 |
|
ebegin |
Hey! |
21:33 |
|
|
wizzyrea left #koha |
21:33 |
|
|
wizzyrea joined #koha |
21:33 |
|
ebegin |
@wunder montreal quebec |
21:33 |
|
munin |
ebegin: The current temperature in Montreal, Quebec is -12.0�C (4:17 PM EST on December 15, 2010). Conditions: Light Snow. Humidity: N/A%. Windchill: -21.0�C. Pressure: (Rising). |
21:34 |
|
ebegin |
... cold ... |
21:34 |
|
Brooke |
but, your croissants are not crap. |
21:34 |
|
Brooke |
Silver lining :) |
21:34 |
|
ebegin |
:) |
21:36 |
|
ebegin |
Brooke, is it summer now in NZ? |
21:37 |
|
Brooke |
Em, I think that hypothesis requires in person exploration. |
21:37 |
|
Brooke |
I'll volunteer for the expedition. |
21:37 |
|
Brooke |
Selflessly, of course. |
21:37 |
|
Brooke |
(But yes, course it is.) |
21:37 |
|
ebegin |
Oh! sorry, I though you were there. |
21:38 |
|
ebegin |
Where are you ? Virginia? |
21:38 |
|
Brooke |
a yep |
21:41 |
|
|
Brooke left #koha |
21:48 |
|
|
nengard left #koha |
21:57 |
|
|
druthb left #koha |
22:04 |
|
|
davi left #koha |
22:12 |
|
|
drotsk left #koha |
22:25 |
|
|
wasabi left #koha |
22:28 |
|
|
thd-away is now known as thd |
22:29 |
|
|
thd is now known as thd-away |
22:46 |
|
|
wasabi joined #koha |
22:48 |
|
chris |
hdl: you still awake? |
22:48 |
|
hdl |
still |
22:49 |
|
chris |
go to sleep ;-) |
22:49 |
|
chris |
but before you do |
22:50 |
|
chris |
http://git.koha-community.org/[…]b259cbdc7e13e3b8f |
22:50 |
|
chris |
anything you want to change there? |
22:50 |
|
chris |
just fixed the copyright statements getting ready from the merge, its passing all the unit tests now |
22:51 |
|
hdl |
ok for me. |
22:51 |
|
hdl |
doing night house keeping on our servers. |
22:51 |
|
chris |
ah cool |
22:52 |
|
chris |
ok, just have to do a little more testing, but this should be merged before you wake up tomorrow |
22:52 |
|
chris |
but now, lunch |
22:53 |
|
hdl |
have a nice meal |
23:48 |
|
|
hdl left #koha |