Time |
S |
Nick |
Message |
12:58 |
|
kados |
http://www.oclc.org/research/projects/fast/ |
13:02 |
|
tumer[A] |
kados:check mail. Was it clear enough? |
13:28 |
|
hdl |
kados around? |
13:34 |
|
kados |
hdl: yep |
13:34 |
|
kados |
hdl: what's up?/ |
13:34 |
|
hdl |
I have a problem |
13:34 |
|
hdl |
MacOSX perl Module installation |
13:34 |
|
hdl |
It seems It cannot launch make |
13:35 |
|
hdl |
Do you know how to cope with it ? |
13:36 |
|
kados |
hmmm |
13:36 |
|
thd |
hdl: what version of OSX? |
13:37 |
|
kados |
hdl: what's the exact error? |
13:37 |
|
dewey |
i guess the exact error is just before : |
13:37 |
|
hdl |
kados : make command not found. |
13:37 |
|
kados |
hdl: and what module? |
13:38 |
|
kados |
hdl: it sounds like you need to install the developer's tools |
13:38 |
|
kados |
hdl: on the original CD |
13:38 |
|
kados |
hdl: Xcode tools |
13:38 |
|
hdl |
The first I tried was MARC::Record. |
13:38 |
|
thd |
hd: you have to have the developer's tools disc installed, well kados types faster |
13:39 |
|
hdl |
But is there a way to install perl Module without building them ? |
13:39 |
|
kados |
hdl: yes |
13:39 |
|
kados |
hdl: look in the lib dir |
13:39 |
|
hdl |
I always use perl -MCPAN -e install ? |
13:39 |
|
kados |
hdl: and copy it into your perl path |
13:39 |
|
kados |
all make does is run the tests |
13:39 |
|
kados |
and copy things over to the right spots |
13:39 |
|
kados |
so you can do it manually |
13:39 |
|
kados |
it's just a lot of work |
13:40 |
|
kados |
hdl: install Xcode tools and it should work properly |
13:40 |
|
hdl |
Holy ... |
13:40 |
|
hdl |
It is on a machine in switzerland. |
13:40 |
|
kados |
hehe |
13:40 |
|
hdl |
via ssh. |
13:40 |
|
kados |
hmmm |
13:40 |
|
kados |
that is a problem :-) |
13:41 |
|
hdl |
:D |
13:41 |
|
thd |
hdl: the incompatibilities of building things on OSX are also a lot of work |
13:41 |
|
kados |
there is no way to install remotely that I know of |
13:41 |
|
thd |
kados: I suspect there is a way |
13:41 |
|
kados |
hdl: one way would be via VNC |
13:41 |
|
thd |
hdl: is this the server version of the software? |
13:42 |
|
hdl |
I think it is MacOSX Server Edition. |
13:42 |
|
kados |
hdl: http://tomclegg.net/xcode-remote-install |
13:42 |
|
thd |
hdl: in any case, you can download the developer's tools |
13:42 |
|
kados |
hdl: I'm not responsible if it breaks their system :-) |
13:43 |
|
kados |
hdl: so according to that guy you can do it |
13:43 |
|
thd |
hdl: you should also install fink which has ports from Debian so that you can install things correctly |
13:44 |
|
thd |
hdl: you need developer's tools first |
13:45 |
|
thd |
hdl: you have to match versions correctly so be certain you know which animal panther, cheetah, etc you have OS 10.? |
13:46 |
|
hdl |
kados : Is there any tricks I should be aware of before installing ? |
13:46 |
|
hdl |
For instance : apache is named httpd |
13:46 |
|
kados |
i only run OSX as a desktop |
13:46 |
|
kados |
don't have KOha installed |
13:47 |
|
kados |
I use debian sarge for all koha installations |
13:47 |
|
thd |
hdl: you should get a copy of "OSX for Unix Geeks" |
13:47 |
|
hdl |
services are launched with service httpd restart |
13:47 |
|
thd |
hdl: I successfully installed Koha 2.2 on OSX. |
13:48 |
|
shedges |
thd: did your install include the Z3950 search? |
13:49 |
|
thd |
shedges: I wasted months trying to find an alternative a year ago |
13:50 |
|
shedges |
me, too. I thought maybe somebody had solved the problem. |
13:50 |
|
thd |
shedges: so no, I would have needed to recompile Apache and that might have broken other things so I refused to do more work on OSX after a certain point. |
13:52 |
|
thd |
shedges, hdl, kados: Apple made OSX sufficiently different from FreeBSD, that doing work on the command line in the proper Unix way was a recurring exercise in frustration for minor incompatibilities |
13:54 |
|
thd |
hdl: I did get things to work but I had to Google to workaround errors too often for installing some things from source |
13:54 |
|
thd |
hdl: Koha should not be a major problem |
13:54 |
|
hdl |
thd: Do you have some guidelines for Koha Installations ? |
13:55 |
|
thd |
hdl: It was too long ago to remember except that you need to install the developer tools and I would recommend installing Fink. |
13:57 |
|
thd |
hdl: I remember some problems with the directory where OSX puts man pages |
13:58 |
|
thd |
hdl: Google searches on site:macosxhints.com are helpful |
14:01 |
|
thd |
hdl: often you may find that you want to install things with a different version than what apple provides fink has a separate non-conflicting directory for doing just that |
14:07 |
|
thd |
hdl: I really needed the OSX for Unix Geeks book to find my way around how Apple renamed everything etc. The book also has some undocumented commands |
14:12 |
|
thd |
kados: where is tumer's message? |
14:14 |
|
hdl |
thd: Many thx. |
14:16 |
|
thd |
hdl: you are quite welcome. OSX is great for the GUI but is not what you want to install Unix software on if you had a choice. |
17:43 |
|
thd |
kados: are you there? |
18:05 |
|
thd |
tumer hello |
18:05 |
|
thd |
tumer: what was the message that you sent? |
18:05 |
|
tumer |
hi i am still strugling |
18:06 |
|
tumer |
oh i sent a report to ID about the issue we discussed |
18:06 |
|
thd |
tumer: with what are you still struggling? |
18:06 |
|
thd |
tumer: kados thought of a way around the problem for some cases |
18:07 |
|
tumer |
like ??? |
18:09 |
|
thd |
tumer: you could put record IDs which needed matching into some local use filed and index on the local use field which would then be the same field for bibliographic and holdings records. |
18:09 |
|
thd |
s/filed/field/ |
18:10 |
|
tumer |
thd: i think i could not let myself be understood very well. give me a mail address and i will send you an email |
18:11 |
|
thd |
tumer: you could then have 99X for the bibliographic ID in both bibliographic and authority records |
18:12 |
|
thd |
s/authority/holdings/ |
18:12 |
|
tumer |
authority and bibliographic works fine |
18:13 |
|
thd |
tumer: try koha at agogme.com |
18:14 |
|
thd |
tumer well it is the authority records about which I have the greatest concern |
18:15 |
|
tumer |
agogme.com what do i do there? |
18:15 |
|
thd |
tumer: I want to be able to search the references and tracings for multiple authority records and return bibliographic records |
18:15 |
|
thd |
tumer did you not want an email address? |
18:16 |
|
tumer |
yes your email please |
18:16 |
|
thd |
tumer: koha at agogme.com |
18:16 |
|
tumer |
oops sorry |
18:19 |
|
tumer |
thd:koha-dvel already does that |
18:20 |
|
tumer |
we do the similar thing with items as well |
18:20 |
|
tumer |
search biblios and retrive related items -- no problem |
18:20 |
|
tumer |
or vice versa |
18:21 |
|
tumer |
but SQL like JOIN is what we are after |
18:22 |
|
tumer |
i want to be able to limit the search to 120,000 records out of 200,000 depending which criteria i put in like a branch |
18:27 |
|
thd |
tumer: yes, had you posted to koha-devel? |
18:28 |
|
tumer |
not yet |
18:30 |
|
thd |
oh I have the message now |
18:32 |
|
thd |
tumer: how did you implement the recursive search of IDs? What was recursive about your search? |
18:34 |
|
tumer |
well search author and get 1000 records search branch main get 120,000 records then do a serach of 1000 biblionumbers within 120,000 |
18:34 |
|
tumer |
result gives yo 121 records |
18:35 |
|
thd |
tumer I guess it is recursive if each record must search for its own ID again |
18:35 |
|
tumer |
yes you do 1000 times |
18:37 |
|
thd |
tumer: do you have problems with one database failing to respond if you search multiple databases in the same domain? |
18:38 |
|
thd |
s/domain/server/ |
18:38 |
|
tumer |
no but try 1000 searches of biblionumbers one after the other and everything is standstill |
18:39 |
|
thd |
tumer: will you not have the same issue when you have 1000 simultaneous users? :) |
18:40 |
|
tumer |
not really, scripts work faster then fingers |
18:41 |
|
thd |
tumer: so 1000 simultaneous users is really only ten simultaneous users? |
18:42 |
|
tumer |
and they only retrieve some records while i have to retrive all 1000 extract biblionumbers then do a search for each get 120 results |
18:48 |
|
thd |
tumer: I hope Index data supports record linking. I think they must because they helped with a demo system that must have had linked record indexes. |
18:49 |
|
tumer |
demo where? |
18:50 |
|
tumer |
can we look at it? |
18:50 |
|
thd |
tumer: it has not been working for a few weeks and it is years old |
18:51 |
|
thd |
tumer: I think Index Data forgot about it |
18:51 |
|
tumer |
well lets hope and wait |
18:53 |
|
thd |
tumer: maybe they never ran that whole system but they had won the contract to help build it |
18:55 |
|
thd |
kados: are you there? |
18:55 |
|
thd |
kados: FAST is not a solution for records with LCSH |
18:57 |
|
thd |
kados: FAST was Chan's idea from a decade ago to simplify LCSH for the era when no one knew how to catalogue anymore |
02:06 |
|
osmoze |
hello #koha |
02:11 |
|
btoumi |
hi all |
02:19 |
|
osmoze |
bonjour btoumi |
02:19 |
|
btoumi |
bonjour osmoze |
02:19 |
|
btoumi |
comment va ce matin? |
02:22 |
|
osmoze |
ca va ca va, comme un lundi :) et toi ? |
02:26 |
|
btoumi |
osmoze : on est mardi??? ca devait etre une blague :=) |
02:27 |
|
osmoze |
ah oui, mais non c est pas une blague, je bosse le samedi donc votre mardi c est mon lundi :) |
02:27 |
|
btoumi |
lol |
02:27 |
|
btoumi |
ok |
02:27 |
|
btoumi |
j'ai travailler pendant cinq ans le samedi donc je te comprend |
02:29 |
|
osmoze |
^^ |
02:29 |
|
btoumi |
osmoze : et ton travail d'investigation sur koha avance? |
02:29 |
|
osmoze |
c est les joies de travailler en bib de lecture publique :/ |
02:29 |
|
osmoze |
btoumi, d investigation de ? |
02:29 |
|
btoumi |
sur koha =>test ect. |
02:31 |
|
osmoze |
non pas en ce moment, je suis en pleine reinstallation du parc des pc publics et interne. Le grand ménage d été. ALors en ce moment, koha n'est pas la priorité ^^ |
02:33 |
|
osmoze |
d'ailleurs, je dois me rendre dans un autre centre, @ plus tard |
02:34 |
|
btoumi |
ah ok |
02:34 |
|
paul |
tada ... my internet connexion works ! |
02:34 |
|
btoumi |
yesssssssssssss! |
02:34 |
|
btoumi |
bonne nouvelle pour Paul |
02:47 |
|
qiqo |
ei help.. |
02:48 |
|
paul |
toins : tu peux réintégrer le bureau quand tu veux ;-) |
02:48 |
|
qiqo |
bonjour paul |
02:48 |
|
qiqo |
tu peux m'aide? |
02:48 |
|
toins |
paul, Ah trop bien !!! |
02:49 |
|
qiqo |
im getting this error message while updating koha to 2.3.0 not well-formed (invalid token) at line 2, column 8, byte 9 at /usr/lib/perl5/site_perl/5.8.6/i486-linux/XML/Parser.pm line 187 |
02:49 |
|
qiqo |
from 2.2.5 to 2.3.0 |
02:49 |
|
toins |
paul, la je n'ai pas de voiture dans l'immediat... donc cet après midi |
02:50 |
|
chris |
theres about 0% chance that an upgrade from 2.2.5 to 2.3.0 will work |
02:50 |
|
qiqo |
ah really? |
02:50 |
|
chris |
just installing 2.3.0 and getting it to work is very hard |
02:50 |
|
qiqo |
why is that so chris? |
02:50 |
|
chris |
yes, 2.3.0 is purely a development release |
02:52 |
|
chris |
http://savannah.nongnu.org/for[…]php?forum_id=4530 |
02:52 |
|
qiqo |
:( |
02:52 |
|
chris |
tells you a little bit more about it |
02:52 |
|
qiqo |
ah aryt i understand |
02:52 |
|
chris |
if you actually want to use it, id wait for a 2.4.x release |
02:53 |
|
chris |
if the second number is an odd number |
02:53 |
|
chris |
then its unstable/development and only *might* work |
02:53 |
|
qiqo |
ermm ok so i have another question... how do i reset the Z3950 module |
02:53 |
|
chris |
if its even, like 2.2 or 2.4 ... then it should work |
02:53 |
|
qiqo |
ahh ok now i understand |
02:54 |
|
chris |
to go back down to a previous version? hmm not sure about that |
02:54 |
|
chris |
i think just get the older version and install that |
02:54 |
|
chris |
but im not sure about that |
02:54 |
|
qiqo |
yup im doing that as of the moment |
02:55 |
|
qiqo |
i run koha.upgrade or 2.2.5 |
02:55 |
|
qiqo |
ok im back with 2.2.5 |
02:57 |
|
qiqo |
my Z3950 module isnt working |
02:58 |
|
chris |
did you upgrade it when you were trying to upgrade to 2.3 ? |
02:59 |
|
qiqo |
nope, the upgrade never finalised |
03:00 |
|
hdl |
hello all |
03:00 |
|
qiqo |
hello hdl |
03:00 |
|
chris |
was the z3950 working before you started upgrading? |
03:00 |
|
chris |
hi hdl |
03:00 |
|
qiqo |
nope it wasnt.. actually i just finished installing 2.2.5 a while ago |
03:00 |
|
qiqo |
then i attempted an upgrade |
03:00 |
|
chris |
its a bit tricky to get going |
03:01 |
|
chris |
the main thing to check is do you have z3950 daemon running? |
03:01 |
|
qiqo |
according to the mailing list, after adding a z3950 server i have to restart the daemon |
03:02 |
|
chris |
maybe, but i dont think so, you do need to have the daemon running though |
03:02 |
|
chris |
it will log |
03:02 |
|
chris |
to /usr/local/koha/log/ |
03:02 |
|
chris |
so you can see what its doing |
03:03 |
|
qiqo |
i think it's not running |
03:03 |
|
qiqo |
my query is included in the koha-errorlog |
03:04 |
|
chris |
z3950-daemon-launch.sh |
03:04 |
|
chris |
is the script to start it |
03:04 |
|
chris |
in /usr/local/koha/intranet/scripts/z3950daemon |
03:04 |
|
qiqo |
yup |
03:05 |
|
qiqo |
how do you know if its loaded? |
03:06 |
|
chris |
what happens when you type |
03:06 |
|
chris |
ps axf | grep "z3950" |
03:06 |
|
chris |
should get something like |
03:06 |
|
chris |
1999 pts/17 S 0:00 su -c /usr/local/koha/intranet/scripts/z3950daemon/z3950-daemon-shell.sh - www-data |
03:06 |
|
chris |
2000 pts/17 S 0:04 \_ /usr/bin/perl /usr/local/koha/intranet/scripts/z3950daemon/processz3950queue /usr/local/koha/log |
03:06 |
|
qiqo |
4904 pts/1 S+ 0:00 | | \_ grep z3950 |
03:06 |
|
chris |
right its not running then |
03:07 |
|
qiqo |
errmm.. |
03:07 |
|
chris |
in /usr/local/koha/log |
03:07 |
|
chris |
are there any files like |
03:07 |
|
chris |
z3950-daemon-20060725-2005.log |
03:07 |
|
qiqo |
yup |
03:07 |
|
qiqo |
there is one |
03:07 |
|
chris |
todays date? |
03:07 |
|
qiqo |
yup |
03:08 |
|
chris |
what does it say? |
03:08 |
|
qiqo |
Bareword "Net::Z3950::RecordSyntax::USMARC" not allowed while "strict subs" in use at /usr/local/koha/intranet/scripts/z3950daemon/processz3950queue line 260. |
03:08 |
|
chris |
there we go, thats why its not starting |
03:09 |
|
chris |
2 secs i think i remember the fix for this |
03:09 |
|
qiqo |
line 260 says: eval { $conn->option(preferredRecordSyntax => |
03:10 |
|
qiqo |
i think the problem is betweenMARC21 andUS marc |
03:10 |
|
qiqo |
are MARC21 and USMARC alike? because my teacher in library science told me that they are somewhat different |
03:10 |
|
chris |
you could try commenting that line out |
03:11 |
|
qiqo |
oh its running |
03:13 |
|
qiqo |
ohh nope false alarm |
03:13 |
|
qiqo |
ehehe its still is not running |
03:13 |
|
chris |
anything else in the error log now/ |
03:13 |
|
qiqo |
Bareword "Net::Z3950::RecordSyntax::UNIMARC" not allowed while "strict subs" in use at /usr/local/koha/intranet/scripts/z3950daemon/processz3950queue line 261. |
03:14 |
|
qiqo |
ill have the two commented out |
03:14 |
|
qiqo |
then ill try again |
03:14 |
|
chris |
right |
03:16 |
|
chris |
actually |
03:16 |
|
qiqo |
heres what i get in ps: 4950 pts/1 S 0:00 /usr/bin/perl /usr/local/koha/intranet/scripts/z3950daemon/processz3950queue /usr/local/koha/log |
03:17 |
|
chris |
try making it $Net::Z3950 .... |
03:17 |
|
chris |
ie add the $ |
03:17 |
|
chris |
that looks like its running |
03:18 |
|
qiqo |
hurrah |
03:19 |
|
qiqo |
still im not getting results from the library of congress |
03:19 |
|
qiqo |
:( |
03:20 |
|
chris |
whats the log tell you |
03:20 |
|
chris |
loc doesnt have the most reliable server |
03:21 |
|
chris |
you could try monash |
03:21 |
|
qiqo |
what servers can i use? |
03:21 |
|
qiqo |
the logs are blank |
03:21 |
|
chris |
zconn.lib.monash.edu.au |
03:21 |
|
chris |
7090 |
03:21 |
|
chris |
did you do an isbn search or a title one? |
03:22 |
|
chris |
db is voyager for monash |
03:22 |
|
qiqo |
yup |
03:22 |
|
chris |
which one isbn? |
03:22 |
|
qiqo |
harry potter 1 |
03:22 |
|
qiqo |
the isbn of hp |
03:23 |
|
qiqo |
then i tried harry potter |
03:23 |
|
chris |
the little popup windows comes up, but never gets any results? |
03:25 |
|
qiqo |
none |
03:25 |
|
qiqo |
no results found |
03:25 |
|
qiqo |
and i dont get any popup windo |
03:25 |
|
chris |
umm you should thats were the results show up in |
03:26 |
|
chris |
so you go to add biblio, choose add new biblio? |
03:26 |
|
qiqo |
no popups :( |
03:26 |
|
qiqo |
yeah |
03:27 |
|
chris |
you end up with at a page with a bunch of fields |
03:27 |
|
qiqo |
yeah marc inputs |
03:27 |
|
chris |
where you can enter marc into eh? |
03:27 |
|
qiqo |
tags etc.. |
03:27 |
|
qiqo |
yup |
03:27 |
|
chris |
and you entered stuff in the isbn one? |
03:27 |
|
chris |
and hit z3950 search? |
03:27 |
|
qiqo |
errmm.. y ou are actually adding a new book then right? |
03:28 |
|
qiqo |
i need the Z3950 feature... |
03:28 |
|
chris |
no |
03:28 |
|
chris |
thats how the z3950 works |
03:29 |
|
chris |
there is a z3950 search button on that page |
03:29 |
|
qiqo |
ahhh |
03:29 |
|
chris |
that should open a popup and start searching servers |
03:29 |
|
qiqo |
yeah |
03:30 |
|
qiqo |
still no results found |
03:30 |
|
chris |
if you are lucky you get some results that you can then click on the record, and it will populate your table |
03:30 |
|
chris |
does it say ?? requests to go |
03:30 |
|
chris |
and does the log say anything now |
03:31 |
|
qiqo |
zconn.lib.monash.edu.au |
03:31 |
|
qiqo |
ah sorry |
03:31 |
|
qiqo |
Still ?? requests to go <--i get this |
03:31 |
|
chris |
right |
03:31 |
|
chris |
i get stuff like this in my log |
03:32 |
|
chris |
@attr 1=4 "freakonomics" at /usr/local/koha/intranet/scripts/z3950daemon/processz3950queue line 279. |
03:32 |
|
chris |
2007/86 : Processing title=freakonomics at MONASH zconn.lib.monash.edu.au:7090 voyager MARC21 (1 forks) |
03:32 |
|
qiqo |
i only get blank logs |
03:33 |
|
qiqo |
0B in size |
03:33 |
|
chris |
hmm, well i gotta go sorry. Hopefully some of the other ppl can help |
03:34 |
|
qiqo |
aww.. |
03:34 |
|
qiqo |
huhu :'( |
03:34 |
|
chris |
8.30 here, and i should spend some time with my wife .. before i get banned from the computer |
03:34 |
|
chris |
:) |
03:34 |
|
qiqo |
heheh |
03:34 |
|
qiqo |
alright |
03:34 |
|
qiqo |
thank you very much sir |
03:35 |
|
qiqo |
these would help |
03:35 |
|
chris |
no problem, good luck with it |
03:40 |
|
qiqo |
hayz... |
03:53 |
|
qiqo |
hmmm |
04:03 |
|
qiqo |
ei i have a question, in 2.2.5 do we still need the API2:PDF v 0.3r77?? |
04:04 |
|
paul |
yep qiqo |
04:04 |
|
qiqo |
or any version of API2::PFF will do? |
04:04 |
|
paul |
no |
04:04 |
|
qiqo |
errm.. does perl stil host the file? |
04:05 |
|
qiqo |
ok ill start googling |
04:05 |
|
qiqo |
hehe :) |
04:12 |
|
qiqo |
hi ozmoze!! |
04:12 |
|
qiqo |
oh hes gone |
04:39 |
|
paul |
toins_: tu as vu http://www.silicon.fr/articles[…]-ADSL-Orange.html |
04:40 |
|
toins_ |
paul: je regarde |
04:40 |
|
toins_ |
ah oui... nous l'avons vécu cette pane ! |
04:41 |
|
paul |
ca confirme que c'était bien général et coté wanamou |
04:43 |
|
toins_ |
yep |
07:14 |
|
paul |
toins_: are u around ? |
07:15 |
|
toins_ |
yep |
07:15 |
|
toins_ |
i'm here |
08:03 |
|
paul |
hdl around ? |
08:52 |
|
paul |
hello agains, everybody |
08:52 |
|
paul |
is kados around or still away ? |
08:52 |
|
tumer |
hi paul |
08:52 |
|
paul |
hi tumer |
08:52 |
|
dewey |
i heard hi tumer was still strugling |
08:53 |
|
tumer |
yes dewey and still |
08:53 |
|
paul |
aren't you too disappointed by ID answer (about merging data results) |
08:53 |
|
tumer |
very much |
08:53 |
|
tumer |
so we have to keep holdings data in biblios |
08:54 |
|
tumer |
my mistake was i misssed this and finished the writing of whole new API for it ready to go |
08:55 |
|
paul |
:-( |
08:55 |
|
paul |
sorry for you |
08:56 |
|
tumer |
i wish i knew some xml xsl and xslt |
09:34 |
|
slef |
Hello all! |
09:34 |
|
paul |
hi slef |
09:35 |
|
slef |
Finally I stop doing VOIP and VPN and can get back on web sites more-or-less full-time. |
09:36 |
|
slef |
Soon I will need to update my Koha installation to be useful again. Should I aim for rel_2_6 or HEAD? |
09:37 |
|
paul |
good question. depends on what you plan to do. |
09:38 |
|
paul |
head and/or dev_week are definetly for wrl0rds developpers. |
09:38 |
|
slef |
priorities: get a working koha, fix the installer |
09:39 |
|
slef |
(as in fix the new installer) |
09:39 |
|
paul |
so, rel_2_2 |
09:47 |
|
kados |
paul: I've replied to your email |
09:47 |
|
kados |
hi all |
09:47 |
|
paul |
kados, yes, i've seen & read the mail. |
09:47 |
|
paul |
I had a question for you, about your recent commit : |
09:47 |
|
kados |
paul: http://wiki.liblime.com/doku.php?id=koha226bugs |
09:47 |
|
paul |
it's about addbiblio.pl, once again. |
09:48 |
|
kados |
paul: don't know if you've seen this report or not |
09:48 |
|
paul |
it don't work anymore, with NPL or default templates. |
09:48 |
|
kados |
yep |
09:48 |
|
kados |
addbiblio-- |
09:48 |
|
paul |
with default, I have an empty screen & with NPL, authority report don't work anymore. |
09:48 |
|
paul |
addbibio-- ??? |
09:49 |
|
kados |
I'm very sorry |
09:49 |
|
kados |
it's never worked correctly for MARC21 records |
09:49 |
|
paul |
do you mean MARC21 or npl templates ? |
09:49 |
|
kados |
but I seem to have broken your stuff in my attempt to fix |
09:49 |
|
kados |
MARC21 |
09:50 |
|
paul |
imho, addbiblio is a proof that we must use only 1 set of templates in Koha. |
09:50 |
|
kados |
yep |
09:50 |
|
paul |
could you explain your problems ? |
09:50 |
|
paul |
I could revert your commit & try to fix them myself |
09:51 |
|
kados |
you can revert my latest commit |
09:51 |
|
kados |
it was mistake |
09:51 |
|
kados |
a mistake even |
09:51 |
|
kados |
I can show you the problems when you do revert it |
09:52 |
|
paul |
OK, i'll revert immediatly, let me 10mn to finish what i'm working on, and then I revert. |
09:52 |
|
kados |
k, ping me when done |
09:52 |
|
paul |
+ look at my today commits, I think i've solved some/few acquisition problems you reported on koha226bugs |
09:52 |
|
paul |
(the receive one at least) |
10:09 |
|
owen |
kados, you around? |
10:12 |
|
kados |
owen: sure am |
10:12 |
|
kados |
owen: glad to see you back :-) |
10:12 |
|
owen |
I came in to find a note saying the internet was up and down all day yesterday, so who knows how long I'll be back |
10:12 |
|
kados |
owen: just going through our todo list |
10:13 |
|
kados |
owen: 100 is still in sync with dev_week |
10:13 |
|
owen |
zoomopac? |
10:13 |
|
dewey |
zoomopac is probably stock dev-week with only minor changes to searching soon to be committed |
10:13 |
|
kados |
owen: yea |
10:13 |
|
kados |
owen: just added isbn search |
10:15 |
|
owen |
The "most recent additions" search doesn't seem to be working |
10:15 |
|
kados |
hmmm ... |
10:16 |
|
kados |
they might not all work |
10:16 |
|
kados |
DVD does though |
10:16 |
|
kados |
and by work, I mean that it relies on what's in the record |
10:16 |
|
kados |
and in some cases, the date is listed as 2099-x-x |
10:16 |
|
kados |
more preprocessing for me to do and it's on my list already |
10:17 |
|
kados |
eg: http://zoomopac.liblime.com/cg[…]ail.pl?bib=139954 |
10:17 |
|
kados |
date acquired is: 202004-02-04 |
10:17 |
|
kados |
which is obviouly more recent than anything with 2006- :-) |
10:17 |
|
slef |
Got a strange problem with z39.50 search in a 2.2.5 system - everything is returning "Nothing found" even if the daemon finds it. I'll read the debug log once I return in a bit, but any tips welcome. |
10:18 |
|
kados |
slef: using Net::Z3950? or Net::Z3950::ZOOM? |
10:18 |
|
slef |
owen: hi, by the way. |
10:18 |
|
slef |
kados: whatever's default. Net::Z3950? |
10:19 |
|
kados |
slef: your best bet on that is upgrading to Net::Z3950::ZOOM and grabbing the latest code from rel_2_2 |
10:19 |
|
kados |
slef: troubleshooting the 2.2.5 version has been known to drive one mad |
10:21 |
|
kados |
owen: the queries on all those are correct |
10:21 |
|
kados |
owen: so if they don't work as expected, look to the record |
10:21 |
|
kados |
well, it's a problem we can fix :-) |
10:22 |
|
kados |
why call number sorting isn't working has got me stumped |
10:23 |
|
kados |
owen: 'fix the “bold_titleâ€� in search results page' |
10:23 |
|
kados |
owen: is that where the search query used to be hilighted? |
10:23 |
|
owen |
Yeah, that was a long time ago. |
10:24 |
|
owen |
(in dev_week terms) |
10:24 |
|
kados |
owen: well, I can add it back, just tell me what you want the term to be wrapped in |
10:24 |
|
kados |
owen: a span? |
10:24 |
|
kados |
with a specific clas? |
10:24 |
|
kados |
class even |
10:24 |
|
kados |
<span class="term"></span> maybe? |
10:25 |
|
owen |
Sure, why not |
10:26 |
|
kados |
k ... |
10:28 |
|
owen |
So kados bring me up to date on what you've been working on. |
10:29 |
|
kados |
sure |
10:29 |
|
kados |
mainly, over the weekend, I worked on the OPAC |
10:30 |
|
kados |
my goal was to clean up all the OPAC scripts and remove all redundent/obsolete data |
10:30 |
|
kados |
and fulfil some of the usability feedback I've gotten from various places |
10:31 |
|
owen |
Like what? |
10:31 |
|
kados |
well, I've gotten numerous comments that all account-related stuff should be in one place |
10:31 |
|
kados |
or at least in one area |
10:32 |
|
kados |
several people didn't understand why the book bag link was above search results |
10:32 |
|
kados |
they expected it to be in the 'account' section |
10:33 |
|
kados |
several clients also didn't understand why there were so many of the same link on a given page |
10:33 |
|
kados |
the 'Search Home' link on http://search.athenscounty.lib.oh.us/ |
10:34 |
|
owen |
Book bags aren't tied to accounts |
10:34 |
|
kados |
is also called 'library catalog' |
10:34 |
|
kados |
in the opacnav |
10:34 |
|
kados |
which several people found confusing |
10:34 |
|
kados |
I agree book bags aren't tied to accounts |
10:35 |
|
kados |
but conceptually, that's where people seem to want them |
10:35 |
|
kados |
at least based on the feedback I solicited |
10:35 |
|
owen |
It's getting into shaky territory, because we can't lead people to believe that the book bag is saved with their account settings. |
10:35 |
|
kados |
I cleaned up all the opac scripts to use the new API (hopefully didn't miss anything) |
10:36 |
|
kados |
added a resident search to the masthead that should show up on all screens |
10:36 |
|
kados |
modified how the searchdesc displays |
10:37 |
|
kados |
'search' returned X results is always resident but not a large thematic element |
10:37 |
|
kados |
added a rss feed icon |
10:37 |
|
owen |
How does that work? |
10:37 |
|
kados |
floated the re-sort list to the right to free up space |
10:37 |
|
kados |
the rss relies on the OpenSearch plugin I haven't committed yet |
10:38 |
|
kados |
so the feed is actually generated from amazon.com (though we can generate feeds natively ... I just didn't get around to it) |
10:38 |
|
kados |
s/amazon.com/a9.com/ |
10:38 |
|
kados |
lets see ... |
10:38 |
|
kados |
opac-main has had a complete facelift |
10:39 |
|
kados |
in preperation for a couple features I've been working on |
10:39 |
|
kados |
they are: |
10:39 |
|
owen |
opac-main seems to be empty! |
10:39 |
|
kados |
yep |
10:39 |
|
kados |
1. 4-5 items related to items you've previously checked out |
10:40 |
|
kados |
2. 4-5 items pulled from a given staff list (virtual shelf) |
10:40 |
|
kados |
3. 4-5 items recently returned |
10:40 |
|
kados |
so the idea was, make the main page more interactive |
10:41 |
|
kados |
if they need to do a search, they can do it from any page |
10:41 |
|
kados |
advanced search is always visible (in opacnav) |
10:41 |
|
kados |
and then there's the new facets feature |
10:42 |
|
owen |
Is that in an iframe? |
10:42 |
|
kados |
(before I get to that ... |
10:42 |
|
kados |
no iframe |
10:42 |
|
owen |
Just overflow? |
10:42 |
|
kados |
yea, it's overflow |
10:43 |
|
kados |
the content is in javascript |
10:43 |
|
kados |
yep, you'll get that |
10:43 |
|
kados |
I turned on overflow: auto to enable that |
10:43 |
|
kados |
because sometimes the list goes beyond the space designated |
10:45 |
|
kados |
all of my design choices were made from looking at the following sites: |
10:46 |
|
kados |
http://aqua.queenslibrary.org/ |
10:46 |
|
kados |
http://www.amazon.com |
10:46 |
|
kados |
http://firstsearch.oclc.org |
10:46 |
|
kados |
http://search.ebay.com |
10:46 |
|
kados |
http://www.lib.ncsu.edu/ |
10:47 |
|
kados |
as well as the 'don't make me think' book |
10:48 |
|
kados |
as far as facets go |
10:49 |
|
paul |
kados, could you explain me what "facets" means pls ? |
10:49 |
|
kados |
paul: if you do a search, the faceted results are those listed on the left-hand side under 'Subject' Authors' 'series' |
10:50 |
|
kados |
paul: it is a compilation of all of a given aspect of the results |
10:50 |
|
kados |
paul: such as subject, author, etc. |
10:50 |
|
kados |
owen: so facets ... |
10:51 |
|
kados |
will be completely re-written from a functional POV |
10:51 |
|
kados |
probably to follow the FAST guidelines: http://www.oclc.org/research/projects/fast/ |
10:51 |
|
kados |
as far as display, I plan to only show the first 3-5 |
10:52 |
|
owen |
3-5 what? |
10:52 |
|
dewey |
-2 |
10:52 |
|
kados |
and hide the rest within a final one called 'see X more' |
10:52 |
|
owen |
Thanks dewey! |
10:52 |
|
kados |
3-5 of each type of facet |
10:52 |
|
kados |
so for instance, a search on neal stephenson |
10:52 |
|
kados |
has: |
10:52 |
|
kados |
Subjects |
10:52 |
|
dewey |
Subjects are authority controlled |
10:52 |
|
kados |
Scientists |
10:52 |
|
kados |
Treasure trove |
10:53 |
|
kados |
Kings and rules |
10:53 |
|
kados |
See 25 more |
10:53 |
|
kados |
Series |
10:53 |
|
kados |
The Baroque cycle |
10:53 |
|
kados |
Volume two of The Baroque cycle |
10:53 |
|
kados |
A Bandam spectra book. |
10:53 |
|
kados |
Authors |
10:53 |
|
kados |
Stephenson, Neal |
10:54 |
|
kados |
that way, the user won't have to scroll down to see that there are multiple types of facets |
10:54 |
|
kados |
the final thing I did was the 'remove search' and 'further limit search to' |
10:54 |
|
kados |
and the 'default facets' |
10:55 |
|
kados |
default facets show up when there hasn't been a search |
10:55 |
|
kados |
the others allow refining the current search to include additional characteristics |
10:55 |
|
kados |
well ... 'Further limit' does |
10:56 |
|
kados |
(and only works with CCL queries at the moment) |
10:56 |
|
kados |
well ... that's enough to fill a book :-) |
10:57 |
|
kados |
owen: comments, questions? |
10:57 |
|
paul |
kados : addbiblio.pl reverted. http://i5.bureau.paulpoulain.c[…]mple/addbiblio.pl works (login test/test) |
10:57 |
|
kados |
paul: did you change templates too? |
10:57 |
|
kados |
paul: or are they stock cvs? |
10:57 |
|
paul |
mmm... the default templates are OK |
10:57 |
|
paul |
npl ones are not. |
10:58 |
|
kados |
I suspect MARC21 doesn't work in default templates |
10:58 |
|
kados |
I will test that first |
10:58 |
|
kados |
using default |
10:58 |
|
paul |
but I haven't MARC21 DB |
10:58 |
|
paul |
(if you have a small one, throw me a link to DL it) |
10:58 |
|
kados |
ok |
10:59 |
|
owen |
How is 'remove search' supposed to work? |
10:59 |
|
kados |
owen: you click on it and it takes you back to the advanced search page :-) |
10:59 |
|
kados |
owen: it's pretty minimal at the moment |
11:00 |
|
kados |
owen: but eventually I'd like it to be a search history |
11:00 |
|
kados |
owen: I'm not 100% sure that the js hierarchy is the best way to do that |
11:00 |
|
kados |
owen: but it was pretty quick for a prototype mechanism |
11:02 |
|
paul |
kados : http://i8.bureau.paulpoulain.c[…]mple/addbiblio.pl is now UNIMARC with NPL templates (login test/test as well) |
11:04 |
|
owen |
kados: my instinct is to say that the js hierarchy is too complex for our users |
11:04 |
|
owen |
Particularly when the 'nodes' hardly ever contain more than one item each |
11:04 |
|
kados |
owen: well that will change |
11:07 |
|
owen |
How so? |
11:09 |
|
kados |
well ... I can't explain the specifics because I haven't figured it out completely |
11:09 |
|
owen |
:) |
11:09 |
|
kados |
thd and I spent a good deal of the weekend pondering LC subjects |
11:10 |
|
kados |
they present some pretty interesting puzzles :-) |
11:10 |
|
kados |
owen: there were 5 original proposed ways to nest a given subject in a hierarchy: |
11:10 |
|
kados |
http://kados.org/subject_hierarchy.html |
11:10 |
|
kados |
and now we discovered OCLC's FAST project |
11:11 |
|
kados |
so that's a 6th that I haven't taken the time to discect yet |
11:11 |
|
kados |
but one of those 6 will be used |
11:11 |
|
thd |
kados: and Chan used a non-LC heading as an example |
11:12 |
|
kados |
thd: right ... |
11:12 |
|
thd |
kados: so I have erred in the absence of an authority file |
11:13 |
|
kados |
thd: to err is human :-) |
11:13 |
|
kados |
:-) |
11:14 |
|
owen |
I like the 'narrow results by' options on the NCSU site |
11:15 |
|
owen |
But I'm confused about what we're trying to do with facets...expand or narrow or both? |
11:15 |
|
thd |
kados: there are still 5 examples did you mean 5? |
11:15 |
|
kados |
good question |
11:15 |
|
kados |
thd: the sixth is FAST from OCLC |
11:15 |
|
kados |
owen: good question |
11:15 |
|
thd |
kados: what I am hoping we would do is change |
11:15 |
|
thd |
kados: FAST does nothing for legacy records |
11:16 |
|
kados |
thd: good point |
11:16 |
|
kados |
owen: well, I think for the OPAC, expanding is probably the goal |
11:16 |
|
kados |
owen: for the Intranet, I think staff will want narrowing |
11:16 |
|
kados |
owen: so both :-) |
11:16 |
|
thd |
kados: FAST is a system for how records might be subject coded when none of the cataloguers know how to subject code any longer |
11:17 |
|
kados |
thd: that description fits NPL :-) |
11:18 |
|
owen |
I like that the NCSU sidebar: is a simple list; has limited results for each facet; has a 'show more' link |
11:18 |
|
thd |
kados: the problem with FAST is that all subject strings are very short so there is no way to specify the two narrow questions that a work may treat |
11:19 |
|
kados |
owen: yep, we could do it that way |
11:19 |
|
kados |
owen: the js was just for fast prototyping |
11:19 |
|
owen |
I also like the way NCSU handles the narrowing process: you can delete the additional search terms by clicking an X in the results screen |
11:19 |
|
kados |
owen: and I think they use FAST for display, or a subset of it |
11:19 |
|
thd |
kados: when everything is top level you have no way to associate which strings belong together. |
11:20 |
|
thd |
kados: FAST is subject soup. |
11:20 |
|
kados |
owen: where is that? |
11:21 |
|
thd |
kados: LCSH are a mess but a reasonably precise mess. |
11:21 |
|
owen |
Above the search results: |
11:21 |
|
owen |
"Search 'something': Early works to 1800 [remove] : England [remove] : Doctrines [remove] |
11:21 |
|
owen |
We found 2 matching items. " |
11:21 |
|
kados |
owen: I also like the sutle use of color for checked out vs available |
11:21 |
|
kados |
owen: ahh, very nice, I see it now |
11:21 |
|
kados |
owen: we could easily do that |
11:24 |
|
owen |
I think the Queens Library search is way too complex |
11:27 |
|
owen |
kados: How can I best be of help? |
11:33 |
|
thd |
owen: what is too complex about a search that is too simplistic? |
11:34 |
|
owen |
What do you mean thd? |
11:34 |
|
thd |
owen: I would agree that it is not intuitive |
11:35 |
|
thd |
owen: I think the progressive hierarchy of subject subdivisions implied by the Queens library search is mistaken |
11:35 |
|
kados |
owen: paul has just fixed addbiblio in rel_2_2 |
11:36 |
|
kados |
owen: so one thing we need to do is sync default and npl templates |
11:36 |
|
owen |
Okay. |
11:36 |
|
thd |
owen: I think that the search should be faceted by place topic time and form |
11:38 |
|
thd |
owen: I think that such subject elements from the same facet should be grouped together in the same facet for extending the search |
11:39 |
|
thd |
owen: I suspect that kados will not go so far as adding the facility to browse for other terms to use in the same facet but I will in future |
11:43 |
|
tumer |
hi all |
11:44 |
|
tumer |
kados:i am in pain with mikes answer |
11:44 |
|
kados |
tumer: I can imagine :( |
11:45 |
|
tumer |
so no more holdings records |
11:45 |
|
kados |
tumer: even if we stored them in the same db with different indexing rules |
11:45 |
|
kados |
tumer: I think two queries is too many :( |
11:46 |
|
tumer |
i am reverting back to dev_week |
11:46 |
|
paul |
the next question being : does it mean we forget storing the issues informations in zebra too ? |
11:46 |
|
tumer |
paul:no |
11:46 |
|
tumer |
we already do that |
11:47 |
|
paul |
but it seems updating biblio + item for each issue/return is too much CPU consumming isn't it ? |
11:47 |
|
paul |
s/seems/seemed/ |
11:47 |
|
kados |
it might be |
11:47 |
|
kados |
plus in dev week, we do: |
11:47 |
|
kados |
* store in items table |
11:47 |
|
tumer |
that is another matter i am investigating |
11:47 |
|
kados |
* store in zebra |
11:47 |
|
kados |
so two operations with each circ, rather than just one |
11:48 |
|
tumer |
waiting to see if it is windows issue only |
11:48 |
|
paul |
store in items or issues table ? |
11:48 |
|
kados |
tumer: waiting to see if what is windows issue only? |
11:48 |
|
paul |
in 2.2, it's just in issues & calculated on the fly |
11:48 |
|
kados |
paul: sorry, issues |
11:48 |
|
paul |
ah, ok. |
11:49 |
|
kados |
if we want to search by availability, we must store in zebra |
11:49 |
|
kados |
and several clients want this feature |
11:49 |
|
kados |
so the next question is: how can we speed up saving records to zebra? |
11:49 |
|
paul |
by doing a commit only once every 10mn with shadow DB, iirc |
11:49 |
|
kados |
one idea I had was to do a batch 'commit' operation once every 5 minutes or so |
11:50 |
|
kados |
paul: you beat me :-) |
11:50 |
|
paul |
so, no more commit in koha, but in crontab. |
11:50 |
|
kados |
right ... one possible solution |
11:50 |
|
paul |
sounds like an accepatble plan to me. |
11:50 |
|
kados |
but it should be a syspref |
11:50 |
|
kados |
IMO |
11:50 |
|
kados |
because code already exists to commit with every operation |
11:50 |
|
paul |
I agree, because for small libraries it won't be a problem |
11:51 |
|
kados |
right |
11:51 |
|
paul |
(or for large ones, with a few circ) |
11:51 |
|
kados |
yep |
11:51 |
|
kados |
and for libraries where accuracy is more important than speed |
11:51 |
|
paul |
so I was right when I heard the systempref crying, 5mn ago ;-) |
11:51 |
|
kados |
:-) |
11:51 |
|
tumer |
i already added a batch commit sysypref |
11:52 |
|
paul |
tumer is faster than lucky luke |
11:52 |
|
paul |
(if you know this cartoon) |
11:52 |
|
tumer |
yep |
11:52 |
|
tumer |
but my pistol is aching today |
11:53 |
|
thd |
owen, kados: see Norgad et. al. The Online catalog : from technical services to access service. In Advances in librarianship. v. 17. |
11:54 |
|
thd |
the title does not do justice to the content |
11:54 |
|
kados |
heh |
11:56 |
|
kados |
thd: I will probably not have time to work on facets today |
11:56 |
|
kados |
thd: maybe later this evening or tomorrow |
11:56 |
|
thd |
kados: well later this evening or tomorrow would be good |
11:56 |
|
kados |
thd: :-) |
11:59 |
|
paul |
bye bye everybody, see you on thursday |