Time |
S |
Nick |
Message |
11:20 |
|
kados |
thd: yep |
11:22 |
|
thd |
kados: The record in your example may be using 610 rather than 650 for Linux as a corporate name. |
11:25 |
|
thd |
kados: have you included all 6XX $a, $v $x, $z, $y or whatever may be needed in any possible case as see also indexes in the biblio framework? |
11:27 |
|
thd |
kados: Also, have you unhidden them so that they would appear in the MARC view? |
11:32 |
|
thd |
kados: the above presumes that you have 650 $a as the MARC subfield linked to biblio.subjects and need to expand the indexes searched for the biblio framework. |
11:34 |
|
thd |
kados: are you still awake? :) |
11:39 |
|
thd |
kados: Another possibility is that your example record, as is otherwise evident has been, poorly catalogued. |
11:39 |
|
kados |
thd: still here |
11:39 |
|
kados |
thd: shouldn't it show up in the MARC editor? |
11:39 |
|
kados |
thd: for that example record? |
11:40 |
|
kados |
thd: I couldn't find any more 6XX fields that had 'Linux' |
11:40 |
|
thd |
kados: Any arbitrary 6XX may have been used to catalogue "Linux" as a subject by the inexperienced or unprofessional cataloguer who had catalogued that record. |
11:42 |
|
thd |
kados: It will only show up in the MARC editor if a tab has been assigned for the relevant subfields and then they will appear under the assigned tab according to the biblio framework. |
11:43 |
|
kados |
right ... so I'll check the SQL then |
11:44 |
|
thd |
kados: The relevant field may not even be in the biblio framework while paul\x{015b} fix for subdivided subjects searches 600-699 sequentially. |
11:46 |
|
thd |
kados: A 69X or 6X9 field may have been used properly for a locally defined thesaurus to supply additional subject terms not found in a standard thesaurus. |
11:48 |
|
thd |
Kados: Where a standard thesaurus would be one such as Sears or LCSH which had been found insufficient. |
11:49 |
|
thd |
kados: Is your quoted error message from that record? |
11:52 |
|
kados |
here's the stuff in the 600s: |
11:52 |
|
kados |
| 707177 | 12699 | 630 | 16 | 07 | a | 1 | Linux | NULL | |
11:52 |
|
kados |
| 707178 | 12699 | 630 | 16 | 07 | 2 | 2 | Sears | NULL | |
11:53 |
|
kados |
| 707179 | 12699 | 650 | 17 | 07 | a | 1 | Computer operating systems | NULL | |
11:53 |
|
kados |
| 707180 | 12699 | 650 | 17 | 07 | 2 | 2 | Sears | NULL | |
11:53 |
|
kados |
| 707181 | 12699 | 650 | 17 | 07 | x | 3 | sistema de operacion de computadoras | NULL | |
11:54 |
|
thd |
kados: 630 is uniform title used as a subject. |
11:56 |
|
kados |
thd: so it's a case of poor cataloging then ... and I need to account for it with seealso ... seems like there are going to be quite a lot of cases similar to this |
11:56 |
|
thd |
kados: you need to add the appropriate see also index links to 650 $a or whatever else you may have used instead for linking to biblio.subjects . |
11:57 |
|
thd |
kados: Also any subfields you want to appear in the MARC view have to be unhidden and have to be set to a tab for the editor to be edited. |
11:58 |
|
kados |
thd: thanks for your help |
11:58 |
|
kados |
thd: I can't wait until we start using Zebra :-) |
11:59 |
|
kados |
thd: btw: have you tried out perl-zoom yet? |
11:59 |
|
thd |
kados: This case may be poor cataloguing. The use of the Spanish translation as a subdivision instead of a repeated field suggests poor cataloguing. |
12:00 |
|
thd |
kados: I have not had time but I have closely examined the documentation and noticed some problems the first of which I have posted to the koha-zoom list but it has not appeared yet. |
12:01 |
|
thd |
kados: The Subject of my first post is incomplete CQL support. |
12:02 |
|
thd |
kados: I know you are keen to use CQL for good reason, |
12:04 |
|
thd |
kados: Is Mike planning to add scan() support for CQL any time soon? |
12:04 |
|
kados |
thd: shouldn't be necessary |
12:05 |
|
thd |
kados: there is also a library problem for sortby() support for CQL. |
12:06 |
|
kados |
thd: I asked Mike about that |
12:06 |
|
kados |
thd: here was his response: |
12:06 |
|
kados |
"you can do:" |
12:06 |
|
kados |
$ss = $conn->scan('@attr 1=1003 a'); |
12:06 |
|
kados |
($term, $occ) = $ss->term(0); |
12:06 |
|
kados |
$rs = $conn->search(new ZOOM::Query::CQL(qq[dc.author="$term"]); |
12:07 |
|
thd |
kados: scan() is not required but could be the basis of a very significant set of features that would help to put Koha far ahead of other ILS systems. |
12:07 |
|
thd |
kados: that is PQL not CQL |
12:07 |
|
kados |
thd: fortunately we can use scan() :-) |
12:08 |
|
kados |
thd: why does it matter? it's all going to be done behind the scenes |
12:08 |
|
kados |
thd: (and notice the $rs line _is_ CQL ... |
12:08 |
|
thd |
kados: So mixing the query syntax will not be a problem? |
12:11 |
|
thd |
kados: Is mixing the use of PQL and CQL the way around for sortby() as well? |
12:14 |
|
kados |
thd: mixing query syntax won't be a problem |
12:14 |
|
kados |
thd: don't think that applies to sortby() |
12:14 |
|
kados |
(meaning I think you can use CQL for that) |
12:14 |
|
kados |
thd: but really, any question you have would be useful to ask on koha-zebra |
12:14 |
|
kados |
as I'm sure others will have the same question |
12:15 |
|
kados |
and Mike can do a much better job answering |
12:15 |
|
thd |
The other issues that I noticed were still no documentation for the extended services that are at least now supported. |
12:15 |
|
thd |
kados: I posted a couple of hours ago but my first post has not appeared yet. |
12:16 |
|
thd |
kados: Also still no documentation for asyncronous connections. |
12:16 |
|
kados |
thd: good point |
12:16 |
|
kados |
thd: I hope you posted that as well :-) |
12:17 |
|
kados |
thd: (yea, turns out that savannah is pretty slow for mailing list purposes eh?) |
12:29 |
|
thd |
kados: Even if the case that you found for 630 is poor cataloguing, you should still have biblio frameworks that support any 6XX that may already be properly used in some record, both for the present set of records and any future ones. |
12:30 |
|
kados |
thd: well I'm certainly not going to put _every_ possible 6XX field in seelaso |
12:31 |
|
kados |
thd: that kind of subject support will have to wait until 3.0 |
12:32 |
|
kados |
thd: there is documentation for the extended services |
12:32 |
|
thd |
kados: It should be built into the default frameworks which should work for Koha 3.0 as well. Then the work would not be redundant. |
12:32 |
|
thd |
kados: where??? |
12:32 |
|
kados |
thd: check the ZOOM::Package section |
12:33 |
|
thd |
kados: It only gives the options without showing how to use them concreatly |
12:34 |
|
kados |
thd: I thought it was fairly obvious :-) |
12:34 |
|
thd |
kados: really? :) |
12:34 |
|
kados |
thd: did you see my announcement post? |
12:35 |
|
kados |
thd: it has a bit of code that _should_ work |
12:35 |
|
kados |
and that will illustrate how to use the methods, etc. |
12:35 |
|
thd |
kados: The PERL-ZOOM POD declares them not yet documented anywhere. |
12:36 |
|
kados |
thd: http://search.cpan.org/~mirk/N[…]ZOOM%3A%3APackage |
12:36 |
|
kados |
thd: scroll down to the ZOOM::Package section |
12:36 |
|
thd |
kados: I think we are not referring to the same extended services. |
12:37 |
|
kados |
"This class represents an Extended Services Package" |
12:39 |
|
thd |
kados: even this link does not work "Package options are listed at http://indexdata.com/yaz/doc/zoom.ext.html ". |
12:42 |
|
thd |
kados: Perhaps some extended services that I am thinking of are listed in the Z39.50 documentation but are not part of Zoom. |
12:43 |
|
thd |
kados: The issue would be moot for Perl-Zoom if what I am thinking of is not in Zoom. |
12:47 |
|
thd |
kados: I do not remember the names of the particular services off hand but they seemed to be intended for managing result sets in some undocumented fashion. |
12:48 |
|
thd |
kados: The had to do with storing queries and result sets for later retrieval as a convenience and resource management issue. |
12:50 |
|
thd |
kados: If what I am thinking about is not in Zoom Index Data may have added something similar for YAZ Proxy. |
13:00 |
|
kados |
thd: yea, that's yaz proxy i think |
13:29 |
|
osmoze |
aie |
13:30 |
|
osmoze |
paul, juste une question : j ai un livre, a lafrancaise en pret. Je le reserve de chez moi....Quand il rentre, j essaye de le transferer a une autre annexe, il me demande l annulation de la reservation pour le transfert...Et lors du transfert, on perd la notice exemplaire. Dans la 2.0 et 2.2.4, bug connu ou pas ? |
13:30 |
|
osmoze |
paul ou |hdl| d ailleur :) |
13:31 |
|
osmoze |
en gros on perd toutes les infos localisation etc etc |
13:35 |
|
paul |
ah, non, ca y en n'a pas bug connu... |
13:35 |
|
paul |
et yen a gros bug d'après ce que tu décris ! |
13:35 |
|
osmoze |
hum... bin oui, par contre, je m avance un peu vite en disant sur la 2.2.3 aussi |
13:35 |
|
osmoze |
je test vraiment de suite |
13:41 |
|
osmoze |
roh |
13:41 |
|
osmoze |
c est la 2.0 |
13:41 |
|
osmoze |
on a pas le probleme avec la 2.2.4 |
13:42 |
|
osmoze |
désolé de mon alerte...Dans un sens, je suis alerté apres la guerre, il est bien temps que je parte en week end quand meme |
13:42 |
|
osmoze |
par contre, sandrine demande si les notice exemplaire perdu dans ce cad la si on peut les retrouver facilement ou si on les refera au coup par cop |
13:43 |
|
paul |
je crains qu'il ne faille les refaire au coup par coup. |
13:43 |
|
paul |
tu n'es pas en prod en 2.2.4 encore ? |
13:44 |
|
osmoze |
bin non, cela fait une semaine que mes petites mains bibliothecaire se mettent a l aise avec la 2.2.4, j ai juste a remettre la base a aujourds hui et cela pourrait etre bon, mais comme il y a quelque bug, j attend la 2.5 pour la mettre.... |
13:45 |
|
paul |
oki |
13:45 |
|
osmoze |
sandrine et autres m ont fait une liste de bug ou de problemes d utilisation |
13:45 |
|
osmoze |
on doit faire le point mercredi pour pouvoir te faire un boooooo mail |
13:45 |
|
osmoze |
:) |
13:45 |
|
osmoze |
j ai rencontré francis au fait cet apres midi ^^, ca avance :) |
13:46 |
|
osmoze |
la grosse demande vient de l utilisation du module pour les revus |
13:47 |
|
paul |
hehe, mercredi je suis en vacances, donc tu peux bien faire ce que tu veux, ca m'est égal ;-) |
13:47 |
|
paul |
les périodiques/bulletinage tu veux dire. |
13:48 |
|
paul |
dans la 2.2.4 il commence à être pas mal, et dans la 2.2.5, il y a encore quelques améliorations mineures. |
13:50 |
|
osmoze |
oui, bulletinage :) au fait, avant de partir tu me mettras bien ma base ? (°_°) |
13:51 |
|
paul |
oups, j'allais oublier. |
13:51 |
|
paul |
elle est dispo ou déjà ? |
13:59 |
|
osmoze |
je suis resté au moment ou hdl te scp la base |
13:59 |
|
osmoze |
sinon je te la remet dispo, mais demain :( |
13:59 |
|
osmoze |
car dans 5 minutes c est le noel des gamins a la com ^^ |
14:01 |
|
osmoze |
bon je decolle |
14:01 |
|
paul |
demain, je suis pas là |
14:02 |
|
osmoze |
ok |
14:02 |
|
paul |
parce que maintenant, le bureau, c'est plus à la maison ! |
14:02 |
|
osmoze |
hé hé t as bien raison :) |
14:02 |
|
paul |
donc je reviens que lundi |
14:02 |
|
osmoze |
on voit cela mardi si t as le temps de me la mettre vite fait |
14:02 |
|
paul |
oh que oui. Juste que je me sens un peu seul et en silence |
14:02 |
|
osmoze |
sinon je te fais cela durant demain ou le week end et je te fais un mail pour que tu l ai lundi |
14:02 |
|
osmoze |
moi j aimais bien tombé sur la secretaire :) |
14:03 |
|
osmoze |
paul, c est ca de devenir vieux geek ;) |
14:03 |
|
osmoze |
good bye all |
14:03 |
|
paul |
bye |
18:07 |
|
rflach |
Can anyone tell me what a "This account is currently not available" error when attempting to run z3950-daemon-shell.sh means? |
18:07 |
|
rflach |
woops I mean -launch.sh |
18:08 |
|
rflach |
-shell.sh & runs ok but has other problems. |
18:11 |
|
rflach |
specifically after my $rs=$conn->search($query);pe(); the $numresults=$rs->size() was generating a can't run size on an undefined value error. I added a my $numresults="0";if(defined($rs)){ prior to the above and that fixed a lot of problems. |
18:13 |
|
rflach |
but both before and after I have the following problem: after search results first show up, the next refresh tries to call /cgi-bin/koha/z3950/0 instead of search.pl which causes a server error. I added some javascript to stop the refreshes if search results exists (with a button to continue searching if desired), but I'm not sure that's an adequate solution. Any thoughts? |
03:55 |
|
root |
hi a newbie here |