Time |
S |
Nick |
Message |
12:03 |
|
slef |
ok np |
12:06 |
|
kados |
mailman will do it automatically |
12:24 |
|
slef |
no it won't, as it's bouncing to the From not the envelope |
12:24 |
|
slef |
(buggy RFC-ignorant mailserver) |
12:30 |
|
kados |
ahh |
13:19 |
|
owen |
Hi GrahamDoel |
13:19 |
|
GrahamDoel |
Hi owen |
13:20 |
|
GrahamDoel |
I have a fresh install of koha and am strugglind with the z39.5 searches, do you think you might be able to point me in the right direction |
13:20 |
|
GrahamDoel |
? |
13:20 |
|
owen |
I can try |
13:20 |
|
GrahamDoel |
thanks |
13:21 |
|
owen |
What are the symptoms of your problem? Are the searches not returning any results? |
13:22 |
|
thd |
kados: are you back? |
13:22 |
|
GrahamDoel |
when I do an add biblio enter the isbn and click the z3950 search it just returns z3950 Still ?? requests to go |
13:22 |
|
kados |
thd: yep |
13:22 |
|
GrahamDoel |
I have run the start daemon script, but don't know how to check |
13:23 |
|
thd |
/msg kados kados: I am still finishing the Afognak job |
13:23 |
|
kados |
:-) |
13:23 |
|
owen |
Hmmm... that's one I /don't/ know. |
13:24 |
|
owen |
kados: can you assume the daemon is running properly if the script doesn't generate any errors? |
13:24 |
|
kados |
GrahamDoel: check the logs |
13:24 |
|
kados |
you need to see what the log says to be sure |
13:24 |
|
GrahamDoel |
ok... could you remind me where they are? |
13:25 |
|
kados |
GrahamDoel: whereever you specified in the z3950-daemon-options file |
13:25 |
|
GrahamDoel |
ok, thanks... I'll look now |
13:27 |
|
owen |
GrahamDoel: you filled in a search term, right? An ISBN or title? |
13:28 |
|
GrahamDoel |
yes, I found that on a list somewhere and did check that I had done. |
13:32 |
|
GrahamDoel |
ahh.. I see. No connection at... |
13:42 |
|
thd |
kados: there is a bug in the 008 plugin |
13:42 |
|
kados |
thd: do tell |
13:45 |
|
thd |
kados: if the first 6 positions for record creation date are blank then using the plugin does not set them automatically but eliminates the first six positions moving everything over by 6 positions |
13:45 |
|
thd |
kados: is that clear? |
13:45 |
|
kados |
yes |
13:46 |
|
kados |
should be a simple fix |
13:46 |
|
thd |
kados: maybe this happens for the 008 plugin under other conditions but this was the condition where I noticed it. |
13:51 |
|
GrahamDoel |
thank you kados and owen, problem solved. Thanks again |
13:55 |
|
kados |
thd: could you file a bug in bugzilla ... mark it as 'blocker' since it prevents creation of valid MARC21 records |
13:56 |
|
thd |
kados: is that what blocker means? |
13:56 |
|
kados |
blocker means that you can't release the software until it's fixed |
13:57 |
|
kados |
IMO anything that prevents us from creating valid MARC21 is a blocker :-) |
13:57 |
|
thd |
kados: you do not have to use the plugin and so there is a workaround |
13:58 |
|
thd |
kados: editing fixed fields without some extra aid is crazy though |
13:58 |
|
kados |
yep |
14:59 |
|
thd |
kados: bug filed but I have not been able to assign it to you |
15:01 |
|
thd |
kados: when I have tried assigning or CC a bug to you it tells me that jmfliblime.com is not a recognised email address for that purpose. |
15:03 |
|
thd |
kados: how can I assign a bug to you |
15:03 |
|
thd |
? |
15:10 |
|
kados |
thd: I'm here |
15:10 |
|
kados |
thd: jmfkados.org is the right address I think |
15:10 |
|
thd |
ahh |
15:11 |
|
thd |
kados: I remembered an important question |
15:11 |
|
kados |
thd: I'm all ears :-) |
15:12 |
|
thd |
kados: we are using MARC instead of MARC-XML in Zebra for performance reasons? |
15:12 |
|
kados |
not quite |
15:12 |
|
kados |
we are using binary MARC for the initial import |
15:12 |
|
kados |
and MARCXML after that |
15:12 |
|
kados |
internally it's stored in Zebra's internal format |
15:13 |
|
thd |
kados: importation is much slower if it were in XML? |
15:15 |
|
kados |
yes |
15:15 |
|
kados |
about 1000 times slower :-) |
15:16 |
|
thd |
kados: does the record editor add a new record in MARC-XML or MARC? |
15:17 |
|
thd |
kados: does the record editor also submit the resulting record from edits int the same format? |
15:19 |
|
thd |
kados: can we break the MARC record size limit without problem using MARC-XML once the record has been added or newly created? |
15:19 |
|
kados |
I think so |
15:19 |
|
kados |
though I haven't tested |
15:20 |
|
thd |
kados: what format does the record editor save to the database for Zebra? |
15:23 |
|
thd |
kados: is importation 1000 times slower if the records have already been converted to UTF-8? |
15:23 |
|
kados |
yes |
15:23 |
|
thd |
kados: what format does the record editor save to the database for Zebra? |
15:24 |
|
kados |
there is code to do both |
15:24 |
|
kados |
but currently, I'm using xml |
15:25 |
|
thd |
kados: is there any performance difference for Zebra presenting one format or other for search results? |
15:26 |
|
kados |
not that I know if |
15:26 |
|
kados |
of course, XML is much more verbose :-) |
15:26 |
|
kados |
a 10K MARC21 file could easily be 100K in MARCXML |
15:27 |
|
thd |
kados: yet the browser only has to manage the browser view |
15:29 |
|
thd |
kados: have you bookmarked MARC::Record relative to MARC::File::XML? |
15:30 |
|
kados |
bookmarked? |
15:31 |
|
thd |
s/bookmarked/benchmarked add field, and other transformations/ |
15:33 |
|
thd |
kados: my spell checker sometimes gives my strange transformations that I do not always catch |
15:33 |
|
thd |
s/my/me/ |
15:35 |
|
kados |
I haven't benchmarked anything yet |
15:35 |
|
kados |
officially :-) |
15:38 |
|
thd |
kados: I guess that a benchmark is not needed in instances when you have a 1000 times difference. No one cares if it is 998 or 1002 times too long for that particular case. |
15:44 |
|
kados |
:-) |
17:15 |
|
thd |
kados: are you still there? |
21:13 |
|
russ |
dewey seen thd? |
21:13 |
|
dewey |
thd was last seen on #koha 3 hours, 57 minutes and 37 seconds ago, saying: kados: are you still there? [Fri Jun 30 10:15:30 2006] |
21:13 |
|
russ |
thd are you about? |
21:13 |
|
thd |
yes russ |
21:15 |
|
russ |
hiya |
21:15 |
|
russ |
i have a marc question for you if you have a minute |
21:15 |
|
thd |
yes |
21:15 |
|
russ |
the 300 b subfield, is that repeatable? |
21:16 |
|
russ |
or should it be repeatable? |
21:16 |
|
thd |
yes, I believe |
21:18 |
|
thd |
russ: apparently, I was wrong |
21:19 |
|
thd |
russ: 300 is repeatable even 300 $a is repeatable and $a is seldom repeatable |
21:20 |
|
thd |
russ: 300 $b is not repeatable |
21:20 |
|
russ |
right ok |
21:20 |
|
russ |
thanks |
21:20 |
|
russ |
what resource do you use to find this out? |
21:22 |
|
thd |
russ: http://www.loc.gov/marc has the most up to date information for the concise format information |
21:22 |
|
thd |
russ: I may guess at what you want to know for encoding 300 |
21:24 |
|
thd |
russ: If you were wanting to use a repeated $b to add additional information about accompanying material in $e, which is not allowed then .. |
21:24 |
|
thd |
put all the information in $e itself for the accompanying material |
21:26 |
|
russ |
ok cool thanks |
21:26 |
|
thd |
russ: consider the example of the atlas in http://www.loc.gov/marc/biblio[…]phys.html#mrcb300 |
21:27 |
|
thd |
russ: 300 ##$a271 p. :$bill. ;$c21 cm. +$eatlas (37 p., 19 leaves of plates : 19 col. maps ; 37 cm.) |
21:28 |
|
thd |
russ: $e contains everything for the accompanying atlas which might have been in a repeated $b were it allowed. |
21:29 |
|
thd |
russ: $e also contains what might have been in a repeated $c which is allowed but I have not noticed it used. |
21:31 |
|
thd |
russ: $e, itself, is not repeatable |
21:41 |
|
russ |
thanks for all of that, that link to the loc was helpful, i have found what i need. |
02:11 |
|
osmoze |
hello |
02:16 |
|
btoumi |
hi all |
02:16 |
|
ToinS |
hello |
02:29 |
|
btoumi |
hi toin's |
02:34 |
|
ToinS |
salut bruno |
02:35 |
|
btoumi |
ca va |
02:35 |
|
btoumi |
? |
02:36 |
|
hdl |
hello world |
02:36 |
|
ToinS |
très bien |
02:36 |
|
ToinS |
salut hdl |
02:36 |
|
btoumi |
hello hdl |
03:39 |
|
slef |
hii |
04:00 |
|
ToinS |
hi slef |
06:40 |
|
tumer |
hdl:are you around? |
07:20 |
|
hdl |
tumer[A]: I'm here. |
07:20 |
|
hdl |
I was having lunch and overlooked your beep |
07:20 |
|
hdl |
tumer ??? |
07:20 |
|
dewey |
tumer is here for a few seconds ;-) |
07:21 |
|
tumer |
hi hdl? |
07:21 |
|
hdl |
hi |
07:21 |
|
hdl |
how are you ? |
07:21 |
|
tumer |
i am having probleems with authorities |
07:21 |
|
btoumi |
hi all |
07:22 |
|
hdl |
If I can help. |
07:22 |
|
hdl |
Can you tell me ? |
07:22 |
|
tumer |
hdl: on the editor when you get an authority with all these repeatable fields.. |
07:22 |
|
tumer |
the biblio-blind-search.pl gets confused |
07:23 |
|
hdl |
an url ? |
07:23 |
|
tumer |
if the cataloger plays with cloning subfields or changing their order.. |
07:24 |
|
tumer |
and then tries to field the field from the authorities.. |
07:24 |
|
hdl |
Oh Yes. |
07:24 |
|
tumer |
they get filled into differnt tags .. |
07:24 |
|
hdl |
OUPPS. |
07:24 |
|
tumer |
or if you try to delete tem.. |
07:24 |
|
tumer |
it deletes a different tag |
07:25 |
|
tumer |
hdl:did you get this? |
07:25 |
|
hdl |
No, but I have VERY simple authorized forms. |
07:26 |
|
tumer |
hdl: do you understand the problem? |
07:26 |
|
hdl |
Yes. |
07:26 |
|
hdl |
Should be a problem of dupping the authtagtoreport using the good order. |
07:26 |
|
tumer |
so authorities is now btoken with this new editor and i had to stop using authorities |
07:27 |
|
hdl |
Is this only because of the latest changes in MARC editor ? |
07:27 |
|
tumer |
i think so because we never had it before |
07:28 |
|
tumer |
we are in the middle of cataloguing 10,000 books and the system is now broken |
07:28 |
|
hdl |
BEST is the fiend of good. :( |
07:29 |
|
hdl |
which version are you working on ? |
07:29 |
|
hdl |
devweek ? |
07:29 |
|
tumer |
if you want to reproduce the problem.. |
07:30 |
|
tumer |
juts have a page where there are more than one field that uses authorities in te same page |
07:30 |
|
tumer |
thd: i am using dev_week merged to rel_2_ |
07:31 |
|
thd |
tumer: yes |
07:31 |
|
tumer |
i am using npl templates though |
07:32 |
|
hdl |
wow, what a mess. |
07:32 |
|
tumer |
hdl:also i think there is a missing code |
07:33 |
|
hdl |
Is there an official branch for devweek merged with rel_2_2 or is it that You did a merge ? |
07:33 |
|
tumer |
when you clean a field of authority the authority number does not get cleaned.. |
07:34 |
|
tumer |
the blind-biblio-search.pl only cleans subfields a..z |
07:34 |
|
thd |
tumer: I had thought that there had been no changes to the authorities editor in a very long time |
07:35 |
|
tumer |
thd:i am talking about using authorities in normal marc editor |
07:35 |
|
hdl |
tumer: It is not so easy to understand but with default templates, you have to open the popup and clear entry. |
07:35 |
|
tumer |
thd: thats exacly the what i am talking about |
07:35 |
|
thd |
yes, I was about to guess that |
07:36 |
|
tumer |
oh hi thd |
07:36 |
|
thd |
hello tumer |
07:36 |
|
tumer |
hdl:the previous line was supposed to be for you not to thd |
07:37 |
|
thd |
I realise you pinged me by mistake |
07:37 |
|
tumer |
its just that i write to td more so used to write ths automatically |
07:38 |
|
tumer |
the last line is a mess of mistakes.sorry |
07:38 |
|
hdl |
It's ok. |
07:38 |
|
tumer |
hdl:its the same way with npl templates. You use a popup for athorities |
07:39 |
|
thd |
tumer: so the problem is that $3 is not being cleared when the popup is opened? |
07:39 |
|
hdl |
But on my devweek version, which is quite old, the "clearing" of authorities seems to work. |
07:40 |
|
hdl |
(thd: $9 kohaauth number) |
07:40 |
|
tumer |
hdl:yes on the screen buth the marc record could be left with $9 authid filled |
07:40 |
|
thd |
oops $9 I mean |
07:41 |
|
thd |
tumer: are you using a recent version from CVS? |
07:41 |
|
tumer |
hdl:the main problem is not cleaning |
07:41 |
|
tumer |
the problem occurs if you use this cloning of subfieds and then use yhe authorities |
07:42 |
|
thd |
tumer: which version are you using? |
07:42 |
|
tumer |
my marc editor is version the one just before paul broke |
07:43 |
|
tumer |
i am afraid to upgrade now |
07:43 |
|
thd |
tumer: that version has a problem when cloning fields and certainly you should not upgrade the addbiblio.pl |
07:44 |
|
thd |
tumer: kados made a fix which he has not posted yet but which he described the needed changes |
07:45 |
|
tumer |
well thats not so easy of not upgrading as the system is now half-breed |
07:46 |
|
hdl |
Yes, I understand. |
07:47 |
|
tumer |
hdl:say a subject authority has 150$a 150$x and another 150$x does it get transferred to the marc editor correctly? |
07:48 |
|
hdl |
I have not tested. But paul told me that there could possibly be a problem. |
07:49 |
|
tumer |
hdl:well there is. It does not |
07:49 |
|
hdl |
I could investigate but not before next week. |
07:50 |
|
tumer |
even if you prepare multiple x's on the editor they get filled with same data allover. |
07:50 |
|
hdl |
It should be a problem of PERL context. |
07:51 |
|
tumer |
hdl:the problem i am having is not of multiple subfields but of completely wrong fields getting cleared or even sometimes filled |
07:51 |
|
thd |
tumer: I have found the fix kados suggested on #koha |
07:52 |
|
hdl |
you know : when using $field->subfield('a') it returns either a string OR a list depending on the variable on the right of your = |
07:52 |
|
thd |
25/06/06 12:09:00+-5<kados:#koha>for a temporary fix |
07:52 |
|
tumer |
hdl:the changes in marc editor breaks authorities |
07:52 |
|
thd |
25/06/06 12:09:11+-5<kados:#koha>look for all instances of 'new_from_xml' in addbiblio.pl |
07:52 |
|
thd |
25/06/06 12:09:17+-5<kados:#koha>and make sure they look like this: |
07:52 |
|
thd |
25/06/06 12:09:18+-5<kados:#koha>my $record=MARC::Record->new_from_xml($xml, 'UTF-8'); |
07:52 |
|
tumer |
thd:i already have that |
07:53 |
|
thd |
oh, I suspected that all the bugs would not be gone |
07:54 |
|
thd |
tumer: I have seen more examples of double encoding UTF-8 to UTF-8 |
07:54 |
|
tumer |
kados have been silent about encoding problems i reported anyone knows why? |
07:55 |
|
thd |
tumer: kados is extremely fatigued by encoding problems |
07:56 |
|
tumer |
thd:well at least that releives me some of mine |
07:56 |
|
slef |
Is there any easy way to debug a z39.50 connection that I think is being blocked by a firewall, or do we need to wait for the administrator to interrogate the firewall? |
07:58 |
|
thd |
tumer: there is still also a problem with the fonts used in the CSS for every template which will obscure the actual character content for some UTF-8 multibyte characters. |
07:59 |
|
tumer |
thd:yes i know but not for internet explorer which i use |
07:59 |
|
thd |
tumer: this problem is browser independent |
08:00 |
|
tumer |
thd:as long as i keep with true type (unicode) characters of windows i do not et that problem |
08:02 |
|
thd |
tumer: my problem of posting UTF-8 content is gone now with Firefox 1.5.04. What I am identifying is a problem for character display that may not affect Turkish characters but certainly exists for French character display in both Firefox and the Opera post from Windows. |
08:02 |
|
thd |
s/post/port/ |
08:03 |
|
thd |
slef: what problems does you Z39.50 connection report |
08:03 |
|
thd |
? |
08:06 |
|
thd |
tumer: I am observing this character display problem especially in the uncommon cases where a fixed width font is used but maybe it is my GNU/Linux system |
08:08 |
|
slef |
thd: none, as far as I can tell. |
08:09 |
|
thd |
tumer: kados did say that he would hunt down the double encoding problem that you reported |
08:09 |
|
thd |
tumer: he has only been back from ALA yesterday |
08:10 |
|
thd |
slef: do you mean the connection is working fine or that you have no error messages? :) |
08:11 |
|
thd |
slef: can you determine that the daemon is running with the ps command? |
08:15 |
|
thd |
tumer we could test whether you can see the same problem when you are off phone |
08:20 |
|
tumer |
thd: i am back now |
08:21 |
|
tumer |
thd: what do you want me to do? |
08:25 |
|
tumer |
thd: the page looks normal with french accented characters |
08:27 |
|
thd |
tumer: do you see a difference between the representation of the accented character in the results for search sting and author results column? |
08:27 |
|
tumer |
thd:no it looks perfectly fine |
08:27 |
|
thd |
tumer: what accented characters do you see? |
08:28 |
|
tumer |
thd:i see accented e that is é |
08:28 |
|
slef |
thd: yes, and I can run it through the debugger. It forks but never changes from Still ?? requests to go. |
08:29 |
|
thd |
tumer: so the problem is with X-windows or fonts on my GNU/Linux system then. |
08:29 |
|
tumer |
thd: only in name Valérie there is an accent |
08:30 |
|
tumer |
thd: by the way this is what i am trying to achieve |
08:30 |
|
thd |
tumer: thank you now I know that I suffer deeply for my freedom I see some very strange out of order accent on this page |
08:31 |
|
thd |
tumer: yet, because different fonts are use on the page with an individual record instead of multiple search results the individual record looks fine |
08:32 |
|
tumer |
thd:yes the problem is how did you mabage to get this page display correctly like this and not have the double utf8 problem? |
08:33 |
|
tumer |
doing the same search on my system will give the double encoding problem |
08:33 |
|
thd |
tumer: my X-windows may also be partly misconfigured like everyone's |
08:34 |
|
tumer |
thd:so you mean you know the solution or this? |
08:34 |
|
thd |
tumer: I have not seen the double encoding problem on the installation on my system at home but I have seen it on a production system that I have been working on for kados |
08:35 |
|
thd |
tumer: I do not know the solution for double encoding other than finding where it is happening and stopping it. |
08:36 |
|
tumer |
thd:now this is serious. What you are saying is that its not a double encding problem but a problem of X-windows or Windows? |
08:37 |
|
tumer |
thd: by the way is your serach results coming from ZEBRA or mysql? |
08:38 |
|
thd |
tumer: i expect that the double encoding I see on the production system where I have been editing records is from some part of the system not recognising that the record was already converted from MARC-8 to UTF-8. There had somewhat recently been a problem for MARC::File ::XML doing that. |
08:38 |
|
tumer |
thd: are you using zebra? |
08:38 |
|
thd |
tumer: my search results are coming from SQL |
08:39 |
|
thd |
tumer: I have not taken the time actually to set up Zebra yet. |
08:40 |
|
thd |
tumer: I have s[pent most days recently too busy to advance my installation of Koha |
08:40 |
|
tumer |
thd:we may be sitting on another problem that this thing actually happens with ZEBRA only and not with SQL? |
08:41 |
|
thd |
tumer: I expect that the production system where I have seen double encoding is not using Zebra but I am not certain |
08:42 |
|
thd |
kados: are you awake yet? |
08:42 |
|
tumer |
thd:the same record with the same search will yield the double encoding on my system and kados's which both use zebra |
08:43 |
|
thd |
tumer: do you mean that you added it to your system? |
08:43 |
|
tumer |
thd: no but i have the same authorname i mean |
08:44 |
|
thd |
tumer: that is very bad |
08:44 |
|
thd |
tumer: has your record been saved with the record editor or only imported? |
08:45 |
|
tumer |
thd:unfortunately i have to go to a reception now, but at least we are starting to all see the same problem which is good |
08:45 |
|
thd |
:) |
08:46 |
|
tumer |
thd:if you catch kados can you please mention him this coversation? |
08:47 |
|
thd |
tumer: I have been trying to show him this problem since late yesterday |
08:47 |
|
thd |
tumer: I hope that he has not gone away |
08:48 |
|
thd |
tumer: you also have the other problem with not clearing $9. |
08:50 |
|
thd |
slef: do you have a valid reliable Z39.50 target configured and are you searching for a record that can certainly be found? |
08:50 |
|
kados |
thd: I will be soon :-) |
08:51 |
|
kados |
thd: I've got to head out for a couple hours |
08:51 |
|
kados |
thd: I'll be back later |
08:51 |
|
thd |
kados:OK tumer and I have found something |
08:52 |
|
kados |
thd: can't wait to hear about it ;-) |
08:53 |
|
kados |
(btw: entity encoding the utf8 may be the answer to our previous encoding probs, I thought of that last night ...) |
09:30 |
|
thd |
paul: are you there? |
09:51 |
|
thd |
paul: are you still here? |
10:41 |
|
slef |
thd: no and no |
10:41 |
|
slef |
thd: :) |
10:41 |
|
slef |
thd: I'm 99% sure the firewall is blocking z39.50 to all other servers :-/ |
10:43 |
|
thd |
slef: z39.50 servers are only good. You should let them all in. My firewall does not block z39.50 responses over port 80 but my security is low :) |