Time |
S |
Nick |
Message |
13:10 |
|
thd |
owen: My test showed that there is no JavaScript functionality in the add repeatable field link for Firefox 1.07 |
13:11 |
|
owen |
Wow...so it just fails silently? |
13:12 |
|
thd |
owen: I have only a non-JavaScript link to the same page itself with a blank anchor at the end. |
13:13 |
|
owen |
Are there any errors in the Javascript Console? |
13:14 |
|
thd |
owen: I did not check very extensively. Let me look now. |
13:14 |
|
owen |
It would be very helpful to know any relevant Javascript errors. |
13:15 |
|
thd |
owen: This may take a few minutes because most resources are being consumed by a giant system update on my low bandwidth. |
13:16 |
|
owen |
No problem. |
13:22 |
|
thd |
owen: no error messages and the add field has only a link to the same addbiblio.pl page for the record with a blank page anchor. |
13:23 |
|
owen |
You're talking about the little plus sign link after the tag description, right? e.g. 'INTERNATIONAL STANDARD SERIAL NUMBER' |
13:24 |
|
thd |
owen: I have this as an example for the link intended to create another ISBN, field 020: http://localhost:8082/cgi-bin/[…]ldbiblionumber=30# . |
13:24 |
|
owen |
The link is just a dummy--it's a trigger for a javascript function. So the link itself doesn't matter. |
13:25 |
|
thd |
owen: I do no where it always had been actuated previously :) but that is good to check |
13:25 |
|
thd |
s/no/know/ |
13:25 |
|
kados |
thd: i will have a couple hours later today to troubleshoot these issues |
13:25 |
|
kados |
thd: i also discovered some major bugs in the new editor |
13:27 |
|
thd |
kados: in another three or four days or so I will have updated Firefox to version 1.5 along with just about everything else on my system |
13:29 |
|
thd |
kados: This morning paul suggested to me that he was supposing the DOM had changed in some significant way between Firefox 1.0X and 1.5 so that the means for accessing the same part of the document in one version would not work in the other version. |
13:31 |
|
thd |
kados: I suppose that could be possible but that did not seem to be a reasonably likely account of the difference without actually knowing how the code changed and how it worked. |
13:32 |
|
owen |
Anyway, the problems kados is seeing are much more than just a javascript issue |
13:33 |
|
thd |
owen: yes, I am also skeptical about that as I would have seen only changes in the DOM that provide enhanced methods of access to the document historically. |
13:33 |
|
dewey |
okay, thd. |
13:34 |
|
thd |
owen: what sort of problems is kados seeing, which are unrelated to JavaScript? |
13:36 |
|
owen |
kados noticed and I confirmed that information was getting lost when saving records after editing. |
13:37 |
|
owen |
kados also saw a problem with the "duplicate record" function from the catalog |
13:39 |
|
kados |
it's possible paul changed something in the scripts that broke MARC21 |
13:39 |
|
kados |
I'm planning to spend a few hours on this later today |
13:42 |
|
owen |
kados: Did you mean that when you clicked the 'duplicate record' button the record that opened was incomplete and/or empty? |
13:42 |
|
kados |
no ... |
13:42 |
|
kados |
it's when I go to save it |
13:42 |
|
kados |
and it asks me if it's a duplicate |
13:42 |
|
kados |
of the original record |
13:42 |
|
kados |
(which it is of course) |
13:42 |
|
kados |
that screen has a mangled record on it |
13:43 |
|
kados |
since all of this was working in rel_2_2 right before paul made his changes |
13:43 |
|
kados |
I can only assume something he did is responsible ;-) |
13:43 |
|
thd |
I have now tested with standards compliant Opera 8.52 and observed the same conditions as with Firefox 1.07 exactly. |
13:46 |
|
owen |
thd: the clone tag function? It works fine for me in Opera 8.54 (Windows) |
13:47 |
|
thd |
owen: maybe I need 8.54 instead of 8.52 (GNU/Linux) |
13:48 |
|
owen |
Again, it seems unlikely that it's a browser version issue |
13:48 |
|
owen |
Are you sure javascript is turned on? |
13:48 |
|
thd |
owen: actually Linux versions are usually a little behind. |
13:48 |
|
thd |
owen: yes |
13:50 |
|
thd |
owen: yes now I am certain |
13:50 |
|
kyle |
hey all |
13:50 |
|
owen |
Bummer, that would have been an easy fix ;) |
13:51 |
|
thd |
hello kyle |
13:51 |
|
kyle |
hello thd : ) |
13:51 |
|
kyle |
fyi, I created a project on sourceforge called koha-tools. |
13:51 |
|
kyle |
It has my firefox extension, reports generators and other stuff in it. |
13:52 |
|
thd |
owen: there should however, be a standard warning message that is visible if JavaScript is not turned on for pages that need JavaScript to function. |
13:52 |
|
kyle |
The perl module for user simulation, and a marc tool for fixing 245 fields. |
13:53 |
|
thd |
owen: Very few pages should actually need JavaScript |
13:53 |
|
thd |
all pages needing JavaScript should be in the intranet only. |
13:54 |
|
owen |
few pages should /require/ javascript, but addbiblio is the exception, because of the complexity it demands |
13:54 |
|
thd |
kyle: are your 245 fields broken? |
13:54 |
|
kyle |
extremely |
13:55 |
|
kyle |
many that should have an indicator > 0 don't, and some that *should* have 0 have some other number. |
13:55 |
|
thd |
owen: Perl is much better scripting tool for managing complexity. |
13:55 |
|
kyle |
our current ILS only stores marc data, but doesn't use it. |
13:56 |
|
kyle |
it has it's own scheme for ordering search results. |
13:56 |
|
thd |
However, that would be server side. |
13:56 |
|
owen |
thd: true, but for an advanced editor to be efficient, it needs to be fast. addbiblio is too large a page to continually refresh as you're editing a record. |
13:57 |
|
thd |
owen: yes, I understand the reason and agreed reluctantly that it must be JavaScript |
14:00 |
|
thd |
owen: there is yet another model possible with small pages of one tab at a time being sent where most everything stays on the server. |
14:01 |
|
thd |
owen: such a model could have the same resource consumption as the current JavaScript model. |
14:02 |
|
thd |
owen: especially as every plugin is a server side call. |
14:05 |
|
thd |
kyle: Does your ILS stores record data in SQL or some other database model but only reads MARC at the time of record creation? |
14:05 |
|
thd |
s/record creation/original record creation/ |
14:05 |
|
kyle |
thd: it uses a database called Btrieve. |
14:06 |
|
kyle |
thd: basically, yes, it only grabs data from the MARC record on import, then stores and ignores it. |
14:07 |
|
thd |
kyle: Can it export MARC from its own format if the MARC records were gone or would it be just a small subset of MARC. |
14:07 |
|
thd |
? |
14:09 |
|
thd |
kyle: actually, as I remember, your database does not export its own data like most similar databases without a fee being paid. |
14:10 |
|
kyle |
thd: we can get MARC data out, but it stores a marc record for each item in the database, 5 copies of a book = 5 identical marc records. |
14:11 |
|
kyle |
thd: I spent an incredible amount of time writing a program to fix it so that koha sees one biblio with multiple items. |
14:14 |
|
thd |
kyle: was your incredible amount of time spent to fix Koha so that it would use the records in the same manner or fixing your MARC records so that Koha could use them. |
14:14 |
|
thd |
? |
14:15 |
|
thd |
kyle: I mean did you change Koha to accommodate your records as they were or did you attach all your holdings to a single record before import into Koha? |
14:16 |
|
kyle |
thd: we could have used the records as they were, but koha would have showed five separate instances of said book, instead of one book with five items attached. |
14:16 |
|
kyle |
thd: the latter |
14:17 |
|
kyle |
thd: actually, what we did was import them into koha as is, then I wrote a program to manipulate the koha database directly to remove the duplicates and readd them as additional items to a single biblio. |
14:17 |
|
kyle |
thd: so it will only work on the 2.2 branch of koha. |
14:17 |
|
thd |
kyle: I was wondering if you had added some record deduplication function to Koha itself, about which I had no knowledge. |
14:18 |
|
thd |
kyle: no wonder that took an incredible amount of time :) |
14:18 |
|
kyle |
thd: no, I was going to go that route but the processing time would have been astronomical, to the tune of factorial($numberOfMarcRecords) |
14:19 |
|
thd |
kyle: yes, that would not be practical for CPU usage in real time |
14:20 |
|
thd |
kyle: I assume that you can now export your records from Koha as individual MARC records with holdings attached? |
14:21 |
|
kyle |
thd: that's why I took the route I did. Processing a couple hundred thousand records only takes a few hours with my current program. |
14:21 |
|
kyle |
thd: yes, that is correct. |
14:23 |
|
thd |
kyle: well you have certainly solved it what would seem to be the difficult way to me but I guess every way requires creating a database of which records are which before merging holdings content. |
14:24 |
|
thd |
kyle: and Koha 2.2 provided that. |
14:24 |
|
kyle |
thd: yeah. |
14:27 |
|
thd |
kyle: your script may be tied to Koha 2.2 but if you would need to do something similar in future, a similar thing could be done more easily with Zebra and MARC::Record than doing it mostly in SQL as I imagine that you had done. |
14:30 |
|
kyle |
thd: yes, I imagine so. The problem being that I have no idea how to manipulate data in Zebra. |
14:30 |
|
thd |
kyle: you export it first and manipulate it in MARC record. |
14:31 |
|
thd |
s/MARC record/MARC::Record/ |
14:31 |
|
kyle |
thd: what I would like to write is a marc "scrubbing" program that would look at a marc file's isbn or other identifier, download the corrosponding marc file from LOC, and fill in any missing fields. |
14:32 |
|
thd |
kyle: I have some code to do just that. |
14:32 |
|
kyle |
thd: that's great, is it available somewhere? |
14:34 |
|
thd |
kyle: I would be happy to share it but it needs a little more work to avoid false matches. Maybe you could hire LibLime to hire me to improve the code. |
14:36 |
|
kyle |
thd: I wish I had the clout to do that, but John's still waiting for Zebra to be fully integrated into koha, I don't think I'll be talking him into funding another project anytime soon ; ) |
14:37 |
|
thd |
kyle: It currently consists of an LWP Perl script that drives special hidden features in a PHP/Yaz client that I had started before Perl::Zoom was ready. |
14:38 |
|
kyle |
thd: that's sounds nifty. |
14:40 |
|
thd |
kyle: It need not cost much because I am very cheap but have John talk to kados. I do want to share the code but it has no comments and I have changed the PHP/Yaz client just recently so that the Perl would need updating at least to communicate the correct variables. |
14:41 |
|
kyle |
thd: I'll ask him about it next time I see him. Can't hurt to ask. |
14:41 |
|
thd |
kyle: the basic problem is that being confident that you have the same record is much more tricky than you would imagine at first. |
14:41 |
|
kyle |
thd: yeah, I can imagine so. |
14:42 |
|
thd |
kyle: I am actually desperate for work but I have spent months researching how to do this well so that I can do it well. |
14:43 |
|
thd |
kyle: I am actually disparate for any kind of work so that I am having to do things which keep me from working on Koha. |
14:45 |
|
thd |
kyle: I have tried this for one LibLime customer already so that I do have real experience with matching some very poor quality records and I know all the published research on MARC record matching techniques. |
14:46 |
|
thd |
kyle: Unfortunately, the LibLime customer was anxious to have records quickly so I did manual post processing of a few hundred records to give them results without correcting my script to do it right automatically. |
14:48 |
|
thd |
kyle: the LibLime customer did not even have MARC data so, obviously that made the matching more difficult when sometimes the only field was a title field with a couple of words. |
14:50 |
|
kyle |
thd: wow, that must've been fun. |
14:50 |
|
thd |
kyle: the greatest difficulty with even real records is that no two cataloguers will catalogue perfect ideal matching records for the same material and so you have to match on multiple fields with varying weights and give the records a score to know that you have a good automated match. |
14:51 |
|
kyle |
thd: that makes alot of sense |
14:52 |
|
thd |
kyle: yes lots of fun especially as bugs in the Koha MARC editor have delayed the manually making the resulting records perfect. |
14:55 |
|
thd |
kyle: also ISBNs, LCCNs or other standard identifiers alone have a 15 to 20 percent error rate for having a valid match if they were entered manually. |
14:57 |
|
thd |
kyle: and then even valid numbers will match records which are related but not identical material. |
14:59 |
|
thd |
kyle: a collection of weighted key fields can yield very accurate matches for almost all records with a large enough set of targets. |
15:00 |
|
kyle |
thd: that's interesting. However, I've got to do. Talk to you later. |
15:01 |
|
thd |
kyle: furthermore my Z39.50 client identifies the most complete record with multiple matching records and will give you the record with the complete table of contents and most subject headings etc. |
15:03 |
|
kyle |
thd: talk to you later. |
15:03 |
|
kyle |
bye all |
15:03 |
|
thd |
goodbye kyle |
16:20 |
|
hdl |
kados around ? |
16:20 |
|
hdl |
thd ? |
16:20 |
|
dewey |
thd is probably just dying to tell you but I fear that if I reveal all I will lose the ability to obtain funding for a business myself. or skeptical about that as I would have seen only changes in the DOM that provide enhanced methods of access to the document historically. |
16:21 |
|
thd |
dewey: what silly things you have collected together. |
16:21 |
|
dewey |
thd: i don't know |
16:21 |
|
thd |
dewey: at least the rest of us can appreciate the humour |
16:21 |
|
dewey |
thd: excuse me? |
16:22 |
|
thd |
hdl: I am here |
16:22 |
|
thd |
kados: is somewhere, maybe working on the record editor bugs. |
16:23 |
|
thd |
hdl: above |
16:23 |
|
hdl |
thd : |
16:24 |
|
hdl |
do you know where I could get access to a working current dev on zebra ? |
16:24 |
|
thd |
hdl: you mean head is not working so you want a working checkout? |
16:25 |
|
hdl |
No. |
16:25 |
|
hdl |
I mean I have a demo tomorrow and client would certainly be impressed to see what kados has already up :))) |
16:26 |
|
thd |
hdl: you do not mean his zoomopac search forms? |
16:26 |
|
hdl |
yes. |
16:27 |
|
thd |
hdl: http://zoomopac.liblime.com |
16:28 |
|
hdl |
:D |
16:28 |
|
hdl |
thx anyway. |
16:28 |
|
hdl |
hope you are fine. |
16:28 |
|
thd |
you are quite welcome hdl |
16:30 |
|
thd |
hdl: I am ok. I will be better in a couple of days after a full system update will give me Firefox 1.5 along with other things to use the record editor with fewer bugs. |
18:25 |
|
thd |
dewey: what happened to hewey and louie? |
18:25 |
|
dewey |
thd: i haven't a clue |
18:26 |
|
thd |
s/hewey/huey/ |
18:27 |
|
thd |
dewey: huey and louie are broken. |
18:27 |
|
dewey |
OK, thd. |
18:27 |
|
thd |
dewey: it is up to you to take good care of the forest. |
18:27 |
|
dewey |
...but it is <reply>... |
18:40 |
|
thd |
dewey: take good care of the forest. |
18:40 |
|
dewey |
thd: what? |
18:41 |
|
thd |
dewey: dewey is to take good care of the forest. |
18:41 |
|
dewey |
...but dewey is an infobot... |
18:42 |
|
thd |
dewey: dewey the inforbot is to take good care of the forest. |
18:42 |
|
dewey |
OK, thd. |
18:43 |
|
thd |
dewey: do you fix spelling errors? |
18:43 |
|
dewey |
wish i knew, thd |
18:44 |
|
russ |
dewey: spell nessacery |
18:44 |
|
dewey |
I can't find alternate spellings for 'nessacery' |
18:44 |
|
thd |
dewey: dewey, the infobot, is to take good care of the forest. |
18:44 |
|
dewey |
OK, thd. |
18:44 |
|
thd |
dewey spell inforbot |
18:44 |
|
dewey |
I can't find alternate spellings for 'inforbot' |
18:45 |
|
thd |
dewey: what is an infobot? |
18:45 |
|
dewey |
an infobot is at http://www.cs.cmu.edu/~infobot |
18:46 |
|
thd |
dewey: remember all infobots named dewey must take good care of the forest. |
18:46 |
|
dewey |
thd: i'm not following you... |
18:47 |
|
thd |
dewey: you do not need to follow me as long as you take good care of the forest. |
18:47 |
|
dewey |
thd: i'm not following you... |
18:47 |
|
thd |
dewey: exactly, carry on. |
18:47 |
|
dewey |
thd: i'm not following you... |
18:48 |
|
thd |
dewey: CMU should give you a better parser. |
18:48 |
|
dewey |
thd: what? |
18:49 |
|
thd |
exactly |
02:10 |
|
btoumi |
hi all |
02:39 |
|
btoumi |
one qustion for all champion |
02:39 |
|
btoumi |
the deletedborrowers table is a same that the borrowers table? |
02:41 |
|
osmoze |
hello #koha |
02:43 |
|
btoumi |
hi osmoze |
02:43 |
|
osmoze |
no champion here :p |
02:44 |
|
chris |
yes it should be the same bruno, and when you delete a borrower it should get stored there |
02:44 |
|
btoumi |
i think they work too much |
02:44 |
|
chris |
because people often go .. oh i didnt mean to delete that, can I get it back |
02:45 |
|
btoumi |
i ask u because actually the deletedborrowers table not the same i do the change in updatabase |
02:45 |
|
btoumi |
and commit |
02:45 |
|
chris |
cool that would be good |
02:46 |
|
btoumi |
hi chris and ty |
03:02 |
|
paul |
hdl around ? |
03:37 |
|
btoumi |
:chris around? |
03:52 |
|
btoumi |
another question for campion |
03:53 |
|
paul |
campion is away I think :-D |
03:53 |
|
btoumi |
champion |
03:53 |
|
btoumi |
lol |
03:55 |
|
btoumi |
actually deletedborrowertable is not correct we have badfield name and useless field but if i modify updatedatabase |
03:57 |
|
btoumi |
i must do like if i have a koha 2.2.5 database |
03:57 |
|
btoumi |
isn't it? |
03:57 |
|
paul |
pas sûr de bien comprendre la question... |
03:57 |
|
btoumi |
je vais la faire en francais parce que la c chaud en anglais |
03:58 |
|
btoumi |
en fait dans la version actuelle de la base de donnee de koha3.0 la table deletedborrower contient les champs de la table actuelle borrowers plus les anciens champ |
03:59 |
|
paul |
ouaip |
03:59 |
|
btoumi |
je suis entrain de modifier l'updatedatabase pour que l'on puisse modifier la base d'une version anterieur |
04:00 |
|
btoumi |
et pas les modidications que l'on doit faire sur la base actuelle de la 3.0 |
04:00 |
|
btoumi |
est ce comprehensible? |
04:14 |
|
btoumi |
par contre va falloir modifier la base de donnees de la head |
04:16 |
|
btoumi |
donc je sais pas si c toi :paul ou :chris |
04:16 |
|
btoumi |
qui allez le faire |
04:52 |
|
btoumi |
change be doen for updatedatabase |
04:52 |
|
dewey |
btoumi: that doesn't look right |
04:54 |
|
btoumi |
:nick btoumi_lunch |
04:54 |
|
btoumi |
nick btoumi_lunch |
09:03 |
|
btoumi |
bye for all good week end |
09:08 |
|
paul |
hello pierrick |
09:09 |
|
paul |
une dernière petite visite ? |
09:09 |
|
paul |
(depuis pfw.ineoms.com) |