Time  Nick    Message
11:43 hdl     am I alone ?
11:05 hdl     Sylvain : have you found answers ?
11:04 hdl     hi
08:09 Sylvain hi
07:01 osmoze  paul_away>  quand t es par la, tu peux me biper stp :)
18:07 thd     kados: are you still with us?
18:03 thd     Kados, Keeper of the Roadmap
18:01 thd     kados: Did paul write most of the roadmap descriptions?  hdl was unable to answer my confusion, he seemed to defer to you.
17:59 thd     kados: most of the roadmap feature descriptions seem as if they have some Babelfish problems in translation to English, but three had me baffled.
17:58 thd     kados: Very early this morning, I posted some questions to hdl about some confusing language for a few features in the Koha roadmap.
17:56 thd     kados: I do not know the parameters for reaching the danger threshold.  You would need a real MARC expert for that, aside from much experimentation to see what happens when building records with known test characteristics.  I could enquire from some people who may know.
17:53 thd     kados: If the record approaches the limit for maximum size, which apparently you found then records should be converted to MARC XML before a danger threshold is reached.
17:51 thd     kados: YAZ also has character set conversions.  However, I have found a very bad bug in the MARC-8 to ISO 8859 conversion for YAZ.  When encountered, the remainder of the record is eaten :)
17:50 kados   thd: here's a question for you about MARC ... should there by system limitations on the number or length of records and fields(e.g., notes fields in order or MARC records)?
17:49 thd     kados: Of course if the script is not supported in an ISO 8559 encoding then the client must support UTF-8 which means no MS Windows for the Chinese users unless someone is willing to cope with the complications of UTF-16 conversion.
17:49 kados   gotcha
17:46 thd     kados: Queries coming back from the client have to be converted back to UTF-8 for Koha.
17:45 thd     kados: You do not want to mess with UTF-16 just to support a non-free OS.  UTF-16 introduces complications.
17:44 thd     kados: If your library is running MS Windows for the client browser then there is no UTF-8 support ,only UTF-16, which is a problem unless everything can be converted to ISO 8859 going out..
17:42 thd     kados: OPAC clients cannot be expected to have UTF-8 support and even record editor clients have a problem.
17:41 thd     kados: Then for the record editor and OPAC the conversion should go out in ISO 8859
17:41 kados   sweet
17:40 thd     kados: I intend to.  A conversion is necessary whatever the record has to UTF-8 in Koha.
17:38 kados   thd: do you think you could get MARC-8, ANSEl and ALA working in Koha?
17:38 thd     kados: almost
17:38 rach    :-)
17:37 thd     kados: I have become one
17:37 kados   ooh ... that sounds nice
17:37 kados   or someone needs to become one ;-)
17:37 thd     kados: Encode::MAB2 will convert from ISO 5426 one way only for UNIMARC.
17:37 kados   we need a charset expert
17:37 kados   I suppose it would be hard to get all of them to display at the same time eh?
17:37 kados   It would be very nice if we could get MARC-8, ANSEL and ALA char sets supported in the display
17:36 kados   hmmm ...
17:33 thd     kados: What I do not understand is why BnF is distributing records in ISO 8859 that is not a UNIMARC character set.
17:31 thd     kados: This affects any non-ASCII character, even simple accented characters from French.
17:30 thd     Kados: Someone did not do their homework :)
17:29 thd     kados: No, I was very disappointed :)
17:29 kados   Kkoha doesn't use MARC::Charset do we?
17:29 kados   shoot ...
17:29 thd     kados: however, there is MARC::Charset for the MARC 21 conversions.
17:27 thd     kados: Library character sets preceded unicode for solving the same problem but they are not the same and not supported by common programming language functions.
17:26 kados   ie if we switch to UTF-8 will we have support for them or is it more complicated?
17:26 kados   are they covered in UTF-8?
17:25 kados   do you have any knowledge of these?
17:25 kados   yep
17:25 thd     kados: had you wanted to know about MARC-8, ANSEL, ALA character sets?
17:25 kados   it had to do with ALA character support in Koha
17:24 thd     kados: what had wanted to ask me on Saturday?
17:24 kados   thd: do you have a pressing question?
17:24 kados   thd: not really ;-)
17:16 thd     kados: are you there?
15:33 kados   hdl: I don't see the question
15:32 kados   no ... missed it
15:09 hdl     hi osmoze
15:04 hdl     kados : Did you read thd's question ?
14:47 genji   and im back.
14:33 genji_  going offline for a bit. changing modems.
14:26 owen    I don't remember anything like that from our previous system
14:26 owen    Sounds like a very specialized cataloging tool
14:26 kados   I guess Koha doesn't support this yet ;-)
14:26 kados   ahh ...
14:25 genji_  Comparision, for record merge./
14:25 kados   does anyone know why someone would want to display multiple MARC records side-by-side ... ?
14:12 owen    Right, I understand
14:12 genji_  This is pure logic speaking.
14:12 genji_  else, why put the [0]
14:12 genji_  so you'd only get the first item.
14:11 genji_  think so. but, $subscriptions seems to be a multi item array.
14:11 owen    Hmmm...so if $subscriptions also contains a value for subscriptionid, I should be able to get it by saying, my $subscriptionid = $subscriptions->[0]{subscriptionid}; ?
14:09 genji_  First item in hash array, field bibliotitle.
14:09 owen    How does the [0] affect it?
14:08 owen    It's part of serial-issues.pl
14:08 owen    my $title = $subscriptions->[0]{bibliotitle};
14:08 owen    I'm confused about a line of code that has an unfamiliar syntax
14:08 genji_  you didn't So, whats up?
14:08 owen    Sorry, didn't mean to wake you :)
14:07 genji_  -yawn- yes?
14:07 owen    any Perl-savvy humans out there?