Time |
S |
Nick |
Message |
12:56 |
|
jwagner |
Can someone answer a question about the logic behind calculating due dates with a holiday in the calendar? |
12:58 |
|
jwagner |
I had assumed that the holiday part only came into effect if a due date fell on a day defined as a holiday. What seems to be happening is that if a day defined as a holiday falls anywhere in the loan period, the loan period is extended by a day. Is this the way it's supposed to work? |
13:00 |
|
paul_p |
hi jwagner. strange you're right. The way it's supposed to work, afaik, is to add days only if the due_date is a closed one, you're right |
13:02 |
|
jwagner |
Hmmmm. We've tested and it doesn't seem to be working that way. Feb 16 is defined as a holiday. If the loan period extends past Feb 16 (e.g., would normally be due on Feb 20), the system is making it due a day later (Feb 21). However, if the loan period ends before Feb 16, the due date falls on the proper date. As far as I can see, the calendar is set up correctly. What else can I look at? I don't see many policies or sysprefs other than the ones |
13:04 |
|
paul_p |
jwagner: I'm afraid I won't be helpful here : our libraries usually have due dates calculated in weeks. So if someone issues a book on monday (open), it's due on monday (open too !). So nobody ever reported a bug on that, & I never had to investigate. |
13:07 |
|
jwagner |
Thanks, Paul. I'll keep poking around. I didn't see any relevant bugs on the bugzilla site, but maybe this is a new one. |
13:26 |
|
kf |
there is a syspref |
13:26 |
|
kf |
jwagner, i thought you can configure how calendar is handled |
13:27 |
|
kf |
https://sites.google.com/a/lib[…]nces--Circulation |
13:27 |
|
kf |
useDaysMode |
13:27 |
|
paul_p |
kf ++ !!! I forgot this one !!! |
13:28 |
|
kf |
im glad if i can help, because normally im just asking questions |
13:29 |
|
kf |
atm i have a problem with geman umlauts and how they are displayed in firefox :( |
13:31 |
|
Elwell |
hmm. If I want to get the addresses in swiss format (streetname number, postcode town) where do I start hacking? |
13:32 |
|
owen |
Elwell: does the database have the appropriate fields for you to work with? Is it only a display issue? |
13:49 |
|
jwagner |
Paul and KF, sorry, I stepped away for a few minutes. Our UseDaysMode syspref is set to Calendar. If I'm understanding it correctly, that means it should pay attention to the holidays defined in the calendar. It seems to be doing that; it's just extending the duedate even if the defined duedate isn't on the holiday. |
13:51 |
|
kf |
i think you want to use datedue |
13:52 |
|
kf |
with calendar its does not count the closed days for the due date, as i understand it |
13:52 |
|
kf |
what you want is, that closed days cant be a due date? |
13:54 |
|
jwagner |
Yes. So, looking again at the values for that syspref, what we want is datedue rather than calendar? |
13:54 |
|
kf |
i think so |
13:55 |
|
jwagner |
Many thanks! I'll give that a try. |
13:55 |
|
owen |
jwagner: We had the same confusion here. People were wondering why the due date had been extended by a day |
13:55 |
|
jwagner |
Owen, did the datedue fix it for you? |
13:56 |
|
kf |
i think datedue is what the libraries here want too - that due dates dont fall on days, when the library is closed, but on the next day, the library is open |
13:57 |
|
owen |
I'm still confused because the description of the syspref says "select Calendar to use the holidays module, and Days to ignore the holidays module" |
13:57 |
|
owen |
...but it doesn't mention datedue |
13:58 |
|
owen |
Ah, but in the manual: Datedue = the calendar only affects a due date if the due date would normally land on a closed date. |
13:59 |
|
owen |
so yeah, that's the right setting. |
14:00 |
|
kf |
owen, can you perhaps take a look at my problem with umlauts? |
14:00 |
|
owen |
I don't know if I can help, but I'd be glad to take a look |
14:00 |
|
kf |
perhaps you have an idea, how to teach firefox to display them right :( |
14:11 |
|
owen |
Any Koha statistics experts around? |
14:15 |
|
owen |
I'm wondering if Koha is still not recording a branch for some renewal transactions |
14:16 |
|
owen |
It makes it very difficult to calculate transactions per branch if 500,000 of your renewal stats for the year don't have a branch listed. |
14:23 |
|
atz |
owen: i don't think koha stats even records "issuingbranch" for a regular checkout |
14:24 |
|
atz |
just "branchcode" |
14:24 |
|
atz |
(i.e. owner of the item) |
14:24 |
|
owen |
I thought branchcode *was* issuingbranch |
14:25 |
|
atz |
well, i should be more specific. branchcode is the brach whose rules govern the checkout |
14:25 |
|
atz |
if you are setup to circ like most ppl, it is the owning branch |
14:25 |
|
atz |
but can be issuing branch (i think the syspref is HomeOrHoldingBranch |
14:25 |
|
atz |
) |
14:26 |
|
hdl_laptop |
atz: CircControl. |
14:27 |
|
atz |
thx hdl_laptop |
14:27 |
|
hdl_laptop |
np. |
14:29 |
|
owen |
So with CircControl set to ItemHomeLibrary, the statistic for a book checked out at *any* branch will list the item's home branch? |
14:30 |
|
atz |
i think it should. the problem is that issuingbranch is the field it should query, but that is rarely if ever populated |
14:31 |
|
atz |
basically the stats all key on "branchcode" because it is the one that gets results. but it isn't really the right one. |
14:31 |
|
atz |
i regard koha stats w/ suspicion. |
14:31 |
|
owen |
F'ing hell don't tell my boss that |
14:32 |
|
atz |
for single branch systems, i think it is equivalent |
14:32 |
|
owen |
hdl_laptop: do you know how 2.x versions handled this situation? Do you know which branch would be credited with the issue statistic? |
14:34 |
|
hdl_laptop |
Sorry out of my hat, i can't answer you. |
14:36 |
|
kf |
hdl, can i ask you something about french accents and utf-8? |
14:38 |
|
hdl_laptop |
yes |
14:38 |
|
hdl_laptop |
kf ? |
14:38 |
|
kf |
here |
14:39 |
|
kf |
we have a problem with german umlauts in utf-8 |
14:39 |
|
hdl_laptop |
owen : in statistics table in koha 2.2 it was the "issuing" branch twhich was stored. |
14:40 |
|
owen |
Thank you for confirming that hdl_laptop. That is what I wanted to hear :) |
14:40 |
|
hdl_laptop |
kf are you using MARC21 or unimarc ? |
14:40 |
|
kf |
marc21 |
14:40 |
|
hdl_laptop |
Is it a display or search problem ? |
14:40 |
|
kf |
a display problem |
14:41 |
|
hdl_laptop |
??? |
14:41 |
|
kf |
both, but zebra can solve the search problem |
14:41 |
|
kf |
i will try to explain |
14:41 |
|
kf |
in english its difficult |
14:41 |
|
hdl_laptop |
do you have a url ? |
14:41 |
|
hdl_laptop |
ich kann ein wenig deutsch sprechen. |
14:45 |
|
kf |
i sent you the link |
15:50 |
|
owen |
Hi liz and danny |
15:50 |
|
danny |
hey owen and #koha |
15:50 |
|
liz |
'mornin Owen |
15:51 |
|
liz |
did everybody have a lovely weekend? |
15:51 |
|
kf |
hi liz |
15:53 |
|
hdl_laptop |
hi liz |
15:57 |
|
liz |
lol owen |
15:57 |
|
liz |
and hi everybody :D |
16:02 |
|
nicomo |
hi liz |
16:03 |
|
liz |
I feel unnaturally welcomed this morning, thank you :) |
16:14 |
|
hdl_laptop |
owen: i just sent an update on circulation.tmpl on 3.0.1 can you proof read the result so that we can be sure not to have broken any feature ? 6 eyes are better than 1 |
16:14 |
|
hdl_laptop |
(nicomo is testing it on his machine.) |
16:15 |
|
pianohacker |
owen: just saw your post |
16:15 |
|
pianohacker |
That setting really doesn't even work correctly? That would be the icing on the cake |
16:16 |
|
owen |
pianohacker: As far as I can tell. when I set the "hidden" value to display in the editor, it doesn't. It only displays if there is a pre-filled value or if it is mandatory. |
16:17 |
|
pianohacker |
Hrm |
16:17 |
|
pianohacker |
Maybe a double patch |
16:19 |
|
hdl_laptop |
owen cnighswongerer is not around is he ? |
16:20 |
|
owen |
hdl_laptop: you sent an update where? |
16:21 |
|
hdl_laptop |
well.... I should have sent it to the list, but I pushed it through directly. |
16:22 |
|
hdl_laptop |
It is on 3.0.x branch |
16:35 |
|
liz |
you know, the add patron interface really ought to exclude patron categories that don't match the type of patron you are adding |
16:35 |
|
liz |
i.e. if you are adding an adult patron, it shouldn't show you child patron categories |
16:37 |
|
owen |
hdl_laptop: I'm not clear on what your change was supposed to fix. Can you explain? |
16:45 |
|
pianohacker |
liz: believe it or not, I think you have me to blame for that |
16:46 |
|
pianohacker |
It was a fix for not being able to change the patron category across supercategories; i.e, an adult patron could not become a homebound patron |
16:47 |
|
owen |
it preselects the patron type you chose to add though |
16:51 |
|
hdl_laptop |
owen : there is 2 fixes in it. |
16:52 |
|
hdl_laptop |
a) divs behaved quite badly when user was selected => lefthand menu bar was displayed at the bottom of circulation page. |
16:53 |
|
hdl_laptop |
b) Null checkouts and fines were displayed even when no borrowers wereselected. |
16:53 |
|
liz |
owen: hm, ours just shows the categories alphabetically, starting with the child categories, regardless of the supercategory |
16:54 |
|
liz |
pianohacker: I can appreciate that fix, for sure |
16:55 |
|
liz |
lordy, everything you've ever heard about kansas and wind is coming true today |
16:56 |
|
rhcl |
yea, really strong wind up here too, and we're not even in Kansas! |
17:04 |
|
owen |
hdl_laptop: I see the problems you describe, and your updated file seems to fix them |
17:06 |
|
hdl_laptop |
Can you do some more tests ? |
17:07 |
|
hdl_laptop |
just to check that nothing else is broken by this patch. |
17:07 |
|
hdl_laptop |
It is quite uneasy to read tmpl files. |
17:07 |
|
hdl_laptop |
(They are not much factorized. |
17:07 |
|
hdl_laptop |
) |
17:10 |
|
owen |
Certainly |
17:17 |
|
hdl_laptop |
(I think that eventually, it could be good if error messages or warnings, or even working area would be in different included files but would be a heavy refactoring.) |
17:18 |
|
owen |
I do see a layout bug...let me try to track it down. |
17:23 |
|
hdl_laptop |
owen : can you point it to me ? |
17:24 |
|
owen |
If the patron has messages (overdues, circulation note, etc) that block is showing up below the checkout form rather than to the right of it |
17:26 |
|
owen |
according to the diff on git.koha.org, you took out a <!-- TMPL_UNLESS --> and a <!-- /TMPL_IF --> Aren't you getting template errors from that? |
17:27 |
|
hdl_laptop |
I took out TMPL_UNLESS with its /TMPL_UNLESS |
17:28 |
|
hdl_laptop |
mmm... strange. |
17:35 |
|
hdl_laptop |
owen : this warning stack underflow:tags stack is empty ? |
18:33 |
|
Elwell |
hmm. kohadocs for 3.0 -- pdf's screwy for all? |
18:33 |
|
Khalsa |
the ones that are really old? |
18:34 |
|
Khalsa |
oh ncm |
18:34 |
|
Khalsa |
nvm |
19:50 |
|
owen |
Every time I try to delete an extra framework Koha says it's being used 312 times...Each of them is being used exactly 312 times? |
19:55 |
|
chris |
i find that hard to believe |
20:04 |
|
gmcharlt |
owen: it's counting number of MARC tags you have defined in framework, not number of bibs using that framework |
20:04 |
|
gmcharlt |
not saying that the current behavior is *useful*, mind you |
20:04 |
|
owen |
:) |
20:05 |
|
owen |
Would anything negative result from deleting a framework which was "in use" by existing records? |
20:06 |
|
gmcharlt |
likely bib display oddities |
20:06 |
|
gmcharlt |
and possibly worse |
20:07 |
|
gmcharlt |
assigning bibs in question to the default framework should be reasonably safe, at least in MARC21 |
20:08 |
|
gmcharlt |
assuming you haven't changed the Koha field to MARC mapping |
20:08 |
|
owen |
But there's not any way from within Koha to tell what frameworks are in use. |
20:08 |
|
liz |
omg, gmcharlt that is so good to know |
20:09 |
|
gmcharlt |
no except for running the query select frameworkcode, count(*) from biblio group by frameworkcode; |
20:09 |
|
gmcharlt |
biblIo_framework.pl should be patched to do that as part of its delete_confirm check |
20:10 |
|
chris |
you could even check things like if you try to change a framework that has less marc tags defined, what information will be 'lost' etc |
20:39 |
|
Elwell |
hmm, see the changelog for INSTALL.debian -- wants to install HTML::Scrubber from cpan and not from libhtml-scrubber-perl ?? |
20:43 |
|
chris |
what version is in debian? |
20:44 |
|
chris |
if its greater that 0.08 then it will be fine |
20:44 |
|
Elwell |
2 ticks (waiting for git) |
20:45 |
|
Elwell |
0.08-3 |
20:45 |
|
chris |
that'll be fine |
20:47 |
|
Elwell |
http://packages.debian.org/etc[…]tml-scrubber-perl |
20:48 |
|
Elwell |
should I do a diff and add it to the debian.packages rather than the cpan line and submit? |
20:50 |
|
chris |
yep |
20:50 |
|
chris |
that would be great |
20:51 |
|
Elwell |
ok - I'll double check what else is packaged with lib*-perl first |
21:15 |
|
Elwell |
ok - libpoe-perl --etch packaged version is at 0.3502, CPAN 1.003. Both get installed. |
21:15 |
|
Elwell |
koha:/home/build# locate POE.pm |
21:15 |
|
Elwell |
/usr/local/share/perl/5.8.8/POE.pm |
21:15 |
|
Elwell |
/usr/share/perl5/POE.pm |
21:31 |
|
Elwell |
chris: OK. if I've battled correctly against git, commit d68e4340c41b7e59ad2cad6aef078dd381a341e4 |
21:31 |
|
Elwell |
should have the changes |
21:32 |
|
chris |
you will want to git format-patch |
21:32 |
|
chris |
then git send-email |
21:32 |
|
gmcharlt |
to patcheskoha.org |
21:32 |
|
chris |
and send it to patcheslists.koha.org |
21:32 |
|
chris |
or that |
21:32 |
|
chris |
they go to the same place :) |
21:33 |
|
Elwell |
meh |
21:33 |
|
Elwell |
git: 'send-email' is not a git-command |
21:34 |
|
gmcharlt |
what's your git --version |
21:34 |
|
Elwell |
ah. old. very. 1.4.4.4 |
21:34 |
|
chris |
apt-get install git-email |
21:35 |
|
chris |
comes as a separate package in debian |
21:38 |
|
Elwell |
bugger - can't see my diff as I did a git commit |
21:55 |
|
Elwell |
ok - I'm not convinced anything got sent (maillog looks empty), but I managed to extract the patch into a file and run git send-email |
21:56 |
|
Elwell |
blame gmail :-) |
21:56 |
|
Elwell |
** patcheskoha.org R=nonlocal: Mailing to remote domains not supported |
21:57 |
|
chris |
git format-patch |
21:57 |
|
chris |
git format-patch origin |
21:57 |
|
chris |
did that not make a patch for you? |
21:58 |
|
Elwell |
yeah - got that (eventually once I worked out syntax -- first time with git) |
21:58 |
|
Elwell |
can I send the patch from gmail or does it break your handling? |
21:58 |
|
chris |
nope you can send it from gmail |
21:59 |
|
gmcharlt |
you can send as an attachment if need be (would prefer to not munge any whitespace diffs) |
22:04 |
|
Elwell |
ok - got as far as mailmankoha.org - I'm off to bed. |
03:23 |
|
Amit |
hi good morning |
03:23 |
|
Amit |
good morning mason, chris |
03:34 |
|
mason |
morning amit |
03:35 |
|
Khalsa |
Amit: what time zone are you in? |
03:39 |
|
Amit |
i m from india |
03:39 |
|
Amit |
9.09 A.M IST |
03:39 |
|
Amit |
hi khalsa |
03:39 |
|
Khalsa |
Hi |
03:42 |
|
Amit |
r u from khalsa? |
07:20 |
|
kf |
good morning #koha |
07:20 |
|
chris |
evening kf :) |
07:31 |
|
kf |
:) |
07:31 |
|
kf |
im still working at my encoding problems |
07:33 |
|
chris |
never fun |
07:34 |
|
Elwell |
hmm. who's the listadmin? http://lists.koha.org/ vs http://lists.koha.org/mailman/listinfo |
07:36 |
|
chris |
it will be someone at biblibre |
07:38 |
|
mc |
wow ... storm begun! |
07:39 |
|
mc |
hello world |
07:39 |
|
chris |
hi mc |
07:40 |
|
kf |
MARC8 -> UTF8 encoding is problematic |
07:40 |
|
chris |
yes MARC8 was a very stupid idea |
07:40 |
|
kf |
but at least i know that know - iso5... -> UTF8 works fine |
07:41 |
|
SelfishMan |
"problematic" is an interesting term to describe that |
07:41 |
|
kf |
but our union catalog is MARC8 + MARC21 of course... |
07:41 |
|
kf |
im too nice for other terms ;) |
07:42 |
|
SelfishMan |
A few more minutes of fighting with it and I would have been a broken person hiding in the corner in tears |
07:42 |
|
kf |
lol |
07:42 |
|
kf |
i was near yesterday |
07:42 |
|
kf |
i still have no solution, but i got some correct umlauts from a french z39.50-server now |
07:43 |
|
chris |
have you been using MARC::Charset? |
07:43 |
|
chris |
http://search.cpan.org/~esumme[…]b/MARC/Charset.pm |
07:44 |
|
kf |
its about the way umlauts are encoded, i want a single character version, but with marc8 i always get a 2 character version that is not searchable and displayed wrong |
07:45 |
|
chris |
ahh |
07:45 |
|
kf |
i think the two character version is already in marc8 |
07:46 |
|
kf |
so its not really a bug, but annoying :( |
07:47 |
|
chris |
how many letters can have an umlaut on them in german? |
07:47 |
|
kf |
i dont want a catalog, where german umlauts dont look the way they should |
07:47 |
|
chris |
if you code convert the marc records to xml |
07:47 |
|
kf |
its also about accents |
07:47 |
|
chris |
code=could |
07:47 |
|
kf |
umlauts are ä ö and ü |
07:47 |
|
kf |
its about the z39.50 download in koha |
07:48 |
|
chris |
then we could write some regex's in perl to convert them |
07:48 |
|
kf |
i dont know how to change that, we already converted data for batch import |
07:48 |
|
kf |
i think the problem is too big |
07:48 |
|
mc |
hmm |
07:48 |
|
kf |
i looked at french catalogs yesterday - there are many accents and things like â |
07:48 |
|
kf |
- i cant speak french |
07:48 |
|
mc |
want to convert umlauts symbols to what ? |
07:48 |
|
kf |
and there its always single character version, because of unimarc and iso5... |
07:49 |
|
mc |
ö > o ? |
07:49 |
|
mc |
or other thing ? |
07:49 |
|
kf |
but if you look for french records at LOC, its always two character |
07:49 |
|
kf |
i dont know if i can really explain that in english |
07:49 |
|
kf |
sorry mc? |
07:49 |
|
kf |
do you mean alphabetical sorting? |
07:50 |
|
chris |
mc: 2byte encoding to single byte encoding |
07:50 |
|
mc |
kf, it seems to me that you want to convert |
07:50 |
|
mc |
ohh |
07:50 |
|
kf |
ah chris, i think you understand me :) |
07:50 |
|
mc |
thx chris |
07:51 |
|
kf |
the problem is, i can have 2byte, it looks slightly wrong and can be searched, but if something is changed manually, it will get singly byte |
07:51 |
|
mc |
utf8 > latin1, something like that ... but what if you don't have the same symbol? |
07:52 |
|
Elwell |
OK - seeing as Debian Lenny is coming out 'soon' (bearing in mind debians normal speed...) -- has anyone done an install on it? |
07:52 |
|
chris |
yep |
07:52 |
|
kf |
i think the symbols are not the problem |
07:52 |
|
mc |
Elwell, i use it for a while now |
07:53 |
|
kf |
its all there in single byte, just look at the french catalogs :( |
07:53 |
|
mc |
even in production |
07:54 |
|
Elwell |
mc: cool - deb testing proved pretty damn reliable last time I used it (fraid I jumped ship to the ubuntu side for desktops) |
07:54 |
|
mc |
Elwell, i left etch for a while as many of the koha dependancies are in lenny so koha installation is very clean |
07:55 |
|
kf |
is there a way to get records in marc8 to utf8 with german umlauts as single character? |
07:56 |
|
paul_p |
hello everybody |
07:56 |
|
paul_p |
good morning from france ! |
07:56 |
|
mc |
i'm not aware about char pbs but is marc::charset using iconv or something like that ? ? |
07:57 |
|
mc |
paul_p, hello |
07:57 |
|
Elwell |
bonjour paul from just across the border |
07:58 |
|
paul_p |
Elwell: hi. Where are you located exactly ? |
07:58 |
|
mc |
Elwell, experienced some pb with my ubuntu as desktop. i'm not happy enought to use it for customers |
07:58 |
|
paul_p |
(/me in Marseille, south of france) |
07:58 |
|
kf |
i think marc::charset works correct - but its still not what i need |
07:59 |
|
mc |
kf, sorry ... i stop to bug you is it seems i don't understand the pb |
07:59 |
|
kf |
and its confusing to have different versions of utf8-encoding for the same german letter |
07:59 |
|
kf |
mc, im glad if someone tries to understand and help, i think its difficult for me to explain in english |
08:00 |
|
mc |
kf: german ? |
08:00 |
|
kf |
yes im from germany |
08:00 |
|
Elwell |
paul_p: just over the swiss side of the Jura (work near geneva) |
08:00 |
|
kf |
constance |
08:00 |
|
paul_p |
do you speak french ? (if yes, we have a french speaking channel, much more silent than this one though) |
08:01 |
|
mc |
Elwell, we're almost neighbor |
08:01 |
|
kf |
sorry, no :( |
08:01 |
|
Elwell |
paul_p: nope - still learning it |
08:01 |
|
paul_p |
kf: I asked Elwell ;-) (I already knew you're german. And i'm happy to see new europeans being interested by Koha !) |
08:02 |
|
mc |
my german isn't fluent enought for this kind of chat ... but i work on it! |
08:02 |
|
paul_p |
we have some (few) customers in Switzerland (in Lausanne) |
08:02 |
|
kf |
you can use me for training :) |
08:02 |
|
chris |
kf: maybe ask on the koha-translate list? |
08:02 |
|
paul_p |
(and hopefully another one in Neuchatel in the coming weeks...) |
08:02 |
|
Elwell |
I notice the WIPO use it too but looks like its supported by liblime |
08:02 |
|
chris |
someone might have run into something like this before |
08:02 |
|
mc |
kf thx for it: i'll try when i will feel ready |
08:03 |
|
chris |
yep WIPO do .. and UNIDO in vienna |
08:03 |
|
kf |
mc: sure, im here |
08:03 |
|
paul_p |
Elwell: yep. WIPO asked me a RFP 3 years ago, but it was not unimarc, not french & I was too busy. So I suggested asking LL |
08:04 |
|
mc |
kf: so ... |
08:04 |
|
Elwell |
my plan is to learn it and do amlib.ch as a volunteer project |
08:04 |
|
kf |
its no problem for translation, because when using the keyboard you always get it right |
08:05 |
|
mc |
you want to translate from utf8 to wich charset? |
08:05 |
|
mc |
i seen MARC8 |
08:05 |
|
mc |
but is it a charset ? |
08:05 |
|
mc |
(that's the point i missed, i think) |
08:06 |
|
kf |
i think i dontk now how to explain it |
08:07 |
|
kf |
it seems there are two different ways to encode umlauts (as an example) you can use base character + diacritic or you can simple use the character ü |
08:07 |
|
mc |
ok: where the data came from ? |
08:07 |
|
kf |
LOC is an example, they use diacritic version |
08:07 |
|
mc |
kf, sure! that's utf8 :) |
08:07 |
|
kf |
and its annoying, because its displayed wrong in firefox |
08:07 |
|
kf |
its not over the letter, but left of it |
08:08 |
|
mc |
outch |
08:08 |
|
kf |
dont can show that to a library user |
08:08 |
|
paul_p |
kf : I confirm you ü can be represented differently in unicode (ü and U+UMLAUT) |
08:08 |
|
mc |
kf, isn't it just a font pb in firefox ? |
08:09 |
|
kf |
yes, but you cant force arial unicode ms i think, is it even avaliable on every system? |
08:09 |
|
mc |
i fixed some weird displays just switching to a better utf8 font |
08:09 |
|
kf |
then you can display it right |
08:10 |
|
kf |
but you will still have mixed unicode encoding in your database then |
08:10 |
|
mc |
kf, i use monospace under linux |
08:10 |
|
mc |
i don't know about mac |
08:10 |
|
mc |
(i have to test it) |
08:11 |
|
hdl_laptop |
Elwell: it is me |
08:11 |
|
mc |
kf, my two cents: you can't expect to avoid that if you don't check the data sent by user on fly |
08:12 |
|
kf |
and zebra has to be configured, to do an equivalence search for all those characters |
08:12 |
|
chris |
hdl_laptop: good morning, i pushed up the updated release notes |
08:12 |
|
kf |
its not a good solution i think |
08:12 |
|
mc |
kf, the better afaik :( |
08:13 |
|
kf |
we would not have this problem i i could use UNIMARC |
08:13 |
|
hdl_laptop |
chris: thanks |
08:13 |
|
kf |
or only french z39.50 with iso encoding |
08:14 |
|
kf |
i have to think about it, but i dont think its a good thing to use U+Umlaut, when its displayed wrong and cant be typed in by the user |
08:16 |
|
hdl_laptop |
kf : it seems that the international norm is tending to have combined diacritics signs and not one character for that. |
08:17 |
|
hdl_laptop |
This can explain the way taht MARC-8 to utf-8 chose to deal with encoding. |
08:18 |
|
hdl_laptop |
We also have the problem here with some french diacritics. |
08:18 |
|
hdl_laptop |
Since é can be encoded two ways |
08:19 |
|
hdl_laptop |
Solution we are considering is using yaz-icu |
08:19 |
|
hdl_laptop |
And normalize accents in order not to have problems with diacritics indexing. |
08:23 |
|
hdl_laptop |
But display is then quite hard to fix. |
08:32 |
|
hdl_laptop |
kf: I can help you building up equivalence in zebra for thos chars. |
08:35 |
|
kf |
thx hdl |
08:35 |
|
kf |
i think i know how to do that |
08:36 |
|
kf |
i asked you about that some time ago :) |
08:38 |
|
kf |
i saw in french catalogs only single character encoding - but thoght it must be the same problem with é |
08:38 |
|
kf |
i have a little proplem here atm, be back later |
09:43 |
|
kf |
hdl: normalize to e? |
09:56 |
|
hdl_laptop |
kf: we are facing same problem with é. |
09:57 |
|
hdl_laptop |
But if you are getting biblios from SUDOC with utf-8 encoding, you will see that é is combined diacritics and not \xe9 |
09:58 |
|
kf |
is it only é or all? â |
09:59 |
|
kf |
i just dont like the idea to have both versions in one database |
10:50 |
|
kf |
my colleague just sent me that link, seems NFC-normalization would be the right solution then: http://www.mail-archive.com/ko[…]org/msg00636.html |