IRC log for #koha, 2009-07-10

All times shown according to UTC.

Time S Nick Message
13:42 |Lupin| hello
13:45 ebegin hey |Lupin|
13:58 |Lupin| hey ebegin
14:16 Snow_Fox morning
14:16 ebegin are the accounttype (Rep, L, ...) are defined somewhere?
14:17 ebegin good monring Snow_Fox
14:17 paul_p ebegin: no, afaik, it's hardcoded (a koha 1.0 piece of code...)
14:17 ebegin grr... thanks paul_p
14:18 |Lupin| Is the ccode category predefined in Koha or is there something to do to activate it ?
14:18 paul_p ebegin: have you submitted a patch as of today ? If no, then you may be the 96 or 97th commiter !
14:20 ebegin no, i didn't :)
14:23 ebegin when was the first line of code written for Koha? 1999?
14:26 paul_p ebegin: yep, end of 1999. but this one may not be that old. (which script ?)
14:29 |Lupin| can anybody help with ccodes pls ?
14:29 |Lupin| wet to administration -> authorized values and couldn't see anything
14:47 ebegin paul_p, you think you can give a try to fill some of those values? http://wiki.koha.org/doku.php?[…]opment:hard_coded
15:07 kf |Lupin|: its the right place
15:08 kf |Lupin|: you can import some predefined CCODEs during installation
15:08 kf |Lupin|: when there is no CCODE you can add it
15:08 |Lupin| kf: I just had a look to the manual. It mentions the CCODE category, but it does not appear in the drop-down I can see with categories
15:09 |Lupin| after BSort2 I have CAND
15:09 |Lupin| and then COUNTRY
15:09 kf |Lupin|: try adding it with "new category"
15:11 |Lupin| kf: ok
15:16 kf |Lupin|: when adding a new category you will add the first value in the same step
15:19 |Lupin| kf: oka !
15:19 |Lupin| kf: just surprised that it is not there already
15:19 kf just use CCODE as category code
15:19 kf I dont know why, perhaps you chose not to install example values during isntallation, or its a UNIMARC / french isntallation thing?
15:21 ebegin |Lupin|, IIRC, the CCODE are part of the optional data that is added during the websetup.
15:22 Snow_Fox with the offline transaction tool it notes that the following items will not be process or checked out, and was wonder what it ment by Item barcode fills a request
15:22 |Lupin| kf & ebegin ok, thanks !
15:24 |Lupin| it may indeed very well be that I didn't instal the example values...
15:58 ebegin |Lupin|, does lynx support JQuery?
16:00 |Lupin| ebegin: no, I don't think so
16:29 paul_p ebegin: lynx don't support javascript.
16:30 paul_p BUT JQuery has a fallback++ without jquery
16:30 paul_p so, it's not as nice & fancy as with jquery "active", but it still works
16:32 ebegin paul_p, good.  Thanks for the info.  
17:03 |Lupin| bye "koha !
18:34 joetho what are these -null- things in circ reports??
18:35 joetho items with no collection code or something like that? Nope. I am perplexed and stymied.
19:10 atz joetho: right
19:10 atz NULL is the value for no value.
19:10 atz select itemnumber from items where ccode IS NULL;
19:11 atz or whatever your field of choice is
19:17 mib_s1ex4e is this just for koha writers or for koha cataloging?  Need to know if I need to be in here or not..lol
19:18 ebegin mib_s1ex4e, this is for anybody who want to talk about koha.  both librarian and coders
19:18 mib_s1ex4e cool, thankx
19:20 chris morning
19:21 ricardo "Morning" chris!  :)
19:21 ricardo (it's 20H21 here in Portugal  ;-)
19:22 ricardo Quick question: is there a way in Koha to delete all bibliographic records that belong to a specific "Item Type"?
19:22 chris 7.21am here, just feeding my son breakfast then i have to catch a bus to work :)
19:23 ricardo chris: I thought that Koha related work made you a millionaire and that you hadn't to work anymore  ;-)
19:23 chris lol
19:23 ricardo Eheh
19:23 chris ricardo: i think you would have to do it in the db with some sql
19:24 Snow_Fox hey on the offline circ program when it says that it wont record a "fills a request"
19:24 Snow_Fox is that refering to a item hold?
19:24 chris my guess is yes, but kyle is who could answer definitively Snow_Fox
19:24 ricardo chris: That's what I'm afraid of  :-S  And afraid because, besides the biblio.* tables, I don't know if I have to delete other things because of the MARC representations
19:25 chris you want to get rid of biblio, biblioitems and items;
19:25 chris biblioitems contains the marc and marcxml blobs
19:25 chris the trick is
19:26 chris itemtype can be at itemlevel
19:26 Snow_Fox .seen kyle
19:26 Snow_Fox duh no x3
19:26 Snow_Fox heh
19:26 chris @seen kyle
19:26 munin chris: I have not seen kyle.
19:26 ricardo chris: Right. But, in my case, i have item types NOT set at "item level"
19:26 chris koha-devel mailing list is your best bet Snow_Fox
19:26 Snow_Fox roger that thanks chris
19:27 chris ricardo: cool, in that case its easier, if it was at item level you would definitely have to script it
19:27 ricardo chris: I mean that I set the "item-level_itypes" System Preference to OFF
19:27 chris *nod*
19:27 ricardo chris: Yeah. Thank God for small favours, right?  ;-)
19:27 chris if it was at itemlevel, you would have to edit the marc (removing just the items of the certain type)
19:28 chris delete from biblio,biblioitems,items where biblioitems.biblionumber = biblio.biblionumber and items.biblioitemnumber = biblioitems.biblioitemnumber and biblioitems.itemtype='SOMETHING'
19:29 ricardo chris: Right.  I must admit that I don't feel very comfortable with editing directly the tables... I wish there was some kind of API for this (or an option for bulkmarcimport). Oh well
19:29 ricardo chris: Thank you very much for the query.  :D
19:29 chris BACKUP THE DB FIRST :-)
19:29 ricardo chris: *nod*
19:29 chris then rebuild_zebra.plk
19:29 chris -k
19:29 ricardo Yeah... In my case "rebuild_nozebra"
19:29 chris righto
19:30 chris im fairly sure there is work being done on bulkedit/deletes of records
19:30 chris but im not sure where its at
19:30 ricardo chris: OK. Thanks for that info, too
19:32 ricardo (I must say I'm not fond of data migrations, to say the least...)
19:32 jwagner PTFS is working on batch item edit for a client -- code is still in test, so I'm not sure when it will be available.
19:33 ricardo jwagner: Hi Jane! That's great! Thanks for the info  :)  And don't worry, I won't "nail" you to a deadline... But you said "tomorrow", right?  ;-)
19:34 ricardo (just kidding...)
19:36 jwagner Nope.  I didn't even say "yesterday" !
19:36 ricardo jwagner: Eheh
19:36 chris right off to ride my bus
19:36 jwagner Happy commuting, chris!
19:40 ricardo chris: Bye Chris... and thanks! :)
20:06 wizzyrea jwagner: have you looked at the NEKLS specs for bib maintenance?
20:06 wizzyrea (and I'm sure we'd be interested in seeing what you're doing for your client :P)
20:14 joetho we are also spec-ing out a development to deal with merging bibs together, that preserves all the item data.
20:14 joetho I think this will go a long way toward solving problems with batch deletions
20:14 jwagner wizzyrea, Sorry, I've been off on another system.  No, I haven't seen the NEKLS specs -- can you point me to them?
20:16 joetho not sure if they are vendor-specific-proprietary at this point. A very murky subject, that.
20:16 wizzyrea no, they are creative commons licensed
20:16 wizzyrea re: joetho
20:16 joetho yer sure.
20:16 wizzyrea yes
20:16 joetho yer totally sure.
20:16 joetho ha. a hesitation.
20:16 wizzyrea oh for pity's sake. Yes, look closely at what you have, it's printed at the bottom
20:17 pianohacker joetho: LibLime specs I've seen in the past were creative commons licensed
20:17 joetho uh oh dads here
20:17 ricardo joetho: LOL
20:18 gmcharlt wizzyrea: yes
20:18 joetho see I toldja
20:18 wizzyrea pbbbt
20:18 ricardo wizzyrea: LOL!
20:19 gmcharlt please send me at least 10% of any money wagered on that question ;)
20:19 ricardo gmcharlt: eheh
20:19 joetho how about ten percent of the money spent on it?
20:19 gmcharlt ooh, shiny!
20:20 ricardo Question: the easy way to remove duplicate bibliographic records in Koha (for a given item type) is...?
20:20 wizzyrea ooh, that's the question of the day
20:21 wizzyrea gmcharlt: you crack me up. :)
20:21 pianohacker atz++
20:21 joetho thus our discussion of the specs for bib merging development.
20:21 pianohacker If you're not careful, you'll be assigned cleanup tasks for the rest of your career
20:21 wizzyrea wow, that was random lol (pianohacker)
20:22 pianohacker Hrmph. Mailing list
20:22 pianohacker random--
20:22 ricardo joetho: Really? That's great! That means I can stop working on this data migration and wait for you to finish the specs / development ;-)
20:23 jdavidb Yah, wizzyrea...if you kept up with the lists, whilst cutting up on here, you'd have known what pianohacker was talking about...c'mon...do try to keep up.  :P
20:23 ricardo @karma random
20:23 munin ricardo: Karma for "random" has been increased 0 times and decreased 1 time for a total karma of -1.
20:23 ricardo Poor random!  ;-)
20:23 wizzyrea *sigh*
20:23 joetho btw, @atz suggestion for my earlier "null" question didn't work.
20:23 pianohacker Not everyone is glued to their email inboxes (not entirely a bad thing...)
20:23 gmcharlt c'mon, wizzyrea - the whole point of social apps is that you're obligated to keep up with *everything*
20:24 joetho well..... bucks up.
20:24 jdavidb Awwwwww...we wuvs you, wizzyrea!
20:25 atz joetho: you didn't find NULL values in the records?
20:25 ricardo On a related note...
20:26 pianohacker joetho: Is it possible that some of your records have ccodes that are no longer defined in the authorised_values table ?
20:27 joetho hmmm.
20:27 ricardo Is it because that development work is being done in this area, that the only "Record matching rule" (available in its dropdown list) is "Do not look matching records" (in the "Stage MARC Records for Import" screen)?
20:27 joetho I have been very careful about deleting authorized values.
20:27 pianohacker ricardo: You first have to define record matching rules
20:27 ricardo pianohacker: OK. Thanks for the info. Let me check that
20:28 joetho our development spec for this seems fairly extensive.
20:28 joetho it definitely includes expanded functionality for defining matching rules.
20:28 pianohacker joetho: SELECT * from items left join authorised_values on (ccode = authorised_value) where lib is null;
20:28 pianohacker Should pull up anything with a broken ccode
20:28 ricardo pianohacker: OK. I found the screen for adding a matching rule... but I'm beginning to think that I wish I *didn't*!  ;-)
20:29 pianohacker Heh
20:29 pianohacker I think you have gmcharlt to blame for that
20:29 joetho I only looked for missing ccodes- not dysfunctional orphans.
20:29 joetho I find those on match.com
20:29 pianohacker O_o There is no doubt a section for exactly that, this being the internet
20:30 atz ricardo: ask and ye shall receive.... eventually, ye shall stop asking.
20:31 ricardo atz: LOL! I think that would be a good "quip" for Koha's Bugzilla  ;-)
20:31 jdavidb Bugzilla?  Naah...front page of the Wiki.  That's priceless!
20:31 gmcharlt @quote add <atz> ricardo: ask and ye shall receive.... eventually, ye shall stop asking.
20:31 munin gmcharlt: The operation succeeded.  Quote #13 added.
20:31 Sharon joetho I have sql reports to find null itypes and ccodes and locations.
20:31 wizzyrea @quote add joetho: I only looked for missing ccodes- not dysfunctional orphans.
20:31 munin wizzyrea: The operation succeeded.  Quote #14 added.
20:32 ricardo jdavidb: eheh
20:32 pianohacker joetho, Sharon: The SQL I posted above should find those, ah, "dysfunctional orphans", rather than just null ccodes
20:32 ricardo @qote
20:32 munin ricardo: I suck
20:32 ricardo @quote
20:32 munin ricardo: I'll give you the answer as soon as RDA is ready
20:33 ricardo RDA?
20:33 Sharon cool beans
20:33 joetho I resolved all my actual "null" itypes etc, but not the ones that are THERE but no authorized.
20:33 pianohacker @quote random
20:33 munin pianohacker: Quote #5: "<jwagner> Why is it every Koha rock I turn over produces a zillion (metaphorical) ants, each with a new question????" (added by kf at 01:13 PM, June 12, 2009)
20:34 wizzyrea @quote random
20:34 munin wizzyrea: Quote #10: "< pianohacker> You helped start an open source project; clearly your sense of what to avoid to make your life easier has been impaired for a while :)" (added by chris at 07:59 PM, June 23, 2009)
20:34 ricardo pianohacker: Amusingly and sadly insightful!
20:35 richard hi
20:35 joetho pianoist: I haven't got it to work yet. Sqlirrelly something or other in there.
20:35 pianohacker Hrmm. Syntax error, or just don't pull nothin' up?
20:36 joetho syntax. I'm pecking at it.
20:36 ricardo Hi richard
20:36 jdavidb @quote random
20:36 munin jdavidb: Quote #9: "pianohacker ponders drumstick->ear as a method of food acquisition...We haven't gone to this good mexican restaurant in a while..." (added by wizzyrea at 08:23 PM, June 19, 2009)
20:36 pianohacker I'm starting to talk in soundbites, God help me
20:36 joetho but assuming I DO find some weird values in there- how did they get there?
20:37 pianohacker Accidental delete of ccodes by someone, odd imported data, who knows
20:37 wizzyrea I read drumstick and I thought "chicken drumstick?"
20:37 wizzyrea even though I know it's a drum stick
20:37 joetho An Un-named Employee left me a few code presents in the form of transposed ohs and seroes, but once you find them it's no big deal.
20:37 gmcharlt uh-0h
20:37 joetho zeroes*
20:39 ricardo gmcharlt: I think it's more like "Oh-0h"  (pun *fully* intended  ;-)
20:39 joetho but I don't see how that could happen with ccodes. One wrong character when you are importing huge batches should give thousands of errors, not 4 or 5 a week.
20:40 joetho In the interest of accuracy, I propose that I use only words and numbers that contain neither a zero nor "letter O".
20:42 atz DROP TABLE biblio;  -- done
20:43 jdavidb Awesome!  Thanks, atz!
20:43 jdavidb atz's clever patch will also eliminate your duplicate bib issues, see?
20:44 joetho is authorised spelled with a z or an s?
20:44 joetho s, right?
20:44 pianohacker s in the table, z in the interface
20:44 jwagner joetho, the answer is Yes
20:44 joetho Yez, I zee
20:45 ricardo jdavidb: LOL
20:45 pianohacker Learn Latin American Spanish, then you won't have to care about the difference
20:46 pianohacker joetho: What particular syntax error is it giving you? The fact that it worked at all might be a peculiarity of my setup
20:48 joetho Can't use an undefined value as an ARRAY reference at /home/sek/kohaclone/C4/Reports/Guided.pm line 412.
20:51 joetho SELECT  from items LEFT join authorised_values on (ccode = authorised_value) where lib IS null
20:54 chris back
20:54 ricardo Wb chris !  :)
20:54 pianohacker chris: hey
20:54 ricardo How do I export patron data in Koha 2.2.9 (so I can later import it in Koha 3.0.3)?
20:56 chris circulation data and accounts data too? or just borrowers?
20:56 chris the short answer is going to be, there is no tool in 2.2.9 to do it
20:57 ricardo chris: Ideally, those too. But for now, just borrowers (because biblionumbers will change and circulation historical data will probably get "confused" by that)
20:57 chris itemnumbers is what it cares about
20:57 ricardo chris: Oh, you're right. But those will also change :(
20:58 chris they dont have to
20:58 ricardo chris: Meaning...?
20:58 chris the best way to do it, is to take a copy of your 2.2.9 database
20:58 chris and put it somewhere then follow the upgrade to 3.0.0 rules
20:59 chris then you end up with a db, that will work with then you can upgrade it to 3.0.3 etc
20:59 chris and then you have tables that you can then mysqldump and load straight into your koha 3
20:59 chris and not lose any historical stuff
21:00 chris the next best way, is to do select * from borrowers into outfile '/tmp/borrowersdata';
21:00 ricardo chris: The upgrade possible doesn't seem possible, I'm afraid (character set problems, and fields that were wrongly filled in our Koha 2.2.9 - like the "sensitive" 100a field in UNIMARC - and that now are (correctly) validated in Koha 3.0.x
21:01 chris load data infile '/tmp/borrowers' into borrowers (column_name,column_name.....)
21:01 ricardo chris: OK. Thank you very much for the tip  :)
21:02 chris or you can do the upgrade, paying ignoring the bibliographic stuff
21:03 chris so that your borrowers table at least gets upgraded
21:03 chris then you can mysqldump and load it in
21:03 chris OR
21:03 chris you can do the select into outfile
21:03 ricardo chris: Yeah... That's the other option. I don't know what would happen to the biblio stuff then, though
21:03 chris well, you;d just ignore that (ie you wouldnt use the upgraded one)
21:04 ricardo chris: And I would then delete the records and do "bulkmarcimport"s afterwards?
21:04 ricardo (records -> *biblio* records, I mean)
21:06 ricardo chris: I'll think about it. Thanks!  :)
21:07 ricardo Well, it's past 10 PM here, and I'm still at work and haven't had dinner. Going home now. Take care everyone! And thank you Chris!  :)
21:07 chris you could do the select into outfile and then munge the file
21:07 chris and use the upload borrowers tool in 3
21:07 chris good night :)
21:08 chris (sorry got distracted by my boss)
21:08 ricardo chris: LOL! No problem  :)
21:08 ricardo Thanks again
21:08 ricardo Bye (out!)
21:08 chris gmcharlt: nice hote?
21:08 chris hotel too
21:09 gmcharlt chris: nice enough; but given that it's a major metro hotel, gouging me for internet access
21:09 gmcharlt annoying, that
21:10 chris ahh i hate when they do that
21:12 chris whats on the cards for tomorrow?
21:13 gmcharlt setting up booth and preparing to wear down our feet
21:13 chris ahh :-) hope you brought a few pair of shoes
21:22 joetho I will peck at this sql crap later.
21:22 joetho thanks for the tips,
21:22 joetho GOODBYE CRUEL WORLD
21:22 pianohacker Bye
21:52 chris http://www.linux.com/news/ente[…]anagement-softwar
21:59 rhcl I see DSpace got a mention.
21:59 chris yep, dspace == fedora now
22:00 chris well duraspace :)
22:00 chris its a nice balanced write up i thought
22:01 rhcl Yea, pretty good. I always forward articles like that to my director.
22:03 Sharon I posted that on our Tech blog
22:04 chris my director just twittered it
22:04 Sharon we're doing an Open Source tech day workshop in August, so that's timely
22:04 chris course my director is also the president of the NZOSS
22:04 chris :)
22:05 Sharon I'm suppose to be putting together a digitization pilot project, so I'm glad the linked to greenstone and others.  I need some educatin'
22:05 rhcl Look at DSpace.
22:05 chris and check out kete too
22:05 rhcl IMHO, it's worlds better than Greenstone.
22:05 chris depending on what sort of project you are working on
22:06 rhcl Sharon: where are you located?
22:06 Sharon local history stuff - pictures, family histories, etc.
22:06 chris http://horowhenua.kete.net.nz/[…]false#comment-128
22:06 Sharon rhcl Kansas - part of the NExpress folks
22:06 rhcl I thought so.
22:06 chris http://horowhenua.kete.net.nz/about
22:07 Sharon awesome
22:07 rhcl We are about halfway+ through a digitization project- Project Bloodroot
22:08 Sharon big? small? replicable?
22:10 rhcl We have about 25 books with 350-400 pages each, plus some odds and ends like newspaper articles.
22:11 rhcl http://pastebin.com/d25764106
22:12 Sharon cool!  I'll take a gander
22:14 Snow_Fox im gonna assume that the offline circ system is not capible of writing of having multiple stations write to the same file on the network at the same time correct?
22:15 chris pass
22:15 chris never used it
22:16 Sharon there are at least 2 Kansas uses of Dspace
22:17 rhcl Yea, a lot of major universities use it. It's pretty polished and professionally well-done.
22:17 Sharon one is the State gov't.  funny
22:17 Snow_Fox hrm
22:17 Snow_Fox the way i figure on setup
22:17 Snow_Fox is if the system goes down
22:17 Snow_Fox have the users create a file on a maped drive on one of our servers
22:18 Snow_Fox so that we can go back and update all at once more or less
22:18 rhcl We are actually using the jumpbox version in a VM.  http://www.jumpbox.com/
22:18 Snow_Fox if we could use one file
22:18 Snow_Fox though
22:18 Snow_Fox it would simplfy everything
22:19 gmcharlt Snow_Fox: I wouldn't count on that working - although since it uses sqlite for the borrowers database copy, it presumably wouldn't be too far from it using sqlite for the transactions record
22:19 Snow_Fox ya i know what you mean, only way to see would be to test it
22:20 Snow_Fox and see if it works
22:20 gmcharlt which would probably give it enough concurrency for what you want
22:20 Snow_Fox true
22:30 rhcl chris: still on? I have been looking at Kete over the past several weeks. From the examples on the site I see it would would be ideal for individuals to contribute to some collective goal.
22:30 chris thats what its for
22:31 chris individuals/organisations
22:31 chris its a tool for building a community and a repository
22:31 rhcl I can't find the link now, but I remember reading a complete page of the memories/writings of an early settler of some part of NZ--way back when. I think it was a transcribed diary.
22:32 rhcl Very interesting.
22:33 rhcl When we finish our actual project, I'm thinking of setting up a 'kete' and trying to get the still living staff of the cancer hospital add their thoughts and memories to it.
22:34 rhcl 'to add'
22:34 chris sounds like a great idea
22:36 rhcl another great idea is dinner, and I'm ready! Later...
22:36 chris :)
02:01 pianohacker 'night
03:36 Amit hi chris, brendan, Jo
03:36 Amit good morning #koha
04:05 Jo moring Amit
04:06 brendan heya Amit
04:06 brendan hi Jo
04:13 Amit hi indradg
05:41 kf good morning #koha
11:24 |Lupin| hello !
11:26 Amit hi Lupin
11:29 |Lupin| hi Amit
11:30 |Lupin| Could someone with a Unimarc koha pleasee tell me to which MARC field the ccode is linked ?
11:30 |Lupin| I think it's 995.8, but I'd appreciate a confirmation.
11:34 |Lupin| hmm not sure about 995.8 actually, because this subfield is not listed in the Koha to marc section as a possible target for CCODE
11:36 kf hi Lupin
11:36 |Lupin| hi kf !
11:43 |Lupin| nicomo: here ?
11:43 nicomo yes
11:44 |Lupin| nicomo: do you have acess to a unimarc set-up koha ?
11:44 nicomo yes, just reading your question above
11:45 |Lupin| nicomo: ok
11:45 nicomo you have to pick a subfield : there's none prescribed
11:45 nicomo then head to zebra's record.abs and tell it where you put ccode
11:45 |Lupin| nicomo: we don't use zebra here since our collection is small
11:46 |Lupin| nicomo: but when you install koha and ask for sample datas to be installed, it doesn't link ccode to any marc field either ?
11:46 nicomo not that I know of in unimarc, no
11:47 nicomo you basically select the subfield you want to link ccodes to, then tell the index (either zebra of noZebra) where it's at
11:48 |Lupin| nicomo: any recommendaiton regarding the subfield ?
11:48 |Lupin| nicomo: and, how do you tell the nozebra index where it is ?
11:49 nicomo not really, except perhaps avoiding any subfield already used in the 955 recommandation
11:49 nicomo 955$Z might be ok
11:50 nicomo nozebra indexes are defined in the NoZebraIndexes syspref
11:51 nicomo but again : i never used noZebra, so you might want to check with someone else on this
11:51 |Lupin| nicomo: still one can select only those fields that are listed in the drop-down, as far as I can see
11:52 nicomo which dropdown?
11:52 nicomo my install show a textarea for this syspref
11:52 |Lupin| in the koha to mark link
11:52 |Lupin| I chose items
11:53 |Lupin| then ccode
11:53 nicomo yes
11:53 |Lupin| and then I have several drop downs
11:53 |Lupin| one for each marc block
11:53 |Lupin| and in the 9xx one
11:53 nicomo yes
11:53 |Lupin| there is no possibility to select 995Z
11:53 nicomo you have to create it in the marc framework first
11:54 |Lupin| aaaaaaah
11:54 nicomo then it'll appear in koha2marc mappings
11:54 |Lupin| nicomo: I understand
11:54 |Lupin| nicomo: thanks
11:54 nicomo you're welcome
11:55 nicomo again, I haven't tested this with noZebra
11:55 nicomo but I'd be interested in the result though
11:55 nicomo keep me posted on this, will you?
11:55 |Lupin| nicomo: sure
11:55 |Lupin| nicomo: it's just that I'm not sure I have understood what exactly interests you
11:56 |Lupin| nicomo: you want to know how well noZebra will be able to earch through C-Codes ?
11:56 nicomo exactly
11:56 |Lupin| nicomo: I don't know whether I'll be able to determine this
11:57 nicomo Well : if you do, ping me, if not, don't worry :-)
11:57 |Lupin| nicomo: but at least if things do not work as expected once ccodes have been set up, I'll know that one possible cause of the problem is the nozebra indexing...
11:57 nicomo indeed
11:57 |Lupin| nicomo: the idea is that we will use ccodes to kep track of different file formats
11:58 |Lupin| nicomo: so one ccode per file format

| Channels | #koha index | Today | | Search | Google Search | Plain-Text | plain, newest first | summary