Time Nick Message 03:52 dcook FYI for any Ubuntu users: https://github.com/perl5-dbi/DBD-mysql/issues/354 04:39 tcohen[m] Awesome 04:39 tcohen[m] Ha ha 04:54 dcook Ikr 04:54 dcook Like I needed this today haha 06:02 reiveune hello 06:02 oleonard o/ 06:07 * dcook waves to folks 06:07 magnuse hiya dcook 06:10 magnuse kia ora khall 06:10 dcook btw I'm planning to try downgrading that mysql client to see if that fixes the issue 06:11 magnuse i have been working a bit on the protected patrons thingy. hope to have something to test this week 06:38 oleonard__ https://snipboard.io/CQ1P5F.jpg 06:38 davidnind nice! 06:39 dcook oleonard__ Could throw in Lebanon as well there 06:39 dcook We've got one library there using English/Arabic/French 06:39 dcook And the struggle is real with indexing... 06:39 oleonard__ Yeah I don't know how the speaker collected the numbers 06:42 ashimema interesting 07:02 tuxayo UEA counted with only 11 libraries? The speaker would be pleased to know that there at like 1100 more 🤯 07:02 tuxayo https://hea.koha-community.org/ 07:02 tuxayo https://hea.koha-community.org/libraries-by-country 07:03 tuxayo Here we can see that it seems there have Koha on many (all?) of there schools from kindergarden to secondary 07:07 tuxayo That's super cool. 07:07 tuxayo Maybe the speaker or someone else from the middle east at the KohaCon could have lead to help reach there ministry of education library department? Having contributions from UAE would be very nice. 07:20 tuxayo Ah, I new I missed something silly a few weeks ago. 07:20 tuxayo « paulderscheid : Just thought that tuxayo might get enraged that we recommend Microsoft stuff :D» 07:20 tuxayo lol 07:23 tuxayo Thanks for thinking about mentioning VSCodium/Code though 👍️ 07:33 tuxayo And for the contribution to the wiki. 07:34 dolf Hi Kohaphiles, I recently did an update from 21.05.00-1 to 23.05.02-1 (we were two years behind!) and I got an error during the database upgrades, but the `apt upgrade` command still exited with code 0. Full outout at https://pastebin.com/04B2Dhdp . Is there a way to fix this, or should I roll back my VM and try something else? 07:35 dolf The error is: ERROR - {UNKNOWN}: DBI Exception: DBD::mysql::db do failed: Row size too large. The maximum row size for the used table type, not counting BLOBs, is 8126. This includes storage overhead, check the manual. You have to change some columns to TEXT 07:35 dolf or BLOBs at /usr/share/koha/lib/C4/Installer.pm line 741 07:49 tuxayo dolf: what is your database software? 07:50 dolf MariaDB 10.3.39 (running on the same VM) 07:51 tuxayo "I got an error during the database upgrades" 07:51 tuxayo "but the `apt upgrade` command still exited with code 0" 07:51 tuxayo I'm not sure the script upgrade script in the deb package is taking into account the res 07:51 tuxayo *result of the DB upgrade. Maybe yes and then it's a bug. 07:51 tuxayo "MariaDB 10.3.39" 07:51 tuxayo ok, nothing weird here 07:54 tuxayo That's the version in Debian 10 and our CI tests it and it's indeed using MariaDB 10.3.39 07:55 dolf Thanks for your time so far! (I'm thinking hard this side as well...) 07:56 tuxayo dolf: you definitely should rollback your VM, at least keep the back preciously 07:57 tuxayo During the time you examine your DB 07:57 davidnind Bug 28267 has some of the errors mentioned, such as "Row size too large", but it is a bit too complicated for me to make any sense of.... 07:57 huginn 04Bug https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=28267 critical, P1 - high, ---, jonathan.druart+koha, Pushed to stable , Older databases fail to upgrade due to having a row format other than "DYNAMIC" 07:59 tuxayo davidnind: good find! 07:59 tuxayo dolf: the ticket mentions this: https://wiki.koha-community.org/wiki/Database_row_format 08:01 tuxayo So I guess could check the row format you have 08:02 tuxayo And if it's wrong, rollback the VM, fix the row format and re run the upgrade. 08:05 tuxayo Does anyone know why it's this upgrade that git a problem about row size? https://git.koha-community.org/Koha-community/Koha/src/commit/a6ef152db9e71805efe8e7dd6b2f8549da0f65b4/installer/data/mysql/db_revs/210600041.pl 08:05 tuxayo It doesn't seem related 08:06 tuxayo And it's not the package upgrade script that doesn't seems to handle the error. It's updatedatabase.pl itself, right? Maybe it also exposes a bug there. 08:07 tuxayo (when looking at the output https://pastebin.com/04B2Dhdp) 08:18 dolf Thanks for the advice!! Do you need any more info from my broken VM before I roll back? 08:27 davidnind Joubu++ for presentation 08:30 dolf The row format seems to be "compact" rather than "dynamic" :) I'll try to change it! 08:40 dolf `show table status where row_format <> "dynamic";` gives me 202 rows. Would it be safe to blindly run ALTER TABLE on all of them to convert them to dynamic, or should I focus on specific tables? 08:44 dolf I can generate the ALTER TABLE query strings using: SELECT CONCAT('ALTER TABLE `', table_schema, '`.`', table_name, '` ROW_FORMAT=DYNAMIC;') FROM information_schema.tables WHERE table_schema = 'koha_rsc' and row_format <> 'Dynamic'; 08:50 tuxayo > The row format seems to be "compact" rather than "dynamic" :) I'll try to change it! 08:50 tuxayo dolf ah great! 08:52 tuxayo > Would it be safe to blindly run ALTER TABLE on all of them to convert them to dynamic, or should I focus on specific tables? 08:52 tuxayo likely: https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=28267#c20 08:53 huginn 04Bug 28267: critical, P1 - high, ---, jonathan.druart+koha, Pushed to stable , Older databases fail to upgrade due to having a row format other than "DYNAMIC" 08:53 dolf I'm testing that now. 08:56 tuxayo If you are very very careful you could diff a dbdump before and after that. 08:56 tuxayo If you are not, just rely on the update script working and Koha basic usage working afterwards. 08:58 dolf I'm not that careful :P (I also have nightly mysqldumps, so if all else fails I'll just re-install and restore, or even build a new VM) 09:14 dolf Would this https://pastebin.com/5ujqW3H0 be worthy of adding to https://wiki.koha-community.org/wiki/Database_row_format ? 09:23 davidnind dolf: definitely worth adding! (assuming it worked for you) 09:24 davidnind If you don't have an account on the Wiki, I can add it. 09:24 dolf I requested a wiki account, but still waiting for the e-mail 09:25 davidnind dolf: it still pending, but I can approve anyway! (I find most often that the request confirmation email goes into people's SPAM folders... 09:25 dolf Just waiting for my VM provider to perform the rollback, than I'll do the final test to verify if it works. 09:27 dolf Thanks :) 09:27 dolf Nothing in spam yet (unless it was blocked on the outgoing server or on Google's servers, in which case there is a big problem with the mail delivery) 09:34 davidnind Let me know if you don't receive anything relatively soon - there may be something wrong the email (it's not something I look after, so will have to raise a bug!) 09:57 eythian @seen drojf 09:57 huginn eythian: drojf was last seen in #koha 4 years, 37 weeks, 3 days, 14 hours, 43 minutes, and 0 seconds ago: <drojf> -s 09:59 ashimema no-one has heard form him in years eythian 10:04 eythian I know, I was looking up how long years was, for a conversation :) 10:04 dolf davidnind Nothing yet. Did I make a typo? rudolfbyker@gmail.com 10:07 ashimema I see 10:08 davidnind dolf: that's what was entered, so looks like something is up with the email 😦 ... Your account is activated - you may be able to do a forgotten password reset, but this may not work either 10:13 davidnind @later tell thd Could you look at the Wiki setup to check whether emails are being generated? New user dolf didn't receive confirmation request or notification when approved (has checked SPAM folders) 10:13 huginn davidnind: The operation succeeded. 10:14 davidnind @later tell tcohen Could you look at the Wiki setup to check whether emails are being generated? New user dolf didn't receive confirmation request or notification when approved (has checked SPAM folders) 10:14 huginn davidnind: The operation succeeded. 10:22 dolf davidnind: The reset password e-mail also did not reach me 10:35 davidnind dolf: thanks for letting me know - KohaCon23 is on at the moment, so those that can look at it are likely to be busy... 10:38 jac Hello 10:40 Guest9049 We would like to know more about the Koha Library Software, any sales or contact in Hong Kong that we could reach? 10:53 Guest9049 OK, I'll try to contact them for more details. Thank you! 11:27 dolf I'm trying to upgrade Debian 10 to Debian 11 on my Koha VM (Koha is already at the latest version), but after changing the release in /etc/apt/sources.list from buster to bullseye and running `apt update` and `apt upgrade`, rabbitmq-server fails to start. Is rabbitmq-server used by Koha, or can I simply remove it and move along? 11:42 davidnind dolf: rabbitmq-server is used by Koha - it now runs some the queued tasks. But I don't know how it works - there have been messages on the mailing list about this in the past, so maybe searching the archive will help... 11:43 tcohen[m] we need a reliable SMTP server for community services 11:46 dolf I just did `systemctl stop rabbitmq-server && systemctl disable rabbitmq-server` and re-ran `apt upgrade` and it worked. Then `apt full-upgrade`. This automatically re-enabled rabbitmq-server, and it started without issues this time. 11:47 dolf tcohen: Google Workspace for Non-Profits works very well for us. Is Koha Community a non-profit? 12:48 paulderscheid[m] @tuxayo concerning the dev workflow w/ ktd it is possible to get this working w/ the oss solutions but it’s much more complicated and we have yet to document it so it’s easy to understand for new devs 12:48 huginn paulderscheid[m]: I'll give you the answer as soon as RDA is ready 12:48 wahanui i already had it that way, huginn. 12:49 paulderscheid[m] we also need to mention devcontainers.json 15:11 reiveune bye 19:23 Mimir909 woah that's crazy 19:23 Mimir909 seeing someone named huginn is a coincidence for me. haha 19:52 davidnind Mimir909: higinn is one of the "helpful" bot accounts we have on the channel... 19:52 Mimir909 oh cool 19:52 davidnind Minmir909: That should be huginn! 19:52 Mimir909 I really need help understanding the basics of Koha for work. Hahaha 19:53 Mimir909 I've subsribed to the mailing list, pounded my head against the documentation, and even reached out to an active user listed on the site. 19:53 davidnind It's probably not that helpful 8-) 19:53 jalway Mimir909, What would you call your level of technical expertise? 19:55 Mimir909 I understand how programming works but have never really coded anything of major complexity. I can write HTML partially from memory or understand its working parts to use tools to put it together. 19:55 Mimir909 I consider myself 'moderate' where 'expert' is so many leagues beyond it, but I can't say I'm a total beginner either, just not super in-depth, haha 19:55 jalway Mimir909, Have you used Linux? 19:56 Mimir909 Until this, no - But I've gotten through Koha setup to the point where I was able to perform web installation and see the configuration options 19:56 jalway Mimir909, Hmm..., that's a nice starting point! 19:57 jalway Mimir909, Have you been able to login? 19:57 Mimir909 Yep, logged in and looking at admin panel. 19:57 jalway Mimir909, Okay, so the hard part is doing all of the fiddly bits. xD 19:57 Mimir909 I can see settings are a series of named variables which can be modified through a menu, which is cool. The main thing I am trying to do right now is I have a big ol' list of ISBNs and I want to add them and get the zebra search to do its thing to them 19:58 jalway Mimir909, There is a boat load of settings, etc. that you need to configure to get a "functional" library setup in Koha. 19:59 Mimir909 If I can just get our base inventory up and running, the defaults should help quite a bit 19:59 jalway Mimir909, Probably the best way to start is to create a user, a book, and check-out the book to that user. 19:59 jalway Mimir909, Uh, you already imported everything? 19:59 Mimir909 The import is what I'm trying to figure out 19:59 Mimir909 I have MARC and CSV and it doesn't seem to take either and won't tell me why 20:00 jalway Mimir909, You are doing this without a support company like ByWater Solutions, yes? 20:02 Mimir909 We absolutely cannot afford the 20:02 Mimir909 I am the most technically inclined guy in the school I work for and just need to figure this out enough to make use of the software for tracking purposes for now, the actually functional library can come later 20:03 jalway Mimir909, Have you defined your collections? 20:03 Mimir909 Well, currently it looks like my setup started with the defaults - Fiction, Non-Fiction, and Reference 20:06 Mimir909 Which works for me for now 20:09 jalway So, what you want to do is use the Cataloging interface to stage records for import. 20:10 jalway More->Cataloging->Stage records for import 20:11 Mimir909 Cool, I've looked at that, and this is where I get a roadblock - When I upload my file (I have it in both MARC and CSV format, as mentioned) it says it starts the job but doesn't actually seem to do anything 20:11 Mimir909 And my best guess there is that I just don't have the file configured the way it wants 20:11 jalway Have you clicked on the Manage staged records button? 20:11 Mimir909 Hmmm, let me look (: 20:12 Mimir909 It says 'No records have been staged' 20:12 Mimir909 I've tried to upload multiple times and it seems to go nowhere, so far 20:13 jalway Hmm... 20:13 jalway Mimir909, I would suggest using this tool to verify that your records are valid. https://marcedit.reeset.net/ 20:13 Mimir909 So far what I've theorized is just that my format is missing something so it's not taking it properly 20:13 Mimir909 Ooooh that sounds fun lemme try 20:14 Mimir909 Ah, okay, this is the MarcEdit software - I downloaded that and tried using it to get my initial MARC file 20:14 Mimir909 I may also not be doing that part correctly. 20:15 jalway Mimir909, Can you use marcbreaker to break the marc file into a human readable format? .mrk vs .mrc? 20:16 jalway In the event that it's throwing errors, then you have bad data. 20:16 jalway Mimir909, Where did you get your MARC file? 20:19 jalway khall_, You are the Kyle Hall, yes? I found a typo in the Patron Importer plugin. 20:19 Mimir999 Hello, jumped onto the windows PC where my MarcEdit lives 20:19 Mimir999 I got the MARC file by taking a spreadsheet of our catalog and trying to convert it using MarcEdit 20:20 jalway Ah, nice, so no room for error, then. xD 20:21 jalway What system are you pulling your catalog data from? 20:21 jalway There's no way to get an actual marc file exported? 20:22 Mimir999 Well, the catalog is hand-assembled. I went through every item in the building and noted its ISBN in the spreadsheet 20:22 * paxed boggles 20:22 jalway Mimir999, I think my brain started bleeding there. 20:23 jalway Mimir999, What datapoints do you have? 20:23 Mimir999 That means we're on the right track 20:23 Mimir999 In the base catalog, just ISBNs and some questionable titles. 20:24 Mimir999 For context, they were using an utterly busted library software that is online only but stopped being supported by developers, so several basic functions broke down and no one knows when but everything kept going 20:24 jalway Mimir999, Is that what the MARC records consist of? 20:24 Mimir999 The MARC file will be whatever MarcEdit spat out when I attempted to convert it, assuming I even did it correctly 20:25 jalway Mimir999, MARC is not a magic ball. Did you you only give it a questionable title and an ISBN? 20:25 jalway Mimir999, Or do you have complete records? 20:27 Mimir999 The complete records exist in the form of a wildly inaccurate catalog 20:28 Mimir999 From the old software I can pull something more in-depth, but it also will list 100 copies of something we only had 40 of, and has tons of other data errors. 20:28 Mimir999 The catalog I made is just titles and ISBNs 20:29 jalway Mimir999, Please see here: https://www.loc.gov/marc/bibliographic/examples.html 20:29 jalway Mimir999, Those are examples of full MARC standard records. 20:30 jalway Mimir999, How many records are we talking about? 1k, 2k, 30k? 20:30 Mimir999 Around 3000 20:30 Mimir999 A piddly library of textbooks for students of an independent magnet school, but one which I am the only person able to really do anything with 20:31 jalway Mimir999, The easiest solution may be to manually pull records for every book using Z39.50 from within Koha. 20:31 Mimir999 Right, that's what I'm hoping to do 20:32 Mimir999 But the actual doing it is not totally understood either 20:32 jalway Mimir999, Click Z39.50 in the Administration module. 20:32 Mimir999 I talked to Paul over at the Naval Archives in Ontario who is a Koha 3 user and he suggested coding a script that manually catalogs stuff 1 by 1. 20:32 Mimir999 Checking there now (: 20:32 jalway More->Administration->Z39.50/SRU servers administration 20:33 jalway Mimir999, You'd possibly spend more time coding a script to do it imperfectly than you would just searching for all the records manually. 20:34 jalway Mimir999, What libraries do you have listed in the Z39.50/SRU servers? 20:34 Mimir909 Library of Congress (there is 4 of them) and a library in France 20:34 Mimir909 Checking with Paul, I determined I had correct settings for connection under these, as they seemed to be configured default 20:36 jalway Here's a couple more that will be helpful: 20:36 jalway LINK - CALIFORNIA,csul.iii.com:210,INNOPAC,No,0,USMARC,utf8,0,Bibliographic 20:37 jalway OHIOLINK,olc1.ohiolink.edu:210,INNOPAC,No,0,USMARC,utf8,0,Bibliographic 20:38 jalway SAN FRANCISCO PUBLIC LIBRARY,sflib1.sfpl.org:210,innopac,No,0,USMARC,MARC-8,0,Bibliographic 20:38 Mimir909 What do 'No' and '0' refer to, before the 'USMARC' setting? 20:39 jalway That's just preselected and rank, stuff that you can decide on yourself. 20:40 jalway The keys are the name of the where you're getting your record, the host:port,database name, syntax, and encoding. 20:41 jalway Err..., also, whether you're trying to pull bibliographic/authorities records. 20:41 jalway We just use Library of Congress for our Authorities. 20:42 jalway All the rest of the Z39.50 servers that we use are for bibliographic records. 20:43 jalway Mimir909, What I highly suggest is that you just manually catalog everything. It will take a significant amount of time. 20:44 jalway More->Cataloging->New from Z39.50/SRU 20:44 jalway Once you have all of those Z39.50 Servers setup. 20:44 jalway Then, just do a search via ISBN 20:45 jalway Import record that looks good and click save. 20:45 jalway Repeat, 3000 times. 20:45 jalway I highly recommend you add a column to your QuestionableTitle,ISBN spreadsheet. Add, ImportedToKoha. 20:46 jalway Then, you will have 3k known good records. 20:46 Mimir909 Hmmmmmm, cool 20:47 Mimir909 How should ISBN be formated to get matches? 20:47 jalway You can fiddle with the items after you get all of the records in the system. As the items are how you check-out a book associated with that record. 20:47 jalway 10digit or 13digit. Koha doesn't like the dashes. 20:47 Mimir909 Also, are those Z servers or SRU servers? 20:48 jalway Mimir909, The ones I listed are Z39.50 servers. 20:49 Mimir909 Cool. I may have made a typo in one because it's getting timeouts, so I'll double check that 20:49 jalway Which one is timing out? 20:50 Mimir909 California 20:51 Mimir909 Also, I have a match! That's cool. Though, it didn't match the first one I tried which is perplexing. 20:51 jalway Hmm.., yeah, they're all working on my config, so not sure what I can tell you about that. xD 20:51 jalway Mimir909, Textbooks are stupid like that. I hate cataloging textbooks. 20:52 Mimir909 Yeeeaaaaah. Hahaha 20:52 Mimir909 Hey, I notice there's an item type called 'Continuing Resources' - What is this intended for? 20:53 jalway Continuing Resources == Serials == Periodicals == Magazines, etc. 20:53 jalway Theoretically, it could be a more book like thing that is also a serial, but doesn't look like a magazine. 20:54 Mimir909 Interesting. 20:54 jalway LINK-California is a consortium of libraries, the OHIOLINK catalog is also a consortium of libraries, and the SF Public is just a great place to get records for DVDs, etc. 20:55 jalway SF Public just has some nice records, too. 20:55 jalway Quality wise. 20:58 jalway I got started cataloging by cataloging serials, because we were paying something like $3k a year for a dumb tracker we only used internally. 20:58 jalway Which is where Koha came in for us. 20:58 jalway We just made the switch from our cataloging system of 20 years, to Koha, fully. 21:02 Mimir909 Whoops, crashed my Linux. 21:05 Mimir909 Great time to practice the actual server side commands for Koha - Which thing in the terminal will actually start the server back up? 21:06 jalway Mimir909, That's what I have IT people for. xD 21:06 jalway Mimir909, I can probably find it somewhere, been a while though. xD 21:08 Mimir909 Ah, I have to start Apache 21:12 Mimir909 Crap. Server went down in the crash and I need to figure out how to get it up again. Lol 21:21 Mimir909 Okay, it's back up - Is there an easy way to just check the catalog directly? Or do I have to run a report and such? 21:23 jalway Mimir909, You mean to see if a record is in the system? 21:23 jalway Mimir909, Click Cataloging at the top and just do a search in the text box given. 21:23 jalway up at the top 21:24 jalway Mimir909, You should also have an OPAC view, but I have no idea how to tell you how to get to that. xD 21:25 jalway Mimir910, Actually, once you have an item up in the cataloging search, there is an Open in OPAC view. 21:26 jalway "OPAC view: Open in new window." Once you have clicked on an item that you've cataloged. 21:26 Mimir910 Hey, cool - I exported catalog data and then opened it in MarcEditor 21:26 jalway That's your "public" Catalog view. I have no idea how you have Koha setup, so I have no idea, if it's public or not. 21:27 jalway From Koha or from your old system? 21:27 davidnind You can list all records in the system - in the staff interface, from the dropdown list next to Search in the top header select Advanced search, then select the Item types you have and click on Search. 21:28 Mimir910 From Koha 21:28 jalway Mimir910, Cool, yeah, Koha is nice like that. Once you have your data in Koha, you can still get your data out of Koha. ;-) 21:30 davidnind I'll put together a list of resources that may help, but I can't do that until the weekend. In the meantime, here are some to get you started (bearing in mind that some things may not be up-to-date with the version you are using (assuming Koha 23.05.x). 21:31 davidnind ByWater Solutions learning resources: https://koha.bywatersolutions.com/ 21:32 davidnind koha-US learning resources: https://koha-us.org/learn/ 21:33 davidnind (bonus with koha-US - the special interest groups are really useful to attend, depending on your area of interest, and are open to anyone) 21:34 jalway Nice. :-) 21:35 davidnind Koha Geek - lost of useful tips and quick tutorials: https://kohageek.blogspot.com/ 21:37 davidnind RAFLIMITS - I haven't watched many of these, but they seem great (more in-depth tutorials/instructions): https://www.youtube.com/@RAFLIMTS 21:40 davidnind And there is the Koha Community manual - it's always a work in progress, and we've never managed to have it 100%up-to-date! https://koha-community.org/manual/23.05/en/html/index.html 21:41 davidnind Probably something useful for you - Implementation check list (https://koha-community.org/manual/23.05/en/html/implementation_checklist.html#implementation-checklist-label) and Cataloguing (https://koha-community.org/manual/23.05/en/html/cataloging.html) 21:42 jalway You can also click the help button on a page in Koha to be taken to the manual. 21:42 davidnind I'll see if I can find a more recent tutorial on importing your data - there are a lot around, but some a quite dated - unless anyone else beats me to it! 21:43 Mimir910 Oh snap 21:43 Mimir910 MarcEdit has a z.50 client 21:44 jalway davidnind, Part of the trouble is the data. Mimir910, Has 2 datapoints, QuestionableTitle+ISBN. 21:45 jalway Mimir910, Yes, it's a little finicky, but it is usable. 21:45 Mimir910 Any way I can do this bulk is ideal, haha 21:45 Mimir910 One day just getting a bunch up on my Koha at once will speed up the rest of the actual work I need to do 21:46 jalway Mimir910, I used the MARCEdit Z39.50 client some. You can add those Z39.50 servers to the server list in MARCEdit as well. 21:47 jalway Mimir910, Hmm..., looks like you could do a batch search form a file with a list of ISBNs. 21:48 jalway Mimir910, Using MARCEdit. 21:48 jalway Then, you could download those records and stage those for import in Koha. 21:49 jalway At which point you'll be trying to import probably good records. 21:51 Mimir910 God yes this is what I need. Hahaha 21:51 Mimir910 This is all helping me understand how this stuff works but I got orders to fill and was hoping to use a backing of a working catalog for it 21:51 Mimir910 To prevent this becoming a mess again in the future, because it's a huuuuge mess they handed me and asked me to fix 21:55 davidnind You need to make sure that your MARC records have the required information for Koha, such as the item type codes, and the codes you use for things like collections, library location, etc. - these need to match what you have setup in Koha. I will try to find a good resource about this for you. 21:58 davidnind If you aim for a minimal record - such as title, author, ISBN, etc. ; and item information, such as barcodes, etc., - you can always enhance later on either with updating individual records using Z39.50 search or exporting, using MARCEdit for Z39.50, and then updating the records in Koha. 21:59 jalway Mimir910, Yeah, you can batch search / save marc records via a list of ISBNs. Using the batch search feature in MARCEdit. 22:01 jalway Mimir910, I just tested it, I didn't get any hits, but it seemed to function correctly. I just grabbed 3 random ISBNs and searched a random small library. So, it's not surprising that I didn't get any hits. 22:01 davidnind Hope we are not overwhelming you here! 8-) Start slowly, get the import of a representative sample of records and items working, then rinse and repeat. 22:07 jalway Oof, the documentation for MARCEdit is helplessly out-of-date, but the Z39.50 batch search by ISBN seems to be functional. 22:11 jalway I'm out. Have a good one. o/ 22:15 Mimir910 As a person who's worked in kitchens, I don't really do 'overwhelmed' 22:15 Mimir910 But I do live in a perpetual state of struggling against a new, higher level of understanding and performance. Haha 23:39 Penggu Hi. Anyone come across "WARNING: MYSQL_OPT_RECONNECT is deprecated and will be removed in a future version." errors that prevent the installer from succeeding? 23:40 dcook_ Penggu: That should be a warning rather than an error 23:40 dcook_ It shouldn't be preventing the installer from working 23:40 Penggu On the web: Updating database structure /Update errors : / WARNING: MYSQL_OPT_RECONNECT is deprecated and will be removed in a future version. / [Try again] 23:40 dcook_ Interesting.. 23:41 dcook_ In any case, I suggest looking at https://github.com/perl5-dbi/DBD-mysql/issues/354 23:41 dcook_ You'll most likely want to downgrade libmysqlclient21 23:41 Penggu log files don't shine any light. Same line in /var/log/koha/site/updatedatabase-error.....log 23:41 dcook_ The latest client has been wreaking havoc on the Internet the past 24 hours or so 23:41 Penggu ok thanks for the pointer 23:42 dcook_ No worries. Unfortunately, it's the only real option at the moment until DBD::mysql release an update to their Perl module not to set MYSQL_OPT_RECONNECT (even as false) 23:42 dcook_ This comment specifically will help you the most: https://github.com/perl5-dbi/DBD-mysql/issues/354#issuecomment-1679677264 23:43 Penggu perfect - thanks 23:43 Penggu i've got other upgrade errors now (progress!) 23:43 dcook_ Hehe 23:44 dcook_ You're in Victoria, yes? At a library or a company? 23:45 Penggu yes - a k12 school 23:46 dcook_ Ah very cool 23:47 dcook_ I'm in NSW so I'm around if you have questions/issues 23:47 Penggu thank you 23:58 Penggu Had to delete a couple of publishercode indexes on biblioitems and deletedbiblioitems because the upgrade script was trying to convert the column to text from varchar(255). Finally finished the upgrade!