Time  Nick             Message
23:22 CrispyBran       Have a good one.
23:22 CrispyBran       Thanks for the info.
23:14 wizzyrea         so that's where to start.
23:14 wizzyrea         either way you'll know when you go to bz apply the patch on current master.
23:13 wizzyrea         fixing a "insufficient blobs" is way different from say, a merge conflict
23:13 wizzyrea         the approach really depends on how it's not applying
23:13 wizzyrea         if not, find out why, and fix that.
23:13 wizzyrea         reattach
23:12 wizzyrea         if it works, yay
23:12 wizzyrea         you could also check out a clean master, and cherry pick your patch over
23:12 wizzyrea         that might do it
23:12 CrispyBran       do I just to a git pull on the master, test my patch again and then obsolete and attach again?
23:10 wizzyrea         (but it depends on the error message)
23:09 wizzyrea         rebase it, probably
23:09 CrispyBran       Joubu: when you create a patch and someone says it doesn't apply, what do I need to do on my end?
21:25 talljoy          hiya rangi!
21:25 rangi            hi talljoy
21:23 wahanui          o/ '`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'
21:23 wizzyrea         more confetti!
21:23 wahanui          confetti is, like, http://25.media.tumblr.com/tumblr_lpm3j6aNaN1qh8hleo1_400.gif
21:23 wizzyrea         confetti!
21:16 alexbuckley      Nope I did like your changes, better to see fewer screens  by having the additional info about how to create things on the same page as the forms for example
21:16 Joubu            have to run, see you tomorrow :)
21:15 Joubu            just hope you did not signoff because you gave up!
21:15 alexbuckley      No worries :)
21:14 Joubu            thx for the quick signoff alexbuckley :)
21:14 Joubu            ha dbic schema you meant, yes you will have
21:14 alexbuckley      Yes thank you for the work on the onboarding tool Joubu
21:13 cait             so i don't need to regenerate?
21:13 cait             oh?
21:13 cait             just something i haven't done so far :)
21:13 Joubu            a simple c/p of the DB rev from master should be enough
21:13 Joubu            cait: do not worry with the schema changes, now we have check to avoid failures on upgrade
21:12 cait             have to figure out the schema changes too i think
21:11 cait             i'd like to get the translators on it too - but will try to push it eraly
21:11 Joubu            yep, apparently hea will not be backported this month, but should be next month :)
21:09 rangi            to out of the SCO :-)
21:09 rangi            ill also fix the english
21:08 rangi            i think that is less clunky than he or she eh?
21:08 rangi            If the user logged in is the SCO user and they try to go out the SCO module, log the user out removing the CGISESSID cookie
21:08 cait             Joubu++
21:07 rangi            and its good to see hea2 live too
21:07 rangi            Joubu: thanks for the work on the onboarding tool
21:01 rangi            cool, ill update it now(ish)
21:01 Joubu            apparently the problem does not appear on the interface
21:01 Joubu            the patch is only about comments, so no translation needed
21:00 rangi            if we get the base neutral, individual communities can decide how to deal with the translations
20:59 rangi            and in english, they can be singular or plural, so easy to always use that
20:59 rangi            yeah, māori has no gendered pronouns so it is easy
20:57 cait             i didn't know until not so long ago
20:56 cait             i think some of the he/she/they problem is non native speakers not being aware of the neutral forms and how to use them
20:56 cait             so people are more aware?
20:56 cait             if we want to change it permanently, maybe we should also have a coding guideline
20:53 rangi            in fact, im gonna put my time where my mouth is and do a follow up doing that
20:52 rangi            it is actually important
20:52 rangi            http://timesofindia.indiatimes.com/city/thiruvananthapuram/Thiruvananthapuram-library-opens-a-new-page/articleshow/55395836.cms
20:48 rangi            there is no reason to ever need to use gendered language in an example
20:48 rangi            problem solved
20:47 rangi            just use they
20:19 CrispyBran       If we changed the language to female, I seriously doubt we'd lose any of the male programmers.  Anyway, that's all the energy I can contribute to this topic.  Moving on to problematic code.
20:17 CrispyBran       I am not sure how reference to a particular gender as an EXAMPLE proves to be problematic.  If a programmer has issue with this, there are deeper issues that should be addressed, rather than taking offense at an example.
20:09 CrispyBran       :)
20:08 cait             hm this vegetable needs to do laundry, brb
20:07 cait             ?
20:07 cait             when a notice with the code ACCEPTED is set up, a message will be sent from the kitten to the patron.
20:06 cait             lol
20:06 cait             ?
20:06 cait             hm stop her
20:06 * cait           has to admit a 'he' in a comment doesn't stop me
20:06 CrispyBran       'When a carrot manages the suggestion, it can set the status to "REJECTED" or "ACCEPTED".'
20:06 cait             CrispyBran: can you demonstrate this in a sentence? :)
20:05 * cait           waves
20:05 * CrispyBran     thinks all references to librarians should be in animal or vegetable form.
20:02 * Joubu          trolls and goes out
20:02 wahanui          and ever, amen.
20:02 Joubu            but "he or she" just make things heavy to read IMO
20:02 Joubu            whatever
20:02 Joubu            can be replaced by she or whateber
20:02 CrispyBran       Joubu: really?  Someone is going to waste programmer time with this?
20:00 Joubu            to me it does not make sense to make these changes as it is only in code comments
20:00 huginn`          04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=18432 trivial, P5 - low, ---, ephetteplace, Signed Off , Most code comments assume male gender
20:00 Joubu            I'd like a English native speaker to take a look at bug 18432
19:52 huginn`          04Bug 18450: major, P5 - low, ---, koha-bugs, NEW , Renew in header bypasses hold block and renewal limits
19:52 CrispyBran       https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=18450
17:56 * magnuse        will try to squeeze in a signoff
17:45 oleonard         I hope to do so
17:45 magnuse          patches are welcome ;-)
17:44 magnuse          gotcha!
17:44 oleonard         (easily)
17:44 oleonard         So you can't add the OPAC as a search engine in Firefox
17:44 magnuse          ah
17:44 oleonard         It can provide OpenSearch results, if I recall correctly, but it doesn't enable auto-discovery of search to browsers
17:44 magnuse          is it something else that is used to show results from kete in koha and vise versa?
17:43 magnuse          oleonard: i thought koha had opensearch?
17:00 oleonard         I'm surprised I haven't wished Koha had OpenSearch for the OPAC enough yet to submit a patch.
16:59 oleonard         I'm surprised Koha doesn't have OpenSearch for the OPAC
16:39 reiveune         bye
15:11 barton           good morning #koha!
14:50 Mauricio_BR      Thank you Cait oleonard kidclamp ;)
14:40 cait             from the urls
14:40 cait             you could use a tool like piwik to get the searches
14:40 cait             the search history
14:40 cait             i think it#s in a cookie before you log in?
14:34 kidclamp         could be a development though, seems possible in existing code to support saving as anonymous patron - but would want tied to syspref
14:33 oleonard         Are searches logged by zebra?
14:33 Mauricio_BR      ok, thanks guys for the information.
14:32 Mauricio_BR      is only for suggestions and/or reading history items.
14:32 Mauricio_BR      yes
14:30 eythian          ah, suggestions not history. I misremembered.
14:29 Mauricio_BR      http://translate.koha-community.org/manual/3.20/en/html/administration.html#AnonSuggestions
14:29 kidclamp         we only save in the table if we have a logged in user
14:28 kidclamp         in the code it seems non-logged in history is stored for session, but not in db
14:26 oleonard         The AnonymousPatron pref says "(for anonymous suggestions and reading history)"
14:26 Mauricio_BR      there is the user with id 0 but it seems to be the admin
14:25 Mauricio_BR      so it is bound to a user
14:25 eythian          isn't there a special user it becomes if it's anonymous that's configured by a syspref?
14:25 Mauricio_BR      but in this table every record has a user field
14:24 Mauricio_BR      yes
14:24 oleonard         Mauricio_BR: Have you looked at the 'search_history' table? I think kidclamp is right: I only see entries for logged-in users.
14:23 Mauricio_BR      XD
14:23 oleonard         Better than my... Every other language which exists.
14:22 Mauricio_BR      my english is not good as you can see...  haha
14:22 Mauricio_BR      oh, sorry
14:22 oleonard         Thought you were asking about a text string
14:22 oleonard         Oh I misunderstood what you were asking
14:21 kidclamp         I don't think search history is sotred unless the user is logged in
14:21 Mauricio_BR      because i am working with datamining...
14:21 oleonard         Why are you looking for it?
14:21 Mauricio_BR      i have the 16.05 ver. of Koha
14:20 Mauricio_BR      i am looking for it in the tables of the database
14:19 oleonard         Mauricio_BR: Where do you see it?
14:17 Mauricio_BR      Hello friends. Please help me with something. I am trying to locate in the database a table wich store the words searched in OPAC as anonymous user (no login in OPAC). Do you know where can I find it?
14:07 marcelr          Joubu: hi; any chance to have another look at the qa changes on the upload reports 17669/18300 ?
13:29 oleonard         Not a big project, just a lot of small ones
13:28 cait             revamping another website? :)
13:27 oleonard         More than me :P
13:26 * cait           didn't do much this release
13:25 oleonard         cait++
13:22 cait             mveron++ :)
13:19 mveron           :-)
13:18 oleonard         mveron++
12:22 oleonard         I will take a look
12:21 huginn`          04Bug http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=7550 normal, P5 - low, ---, veron, Needs Signoff , Self checkout: limit display of patron image to logged-in patron
12:21 mveron           oleonard: Bug 7550 - what do you think about?
12:21 * cait           waves :)
12:21 * mveron         waves
12:03 * oleonard       waves
12:01 marcelr          hi #koha
11:57 toins            hi all
11:49 oha              we will be too!
11:47 ashimema         atheia will be super happy :)
11:47 ashimema         :)
11:45 magnuse          i have started to look at adapting my existing NNCIPP code, so I might be able to do a signoff pretty soon
11:45 magnuse          ashimema: ah, if you use it in production that should be an "argument" for getting it in
11:23 * cait           waves
10:55 ashimema         we use it in production allot
10:55 ashimema         I'd love to see it in :)
10:34 huginn`          magnuse: The operation succeeded.
10:34 magnuse          @later tell atheia do you think it would make sense to try and get ILL as it stands now into 17.05?
10:34 magnuse          Feature Slush for 17.05 is May 5 2017
10:28 magnuse          ashimema: do you think it would make sense to try and get ILL as it stands now into 17.05? i should probably ask atheia...
09:18 wahanui          niihau, eythian
09:18 eythian          hi
08:28 wahanui          kamelåså
08:28 cait             hi magnuse
08:27 magnuse          kia ora cait
08:11 magnus_breakfast \o/
07:57 fridolin         happy Easter Egg :)
07:56 fridolin         hie there
07:30 dcook            Good luck Eurofolk
07:30 dcook            Anyway, I better head out
07:30 dcook            hehe
07:27 oha              speaking of oha vs koha, i usually add a "oha" comment when i change something so i can easily find it again. but with so many (k)oha strings this hasn't been working so well lately :)
07:20 wahanui          privet, gaetan_B
07:20 gaetan_B         hello
07:18 oha              dcook: oh! not sure. i've used oha for more than 20 years now.
07:17 dcook            I don't think so... I'm sure I've seen your full name somewhere :p
07:16 oha              dcook: eheh, is it because (k)oha? :)
07:14 dcook            oha... that name is familiar
07:01 oha              o/
06:51 wahanui          salut, reiveune
06:51 reiveune         hello
06:46 wahanui          hello, alex_a
06:46 alex_a           bonjour
06:46 dcook            Yeah maybe
06:41 magnuse          dcook: yeah, bit of a different use case, i guess
06:32 dcook            And they don't import RDF at all :/
06:32 dcook            At least to outside entities
06:32 dcook            Although maybe that's partially because they don't do any real linking
06:32 dcook            They dont' used named graphs though which was interesting
06:31 dcook            magnuse: Unfortunately, I don't think he'll be able to help too much, but it'll be good to get some more details
06:31 dcook            4-5 always seems the busiest time of day!
06:31 dcook            magnuse: No worry. I have about a million different things happening at once anyway :)
06:26 josef_moravec    morning #koha
06:16 mveron           Good morning / daytime #koha
06:13 magnuse          brinxmat probably has good advice
06:13 magnuse          dcook: sorry, was busy in another window
05:25 dcook            At the moment, I'm thinking we download the records via OAI-PMH then... run them through a filter which creates triples that Koha can understand. Although we could also use SPARQL CONSTRUCT although that would in theory involve more hardcoding...
05:20 dcook            I think it relates to how I import/store the data though
05:20 dcook            Sorry if I'm stepping on your toes at all
05:19 dcook            Hey magnuse, chatting to @brinxmat on Twitter about LD atm
05:19 * dcook          waves
05:14 * magnuse        waves
02:43 dcook            If you broke them up, your schema:name would go into that named graph...
02:43 dcook            Would you stick them all in one named graph or break them up..
02:43 dcook            You'd.
02:42 dcook            So if you did download from http://www.worldcat.org/title/good-omens/oclc/973430700
02:35 dcook            And this year the National Library of Finland is opening up its national bibliography as linked data it seems..
02:34 dcook            https://news.minitex.umn.edu/news/digital-initiatives-metadata-education/highlights-alcts-webinar-linked-data-cataloging
02:30 dcook            Actually, that all looks proposed...
02:27 dcook            Not that that is even anywhere near my problem atm..
02:27 dcook            I do wonder a bit about using some other tools for handling RDF records... and hooking them more loosely into Koha..
02:26 dcook            Interesting... and they're using BIBFRAME... of some kind
02:25 dcook            As I don't see a lot of data out there about that, yet that seems to be what UCDavis and Oslo both do..
02:24 dcook            I'm intrigued by a RDF->MARC conversion as well.
02:23 dcook            That seems...
02:23 dcook            But maybe just into the cataloguer
02:22 dcook            Interesting... it seems that they do download the OCLC graph..
02:21 dcook            I'm too young to be squinting..
02:21 dcook            Gotta love super low res images..
02:20 dcook            Hmmm https://bibflow.library.ucdavis.edu/copy-cataloging/
02:19 rangi            ah yeah
02:19 dcook            Tweets & replies rather than just Tweets... I guess because I @ed someone?
02:18 dcook            Ahh, because I don't understand conventions I guess..
02:18 dcook            Don't know why it's not showing up in my UI
02:17 dcook            Cheers
02:17 rangi            https://twitter.com/minusdavid/status/854155774943154176
02:17 dcook            Surely it show up under my tweets..
02:16 dcook            I haven't tweeted in too long... can't even find my own tweet now..
02:16 dcook            hehe
02:16 ibeardslee       check twice, tweet once
02:15 dcook            I was going to double-check!
02:15 dcook            Ah balls
02:15 rangi            heh, im not @rangi im @ranginui (just fyi)
02:13 dcook            I think I can wrap my head around almost everything except copy cataloguing with RDF
02:13 dcook            tweet sent
02:11 dcook            Oh wait hackfest... those wouldn't be recorded
02:10 dcook            What was the talk about?
02:10 rangi            he was at Kohacon16 and gave a good talk at the hackfest
02:10 * dcook          thumbs up
02:10 rangi            drop him a tweet
02:10 dcook            I mean... they're really the perfect people to talk to since they're interfacing with Koha too
02:09 rangi            https://twitter.com/brinxmat
02:09 dcook            I figure if I can chat to them... they might be able to clarify everything
02:09 rangi            yup
02:09 dcook            Rurik Greenall?
02:08 rangi            rurik is the main project/manager tech lead
02:08 dcook            Not enough experience to know for sure though
02:08 dcook            Yeah, the LIBRIS records are intense... and I'm not sure if they're 100% correct..
02:08 rangi            lots and lots of others
02:08 dcook            Do you know if that's all done by Petter or if they have others?
02:08 rangi            (i meant libris not oslo)
02:08 dcook            Didn't have enough time to totally go through it all
02:08 rangi            ah cool
02:08 dcook            And that saves to both the triplestore and Koha's marc database
02:08 dcook            I think now they have an editor (Catalinker)
02:07 dcook            Or maybe it used to be..
02:07 dcook            I took a look a bit at their github
02:07 dcook            Mmm, I don't think so
02:07 rangi            i think its still marc, that they render as rdf on the fly
02:07 dcook            Need to talk to them too
02:07 dcook            Not sure if they're still demoing or not
02:07 dcook            I think LIBRIS might be too, but I'm not 100% sure yet
02:07 rangi            only ones in the world afaik
02:06 rangi            as they have a fully rdf based catalogue working
02:06 * dcook          agrees
02:06 rangi            you should really talk with teh Oslo people
02:06 rangi            yep
02:03 dcook            But that doesn't necessarily make sense..
02:03 dcook            And show that somewhere on your web page..
02:03 dcook            Using owl:sameAs or some alternative... perhaps you could define some rules to crawl that entity...
02:02 dcook            Maybe you fill out some basic details for your indexer?
02:02 dcook            Do you just do a owl:sameAs?
02:01 dcook            You need a local record in your LMS so that your library patrons can find it
02:01 dcook            You subscribe to Worldcat
02:01 dcook            You've ordered "Good Omens" as you got a purchase request for it
02:00 dcook            Well that's where my mind bends a bit..
02:00 dcook            Now if you were to import this record from OCLC...
01:59 dcook            Although I suppose they could've constructed the entity for display purposes here too..
01:59 dcook            I reckon when they crawled the dbpedia URI... they must have run it through a mapper and only saved it with the mapped/filtered triples..
01:59 dcook            As that triple doesn't exist in dbpedia
01:58 dcook            schema:name is something that OCLC must have generated
01:58 dcook            rangi: The thing I find interesting with that example is the <http://dbpedia.org/resource/London> entry
01:58 dcook            Probably wouldn't be that hard..
01:56 dcook            Seemingly with SPARQL's DESCRIBE, although I haven't played with that in terms of named graphs yet
01:56 dcook            Although they do it in the downloads too
01:56 dcook            And they've just aggregated them here on the page..
01:56 dcook            As are the "related entities"
01:56 dcook            Looking at theirs... I feel like <http://www.worldcat.org/oclc/973430700> is probably in a named graph
01:55 rangi            right
01:53 dcook            http://www.worldcat.org/oclc/973430700
01:53 dcook            I wish I could see what OCLC is doing behind the scenes.. as I like some of their examples
01:53 rangi            probably
01:52 dcook            I wonder if there's a nicer way to do subjects than schema:keywords..
01:51 dcook            That's pretty coool
01:50 dcook            Nope
01:50 rangi            http://linter.structured-data.org/?url=http:%2F%2Fdemo.mykoha.co.nz%2Fcgi-bin%2Fkoha%2Fopac-detail.pl%3Fbiblionumber%3D649%26query_desc%3Dkw%252Cwrdl%253A%2520fiish
01:50 rangi            have you seen this?
01:50 dcook            As you say, good starting point
01:50 dcook            Yeah, I think they all use literals for the moment but that's OK
01:49 rangi            yep a good starting point
01:49 dcook            We could have those RDF statements in a triplestore...
01:49 dcook            So yeah... we already have embedded RDF statements using HTML+Microdata...
01:47 * dcook          gives a thumbs up
01:47 rangi            we should use more of what is available here
01:47 rangi            https://bib.schema.org/
01:47 dcook            We could use that in a triplestore as well
01:46 dcook            So that's really good
01:46 rangi            yep
01:46 dcook            Well, in terms of microdata at least
01:46 rangi            and it is extensible
01:46 dcook            rangi: Yeah, that's what I mean
01:46 rangi            we have schema.org support in koha already
01:42 dcook            Hmm... maybe using the schema.org schema would be a good place to start
01:39 dcook            hehe
01:38 kidclamp         night
01:38 kidclamp         go for it, I will talk with Joy and argue and respond :-)
01:38 dcook            I might send out another email... a much shorter one
01:38 dcook            Anyway, I won't keep you. Enjoy :)
01:38 dcook            That's what I always say too
01:38 dcook            hehe
01:38 dcook            Oh man, I misread beertime as bedtime
01:38 kidclamp         :D
01:38 kidclamp         but he is the best one
01:38 kidclamp         one
01:38 dcook            You have kids?
01:38 dcook            Yeah I wish our time zones coincided more too
01:37 kidclamp         then bedtime
01:37 kidclamp         it's my beertime
01:37 dcook            That's my ideal bedtime :p
01:37 dcook            9:37pm, eh? Yeah I guess that's fair
01:37 kidclamp         UTC -5?
01:37 kidclamp         Vermont, North East USA
01:37 kidclamp         I wish our time zones coincided when I was less tired :-)
01:37 dcook            night in any case :)
01:37 dcook            I don't even know where you live so I can't hassle you :p
01:36 kidclamp         but I am off for the night
01:36 kidclamp         supposedly at least :-)
01:36 dcook            Now that I think about it... aren't we already using microdata...
01:35 dcook            But then you might want to use your local schema that you could index easily..
01:35 dcook            Maybe too a library database like LIBRIS, Library of Congress, National Library of Australia, etc
01:35 dcook            You'd maybe use owl:sameAS for linkedmdb and dbpedia..
01:34 dcook            Let's say we were cataloguing The Shining in Koha..
01:34 dcook            On linkedmdb they use dc:title
01:34 dcook            Thte title is dbp:name
01:34 dcook            Then on dbpedia: http://dbpedia.org/page/The_Shining_%28film%29
01:32 dcook            I've seen this with LIBRIS... a mix of their own schema and established schemas
01:32 dcook            http://data.linkedmdb.org/page/film/2014
01:30 dcook            And surely there must be non-XML based mappings..
01:30 dcook            hehe
01:30 kidclamp         so simple
01:25 dcook            I guess that would look something like... kbv:Record/sdo:mainEntity/bf2:instance/bf2:title/bf2:InstanceTitle/bf2:mainTitle in xpath..
01:25 dcook            Just to get the title
01:25 dcook            Then sdo:mainEntity then bf2:Instance then bf2:title then bf2:InstanceTitle then bf2:mainTitle
01:24 dcook            At the top level we have kbv:Record
01:24 dcook            But check this out: https://libris.kb.se/data/oaipmh/?verb=GetRecord&identifier=https://libris.kb.se/r93fv6w306l65x4&metadataPrefix=rdfxml
01:23 dcook            Vocabulary mapping... and that's how we get LIBRIS into a format Koha knows..
01:21 dcook            Since it can handle any XML-based format
01:21 dcook            In theory, Zebra could actually be used too.
01:20 dcook            Well one way or another..
01:20 dcook            Actually, I think we'd still need a single target schema
01:19 dcook            I keep thinking this is going to be so inefficient..
01:19 dcook            So that should make data validation a bit easier
01:19 dcook            But I suppose subjects and predicates should all be in URI format
01:18 dcook            I really dislike how you can't really make parameterized SPARQL queries
01:18 dcook            I suppose that will have to be in the SPARQL..
01:16 kidclamp         agnosticism++
01:15 dcook            Not hard-coding things++
01:15 dcook            That's a good point
01:15 kidclamp         I think as long as we have a way to index any specific scheam we can get away with being fleixible, it just means a ton of mapping work - chossing one schema and running with that doesn't preclude supporting others, it just means we focus the work in one place
01:14 dcook            Well they've hindered us I think
01:14 dcook            MARC is a great interchange format, but I think its limitations...
01:14 dcook            To be honest, I think it's somethign that should've happened a long time ago
01:13 kidclamp         yeah, I am up in the air about choosing a schema
01:13 dcook            I'm being assimilated...
01:13 dcook            Spanner came to mind before wrench
01:13 dcook            Bloody hell..
01:13 dcook            That's the one
01:13 dcook            https://xkcd.com/538/
01:12 dcook            That and the one about encryption
01:12 dcook            wizzyrea: I think that's my favourite xkcd of all time
01:12 wizzyrea         https://xkcd.com/927/
01:12 dcook            kidclamp: And I think that's vitally important in terms of indexing the data
01:12 dcook            Whether Koha comes up with its own schema or uses an existing standard..
01:12 dcook            Folk like Oslo Public Library and LIBRIS use their own schemas...
01:11 dcook            Without those mappings... I think it would be impossible..
01:11 dcook            And that makes sense
01:11 dcook            I assume they're mapped somewhere in OCLC's back-end
01:11 dcook            Instead of the schemas preferred by dbpedia
01:11 dcook            While it points to http://dbpedia.org/resource/London, it uses the "schema" schema
01:10 dcook            I think we can see that with OCLC
01:10 dcook            "In order to understand as much Web data as possible, Linked Data applications translate terms from different vocabularies into a single target schema."
01:10 dcook            That kind of comes upa t http://linkeddatabook.com/editions/1.0/#htoc84
01:09 dcook            Because the question becomes... what vocabulary/vocabularies do we use in Koha?
01:09 dcook            Those linkages using predicates that are pre-agreed upon
01:08 dcook            Named graph for a Koha bib that is
01:08 dcook            Then... create linkages automatically with a separate Koha named graph
01:08 dcook            I use OAI-PMH to save the LIBRIS record to a named graph
01:08 dcook            Mmm, or... instead of koha:derivedFrom or whatever
01:07 dcook            Ouais, wahanui, ouais
01:07 wahanui          somebody said Complicated was far too mild a term to describe Search.pm.
01:07 dcook            Complicated
01:07 * dcook          shrugs
01:07 dcook            Not for updates
01:07 dcook            Of course, I think this is a bit where old world and new world clash... shouldn't need OAI-PMH in theory
01:07 dcook            We don't need a crawl to fetch them
01:06 dcook            Since my OAI-PMH harvester is pushing up-to-date records into the triplestore
01:06 dcook            So that we don't recrawl it...
01:06 dcook            Maybe... koha:oai-pmh
01:06 dcook            Something like that
01:06 dcook            or koha:derivedFrom
01:06 dcook            and then in Koha... just have something like owl:sameAs
01:06 dcook            I should use the URI from LIBRIS
01:05 dcook            But since Thursday I've been thinking...
01:05 dcook            I've been thinking about saving those triples under a named graph with the name coming from Koha
01:05 dcook            But it provides some problems..
01:05 dcook            Makes sense to me. Catalogue centrally is more efficient than duplicating effort across a country.
01:04 dcook            LIBRIS being the national library's union catalogue
01:04 dcook            So with Stockholm University Library, they're using OAI-PMH to get RDF records from LIBRIS
01:04 dcook            But... I need to know how we're doing that a bit if I'm going to do that
01:04 dcook            If I recall correctly, I'm supposed to just get the RDF in to the triplestore
01:03 dcook            Arguably none of this is relevant for my work of course...
01:02 dcook            hehe
01:02 kidclamp         shhhhhh
01:02 kidclamp         don't tell talljoy, sometimes i like the idea of marc records supplemented by linked data - use the work links to aggregate, but keep our march there as the base for searching etc
01:00 dcook            Not sure how practical it is :/
00:59 dcook            I like the idea of linked data, but...
00:59 dcook            Or maybe it's gone for good
00:59 dcook            Maybe it's a case of a web app error and it will come back
00:59 dcook            If you're crawling and you get a 404... what do you do?
00:59 dcook            I mean..
00:59 dcook            Yeah I think about that as well
00:58 kidclamp         please stop linking here
00:58 kidclamp         need the RDF equivalent of 404 page - We deleted that info as it was old and moldy
00:57 dcook            Even if we were manually updating the record on Dave... I don't think it would make sense to check at deletion time if anyone else is referring to that graph... as that would be a heavy processing job..
00:56 dcook            We have this cached Birmingham graph
00:56 dcook            Because let's say that Dave moves away to the US from the UK. No longer lives near Birmingham.
00:56 dcook            Checking every graph if it's referred to by another graph within the triplestore...
00:55 dcook            I suppose you could have a cleanup job...
00:55 dcook            Although it still seems like you could wind up with a lot of useless data over time
00:54 dcook            Makes sense to me... I don't know how else you'd reasonably manage them..
00:54 dcook            It's time to refresh the data, so we start crawling the links... and we save the results to their own named graphs
00:53 dcook            So in this example... let's say we have an authority record about "Dave Smith"
00:51 dcook            Wonder if I hadn't read this before or just forgot about it since November 2015..
00:48 dcook            "Multiple Named Graphs can be represented together in the form of a graph set." that looks useful..
00:47 wahanui          wizzyrea: I forgot yeah
00:47 wizzyrea         forget yeah
00:46 dcook            http://linkeddatabook.com/editions/1.0/#htoc81
00:45 dcook            And that is adding to my confusion..
00:45 dcook            I feel a bit like LIBRIS might not be using RDF "correctly"...
00:44 kidclamp         interesting looking though
00:43 dcook            Too bad the word semantic can be used in too many different ways :p
00:42 dcook            But I don't think it's what I'm really looking for..
00:42 dcook            I'm looking at this at the moment: http://stanbol.apache.org/overview.html
00:42 dcook            Too many browser tabs
00:42 dcook            What was I thinking of..
00:41 dcook            I haven't looked at Kibana though so I'm just rambling
00:41 dcook            Mind you, there's the whole "full text indexing" is all you need
00:41 dcook            I do wonder sometimes though how sophisticated these searches are though
00:41 dcook            :D
00:40 kidclamp         you don't have to build a search enginge :-)
00:40 kidclamp         es + kibana is most common, because it works out of the box with no config basically
00:40 wahanui          i heard yeah, es was back working. Refactoring it to work with authorities at the moment
00:40 kidclamp         yeah, es
00:40 kidclamp         I like the 'percolation' feature - using a search of existing documents to classify a new document
00:40 dcook            I met someone from ElasticSearch a couple years ago and meant to stay in touch but just been too busy :/
00:40 dcook            Mmm I'd seen the logging thing. Is that with Kibana?
00:39 kidclamp         lots of statistics and logging stuff
00:37 dcook            I mostly see ads for it in conjunction with other modules though yeah
00:37 dcook            I admit I haven't kept up with ES too much
00:37 wahanui          hmmm... Yeah is there a good way to fix that to get the new one running?
00:37 dcook            Yeah?
00:37 kidclamp         like ElasticSearch, we are using it for searching - that is the least of the things people use it for now
00:37 dcook            Seems like
00:37 dcook            Linked Data?
00:36 kidclamp         I think the more it is implemented the more different things will be done though
00:35 dcook            I mean... Zebra works but I don't think they quite knew how it worked at the start
00:34 dcook            I wouldn't really want to repeat the Zebra debacle
00:34 dcook            I feel like if I could just find one good production example... I'd have more confidence in the whole thing
00:33 dcook            Although I guess caches are supposed to be throwaway..
00:33 dcook            Especially since for updates, you need to delete your existing cache
00:33 dcook            Yeah, I'd like to see some real life examples of it though
00:33 kidclamp         caching and regular crawling seems to be the only theory
00:32 dcook            I think the keeping it up-to-date thing is what I struggle with most
00:32 dcook            Then keeping that up-to-date...
00:32 dcook            search/display... yeah
00:32 kidclamp         yeah, you need to index the terms you are linking too if you want to search at all
00:31 dcook            And potentially hardcode a lot of things... or make complex mappings
00:31 dcook            it still seems like you need to do some work locally
00:31 dcook            But other than that...
00:31 dcook            And we all know you're talking about London, England once we've followed the links
00:31 dcook            You can say "Oh, I'm talking about http://dbpedia.org/resource/London"
00:31 dcook            Like that OCLC example..
00:31 dcook            But that you can point to a thing and say "Yeah... that's the thing I'm talking about"
00:30 dcook            It doesn't seem like it's really meant to be interoperable per se...
00:30 dcook            Or I'm misunderstanding it all haha
00:30 dcook            Or maybe libraries are misunderstanding it all
00:29 dcook            kidclamp: I'm so in agreement there
00:29 dcook            So if you're using Lightroom or Darktable or whatever... you're using RDF O_O
00:29 kidclamp         it feels like too much was done without enough thought on how it would actually work
00:29 dcook            XMP sidecar files use RDF
00:29 dcook            https://en.wikipedia.org/wiki/Extensible_Metadata_Platform
00:29 dcook            Although little bit of trivia...
00:29 dcook            That's the thing I hate about RDF... it's so loose
00:26 dcook            Which makes it so much harder I think.. haha
00:26 dcook            Yeah, that's another thing I've been thinking about
00:26 dcook            Ah yeah I get you now
00:26 kidclamp         that^
00:25 dcook            Or like "Work", "Instance", "Person"
00:25 kidclamp         RDF type/class
00:25 dcook            Are you meaning like auth, bib, hold or a different type of "type"?
00:25 dcook            Hmm still not sure I follow
00:25 dcook            This person: http://www.meanboyfriend.com/overdue_ideas/about/
00:25 kidclamp         basically if they had a bunch of things, each type of thing was a table
00:24 dcook            It was suggested to me by..
00:24 kidclamp         any node that was a 'type' I think I mean class?
00:24 dcook            kidclamp: Have I shown you this link? http://linkeddatabook.com/editions/1.0/#htoc84
00:24 dcook            But I suppose if we use enough layers of abstraction, we shouldn't have to worry about that
00:24 dcook            I suppose I'm OK with using a relational database for a triplestore... unless it introduces pecularities that make it impossible to move to a native triplestore
00:23 * kidclamp       gets very bad at words when thinking talking RDF
00:23 dcook            What do you mean by "any type was table"?
00:23 dcook            any type?
00:23 kidclamp         it is definitely difficult
00:23 dcook            Nor should there be I suppose... but :S
00:23 kidclamp         I was at code4lib and talked to the Boston Public Library guys who are doing a lot of work, they implemented everything via SQL - any type was table
00:22 dcook            There's no "Delete: Cascade"
00:22 dcook            But then what about all the triples that are linked from that URI..
00:22 dcook            Like... sure a URI can replace an ID in a relational database
00:22 dcook            I find all the interconnections a bit mind boggling
00:22 dcook            Although maybe it's not so different from relational databases..
00:21 dcook            I still find updates so... troubling
00:21 dcook            It might make it difficult to create an API for the backend
00:21 dcook            Although I figure until we know what we want to do with the backend...
00:21 dcook            Yeah, I want it easy to switch backends as well
00:21 dcook            hehe
00:20 kidclamp         :-)
00:20 kidclamp         2 - we don't have to resolve our urls, it just seems way cooler to do so :-0)
00:20 kidclamp         two things quick 1- my big concern is just that we make it easy to switch backends
00:19 kidclamp         heh, as do we all likely
00:19 dcook            I need to keep researching I think
00:19 kidclamp         I need to reply to your post(s) by the way
00:19 kidclamp         hah, that is a problem
00:19 dcook            The post is from November 2015, so I guess it's not super recent..
00:18 dcook            And the two records I've linked to are already dead links :p
00:18 dcook            kidclamp: Looking at an old blog post I made about linked data...
00:12 aleisha_         good thanks!
00:08 * dcook          waves to folk
00:05 kidclamp         pretty good, how's about you?
00:05 aleisha_         hows it going kidclamp
00:04 wizzyrea         hi aleisha
00:04 kidclamp         hi aleisha_
00:04 aleisha_         hi all