Ron Burns from EBSCO was up next to talk about what EBSCO is doing with their Open Source library platform.
Ron started off with a few facts about EBSCO
- EBSCO is not an ILS vendor
- 60+ ILS partnerships
- Support Koha and were a board member of OLE
Then he talked about the current proprietary Library Service Platforms, and likened them to an old set of anitque drawers. And also touched on how they are monoliths and with the bundling of discovery .. leads to lock in. When you factor in the consolidation, there is a real lack of choice for libraries.
The new FOSS LSP should be modular and modern. Ron reminded us that open source drives innovation, so of course they should build with open source.
Folio (the new LSP) is built with a focus on
- Open Source
They decided to choose the Apache 2 License and copyright transfer to the Open Library Foundation. The community at this point is Kuali OLE, some libraries, Index Data, industry partners, EBSCO.
The architecture will look someting like
- System Layer
- API layer
- UI Toolkit
He did a really good job of a high level view of what it might look like. If you look up the presentation online you will get a much better idea than I can impart.
Where does Koha fit? The Koha community can decide that.
- Development now
- Mid 2016 code on github
- 2017 first base platform release, and apps released
- 2018 marketplace
Paul Poulain was up next to talk about Koha and Coral and why/how we should link them.
To handle electronic subscriptions in Koha we have 2 options, we can either add a module to Koha. Or we can find an existing FOSS project and integrate it. Coral is an existing FOSS one, so integrating is a good choice.
There are 4 modules in Coral
- Define licenses
- Enter licenses
- Import/export licenses
This is where you define your resources, linking them to organisations and licenses
Because Koha and Coral are both web resources there are a few ways we can link them together. They are aiming for middle of 2017 for it to be ready.
Mengü Yazıcıoğlu spoke next about what we are pretty sure is the biggest Koha installation, a cluster serving 1132 Public libraries. (They are growing so fast it has moved from 1126 to 1132 libraries in the last 2 weeks)
They have been live on Koha for 2 years it is run as a project from the Ministry of Culture and Heritage.
In 2012 MCH signed with Near East University in Cyprus to migrate to Koha 2.2.x. With 900,000 users and 9,000,000 items for the migration. Lots of problems with dirty data, centralising a decentralised model meant lots of duplicates. They also had problems with infrastructure and lack of training. But they worked through this and had a system that worked well.
They made quite a few customisations and scripts to deduplicate biblios, the Ministry of Culture and Heritage estimates they saved 5,000,000 Euros on not having to have people do the deduplication alone.
In December 2015 they upgraded to Koha 3.20.x.
You can return your item anywhere in Turkey, so they make heavy use of the Koha transfers module, which tracks where any item is.
The infrastructure to run this, is a lot less than most people would think I am sure. Here is their infrastructure
This was a fascinating and amazing presentation. Started in Levin, Foxton, Shannon and Tokomaru, now … 1132 cities and towns in Turkey .. mind blowing.
Sonia Bouis stepped in to fill a vacant presentation slot with a demo of the Koha sandboxes and showing how to sign off bugs.
She started with a demo of our bugzilla at bugs.koha-community.org and showed how to pick a bug that is testable with the sandboxes
Once you have found a bug in bugzilla, you can now go to the sandboxes to test, some sandboxes are MARC21 and some are UNIMARC, be careful to choose the appropriate one for you.
Joy Nelson and Jesse Weaver did a great talk about Linked Data.
Joy started off explaining what linked data and the semantic web is, which she did about the best of anyone of I have seen do it. Did you know there is a Tutu ontology? I didn’t either.
She covered triples and gave some really good examples that showed how they are acutally used. She covered why we should use linked data.
- Take advantage of standards
- Make library data discoverable on the web
- We need to be where the users are
- De silo our data
- MARC replacement
- Transforming the data
- Choice of URI
- Long transition
- Retraining in terminology
BIBFRAME is designed specifically for the GLAM sector (Galleries, Libraries, Archives and Museums).
Jesse then took over to show the technical side of linked data. He started on what we can already do, with connecting our existing data to external data sources like VIAF and DBpedia.
Jesse was brave enough to do a live demo showing how we could pull in author information into a Koha details page.
It generated a lot of buzz and some good questions.
The last presentation of the day was by Ann-Marie Breaux from YBP (a division of EBSCO) talking about YBP and GOBI.
YBP is a book seller and sells mostly English language books to libraries all around the world. Both print and digital books. In 2015 they purchased by EBSCO but still focus on ebooks (not journal or discovery).
GOBI is their online system, it is how librarians, teaching faculty or anyone who makes purchasing decisions interact with YBP.
GOBI and Koha integration.
2 ways of doing intergation
- Library places an order in GOBI, and the record is loaded into Koha
- Place orders in Koha and push to GOBI
- GobiAPI: early 2017
To finish this session was Galetsi Panagiota talking about Koha in their library.
It is a museum library, the Folkllife and Ethnological Museum of Macedonia and Thrace. They have permanent and temporary exhibitions (I want to visit this museum now). It is a small library with 8000 items, it is open to the public and has a reading room, but does not permit regular lending. The library is essential for researchers both within and without the museum.
The really like being able to store many photos, which makes it easier for researchers to be sure they have the item, before they visit the library.
Karakitsiou Chrisoula then took over to talk more about her role and impressions of Koha in FEMM-TH. It was a really interesting project they selected 3080 items and photographed all of them, front cover, back cover and table of contents page and added them to the catalogue.
Sher Afzal Khan from Bahria University in Pakistan was up next to talk about a Koha installation in Pakistan. He started by introducing us to Pakistan, before moving on to talk about Koha in Pakistan.
FOSS Adoption is still low in Pakistan but Koha is becoming very popular in libraries due to a live DVD with DSpace and Koha on it. There has also been a lot of work done in workshops and training which have increased librarians confidence.
Some of the installed Koha in Pakistan are
- University of Malakand
- Bahria University Libraries
- Garrison Cadet College Koha
- Higher Education Commission
- The Islamia College University Peshawar
- Pakistan Institute for Parliamentary Services
- Sindh Madressatul Islam University
- Quaid-e-Awam University of Engineering
- Lawrence College Murree
- Lahore High Court
- Agriculture University Peshawar
He went on to explain how they integrated Koha and RFID.
It was really great to hear about how much work is being done in Pakistan.
After a fantastic lunch, next up was Despoina Gkogkou talking about the SELIDA framework which is designed to provide a better audit trail /traceability of the the items.
The library is based in the University of Patras, which is the 3rd largest in Greece with 35,000 users and eight branches. They have been using Koha early since May 2016, and they have translated the staff interface into Greek. They have had to make many few changes to get it working with Koha.
Up next was Filippos Kolovos from the University of Macedona talking about their Koha journey.
They shifted from Horizon 7.3.2, which altho useful it was getting old and obsolete. They needed new features and wanted to follow the Open Source policy of the Library, being able to customise the software as needed.
Reasons to move
- Funding issues
- Koha is as good as many other systems.
They had 3 partners involved in the migration, EliDOC, BibLibre and Aristotle University. Migrated 11,105 users, 97,795 biblio records 102,157 authority records.
They bumped into some issues
- Authority values
- New duplicated authorities
- Different name titles linked to the wrong biblio
- Weird renewal issue that I have to investigate after this talk
- The log grew t0o fast – Cataloguing logging was on and a massive batch process was run each night
- Every major migration has different challenges
- Both for the software and the staff
- Sometimes you hit unexpected issues also