Changelog for Koha 3.2.0

This isn’t the authoritative list, Galen the release manager has a better idea than me, but this is what I think is going to be in the 3.2.0 release.

This was a question I was asked a few times, as the marketing coming out of Liblime was confusing people as to what is, and what is not in Koha.

So here’s what I know of

  • Offline Circulation
  • Callnumber splitting enhancements
  • Staff search results page enhancements
  • Email checkout slips
  • Hold request targeting
  • Overdue report enhancements
  • Default settings for patron notifications
  • Support for syndetics content
  • Holds on multiple items
  • Tag multiple items in the cart
  • Support for different OPAC interfaces, by url, so different libraries/branches can have different looks
  • OPAC display for UNIMARC using XSLT
  • Can now disable Amazon reviews, without disabling Amazon book covers
  • Many enhancements to patron messaging
  • Huge piles of acquisition enhancements, too many to list here
  • OPAC suggestions (checks the biblio doesn’t already exist)
  • OPAC suggestions allow hold to be placed if item is ordered
  • Searching allow stemming for other languages
  • Bulk changes to items
  • Moving items between MARC records
  • Lots of holds improvements
  • Allow borrowers to manage their privacy settings
  • System preferences big improvements
  • Labels rewrite
  • RFID support code
  • Patron group editing

This is just what I know of, and doesn’t include any partial features, so there is likely a lot more.

    LIANZA Conference 2009

    Well, I’ve now had a day to recover and collect my thoughts so I thought it would be time to do a write up on the LIANZA conference that I was fortunate to be able to attend.

    I arrived at Christchurch on Sunday to a beautiful spring day, which flushed the annoyances of having delayed flights, due to Air NZ’s network being down, right out of my mind. I met up with Jo and we shared a taxi to the hotel we were staying at. We then went and registered for the conference and while Jo had a meeting I made use of the free wifi at the conference centre. Following an early dinner we spent the evening finishing off the presentation and deciding who would say what.

    Day 1

    The day started with a powhiri, and then Hana and Tipene O’Regan spoke, talking about intergenerational knowledge transmission and the endangered state of Kai Tahu reo. It was a great talk, and the byplay between the two of them was something special. Richard Stallman was up next talking about copyright vs community, he made some very salient points and while I don’t think everyone got it , a lot of people did. It was something I think the library world needed to hear and to keep in mind whenever they are making purchasing decisions.

    Keeping with the sharing and empowerment of people through knowledge theme, I went to hear Terehia Biddle from Archives NZ, to talk about the role they play, and about engaging successfully with Maori. She also made a lot of great points, two that I remember well were

    breakdowns in relationships are caused by egos getting in the way, and by people unable to admit they are wrong

    competency in te reo me tikanga adds to your credibility

    Following lunch, I went to listen the Brenda Chawner talk about online identities, and how people are using them for personal and professional usage. Nathan Guy was supposed to be speaking next, but he couldn’t make it, and since there is only one thing worse than a politician reading their speech and that is another politician reading their speech for them, I ducked out and did some Koha work instead.

    To finish the day off, Brenda, Jo, Richard Stallman and I went off to a Bangladeshi restaurant to have dinner. Topics of conversation covered were

    • Koha
    • Software as a Service
    • Kea (and other NZ birds)
    • The use of passive tense in te reo Maori

    All in all, some of the best food and conversation I have had in a while.

    Day 2

    I missed the Keynote on Tuesday, everyone said Claudia Lux spoke well, so I will have to read others reports on her talk. I was busy panicking slightly about Jo and I presenting at 10.

    From what I can tell, the presentation went well and was well received, I felt it was rushed but that we managed to communicate the points we were trying to get across. I spent a good chunk of the rest of the day with Jo doing demos so I can only take that as a good sign.

    After morning tea, and some quick Koha demos I went to listen to Dennis Dutton talk about “Evolution and Aesthetics” and no disrepect to the other speakers or topics but this was definitely my highlight of the conference. I had decided to buy his book ‘The Art Instinct’ within about 5 mins of him starting and by the end I wanted to race out and buy it right then and read it right away. I won’t try to report what he was talking about, as I won’t do it justice, but do read the book if you ever get the chance.

    The rest of the day was spent talking about and demoing Koha. To finish the day off, it was the informal conference dinner and drinking at a variety of bars and restaurants in SOL square (a recently redeveloped couple of lanes). It was a lot of fun, good people, good music, good food and good beer. Ticked all the boxes. And between dancing and drinking I still ended up talking about Koha, which I take as a really good sign there is a lot of interest in it.

    Day 3

    The final day of the conference started with Penny Carnaby (The National Librarian) talking about citizen created content, and the importance of not losing it. Both Koha and Kete got a shout out in her speech. Following her was Jessica Dorr from the Bill and Melinda Gates foundation. Her heart was definitely in the right place and there is no doubt the gates foundation does do a lot of good around the world. The cynic in me sees it as training the next generation of Microsoft dependent users though.

    I didn’t get to any of the next sessions before lunch, I ended up doing another Koha demo and talking to others about it.

    To finish the conference was Tim Spalding of Librarything fame, he covered a lot in his talk and made the point that if libraries want to get to web 1.0 let alone web 2.0 they need to go with free and open source software.

    So while the conference is still dominated by proprietary and locked down software, I think this conference was a big step up from previous LIANZA conferences. I didn’t have to explain Free Software to people before I could start talking about Koha, they already knew what it was (even if they called it open source ;)). It was a really good conference, well organised and well executed, but of course what made it so great was all the people who attended.

    Koha Translation Stats

    We are getting close to 2 Koha releases, 3.0.4 and 3.2.0 and the translators have been working hard, here are some statistics.

    Username Language-Code Number translated
    Agnes hu 57
    AvaZabihi fa 5
    GOB ca 1402
    Hans nl 399
    Jakob sv 62
    Kaz tet 4886
    LordJABA pl 2
    Matija hr 3
    Najmun ben 2
    Rafael pt 18
    Ramakant mr 1
    RobertH de 22
    Selmas1an tr 2520
    Shtriter ru 1
    abdullatef ar 5
    akhundof az 1242
    albertojose pt 3
    alen hr 332
    ashraf ar 13
    asir tr 1152
    ataurrehman ur_AR 14
    atomus pl 100
    axelb nb 67
    beda de 843
    bmacan hr 39
    bonanome it 10
    bspt ca 57
    celle tl 388
    cettox tr 36
    chrisc en_NZ 1376
    chrisc mi_NZ 28
    cybermon khk 61
    dgrau ca 12
    diego gl 2
    ebegin fr_CA 267
    ebegin fr 9
    evan sk 2
    francofiorello it 8
    gkatsa fr 1
    gkatsa el 5960
    gulelat am 3
    hellen de 32
    henrikpedersen da 5
    hilongo es 99
    honyk cs 4
    ikranjec hr 572
    indradg ben 54
    jaroop pl 2
    jverissimo pt 79
    katrin de 3073
    kiolalis el 1
    kmkale mr 358
    kosmas el 7
    krishmp hi 2
    kristina hr 57
    laurenthdl fr 1
    ldiaz ca 1
    legendario pt 1
    legendario pt_BR 207
    ljhelbo da 46
    mao zh_TW 71
    marian sk 1
    mars pl 7
    mglavica hr 1696
    mmacht de 625
    moguro ja 78
    nalon fr 224
    neelawat th 1251
    nicomo fr 123
    nkawase ja 1
    nmarkop el 3
    pabloab es 4
    petras cs 536
    pycio pl 10
    rbuj ca 7
    rea1 el 6
    ricardo pt 723
    ropuch pl 538
    russel en_NZ 1289
    savitras hi 64
    savitras prs 64
    shermira es 23
    somchai th 210
    stefanos el 116
    tajoli it 28
    thawatchai th 2
    theogielen nl 73
    tjakopec hr 16
    wasimbhalli ur_AR 9

    Reflections on the fork, a week later

    When I was thinking back on this I found myself wondering, why was I so angry, disappointed and sad  about this, after all forks come and go in the FLOSS world. Often they wither and die or are merged back into the main development line. Sometimes there is enough momentum behind them they continue on, like the BSDs have done.

    Forks can happen for a philosophical reason, like Gnote and Tomboy. Forks can happen due to the main trunk stalling, or being unwilling to move development in a direction that people want. Forks can happen due purely to personality conflicts.

    So why did the Liblime fork cause so much of a stir, and it is a fork there can be no argument about that, separate development line = fork .. the only argument is whether it will be a long lasting or short lasting fork. So no it wasn’t the fact it was a fork, I had been resigned to that for a while, ever since it became obvious months ago there was significant amounts of work not being committed upstream. No it was the ‘spin’ around the fork that was the most concerning.

    All sorts of reasons have been given as to why Liblime ‘had’ to fork.

    • They don’t have the time or resources to send patches upstream. Or another version, recent resignations of staff have meant they don’t have the resources.
    • The community’s code is so bad they have to maintain their own version.
    • They aren’t withholding code, or even if they are its only for a month or 2 (which still makes it a fork for a month or 2)
    • And lately, there is customer data bound in with their code so they can’t make it publicly available.

    I’m not going to rebut each of these excuses, people have already done so, and suffice to say reality doesn’t support these. But it is distressing that what appears to me to be the real reason for maintaining their own repository and version has not been said.
    Given that the technical reasons for not releasing patches upstream are demonstrably false, the only reason left is to deprive other Koha users and developers of code, to gain some kind of competitive advantage in the market place. This is a valid business strategy, not one I would take, or that I think will succeed, but valid none the less. So I don’t really take huge issue over this. It is just the fact that you should not try to make excuses and in the process cast aspersions about a huge range of people so that you don’t have to admit the real reason you are doing something. It is the fact that in the attempt to justify the fork, Liblime and their supporters have maligned a huge group of people who do not deserve that.

    The Koha community retains its Documentation Manager

    It has been a week of big news in the koha world. Some bad, like Liblime’s fork of Koha finally becoming public knowledge and some good like Nicole staying on as Documentation Manager.

    What makes this news even better, is the fact that two of the Koha support vendors teamed up to insure Nicole could stay in the community

    In an environment when some people are actively trying to paint the community as a bad thing, it’s great to see that the people who understand Free Software are prepared to walk the walk, not just talk the talk

    Koha unsung heroes – Part 15

    The #koha irc channel

    On irc.katipo.co.nz we have a #koha irc channel (have had since 2000). There have been literally thousands of times someone has been helped on there. Here are some of my favourites:

    • Thd helps audrey with understanding MARC21
    • si teaches kados about ssh-keychain
    • I help kados out with html::template
    • We help 2 people with Koha installs, and then discussion turns to cricket and rugby
    • Lots of talk about cricket
    • Even more talk about rugby

    (I’ve only done the early years, and I leave it as an exercise for the reader to find other gems and link them in the comments)

    DBIx::Class and Koha

    I did some work today with DBIx::Class and Koha, and got opac-account.pl partially using it.

    You can see the schema here and I had to make a few changes to C4::Context, and a few to opac-account.pl.

    Here is what they were.

    diff --cc C4/Context.pm
    index 7ba57fb,4dab9a9..cafc0ca
    --- a/C4/Context.pm
    +++ b/C4/Context.pm
    @@@ -18,6 -18,6 +18,8 @@@ package C4::Context
    use strict;
    use warnings;
    ++use Koha::Schema;
    ++
    use vars qw($VERSION $AUTOLOAD $context @context_stack);
    BEGIN {
    @@@ -191,6 -191,6 +193,27 @@@ $context = undef;        # Initially, n
    =cut
    ++sub schema {
    ++    my $self = shift;
    ++    my $db_driver;
    ++    if ($context->config("db_scheme")){
    ++        $db_driver=db_scheme2dbi($context->config("db_scheme"));
    ++    }
    ++    else {
    ++        $db_driver="mysql";
    ++    }
    ++
    ++    my $db_name   = $context->config("database");
    ++    my $db_host   = $context->config("hostname");
    ++    my $db_port   = $context->config("port") || '';
    ++    my $db_user   = $context->config("user");
    ++    my $db_passwd = $context->config("pass");
    ++    my $schema = Koha::Schema->connect( "DBI:$db_driver:dbname=$db_name;host=$db_host;port=$db_port",
    ++      $db_user, $db_passwd);
    ++    return $schema;
    ++}
    ++
    ++
    sub KOHAVERSION {
    my $cgidir = C4::Context->intranetdir;
    diff --cc opac/opac-account.pl
    index 43a1e0c,43a1e0c..376ae98
    --- a/opac/opac-account.pl
    +++ b/opac/opac-account.pl
    @@@ -17,16 -17,16 +17,20 @@@
    # wrriten 15/10/2002 by finlay@katipo.oc.nz
    # script to display borrowers account details in the opac
    ++# Edited by chrisc@catalyst.net.nz
    use strict;
    use CGI;
    use C4::Members;
    ++use C4::Context;
    use C4::Circulation;
    use C4::Auth;
    use C4::Output;
    use C4::Dates qw/format_date/;
    ++use DBIx::Class::ResultClass::HashRefInflator;
    use warnings;
    ++
    my $query = new CGI;
    my ( $template, $borrowernumber, $cookie ) = get_template_and_user(
    {
    @@@ -40,9 -40,9 +44,16 @@@
    );
    # get borrower information ....
    --my $borr = GetMemberDetails( $borrowernumber );
    ++# my $borr = GetMemberDetails( $borrowernumber );
    ++my $context = C4::Context->new;
    ++my $schema = $context->schema;
    ++my $rs = $schema->resultset('Borrowers')->search({ borrowernumber => $borrowernumber });
    ++$rs->result_class('DBIx::Class::ResultClass::HashRefInflator');
    ++my $borr = $rs->first;
    ++use Data::Dumper;
    ++warn Dumper $borr;
    my @bordat;
    --$bordat[0] = $borr;
    ++push @bordat,$borr;
    $template->param( BORROWER_INFO => @bordat );

    So not many changes at all, i’ll work on changing some more, but it looks like adding DBIx::Class can be done in a gradual way.