All posts by Roger Hiles

I head the Collections and Processing Services unit at the Pollak Library at California State University, Fullerton. Interests include Library Automation Systems, Open Source software, Thin Clients, and the One Laptop Per Child project.

Wikis : when are they the right answer?

Jason Griffey of the University of Tennessee at Chattanooga presented a brief, bright and breezy look at the basics of wikis and their use in libraries to an attentive group of about 60 attendees at the end of day two of the 2006 LITA National Forum.

The basic appeal of the wiki is that it is a modern day example of “Many people make for light work”.

Wikis are designed to allow contributors to add to and revise the information on the site, so that the shape and scope do not have to be predetermined. In fact, wikis are a good choice when the shape and scope cannot be predetermined, and they can grow organically as new facts are added. They are good for dealing with “fringe“ items.

Potential problems with wikis stem from the lack of control. Duplication, lack of cross references, and eventual entropy can make mature wikis less useful.

Wikis vs. Librarians

Wikis are foreign to the systems librarians are used to. They lack a preexisting classification scheme, they are always being organized, absolute control is not necessary– in fact, is usually impossible. This lack of control can be a problem for librarians.

Other “Web 2.0” technologies (folksonomies, tagging, del.icio.us) are similarly unstructured, and allow for “emergent order”

Wiki options

There are three different types of wiki: server-based, hosted remotely, or installed locally (on a desktop, not available to he world).

Leading wikis

Mediawiki – free, open source, based on php/mysql, most popular wiki by far), has many extensions (e.g. maps, citations, mp3,
Example: jasongriffey.net/wiki

PBwiki – remotely hosted, is fast, easy. Potential downside: the hosted site could someday get advertising

Tiddlywiki – local on a pc, has no web server. It can be used to create searchable, free-form notes.


Great library wikis:

Ohio University Libraries Biz Wiki – like a research portal, it’s a controlled wiki – creator and faculty only can edit, but not students.


Library Success wiki
– best practices for libraries


ALA Midwinter 2007 wiki

LITA wiki – very new

Lawrence Lessig’s collaborative book project on line – soliciting help in revising a book he doesn’t have time to revise on his own.

When is a wiki appropriate?

When the scope is big, but not too big (i.e. when the project has some focus, or scope).

When distributed creation is needed.

When you aren’t completely sure where you’re going with the project (i.e. when you are asking a fuzzy question).

Wiki best practices

Trust your users

Monitor the information

Seed your needs (make sure to include some content to invite input: a blank page is intimidating)

The Spin on Thin : Thin Clients in Academic Libraries

On the final day of the 2006 LITA National Forum, Helene Gold, electronic services librarian at Eckerd College (St. Petersburg, Florida) described how thin clients are being integrated into the computing environment in the college’s new library. The 25 people who braved Forum-fatigue to attend were not disappointed by Helene’s engaging and accessible presentation.

When Eckerd College decided to build a new library, it also decided to house the campus ITS department in the new building. The ITS department in turn decided to use the opportunity to install thin clients in the new facility to showcase the technology. This had both positive and negative ramifications– while the ITS staff was committed to making the project work, “buy in” by the library staff came more slowly.

Thin clients

Thin clients are relatively simple and durable devices that have no storage or computing power of their own but which can be used to communicate with applications running on servers. While old PCs can be reused as thin clients (removing or deactivating the hard drives), Eckerd decided to use “Sun Ray” thin clients from Sun Microsystems in their new library.

The Eckerd ITS elected not to use a Citrix or Microsoft terminal server that could have presented students with the MS Windows user interface, but instead chose to use the Gnome user interface (most commonly used with Linux) that was supported by their Sun terminal servers. They made this choice both because of lower cost and the desire to simplify systems management requirements, as well as the desire to make the thin clients stand out from the MS Windows PCs used in the library.

The Survey

When she heard about the project, Helene conducted a SurveyMonkey survey to discover library experiences with thin clients, and found that although few respondents were using thin clients, most of those using them were satisfied.

Negative comments elicited by the survey included software incompatibility, lack of CD-ROM support, frequent freeze ups, network bandwidth too limited make thin clients feasible, and finally, students were unfamiliar with the devices and needed more support (as compared with PCs). Notably, all the respondents of the survey were using the MS Windows interface (via Citrix or Microsoft terminal servers).

Positive comments included the space-saving and compact natures of the devices, energy savings (thin clients use 1/6 of the electricity of of PCs), centralized upgrades and management, longevity (can last 7-10 years, compared to PC lifetimes of 3-5 years), security (students login securely to the network and server security is far better than that of Windows PCs), no virus or worm threats, and also that the devices have virtually no value to thieves– they only work in thin client environments. Finally, there are no games or student-initiated downloads, and no chat.

From the Trenches

In her last segment, Helene shared insights from their day to day experiences with the devices:

Printing has proven to be a problem. The system’s Sun printer server did not play well with the campus’ Novell iPrint system, and after much work, the ITS staff eventually removed it and went with CUPS (Common Unix Printing System) instead.

The devices do not have CD-ROM drives (or even support external ones), and students needed to use language CDs have to be directed to Windows PCs elsewhere in the building.

Although the Sun Rays can support USB storage devices, other types of USB devices usually do not work.

Students who needed to use MS Windows applications (such as MS Office) access them off the network from a Linux-based CrossOver server from CodeWeavers which runs them on a compatibility layer. Although CrossOver has been explicitly tweaked to run in Sun thin client installations, Office access is still the leading source of student complaints and some of the steps students must take while using it seemed cumbersome. Saving a document, for example, was a six-step process.

As noted, the library was faced with the double burden of supporting students on both a new hardware and software platform at the same time.

Software that is intensely computational is not a good fit for thin client computing, because all users are sharing the processor(s) on the server(s).

Thin clients are also very sensitive to network bandwidth problems, again because every task, every keystroke the users are doing is 100% dependent on the network.

There is a lack of control with thin clients. They are centrally controlled by the systems staff, and there is little tweaking possible from the user level.

On a positive note, Sun thin client environments offer Sun’s “SmartCard” technology. Students can be issued a card that they insert into a slot on the machine, log in over the network on any thin client, then pull out the card and take it to any other thin client and rejoin their session. A sessions can remain open all week, allowing students a lot of flexibility in their work arrangements. Drawbacks include the cost of the card ($4) and confusion from students because it is a separate card from the campus ID.

The Set Up

Eckerd used two Sun 2-way servers with 4 GB of RAM each, and a single processor Dell server (with 2 GB of RAM) runs CrossOver Server under Linux. They have 30 thin clients currently.

Conclusion

In conclusion, Helene noted that planning is crucial in making a project like this a success. Who will control and implement the project? What are the needs of the users? Software compatibility issues should be carefully examined. What cost savings are anticipated? What is the projected return on investment?

Finally, does the library or the ITS staff have people with Unix experience? And does the library have a good relationship with its ITS staff?

Both the Library staff and the Systems staff need to work together on a project like this, and the earlier they get get together in the process, the better.

In the discussion that followed, the open source K12LTSP project (http://www.k12ltsp.org/) was mentioned as a free way to experiment with thin client technology.

Are There No Limits to What NCIP Can Do?

This session was subtitled “E-Commerce, Self-Service, Bindery, ILL, Statistics – New Applications for the NCIP Protocol”, and as the session began attendees got an answer to the question posed by the title, as presenter Ted Koppel of Ex Libris admitted that new applications for NCIP have not been as plentiful as was anticipated, so the presentation has been re-focused to include a section on how the NCIP standards process is working, and those bindery, statistics and e-commerce applications went missing. So I guess the question in the title has been answered: there are some things it can’t do (yet!).

The presentation came in four parts three parts, with Ted’s introduction, Candy Zemon addressing problems and proposed solutions with NCIP, Jennifer Pearson describing an example of OCLC’s use of NCIP, and Candy Zemon, this time filling in for an absent Gail Wanner, previewing a browser plug-in being developed as part of the “Rethinking Resource Sharing Initiative”.

Part 1

Candy Zemon of Polaris presented on “NCIP: Growing pains”, describing the protocol and how it came to be and also helped explain why the session had to be rejiggered.

NCIP (the NISO circulation interchange protocol, also known as Z39.83), was intended to help establish communications between disparate systems for use in Direct Consortial Lending (DCL), circulation ILL and self service circulation systems. Today, NCIP is an established standard up for a regular review soon.

The question presents itself- why has not this useful standard seen more success? First, NCIP is invisible to the user (when it works!). In many cases, it does what 3M’s SIP or SIP2 do.

While there have been many pilot projects, current uses for NCIP include bindery, self-check, self-sorters, and self-service finance.

The NCIP Implementers Group met to review how problems and perceived problems with the protocol problems, and plan ways to solve them. In part, NCIP came to be used in ways not originally intended, finding fewer of the new applications Ted mentioned in the introduction, and more use in self service circulation and self-sort situations (perhaps because of difficulties with the rather loosely maintained 3M SIP protocols). The sense was that NCIP was too complex for some of these uses. The solutions proposed include making the messages smaller, with fewer mandatory messages, and the the removal of some message elements in situations where a trust relationship between the communicating systems was already established.

Documentation was also felt to be a problem, and the existing documentation will be reorganized and additional documents will be created, including more targeted guides for specific uses, and some “Why use NCIP” guides.

Confusion has been caused by the overlap in functionality between the DCB (direct consortial borrowing) and C/ILL (circulation/inter library loan) profiles in NCIP. The solution: harmonize these profiles.

It was felt that NCIP needs greater extensibility, a major part of the appeal of the 3M SIP protocols. NCIP may incorporate the XML tag.

As NCIP has found increasing use in self service situations, bandwidth concerns have emerged. The solution will be to add the ability to batch or list in messages, as well as reduce the overhead in trusted partner situations.

Finally, a number of bugs are still outstanding. The solution: fix ‘em!

Part 2

Jennifer Pearson of OCLC described the use of NCIP in OCLC’s Worldcat Resource Sharing program.

OCLC is seeking broaden resource sharing from simply library-initiated “inter library loan” to patron-initiated “fulfillment” (i.e. to include purchase options). The hope is to keep libraries in the game in this Age of Amazon, and to keep OCLC in the game as a central, “neutral broker” of the whole process.

Authenticated and validated through NCIP, patrons could have borrowing capabilities from home (that might include home delivery), including purchase options, with all the disparate systems involved in such a process tied together through NCIP.

OCLC is working to make NCIP management less complex by serving as central broker and thus, fewer point to point setups are needed.

OCLC is currently partnering with SirsiDynix, Polaris, and a group of Montana libraries. Work with TLC and Carl are expected to start later. The system may debut in next calendar year.

Part 3

Candy Zemon, stepping in for original presenter Gail Wanner (of SirsiDynix and the “Rethinking resource sharing initiative”) presented “Rethinking resource sharing: getting what you want“.

Candy briefly described the history and goals of the Rethinking Resource Sharing Initiative, which began with a white paper in February 2005, continued through several national forums, and updated white paper, developed a formal leadership, and now plans yet another forum.

Their goal is to create a new global framework to allow people to get what they want based on cost, time, format and delivery, and that is user focused (i.e. can both start and end outside a library and is not library-centered), vendor neutral, has a global context, and uses the concept of Resource Sharing (not just ILL). With ILL, scarce resources are allocated, but with RS, one picks from an abundance of resources.

They are currently working on a user-centric tool, the Get-it Button Project, an open source, cross-vendor, modular web browser plug-in that parses web pages to find published materials, performs an availability check, and displays results based on a patron’s profile. It may be previewed by ALA Midwinter.

The session concluded with a discussion of marketing options for he plug-in.

The impending demise of the local OPAC

Gregg Silvis presented his view of the not-so-rosy future of the local OPAC to a capacity crowd on the first day of the 2006 LITA National Forum.

Reviewing the origins of today’s OPACs in the card catalogs of yore, he focused on the duplication of effort that has always been a part of the tradition of the local catalog, in both card and electronic form. The development of cooperative cataloging greatly reduced this duplication, but the advent of local automated systems caused libraries to migrate redundant physical processes to electronic form and decades later, in a very different technological environment, libraries still largely operate the same way. Each library follows similar or identical steps to locate, load, and index copies of the same records, separately perform identical authority control steps, independently maintain, upgrade and backup thousands of servers to host their OPACs, devote massive amounts of staff resources to the design and implementation of thousands of similar, but different user interfaces, separately test each changed feature for browser compatibility, and provide support to users for locally customized systems each differing at least slightly from every other.

Silvis suggested that it’s time for a radical change, and that recent efforts by OCLC have created in OCLC FirstSearch WorldCat or Open WorldCat a potential replacement for the local OPAC. For most libraries, most of their holdings are in WorldCat, so a WorldCat search limited to their institution’s holdings already provides a lot of the functionality of an OPAC. But Silvis admits that for his idea to work, some key pieces are still missing (such as real-time links to local acquisitions and circulation systems, a way to handle location-specific links to electronic resources and contractually restricted locally enhanced table of content data, and big issues of scalability, reliability, pricing and loss of control). Still, OCLC has an active office of research and has taken the lead in incorporating a number of new technologies into its products, so it is quite possible these problems could be overcome. Such a step would put OCLC in a much more dominant position in the industry, effectively establishing a monopoly over one of the most important library functions, but he stated, in a sense, OCLC is us, and not concerned with corporate profits or maintaining its stock price. The money gets reinvested in more services to libraries.

Silvis concluded with a few “extreme ideas” that I had not heard from him before. Once cataloging and the OPAC are taken away, local systems would consist of acquisitions/serials and circulation/course reserves. Acquisitions/serials can be viewed as a redundant accounting system for many libraries, one that needs to be reconciled with the more general accounting systems often used by the larger institutions of which libraries are a part (such as a University or local government)– noting that that the University of Alberta has moved its acquisitions function to PeopleSoft. For academic libraries, student information in the patron records of the ILS largely duplicates what is held in student information systems. How much of the ILS could be replaced with added functionality in other programs?

Silvis closed by mentioning that the University of Washington is currently in discussion with OCLC to use WorldCat as its OPAC, and invited comments or questions from the 70+ people crowding the room.

A spirited and wide-ranging discussion followed, with many attendees expressing frustration with the current “state of the OPAC”, questioning how vendors could let their products stagnate, some suggesting that open source solutions like the Georgia PINES Evergreen ILS and Koha might provide relief for libraries, and several expressed interest in Silvis’ OCLC OPAC idea.

This was a well informed group painfully aware of specific new things they’d like to do (for example, someone wanted to open their OPAC to LibraryThing), but were unable to do with their current systems.

Some criticized the what they saw as a resistance by vendors to anything but incremental changes in the established automation systems, others stood up for system vendors, who always say these incremental changes are what libraries ask for. If this group is representative, vendors will find that this is changing fast. There was a sense that bigger changes are necessary.