General information

Evaluating Databases: Prioritizing on a Shoestring

database schema graphic

Libraries have limited resources, and the portion afforded to electronic resources requires some savvy prioritization to meet patrons’ needs while sticking to budgets. Allocation of spending is a key issue for many libraries, and database subscriptions can cost thousands, even tens of thousands, of dollars. For smaller libraries, it’s possible to spend the electronic resources budget on a single amazing all-purpose database or piece together a collection from low-cost alternatives. What’s a librarian to do?

It’s important to note that there’s no right/wrong dichotomy in deciding which electronic resources are “best”; it’s always a matter of “best for the community”, i.e., the librarian’s rule of thumb: know thy service population. Does your library serve a population with a high unemployment rate? You may need to prioritize electronic resources focused on job training, skill-building, and resume writing. Are you situated in an area where students hang out after the school day? Consider electronic resources like educational games, homework helpers, and web-based tutoring. Are you nestled in the heart of an emerging tech boomtown? You might include resources on programming languages (reference sources, learning programs, etc).

Over the years, I’ve explored various sources – from my MLIS textbooks to library websites to blog posts – and here’s a list of preliminaries that I consider when I’m tasked with evaluating electronic resources for selection to serve my library’s community.


In the same way I’d evaluate a print source, I consider the content of an electronic resource. Is it relevant to my community? What about scope – is the information comprehensive, or, if not, is it a good fit to fill gaps in depth on a topic of special interest to patrons in the community? Is it updated often with timely and reliable information? Does a database include full text content, abstracts, citations? Is there a print resource that’s more useful?


The how is as important as the content of a resource (the what). I ask myself: how simple is it for a layperson to use? Is the interface user-friendly? Is the indexing accurate and thorough? What about search – how does the database handle truncation, search types, filters, alternate spellings, and search history? Is there a FAQ or tutorial to help users solve issues if they get stuck? Can they export and download materials? I’ve learned that these questions can be important in how valuable patrons find the resource to be. A database may contain the deepest, most broad content possible, but if users can’t find it, it’s not much use to them. Like the question of a tree making sound when it falls in an empty forest, we can’t answer the question of whether the content is useful if no one is there to witness it.

Technical Bits

Before digging deeper into authentication and content format, I have a list of technical odds and ends that I consider in the preliminary evaluation. Does the vendor provide IT support for when technical issues inevitably arise? What about staff training or tutorials so librarians can learn how best to assist patrons in using the resource, or teach classes on the database’s functionality? How do patrons access the database – some vendors may allow in-library access only, some may provide limited content in their licensed versions, and some may not be optimized for mobile; in my evaluation, the resource will need to be stellar in other ways if these limitations exist. There’s also the biggie: cost. I weigh the expected value against the cost of the resource in electronic versus print format, e.g., is the electronic version more timely, cheaper per use, or vastly easier to use?

Once an electronic resource is in use, I add a parameter or two in the annual evaluation process – such as whether a database generates enough use to warrant the expense; any patron feedback staff has received; how much librarian-patron interaction is required for users to engage with the resource effectively; and how often the resource crashes as well as how the vendor’s IT staff assists in resolving those inevitable issues that crop up. In the preliminary stages of electronic resource selection, I use content, function, and basic technical elements as the litmus. If a resource passes all of these tests, then a library can dig a level deeper to finalize its decision. I’ll discuss this next month in a follow-up post.

Do you have any pro-tips? What has been your experience in implementing databases at your library?