General information

De Lange Conference on Emerging Libraries: Notes, Part 3

Tuesday, March 6, 2007

9:00 a.m. “Science Wars: The Next Generation”
James Boyle, William Neal Reynolds Professor of Law, Duke University, and co-founder of Center for the Study of Public Domain, is a member of the Creative Commons Board and one of the principal organizers of Science Commons.

Theme is technology has multiplied materials from which scientific research is done, however, the retrieval, access, and use of materials has not kept up. We need to put same energy into facilitating scientific inquiry that we put into Myspace, etc. Who is responsible for this and how can it be done.

Copyright basics are not amenable to the networked age – when you put something into mildly original form, it’s copyrighted, and all rights are reserved. This is problematic to open access now because all your material can be so widely available and known. Yet no one has permission to do anything with it, to build on it. Creative Commons attempts to deal with this by allowing things that are covered by it to be freely used except that commercial use can’t be made of them and derivative works can’t be done. He is one of the founders of the Science Commons which makes use of Creative Commons ideas in science.

Recent case law has become vague in a couple of troubling areas – the expression of facts or completely obvious ideas has started to apparently become copywritable or patent-able. However, he has found that this has not been as big a problem to scientists as he would have thought. One reason is that they just ignore it and think there is a “research exception.” This has been found to be invalid.

Biggest problem is getting access to materials to do research – literally the biological or physical materials. Original researcher may withhold it due to competition, and this is becoming more common. Also physical cost of dealing with materials. Agreeing to transfer of material is big legal problem. Negotiation often takes major amount of time, causing real research problems. There is a standard agreement form among universities, but people often want to make exceptions. There is no standard for agreements between universities and commercial entities.

A good model would be a standard agreement that could be attached to materials that would then completely regularize the process of being able to obtain the materials and use them. Use would be pre-cleared for a standard fee. Other problem is finding the information you need – need semantic Web for science. People don’t like adding metadata, though. Could use software for this although it is second best solution. Can’t do this, though, for material for which you don’t have rights.

What about material in public domain, though? People agree it sounded good, but nobody did anything about it. Now working on Neurocommons project to try and build semantic Web for neurology, ready to release soon. It will be buggy but open for correction.

10:30 a.m. “Read as We May”
Paul Ginsparg, professor of Physics and Information Science, Cornell University, started the e-print arXiv in 1991 at Los Alamos National Laboratory.

Discussed the history of arXiv.org. Needed to sustain this project beyond his interest, turned it over to library. The will has been good, but have been disappointed by the library’s ability to keep it technologically current.

What are the current pressures operating on the scientific information system? Soaring journal costs, question, too, of what the cost of publication is, and how much money is being paid for the journals. Scholars don’t tend to be aware of the publication cost, and may be more upset about that than the subscription price that is being charged.
Cost to publish each article may range from a couple of thousand from professional societies up to $25,000 for the American Chemical Society (which operates more like a commercial publisher.) Web 2.0 holds promise for having a system that includes the more essential features of the publishing process but cuts the cost greatly (although still not free.)

Open access is thought by some to be self-evident from public policy standpoint. Thinks the point is specious that studies have shown open access items to be more highly cited – if this is so, it may argue for scholars wanting less open access articles since those that are will be more highly cited – self interest. Open access doesn’t mean free access, he noted.

Federal Research Public Access Act of 2006 mandated that federal agencies with more than $100 million budgets have to make papers open access after 6 months. Another type of thing that is being done is that university faculties are “resolving” to hold onto rights to their publications, but this rarely has sufficient faculty buy-in. Backdoor route to open access – is this gradually happening anyway? More papers than you would expect to be found freely on Web – for example more than 1/3 of high impact biomed articles in 2003 available. Mentality of next generation assumes open access, debates may seriously decrease.