Issue 21/04

November 19, 2004

Paula Kaufman, University Librarian, Editor




MPs on the House of Commons Science and Technology Select Committee have asked the Government to “reconsider its position” on scientific publications after it released an obstructive Response to a Committee Report released in July this year. The MPs say that the Department of Trade and Industry (DTI) has clearly tried to “neutralise” the views put forward by other departments and Government-funded organizations, in particular the Joint Information Systems Committee (JISC), an expert advisory body funded indirectly by the Department for Education and Skills. The MPs said it was “worrying” both that an expert body had felt constrained in carrying out its advisory role, and that the Government had ignored JISC’s expert advice on the need for change in the system for publishing research findings. JISC’s very positive response to the Committee Report was watered down following negotiations with DTI. The Government Response focuses on criticism of the “author-pays” publishing model, despite the fact that the Committee’s Report did not recommend its wholesale adoption. Moreover, the Government has “prejudged” the publishing model, instead of encouraging experimentation as advocated by the Committee. MPs claim that the Government’s position owes more to the publishing interests supported by DTI than the best interests of the scientific community or evidence-based policy.  Science and Technology Committee, U.K. House of Commons 11/08/04



On November 9, the Financial Times published an unsigned editorial on open access. Excerpt: 'Although the angry MPs may have gone too far in accusing the Department of Trade and Industry of kow-towing to the publishing lobby at the expense of British science, the government should not have taken such a negative stance. A more measured response would have been to adopt some of the committee's suggestions for establishing Britain as a test-bed for open access journals, with publishing and peer review costs met ultimately by the research funding agencies, while making clear that there would be no precipitate move away from the existing system....The main reason for considering a change now is that computer and communications technology make it possible, for the first time, to disseminate research results far beyond the traditional purchasers of scientific journals, such as university libraries. There is a powerful ideological argument that the public, having funded the research in the first place, should not have to pay again to see the results....Although the lukewarm attitude of the government will disappoint open access activists, the publishing industry must recognise the growing international pressure for fundamental change. The Wellcome Trust is determined to introduce open access publishing through the £400m a year it spends on biomedical research and there are powerful voices for reform in the US and elsewhere in Europe. A fair compromise might be to give journals six months exclusivity and then guarantee free public access."  Open Access News 11/11/04



Excerpt: 'The Wellcome Trust, Europe's largest research charity, has become the latest grant-giving body to throw down the gauntlet to academic publishers in the debate over open-access literature. All papers reporting the results of research funded by the trust will in the future have to be placed in a central public archive within six months of publication, the organization said on 4 November. The move could bring the trust into conflict with publishers, who often hold exclusive rights on the use of such material. This in turn could restrict researchers' choices about which journals they publish in....Researchers funded by Wellcome could find that the new rules create some difficult choices. Some publishing houses, such as Elsevier, which publishes more than 1,800 journals including Cell and The Lancet, do not currently allow any version of a paper they have published to be placed on a public archive other than on websites restricted to the author's research institution. "This will put publishers and researchers in a difficult position," acknowledges Robert Terry, a senior policy adviser at the trust's London headquarters. But Terry believes that journals will modify their policies to allow papers to go to central archives. He points out that the US National Institutes of Health (NIH) is considering putting similar requirements on the research that it funds (see Nature 431, 115; 2004). "It would be quite a strange [biomedical] journal that didn't include research funded by the NIH and the Wellcome Trust," he adds.' Open Access News 11/11/04



On November 2, the Naperville City Council voted to ask the Illinois General Assembly to change the Library Records Confidentiality Act. The change would add these paragraphs: "Nothing contained in this act shall be construed as a privacy violation or a breach of confidentiality when a library cooperates with and provides information to sworn law enforcement officers in the process of a criminal investigation." "Furthermore, nothing contained in this act shall be construed as permitting a library to refuse to cooperate with and provide information to sworn law enforcement officers in the process of a criminal investigation." "Such cooperation and information shall include, but not be limited to, releasing registration records and identifying information of patrons, as well as access to computer and surveillance information." If passed, the changes would affect libraries across Illinois, including UIUC’s libraries. LIS News 11/08/04 



Earlier this year, the Australian affiliate of Project Gutenberg posted the 1936 novel "Gone with the Wind" on its Web site for downloading at no charge. After an e-mail message was sent to the site last week by the law firm representing the estate of the book's author, Margaret Mitchell, the hyperlink to the text turned into a "Page Not Found'' dead end.  At issue is the date when "Gone with the Wind" enters the public domain. In the United States, under an extension of copyright law, "Gone with the Wind'' will not enter the public domain until 2031, 95 years after its original publication.  But in Australia, as in a handful of other places, the book was free of copyright restrictions in 1999, 50 years after Mitchell's death.  The case is one more example of the Internet's inherent lack of respect for national borders or, from another view, the world's lack of reckoning for the international nature of the Internet, and it is also an example of the already complicated range of copyright laws. The issue of national sovereignty over the Internet has not been firmly established, either by trade agreement or by court precedent, some legal experts say, and conflicts continue to be settled individually. But there are much bigger copyright battles looming as more material, including songs by Elvis Presley and the Beatles, approach public domain in countries around the world. Already the copyright battle is brewing in Europe, where the International Federation for the Phonographic Industry, a trade group in London for record companies, is urging the European Commission to extend copyright protection for performers from 50 years to 70 years, or even to the 95 years generally given to sound recordings in the United States. Without that extension, recordings from musical artists of the 1950s and 1960s will start entering the public domain in the 20 European Union nations this decade and next, allowing anyone to profit from them without paying performer royalties. The first prominent rocker to be affected, according to the federation, is Elvis Presley, whose 1954 single "That's All Right'' is set to become copyright-free in the European Union in January.  More important—at least to EMI Records—the Beatles catalog would begin to fall into the public domain in the European Union starting with "Love Me Do" in 2013, although the publishing rights would remain intact. In the United States, performer rights are protected for 95 years….  "It may be that just the threat of pressure was enough incentive to get it removed," Carroll, the California lawyer, said. "Project Gutenberg is made up of volunteers and doesn't have deep pockets."  Another reason for the quick removal of the book may be a trade agreement, expected to be ratified by the United States and Australia this year, which would require Australia to enforce a copyright limit of 70 years after the death of the author. 11/8/04



Even though political speeches are made by public figures, there aren't many ways to get video footage of a speech.  Morning Edition, National Public Radio. 11/10/04


Students at about a dozen colleges and universities have started organizations called Free Culture groups to educate other students about copyright and fight what they see as a tilting of the law to favor copyright owners. The first Free Culture group was started by Swarthmore College student Nelson Pavlosky, known for his successful legal challenge to Diebold Election Systems' use of the Digital Millennium Copyright Act in trying to suppress leaked company memos. Pavlosky and other Free Culture organizers want college-age people to understand how copyrights have changed in the electronic era, particularly with respect to legislation such as the proposed Induce Act. Pavlosky acknowledged that a danger of the Free Culture groups is that participants will simply be seen as "rich white kids who want free music." Jessica Litman, a law professor at Wayne State University and a speaker at a meeting of the Free Culture groups, noted that copyright law is traditionally written by lobbyists who represent copyright owners and said that consumers should be included in that process. Wired News, 10 November 2004 Edupage, November 10, 2004,1284,65616,00.html



Can it be that the Web is but 10 years old—perhaps a teenager if you stretch it. In that time, libraries have made great advances in providing Web-based access to a broad variety of services that used to be available only within the walls of the library. But many libraries' Web sites continue to replicate the physical and functional organization of the traditional library - a "thin veneer over library technical infrastructures that were designed to support traditional library services," says Krisellen Maloney, of University of Arizona in Tucson. These sites are typically organized around functions (interlibrary loan, circulation, reference) or existing information stores (the card catalog, print indexes). But Web-savvy users unfamiliar with traditional library processes don't view such sites as transparent or able to meet their information-seeking needs, she says. Better to organize electronic services in a way that supports users' tasks—through such means as library portals that have integrated connections to all systems and information resources. It is no longer possible to think of library technical infrastructures as a group of separate systems that will be accessed and used from the same starting point and in a similar manner by all users, she says. (American Society for Information Science and Technology Bulletin Oct/Nov 2004) ShelfLife, No. 182 November 11 2004)



"Public television programs such as Great Performances, NOVA, Nature and Frontline are important cultural artifacts," says Ken Devine, VP and chief technology officer for Thirteen/WNET New York, one of the leading U.S. producers of public television programming. With that in mind, Thirteen/WNET is partnering with WGBH Boston, PBS and New York University on a $3-million, three-year planning project that will lay the groundwork for preserving digital television programming. "Just as programs on

videotape can be lost without proper storage or playback equipment, in their own way, programs in digital formats are in just as much danger—without proper plans for long-term preservation, storage and playback, they, too, could easily disappear forever," says Devine. The funding for Preserving Digital Public Television comes from the Library of Congress's National Digital Information Infrastructure and Preservation Program, and is one of eight awards aimed at long-term preservation of America's cultural materials and the only one preserving any type of television programming. The Preserving Digital Public Television project will inventory the at-risk programming at each public TV station, establish criteria and procedures for creating the archive, research technology issues, and outline operating policies for a cooperative archival facility. The next step after this stage will be to make such an archive operational. (Business Wire 1 Nov 2004) ShelfLife, No. 182 (November 11 2004)



Creative Commons is expanding from the realm of copyright into patents and scientific publishing. The group's move into the scientific sphere could help add new weight to growing criticisms that the current patent process has become too inflexible and often awards too much protection to ideas that are not genuinely unique.  BNA's Internet Law News (ILN) - 11/11/04



Between 1935 and 1960, the paperback revolution created a new industry overnight, permanently changed our understanding of 'the book, helped to democratize reading by increasing readership and eroding the lines between 'high' and 'low' literature, and created its own, unique genres and forms of expression." There are links here to cover art, articles, and an animated timeline. November 12 NeatNew and ExLibris

That's the conclusion of a new study, which finds them in direct violation of government requirements. It seems that most search engines just don't do a good enough job of distinguishing between paid and "natural" search engine results. The study shows that all of the biggies are fundamentally flawed, including Google, Yahoo, and even AOL and Lycos. Will MSN search do a better job? Don't count on it. We've got details on the impeccable credentials of the research group, and what it found, along with what the search engines themselves had to say in response.  What’s New Now (Ziff Davis) 11/11/04,1759,1723653,00.asp



Creative Commons has named John Wilbanks to be the first director of Science Commons. Excerpt from John Borland's story in ' "Wilbanks' addition as leader of the new Science Commons branch...marks a very exciting new phase, as the Creative Commons model is tested in uncharted areas of intellectual endeavor," Lawrence Lessig, Stanford Law School professor and organization founder, said in a statement....A posting on the group's Web site says its board of directors had been considering moving into the area of science almost since inception but that it did not initially have the "expertise or technical capacity" to enter that realm. An intellectual-property system that allows sharing between scientists is particularly important, given research grants that often make results proprietary, as well as recent international changes in patent law that expand the scope of data protection, the group said. The "commons" approach could help introduce needed flexibility, it added. "Right at the historical moment, when we have the technologies to permit worldwide availability and distributed processing of scientific data...we are busy locking up that data and slapping legal restrictions on transfer," the Creative Commons site says. "Judicious balance is needed. The tendency to claim that property rights are never the answer, or that openness always solves all problems, must be avoided." Science Commons will officially launch on January 1, 2005. Open Access News 11/11/04



John Ashcroft may be headed for retirement, but Section 215 of the Patriot Act, passed in the wake of 9/11, remains in force.  Here from Bookselling This Week is a reminder from the Campaign for Reader Privacy about the ensuing battle to make changes to the act to ensure privacy. ALA, ABA, and AAP are some of the groups involved. 11/11/04




TThe nation's 115 million home computers are brimming over with personal treasures - millions of photographs, music of every genre, college papers, the great American novel and, of course, mountains of e-mail messages. Yet no one has figured out how to preserve these electronic materials for the next decade, much less for the ages. Like junk e-mail, the problem of digital archiving, which seems straightforward, confounds even the experts. "To save a digital file for, let's say, a hundred years is going to take a lot of work," said Peter Hite, president of Media Management Services, a consulting firm in Houston. "Whereas to take a traditional photograph and just put it in a shoe box doesn't take any work." Already, half of all photographs are taken by digital cameras, with most of the shots never leaving a personal computer's hard drive. So dire and complex is the challenge of digital preservation in general that the Library of Congress has spent the last several years forming committees and issuing reports on the state of the nation's preparedness for digital preservation.  Jim Gallagher, director for information technology services at the Library of Congress, said the library, faced with "a deluge of digital information," had embarked on a multiyear, multimillion-dollar project, with an eye toward creating uniform standards for preserving digital material so that it can be read in the future regardless of the hardware or software being used. The assumption is that machines and software formats in use now will become obsolete sooner rather than later. (Note: UIUC’s GSLIS and University Library have been awarded a $2.7 million grant under this program.)   In the meantime, individual PC owners struggle in private. Desk drawers and den closets are filled with obsolete computers, stacks of Zip disks and 3½-inch diskettes, even the larger 5¼-inch floppy disks from the 1980's. Short of a clear solution, experts recommend that people copy their materials, which were once on vinyl, film and paper, to CD's and other backup formats. But backup mechanisms can also lose their integrity. Magnetic tape, CD's and hard drives are far from robust. The life span of data on a CD recorded with a CD burner, for instance, could be as little as five years if it is exposed to extremes in humidity or temperature. And if a CD is scratched, Mr. Hite said, it can become unusable. Unlike, say, faded but readable ink on paper, the instant a digital file becomes corrupted, or starts to degrade, it is indecipherable. Professional archivists and librarians have the resources to duplicate materials in other formats and the expertise to retrieve materials trapped in obsolete computers. But consumers are seldom so well equipped. So they are forced to devise their own stop-gap measures, most of them unwieldy, inconvenient and decidedly low-tech.   Proponents of paper archiving grow especially vocal when it comes to preserving photographs. If stored properly, conventional color photographs printed from negatives can last as long as 75 years without fading. Newer photographic papers can last up to 200 years. There is no such certainty for digital photos saved on a hard drive. The experts at the National Archives, like those at the Library of Congress, are working to develop uniformity among digital computer files to eliminate dependence on specific hardware or software.  New York Times 11/10/04


In the hottest story to come out of the 24th annual Charleston Conference, held from November 3-6, in Charleston, SC, Cornell University librarian Phil Davis presented a bombshell paper that detailed a pattern of republishing content without attribution at Emerald Publishing, formerly known as MCB University Press. Using simple keyword searching of the publisher's online journals, Davis identified 409 examples of duplicated articles from 67 journals, all "republished without notification from 1989 through 2003." Many of these articles, Davis noted, were published simultaneously in journals within the same or similar subject disciplines. The pattern suggests that libraries unwittingly may have spent "considerable sums of money on duplicated materials from Emerald." Davis's work was prompted by Chuck Hamaker, a librarian at the University of North Carolina at Charlotte, who Googled himself, only to learn he was published in journals to which he never sent submissions.  Davis would not speculate on what motivated the duplications, or whether the pattern of republication was a systemic, coordinated effort. In a written response, Emerald spokeswoman Gillian Crawford acknowledged that the company had erred: "We accept that explicit notice of dual publication should have been published alongside each article and regret any inconvenience as a result of that notice not being given." Crawford explained that, from 1989 to 2000, articles considered "to be of particular merit were occasionally published within another MCB journal where it was felt that their content would be of interest or benefit to the additional journal audience." Crawford said there has been no "deliberate dual publication" since 2001, adding that Emerald may still "occasionally republish" an article with the author's consent and full attribution, in, for example, a themed collection or review issue.  Authors are unpaid, but will libraries seek to be compensated for purchasing duplicated content? Davis, in his paper, was careful to note that, while perhaps unethical, Emerald's action is not necessarily illegal. He also notes, however, that library customers do have a significant complaint, since libraries pay subscription fees in advance for content they generally assume to be original. Without a notice to the contrary, libraries, short of performing a research exercise, would not know they were purchasing duplicate content. Crawford said that Emerald has conducted its own research and will contact all customers affected, but declined to say if any settlement would be discussed. Library Journal  11/15/04



The U.S. Senate may soon vote on HR2391, the Intellectual Property Protection Act, a comprehensive bill that opponents charge could make many users of peer-to-peer networks, digital-music players and other products criminally liable for copyright infringement. The groups that lined up against the bill include the Consumer Electronics Association, the Computer and Communications Industry Association, the American Conservative Union and public-interest advocacy group Public Knowledge.  BNA's Internet Law News (ILN) - 11/16/04,1283,65704,00.html 



OCLC, the world’s largest library cooperative, and Yahoo! Inc., a leading global Internet company, have announced a pilot program that leverages the strength of the Yahoo! Toolbar and Yahoo! Search to enable consumers to explore the Web and WorldCat® (the OCLC Online Union Catalog) database. The program offers consumers a co-branded toolbar that provides one-click access to 2 million of the most popular records found in WorldCat, a central catalog of library holdings created and maintained collectively by more than 9,000 libraries. WorldCat includes books, movies and audio files. The Yahoo!/OCLC toolbar is a project associated with Open WorldCat, a new OCLC initiative designed to increase the online visibility of libraries and their collections. OCLC will be promoting the co-branded toolbar on its website, providing consumers access to information previously only available from within libraries. The toolbar enables consumers to narrow their search results to the WorldCat database and helps them locate libraries in their vicinity that have the record they are looking for. OCLC and Yahoo! will work together to increase accessibility to more of WorldCat’s 57 million records as they become available. To access WorldCat’s most popular records, consumers simply enter a query in the search box located in the toolbar and either click the WorldCat logo or use the drop-down menu which features a “libraries” link. Consumers will then be prompted for their zip code to determine if the library materials they are looking for are available in a nearby OCLC member library. The co-branded toolbar features Yahoo! Search, which provides consumers with a rich research technology to help them access both online and offline databases. The Yahoo!/OCLC Toolbar also includes a drop-down menu, located next to the WorldCat logo, which provides access to the OCLC FirstSearch service, the NetLibrary eBook service, the OCLC member library list, the OCLC Web site and a link to the About WorldCat site which leads to more information on the database. The co-branded toolbar will also be available in OCLC libraries across the nation. The Yahoo!/OCLC Toolbar can be downloaded on Monday, November 15th from the OCLC website.



Bookstore sales finished a soft third quarter by falling again in September. According to preliminary figures from the U.S. Census Bureau, sales in the month dropped 3.8%, to $1.53 billion. Sales for all of retail rose 8.1% in September. For the first nine months of 2004, bookstore sales were down 0.5%, to $12.43 billion, compared to a 7.8% increase for the entire retail segment.  Publishers Weekly 11/15/04



Sam Jaffe, Want a Jolt of Literature? Try Textpresso! The Scientist, November 8, 2004. Excerpt: "[A] new open-source tool called Textpresso can find a single fact just by typing in a quick search entry. Paul Sternberg’s lab at the California Institute of Technology designed Textpresso to organize papers on Caenorhabditis elegans. Unlike the popular PubMed online search tool, Textpresso does a full text search. And unlike other text-search devices, Textpresso bases its search on ontological relationships, thus increasing its precision." (PS: See our earlier blog posting about Textpresso for the OA connection, if it isn’t already clear.)  Open Access News 11/06/04



A new search engine that speaks results.  It’s an interesting approach, especially for visually challenged people.


10 X 10

Here’s a very interesting site that provides a visual picture of up-to-the-minute news.



An initiative of the National Endowment for the Humanities (NEH) and the Library of Congress will digitize millions of pages of historical newsprint and place them online. The goal of the National Digital Newspaper Program is to post 30 million newspaper pages originally printed between 1836 and 1922, replacing the current system of microfilm records of old papers. Bruce Cole, chairman of the NEH, announced the initiative at the National Press Club, saying that providing easy access to such historical records serves as an effective tool in the fight against what he called "American amnesia." When the newspapers are online, said Cole, everyone will have "immediate, unfiltered access to the greatest source of our history." Newspapers printed before 1836 will not be included because earlier typefaces cannot be read effectively by optical scanners; newspapers published after 1923 are covered by copyright restrictions. San Jose Mercury News, 16 November 2004  Edupage, November 17, 2004



Tomorrow Google will launch the beta version of Google Scholar, although it is online today for use. From the press release: '[W]e are excited to announce Google Scholar, a free search service that helps users find scholarly literature such as peer-reviewed papers, theses, books, preprints, abstracts, and technical reports. This service will be available tomorrow morning....Like Google Web Search, Google Scholar orders search results by relevancy to ensure the most useful references appear at the top of the page. This ranking takes into account the full text of each article as well as the article's author, the publication in which the article appeared, and how often it has been cited in scholarly literature....Whenever possible, Google searches across the full text of a paper, not just the abstract....Google Scholar offers relevant results for a wide range of scholarly materials including research that isn't yet online. For instance much of Einstein's work isn't online, but it is heavily cited by other researchers. Google Scholar leverages these citations to make users aware of important papers or books that are not online, yet may be available in their local library.'  Open Access News 11/18/04



In late 2002, after years of flowing red ink, Stanford University Press (SUP) announced that it would ambitiously overhaul both its publishing program and its operations, reducing its workforce, including editorial staff, and slashing the number of books published in some areas while starting programs in others. This week, just two short years after the overhaul begun, the plan has yielded dramatic results. Geoffrey Burn, director of the press, and Alan Harvey, director of publishing and acquisitions, last week told the Stanford Faculty Senate said that, since the 2002 reorganization, SUP revenue has grown by a whopping $1 million, up 26 percent. Making that figure even more impressive, the press's annual subsidies during that period were trimmed from $917,000 in 2001-02 to just $221,000 in 2003-04. "Like most presses in 2001-2002, we were seeing a downturn in our fortunes as a result of reductions in both library and retail buying, reflecting a downturn in the economy as a whole," Burn told the LJ Academic Newswire. "This meant that we needed to find niche revenue streams, and launch them quickly, to underwrite our core program of scholarly publishing in the humanities and social sciences."  With the trade market flat, the undergraduate market dominated by large companies, and entry into the journals market both costly and in flux, options were limited. As if those market factors weren't daunting enough, they were compounded by SUP's internal situation—a publishing program that was "spread fairly wide," too few books that generated enough revenue to cover the cost of other books, and "processes, systems, and schedules" that were not efficient enough to meet the increasing demands of the market. Faced with those challenges, SUP came up with an ambitious "seven part plan" to enable SUP to re-tool its program.  The plan included a focus on areas where SUP had a "sustainable reputation," such as in the humanities and social sciences; also SUP abandoned areas in which it could not compete, such as science. SUP also cut back trade signings, and launched new programs in areas of strength for Stanford University such as economics, law, finance, business, and policy. The press also "re-engineered" its internal processes and managed to reduce its production schedules by about 50 percent. For now, Burn says he is comfortable with the direction of SUP and predicts even more growth. Still, he acknowledges, given the pace of technology, challenges loom for all presses, not just SUP. "At the risk of sounding complacent, I think our print model will require only fine tuning going forward," Burn said. "Our challenge, as is true for our whole community, is to play a part in finding ways to make large amounts of aggregated scholarly material available, in a readily navigable environment, with sufficient perceived value to support a business model."  Library Journal Academic News Wire: November 18, 2004



Visitors to the British Library will be able to get wireless internet access alongside the extensive information available in its famous reading rooms. Broadband wireless connectivity will be made available in the eleven reading rooms, the auditorium, café, restaurant, and outdoor Piazza area. A study revealed that 86% of visitors to the Library carried laptops. The technology has been on trial since May and usage levels make the Library London's most active public hotspot.  Previously many were leaving the building to go to a nearby internet café to access their e-mail, the study found. "At the British Library we are continually exploring ways in which technology can help us to improve services to our users," said Lynne Brindley, chief executive of the British Library. "Surveys we conducted recently confirmed that, alongside the materials they consult here, our users want to be able to access the internet when they are at the Library for research or to communicate with colleagues," she said. The service will be priced at £4.50 for an hour's session or £35 for a monthly pass. The study, conducted by consultancy Building Zones, found that 16% of visitors came to the Library to sit down and use it as a business centre. This could be because of its proximity to busy mainline stations such as Kings Cross and Euston. The study also found that people were spending an average of six hours in the building, making it an ideal wireless hotspot. Since May the service has registered 1,200 sessions per week, making it London's most active public hotspot. The majority of visitors wanted to be able to access their e-mail as well as the British Library catalogue. The service has been rolled out in partnership with wireless provider The Cloud and Hewlett Packard. It will operate independently from the Library's existing network. The British Library receives around 3,000 visitors each day and serves around 500,000 readers each year. People come to view resources which include the world's largest collection of patents and the UK's most extensive collection of science, technology and medical information. The Library receives between three and four million requests from remote users around the world each year.  BBC NEWS 020241.stm  LIS News 11/18/04


The scholarly communications are also on line at