Issue No. 34

December 30, 2002


Paula Kaufman, University Librarian



Happy new year to all our readers.  I hope this is a year of good health, happiness and peace for all.    PTK




A group of prominent scientists is mounting an electronic challenge to the leading scientific journals, accusing them of holding back the progress of science by restricting

online access to their articles so they can reap higher profits.  Supported by a $9 million grant from the Gordon and Betty Moore Foundation, the scientists recently announced the creation of two peer-reviewed online journals on biology and medicine, with the goal of cornering the best scientific papers and immediately depositing them in the public domain. By providing a highly visible alternative to what they view as an outmoded system of distributing information, the founders hope science itself will be transformed. The two journals are the first of what they envision as a vast electronic library in which no one has to pay dues or seek permission to read copy or use the collective product of the world's academic research. Harold Varmus, Nobel laureate in medicine, serves as chairman of the new nonprofit publisher. "The written record is the lifeblood of science," he commented. "Our ability to build on the old to discover the new is all based on the way we disseminate our results." By contrast, established journals like Science and Nature charge steep annual subscription fees and bar access to their online editions to nonsubscribers, although Science recently began providing free electronic access to articles a year after publication.  The new publishing venture, Public Library of Science, is an outgrowth of several years of friction between scientists and the journals over who should control access to scientific literature in the electronic age. For most

scientists, who typically assign their copyright to the journals for no compensation, the main goal is to distribute their work as widely as possible. Academic publishers argue that if they made the articles more widely available they would lose the subscription revenue they need to ensure the quality of the editorial process. Far from holding back science, they say, the journals have played a crucial role in its advancement as a trusted repository of significant discovery. The new venture’s success will depend largely on whether leading scholars are willing to forsake the certain status of publishing in the established journals to support the principle of science as a public resource.



Creative Commons, a nonprofit organization dedicated to promoting the creative reuse of intellectual works, recently launched its first product: its machine-readable copyright licenses, available free of charge from The licenses allow copyright holders to easily inform others that their works are free for copying and other uses under specific conditions. These self-help tools offer new ways to distribute creative works on generous terms – from copyright to the public domain – and are available free of charge.


Copyright holders who decide to waive some of their rights but retain others can choose a license that declares "Some Rights Reserved" by expressing whether they require attribution or allow commercial usage or modifications to their work. Additionally copyright holders may choose to waive all their rights by dedicating their work to the public domain. After the copyright holder chooses their license or public domain dedication, it is expressed in three formats to easily notify others of the license terms.  Various organizations and people have pledged their support for Creative Commons, including Byrds’ founder Roger McGuinn, DJ Spooky, iBiblio, the Internet Archive, MIT Open Courseware project, O'Reilly & Associates, People Like Us, the Prelinger Collection/Library of Congress, Rice University's Connexions project, Stanford Law School, and Sun Microsystems. Implementers include musicians, writers, teachers, scholars, scientists, photographers, filmmakers, publishers, graphic designers, Web hobbyists, as well as listeners, readers, and viewers.


Want to see and hear the talks at the Creative Commons launch?  Go to



Subscription agent Faxon, part of divine, Inc., has apparently shut down certain operations and is preparing for some sort of reorganization.  The company has not yet released an official statement, but here's what’s known to date:

- Customers have been referred to a Chicago firm, Development Specialists, Inc., which consults in reorganization, bankruptcy, and turnaround management.  DSI is making no statements but promises a clarification of the situation next week.

- Libraries are scrambling to figure out how their subscription needs will be filled for next year, with few options or alternatives.  Customers have been told to find alternative vendors; meanwhile, many of them have placed orders for 2003 - and sent in advance payments - that have not been placed with publishers.

- Faxon apparently has not paid a number of publishers for their 2002 subscriptions, nor are they receiving any orders for 2003 subscriptions.

- Faxon's London, Ontario and Montreal offices have been closed and employees laid off.

- EBSCO announced that it would be acquiring the European operations of RoweCom, Inc., part of Faxon/divine that serves Europe.  In that announcement, EBSCO states that divine has announced its "intention to exit the content subscription business." 

- Financial perspective from Outsell: divine's cash was down to $30 million this

June.  In August, it received an equity investment of $61 million from Oak Investment Partners.  One month later, the company reported total cash of $61 million - the equivalent of the Oak investment, but the $30 million of its own cash from June was gone.  Current market valuation has the company worth $28 million, less than half of the Oak investment. The implications of this fiasco are many and distressing:

- Lots of innocent people are going to be caught high and dry.  Faxon customers will have a hard time scrambling to obtain services from the remaining industry players, EBSCO and Swets Blackwell, with no notice over the holiday season.

- The Enron-like aspect of this: Where is the cash?  This business has always been based on customers paying in advance for their subscriptions, while agents pay the publishers only at the end of a calendar year, and get by on the float in between.  Now the big question is - where did all that cash go?  It's on divine's balance sheet as deferred revenues, and the deferred payments have not gone to the publishers - so where has divine stashed it?  Regardless of what happens with divine, Outsell expects the practice of pre-payment will be gone for good after this episode.

- The publishers who have been stiffed by divine are also in a good position to help ease Faxon customers through the crisis, by continuing to honor subscriptions until the mess is worked out.

FW: Outsell's e-briefs, December 20, 2002

NOTE: The UIUC Library has no subscriptions with Faxon.



The Open eBook Forum has announced the findings of the Consumer Survey on Electronic Books. Conducted at New York is Book Country in October 2002, the survey examines consumer preferences towards electronic and paper books. Unique to this study is an attempt to measure attitudes towards eBooks by people who read "paper" books. Contrary to a commonly held industry belief, results indicate no correlation between computer skills or daily Internet use and downloading an eBook. General readers are as likely to adopt eBook technology as people who have expert computer skills and people who visit the Internet daily. Factors in consumer willingness to read eBooks in the future were also measured.

There was, moreover, a correlation between those who have read an article about eBooks and those that are more likely to buy and read an eBook - indicating that awareness is a key factor in the growth of the eBook business. The full survey is only available to OeBF members. Non-OeBF members can obtain a copy of the survey upon application for membership to the OeBF. Member benefits and forms can be found at



Russian software developer ElcomSoft has been cleared of charges that it illegally created a program to disable encryption on Adobe e-books. The jury verdict, announced earlier this month in U.S. District Court in San Jose, California, concludes the first criminal trial of a company accused of violating the Digital Millennium Copyright Act, a 1998 federal statute that protects copyrights on electronic content. As readers of this Newsletter know, Moscow-based ElcomSoft had been charged with violating the law by creating and selling a program called the Adobe eBook Processor, which allowed users to foil copyright protections put in place by e-book publishers. The government had charged the firm with four counts of violating the DMCA and one count of conspiracy. Assistant U.S. Attorney Scott Frewing maintained throughout the trial that ElcomSoft was aware it was violating the law by selling the Adobe eBook Processor. But defense attorney Joe Burton maintained that ElcomSoft's behavior shows that the company was clearly not aware it was doing anything illegal when it began selling the e-book decrypting program in June 2001. Burton asked jurors why the company would have sold and even written press releases about a program it knew broke the law.,1367,56894,00.html



As part of its recently expanded Web offerings, the Catholic Church is giving the public its first glimpse of numerous historic documents from the Vatican's Apostolic Library, including Luther's handwritten letters and translations of Ćsop's Fables into German.

Over the past 18 months, the Vatican has been working closely with Hewlett-Packard on a project to bring the Vatican's historic archives to the Web. HP is providing the Vatican with the wherewithal—namely, hardware, services and Internet consulting—free of charge to make it happen. Visitors to the website can view such resources as photographed pages from papyrus Bibles and a selection of rare illuminated manuscripts. The Vatican has had a website since 1995, but some of its archives became available to scholars through an FTP service as early as 1985. Rare artifacts and manuscripts were off limits to all but the academic community, until now.,1294,56410,00.html



President Bush has signed legislation aimed at improving online access to government information and services. The measure is designed to help the federal government take fuller advantage of the Internet and use information technology to maximize efficiency.

Bush pledged in July 2001 to introduce a more corporate management style to government by focusing on results and ease of use for its "customers." Increasing the use of electronic government is one piece of that agenda. Sponsored by Rep. Jim Turner, D-Texas, and Sen. Joe Lieberman, D-Conn., the legislation establishes a new Office of E-government within the White House's Office of Management and Budget to oversee government-wide efforts. It authorizes funding that increases from $45 million in the 2003 budget year, which began Oct. 1, to $150 million in 2006. OMB already essentially performs the functions of the new office, but the legislation codifies the position in law and makes it permanent. Recent improvements include redesigns of one-stop federal Internet portals such as  an online gateway for Americans to federal services and information;, which provides links to 1,900 federal parks; and, a site where people can search for the benefits to which they are entitled. Starting with the 2003 tax year, some Americans will be able to not only file, but prepare for free, their tax forms online. Other online improvements in the works include a Web site making it easier to participate in the federal rulemaking process and an electronic payroll system that will go from 22 payroll service providers to four. The measure also requires regulatory agencies to conduct administrative rule-makings on the Internet and federal courts to post information and opinions on their Web sites; provides for temporary exchanges of information technology workers to government from the private sector; and authorizes "share-in savings" contracts, in which contractors provide upfront technology and are paid out of some of the savings they reap for their federal agency customers.



The Office of Management and Budget did not collect complete business case information before selecting its 24 e-government initiatives, according to a General Accounting Office study conducted released Dec. 19 by the Senate Governmental Affairs Committee chairman Joseph Lieberman, D-Conn. For example, despite the importance that OMB attached to collaboration and customer focus in its e-government strategy, fewer than half of the initiatives' initial business cases addressed these topics, the study found. The initiatives include electronic travel and payroll systems, a government wide portal for benefits eligibility information and a portal for business compliance information. Lieberman had asked GAO to review information related to the selection and implementation of each e-government initiative. GAO staff reviewed best practices for preparing information technology business cases developed by leading government, academic and private sector organizations, and compared the initial business cases used in the selection of the 24 initiatives with these best practices. GAO staff also compared the information in the May 2002 work plans and funding plans with identified best practices from GAO and OMB guidance on IT project management and oversight. The GAO analysis found that OMB did not have all the information it needed to fully monitor the progress and development of the initiatives. For example, only nine of the initiatives identified a strategy for obtaining needed funds. Also, the accuracy of the estimated costs in the funding plans may be questionable, the report said. GAO found that since May 2002, estimated costs for 12 of the initiatives have changed significantly-by more than 30 percent. Without accurate cost, schedule, and performance information, OMB cannot ensure that its e-government initiatives are on schedule and achieving their goals of providing value to customers and improving government efficiency, the GAO said.



Civic journalism was an early force behind interactive news, with innovative projects such as clickable maps and games being sponsored by the Pew Center for Civic Journalism. Earlier this fall, the Pew Center began preparing for the end of its 10-year charter by creating a new entity—called J-Lab, The Institute for Interactive Journalism—to carry on its mission of enhancing the way people connect with the news. Headquartered at the University of Maryland's Philip Merrill College of Journalism, J-Lab will solicit ideas for making news interactive, such as state budget calculators or environmental-choice games, and then work with university computer scientists and newsroom experts to bring these ideas to life. "Fewer people these days are reading narrative news stories," said Jan Schaffer, J-Lab's executive director. "Narrative stories won't, and shouldn't, go away, but we can't risk not informing people who won't read them. The challenge for interactive journalism is to create less noise and more meaningful interaction that informs and engages people about important issues." She thinks that not only the Internet, but also videoconferencing, Web cams and public kiosks, could help meet this challenge. But for now, consumers can practice self-empowerment by combining information from sources like newspapers, drive-time radio news and the Internet.  NEWS-ON-NEWS/The Ifra Trend Report: No. 176 (18 December 2002)   J-Lab, The Institute for Interactive Journalism 



Attorney Jonathan Band calls attention to a post-9/11 policy conflict that has received little public attention: the growing conflict between cyber-security and intellectual property. For several years, the entertainment industry had argued that the Internet in general and peer-to-peer networks in particular enable intellectual property infringement on an unprecedented scale. Industry representatives claim that this infringement cuts their profits and diminishes their incentive to invest in new products.

Accordingly, the entertainment industry has lobbied Congress to adopt a variety of measures aimed at facilitating the enforcement of intellectual property rights. Unfortunately, Band points out, these measures have the unintended consequence of undermining cyber-security. For example, in 1998 Congress passed the Digital Millennium Copyright Act. One provision of the DMCA prohibits the circumvention of technological measures that protect access to copyrighted works. The provision's intent was to impose legal penalties on hackers who penetrated the encryption and other technological measures copyright owners would use to protect their works in the digital environment.  In the four years since the DMCA’s enactment, it has become increasingly clear that the law prohibits the research and testing necessary to develop new cyber-security products, despite the two narrow exceptions for encryption research and security testing in the DMCA. Computer science professors have found themselves entangled in litigation because of their academic activities, and universities and software companies have had to include attorneys in the research and development process to ensure compliance with the DMCA's arcane terms. In this way, the DMCA has hindered the development of technologies that can protect computer networks from cyber- attacks. Unlike the conflict between security and privacy, the conflict between cyber-security and intellectual property is completely avoidable. Copyright owners have numerous means at their disposal for protecting their intellectual property without compromising cyber-security. These include litigation, strong encryption, “spoofing” (posting corrupt music files on the Internet), and the development of new business models that discourage infringement. These means might be more expensive than those permitted under the DMCA or proposed legislation, but Band contends that the cost to society of cyber-attacks that cripple our critical information infrastructure will be far greater.



The National Science Board has released its draft report on Science and Engineering Infrastructure for the 21st Century and invites public comment on the draft until January 9.  Recent concepts of infrastructure are expanding to include distributed systems of hardware, software, information bases, and automated aids for data analysis and interpretation. Enabled by information technology, a qualitatively different and new S&E infrastructure has evolved, delivering greater computational power, increased access, distribution and shared-use, and new research tools, such as data analysis and interpretation aids, web-accessible databases, archives, and collaboratories. Many viable research questions can be answered only through the use of new generations of these powerful tools.



More than any other country, the U.S. government has used the web to make a wealth of information available to its citizens. But we are now discovering that the dark side of web-based information is the ease with which it can be deleted. Government-sponsored information and research is disappearing from government web sites, much of it in the name of national security. Airport safety data vanished, and chemical plant risk-management plans were deleted from the Environmental Protection Administration's web site. The Department of Energy removed environmental impact statements which alerted local communities to potential dangers from nearby nuclear energy plants, as well as information on the transportation of hazardous materials. The US Geological Service asked depository libraries to destroy a CD-ROM database on surface water (as a result, University of Michigan researchers lost access to information vital to their three-year study of hazardous waste facilities, and community activists could no longer access data on chemical plants that violate pollution laws). According to the American Library Association, the Department of Energy has removed 9,000 scientific research papers that contain keywords such as "nuclear" or "chemical" and "storage" from national laboratory web sites and is reviewing them to see if they pose security risks. The Defense Technical Information Center has removed thousands of documents. But other information that has no relationship with security issues is also vanishing. The Centers for Disease Control removed reports from its web site on the effectiveness of condoms in AIDS prevention, and on effective programs for the prevention of tobacco use, pregnancy and sexually transmitted diseases among young people. The National Cancer Institute removed a report debunking the claim that abortions increase the risk of breast cancer, and the Department of Education is, it says, "reevaluating" hundreds of research reports available on its web site. Furthermore, state governments are also removing data from public access. The problem is that the previous presumption, that publicly- funded information is the rightful property of the public until proven otherwise, has been replaced by the presumption that the public has to prove to a suspicious government that it deserves the information. Gary Bass, of OMB Watch, a private group which monitors government spending and legislation, says "We are moving from a right to know to a need to know society." More information will presumably disappear when some government agencies cease to exist as their functions are folded into the new Department of Homeland Security. Among the agencies slated for extinction are the US Immigration and Naturalization Service. Will anybody in the reconstituted agency preserve the documents on their web pages? If not, will the University of North Texas librarians who operate a "CyberCemetery" of the documents of defunct government agencies preserve them? Who is keeping track of deleted data? As you would expect, government document librarians are monitoring the situation closely; information on deletions and other threats to public information is available on the Government Documents Round Table web site has also created a Task Force on Permanent Public Access to Government Information.

University of North Texas "CyberCemetery"

OMB Watch monitors the deletion of government web pages.




College groups are again asking the U.S. Copyright Office to allow scholars to bypass technological devices that restrict electronic access to copyrighted works. In a letter to the Copyright Office, the groups say that a section of the Digital Millennium Copyright Act, known as the "anti-circumvention provision," needs to be revised to permit "fair use" of copyrighted material for research and teaching. Researchers and scholars maintain that they must be able to bypass the access-control devices and view digital texts and images without fear of breaking the law. The groups note that academic users have long been able to view nonelectronic copyrighted material under existing fair-use provisions of copyright law. The Association of American Universities wrote the letter on behalf of the American Council on Education and the National Association of State Universities and Land-Grant Colleges. The groups' comments are part of a process in the digital-copyright law that requires the Copyright Office to recommend exceptions to the anti-circumvention provision to the Library of Congress every three years. The head of the Library of Congress oversees the Copyright Office. The groups made a similar but unsuccessful appeal two years ago. But this year, in a departure, the groups are faulting the standards the Copyright Office uses to determine whether exceptions to the anti-circumvention provision should be granted. Library groups made a similar point in a letter to the Copyright Office. They said the office had "set the bar for relief at an unreasonable and unrealistic level." Arnold P. Lutzker, a Washington lawyer, wrote the letter on behalf of the American Association of Law Libraries, the American Library Association, the Association of Research Libraries, the Medical Library Association, and the Special Libraries Association.



The Bush administration is planning to propose requiring Internet service providers to help build a centralized system to enable broad monitoring of the Internet and, potentially, surveillance of its users. The proposal is part of a final version of a report, The National Strategy to Secure Cyberspace, set for release early next year, according to several people who have been briefed on the report. It is a component of the effort to increase national security after the Sept. 11 attacks. The President's Critical Infrastructure Protection Board is preparing the report, and it is intended to create public and private cooperation to regulate and defend the national computer networks, not only from everyday hazards like viruses but also from terrorist attack. Ultimately the report is intended to provide an Internet strategy for the new Department of Homeland Security. Such a proposal, which would be subject to Congressional and regulatory approval, would be a technical challenge because the Internet has thousands of independent service providers, from garage operations to giant corporations like American Online, AT&T, Microsoft and Worldcom. The report does not detail specific operational requirements, locations for the centralized system or costs, people who were briefed on the document said. While the proposal is meant to gauge the overall state of the worldwide network, some officials of Internet companies who have been briefed on the proposal say they worry that such a system could be used to cross the indistinct border between broad monitoring and wiretap. The government report was first released in draft form in September, and described the monitoring center, but it suggested it would likely be controlled by industry. The current draft sets the stage for the government to have a leadership role.



The Pentagon has released a study that recommends the government pursue specific technologies as potential safeguards against the misuse of data-mining systems similar to those now being considered by the government to track civilian activities electronically in the United States and abroad. The study, Security and Privacy, was commissioned in late 2001 before the advent of the Pentagon's Total Information Awareness system. The study was conducted by a group of civilian and military researchers, the Information Sciences and Technologies Study Group, or ISAT, which meets annually to review technology problems. The study concludes that technologies can be adapted to permit surveillance while minimizing exposure of individual information. Those technologies include automated tracing of access to database records; the ability to hide individual identification while conducting searches of databases with millions of records; and the ability to segregate databases and to block access to people without authorization.    Report at

Outlook 2003: Issues In The Information Marketplace



Outsell’s annual year-end Briefing looks forward to 2003. This year’s MetaIssue is “Incoming Asteroids!” Outsell sees a period of unprecedented change coming in 2003, with the Information Content (IC) industry about to be hit with a number of changes that will seem to drop from the sky. Among the asteroids coming our way: technology companies moving into the content space; the increasingly invisible content buyer; and a whole new generation of users with different ideas about information. The Briefing provides a look at what’s coming at us beyond the tall grass of the day-to- day challenges.


A study done by the Pew Internet & American Life Project finds that most Americans who do not use the Internet still have high expectations for getting information online—with those online having even greater expectations.  According to the telephone-based survey, 64 percent of nonusers expect to be able to find information in at least one of the following four categories—health care, government, news and shopping. In fact, 16 percent of the nonusers say they would turn to the Internet first the next time they need health care and government information.  Meanwhile, 97 percent of Americans who use the Internet expect to find information in one or more of the categories. Overall, 84 percent of all Americans have such thoughts. At least 70 percent of the people who have sought information in each category say they usually find what they are seeking. The greatest successes are in news and shopping; government ranks the lowest. Expectations are lower when it comes to information about other people. Only 31 percent of Americans believe they could find reliable information on the Internet about someone else. However, 58 percent expect to be able to reach someone through e-mail. The study was based on phone calls made randomly Sept. 9 to Oct. 6 to 2,092 adults, including 1,318 Internet users. The margin of error was 2 percentage points for the general sample, 3 percentage points for the sample of only users and 4 percentage points for nonusers.



It’s been an active year on the scholarly communications front.  Peter Suber, Earlham College and author of the FOS Newsletter, offers these highlights:

·         January 1, 2002. BioMed Central started charging processing fees to cover the costs of free online access.

·         January 31, 2002. HINARI started delivering free online content.

·         February 6, 2002. The International Scholarly Communications Alliance (ISCA) launched.

·         February 14, 2002. Budapest Open Access Initiative (BOAI) launched by the Open Society Institute.

·         February 25, 2002. OAIster launched by the University of Michigan Libraries Digital Library Production Services.

·         May 16, 2002. Creative Commons launched by Lawrence Lessig.

·         May 26, 2002. The FOS News blog launched by Peter Suber.

·         April, 2002. The Association of College and Research Libraries (ACRL) launched its scholarly communication initiative.

·         July 1, 2002. BioMed Central launched its Open Access Charter, assuring open access to its journal contents for the long-term, even after any future changes of ownership.

·         July 1, 2002. Eprints software affiliated with GNU, underlining its commitment to remain free and open source.

·         July 1, 2002. Ingenta announced its plan to create a commercial version of the eprints software and offer OAI eprint services.

·         July 11, 2002. Citebase is launched by Southampton University.

·         August 1, 2002. Eprints-UK launched by JISC-FAIR.

·         August 1, 2002. Project RoMEO (Rights MEtadata for Open archiving) launched by JISC-FAIR.

·         August 1, 2002. Project SHERPA (Securing a Hybrid Environment for Research Preservation and Access) launched by JISC-FAIR.

·         August 1, 2002. Project TARDIS (Targeting Academic Research for Deposit and Disclosure) launched by JISC-FAIR.

·         October 31, 2002. DARE launched by the Dutch government.

·         November 4, 2002. DSpace officially launched by MIT.

·         November 4, 2002. PubSCIENCE was discontinued by the U.S. federal government in response to lobbying by commercial publishers.

·         November 6, 2002. Bonn statement issued by the German university rectors.

·         December 17, 2002. The Public Library of Science received a $9 million grant from the Moore Foundation for open-access publishing and announced its first two open-access journals.

The scholarly communications are also available on line at