Issue No. 56
December 9, 2003
Paula Kaufman, University Librarian



Taylor & Francis Group PLC, a specialist publisher of scientific, academic and professional books and journals, said it has agreed to acquire, subject to regulatory clearances, the business and publishing assets of the Dekker group of companies, a US based Scientific, Technical and Medical ("STM") publisher from the family owned Marcel Dekker Inc for $138.6 million. It is expected that the acquisition will be earnings enhancing in the first year of ownership. The consideration will be made up of $122 million in cash, a loan note of $1.6 million and a further contingent cash payment at closing estimated to be in the region of $15 million. Taylor & Francis said the cash consideration will be paid from group cash resources and banking facilities. Dekker is a leading US based STM publisher of journals, reference and textbooks and encyclopedias, in the specialist areas of science, engineering and medicine. Dekker's sales for the year ended Dec 31, 2002 were $42.0 million, producing an operating profit before exceptional items and shareholders' costs of $5.1 million and profit before tax of $3.2 million. The nature of Dekker's US dollar revenues and the US dollar purchase consideration will further reduce the enlarged group's cash flow based exchange exposure to the US dollar. The group said the high quality Dekker titles will enhance Taylor & Francis' existing scientific, engineering and medical portfolios and have good growth potential as part of a larger focused publishing group. Taylor & Francis said its strategy remains to grow the business through a combination of organic growth and earnings enhancing acquisitions.  Peter Scott’s Library Blog 11/18/03



The November 6 issue of Nature ran two stories on legal troubles that might plague arXiv, the most used preprint exchange in all the fields of science and scholarship.

(1)    The first (“Defamation online”) is an unsigned editorial noting that an article by CERN physicist Alvaro De Rújula, posted to arXiv on October 27, accuses Martin Rees, Britain's astronomer royal, of "claiming credit for other researchers' ideas". In the US, this would only be libelous if false, but in the UK it might be libelous even if true. Moreover, in the UK both the archive and the author might be liable, even though the archive does not read or approve the articles it hosts. In the OA movement, we usually distinguish archive deposits from true "publications," but any kind of exposure to third parties counts as "publication" for the purposes of defamation law. The editorial is accompanied by a short note by Jim Giles (Critical comments threaten to open libel floodgate for physics archive) reporting that arXiv founder Paul Ginsparg would remove a defamatory paper from the archive if advised by a lawyer to do so. Quoting Ginsparg: "ArXiv is just a mindless redistribution system. It's not implemented to be a global police force to detect or enforce professional ethics."

(2)    The second (“Preprint server seeks way to halt plagiarists” by Jim Giles) reports that 22 papers by Ramy Naboulsi were recently removed from the archive when some were discovered to be plagiarized. They were removed by the colleague who had submitted them to arXiv under the good-faith but mistaken belief that they were original. However, Paul Ginsparg is exploring ways to block plagiarized articles in the future by using software to measure the similarity of new submissions to articles already in the collection. Open Access News 11/10/03




The November 13 issue of The Economist contains an unsigned story, “Perishing publishing, on the possible defamation in a preprint on deposit at arXiv.” "On the face of things, pre-printing is a good idea. It exposes a paper to wider scrutiny than the old system did, which should improve its accuracy—as happened in this case. But it also suggests that the price of getting one's ideas into the public domain rapidly is a need to keep them continuously revised in order to avoid criticism, however moderately or immoderately expressed. Like the Red Queen, in 'Through the looking glass', today's physicists need to rush faster and faster merely to stay in the same place."  Open Access News 11/17/03



Steven Dickman writes on the challenges of searching the scientific literature in PLoS Biology, November 17, 2003. "Although the march toward better text-mining systems is building momentum, access could stop it in its tracks. Experts in text-searching uniformly cite access as a key obstacle for developing better search tools. 'Access is a bigger problem than algorithms' is how one machine-learning expert puts it, and a half-dozen others agreed."  Open Access News 11/17/03



Librarians have been concerned for decades about the rising costs of academic publications, sometimes referred to as the 'serials pricing crisis'. Scholarly journal prices have been rising faster than inflation, and faster than library budgets, for more than thirty years. The transition to electronic access should have brought relief for librarians - but instead they are now embroiled in lengthy negotiations with publishers who are demanding high prices for electronic site licenses. Open Access Now talked to Beverlee French about her challenging job as the Director for Shared Digital Collections at the California Digital Library.



Cornell University Library has posted a list of about two hundred Elsevier journal titles it is canceling for 2004. Harvard University says it is preparing for similar cuts in its Elsevier subscriptions. The University of California continues its negotiations with the Dutch publisher of scholarly scientific journals on behalf of all the UC campuses, while faculty on some campuses have resolved to boycott Elsevier if reasonable rates cannot be negotiated. Other universities and library consortia around the country are also in the throes of assessing what they can afford and what they will have to cancel due to price increases and budget constraints....Many faculty scholars at numerous universities have already embraced alternative scholarly publishing and open-access models, such as BioMed Central, Public Library of Science (PLoS), and others. SPARC (the Scholarly Publishing and Academic Resources Coalition) recently announced a partnership with PLoS 'to broaden support for open-access publishing among researchers, funding agencies, societies, libraries, and academic institutions through cooperative educational and advocacy activities.'"  Open Access News 11/17/03



Both the faculty and staff senates at the North Carolina State University have approved a tough resolution opposing the practice of bundling content and essentially authorizing the library not to renew its bundled deal with industry-leading STM publisher Elsevier. Like a number of research libraries, NCSU is currently negotiating a renewal of its Elsevier package, which expires on December 31. The resolution blasts the practice of bundling journals and explicitly charges the NCSU libraries with maintaining "strong and flexible control over the state funds entrusted to it." NCSU Head of Collection Management Suzanne Weiner told the LJ Academic Newswire that NCSU's current Elsevier deal, negotiated through the Triangle Research Library Network, costs the library roughly $1.4 million annually. That translates into roughly 15 percent of NCSU's $9.2 million collections budget. Some 38 percent of the libraries' serials budget goes to Elsevier, representing 11 percent of NCSU's journals. Like Harvard and Cornell, Weiner said that NCSU's major issue was the inflexibility of the bundled deal. Perhaps the most striking element of NCSU's resistance is the level of faculty engagement. Weiner said the NCSU library committee, composed of faculty, staff and students, invited NCSU librarian Susan Nutter to make a presentation about the Elsevier negotiations—the result of which was the resolution. The resolution affirms the libraries' ability to "decline highly restrictive offers such as such as those proposed by Reed Elsevier for its ScienceDirect online product." The resolution passed unanimously, so if the library refuses a bundled offer, Weiner says, it will do so with full support of the university community. Still, Weiner stressed that NCSU has not made a decision to renew or to cancel, and is continuing to negotiate with Elsevier.  Library Journal Academic News Wire: December 04, 2003   For information on the UIUC/UIC/UIS Elsevier negotiations see



Bobby Pickering, Elsevier hits back at journal cuts, Information World Review, December 4, 2003. (Thanks to Gary Price.) Elsevier tries to put a good face on the wave of cancellations, saying that most negotiations with subscribers are going well and the cancellations are about removing duplicates and shifting from print to electronic.  Open Access News 12/04/03   



Academic librarians across the country have long complained that the bundled subscription packages from large scholarly publishers and database aggregators force them to subscribe to journals they don’t want to get good prices on ones they do. Smaller publishers, such as those from scholarly societies, have often suffered as library budgets are absorbed by payments to large publishers. Now HighWire Press, the librarian-led journal aggregator from Stanford University, has launched a new subscription program called Shop for Journals ( Initiated by a group of scholarly society publishers participating in HighWire, the new pricing/subscription model offers an alternative to the “Big Deal” packages and allows librarians to create their own packages using tiered pricing tied to library type. At launch, the service encompassed 57 journal titles from 25 scholarly society publishers, but a HighWire Press representative said that currently 50 publishers are watching the program closely and 10 are expected to join the program by summer 2004. Information Today 12/08/03



Two years after its launch, BioOne representatives report that the electronic journal's database now contains 68 scholarly journals and one electronic book published by 56 scientific societies and other related organizations. More than 400 libraries around the world (mostly at colleges and universities) provide BioOne access to some 3.5 million scholars, students, researchers and other practitioners. The goal of the e-journal aggregation was to develop an academy-based alternative for the electronic publishing of journals by scholarly societies (focused on biological, ecological and environmental sciences) that lacked the financial and technical resources to become electronic publishers. The founding partners—including the American Institute of Biological Sciences, the Scholarly Publishing and Academic Resources Coalition (SPARC), Allen Press, the University of Kansas, and the Greater Western Library Alliance—also wanted to address the continuing need for academic libraries to acquire high-quality scientific literature at a reasonable cost. With a business model that puts as much subscription revenue as possible directly into the coffers of the societies that publish through BioOne (, 2002 member journals received an average of about $8,500. (C&RL News, Nov 2003)  ShelfLife, 11/13/03



George Plosker says that he and his fellow panelists at the Special Libraries Association conference "became increasingly concerned that professionals and researchers sincerely believe that searching the Open Web, particularly Google, is 'good enough.' Groups with degrees from excellent schools, Ph.D.s in environments that included technical R&D, and even biomedical and pharmaceutical professionals were using Google, not recognizing the significant differences in authority and quality between the Open Web and premium subscription content typically provided by the information centers/libraries." Google now gets 250 million search requests a day, and Searcher editor Barbara Quint says it is getting more searches in three days than all libraries combined globally get in one year. An amazing service! Plosker and his colleagues conclude that "what remains to be done is to inform and educate users that there is more to the content world than the Open Web." At the same time, the profession "can no longer resist these influences on the content environment, nor maintain dated points of view. What we learned in library school is not enough. We cannot sit at the reference desk and proclaim, 'We only support premium content databases.'" (InfoToday 3 Nov 2003)   ShelfLife 11/13/03

For information about the UIUC Library’s Open Archives Initiative Metadata Harvesting Project see




For a fee, thanks to a newly created tie-up with the Edgar Online database, users of Microsoft's Office 2003 will be able to graph, chart and compare financial results from income and cash flow statements, as well as balance sheets, for any of more than 10,000 companies that file with the U.S. Securities and Exchange Commission. Microsoft is connecting its popular Excel spreadsheet to the Edgar Online database of U.S. financial reports so that company analysts will be able to pipe in data currently entered by hand, thus saving the estimated hundreds of millions of dollars now spent

retyping data from one application to another. The data, which go back five years, will be delivered in XBRL, a version of XML designed for business reporting. Microsoft is betting that XML will become the Internet's dominant language for computer-to-computer communications.  (Reuters 5 Nov 2003) ShelfLife 11/13/03



Five Scenarios for Digital Media in a Post-Napster World, a joint effort between GartnerG2 and the Berkman Center for Internet & Society, presents five possible scenarios for copyright law applicable to digital media in the United States.

Descriptions of each scenario were developed by Professor William Fisher and John Palfrey of the Berkman Center at Harvard Law School. The outcomes of each scenario were developed by GartnerG2 as “first takes” on the implications that changes in copyright law will have on future business models and markets.  The intent of the paper is to spark reasoned discussion and debate that can assist in the development of new business models for the entertainment industry, artists and technology companies, while enabling consumers to legitimately acquire and manipulate copyrighted digital media. These scenarios were the basis for several working sessions at a September 2003 conference sponsored by the Berkman Center and GartnerG2. Feedback from those work sessions was compiled and posted on both the GartnerG2 and Berkman Web sites.

This document is a supplement to “Copyright and Digital Media in a Post-Napster World,” available on, and also at 



Microsoft has launched a beta version of a competitor to the popular Google News automated headline site—MSN Newsbot. In addition to an English-language UK site, MSN has launched versions for France, Italy and Spain. According to the site, Newsbot is an experimental, automated news service that gathers news from more than 4,000 sites. Unlike Google, the site says it offers personalization: "If you sign in to Passport, we can personalize the news site based on your interests: showing you news from sources you've chosen in the past, or suggesting stories based on your previous interests." The site was developed in partnership with Moreover Technologies.



Internet Filters and Public Libraries by David L. Sobel is a new First Report now available from the First Amendment Center.  Sobel, general counsel of the Electronic Privacy Information Center, examines the effects of the U.S. Supreme Court’s June 2003 ruling in U.S. v. American Library Association, which declared the Children’s Internet Protection Act (CIPA) constitutional. CIPA mandates that libraries accepting federal funds install filtering software to block access to material that is “obscene,” “child pornography” or “harmful to minors.”



HarperCollins is introducing the literary equivalent of the extras that DVD buyers get - unseen footage, and so on - when it brings out its Harper Perennial line of upmarket paperbacks next year. Harper Perennial will add sections called "P.S.", containing interviews with the author, critical pieces and suggestions for further reading. The first titles, due in the spring, include Jane Dunn's Elizabeth and Mary, Katie Hickman's Courtesans, Ann-Marie MacDonald's The Way the Crow Flies, and Douglas Coupland's Hey Nostradamus!  The Guardian 11/15/03,6109,1085297,00.html



Further evidence that we need to revise the notion that libraries are places where we only borrow books appears in the statistics for U.K. library expenditure in 2001-02. Some promotions of literacy and reading take place in libraries, which have also developed the "People's Network" of online resources; spending on the sector went up for the fourth successive year. But less than 10p in the pound of that expenditure went for books, according to the Library and Information Statistics Unit. Libraries hold 18m fewer items of stock than they did 10 years ago. Library loans in that period have fallen by a third.  The Guardian 11/15/03,6109,1085297,00.html



Fair use can raise some of the most difficult and controversial issues in copyright law. Users often wish there were clear rules to establish exactly when a use is fair and when it is not. But the ambiguity of the fair use doctrine is also its strength, because it allows courts to apply fair use to new and sometimes completely unanticipated uses of copyrighted works, argues June Besak. Copyright owners have certain exclusive rights, in particular the right to copy their work and to create adaptations based on the original. Those rights are not absolute, however. The law provides exceptions to permit various uses without the copyright owner's consent. Perhaps the best known is fair use, a flexible exception that allows reasonable uses that will not unduly harm the market for the original copyrighted work. What makes a use "fair"? There is no simple formula. EDUCAUSE Review, Nov/Dec 2003  Read more at



A new report released by Shore Communications Inc. paints an optimistic outlook for the emerging ebook industry that is already growing at exponential rates in year-over-year sales figures and seeing stronger usage in both traditional and non-conventional outlets for book-based content. According to the report (, the consolidation and standardization of ebook platforms, formats and rights management, their broadening availability via consumer, library and professional outlets and improved accessibility via search technologies is creating acceptance and use that can be expected to grow significantly in the near term. However, major challenges still confront the growth of ebook demand.  The Write News 11/14/03



The first leg of an ultra-high-performance network went live recently in what its backers call the most important networking experiment since Arpanet, the military network that laid the foundation for the Internet. The National LambdaRail is the biggest, fastest network ever undertaken for scientific research. Created by a private consortium of universities and tech companies, the NLR will link hundreds of research institutions around the United States with a dedicated, high-speed optical network.

The first leg links Chicago's TeraGrid facility and the Pittsburgh Supercomputing Center. The remainder of the network will be up and running by the end of 2004.,1377,61102,00.html



PubMed Central has launched an About Open Access page drawing attention to the journals that provide open access to their contents through PMC. The page also announces an important new policy: "[I]n October 2003, PMC began accepting individual open access articles from journals that do not participate in PMC on a routine basis. For the specific conditions under which PMC accepts these articles, see the relevant PMC agreement (in Microsoft Word format)." The offer is open to all authors in the life sciences willing to release their work to "open access" as defined by the Bethesda Statement on Open Access Publishing.  Open Access News 11/12/03 (Note, your editor is a member of the PubMedCentral Advisory Committee)



We have more information at our fingertips than any generation before us, yet there is little evidence that our ability to make good decisions has improved in correlation. Instead, many people find it increasingly difficult to separate good information from bad. The goal of improving information literacy is one that a number of countries have established, and broad-based information literacy training will certainly help. But one

researcher has found that personality plays a big role in the development of information literacy. In her study last year, Fast Surfers, Broad Scanners and Deep Divers, on how personality affects our ability to find and absorb information, information literacy researcher Jannica Heinstrom found that "personality and approach to studying influence (our) information-seeking habits.” She finds the neurotic, easily distracted and lazy "Fast Surfers" have difficulty formulating searches and then interpreting what they find. "Deep Divers" are identified by their willingness to consider viewpoints and link ideas. Strategic thinkers, or "Broad Scanners," are conscientious and have clear goals. The last two categories are highly motivated, Heinstrom says, and often able to find the information they seek, although for different reasons. While discipline and education can improve the way information sources are used once located, Heinstrom concludes that "personality (will) create boundaries and unique possibilities for the way information seeking is executed." (The Age 11 Nov 2003)

ShelfLife, No. 133 (November 20 2003)

(Information about the Library’s Information Literacy Services and Instruction can be found at



Simson Garfinkel argues that the hand-wringing about obsolete formats is misguided. The digital files we create today will be around for a very, very long time, he says.  It is simply inconceivable that documents created today in Adobe’s Portable Document Format (PDF), or images stored in the Joint Photographic Expert Group (JPEG) format, won’t be decipherable on computers in the year 2030. That’s because both the PDF and the JPEG formats are well-defined and widely understood. Adobe has lost control of PDF: there are more than a dozen programs that can create PDFs and display them on a wide range of computers. In other words, PDF is no longer a proprietary format. The same goes for JPEG. Yes, Adobe may fail and new 3D cameras may make two-dimensional photography obsolete. But we will always be able to read files in these formats, because the detailed technical knowledge of how to do so is widely distributed throughout society.



A US federal judge has approved a settlement of a lawsuit filed on behalf of millions of record club members who alleged that major record companies and large music

retailers had overcharged them in a price-fixing conspiracy. Under the settlement, the CD buyers will receive vouchers to give them 75 percent discounts for new compact discs, which they will receive with no shipping or handling charges.  BNA's Internet Law News (ILN) - 12/8/03



Several recent global agreements among developed and undeveloped countries include measures about digital rights management (DRM) technologies that challenge the balance between rights of users and "creators" (content owners as well as authors), says Christopher May, a Reader in International Political Economy at the University of the West of England. "It seems at least possible that DRM may consolidate (or even worsen) the wealth effects disrupting the distribution of information and knowledge across the so-called 'digital divide.' This is manifest most generally in the collapse of the generally accepted social norms of content usage," he says. On one hand, users employ technology they have legally purchased to "infringe" the rights of content owners, while owners have sought to (re)establish robust control over their knowledge assets, he says. In order to reestablish an equitable balance, "A new politics of the knowledge commons needs to weigh in on the side of the public domain, to balance the well established and

powerful interests which have been mobilized, not so much by authors as by the content industries, to protect, advance and expand their commercial rights to profit from the exploitation of content. Any re-balancing will require political action," says May. (First Monday Nov 2003) ShelfLife, No. 135 (December 4 2003)



A recent study by the Pew Internet & American Life Project indicates that while technology increasingly has become interwoven into daily activities, there are very different patterns of online information consumption among age and economic cohorts. The three groups most likely to subscribe to online content are the Young Tech Elites (average age 22), the Older Wired Baby Boomers (average age 52) and the Wired Generation Xers (average age 36). The study also identified Wired Senior Men (average age 70) as "ardent, aging news hounds." This affluent group is relatively small, but its members are frequent news gatherers online, particularly political news. Among the other three "heavy user" groups, Older Wired Baby Boomers place the largest emphasis on information gathering, rather than some of the more avant garde activities such as music downloading that dominate younger users' time. Seventy-two percent have done work-related research online, and 88% have gone online to get news, with about half doing so on any given day. This group tends to be dominated by males (60%), and have higher incomes and more education than the average American. Wired GenXers are split 50-50 on gender, and they see the online world as a way to get things done. Sixty-seven percent have done work-related research online and while they often go online for news, only about a third do so on a typical day. Thirteen percent of both GenXers and Young Tech Elites have paid for online content. And while Young Tech Elites are avid surfers, they are also the most likely to download information and post it online. In addition, 44% have created content for the Web, compared with 19% for all users.

("Consumption of Information Goods and Services in the United States" 23 Nov 2003) ShelfLife, No. 135 (December 4 2003)



More than 2 million U.S. children aged 6 to 17 currently have their own Web sites, according to a survey by Grunwald Associates, and based on respondents' answers, the market research service predicts that more than 6 million children could have their own personal sites by 2005. "Previous generations of kids wrote earnest poetry, or joined rock bands to express themselves. Today's kids do so by building personal Web sites. And they're hungry for tools to help them build better, more engaging sites, and stay in closer touch with friends," says Peter Grunwald, president of Grunwald Associates. And while almost half of the teenagers with home Internet access polled either had a Web site or planned to build one, a third of 6-8-year-olds expressed similar plans. The study also indicated that girls were more likely to have their own sites than boys (12.2% vs. 8.6%). Meanwhile, the proliferation of broadband connections at home has rendered internet access from school a disappointing experience for many. 76% of kids with home broadband connections reported their home connection was faster than the school's. Even a majority of children with dialup access at home (62%) perceive their home connections to be the same or faster than those at school. And though the connections may be slow, both parents and kids expressed rising levels of dissatisfaction with the amount of time allotted to their children for computer use at school, with nearly half (49%) of kids and 34% of parents saying their children were getting "too little time online" during the school day. An overview of the "Children, Families and the Internet" survey findings is available at (Grunwald Associates 4 Dec 2003) NewsScan Daily, 5 December 2003



A new study published in the journal Science looked at footnotes from scientific articles in three major journals (the New England Journal of Medicine, Science, and Nature) at 3 months, 15 months and 27 months after publication, and found that the prevalence of inactive Internet references grew during those intervals from 3.8% to 10% to 13%. In another recent study, one-fifth of the Internet addresses used in a Web-based high school science curriculum disappeared over 12 months, and a third study found that

40% to 50% of the URLs referenced in articles in two computing journals were inaccessible within four years. Brewster Kahle, widely admired for his creation of the Internet Archive project, says: "It's a huge problem. The average lifespan of a Web page today is 100 days. This is no way to run a culture." (Washington Post 24 Nov 2003) ShelfLife, No. 135 (December 4 2003)



Berklee College of Music, an independent music college has announced the launch of Berklee Shares. This new program provides free music lessons and encourages musicians to share and distribute these music lessons online. The Berklee Shares lessons are available at no charge and are made up of a growing catalog of MP3s,

QuickTime movies and PDF files derived from curriculum developed at the college by its world-renowned faculty. The lessons are available for free download on ( ), affiliate partner sites and peer-to-peer networks including Limewire and Kazaa.  The Write News Weekly 12-1-03


Blogger Henry Jenkins lauds a recent essay by novelist Umberto Eco about the importance of books in human culture. Eco argues against the idea that new media will kill off print culture, and lays out some of the ways that they are apt to interact. Emerging Technologies Wednesday Update (12.03.03)



The National Digital Information Infrastructure and Preservation Program (NDIIPP) at the U.S. Library of Congress has published It's About Time: Research Challenges in Digital Archiving and Long-term Preservation. The report presents findings from a joint Library of Congress/National Science Foundation workshop on research challenges in digital preservation.  Margaret Hedstrom (University of Michigan) chaired the workshop, which was attended by experts from government, academia, professional organizations, and the private sector.  As detailed in the published report, the workshop identified a number of priority areas for research into new models, methodologies, and tools for long-term preservation of digital material.  The report is available at



In a lively, written debate, the LANCET (owned by Elsevier) offers some expert industry perspectives on open access publishing. Writing skeptically about the prospects of open access, Wiley vice-president Brian Crawford argues that there is little to suggest that the model is sustainable—and that its reliance on author fees to cover publication costs could actually harm science. Crawford asserts that, if authors are required to pay for publication, the "filtering" process in STM publishing will necessarily skew toward the author's goal—publication—possibly weakening the effectiveness of scientific research. Countering Crawford, Pritpal Tamber, of open access publisher Biomed Central, argues that subscription-based models drain funds from research institutions and reduce access to crucial research. It may be too early to say whether open access is the way

of the future. For academic libraries, however, something clearly must give. ARL research shows that research libraries in North America have seen serial prices soar 215 percent since 1990. During that same period, research libraries' serials expenditures more than doubled, but the number of titles purchased by large academic research libraries generally decreased. In an era when technology was supposed to aid in the distribution of information, such statistics continue to confound librarians and researchers. "Ten years ago everyone believed that electronic [journals] would be really cheap," Yale Associate University Librarian Ann Okerson recently told the YALE DAILY NEWS. "I think there's a fair amount of disillusionment or disappointment that moving to electronic has not brought down costs or prices." To access the LANCET debate, visit and scroll down to "Series" (registration required).  Library Journal Academic News Wire: December 02, 2003



Researchers and academic librarians may be increasingly disillusioned about the marketplace for e-journals, but the emerging open access movement in STM publishing may help change that, says Georgia Tech economist Mark McCabe. In a conversation with the LJ Academic Newswire, McCabe, an expert on the evolving STM marketplace, said that open access has made a strong first step toward success—and may offer the only "socially sensible" solution to reversing STM inflation. McCabe is currently in the early stages of an Open Society Institute-funded study that will analyze various open access models vs. subscription-based models. He said that open access can succeed in STM publishing because it restores a concept to the STM market that has diminished in recent years: competition. Under the open access model, journals charge authors a fee

to cover the cost of publication, while access for users is free. This offers more market efficiency, he says, because it reintroduces faculty to the costs of disseminating research in their academic field. The current practice—libraries purchasing site licenses--in contrast divorces faculty members from the costs of publishing.  With author fees instead of subscriptions, McCabe says, journals will compete with each other for authors—and their fees. To do that, they can offer financial incentives such as lower (or waived) author fees, or quicker peer-review and publication, or the promise of more readers or prestige.  Library Journal Academic News Wire: December 02, 2003



The recording industry this week claimed progress in a controversial legal campaign targeting individuals who use peer-to-peer networks, but its optimism appeared to clash with at least some of the evidence, which remains murky. By some measures, usage of peer-to-peer software such as Kazaa has been cut in half since the Recording Industry Association of America (RIAA) announced in late June that it would begin suing alleged file traders. The campaign to date has yielded 382 lawsuits and 220 settlements averaging close to $3,000 apiece. But by other measures, file swapping is hitting an all-time high.  Scholarly Electronic Publishing Weblog 12/08/03



The model of scholarly publishing can be reduced, in economic terms, to a Tragedy of the Commons, whereby the individual interests of publishers, libraries and scholars are in conflict with what is in the best interest of the public good. Serials inflation, price discrimination, and site-license pricing are all manifestations of this dysfunctional economic model. Moral arguments to change human behavior are not effective because they do not provide individual incentives. Technology alone is also not a viable solution since it fails to change the underlying human behavior that is driving the economic model. Abandoning the current system of publishing is both risky and costly. This paper argues for a reintermediation of the library as governor of the public scholarly commons, but illustrates that these solutions are in conflict with the mission of the library profession.  Scholarly Electronic Publishing Weblog 12/08/03



Internet archive creator Brewster Kahle and founder Jeff Bezos have at least one thing in common: a desire to capture digitally all human knowledge. Kahle notes with regret, "For most students today, if something is not on the Net, it doesn't exist." So to achieve "the Alexandrian fantasy"—an archive of all knowledge, accessible by everyone—will require scanning all those books written in the past. Is that really possible and actually economic? Journalist Gary Wolf says the answer is yes, because the cost of scanning has been steadily dropping and is now as low as $1 a book. There are three ways of doing this: (1) Tear off the spines, and send the pages through a high-end scanner that costs about $25,000 and handles 90 black-and-white pages a minute, front and back; (2) Ship the book overseas, to workers in India, China, and the Philippines who earn about 40 cents an hour to manually turn pages that are processed by $15,000 overhead scanners; and/or (3) Hire a robot: Kirtas Technologies has introduced a bot that has both an overhead scanner and an automated page-turning arm. Dustin Goot explains: "While a book sits open in a special cradle, the arm swoops down, grabs the top page with gentle suction, and turns it." But he adds: "The machine boasts a speed of 1,200 pages an hour, but justifying its six-figure price to frugal librarians is tough." (Wired Dec 2003)  ShelfLife, No. 134 (November 26 2003)



With the increasing demand for digital videos in the educational and research communities, the Open Video Project ( aims to provide an easy to-use open source DV archive, while serving as a test bed for digital library research and development at the Interaction Design Laboratory. Currently the collection contains about 2,000 individual segments, in MPEG-1, MPEG-2, MPEG-4 and QuickTime formats, contributed by government agencies, universities and individual collectors. Most of the 460 hours of video footage have been edited into shorter segments for faster downloading, and include metadata at three levels: entire video, segment and frame. Various kinds of video representations, known as surrogates, have also been created, to speed the process of selecting the desired clip. Surrogates range from a single thumbnail image to a storyboard displaying multiple key frames simultaneously, as well as slide show and fast-forward types. There are also traditional text surrogates like title, keywords and descriptions. Each surrogate category is being evaluated to see how much people could understand from watching only these surrogates, as well as how they interacted with textual and visual information. In the second phase, surrogates have been integrated to create 'Agile View' interfaces for users to explore video content. ( Oct 2003) ShelfLife, No. 134 (November 26 2003)



The U.S. Department of Energy is giving $4.5 million to Oak Ridge National Laboratory so the Lab can develop the prototype of a computer network system capable of transmitting massive amounts of science data. Called Science UltraNet, the network will operate at a range of 10-40 gigabits per second. Nageswara Rao, one of the project directors, explains: "We're not trying to develop a new Internet. We're developing a high-speed network that uses routers and switches somewhat akin to phone companies to provide dedicated connections to accelerate scientific discoveries. In this case, however, the people using the network will be scientists who generate or use data or guide calculations remotely." (AP/USA Today 25 Nov 2003)  NewsScan Daily, 26 November 2003


The scholarly communications are also on line at