February 12, 2009 Meeting Minutes
Attending: Qiang Jin (chair); Atoma Batoma, Andy Bendel, Linde Brocato (minutes),
Chris Cook, Fang Gao, David Griffiths, Myung-Ja Han, Barbara Henigman, Susan Hill, Eleanor
Hornbacker, Robert Howerton, Anne Huber, Gail Hueting, Kevin McLaughlin, Michael Norman, Fung
Simpson, Diana Walters, Janet Weber.
The meeting was called to order at 9:37 by Qiang Jin.
The minutes of 13 November 2008 were approved.
ALA Midwinter 2009 reports:
- Reported on the ALCTS Cataloging and Classification section's discussion of RDA and its
- RDA: June 2009 JSC finishing RDA and sending to ALA Publishing
Third Quarter of 2009: RDA released
Third/Fourth Quarter: National libraries test
First Quarter 2010: National libraries decide
whether to implement
Second/third Quarter 2010: Training
Fourth Quarter 2010: RDA implementation
- Qiang also reported on the discussion of evaluating FAST headings at ALA Midwinter. SAC
Subcommittee on FAST meeting - over 1,000,000 FAST headings, including personal names, corporate
names, events, titles, periods, topicals and geographics.
- The 31st ed. of LCSH will be available in Spring.
- Gail Hueting:
- OCLC News: they wanted to implement a new "record use policy" last November and then again in
February, but have refrained since there was much discussion on blogs, listservs, and the
internet. They've created a "Review Board of Shared Data Creation and Stewardship" consisting
of members of the consortium and not just the Board of Directors, since it's a membership
group. No new policy will be implemented until Fall 2009.
- OCLC has done a major survey of end users and of librarians, and has found that what end users
really want is more keyword-accessible subject information, while what librarians want is fewer
duplicates and to be able to enhance records and fix errors. The result is that they're doing a
pilot project - Expert Community Experiment, February -August 2009 --to expand who can change
records and how they can change them.[See also
- Atoma pointed out that RDA needs volunteers for testing, who will create records for
materials in different formats using RDA tools. The testing will take 6 months with volunteers
being familiarized with RDA within 3 months, so the actual test will be the remaining 3
months. Afterwards, a report will be generated along with the records being made
public. The advantage of participation in the testing is that equipment, the manual, and
training will be provided for free by RDA. The training is very technical. As of
February, RDA is online for everybody to play with it.
Online Training Modules:
Gail Hueting and Fang Gao showed the CAM / Video Tutorials website --
-- and in particular the "Understanding a Serials Bibliographic Record." Fang reported that
two other video tutorials are in the works, (1) relinking new (presumably) better bib records to
the MFHD of MARCETTE records, which has proved to be somewhat complicated, and (2) creating a new
MFHD rather than just changing the present location in the MFHD of serials when current issues stay
in a library and back issues go to STOS (all of 'em go to STOS that way). The clarity and
usefulness of the tutorials was applauded.
CARLI recommendation re:440 vs. 490/830.
Gail Hueting reported that CARLI has suggested making all 440s into 490/830, rather than our
current policy (adopted at the November 2008 CWG meeting) of retaining 440s in copy-cataloging
records, and creating new records with 490/830. There are some 2,000,000+ records with 440
fields in the I-SHARE catalog. CARLI also notes that the 440 field will continue to be
indexed. Strawn tools have a new version to take 440 and make it into a 490.The problems with
automated changes of 440 to 490/830 were brought up, as were those in the swapping of 440 and
490/830 back and forth as records are overlaid in the I-Share catalog.
consensus was to continue with our own policy of leaving 440s in copy cataloging and using
490/830 in original records that we create, since it will add too much time to copy
cataloging. It was also brought up that it would take too much time in the retrospective work
done in the STX office, in particular with analyzed serials.
OCLC credits for enhancing records and adding new records in
WorldCat: many catalogers here add or edit in Voyager rather than in
Connexion. We miss the possible credits for doing such work by not doing it in
- Records with encoding levels
K, M, 2, 3, 4, 7 can be upgraded to full level.
NOTE: If not adding standard Dewey or LC call number, enter and replace at level
K so that others can add a standard call number
- Full level records,
[blank] level can have certain fields added (enhanced): call number and subject
headings in a scheme not already present (e.g. Dewey number for a record with only an LC number),
contents note. We are an Enhance library for books and scores; use the Enhance logon, available to
those doing original cataloging, which allows addition of other fields to full-level records, such
as added entries and notes that we feel are necessary.
- Adding new records:
- Make sure to follow OCLC's policy in "When to enter a new record" at
- Parallel records: when there is 040 $b [spa, ger, fre etc.] and the language of the record is
not English. Gail finds that records in French or Spanish can be upgraded to be acceptable
for the U.S. without much trouble, but for German, Dutch, and Swedish records, it is easier to
input a parallel record.
Include a 936 field with the OCLC numbers of the parallel record.
- Members with full-level authorization will be allowed to make changes to any record except LC
and PCC (BIBCO and CONSER records). Credits will be given for changes after the
end of the experiment. There are webinars on the program on a handout for the CWG meeting:
latest word on a communal webinar experience: "Gail is registering for the Tuesday, Feburary 24th
session from 3-4 pm, and has made arrangements for folks to view this with her in room 291 UGL."
[Email from Beth Woodard on LIBNEWS Fri 13 Feb 10:34:53 CST 2009 CDT.] More information is
Authority Control Maintenance / Outsourcing / Backlog:
Michael reported that Backstage will be here again in March, as they are interested in
various levels of work with our catalog (for around $150,000), ranging from analyzing the whole
catalog for what to replace or update, to various subsets of the catalog.
- FUNDING of EFFORTS:
Paula Kaufman and Scott Walter are championing funding for cleaning up the catalog in order
to take advantage of second generation OPACs, and getting backlogs cataloged so that we are higher
in the "volumes held" rankings again (we are presently #5 and both Berkeley and Toronto have
Examples of clean-up of electronic bib records is a set of Springer titles that didn't
display properly because fixed-field data wasn't correct. Hundreds of thousands of bib
records need similar work. The bib maintenance team is going to meet next month to determine
the best way to handle bulk and batch cleanups.
- AUTHORITY CONTROL MAINTENANCE:
We do have the entire Authority File Db loaded weekly by CARLI. It was asked if we get
credit for creating NACO records, and Gail Hueting pointed out that our obligation of 200 records
is largely met by Asian and Rare Books contributes as well. She also noted that NACO is
scrapping the 80% rule for determining the authorized form (preponderance of less that 80% is
- CATALOGING BACKLOG:
Paula Kaufman has earmarked $200,000 to address backlogs (the S collection, some of which now
has preservation issues because of a leak, and which will be sent to Oak Street after
treatment). We can also outsource cataloging at a range from $12/book to up to $35-$50/book
for non-roman alphabet materials. Backstage has put cards into the database for
$2/card. These then get loaded into OCLC if there isn't already a record (the last time, we
loaded about 80,000 records into OCLC). MARCETTE CLEANUP can also be part of the GOOGLE
project, since many of the materials are on their candidate list; there is, therefore, grad or
academic hourly money available to hire folks. Many of the materials that are in the backlogs
or are in MARCETTE records are unique, and have no records in OCLC. They get us credit for
contributing records, and increase our volume and title count (see 6b above). Campus will
also help fund. Paula asked for a list of the backlogs, which Michael compiled from some of
Naun's work, and from NEH access grant proposals.
outsourcing: Backstage, MARCnow (a vendor in India associated with Harassowitz). New
vendor next year will be Yankee Book Peddler, so there will be changes in profiles (some books will
come shelf-ready), and pilot tests at the beginning.
- RE: CREDITS from OCLC:
we can also upload new records from Voyager and get credit, but that doesn't give us credits
for enhancing records only in Voyager.
Submitted by: Linde M. Brocato, GA
Version 1.1 (13 February 2009)
Version 2 (24 February 2009)