Time and Location of Meeting
October 14, 20159:00 am - 10:30 am Main Library Room 323c
- Review agenda
No changes made.
- Review Library Assessment Committee Grant Program.
- Last year the grant programs rubrics and guidelines were solidified. Jen uploaded grant applications from the past year to the group’s Box folder so new LAC members can compare them to the rubrics, with the understanding that there has traditionally been some back and forth between applicants’ first and final submissions. The committee is encouraged to prioritize applications that focus on strategic-planning instead of just research.
- The committee agreed that sharing the guidelines and rubrics on the LAC or assessment website may improve the initial quality of applications. Becky suggested an amendment to the impact segment of the rubric that encourages practical qualitative assessment.
- Jameatris also suggested finding a way to help applicants assess their own methods, i.e. guidelines on developing a clear hypothesis and methods. The committee agreed that these sources exist already, but could be made more readily available to applicants either on the LAC website or in a LibGuide, which may be more accessible since applicants are internal.
- The following changes were proposed for the LAC grant guidelines:
- The timeline for proposals was changed to lengthen the committee’s review time from 2 to 5 business days.
- Something will be added to let applicants know how/when their grant money must be spent.
- Jen is going to add the LAC grant to the staff orientation list and upload the grant record to the assessment website.
- Jemma Ku (the most recent recipient of the LAC grant) is formulating a report for the LAC on her experience.
- Projects for this year
Please review the LAC Member Updates attached to the agenda. Prepare to discuss and identify project for this year.
- Resource that includes suggestions and strategies for incorporating assessment into library instruction
- Susan Avery suggested working with User Ed to implement this idea. There are different ways to assess instruction, such as clicker. We could develop online resources and/or contact information for help. This could be something to add to the assessment LibGuide or the assessment website. The current methods of reporting library instruction make assessment and reporting of the stats difficult for everybody. For example, there is no good method at present to track outreach and orientation sessions by library faculty/staff or collect consistent electronic resources group data.
- Create funding support of professional development in the area of assessment
- This matter will be revisited in spring when we start thinking about the Library Assessment Conference for 2016 (October 31, 2016 – November 2, 2016) and who will go.
- Review ongoing assessment projects, practices and tools, make recommendation for
change if necessary. [For example, could/should LAC review and educate the library the best way to collect gate counts? Should we terminate contract with Desk Tracker ($2,094.75 per year paid by Library IT) and move to a Springshare product?]
- Springshare supports a software called Lib Analytics Insight that is cheaper, but it would need to be discussed with the Reference Management Team and the User Education Committee to see if there’s potential to make this switch. We must consider how individual service points will deal with this change, perhaps by analyzing and shopping the options. Ultimately the decision to change will lie with the Reference Management Team, but JoAnn referred the matter to the Coordinator for Library Assessment. Erin is on both the LAC and the reference management team, and plans to investigate the matter further with both groups. Library IT already supports a possible change from Desk Tracker.
privacy is handled across the Library (such as reference/information desk, IM/chat transcript handling, etc.)?
- Should the committee work on a “word cloud”or words/terms that are often associated with library assessment?
- This word cloud be added to the assessment web page or the abovementioned LibGuide. It may encourage staff and APs to look at more assessment-related materials, since it is a misunderstood aspect of our library. (People looking for assessment tools don’t know that’s what it’s called or that it falls under the umbrella of assessment.) Jen is going to start work on some words and phrases for this word cloud.
- IRB and library surveys
CITI training has the following message for “Designed to Develop or Contribute Generalizable Knowledge”
“Some activities that involve interactions with humans and data gathering may not meet the definition of research because they are designed to accomplish something else, such as program improvement. For example, university library staff may conduct a survey of members of an academic unit to find out if the library is meeting the department’s need. The project may be systematic, but is not considered research because the intent of the project is to improve the library’s service to its patrons, rather than contribute to a body of knowledge.
Publication of results is sometimes used, incorrectly, as an indicator that a project meets the definition of research. It is the intent of the project that matters. In the example above, the library staff could share the results of their program improvement activity at a conference without changing the intent. The project would not become research by virtue of sharing its results.“
- The committee will read these updates on their own time. There is lots of ongoing IRB in the libraries. Some of the language here is vague, so the committee will investigate further.
- Ithaka Graduate Student Survey update.
Lisa is not available to meet with us however she shared the following updates regarding the survey administration –
- Alisa Rod, who was our contact at Ithaka, has taken a job at Barnard. Her position is currently vacant but we are working now with the surveys administrator, Christine Wolff, who started a few months ago and has been fantastic. Roger Schonfeld continues to be the manager for the overall surveys program.
- The cognitive interviews (getting feedback for individuals in the target population as to whether they understand the questions as they are intended) for the STEM and International modules were completed in August and the questions finalized. It was particularly heartening to have the international students respond that it was great that the Library would care about their comfort and feeling welcome.
- Anna Lapp is the new Info Lit GA and will be taking Sarah’ Crissinger’s place as GA assistance for this project. Rebecca Bryant has joined the working group in place of Merinda Hensley as a connection to the User Education Committee.
- We are working on developing a partnership with the Graduate College and hope they will support the survey implementation so we can do a full population survey.
- The timeline for administering the survey is not yet known – depends on working with the Graduate College.
- Since the survey has been delayed, it may not happen this fall semester. If there are any other questions about the Ithaka survey, they can be directed to Jen or Lisa.
- Updates (Round-robin)
- Erin reminded the committee that there will be a reference road map event in spring, an open forum for library staff to discuss issues with reference management.
- Suzanne is working on the launch of the new library website with a toolkit for UX. That working group is currently mining for site failure questions and issues. (The site may advertise that people can ask questions directly to the LAC.)
- In the spirit of developing a culture of assessment, Lynn is doing some University Press ebook assessment and researching how to do collections assessment with a vocabulary that is accessible to all.
S. Avery, C. Ingold, E. Kerby, J. Rimkus, B. Smith, J. Strutz, L. Wiley, J. Yu (ex officio), S. Chapman (ex officio) and E. Justiss (GA)