Statistics of Sessions

July 27, 2000

Topic: Statistics of Sessions

Last year libraries on campus kept records of: 1) how many sessions were held and 2) how many people were in each session. These figures were used by Bart Clark and sent to ARL. ARL defines the criteria for these stats. Adjustments are sometimes made before reporting to the ARL. The totals should be in line with peers.

Five people at this luncheon discussion actually report the statistics for their department.

MORE DETAILED STATISTICS

The larger categories of statistics could be broken down into smaller ones and used for local purposes. This is a list of possible information recorded: type of workshop, audience level, where held, time (in semester, hour of day), time spent by library staff, type of equipment used, number and type of handouts, involvement of specific staff (librarian vs others). This information would be helpful in determining the number of staff needed to do instruction and what facilities are needed. CCSO sites are set up as labs and not instructional classrooms. Also, they are often not available.

Certain stats would be helpful, even if they are not being used by ARL. We must consider the usefulness of the stats in order to determine what to collect and where to put our investment of time and money. We want to collect enough, but not too much. What do we want to say about user education? It is difficult to keep stats, especially when you are busy or when the form is long. How do we want to collect stats? 100% of the time, sweeps, or another way?

CONSIDERATIONS

The stats must be truthful. They must report what is actually being done in the clearest way. They must be consistent. When two departments are involved, who records the statistics? In Reference, stats go with the department who does the training and is responsible for the workshop.

Libraries may not use the same definitions. Does in-house training count? PAC training for staff is currently not being counted. How do you count reference questions? (break it into parts or as one interaction?). Do you take into account the length of time it takes? Do the questions have teaching components? Does this make a difference in how the interaction is counted?

There are differences in types of user education sessions and the way they are counted by libraries. Smaller libraries primarily do one-on-one training. Larger libraries also do tours, go to classrooms, and offer workshops. There is a difference between these types of sessions. Some departments count tours. Some libraries count formal presentations. Some do not send stats because they do not conform to the definitions.

OUTCOMES

What outcomes do we want the statistics to provide and what do we want to use them for? What are our goals for the statistics? How do they fit with our goals for the user education program? Goals for the user education program should be defined. This should be clearly stated as a mission statement at the top of the stats form. What types of qualitative information do we need and what does the quantitative information tell us about how the qualitative is doing. How do you measure the outcomes?

WHAT TO DO NEXT

  • Contact libraries and get copies of the forms they use for reference statistics.
  • When a form is developed, run it as a test during a sweeps time on campus.
  • Consider using a questionnaire to determine how data is collected by each library.