About a year and a half ago, as a colleague and I were preparing for an upcoming book club discussion, she asked me a basic question: How many chairs should I set up? On its face it seemed simple, but I found it to be a question that I could not really answer.
As a relatively new programming librarian at Towson University’s Albert S. Cook Library, I had tried to put forward workshops, lectures, and discussions that I thought our students, faculty, and staff, as well as local community members, would find interesting and useful. Sometimes I succeeded, and sometimes I failed miserably. One thing I realized, though, was that I did not have any way to be able to predict whether or not a program would be well-attended. Worse yet, I did not have any way to demonstrate the value of the library’s programming to the university’s administration.
Background
With nearly 22,000 students, Towson University is one of the largest public universities in the state of Maryland. Towson has expanded rapidly over the last ten years, adding close to five thousand students during that time. In an effort to keep up with the increasing population, the library has expanded staff and services. Last fiscal year, we hosted forty-one programs with a total attendance of almost two thousand people. This was double the number of programs that we offered just two years earlier. In order to determine who is attending our events and what value these new offerings add to the university, I decided that I needed to begin formally assessing our library’s programs.
The Post-Event Survey
I began my adventures in assessment by drafting a pencil-and-paper survey that I would give to event attendees to complete at the end of our programs. I initially toyed with the idea of using an online assessment, but I feared that too few people would complete it and I would have no data. With a paper survey, I had the option of standing at the doorway of event and strongly, but sweetly, encouraging attendees to fill out my forms before they left. I decided to ask four open-ended questions. First I asked: “How did you hear about today’s program?” I asked this question in order to find out what campus and community outlets would be best for publicity. Second, I asked, “What were the strengths of today’s program?” with the hope that I would get some constructive feedback on what event elements were worth repeating. Similarly, I then asked: “Is there anything about today’s program that could be improved?” The final question was “What other programs or events would you like to see Cook Library host?” and I asked this in hopes of getting helpful suggestions about program offerings that would generate an engaged audience.
Then after attending a presentation by Megan Oakleaf on the Association of College and Research Libraries (ACRL) publication Value of Academic Libraries: A Comprehensive Research Review and Report, and hearing about the importance of capturing the quality of what libraries do rather than just the quantity, I decided to incorporate questions asking about the value of our events into my survey. I added: “Why did you decide to attend today’s program?” and “How was today’s program valuable to you?”
The Results
I tested out my first draft of the questionnaire at a faculty book talk about school reform. This program came about when a new associate professor in the College of Education contacted me and asked if she and her fellow co-authors could give a presentation in the library about their new book. We had approximately forty people attend, and the audience was a mix of education faculty and students as well as area public school officials. Looking at the survey results, the attendees’ responses to my questions about the strengths and weaknesses of the program yielded some surprising information. Instead of talking about what elements of the program should or should not be repeated, the respondents mentioned very specific things that the library staff could not control, such as the volume of a panelist’s voice or the lack of handouts by the presenters. While this type of feedback is helpful to the speakers if they are to give their presentation again elsewhere, it did not help me discern if we should run a program like this again. As a result, I decided to drop the strengths and weaknesses questions from future versions of the survey.
I next tested out the updated post-event survey at three events. The first was a reading by a poet that a faculty member brought to campus. Approximately thirty people came to this event, and the attendees were mostly Towson University faculty and students. The last two were book club discussions. (When I started at Towson in 2009, I created a book club series whereby the group would meet twice a semester to discuss works that are related to campus events. Each book club meeting would begin with a faculty member providing background information about the selected text, and then a librarian would lead the discussion.) The first book club meeting had twelve people in attendance, and the audience was mostly librarians. The second book club had about sixty in attendance, and the audience was mostly students.
When I reviewed the post-event surveys from these three programs, I noticed that the attendees reported that they had heard about our events in a variety of outlets. Multiple people mentioned they had heard about the events through our campus email announcement system, Facebook, the university website, and through word of mouth. Even though I had hoped I could get a sense of what media outlets would be the best focus of our promotional efforts, these results told me that I needed to continue to promote our events through a variety of venues if I want to have an audience.
Secondly, I noticed that in response to the question, “Why did you decide to attend today’s program?” many of the respondents, especially from the second book club meeting, wrote that they attended because of a class requirement. At this event, the room was filled with students, but only a handful of them participated in the discussion. When I became Towson University’s Communications and Development Librarian in 2009, I hoped to build an audience for our programming by creating events that highlight the expertise and interests of our faculty members. I thought that if the faculty knew that the library supported the speakers they were bringing to campus, they would then encourage the students in their classes to attend our events. For the most part, this worked, but not as I had intended. Often faculty members would bring their classes to our book discussions or offer their students extra credit to attend. This meant that the room would be filled, but only a handful participated in the discussion. Looking at these responses made me realize that I need to not only assess what brings attendees to events, but what would make those attendees be active participants.
I next examined the responses to the “How was today’s program valuable to you?” question and noticed that the most common answer was “Yes,” with many others leaving that question blank. Although I cannot be certain, I would imagine that people answered this way because it takes a good amount of cognitive effort provide a substantive response to this question. It requires respondents to review the content of the program and analyze how that content interacts with their lives. Since this was the third open-ended question on the survey, and because some of the attendees were just there because of a class requirement, I can easily understand why this question was bypassed or not really answered. On the other hand, the respondents who did provide substantive answers gave useful information that demonstrated the significance of library programming. For example, after one of the book club meetings, one person wrote that the discussion about Rebecca Skloot’s The Immortal Life of Henrietta Lacks “taught me to be more aware of my own medical health, as well as understanding what rights we are entitled to.” Even though it was by no means the most common answer on the survey, this attendee’s response serves as anecdotal evidence that our library programming encourages critical thinking and provides value to our campus learning community.
Lastly, I looked at the response to “What other programs or events would you like to see Cook Library host?” and I noticed a wide spectrum of answers. Most respondents left this question blank, but the ones that did give answers either wrote very general responses such as “I would be willing to attend events as long as the topic is interesting and I will learn something useful” or very specific ones such as “Potterpalooza.” While the specific responses were potentially the most helpful, they were the least common, and they often did not include enough details for me to create the desired programs.
Conclusions
After analyzing the responses to the post-event surveys that I distributed after four library programs, I believe that I will only retain one of my questions in future program assessments: “How was today’s program valuable to you?” Most of the questions did not generate the responses that I had expected, and the value question was the only one that ultimately provided useful data. Granted, it does not tell me how I could improve our library’s programming, but it did tell me, and in turn can tell the university administration, that our programs help students become more information-literate. Thus, I can use this type of anecdotal evidence in library publications and presentations to make the case for sustained library and university funding for our programs.
Given that it does require the survey respondents to think critically in order to answer the value question in a meaningful way, I think I will make this question the first, if not the only, event-evaluation question on future post-event surveys. I may ask respondents some demographic questions after the value question, but I do want to avoid asking too many open-ended questions because I believe doing so resulted in the numerous empty responses that I received during this survey trial.
My first foray into library program assessment did not go as I had planned, but I did learn that it is important to focus on asking questions that will allow me to make the argument for future programming. This does not come from asking event attendees to comment on why they came to a particular program or what they liked or disliked about an event, but rather by asking them how a program impacts their lives. It is not an easy question to answer, but when event attendees are able to put some thought into their responses to the value question, the end result of the assessment is worth the adventure.