Sunday, July 31, 2011

Thinking Like a Museum Evaluator

Poster Session conversations at the VSA Conference
In my next life I want to be a museum evaluator and researcher. I guess that will have to be after I am a museum planner, a librarian, an urban planner, and an arborist. I was reminded about how much I enjoy the evaluator perspective at the recent Visitor Studies Association conference in Chicago, July 24-27.

One thing I tuned into over the two-and-a-half day conference was how presenters, evaluators, and researchers answered questions. When a question was posed, I often noticed there was a short but perceptible pause, followed by a well-crafted reply. In the cartoon part of my brain, I pictured a thought bubble above her head with a list or steps in a logic model she checked before answering. I also thought I saw another thought bubble hovering over the head of someone whose answer noted specialized terms.

This is the second year I have attended the VSA conference. Both years I have been drawn to and invigorated by the thoughtful and disciplined thinking and interest in critical reflection in so many sessions and among so many people. Again and again, I saw a willingness, even a zest, to explore questions. I often had a sense of an evaluator’s independence, objectivity, and neutral stance related to information that was gathered, presented, and interpreted.

Thinking About Practice, Finding Patterns
Listening to projects, studies, methods, and results, I thought about the related thinking and practices that support this work. These practices not only made the sessions better, but are relevant for me in my work and in work across museum. Four sets of practices have stayed with me. 
Questions Everywhere. There’s no doubt about it. Evaluators and researchers love questions. Sessions opened with questions as titles and ended with questions as well. “Based on this, how can we create programs that…?” was a question I heard more than once. It wasn’t just the sheer quantity of questions that was impressive. They were well-crafted questions. Some framed major conversations, others took into account the complexity of situations: “How do we evaluate fairly when pupils have different social, economic, and family circumstances…?” Even standing at the elevator, in the poster session, or at lunch, evaluators and researchers were asking questions.

Checking and Challenging Assumptions. Checking assumptions is sometimes procedural like asking if everyone has had a chance to speak. Other times it seriously challenges the fundamentals like asking how (in)adequate attendance is as a measure of success. Checking assumptions can also flip a switch and reroute thinking, as when Joe Heimlich said that measures are met when we’re successful; we’re not successful because we meet our measures. There were many friendly provocations such as asking whether we are preserving success measures to preserve ourselves. Besides a willingness to challenge assumptions, I admired an appetite for experimenting in what presenters shared and encouraged others to try. In the middle of a session, one of the presenters reminded participants, “We’re trying to do new things.” 

Finding Language. I appreciate precise language for clarity, variety, and getting at meaningful distinctions. Multiple references to the role of language in visitor studies were made. In the first session I heard “languaging visitors,” or giving them the tools to talk about and share art. The value of visitors’ language to get at what’s intangible, like intrinsic benefits, also came up. For that matter evaluator and researcher language and “articulating intended results” received attention. There were many acknowledgements of context-specific terms: Big I and little “i” identity, inquiry-based strategies, and the Exploratorium’s own definition of “immersives.”
Playing It Forward. Frequently, the follow-up to presenting a project or a study was a slide or the question, “What can we do better?” If this question wasn’t posed, then a focus on Next Steps or areas of future study was. Often specific ways in which programs, exhibits, or marketing materials could be changed were highlighted. I greatly appreciated the push for improvement this represents­–a strong interest in action, change, and closer alignment of intention and achievement. Someone in the final session asked, “How do we, as a field, increase the rigor of our work in ways that are supportive of our colleagues but hard on the research and evaluation?”

Now, there’s a great question that challenges assumptions, makes meaningful distinctions, and plays the conference forward.

An Evaluator’s Perspective
New perspectives arriving
These practices interested me and I wanted to learn more about how evaluators see these and their own practices. During the conference I asked five people about what they see as a distinct perspective of an evaluator. We talked in line for the bus, at the elevators, and before sessions started. Speaking primarily about the perspective they bring to evaluation, they mentioned the following.
•                  I’m constantly wondering how the visitor experience will be. I look at prototyping as the value of the visitor’s report and behavior. (Elizabeth, in-house exhibit planner with evaluator responsibilities)
•                  Testing assumptions about what the visitor, or learner, takes away or understands from the experience. I try and bring multiple stakeholders and their perspectives to the task of interpreting what people are taking away. (Camilla, independent evaluator)
•                  I’m constantly asking questions and wanting to know the reason for things. Everything you figure out leads to a new question. (Lorrie, independent evaluator)
•                  I’m hungry for context. I ask myself if I have and understand the context I need (background, perspective, familiarity, etc.) to bring the right tools to this person or team to do what they need to accomplish. (Nina, in-house evaluator)
•                  Objectivity. An evaluator has a neutral relationship with information; she needs to show the information and let program people bring their perspective and needs to interpreting it. (AndrĂ©a, in-house evaluator)

Over the course of my five conversations, I shifted my question from asking about an evaluator’s thinking to an evaluator’s perspective. I debated about whether and when to tell them I was going to put their response in this blog and decided to do so. I did notice I had a mix of internal and external evaluators and researchers, but also realized by sample included only women. I wished I had more time for conversations and a new round of questions.

The conference was valuable as was my little study. I have a lot to play forward and a lot of work ahead before I’m thinking like an evaluator.

No comments:

Post a Comment