clickers on college committees
I'm co-chairing a blue-ribbon committee to help address our immediate budget squeeze while orienting to a longer-term vision for my college. The committee is composed of 30 people, including deans, administrators, established and probationary faculty, and graduate and undergraduate students in the social sciences, arts, and humanities.
With such a large and diverse group, it can be difficult to come to consensus. Moreover, with such extreme power differentials in the room, it seemed important to build in some mechanism to ensure that we heard all the voices on the committee. We used email, of course, and set up a site to archive notes and reading materials. But not everybody weighed in either verbally or in writing. To get a better sense of the group's wishes before we prepared the first round of recommendations, my co-chair and I decided to do some instant polling using classroom clickers.
I can't report on the specifics at this point, but I think the committee members generally found this process useful. Six points:
1. We used them to get "the sense of the room" rather than to take high-stakes votes. People seemed comfortable using the results as a starting point for a focused discussion.
2. Though the committee seemed okay with the polling, they had lots of questions and qualifications about wording. For example, they noticed immediately when the answer categories were not mutually exclusive or exhaustive (in fact, several seemed to have taken some variant of Nora Cate Schaeffer's graduate seminar in questionnaire design).
3. Taking a tip from Carolyn Liebler, who uses clickers in her introductory sociology classes, we tried to offer response categories that left room for compromise (e.g., "mostly agree" or "mostly disagree") on potentially polarizing questions. We also asked variants of the "most important priority" questions that often arise in budget deliberations.
4. We were genuinely surprised with results on some items, especially when we found consensus where we'd anticipated disagreement. This helped us move expeditiously to more contentious issues.
5. I'd expected to hear at least a few groans or grumbles about "dumbing down" the process, especially since the clicker system was based on simple powerpoint slides. Nobody expressed such sentiments, but I wonder whether I should have asked this question on the last slide: How lame is this clicker exercise? (a) completely lame, (b) very lame, (c) moderately lame, (d) not so lame, or (e) not at all lame.
6. By directly addressing some of our uncertainties, the clicker data made the co-chairs a bit more comfortable writing on behalf of the group. I'm not sure whether they will bring any greater legitimacy to our recommendations, but I was glad to include the polling in our discussion of processes.