Sunday, February 25, 2007

The Flashlight Approach to Evaluation, Summarized

For a variety of reasons, educators, their units and institutions now often gather evidence in order to evaluate 'stuff'. By 'stuff' we mean tools, resources, and facilities such as course materials, computer software, classroom design, blended courses, distance learning programs, network infrastructure, libraries, ePortfolios, ... By "evaluate" we mean a purposeful gathering of the evidence needed in order to make better choices about what to do next.

These days when people think about evaluation, their thinking often begins and ends with 'outcomes assessment.' But information about outcomes is rarely enough, by itself, to show how to improve those outcomes. (For more on this point, click here.) The Flashlight approach to evaluation is designed to provide the best insights for improving outcomes for the least effort.

The Flashlight approach has a number of elements, of which these are the most important:


1. Activities: Focus on what people do with the 'stuff' at issue. For example, if you want to get more value from personal response systems (some of which are also known as 'clickers'), you first need to discover what faculty and students are actually doing with the clickers. For instance, are the clickers being used to create structured discussion about difficult ideas? to take attendance? to test memorization? Each of those patterns of use (which we refer to as 'activities') will create different benefits, costs, damage...

2. The dark side: Consider people's fears and concerns about the stuff, not just their hopes and goals. For example, if clickers are being used to take attendance, are students sending friends to class with a handful of clickers?

3. Motives, incentives, and disincentives: Value is created by the way people use stuff. So the best clues for increasing that value come from learning why people use the stuff as they do. For example, if you do a workshop on the value of using clickers for conceptual learning, and 30% of the participants don't start using clickers that way, you should investigate the reasons. It might be rooted in their personal approaches to teaching, their disciplines, your training, reactions of students, their facilities, ... No matter what you discover, it's almost always going to be useful in helping you figure out whether and how to get more value from clickers.

4. Education is not a machine: Even when the same faculty member teaches two sections of the same course, and has taught them for years, what students do and learn will differ. Add clickers, or any other technology that increases options for faculty and students, and that variation will probably increase. That's one reason we emphasize unique uses perspective for evaluation, and not just uniform impact approaches.

5. Collaboration: Whose choices influence how the stuff is used? Whose choices could change how the stuff is used? Those are the people who should be involved in helping design your study. If you need help in gathering data (e.g., getting good response rates to surveys), then they can help. They can also help you make sure that your questions and language are clear and compelling.

6. Start evaluating now: To improve outcomes (including costs), change activities. (Buying new stuff is just a means to change activities). And it's always the right time to begin studying an activity. So start evaluating now.
For example, if you're interested in using clickers to foster conceptual learning, start evaluating conceptual learning (including faculty development) now, whether or not you're using clickers yet, whether or not you're experienced in their use yet, whether or not you're considering replacing one kind of clicker with another. For example, if you are considering buying new stuff, your findings can help you
* choose products;
* remove barriers to effective use even before the technology becomes available, and
* provide baseline data for measuring the impact of the new stuff.
In short, whether or not it's time to buy, develop, or replace stuff, it's always time to begin studying activities that use, or that would use, that stuff.

The whole Flashlight Evaluation Handbook is designed to flesh out these and related ideas using examples and specialized guides. The TLT Group can also provide coaching (send us your draft plans and we can talk), collaborators in doing such studies, or we can even do them for you. So take a look at the Handbook and contact us if you'd like to talk (301-270-8311; flashlight@tltgroup.org).

PS. If your institution is a subscriber, feel free to use this summary for workshops. If your institution is not yet a subscriber, we would appreciate it if you would ask permission to use this material.

No comments:

Post a Comment

What do you think?