THIAGI GAMELETTER: August 2011

SERIOUSLY FUN ACTIVITIES FOR TRAINERS, FACILITATORS, PERFORMANCE CONSULTANTS, AND MANAGERS.

TABLE OF CONTENTS

Masthead
Our mission statement, copyright notice, and cast of characters.

Structured Sharing
Concept Analysis
Beyond memorizing definitions.

Sudden Survey
Betrayal
When trust is missing.

Jolt
Social Virus by Tracy Tagliati
It's catchy.

Improv
Transformers by Tracy Tagliati
When is a rope not a rope?

From Brian's Brain
The latest from Firefly News Flash by Brian Remer
Links to Brian's newsletter.

International Workshops
Thiagi's Workshops in Europe
Coming soon to Sweden and France.

Single Topic Survey
The Virtues of Virtual Training by Tracy Tagliati
Global training on a reduced budget.

Survey Results
To Tell the Truth by Tracy Tagliati
A summary of your responses.

Topical Tweets
Five Phases of Evaluation
This issue's collection.

Masthead

THIAGI GAMELETTER:

SERIOUSLY FUN ACTIVITIES FOR TRAINERS, FACILITATORS, PERFORMANCE CONSULTANTS, AND MANAGERS.

Mission

To increase and improve the use of interactive, experiential strategies to improve human performance in an effective, efficient, and enjoyable way.

Editorial Roster

Author and Editor : Sivasailam (Thiagi) Thiagarajan

Assistant Editor : Raja Thiagarajan

Associate Editors: Tracy Tagliati and Jean Reese

Contributing Editors: Brian Remer and Matthew Richter

Editorial Advisory Board: Bill Wake, Matthew Richter, Samuel van den Bergh, and <type your name here>

Copyright Info

The materials in this newsletter are copyright 2011 by The Thiagi Group. However, they may be freely reproduced for educational/training activities. There is no need to obtain special permission for such use as long as you do not reproduce more than 100 copies per year. Please include the following statement on all reproductions:

Reprinted from THIAGI GAMELETTER. Copyright © 2011 by The Thiagi Group, Inc.

For any other use of the content, please contact us ( thiagi@thiagi.com ) for permission.

Subscription Info

To sign up, or to donate and help us continue this newsletter, please see the Online Newsletter page on our website ( http://thiagi.com/pfp.html ).

Feedback Request

Thiagi believes in practicing what he preaches. This is an interactive newsletter, so interact already! Send us your feedback, sarcastic remarks, and gratuitous advice through email to thiagi@thiagi.com . Thanks!

Structured Sharing

Concept Analysis

Whenever you have these experiences:

You are in a state of flow.

Whenever you see a shape that has

it is a triangle.

State of flow and triangle are examples of concepts.

Concepts are key building blocks of all knowledge. They are essential elements of all training sessions.

Technically, a concept is a collection of features (and the relationship among them) that evoke a common response. When I was a novice instructional designer, I spent a lot of time analyzing each concept in my training topic, using a lengthy list of 25 questions. In the process of conducting these analyses, I discovered that the person who benefits the most was me, the instructional designer. Through concept analysis, I ended up with a rich understanding of the concept. However, I was not sure if this understanding was transferred to the participants who went through my training packages.

Nowadays, having discovered that the laziest trainer (and instructional designer) are the most effective professionals, I use a structured sharing approach in which the participants undertake their own concept analysis—and learn by sharing what they already know.

Here are the details of this structured sharing technique.

Purpose

To explore various elements of a concept including its definition, critical attributes, and examples.

Participants

Minimum: 6
Maximum: 30
Best: 12 to 20.

Time

Depending on the complexity of the concept and the depth of analysis, 20 minutes to 2 hours.

Sample Concepts

Supplies

Preparation

Make one copy of the List of Concept Analysis Questions for each team. Circle a different (but overlapping) set of questions in each list according to this arrangement:

Number of Teams Questions Circled
3 Team 1: 1-6, Team 2: 4-10, Team 3: 7-10 and 1-3
4 Team 1: 1-5, Team 2: 3-7, Team 3: 6-10, Team 4: 1-5
5 Team 1: 1-4, Team 2: 3-6, Team 3: 5-8, Team 4: 7-10, Team 5: 10, 11, 1, 2
6 Team 1: 1-4, Team 2: 3-6, Team 3: 5-8, Team 4: 7-10, Team 5: 10, 11, 1, 2, Team 6: 3-6

Flow

Form Teams. Organize the participants into three to six teams each with two to six members.

Introduce the concept. Announce that the teams are going to analyze the concept to better understand it. Explain that you are not going to define the concept in order to encourage team members to share their multiple perspectives.

Distribute the lists of questions. Give a copy of the List of Concept Analysis Questions (with overlapping questions circled) to each team. Ask the team members to discuss the concept by sharing what they already know, in response to the questions circled in the list. Encourage each participant to take personal notes of the discussion, explaining that they will need these notes during the next phase of the activity. Announce a suitable time limit and let the discussions begin.

Monitor the discussion. Move from one team to another, listening in on the conversations without participating in them. Periodically announce the remaining time. At the end of the allocated time, announce the conclusion of the discussion.

Reorganize the participants. Ask the members of each team to count off one, two, three… so that each person has a different number. Ask all number ones to assemble into a new team, number twos into another team, and so on. Tell the participants to bring their notes with them. Point out that the new teams have representatives from each of the earlier teams.

Discuss the complete list of questions. Give a copy of the List of Concept Analysis Questions (without circled questions) to each participant. Ask the members of each new team to share their conclusions from the previous round of discussions, working through one question at a time. Explain that more than one team may have discussed each of the questions earlier. Encourage fresh comments from the team members who had not handled these questions. Announce a suitable time limit. As before, monitor the discussions without interfering.

Assign follow-up activities. Conclude the discussions at the end of the allotted time. As a homework assignment, ask each participant to write an extended definition of the concept, incorporating the key points from the discussions.

Handout1

List of Concept Analysis Questions

Here are 10 questions for you. To explain some of the questions, we have given an example using the concept of triangles.

  1. What is the dictionary definition of this word?
    Example: A triangle is a closed figure with three straight sides and three angles.
  2. What are some synonyms of this word?
  3. What are some antonyms of this word?
  4. What are some clear examples of this concept?
  5. What are some borderline non-examples of this concept? These non-examples should appear to be genuine examples but should subtly lack a critical attribute.
    Example: A sector of a circle has three sides but it is not a triangle because one of the sides is curved.
  6. What are the critical attributes of this concept? Critical attributes of a concept are a part of the definition. They should be present in all examples.
    Example: Three sides is a critical attribute of a triangle.
  7. What are some variable attributes of this concept? Variable attributes are features that may be different in different examples.
    Example: Color is a variable attribute of a triangle.
  8. To what higher-level category does this concept belong?
    Example: Shape is the higher-level category to which triangle belongs.
  9. What are some related concepts at the same level?
    Example: Squares and rectangles are concepts that are related to triangles at the same level.
  10. What are different types of this concept?
    Example: Right triangles, equilateral triangles, and isosceles triangles are different types of triangles.

Sudden Survey

Betrayal

Recently, we designed and delivered an activities-based workshop on building trust. After dealing with various strategies for increasing trust among partners, the workshop dealt with the opposite concept of betrayal. We pointed out that while it takes several positive interactions to build trust, it takes just a single negative interaction to break trust.

Betrayal is an activity that is based on the SUDDEN SURVEY framegame (see the January 2009 issue of TGL). It focuses on how to conduct a difficult conversation related to a betrayal incident.

Purpose

To discuss a betrayal incident and to move on to a positive future.

Participants

Minimum: 8
Maximum: 52
Best: 12 to 32

Time Requirement

30 to 50 minutes.

Supplies

Preparation

Assemble a packet of playing cards. Estimate the number of participants. Divide this number by four, rounding up the answer if necessary. From a deck of cards, remove these many cards of each suit. Shuffle this packet of cards and use them for randomly allocating participants to the four different teams.

Example: You have 29 participants. Dividing this by four, you get 7.25. You round this up to 8. From the deck of cards, you remove Ace, 2, 3…8 from each of the four suits. You shuffle this packet.

Flow

Brief the participants. Review the principle and procedures from the earlier discussions on building trust. Explain that you are going to conduct an activity to explore the opposite concept of betrayal. Rather than lecturing about the concept, you will conduct a structured sharing activity in which all participants will collect and share responses to four critical questions. Invite participants to use their own experiences and their observations of other people's relationships (either in real life or in fiction) in coming up with their responses.

Introduce the four questions. Explain that you are interested in getting responses to these four questions from the participants:

Explain the activity. Tell participants that you are going to organize them into four teams. Each team will be assigned one of the four questions. Explain that the teams will be collecting responses from all participants—including members of their own team—related to the question assigned to them.

Set the agenda. Explain the following schedule:

Make team allocations. Shuffle the packet of playing cards and ask each participant to take a card. Ask participants to find other members of their team who have a card of the same suit. Invite each team to gather around in a convenient location. Assign the questions to the teams in this order:

Coordinate the planning activity. Ask each team to begin planning how to collect responses from everyone in the room (including members of their own team). While planning, participants may not talk to the members of other teams. Announce a 3-minute time limit and start the timer.

After 2 minutes, blow the whistle and announce a 1-minute warning. After 2 minutes, announce the end of the planning period.

Coordinate the information-collection activity. Announce that each team now has 3 minutes to collect responses to the question assigned to it. Announce a 3-minute time limit and start the timer. Get out of the way as everyone tries to talk to as many other participants as possible.

After 2 minutes, announce a 1-minute warning. After 3 minutes, blow the whistle and announce the end of the information-collection period.

Coordinate the analysis activity. Ask members of each team to return to their team. Invite team members to share and organize all the responses they collected. Distribute a sheet of flip chart paper and ask each team to summarize the information on this sheet. Announce a 3-minute time limit and start the timer.

After 2 minutes, announce a 1-minute warning. After 3 minutes, blow the whistle and announce the end of the analysis period.

Coordinate the reporting activity. Randomly select a team and ask it to display the flip chart. Ask a representative from this team to present its results and conclusions. Start the timer and announce the end of the reporting period at the end of 1 minute.

Repeat the procedure until all teams have given their reports.

Debrief and follow up. Comment on the findings from different teams. Add additional information about conducting difficult conversations about a betrayal incident. With the help of the participants create a suitable checklist and several authentic scenarios. Use them to conduct and discuss role-play sessions.

Jolt

Correction: We would like to acknowledge that a similar version of this activity was published by Sharyn Weiss and Doni Tamblyn in 2000 as Emotional Contagion. Our apologies for not mentioning this last month.

[book cover]

You will find Emotional Contagion and many other effective activities in the The Big Book of Humorous Training Games by Doni Tamblyn and Sharyn Weiss. (Learn more about the book on Amazon.)

Social Virus
by Tracy Tagliati

We all know how quickly the cold or flu can spread through the office, but we don't often think about how contagious our emotions can be. This jolt provides a brief simulation of how quickly both negative and positive emotions can be transmitted.

Synopsis

One participant is selected to be the Negative Infector General and asked to infect others with a negative emotion. During the next round, you pretend to select another participant to be the Positive Infector General. At the end of the second round, participants are surprised to find out that they became more positive even though no one initiated the emotion.

Topics

Purpose

Participants

Minimum: 10
Maximum: Any number
Best: 20-30

Time

5 minutes for the activity and 5 minutes for the debriefing.

Flow

Set up the first scenario. Ask the participants to stand up and close their eyes while you give the following instructions (in your own words):

You are all gathered here for an employee social in your organization. In a moment, I will tap one of you on the shoulder. If you are the chosen one, you become the Negative Infector General. Your task is to make eye contact with three other people, one at a time, and infect them with a negative expression. You can choose any negative expression. You can frown, glare, grimace, pout, sulk, glower, or show some other negative expression that is unique to you. The rest of you, your job is to mill around the room. If someone makes eye contact with you and infects you with a negative expression, then you have to infect three others. Once infected, you should maintain the expression.

Start the first round. While the participants still have their eyes closed, tap one of them on the shoulder to identify the Negative Infector General. Move away from this person and ask all the participants to open their eyes and begin milling around the room.

Stop the activity. After a minute, or when a majority of the participants have been infected with negative expressions, call time.

Identify the Negative Infector General. At the count of three, ask all the participants to point to the participant they think was the Negative Infector General, the initial person chosen to infect the group with negativity. Count “One, two, three.” Confirm (or identify) the correct person.

Set up the second scenario. Ask the participants to close their eyes again while you give them the following instructions (in your own words):

Let's continue with the same team at the same event as before. In a moment, one of you will be tapped on the shoulder to become the Positive Infector General. If you are the chosen one, your task is to make independent eye contact with three other people in the room and infect them with a positive expression. This may be a smile, a wink, or a positive expression unique to you. The rest of you, your job is to mill around the room. If someone makes eye contact with you and infects you with a positive expression, then your job is to infect three others. Once infected, you should maintain the positive expression.

Start the second round. While the participants still have their eyes closed, pause for a moment, but do not tap anyone's shoulder. The participants will assume you have identified someone to be the Positive Infector General. Move to the front of the room and ask the participants all to open their eyes and begin milling around the room as before.

Stop the activity. After a minute, or when a majority of the participants have been infected with positivity, call time.

Identify the Positive Infector General. As before, ask all the participants to point to the person they think was the Positive Infector General, the initial person chosen to infect the group with positivity. Ask the person to identify himself or herself. Pause while the group waits in suspense. After a few moments, reveal that no one was chosen to be the Positive Infector General.

Debriefing

Point out that emotions can be transmitted very rapidly and often without either person realizing it.

Explain that the group became happy because it expected to be happy.

Ask and discuss the following types of questions:

Learning Points

Emotions are easily passed from person to person, without either party realizing what is happening.

We have a choice to have a positive or negative outlook on everything we do, and that choice will affect others.

Improv

Transformers
by Tracy Tagliati

Finding alternative uses for products and services can increase a company's market share, and ultimately the bottom line. For example, duct tape was created to serve as water-resistant sealing tape for ammunition cases during the World War II. Very soon, it was used to repair firearms, jeeps, and airplanes. Since that war, duct tape has adhered itself so well to the American culture that now people use it to make prom dresses, wallets, iPod cases, and other wild things. Tom Heck has even published Duct Tape Teambuilding Games, a book of teambuilding games with duct tape. (Permacel, the company that created duct tape, couldn't be happier.)

You can use this improv activity to discover alternative uses for your company's products and services.

Purpose

To discover alternative uses for products and services.

Participants

5 or more
Best with 20

Time

15-20 minutes.

Flow

Form groups. Divide the participants into groups, each with about five members.

Brief the participants. Provide each group with a common object (for example, a piece of rope). Explain that one member in the group will start by picking up the rope and demonstrating a single alternative use. Give an example by saying, “This is not just a rope. This is also a belt.” Demonstrate by tying the rope around your waist. The rope is then passed to the next person who will demonstrate another alternative use. Explain that the process will continue until the group has run out of ideas.

Begin a new round. Now that the groups have warmed-up with the first object, conduct another round. This time, use one of your company's products. Continue as before until the group has run out of steam.

Collect the ideas. Facilitate the collection of ideas by assembling all groups and asking each group to take turns and share one idea at a time. To speed up the process, ask the groups not to repeat any idea that was already called out by another team. Record the ideas on a flip chart at the front of the room. Continue until the group has run out of ideas.

Clarify the ideas. Read all the ideas out loud and ask any if any one needs further explanation or wants to provide additional information. Let the group members provide such explanations or information.

Identify the top ideas. Ask the participants to come up to the flip chart and put a 5 next to their top choice, a 4 next to their second choice, and so on. Total the numbers next to each idea and assign a ranking based on these total scores.

Coordinate action planning. Compliment the groups on their creative ideas. Conduct a brief discussion to decide on a time line for putting the top ideas into action.

Variation

If you have a very large number of participants, let each group collect and rank their own idea. Simply assign someone in each group to facilitate the process. Afterwards, you can assemble all the groups and ask them to report out their top choices. From this point, you can use the scoring and ranking process.

From Brian's Brain

Thiagi writes: Every month, we have been reprinting a previous issue of Brian Remer's Firefly News Flash. Beginning this month, we decided to tease you with a synopsis and link you directly to Brian's website.

Here are teasers for the June and July issues of Firefly News Flash. Enjoy!

The latest from Firefly News Flash
by Brian Remer

June 2011: Beyond Polarities

Typically we think in dualistic terms, positioning our opinions, values, methods, and philosophies as polar opposites. What we often forget is that sometimes opposites attract and when they do, that is exactly when creative solutions to seemingly intractable problems can be found. The science of opposable thinking is the subject of this month's issue and it is explored through a review of Roger Martin's book, The Opposable Mind. Key point: Keep a pair of dice in your pocket as a reminder that you have many more options than the yes-no flip of a coin would suggest.

Read more in the June 2011 issue: http://www.thefirefly.org/Firefly/html/News%20Flash/2011/June%202011.htm .

July 2011: Strings and Beads

by Brian Remer

String, beads, and imagination are all it takes to create a novel memory aid that you can use to recall a long list or the steps of a detailed process. Discover how to apply it to your training and coaching to boost retention and increase participant engagement. Meet the inventor of this method in an interview with Michele Deck, author, educator, and co-founder of G.A.M.E.S., a company that specializes in adult learning and interactive teaching methods. Michele's top tip: Engage learners by having them make emotionally relevant connections to what they learned.

Read more in the July 2011 issue: http://www.thefirefly.org/Firefly/html/News%20Flash/2011/July%202011.htm .

International Workshops

Thiagi's Workshops in Europe

Stockholm, Sweden

Interactive Techniques for Instructor-Led Training
7-9 November 2011

This 3-day workshop helps you design and conduct different types of effective training games, simulations, and activities. Based on 30 years of field research, these design formats enable you to create training faster, cheaper, and better.

You will receive two manuals of training games and simulations during the workshop and have access to 2000+ web pages with additional games, activities, and facilitation tips.

Stockholm Brochure (454K PDF)

Special Discounted Registration Fee for readers of the Thiagi GameLetter: SEK 9,900
(rises to SEK 10,900 after 15 September 2011)

Introduction to Interactive Training Techniques
10 November 2011

The best way to improve your training is to encourage participants to interact with each other, with the content, and with you. In this workshop, Thiagi demonstrates techniques for designing interactive training. He also helps you acquire effective facilitation skills that permit you to conduct training activities without losing control, wasting time, and being attacked by participants.

Stockholm Brochure (454K PDF)

Special Discounted Registration Fee for readers of the Thiagi GameLetter: SEK 4,900
(rises to SEK 5,900 after 15 September 2011)

Paris, France

Interactive Techniques for Instructor-Led Training
15-17 November 2011

Organized by best-selling French author Bruno Hourst and his colleagues at Mieux Apprendre ( http://www.mieux-apprendre.com/ ), this 3-day workshop helps you design and conduct different types of effective training games, simulations, and activities. Based on 30 years of field research, these design formats enable you to create training faster, cheaper, and better.

Thiagi will facilitate this workshop in English and his colleagues will provide simultaneous translation into French.

Registration fee for individuals: 800 Euros

For more information, send an email to contact@mieux-apprendre.com .

Special 1-day Workshop
18 November 2011

This workshop will cover a variety of additional topics and will involve Thiagi and his French colleagues.

Registration fee for individuals: 200 Euros

For more information, send an email to contact@mieux-apprendre.com .

Single Topic Survey

The Virtues of Virtual Training
by Tracy Tagliati

Delivering corporate training at multiple sites around the world can be challenging. The solution for many has been virtual classrooms that are conducted in real time using web-conferencing tools. Companies that provide web-conferencing services promise that training can be delivered anytime, anywhere, and to anyone with access to an Internet-enabled computer. It's no wonder that virtual classrooms are one of the fastest growing trends in the educational uses of technology. So, if it is that great, why isn't everyone using it?

Critics of virtual learning argue that too often it tends to be a dry content dump that relies mainly on text and graphs to convey information. They add that typically the instructional design lacks any participant interaction with the content, with the other participants, or with the instructor.

Supporters appreciate the convenience of attending a virtual real-time workshop from their desk. They value the time and the money saved, especially the expenses associated with travel to the training venues.

What are your thoughts?

Poll Question

Are you a supporter of virtual real-time web-based training?

Vote

(The poll opens in a new window.)

Open Question

In your opinion, what are some of the advantages and disadvantages of using virtual real-time web-based training tools?

Respond

(The survey opens in a new window.)

You may include your name along with your response, or if you prefer, keep it anonymous.

We asked some of our colleagues and here is what they had to say:

Dave: Our company has multiple locations that are geographically separated by long distances. In the past, many of the locations worked in silos. Video conferencing in real-time has allowed employees from multiple locations to network, problem solve, and work collaboratively with each other.

Carol: I find that some participants feel isolated and lonely during the virtual learning process. I think this is due to the lack of a shared physical space with other participants and with the instructor.

Michael: From my perception, participants that voluntarily enrolled in a virtual real-time session tend to be highly motivated and are likely to cope well in isolation. By contrast, participants seemed to resent the virtual real-time session if they perceived it as a cost-cutting strategy.

Survey Results

To Tell the Truth
by Tracy Tagliati

Last month we asked if you think it is acceptable to lie during training.

Here are the results:

Yes: 45% No: 55%
(Percentages reflect 55 votes received by August 2, 2011.)

Of those of you who responded, 16% said “Yes”, 33% said “No”, and 51% said “It Depends”.

We also asked you for your thoughts and experiences about this topic. Here are a few of your responses.

Response 3) Sometimes I magnify part of a bigger issue or give an “inexactitude” to make or reinforce a point. I don't consider this lying if it helps participants learning. —Brian Tayor, Sydney

Response 4) Like some of the others, I reckon it's OK to exaggerate/magnify when your purpose/intent is to facilitate/support learning. Maybe one test may be whether your intention is “exploit” or “manipulate”! It's an interesting question. —Mike Sherry,Woodend, VIC

Response 6) I will sometimes use deception to make a point—especially if it's important for people to question their assumptions. But I always “come clean” in the end and point out that I was playing a role. If you do too much of this, people become suspicious of everything you do and they spend more time looking for the “trick” than focusing on the learning.

See more readers' responses or add your own.

Thank you for your responses.

Topical Tweets

Five Phases of Evaluation

Here is a collection of tweets (that began a few months ago) on the topic of evaluation of training products and programs. Follow @thiagi for the latest tweets on different aspects of design and facilitation of training activities.

  1. Evaluation is typically inserted before or after “implementation” in most ID or performance technology models. This is dysfunctional.
  2. Evaluation after development or production wastes resources. Revisions at this stage are expensive.
  3. IMHO, evaluation is not a discrete step. It is an integral part of analysis, specification of requirements, design, and so on.
  4. In the real world, I use five convenient phases of evaluation. They are integrated with each other and with the total process.
  5. Phases: 1. Initial debugging, 2. Expert appraisal, 3. Developmental testing, 4. Typical use testing, 5. Long-term validation.
  6. Five phases of evaluation, integrated with each other and with the steps of ID (instructional design OR intervention design).
  7. First phase of evaluation is “initial debugging”. Conduct it with your initial ideas and scribbles on the back of an envelope.
  8. During initial debugging, you evaluate crude ideas, specifications, your personal capacities, and the plans for the project.
  9. Why initial debugging? To specify requirements and outcomes. Also, to make go/no-go, do/drop decision.
  10. Initial debugging introspective activity. You are both the client and the designer. You take on the roles of the SMEs and end users.
  11. What do you evaluate during initial debugging? Requirements for the training package. Objectives and metrics. Crude prototype.
  12. Formative purposes of initial debugging: improve objectives, metrics, media and method selection.
  13. Formative purpose of initial debugging: Increase the probability that the intervention will reduce the performance problem.
  14. Summative purpose of initial debugging: Decide if you (and your team) can handle the project and if the objectives are realistic.
  15. Guidelines for initial debugging: Lay aside the specs for a couple of days to incubate. Return to take a fresh look.
  16. Review needs analysis, task analysis, and target-population information. Review budget and constraints.
  17. Prepare a checklist to counteract your personal biases. Example: Am I using too many activities?
  18. Use an appreciative-inquiry approach. What are you doing right and how can you leverage it?
  19. Take a broad systems look: In the overall scheme of things, is your intervention worth a hill of beans?
  20. Phase two of the evaluation process is expert review. You ask different experts to review the technical aspects of the intervention
  21. When do you conduct expert review? Same as when you conduct initial debugging. Also whenever intervention is revised significantly.
  22. Why expert review? Because there are technical errors that the end users cannot detect.
  23. Summative purpose of expert review: To get the quality of your intervention certified by different experts.
  24. What types of experts should review your intervention? Content experts, intervention experts, target-population experts, and language editors.
  25. During initial stages of expert review, use people familiar with your project. Later, use experts who are unfamiliar with your project.
  26. Focus each expert on her domain. For example, don't require technical experts to do language editing.
  27. During expert review, use two experts to review each area. Differences of opinion between these two could be enlightening.
  28. Ask experts to make changes in your documents instead of telling you what's wrong with them and what you should do.
  29. Wait until all other expert reviews are completed and revisions made before asking an editor to do the language review.
  30. Avoid using well-recognized authorities to provide expert review of your interventions. They tend to be prima donnas.
  31. Next phase of evaluation: Developmental testing.
  32. Developmental testing is one-on-one tryout of the prototype materials with typical end users.
  33. Development testing involves in-depth observation of individual learners working through your prototype package.
  34. During developmental testing, you evaluate the prototype version of the training package.
  35. I learned how to do developmental testing from Susan Markle and from Robert Horn in the late 60s.
  36. A key concept in developmental testing is to start with a lean version of the training package. Begin with the bare minimum.
  37. The formative purpose of developmental testing is to eliminate glitches through individual (or small group) testing.
  38. The summative purpose of the development testing is to decide whether the project is worth pursuing or whether it should be dumped.
  39. In developmental testing, we trade off the number of people involved with the amount of fine-grain data collected.
  40. Research by Kandaswamy & Stolovitch shows cheaper developmental testing works as well as more expensive large scale field testing.
  41. Revisions made on the basis of in-depth data from a few are as good as revisions based on data from many.
  42. To start developmental testing, give the prototype package to the participant and encourage her to ignore you and work through it.
  43. If you are developmentally testing self-instructional materials or online courses, let the participant interact. Get out of the way.
  44. If you are testing instructor-led training, fill in what the instructor will say. Stick to the outline.
  45. Developmental testing involves trying out your prototype package with one learner at a time and collecting actionable data.
  46. Give the prototype package to the learner and ask her to work through it. Explain that you are collecting data to improve it.
  47. Explain ground rules: You are an observer, not an explainer or teacher. Learner follows the instructions on the package.
  48. If the learner asks for directions, just say “Follow the instructions in the document. I'm not permitted to talk.”
  49. As the learner goes through the developmental testing, takes notes on a duplicate copy. Note down remarks, reaction, and responses.
  50. Don't begin developmental testing by saying, “Help me improve my prototype training package.” Say, “Help me improve this prototype.”
  51. The prototype should require frequent responses. If it does not, ask questions from time to time during the developmental testing.
  52. Include a confusing sentence or an error in the first section of the prototype. Reinforce the learner when she comments on it.
  53. If the learner gets stuck during developmental testing, encourage her to think aloud. Ask her what's stumping her and what she's doing.
  54. During developmental testing, if the learner is not able to figure out the answer, explain things to her as a last resort.
  55. Rather than talking to the learner, type or write additional instructions and explanations and ask her to read and use them.
  56. At the end of developmental testing, debrief the learner's experience with suitable questions.
  57. After developmental testing, debrief by requiring a summary of key points. Ask learner to identify the most difficult, most interesting, topics.
  58. Ask the learner what you could do to make the prototype training package more effective and interesting.
  59. At the end of developmental testing, revise the prototype training package. Clean it up and get ready for the next tryout.
  60. Continue the tryout and revision cycle until the prototype package appears to produce consistent learning results.
  61. Next phase of evaluation: typical-use testing.
  62. Typical-use testing is like road testing or flight testing. Watch what happens in the real world with authentic trainers and learners.
  63. Start typical-use testing when developmental testing produces consistent and satisfactory results.
  64. During typical-use testing, check out the materials and activities as implemented by real trainers and learners.
  65. Whenever I facilitate a workshop that I designed, it works great. Whenever someone conducts the same workshop, it sucks.
  66. Formative purpose of typical-use testing: to improve the support provided to implementers.
  67. Summative purpose of typical-use testing: to check whether the intervention will work in the absence of the designer.
  68. In typical-use testing, work with trainers who were not involved in the earlier design process.
  69. Typical-use testing involves a lot of participants, preferably stratified into different groups.
  70. Typical-use evaluation guidelines: Prepare different instruments to measure input, process, and output data.
  71. Keep your hands off. Let someone else conduct typical-use testing. Hire an evaluator who is not your friend.
  72. Collect pretest data on the participants. Collect competency data on the implementers.
  73. During the implementation, collect process data: time spent, interest level fluctuations, glitches, and participant comments.
  74. Collect output data in terms of improved participant performance and transfer.
  75. After the typical-use evaluation, compare the output data with the objectives. If they match, you are in business.
  76. If the output data does not match the objectives, examine process data. Check if facilitator instructions were followed correctly.
  77. Process data includes facilitator behaviors and participant behaviors during the training activities. Revise as needed.
  78. Suitable changes after typical-use testing: additional modules, enhanced facilitator's guide, troubleshooting ideas.
  79. Next phase in evaluation: Long-term validation.
  80. Long-term validation is the “final” phase in my evaluation approach. This phase is repeated once every 6 months.
  81. Long-term validation is seldom budgeted. Yet, it's an important evaluation activity.
  82. You begin long-term validation 6 months after your intervention has been implemented and the novelty is worn off.
  83. Formative purpose of long-term validation: to identify and incorporate local revisions and improvements to the intervention package.
  84. Summative purpose of long-term validation: to pull the plug on the intervention on the basis of long-term data.
  85. During long-term validation, you collect data from end users (both new and old), implementers, and beneficiaries.
  86. During long-term validation, be as unobtrusive as possible. Use existing data collected for other purposes.
  87. During long-term validation, focus on long-term payoff data. Also check for unanticipated consequences.
  88. During long-term validation, use experts to review the timeliness of content. Check for compatibility with the current environment.
  89. Here are principles that I use to increase the effectiveness and efficiency of evaluation:
  90. Incorporate evaluation in all other design and development steps, from front-end analysis to final implementation.
  91. Harness team power. Whenever possible, use focus groups instead of individual interviews. Involve everyone in the process.
  92. Make optimum use of computer software and online tools. You can use software during all phases of evaluation.
  93. Get more out of less data. There is no correlation between quantity of data and quality of research.