Workshops by Thiagi, Inc. | Search

Faster, Cheaper, Better


by Sivasailam "Thiagi" Thiagarajan

A modified version of this article was published in Educational Technology [Volume 42, Number 3, May-June 2002].


Some Opening Thoughts

Instructional designers have a favorite statement: You can only have any two of these three: faster, better, or cheaper.

During the past 10 years of instructional design, I have discovered that it is possible to have all three (faster, cheaper, and better) by using unconventional strategies. My colleagues who have embraced these strategies agree with me. We all feel that faster, cheaper, and better are strongly associated with each other.

There is an obvious connection between faster and cheaper: Since the major cost of instructional design is directly related to the time spent by the designer and subject-matter experts, a faster project turns out to be a cheaper one also. But the connection between faster and better appears to be contradictory. Isn't faster usually associated with cutting corners and producing sloppy materials? Contrary to that assumption, actual field results indicate that faster instructional design results in better results (as measured by improvement in learners' performance).

We are still trying to make explain this paradoxical outcome. At present, here are a couple of guesses as to why this is happening:

Case Studies

The strategy that we use for producing faster, cheaper, and better training is not a procedural model, but a series to principles to be flexibly applied to each training context. Before exploring these principles, let's take a look at a few composite case studies based on our recent training-design projects.

Application Program. The training objective for this project was to use a proprietary application software program to create databases with specific fields and functions. We began this training design project by constructing the final performance test: You are given an in-basket folder with 20 dossiers. You have to create a database with a set of required fields and to populate it with information from the dossiers. You pass the test if you can print out a report that supplies a summary of the data in a specified format. We actually created six parallel versions of this performance test, each with different sets of information and specifications. During the training session, participant teams had access to reference manuals and videotaped demonstrations of the steps for designing specific database functions. Each team also had 45 minutes of targeted coaching from an expert. Teams were permitted to take the performance test twice for practice. Each test was different from the other and from the "real" final test administered to individual participants.

Java Programming. The training objective for this project was to enable participants to write a simple Java program. To design the training package, we used a modified version of the "extreme-programming" methodology borrowed from software engineering. An instructional designer (ID) and a subject-matter expert (SME) shared a single computer. The ID asked a series of questions (such as "What exactly will I be able to do at the end of this training session?") and used the SME's responses to create the prototype instructional material. The SME and the ID took turns at the keyboard to write and revise the instructional material. They continued their interaction until both were satisfied with the quality of the material. Then they sent for a representative learner who had been lurking in the waiting room, reading back issues of People magazine. This learner tested the instructional material by working through it and completing each exercise, thinking aloud when stuck. Whenever this happened, the ID or the SME silently grabbed the keyboard and made appropriate revisions to the materials to see if the learner was able to proceed. At the end of the test session, the ID debriefed the learner (with questions such as, "What was the most difficult concept?") and made immediate revisions based on learner feedback. They sent the learner back to the waiting room and continued to work on the next section of the module.

New Process. The training objective for this project was to introduce a new procedure for converting market data into a new product through engineering specification, blueprinting, prototyping, beta testing, and final release. All employees in a chip manufacturing company were scheduled to attend this training. While technical manuals provided obsessive details of each stage in this new procedure, we decided to conduct a face-to-face facilitated briefing with groups of 30 participants. The session began with a panel of four experts, each making a 99-second presentation on what they considered to be the most important element of the new procedure. After this, participants were organized into five teams of six members each and asked to generate questions reflecting their immediate concerns. After 5 minutes, teams took turns to read one of their questions. A different panel member answered each question and the other three members provided additional details if needed. Teams took care to ask the really important questions, avoid duplicate questions, and take copious notes. After 30 minutes of this "press-conference" format, each panel member made another 99-second presentation dealing with critical elements that were missing from the earlier questions and answers. Each team now had 10 minutes to summarize the key points about the new procedure, limiting their list to a single page of flip chart paper. As a final activity, all participants went on a gallery walk to review the posters created by the other teams and to award score points to reflect the usefulness of each poster.

Sales Letters. The training objective for this project was to write direct mail sales letters. We began our design with the construction of the final performance test: You have to write a sales letter for an existing item in the catalogue and mail it to 100 people randomly selected from the in-house mailing list. You pass if five or more people who received your sales letter send in their orders. We rapidly cranked out the other components of the training package: We designed a checklist for writing winning sales letters, collected and annotated top ten sales letters from the past year, and assembled four how-to books. We made sure that all learning resources were aligned with each other. During the actual training session, participant teams created sales letters for a specific product by using these resources and by consulting with the facilitator. The sales letters from different teams received feedback from the other teams and from a panel of expert judges. After the session, participants continued to work independently until they successfully passed the performance test.

Selling Automobiles. The training objective for this project was to provide automobile salespeople with product knowledge about new models. The instructional designer assembled existing materials ranging from slick sales brochures to dull technical manuals. A collection of these materials were shipped out to each participant with a tongue-in-cheek note admonishing them to diligently review the content or face potential public humiliation during the ensuing workshop. At the workshop, the facilitator organized participants into teams and asked them to prepare a series of closed questions (such as "What pieces of equipment are included in the basic package?") and open questions (such as "What benefits would you emphasize to a Soccer Mom who is interested in buying a minibus?"). The facilitator combined these questions with other questions prepared by experts and used them to conduct a quiz tournament.

Team Facilitation. The training objective for this project was to improve facilitation skills of the managers in a multinational high-tech corporation. The project lasted for about four months, with training design and delivery taking place simultaneously. The course used a series of e-mail games. Participants were encouraged to learn from their workplace experiences and by using different books, videotapes, and web sites. They spent about 30 to 45 minutes every day and responded to each round of the game within 48 hours. In different email games, participants identified characteristics of effective facilitators, defined critical features of each characteristic, discovered potential dangers associated with them, and developed practical facilitation strategies. Later games involved participants sharing facilitation problems, offering alternative solutions, evaluating the strengths and limitations of these solutions, and recombining them into improved ideas. The content generated by the first group of participants was re-used with ensuing groups. However, all participants were required to create and submit their own responses and solutions before reviewing those from earlier groups.

Basic Principles

A common theme that stands out clearly among all our faster-cheaper-better training design activities, especially in contrast to the traditional ISD (Instructional Systems Design) model, is the focus on activities instead of content. We believe that an effective training package should contain these ABC elements: activities, behavioral outcomes, and content. We also believe that these elements should be tightly aligned with each other to avoid teaching one thing, testing something else, and using a procedure that proclaims "Don't do what I do. Just do what I say!"

Traditional approaches to training design start with the goal and proceed through task analyses to identify and to break down content elements to a molecular level. Our faster-cheaper-better approaches also begin with the training goal. We immediately convert this goal into a performance-test and identify suitable learning activities to help participants to master skills and concepts for passing this test. We use different learning activities as containers for incorporating and managing existing content resources rather than creating our own content.

Additional Principles

Here are some additional principles that we have teased out of our successful projects. Sometime in the future, we will apply reconstructed logic and claim that these principles were systematically derived from our background in cognitive sciences, knowledge management, and complexity theory. We may also create a packet of job aids, conduct a training workshop, and retire to Florida. In the meantime, however, in the true spirit of collaborative design, we offer the following raw list of training design principles for your applications, comments, and sarcastic remarks:

  1. Accept the fact that you are never going to produce "The Final Version" of any training package. Instead, keep continuously updating and upgrading training materials and methods.
  2. Avoid analyzing the content into meaningless, disjointed learning objects.
  3. Don't confuse the ability to talk with the ability to perform; focus on performance.
  4. Combine delivery and evaluation with revision activities.
  5. Develop effective job aids, and then train participants to use those job aids.
  6. Don't waste time by trying to amuse participants with irrelevant activities that are fun. Instead, focus on designing and using relevant activities that are engaging.
  7. Incorporate content generated by current participants in future versions of the training package.
  8. Increase participant motivation by ensuring that what is being taught is directly relevant to success in the work place.
  9. Invite and encourage the trainer to become a co-designer and facilitator.
  10. Locate different types of existing content materials and incorporate them into training activities.
  11. Make the training session simulate on-the-job training as closely as possible.
  12. Motivate participants with a combination of cooperation and competition.
  13. Present content elements in a raw form and invite learners to organize them into meaningful clusters.
  14. Require participants to work in teams and to learn collaboratively from each other.
  15. Re-use effective templates for learning activities in future training packages.
  16. Select and use activity templates that match the type of learning content and objectives.
  17. Shift a significant part of the responsibility for training design and delivery to participants themselves.
  18. Treat all evaluation as formative: Always use evaluation feedback to improve the training package.

Some Closing Thoughts

Three decades ago, when I first entered the United States--and my career as an instructional designer--I remember a prediction that programmed instruction (the precursor to systematic instructional design) is dying of premature hardening of categories. I was skeptical of the claim since everybody else was proclaiming that the educational revolution has been finally won. But the prediction was true. The rigid behavioristic model that informed instructional design during those days was soon replaced by an information-processing model and later by a constructivist model. This repetitive history has transformed me into a rigid eclectic. My model for instructional design is to combine what works, irrespective of what ism it is based on. I have borrowed principles and procedures indiscriminately from such disciplines as creativity, chaos theory, complexity, improvisation, self-adaptive systems, anthropology, and paradox management. In the process of continuously changing my design models, I have figured out that I am a long way from arriving at a final version.

And I hope never to arrive at one.