Blog Post Heading
Blog Post Content
We regard a continuous improvement (CI) process as vital to any effective training or learning program aimed at adult learners. “Listening to the learner” can dramatically improve the quality and impact of your content and courses. At MindEdge, we believe the process of continuous improvement must begin with goal-oriented advanced planning and appropriate revisions and enhancements based on learner feedback.
The five steps
Our continuous improvement process has five steps that apply to both online and face-to-face settings. For the purposes of this discussion, we’re going to describe the process MindEdge employs for developing its online courses and educational solutions.
…online courses allow an accelerated continuous improvement process. Revisions can be rapidly incorporated, even into a live course—with proper planning.
We begin course development by thinking about what the learner will take away after he or she has completed the course or instructional activity. At this stage, we ask ourselves, What body of knowledge should learners have mastered after completion? What should they be able to do with this knowledge? How can this be assessed? Will there need to be evidence of student learning for accreditation? What will be considered an acceptable performance level?
Spending the time up front on establishing these goals makes course development easier and results in a well-thought out experience for the learner. (It’s worth heeding the advice of Abraham Lincoln on the question of preparation: “Give me six hours to chop down a tree and I will spend the first four sharpening the axe.”)
Back to top
2. Define learning objectives and create feedback and assessment mechanisms.
Once learning goals are established, it’s time to develop learning objectives and establish a way to collect feedback and assess progress. Learning objectives are brief, clear, specific statements of expected student performance: “The learner will be able to calculate probability and explain its practical application in assessing project risk.”
We craft learning objectives to cover the range of learning in Benjamin Bloom’s taxonomy, with a focus on the cognitive domain (knowledge, comprehension, application, analysis, synthesis, and evaluation). Click here to see a table of Bloom’s taxonomy.
The course development quality review process presents an ideal opportunity to compare learning objectives against the instructional presentations, exercises, and activities in the course. Our reviewers have a different role from the subject matter experts or instructors who develop the course and who often “know too much.” Especially when the reviewer is new to the discipline or area of knowledge, he or she brings a fresh set of eyes and can assess the course as a new learner rather than a seasoned expert.
During this step, we ask a number of CI questions:
- Do reviewers find the course learner-friendly?
- Are terms and concepts clearly defined?
- Are there connections made with real world applications (always appreciated by adult learners)?
- Is there an appropriate balance between knowledge, comprehension, application, analysis, synthesis, and evaluation?
- Is there attention paid to differing learning styles (visual, auditory, tactile/kinesthetic)?
Performance measures (open-ended questions, multiple-choice questions, short answer, interpretative essay questions) offer another channel for feedback. It’s important to make sure that these measures are aligned with the course learning objectives, so that what you measure is what needs to be mastered.
Back to top
3. Identify any gaps between learner performance and learning objectives based on feedback.
We have found most valuable the actual learner feedback we receive once the course is live. Adult learners generally are not shy about expressing themselves, so we provide multiple opportunities for learner response: feedback forms (based on IACET-approved questions) embedded in our courses and an “Ask the Expert” function that lets learners send us questions as they work their way through the course.
Monitoring and analyzing learner feedback represents a marvelous opportunity for continuous improvement. In some cases, adult learners will tell you directly where improvements are called for (“A fuller explanation of Tuckman’s model of team development, please.” or “How to arrive at the payback period isn’t clear to me.”), and revisions are straightforward. In some cases it may make sense to provide supplementary explanations or additional exercises.
And since performance measures are linked to learning objectives, we can also tell whether the course needs improvement by looking at test scores and student work. If it becomes clear that, for example, many learners cannot calculate probability and explain its practical application in assessing project risk, then it’s time to make changes in the course to address that gap.
Back to top
4. Revise content to address gaps.
For the most part, online courses allow an accelerated continuous improvement process. Revisions can be rapidly incorporated, even into a live course—with proper planning. One of the questions we ask ourselves when make decisions about course development is: “How will this decision affect our ability to continuously improve based on learner feedback?” Our course structure has become more modular over time because of this consideration.
Back to top
5. Monitor feedback to ensure effectiveness.
A continuous improvement process should be…continuous. It means continuing to monitor feedback, especially when learners have indicated (through comments or performance) there are problem areas or barriers to learning. When learners bring challenges to our attention, we like to follow up with them and make sure that the revisions have addressed whatever problems they’ve encountered. It can take several iterations to find the optimal way to teach something.
Room for innovation
Along with this five-step approach, we also actively seek fresh ways to improve the way we teach and develop courses. We don’t want our overall CI process to become overly reactive—too focused on “checking and fixing” at the expense of innovation and potential quantum improvements. That’s why we review our courses on an ongoing basis, always looking to incorporate the latest insight or best practice into what we do.
Copyright © 2009 MindEdge, Inc.