The Myth of e-Learning Levels of Interaction

For years the eLearning industry has categorized custom solutions into three or more levels of interactivity– from basic to complex, simple to sophisticated.  The implication also being that that learning effectiveness increases with each higher level of interactivity.

You don’t have to look hard to find them:

Levels of Interactivity in eLearning: Which one do you need?

CBT Levels

How Long Does it Take to Create Learning?

These kinds of categorizations surely originated in the vendor community as they sought ways to productize custom development and make it easier for customers to buy standard types of e-learning.  I won’t quibble that “Levels of interactivity” has helped simplify the selling/buying process in the past but it’s starting to outlive it’s usefulness.  And they are a disservice to intelligent buyers of e-learning.  Here’s why: 

1. The real purpose is price standardization

Levels of interactivity are usually presented as a way to match e-learning level to learning goals.  You’ve seen it–Level 1 basic/rapid is best for information broadcast, level 2 for knowledge development and level 3 and beyond for behaviour change or something to that effect.  However, in reality very simple, well designed low end e-learning  can change behavior and high end e-learning programs can wow while provide no learning impact whatsoever.

If vendors were honest the real purpose of “levels of interactivity” is to standardize pricing into convenient blocks to make e-learning easier to sell and purchase.  “How much is an hour of level 2 e-learning again?  OK, I’ll take three of those please”.  each level of e-learning comes with a pre-defined scope that vendors can put a ready price tag on.

It’s a perfectly acceptable product positioning strategy, but it’s not going to get you the best solution to resolving your skill or knowledge issue.

 2. Interactivity levels artificially cluster e-learning features and in doing so reduce choice

Most definitions of what features are included in each of the “levels” are vague at best.  Try this definition from Brandon Hall in wide use:

bh levels

It’s hard to imagine looser definitions of each level.  In fact there are a variety of factors that drive scope and effort (and therefore price) in a custom e-learning program. And they go well beyond “interactivity”.  They include interface choices, type and volume of graphics and media, instructional strategies, existence and quality of current content, and yes, the type and complexity of user interactions.

Each of these features has a range of choices within them, but the levels of interactivity approach tends to cluster everything into one bucket and call it a level.  It might look something like the following:

Levels of e-Learning

A Level 1 program is essentially a template based page turner with a few relevant (or maybe not so relevant) stock images and interactivity limited to some standard multiple choice self checks.   In contrast, a Level III program is loaded with custom designed interface, user controls, media and graphics, along with with complex interactions assumed to be required for simulations and scenarios.   Level 2 is the middle choice most buyers and vendors alike are happy to land on.  None of these choices, by the way, has anything to do with accomplishing a desired learning outcome–but that’s another discussion.

If this artificial clustering of features was ever true, it’s not any longer.  Advanced simulations and scenarios can be created with very basic media and user interface features.  Advanced custom interface and controls with rich custom media are often used to support simple information presentation with very little interactivity.  Powerful scenario based learning can be created with simple levels of interactivity.   Rapid e-learning tools once relegated to the Level 1 ghetto, can create quite advanced programs and custom HTML/Flash can just as easily churn out  page turners.   Out-of-the box avatars can be created in minutes.

This clustering of features into three groups gives you less choice that you would receive at your local Starbucks.  If I’m a customer with a learning objective that is best served by well produced video and animations followed by an on the job application exercise,  I’m not sure what level I would be choosing.   A good e-learning vendor will discuss and incorporate these options into a price model that matches the customer requirement.

3. It reduces effectiveness and creativity

Forcing solutions into one of three or four options stunts creative thinking and pushed the discussion towards media and interactivity rather than closing a skill gap where it should be.

4. It hurts informed decision making

It appears to create a common language but actually reinforces the myth that there are only three or four types of e-learning.  The combinations of media, interactivity and instructional approaches are as varied as the skill and knowledge gaps they are meant to address.

5. It encourages a narrow vision of e-learning

e-Learning has morphed into new forms.  Pure e-learning is already in decline being cannibalized by mobile and social learning and the glorious return of performance support.  These approaches are much more flexible and nimble at addressing knowledge and skill gaps.

Flipped Classroom – A Student Perspective

My son is in a graduate program in Medical Physics at the University of Toronto.  I sent him this recent article from the Atlantic on the concept of the flipped classroom (in higher education).

The Post-Lecture Classroom: How Will Students Fare?

lead_large

He sent a thoughtful email response which was interesting from a student perspective (in addition to seeing more words from him than “send money”!)  He just finished reading Quiet by Susan Cain and I like the connection he makes here.

Good article dad. I’d be interested to actually read the research article  myself, to look at sample sizes and statistical analysis showing their confidence intervals for the data and how statistically significant it is. I think a push towards new and improved teaching techniques is awesome, but you have to be really cautious with it. They mentioned that one year there were student presentations with discussions led by those students, which I have had several courses implement, and it’s a pretty mixed bag. At their worst, those presentations were students presenting the material in just as much of a dry, “Powerpoint poisoning” kind of way as possible. In one class, they were a bit more successful because the prof really helped structure the discussions so we all got a lot out of it.
 
I also have experience with the technique of having us read the book before class and answer clicker questions during lecture, to root out common misconceptions. There were online quizzes before lecture (very simple) on the readings to make sure we did them. I think that this was a VERY effective technique, and made me feel much more engaged in class, and definitely succeeded at giving the professor the information they needed in order to teach the class well, addressing problems that were common in the class.
 
I’m also uncertain about the whole discussion and collaboration in class. That CAN be a good way of engaging the class, but having read “Quiet” by Susan Cain, it definitely seems to fall squarely into this push towards an “extrovert ideal” in education, where it is really designed to most benefit those with more extroverted personalities, and can actually lead to less creativity and innovation, since the consensus in groups will be built by those with the loudest opinion, not necessarily the most informed one. That doesn’t mean this type of teaching isn’t without merit, but I think it should be used with caution, and not be made the centre of the curriculum.
 
Lastly, I think that it’s always important to consider what is actually being taught in these kind of studies, and not apply it to other teaching subjects without due consideration. Graduate Pharmacology seems like a perfectly suited topic for this kind of topic, since everyone is aware of the foundations of the subject, and can focus discussion on “higher-level” stuff like clinical trials. I can tell you that, in my experience, not all attempts to deviate from standard lectures are successful, and there are times, especially at a foundational level, where I personally think that it’s much more beneficial to acquire the knowledge in a more traditional lecture setting, even if it’s less engaging.

Learning Failure in 7 Easy Steps

Picture1We all make mistakes. We know better, but we follow old ways or accept cultural practices that don’t work. There are patterns that produce successful projects and those that lead to failure (see the project death march). I did a recent presentation on the classic mistakes we make in the planning, design and implementation of learning (and how to avoid them). I finished the session with a with a tongue-in-cheek 7 step prescription for a failed learning initiative. Follow them carefully for guaranteed failure.

Step 1: Ensure your program is not connected to a business or performance need

A critical first step.  If your learning initiative in any way contributes to individual or organization performance you’re risking success.  Market your program as urgent and essential to convince your audience they need it (while you’re at it, it’s best if you can convince yourself too).  You then have the option to bury the program somewhere deep in your corporate training catalog or roll it out aggressively as a mandatory program.  Both tactics are equally effective at failing dismally.

Step 2: Rely on your fave management training vendor for program content 

Some say learning programs should be driven by real job or role requirements–that the context in which people work should be the source for learning content.  Don’t believe it.  Instead, close your door, pick up the phone and call your favourite training vendor.  Ask them what training you should be providing this year.  They will have the answer you need and a program with all sorts of great content ready to go.  Another option would be to simply gather as much content as you can from the internet.  Have you seen how much information is out there!

Step 3: Choose a solution that suits you rather than your learners 

There’s so many ways to deliver and support learning now.  Gone are the days where your only option was a classroom training program.  You probably have your favourite.  Trust your gut.  Go with that favourite.  It will be more interesting for you.  Just be sure your choice is detached from the preferences of your audience.

Step 4: Load up on information. Make no provision for practice

Information driven learning is one of the best strategies for failure we know of. Designing practice is hard.  Even harder to design practice that works–on the job activities that develop skill in context. So don’t sweat it. There are great examples out there of power point driven classroom learning, “click next” e-learning, and social learning that’s all about sharing information without actually using any of it. Mimic those examples and you’ll get closer to putting the nail in the coffin of your failed project. But your not quite there yet.

Step 5: Let technology drive your solution

Technology is hip. And they tell us it’s needed to capture the attention of the digital natives entering your organization. So keep it cool. Your program must incorporate the most current devices and tech tools–tablets, smartphones and web 2.0 apps. Don’t think about how they support the objectives of your initiative.

Step 6: Boldly assume the learning will be used on the job

Your program is that good!  It will be more powerful than the current barriers, lack of support and reinforcement that learners will encounter when they are back at work.  Mastery was your goal and no refinement of those skills on the job will be necessary.  Really.

Step 7: Develop your program in a client vacuum

Above all, do not partner with your internal customers to identify business needs or develop a plan to support them through formal or informal learning.  One of the best predictors of success is working collaboratively with your client through the planning, design and development of your program.  Avoid at all costs.  Failure Guaranteed. You’re welcome.

its only a failure quote

Practice and the Development of Expertise (Part 3)

practice

Image courtesy HRVoice

This is a variation of an article I prepared for CSTD and HRVoice, summarizing research on practice and expertise.  Part 1 introduced the signature skills demonstrated by experts that separate them from novices. Part 2 presented the type of practice that develops experts.  This post gets to the implications I see for Learning and Development and makes the connection to existing approaches that embody the principles of deliberate practice.

It would be easy to position deliberate practice in the formal learning camp. Indeed for some physical and routine skills  elements of deliberate practice can be build into formal training programs until a learner reaches mastery.  However, in the modern workplace jobs are more complex and demand greater cognitive (versus physical) skill.  The research findings challenges us to consider how we can better support the full novice to expert journey, embed learning and practice in the job, design experience to include practice and reflection, build tacit knowledge, and design rich feedback. In a past post I listed some general principles for using deliberate practice in learning

Fortunately we have a number of approaches available to us that align well to the conditions of deliberate practice. Most of these approaches are not training in the traditional sense. They do however have a structure to them and require significant support. Consider them more non-formal learning than pure informal learning.  Here are some well defined but under-used learning methods  that match well to deliberate practice.

  • Action Learning. Small teams create a plan of action to solve a real business problem. Impacts of these actions are observed, analyzed, lessons extracted and new actions prepared. This cycle of plan, act, observe, reflect embodies the key elements for deliberate practice. The approach has a significant and growing following. Used frequently for management development, it would be great to see it expanded to other types of professional work. See this  earlier post on action learning. 
  • Cognitive Apprenticeship. The standard apprenticeship model updated for modern knowledge work. Instead of demonstrating a manual skill, experts model and describe their thinking to “apprentices” who then work on the same problem while they articulate and verbalize their own reasoning. The expert provides coaching and feedback to encourage reflection. Support is “scaffolded”–gradually released as skills build and confidence is gained.
  • Communities of Practice. Groups with common professional or project goals work together sharing and discussing best practices. In doing so they develop rich tacit knowledge and the hidden “how to’s” that are often missed in formal learning programs. New knowledge is created in the process of collaborating with others. Social media environments can provide a home for the conversations and knowledge that is created.
  • Simulation and Games. Great simulations are a surrogate for real experience and incorporate authentic  work tasks. This allows the learner to attempt challenging tasks, experience failure and learn from errors–all critical elements of deliberate practice.  I like games that model real work and allow for fun, repeatable practice, but worry about “gamification” that uses game mechanics to motivate employees to use the same old ineffective training.
  • Feedback in the Workflow. Wonderful natural feedback exists in the form of business results and performance data. We don’t tend to think of it as a learning tool, but in the context of deliberate practice, it is one of the most powerful. It requires connecting the data to individual or team behavior. It is the cornerstone of approaches to team learning found in improvement methods like Lean, Six Sigma and performance technology. Here’s a post with some ideas on implementing a learning feedback system
  • Stretch Assignments with Coaching. One of the most powerful approaches to “practice” is challenging work assignments that push current capabilities. Already a staple of executive development, we need to see much more of it for other types of professional development.
  • Open Practice Centers. Instead of tired corporate universities and course catalogs populated with learning programs, Practice Centres could provide progressively challenging practice, simulations and work assignments matched to key job roles. Individualized practice is designed to support the full novice to expert journey using the principles of deliberate practice. Learning “content” is considered only in a support role to accomplish practice goals. Heres an idea for organizing the learning function around practice instead of content and courses. And the core idea applied to Management Development

These approaches and others like them occupy that fuzzy middle ground between informal and formal learning. Each can be aided significantly by social media/social learning and learning technologies. Most importantly however they are approaches that allow us to apply the research on “deliberate practice” to help improve our organizations and in doing so improve our own professional performance.

Practice and the Development of Expertise (Part 2)

Deliberate PracticeThis is a variation of an article I prepared for CSTD and HRVoice, summarizing important research on how practice develops expertise and implications for the learning function.  Part 1 introduced the signature skills demonstrated by experts that separate them from novices.

It seems these signatures of expertise are the result of years of effortful, progressive practice on authentic tasks accompanied by relevant feedback and support, with self-reflection and correction. The research team have labeled this activity “Deliberate Practice”. Others have called it deep practice and intentional practice. It entails considerable, specific, and sustained efforts to do something you can’t do well—or at all. Six elements for are necessary for practice (or on the job experience) to be “deliberate” practice:

  • It must be designed to improve performance. Opportunities for practice must have a goal and evaluation criteria. The goals must be job/role based and authentic. General experience is not sufficient, which is where deliberate practice varies from more laissez-faire approaches to informal learning. Years of everyday experience does not necessarily create an expert. Years of deliberate practice does.  See my post Everyday Experience is not Enough for a deeper discussion on this.
  • It must be based on authentic tasks. The practice must use real work and be performed in context-on the job. The goal is to compile an experience bank, not a vast list of completed formal training programs.
  •  The practice must be challenging. The tasks selected for practice must be slightly outside of the learners comfort zone, but not so far out as produce anxiety or panic. Deliberate practice is hard work and stretches  a person beyond their current abilities. The experience must involve targeted effort, focus and concentration
  • Immediate feedback on results. Accurate and diagnostic feedback must be continuously available both from people (coaches) and the business results produced by the activity. Delayed feedback is also important for actions and decisions with longer term impact as is often the case in knowledge based work.
  • Reflection and adjustment. Self-regulatory and metacognitive skills are essential. This includes self-observation, monitoring, and awareness of knowledge and skill gaps. Feedback requires reflection and analysis to inform behaviour change. Experts make mindful choices of their practice activities.
  •  10,000 hours. For complex work, ten years seems to be the necessary investment of in deliberate practice to achieve expertise. Malcolm Gladwell drew attention to the 10,000 hour rule in his book Outliers and it is one of the most robust findings in this research.  It poses a real challenge for our event based training culture. Of course the less complex the work, the less time required to develop expertise.

If we aspire to evidence-based approaches to learning it’s hard to ignore this body of research. Among other things, it challenges us to consider how we can better support the novice to expert journey, embed learning and practice in the job, design experience to include deliberate practice, build tacit knowledge, and build rich feedback into our organizations.

Fortunately we have a number of approaches available to us that align well to the conditions of deliberate practice. I’ll outline those approaches in Part 3.

Here is a nice illustration of the key process and concepts of deliberate practice courtesy of  Michael Sahota:

Deliberate-Practice-630x357

Practice and the Development of Expertise (Part 1)

I’ve been presenting recently on the research and application of Deliberate Practice for developing expertise in the workplace.  I’ll be doing so again at the upcoming British Columbia Human Resources Management Association (BCHRMA) conference on May 2.  A version of following article on the topic appeared in the CSTD Learning Journal.   A modified version will also appear in an upcoming issue of HR Voice.   It is posted here in three parts.

******

expert We all know well-designed practice is a critical for effective training.  It’s what differentiates meaningful learning from passive information presentation.  But as work becomes more complex and knowledge-based, are the practice activities we design for our formal learning programs (both classroom and e-learning) enough to meet the need for expertise in the modern workplace? A comprehensive body of research on how professional expertise is developed suggests it may not be.

This research, led largely by Anders Ericsson at Florida State University and popularized in recent books such as Malcom Gladwell’s Outliers and Geoff Colvin’s Talent is Overrated, indicates that the type of practice needed to develop true expertise is more intensive and “deliberate” than we thought, and that it must be embedded in the context of real work.  Also, it must occur on a regular basis over much longer periods of time. The research has implications for us as learning and performance professionals.  It argues for a profound shift away from event based formal learning to approaches that could be categorized as informal learning or learning from experience.  However, it turns out not all experience is created equal when developing expertise, so simplistic notions of informal learning also won’t work.  So how should we rethink the design of practice if it is to truly develop the complex skills of the knowledge workplace?  To answer that question it helps to first understanding what expertise looks like.

Characteristics of expert performance

Ericsson’s research has found that top performing individuals at work, besides being very good at what they do, consistently demonstrate the following differences compared to novices and lower performing individuals

  • They perceive more.  Experts see patterns, make finer discriminations, interpret situations more quickly and as a result make faster, more accurate decisions.  Novices slowly review all information and don’t have the contextual experience to recognize patterns
  • They know more.  Not only do experts have more facts and details available to them, they have more tacit knowledge–that all-important unconscious “know how” that only comes with experience.  Novices rely on limited explicit knowledge
  • They have superior mental models.  Experience helps experts have rich internal representations of how things work and how knowledge is connected.  They use this to learn and understand situations more rapidly. Novices rely on simple, sometimes inaccurate, rules of thumb and loosely connected knowledge
  • They use personal networks more effectively.  Experts know who to go to for help and answers.   Novices are not able to identify access critical information and people as quickly
  • They have superior “meta-cognition”. Experts are better self-monitors than novices.  They set goals, self evaluate against a standard, and make corrections and adjustments more quickly from feedback

novice to expert

These are skills we want in all employees.  At times they seem like they come from an innate ability or deep “competency” unachievable by others.   However the research shows that while natural ability may play a small role, practice and experience are far more significant.  The nature of this experience is critical.  “Practice makes perfect” is only true for practice of a certain type which I describe in part 2.

Epic Learning Fail

If you’ve been in the learning business for a while you’ve likely seen a few examples where learning initiatives have simply missed the mark. They didn’t produce the anticipated return on investment.  Planned performance improvement did not materialize. Learners didn’t develop the skills targeted by the program. Or if they did they certainly aren’t being applied on the job. Maybe all of the above.

Failures like these are more common than we like to think. Sometimes they’re not as visible as other more tangible areas of the business like manufacturing defects (Toyota’s sticky breaks) or financial misadventures (take your pick).  When substantial resources have been committed to well intentioned, hard to measure initiatives like training, sometimes success is declared when real evidence suggest otherwise (somehow I’m seeing George Bush on the deck of an aircraft carrier here). The phenomenon is not limited to formal training. Informal learning and I suspect a few recent social learning/social media efforts have met with limited success.

If you’re honest, some of those programs might have even been your own (I know a few of mine have had less than stellar impact).  Or you may have had a role in a larger effort.   Perhaps you designed it, or identified the original need, or delivered it, or managed the implementation effort. Maybe you created an impeccable design that was poorly implemented or visa versa. Learning can derail in many ways.

If we follow our own advice, there is a lot of learning to be had in these partial successes and outright failures. I’m doing a presentation soon titled When Learning Fails: Classic Mistakes and How to Avoid Them. I’m interested in your anecdotes and stories. Have you been involved in a learning program that did not go so well?  Have your observed any train wrecks or near misses?  What caused the issues? What was learned that helped improve your next initiative?

Please post any input in the comments section below or enter a link to a blog post or Tweet to #epiclearningfail. I’ll be reviewing  for the most common causes and learnings.   I’ll summarize in later posts.   Looking forward to your responses.

Update: here is the presentation from National CSTD conference Nov 12. 2012

Four Learning Traps to Avoid

The Learning End GameTrap

Perhaps you’ve re-committed to improve learning as the mission of your department (or next big initiative, or…).  It’s well meaning but can be self defeating (or worse, self-fulfilling). The term leaves the impression that learning is the end game, your raison d’être. The real end game is performance; individual and organizational, defined in terms the business uses to measure itself. Sure, you don’t have control over all the factors that influence performance, but that doesn’t mean your solutions can’t be intimately connected to them. Thinking performance first is liberating and opens up whole new perspectives on the types of solutions you can and should be bringing to the table.

Antidote to the end game trap:  Performance Thinking (Cathy Moore and Carl Binder have nice methods for deriving learning from performance needs)

The Planning Trap

I used to believe in the perfect annual plan all wrapped in MBO goodness, aligned and linked to organizational objectives. But over time I’ve come to two conclusions. First, the plans are rarely fully realized. The more interesting innovations and strategies emerged from responses to opportunities throughout the year. Second, senior teams rarely have their act together enough to create strategies and business plans that are meaningful enough to wrap a good training plan around. Highly analytic planning processes can deceive us into thinking we are planning strategically and improving future organizational performance.

To borrow an argument from Henry Mintzberg, strategy is actually a pattern embodied in day to day work more than an annual plan. Strategy is consistency in behaviour, whether or not intended. Formal plans may go unrealized, while patterns may emerge from daily work. In this way strategy can emerge from informal learning. I’ve always liked this image of planning from Mintzberg:

from Henry Mintzberg “The Rise and Fall of Strategic Planning” (1994)

Antidote to the planning trap:  Beware the best laid plans. Go ahead and work with your business units to create a simple training plan linked to whatever business plans they may have in place. But have a rock solid process in place to respond to the requests that will inevitably come that are not in line with the plan. Be ready to develop solutions to adapt quickly to whatever white water your company or industry happens to be swimming in. Be nibble and flexible in response to business changes. Watch for patterns and successes in that work and incorporate them in your training strategy.

The Measurement and Evaluation trap

Evaluation is a hot button that causes more wringing of hands and professional guilt than it should. Evaluation is meant to inform decisions. Some types training are simply easier to measure than others. Everything can be measured, but not everything is worth measuring. When you do evaluate use business metrics already in use and explore methods focused more on collecting evidence of success rather than definitive proof. Myopic and overly rigorous measurement drives out judgment and causes us to start measuring trees and forget about the forest. Our attempts at evaluation are often disproportionate to evaluation elsewhere in the organization (we only think everyone else knows their ROI).

Antidote to the measurement trap: Don’t emphasize short term ROI or cost reduction measures at the expense of true investment in the future that do not have immediate and calculable ROI. When you do evaluate use existing measures that the business uses to judge success.

The Technology Trap

We seem to be hard wired to line up enthusiastically behind each new wave of technology. Each wave has left behind tools and innovations that changed learning for the better (and also, embarrassingly, for the worse). It offers increasingly wondrous ways to improve access to learning, immerse employees in true learning experiences, share learning in collaborative spaces and extend the tools we use to do our work. And it offers equally wondrous ways to create truly disastrous learning experiences.

Antidote for the technology trap: Understand and embrace technology, especially game changing social media, but protect yourself from panacea thinking and keep your eye on the prize of improved performance.  Success lies in the design not the technology.

Everyday Experience is Not Enough

A core tenet of informal and social learning is that we learn through experience. It’s the elephant in the 70-20-10 room. It’s often used as an admonishment to formal learning. Advocates of the most laissez-faire approaches informal learning suggest that given the right tools (social anyone?) employees will do just fine without all the interference by the learning department, thank you very much.

No one in their right mind would argue that experience is not a powerful teacher, or that our most valuable learning occurs while working. But it’s pretty broad generalization don’t you think? Some experiences must be more valuable than others for achieving learning and performance goals. And if so, what makes those experiences more valuable and how do we know them when we see them? Or, from the perspective of the learning professional, how can we help create the right experiences to help people develop their skills? These seem to be important questions if we are to get beyond loose approaches to informal learning.

Indeed research in developing expertise has shown that not all experience is created equal. Years of experience in a domain does not invariably lead to expert levels of performance. Most people after initial training and a few years of work reach a stable, acceptable level of performance and maintain this level for much of the rest of their careers. Contrast that with those that continue to improve and eventually achieve the highest levels of expertise. It seems that where high performers may have 20 years experience , average performers seem to have 1 year of experience 20 times!

The following chart from the body of research on developing expertise, illustrates the results of different types of “experience” on workplace performance.

Ericsson K.A., "The Influence of Experience and Deliberate Practice on the Development of Expert Performance” The Cambridge handbook of expertise and expert performance (2006)

Average performers learn just enough from their environment (experience) to perform everyday skills with a minimal amount of effort. In contrast, experts continually challenge their current performance and seek feedback from their environment to stay in a more or less permanent learning state, mastering everyday skills but continuously raising their personal bar. This deliberate approach to learning from experience is what separates top performers from the norm. Continuously challenging current skills is hard work and it takes it’s toll. Some decrease or stop their focus on deliberate practice and never achieve the excellence of the expert (arrested development).

Designing experience

So, performance does not improve simply through cumulative everyday experience gained face to face, using social media or otherwise. It requires targeted effortful practice an environment rich in accurate and timely feedback. That does not mean formal training.  It does means experience designed and targeted to develop skills and expertise. This is a very different thing than routine, everyday work experience.

Some of the best learning approaches that work well in helping people challenge their current skill levels fall into that fuzzy middle ground between formal and informal learning (see this post for a continuum of learning experiences) and can include the following:

Designing, fostering and supporting work experiences that develop expertise is an emerging role for the learning professional. That role is to assure that people are working in a setting where they can challenge and develop their knowledge and skills. You can’t make them learn but you can help surround them with the resources they need to learn. This approach to learning is truly a partnership between the individual, their managers and you as a learning professional. In doing that work you are practicing and developing your own expertise.

Moving Practice to Centre Stage

As digital content becomes more prevalent (free and otherwise), there’s much talk about the new role of the learning professional as content curator or content strategy developer.

I agree this will be an important role, but worry it yet again puts the focus on structuring and controlling all that information (another round of knowledge management anyone?) while minimizing the critical role of practice and application of the “content”. We all know the importance of practice and feedback in the progression of knowledge to performance (we do all know that, don’t we?). If we truly believe it, then we need to put the design of practice and feedback at the centre of our work, and content (information) in a supporting role. This simple change in vantage point has the potential to radically change the way we approach learning and performance.

Organizing the learning function around practice (vs. courses and content)

What if the learning function was structured around the design and management of practice centres (virtual and physical), rather than the design and delivery of formal training events? It could once and for all move us away from formal event based learning to process oriented learning. The activities in each practice centre would vary by the type of skill being developed. Practice centres to support management and knowledge work for example (simulations, problem solving, cognitive apprenticeships) would look much different than that those supporting procedural and task oriented work (performance demonstrations, skill development). I explored this approach applied to management development in a previous post

What would be different?

Designing practice centres would require us to establish standards (ideally in collaboration with the people doing the work), derive authentic problems and tasks that help people achieve those standards, scaffold practice exercises in a progression towards expertise in the job/role and source and manage the “content” that will help employees make their way through the practice exercises. The framework puts practice in the centre and moves content to a supporting (but critical) role. You might think of the approach as Cathy Moore’s Action Mapping applied at the organizational level (rather than at the course level).

Separating content from practice

Traditional instructional design tightly connects information presentation (electronic or otherwise) with practice in structured learning events.  However, separating content from practice is positive and liberating (no matter what your ID traditionalists tell you) as long as practice does not get lost. Learning functions centred around the design of progressive practice would ensure that.

In the right context Web 2.0 and social learning can beautifully separate content and application.  Other times it can result more in  information dissemination.  Knowledge is an inert thing without application and consuming information is no substitute for true learning. Much of that awesome user generated content out there focuses on informing and much less on doing (thus the calls for content curation).  When social learning encourages sharing,  thinking, collaborating, and real world application as it does in an excellent community of practice, it fits well into the definition of practice I’m suggesting.

The role of Deliberate Practice in the development of expertise.

In preparation for an upcoming presentation on designing practice to improve performance, I’ve been reading much of the excellent source research on the role of deliberate practice in developing expert performance (popularized recently in well known business books). It’s sparked some ideas on how we might manage the shift I’m suggesting above.

If the research on deliberate practice has taught us anything it’s that developing expertise is a long term proposition (about 10,000 hours depending on who you believe).  One-off practice exercises built into formal training events only introduce employees to the “feel” of a skill and in no way produces the expertise needed in the modern workplace. If work performance is important and effective practice is a proven way of getting there, we should take it seriously enough to get it right.

I’ll explore the application of deliberate practice to various types of learning in my next few posts. In the meantime here are 10 ideas from a previous post that just scratch the surface on how Learning Professionals can use “deliberate practice” to improve workplace skill and performance.

  1. Move from “mastery learning” to designing practice with feedback over longer periods of time (from learning events to a learning process). Deliberate Practice differs from the concept of ‘Mastery Learning” at the heart of much instructional design. Mastery learning assumes a skill is perfected (or at least brought to a defined standard) in a fairly short period of time often within the scope of a single course. The complex professional skills of modern knowledge workers and managers demand a stronger focus on long term practice and feedback and building learning around long term objectives.
  2. Develop the person. Time, practice and individualized feedback imply a long term focus on individuals rather than on jobs or roles.
  3. Informal learning efforts like action learning, coaching and are cognitive apprenticeships are critical but they must be focused on practice and immediate feedback and extend over long periods of time.
  4. Relevant, frequent and varied practice must be the dominant and most important element in all formal training programs.
  5. Practice opportunities must extend far beyond initial training programs, to allow people to hone their skills through experimentation with immediate feedback.
  6. Create practice sandboxes and simulation centres for key organizational skills where people can practice their skills and experience immediate feedback in safe environment.
  7. Design visual feedback directly into jobs so professional can immediately see the results of their work. In this way working IS deliberate practice.
  8. Turn training events into the first step of a learning journey that will continue to provide opportunities to practice and refine skills throughout a career.
  9. Identify the interests and strengths of people nurture them through opportunities for deliberate practice. Provide resources and support that encourage early effort and achievement.
  10. Ensure social media environments provide opportunities for coaching and mindful reflection on performance.