Archives for category: Instructional design

For years the eLearning industry has categorized custom solutions into three or more levels of interactivity– from basic to complex, simple to sophisticated.  The implication also being that that learning effectiveness increases with each higher level of interactivity.

You don’t have to look hard to find them:

Levels of Interactivity in eLearning: Which one do you need?

CBT Levels

How Long Does it Take to Create Learning?

These kinds of categorizations surely originated in the vendor community as they sought ways to productize custom development and make it easier for customers to buy standard types of e-learning.  I won’t quibble that “Levels of interactivity” has helped simplify the selling/buying process in the past but it’s starting to outlive it’s usefulness.  And they are a disservice to intelligent buyers of e-learning.  Here’s why: 

1. The real purpose is price standardization

Levels of interactivity are usually presented as a way to match e-learning level to learning goals.  You’ve seen it–Level 1 basic/rapid is best for information broadcast, level 2 for knowledge development and level 3 and beyond for behaviour change or something to that effect.  However, in reality very simple, well designed low end e-learning  can change behavior and high end e-learning programs can wow while provide no learning impact whatsoever.

If vendors were honest the real purpose of “levels of interactivity” is to standardize pricing into convenient blocks to make e-learning easier to sell and purchase.  “How much is an hour of level 2 e-learning again?  OK, I’ll take three of those please”.  each level of e-learning comes with a pre-defined scope that vendors can put a ready price tag on.

It’s a perfectly acceptable product positioning strategy, but it’s not going to get you the best solution to resolving your skill or knowledge issue.

 2. Interactivity levels artificially cluster e-learning features and in doing so reduce choice

Most definitions of what features are included in each of the “levels” are vague at best.  Try this definition from Brandon Hall in wide use:

bh levels

It’s hard to imagine looser definitions of each level.  In fact there are a variety of factors that drive scope and effort (and therefore price) in a custom e-learning program. And they go well beyond “interactivity”.  They include interface choices, type and volume of graphics and media, instructional strategies, existence and quality of current content, and yes, the type and complexity of user interactions.

Each of these features has a range of choices within them, but the levels of interactivity approach tends to cluster everything into one bucket and call it a level.  It might look something like the following:

Levels of e-Learning

A Level 1 program is essentially a template based page turner with a few relevant (or maybe not so relevant) stock images and interactivity limited to some standard multiple choice self checks.   In contrast, a Level III program is loaded with custom designed interface, user controls, media and graphics, along with with complex interactions assumed to be required for simulations and scenarios.   Level 2 is the middle choice most buyers and vendors alike are happy to land on.  None of these choices, by the way, has anything to do with accomplishing a desired learning outcome–but that’s another discussion.

If this artificial clustering of features was ever true, it’s not any longer.  Advanced simulations and scenarios can be created with very basic media and user interface features.  Advanced custom interface and controls with rich custom media are often used to support simple information presentation with very little interactivity.  Powerful scenario based learning can be created with simple levels of interactivity.   Rapid e-learning tools once relegated to the Level 1 ghetto, can create quite advanced programs and custom HTML/Flash can just as easily churn out  page turners.   Out-of-the box avatars can be created in minutes.

This clustering of features into three groups gives you less choice that you would receive at your local Starbucks.  If I’m a customer with a learning objective that is best served by well produced video and animations followed by an on the job application exercise,  I’m not sure what level I would be choosing.   A good e-learning vendor will discuss and incorporate these options into a price model that matches the customer requirement.

3. It reduces effectiveness and creativity

Forcing solutions into one of three or four options stunts creative thinking and pushed the discussion towards media and interactivity rather than closing a skill gap where it should be.

4. It hurts informed decision making

It appears to create a common language but actually reinforces the myth that there are only three or four types of e-learning.  The combinations of media, interactivity and instructional approaches are as varied as the skill and knowledge gaps they are meant to address.

5. It encourages a narrow vision of e-learning

e-Learning has morphed into new forms.  Pure e-learning is already in decline being cannibalized by mobile and social learning and the glorious return of performance support.  These approaches are much more flexible and nimble at addressing knowledge and skill gaps.

My son is in a graduate program in Medical Physics at the University of Toronto.  I sent him this recent article from the Atlantic on the concept of the flipped classroom (in higher education).

The Post-Lecture Classroom: How Will Students Fare?

lead_large

He sent a thoughtful email response which was interesting from a student perspective (in addition to seeing more words from him than “send money”!)  He just finished reading Quiet by Susan Cain and I like the connection he makes here.

Good article dad. I’d be interested to actually read the research article  myself, to look at sample sizes and statistical analysis showing their confidence intervals for the data and how statistically significant it is. I think a push towards new and improved teaching techniques is awesome, but you have to be really cautious with it. They mentioned that one year there were student presentations with discussions led by those students, which I have had several courses implement, and it’s a pretty mixed bag. At their worst, those presentations were students presenting the material in just as much of a dry, “Powerpoint poisoning” kind of way as possible. In one class, they were a bit more successful because the prof really helped structure the discussions so we all got a lot out of it.
 
I also have experience with the technique of having us read the book before class and answer clicker questions during lecture, to root out common misconceptions. There were online quizzes before lecture (very simple) on the readings to make sure we did them. I think that this was a VERY effective technique, and made me feel much more engaged in class, and definitely succeeded at giving the professor the information they needed in order to teach the class well, addressing problems that were common in the class.
 
I’m also uncertain about the whole discussion and collaboration in class. That CAN be a good way of engaging the class, but having read “Quiet” by Susan Cain, it definitely seems to fall squarely into this push towards an “extrovert ideal” in education, where it is really designed to most benefit those with more extroverted personalities, and can actually lead to less creativity and innovation, since the consensus in groups will be built by those with the loudest opinion, not necessarily the most informed one. That doesn’t mean this type of teaching isn’t without merit, but I think it should be used with caution, and not be made the centre of the curriculum.
 
Lastly, I think that it’s always important to consider what is actually being taught in these kind of studies, and not apply it to other teaching subjects without due consideration. Graduate Pharmacology seems like a perfectly suited topic for this kind of topic, since everyone is aware of the foundations of the subject, and can focus discussion on “higher-level” stuff like clinical trials. I can tell you that, in my experience, not all attempts to deviate from standard lectures are successful, and there are times, especially at a foundational level, where I personally think that it’s much more beneficial to acquire the knowledge in a more traditional lecture setting, even if it’s less engaging.

Picture1We all make mistakes. We know better, but we follow old ways or accept cultural practices that don’t work. There are patterns that produce successful projects and those that lead to failure (see the project death march). I did a recent presentation on the classic mistakes we make in the planning, design and implementation of learning (and how to avoid them). I finished the session with a with a tongue-in-cheek 7 step prescription for a failed learning initiative. Follow them carefully for guaranteed failure.

Step 1: Ensure your program is not connected to a business or performance need

A critical first step.  If your learning initiative in any way contributes to individual or organization performance you’re risking success.  Market your program as urgent and essential to convince your audience they need it (while you’re at it, it’s best if you can convince yourself too).  You then have the option to bury the program somewhere deep in your corporate training catalog or roll it out aggressively as a mandatory program.  Both tactics are equally effective at failing dismally.

Step 2: Rely on your fave management training vendor for program content 

Some say learning programs should be driven by real job or role requirements–that the context in which people work should be the source for learning content.  Don’t believe it.  Instead, close your door, pick up the phone and call your favourite training vendor.  Ask them what training you should be providing this year.  They will have the answer you need and a program with all sorts of great content ready to go.  Another option would be to simply gather as much content as you can from the internet.  Have you seen how much information is out there!

Step 3: Choose a solution that suits you rather than your learners 

There’s so many ways to deliver and support learning now.  Gone are the days where your only option was a classroom training program.  You probably have your favourite.  Trust your gut.  Go with that favourite.  It will be more interesting for you.  Just be sure your choice is detached from the preferences of your audience.

Step 4: Load up on information. Make no provision for practice

Information driven learning is one of the best strategies for failure we know of. Designing practice is hard.  Even harder to design practice that works–on the job activities that develop skill in context. So don’t sweat it. There are great examples out there of power point driven classroom learning, “click next” e-learning, and social learning that’s all about sharing information without actually using any of it. Mimic those examples and you’ll get closer to putting the nail in the coffin of your failed project. But your not quite there yet.

Step 5: Let technology drive your solution

Technology is hip. And they tell us it’s needed to capture the attention of the digital natives entering your organization. So keep it cool. Your program must incorporate the most current devices and tech tools–tablets, smartphones and web 2.0 apps. Don’t think about how they support the objectives of your initiative.

Step 6: Boldly assume the learning will be used on the job

Your program is that good!  It will be more powerful than the current barriers, lack of support and reinforcement that learners will encounter when they are back at work.  Mastery was your goal and no refinement of those skills on the job will be necessary.  Really.

Step 7: Develop your program in a client vacuum

Above all, do not partner with your internal customers to identify business needs or develop a plan to support them through formal or informal learning.  One of the best predictors of success is working collaboratively with your client through the planning, design and development of your program.  Avoid at all costs.  Failure Guaranteed. You’re welcome.

its only a failure quote

practice

Image courtesy HRVoice

This is a variation of an article I prepared for CSTD and HRVoice, summarizing research on practice and expertise.  Part 1 introduced the signature skills demonstrated by experts that separate them from novices. Part 2 presented the type of practice that develops experts.  This post gets to the implications I see for Learning and Development and makes the connection to existing approaches that embody the principles of deliberate practice.

It would be easy to position deliberate practice in the formal learning camp. Indeed for some physical and routine skills  elements of deliberate practice can be build into formal training programs until a learner reaches mastery.  However, in the modern workplace jobs are more complex and demand greater cognitive (versus physical) skill.  The research findings challenges us to consider how we can better support the full novice to expert journey, embed learning and practice in the job, design experience to include practice and reflection, build tacit knowledge, and design rich feedback. In a past post I listed some general principles for using deliberate practice in learning

Fortunately we have a number of approaches available to us that align well to the conditions of deliberate practice. Most of these approaches are not training in the traditional sense. They do however have a structure to them and require significant support. Consider them more non-formal learning than pure informal learning.  Here are some well defined but under-used learning methods  that match well to deliberate practice.

  • Action Learning. Small teams create a plan of action to solve a real business problem. Impacts of these actions are observed, analyzed, lessons extracted and new actions prepared. This cycle of plan, act, observe, reflect embodies the key elements for deliberate practice. The approach has a significant and growing following. Used frequently for management development, it would be great to see it expanded to other types of professional work. See this  earlier post on action learning. 
  • Cognitive Apprenticeship. The standard apprenticeship model updated for modern knowledge work. Instead of demonstrating a manual skill, experts model and describe their thinking to “apprentices” who then work on the same problem while they articulate and verbalize their own reasoning. The expert provides coaching and feedback to encourage reflection. Support is “scaffolded”–gradually released as skills build and confidence is gained.
  • Communities of Practice. Groups with common professional or project goals work together sharing and discussing best practices. In doing so they develop rich tacit knowledge and the hidden “how to’s” that are often missed in formal learning programs. New knowledge is created in the process of collaborating with others. Social media environments can provide a home for the conversations and knowledge that is created.
  • Simulation and Games. Great simulations are a surrogate for real experience and incorporate authentic  work tasks. This allows the learner to attempt challenging tasks, experience failure and learn from errors–all critical elements of deliberate practice.  I like games that model real work and allow for fun, repeatable practice, but worry about “gamification” that uses game mechanics to motivate employees to use the same old ineffective training.
  • Feedback in the Workflow. Wonderful natural feedback exists in the form of business results and performance data. We don’t tend to think of it as a learning tool, but in the context of deliberate practice, it is one of the most powerful. It requires connecting the data to individual or team behavior. It is the cornerstone of approaches to team learning found in improvement methods like Lean, Six Sigma and performance technology. Here’s a post with some ideas on implementing a learning feedback system
  • Stretch Assignments with Coaching. One of the most powerful approaches to “practice” is challenging work assignments that push current capabilities. Already a staple of executive development, we need to see much more of it for other types of professional development.
  • Open Practice Centers. Instead of tired corporate universities and course catalogs populated with learning programs, Practice Centres could provide progressively challenging practice, simulations and work assignments matched to key job roles. Individualized practice is designed to support the full novice to expert journey using the principles of deliberate practice. Learning “content” is considered only in a support role to accomplish practice goals. Heres an idea for organizing the learning function around practice instead of content and courses. And the core idea applied to Management Development

These approaches and others like them occupy that fuzzy middle ground between informal and formal learning. Each can be aided significantly by social media/social learning and learning technologies. Most importantly however they are approaches that allow us to apply the research on “deliberate practice” to help improve our organizations and in doing so improve our own professional performance.

Deliberate PracticeThis is a variation of an article I prepared for CSTD and HRVoice, summarizing important research on how practice develops expertise and implications for the learning function.  Part 1 introduced the signature skills demonstrated by experts that separate them from novices.

It seems these signatures of expertise are the result of years of effortful, progressive practice on authentic tasks accompanied by relevant feedback and support, with self-reflection and correction. The research team have labeled this activity “Deliberate Practice”. Others have called it deep practice and intentional practice. It entails considerable, specific, and sustained efforts to do something you can’t do well—or at all. Six elements for are necessary for practice (or on the job experience) to be “deliberate” practice:

  • It must be designed to improve performance. Opportunities for practice must have a goal and evaluation criteria. The goals must be job/role based and authentic. General experience is not sufficient, which is where deliberate practice varies from more laissez-faire approaches to informal learning. Years of everyday experience does not necessarily create an expert. Years of deliberate practice does.  See my post Everyday Experience is not Enough for a deeper discussion on this.
  • It must be based on authentic tasks. The practice must use real work and be performed in context-on the job. The goal is to compile an experience bank, not a vast list of completed formal training programs.
  •  The practice must be challenging. The tasks selected for practice must be slightly outside of the learners comfort zone, but not so far out as produce anxiety or panic. Deliberate practice is hard work and stretches  a person beyond their current abilities. The experience must involve targeted effort, focus and concentration
  • Immediate feedback on results. Accurate and diagnostic feedback must be continuously available both from people (coaches) and the business results produced by the activity. Delayed feedback is also important for actions and decisions with longer term impact as is often the case in knowledge based work.
  • Reflection and adjustment. Self-regulatory and metacognitive skills are essential. This includes self-observation, monitoring, and awareness of knowledge and skill gaps. Feedback requires reflection and analysis to inform behaviour change. Experts make mindful choices of their practice activities.
  •  10,000 hours. For complex work, ten years seems to be the necessary investment of in deliberate practice to achieve expertise. Malcolm Gladwell drew attention to the 10,000 hour rule in his book Outliers and it is one of the most robust findings in this research.  It poses a real challenge for our event based training culture. Of course the less complex the work, the less time required to develop expertise.

If we aspire to evidence-based approaches to learning it’s hard to ignore this body of research. Among other things, it challenges us to consider how we can better support the novice to expert journey, embed learning and practice in the job, design experience to include deliberate practice, build tacit knowledge, and build rich feedback into our organizations.

Fortunately we have a number of approaches available to us that align well to the conditions of deliberate practice. I’ll outline those approaches in Part 3.

Here is a nice illustration of the key process and concepts of deliberate practice courtesy of  Michael Sahota:

Deliberate-Practice-630x357

I’ve been presenting recently on the research and application of Deliberate Practice for developing expertise in the workplace.  I’ll be doing so again at the upcoming British Columbia Human Resources Management Association (BCHRMA) conference on May 2.  A version of following article on the topic appeared in the CSTD Learning Journal.   A modified version will also appear in an upcoming issue of HR Voice.   It is posted here in three parts.

******

expert We all know well-designed practice is a critical for effective training.  It’s what differentiates meaningful learning from passive information presentation.  But as work becomes more complex and knowledge-based, are the practice activities we design for our formal learning programs (both classroom and e-learning) enough to meet the need for expertise in the modern workplace? A comprehensive body of research on how professional expertise is developed suggests it may not be.

This research, led largely by Anders Ericsson at Florida State University and popularized in recent books such as Malcom Gladwell’s Outliers and Geoff Colvin’s Talent is Overrated, indicates that the type of practice needed to develop true expertise is more intensive and “deliberate” than we thought, and that it must be embedded in the context of real work.  Also, it must occur on a regular basis over much longer periods of time. The research has implications for us as learning and performance professionals.  It argues for a profound shift away from event based formal learning to approaches that could be categorized as informal learning or learning from experience.  However, it turns out not all experience is created equal when developing expertise, so simplistic notions of informal learning also won’t work.  So how should we rethink the design of practice if it is to truly develop the complex skills of the knowledge workplace?  To answer that question it helps to first understanding what expertise looks like.

Characteristics of expert performance

Ericsson’s research has found that top performing individuals at work, besides being very good at what they do, consistently demonstrate the following differences compared to novices and lower performing individuals

  • They perceive more.  Experts see patterns, make finer discriminations, interpret situations more quickly and as a result make faster, more accurate decisions.  Novices slowly review all information and don’t have the contextual experience to recognize patterns
  • They know more.  Not only do experts have more facts and details available to them, they have more tacit knowledge–that all-important unconscious “know how” that only comes with experience.  Novices rely on limited explicit knowledge
  • They have superior mental models.  Experience helps experts have rich internal representations of how things work and how knowledge is connected.  They use this to learn and understand situations more rapidly. Novices rely on simple, sometimes inaccurate, rules of thumb and loosely connected knowledge
  • They use personal networks more effectively.  Experts know who to go to for help and answers.   Novices are not able to identify access critical information and people as quickly
  • They have superior “meta-cognition”. Experts are better self-monitors than novices.  They set goals, self evaluate against a standard, and make corrections and adjustments more quickly from feedback

novice to expert

These are skills we want in all employees.  At times they seem like they come from an innate ability or deep “competency” unachievable by others.   However the research shows that while natural ability may play a small role, practice and experience are far more significant.  The nature of this experience is critical.  “Practice makes perfect” is only true for practice of a certain type which I describe in part 2.

About half of the formal training provided in organizations is custom developed (the other half are packaged “off-the-shelf” programs).  That’s a lot of training.  Every week  internal learning design teams and their external  partners are heads down at work developing learning programs of every description to help build skills and capability unique to their organizations.  In a knowledge economy,  organization specific knowledge and skill is at the heart of competitive advantage.

Yet organizations often don’t get the strategic bang for their custom learning buck.  We are getting good at producing more training in shorter time periods (rapid!) but not necessarily better training, and we are using technology to reinforce these patterns, not break free from them. Training functions continue to respond to ad-hoc requests and greasing those squeaky wheels.

On Monday June 20th, at 1:00 pm (EST) I am doing a free webinar to discuss ways organizations can get more strategic value from their custom learning initiatives (including informal learning).  Panel guests from two Global Knowledge  clients  (Bell Canada and Service Canada) will participate.  Feel free to join us (it’s free).  Click here to register .

Here are some of the  practices we’ll be discussing.

1. (Really) Link Learning to Business Strategy

  • Business goals are your friend. Use them to support your decisions not to respond to low value ad-hoc requests.
  • Get hooked into the annual planning cycle to truly understand your organizations business strategy
  • Prepare proactive annual learning plans with your customers to  jointly addressing business needs
  • Manage ad-hoc requests professionally

2. Target Signature Competencies that Differentiate your Organization

  •  In today’s knowledge economy organizational capability, skills and knowledge set companies apart and provide real strategic advantage
  • These signature competencies are often driven by key business processes
  • Custom learning will add more value when it focuses on these core competencies and not on lower leverage ad-hoc learning needs
  • Identify pivotal jobs, roles and associated skills. Target custom learning projects squarely  at these all important core competencies

 3. Start at the End

  • Custom learning programs too often start with “content” or subject matter–a sure fire way to produce bloated, dull and low value programs
  • By starting with the performance improvement needed from jobs and roles, custom learning programs can be leaner, more effective and faster to develop.  In fact you may not end up developing training at all.  Performance support, information and informal learning solutions will start to to become obvious choices.
  • Work backwards:  business need –>performance needs–>practice/application –> minimal content
  •  Content and subject matter should be the last decision, not the first

 4. Design with Integrity

  •  We know how to design effective learning programs.  We just usually don’t follow our own advice.  The key factors are practice, application, coaching and feedback (true even for informal learning).
  • In our efforts to meet training volume targets, respond to unplanned requests and meet impossibly short turnarounds we opt for speed, convenience and content “coverage” at the expense of real impact
  • Set design standards that produce high impact learning and stick to them.  That doesn’t mean you can’t be flexible and have different approaches for information requirements vs deep learning requirements.  But it does mean you need to have the knowledge to know the difference and the professional integrity to commit only to the appropriate  solutions
  • Professionalize your team.  Hire people with the skills and track record to produce high impact learning and performance.  Develop those that don’t.  Set high standards.

 5. Get Informal

  •  Formal learning programs are only one way to accomplish learning outcomes.  And they are often the least effective and most costly
  • The majority of learning taking place in your organization right now is through informal learning
  • Tap the full range of learning solutions from informal to non-formal and formal learning to broaden your reach and influence the 80% of learning happening outside the training function
  • Performance support systems, communities of practice, job assignments, structured experience, collaborative learning and learning 2.o solutions are all custom solutions that can have greater strategic impact than a formal training program

6. Innovate with Technology

  • Technology has given us e-learning, automated learning administration (LMS), learning content management and collaborative design (LCMS), mobile learning, assessment tools, and more
  • It has brought efficiencies but not always improved effectiveness or strategic value
  • Web 2.0 and social media are disrupting current views of how technology can and should support learning.  That’s a good thing.
  • Be creative in how you use technology to support learning.  Don’t simply be a servant to it.  Use it as a tool to innovate rather than institutionalize mainstream approaches that don’t add value

 7. Use Partners Strategically

  •  External partners can offer more than a “pair of hands” to design custom learning programs. There are many points in the analysis, design and development stages where external partners can add strategic value to your programs that you may not have thought of
  • Set up partnerships with defined roles for internal players and external partners
  • Encourage knowledge sharing
  • Establish a collaborative project workspace to work and learn together
  • Merge processes to develop a seamless flow for working together

 8. Measure Success

  • If it’s important to develop strategic programs it’s equally important to know if you accomplished your objective
  • To be effective evaluation has to be a part of the plan, not an afterthought.
  • Evaluation does not have to be a complex and time consuming. Use existing business measures as much as possible
  • Consider alternatives to the Kirkpatrick model
  • Don’t measure everything.  Find out what’s important to the business and make that your measurement focus.  If the business itself is lousy as measuring results, you have yet another opportunity to add value

July 10 update:

Here are the slides used for the Webinar mentioned in this post.  You can view a recording of the free webinar here.

Seeking ways to leverage new social media environments, learning departments are discovering ways to sneak a little formal learning through the informal learning back door. Some of our clients for example, are looking to load up their social learning environments with small bits of learning content related to business goals. The notion being that these informal learning assets will live or die on the strength of their connection to employee performance need. Informal learning assets (or perhaps more accurately formal learning assets designed for informal consumption) are small segments of learning media such as videos, podcasts, documents, animations, short interactive pieces, images, performance guides, job aids, process descriptions, anything with a learning intention that can be posted to a social media environment. They can be created by anyone, from learning designers, to managers and employees and team members.

Survival of the fittest

The strategy creates a kind of Darwinian free-for-all of digital learning resources. Those of best fit to real learning and performance needs will get viewed, liked, shared, discussed and commented on more than those that don’t quite measure up. The best become internal learning memes that do their viral tour of duty and those that don’t hit the mark fall off the social radar, never to produce their learning offspring to see another day. Or so the theory goes. It’s an interesting strategy with loads of implications for designers, suppliers and users of learning content. The idea (I hesitate to call it a trend) is leading some organizations and training suppliers to deconstruct their existing learning programs into learning bits and pieces for populating internal social media environments such as they are.

Making informal learning assets work

I like the idea of infusing communities with digital learning assets but there are a few cautions to watch as we enter this new path. Foremost is the profusion of “information” oriented learning assets at the expense of the practice, application  and reflection that we know is at the heart of real learning and improvement. Information based assets no matter how novel or entertaining we make them are not enough. To bastardize an old Magerism, if telling alone resulted in learning we’d all be so smart we could hardly stand it.

There are ways to structure and design informal learning assets to maintain the best of what we’ve learned from formal design and bring them into the informal learning world. A model we’ve been experimenting with connects formal, informal and social learning, based on five learning essentials (you’ll recognize them if you are familiar with David Merrill’s First Principles or Bernice McCarthy’s 4Mat). Effective learning requires solving authentic problems and tasks, connecting new knowledge with existing mental models, uses powerful ways of presenting and demonstrating new knowledge, provides many and varied opportunities to practice new skills with coaching and reflection, and finally guides the the application to new situations on the job.

Too many informal learning assets target only the “key knowledge” requirement (#3), without any connection to the remaining four learning essentials. Well designed learning programs will account for each of the essentials, but there is no reason they all have to bundled up together in a tidy formal learning bow. In fact, the essence of good informal learning is that the guided application essential (#4) takes place on the job with feedback and coaching from colleagues or mentors inside social media environment or face to face. Forums and discussions are excellent ways to gently guide application. Job aids and performance support systems are effective vehicles for building skills into workflow (#5). Real business problems and tasks (#1) can be used instead of artificial cases. My point is that with care each of these other essentials can be developed as informal learning assets as effectively as a good information driven asset.

This view can also serve as a guide when deconstructing classroom programs for  conversion to social media environments. Instead of retaining only the key knowledge from your programs, look for effective ways to create assets that support the other learning essentials as well.

Learning assets associated with a specific knowledge domain, role or learning objective can be connected through tagging, linking or even a good old fashioned learning path.

Once loaded into social media environments users and community members will begin using them to improve their performance and manage their own knowledge. Not only will they consume the learning assets, they will create their own and in doing so create new and emergent knowledge. As new ideas emerge they will evolve to standard practice and can feed the development of new or revised formal learning programs.

This connection between formal, informal and Social learning might look something like the following:

Through her Action Mapping process Cathy Moore has demystified, simplified and put a friendly face on an analysis process that produces lean and effective learning programs with an emphasis on practice and application. The four step analysis process of identifying  business goals (1), desired actions/behaviours (2) and  practice activities (3) before identifying content (4) is much advocated but rarely practiced in instructional design. She also uses a helpful visual mapping method to work through this four step process.

Extending the process to performance design

I used the process (and visual mapping approach) to facilitate a learning requirements session a while back. Worked like a charm. I thought then that the process might be taken a little further and be used to identify gaps in the immediate performance environment known to impede optimal performance and then specify solutions for improvement. Here’s what I’m getting at…

Performance Consulting thought leaders (and hard won experience) tell us that newly developed skills alone, without a supporting environment rarely produces the performance impact we need. If you accept this view, you understand that skills and knowledge are only one factor among many that are needed for performance and that, in fact it’s often the performance environment and not the skills that need adjustment. Geary Rummler organized these critical performance factors within a systems framework and labeled it the Human Performance System (HPS), Thomas Gilbert categorized the factors in his seminal Performance Engineering Matrix which Carl Binder has distilled into his Six Boxes Model. The Robinsons summarized the factors in their Performance Consulting process. Mihaly Csikszentmihalyi has found similar factors in his work on optimal performance and flow states. These authors have developed diagnostic tools based on the performance factors that can be used by teams, managers and performance consultants to identify barriers in the work environment and to design tools, processes, and systems that improve performance.

Borrowing from the above models the critical performance factors might be summarized as follows.

 

 

  • Clear Expectations and goals (E)
    Do employees understand the behavior and results expected of them and their team?
  • Supportive Tools, resources and business processes (T)
    Are employees supported by helpful performance aids, process guides and knowledge tools?
  • Timely and meaningful Feedback on results of action (F)
    Is immediate feedback provided to employees and their team (system generated or human) on the quality and accuracy of their actions and output?
  • No Interfering or competing demands (I)
    Is the role free of demands on time and task that interfere with accomplishment of individual and team goals?
  • Consequences aligned to expectations and goals (C)
    Do good things happen when employees accomplish goals and meet expectations or do they happen more for undesired performance?

So how might we extend Cathy’s Action Mapping method to design an optimal performance environment in addition to a learning solution? The first two steps remain the same. 1. Identify the business goal 2. Identify what people need to do to reach the goal. However, at this point the process would shift to the key performance support questions defined above. For each behaviour (or behaviour cluster) the following performance design actions can be taken

  1. Create a vehicle to continuously communicate the key goals, expectations and standards of performance
  2. Design performance aids, automated tools, social learning environments, Communities of practice, and business process adjustments. The appropriate tools and supports will, of course, depend on the type of work.
  3. Create a mechanism for providing continuous information (feedback) to individuals or teams on how they are performing against the desired actions. (I have posted some ideas on this here and here).
  4. Define specific actions for reducing interfering tasks and multitasking and increasing opportunities for focus on task without completing demands.
  5. Revise the balance of consequences in favor of the desired performance.

Using the labels I listed above the extended Action Map might look something like this (Common support actions could support more than one behavior):

Adding Outputs and accomplishments

The approach could be further enhanced by identifying work desired outputs before behaviours/actions (a revised step 2).  This would be especially useful when starting with the analysis of a job rather than a specific business objective. This is important for knowledge work where there may be multiple behavioural paths to the same work output. Carl Binder has labeled this approach the performance chain. The same performance thinking is at the root of both Action Mapping and the Performance Chain approach. You can learn more about performance thinking and the performance chain approach at the Six Boxes web site here.

Implementation

Performance Consulting gets legitimate criticism for sometimes for being too prescriptive and relying external “experts” to implement processes like those above. But there is no reason empowered self-managing team or process improvement groups cannot use the same tools to diagnose and design or influence their own performance environment. A good performance consultant can  facilitate teams through this process.  I learned a while ago from Geary Rummler that good performance consultants can provide both the training artifact requested by the organization and an improved performance environment. The extended Action Mapping method may be a great way to sneak some performance improvement into your training projects.

My last few posts have been about management and leadership development.  In this post, I thought I would bring some of those ideas together in the form of a process or heuristic for a management development system built around defined business challenges, informal learning approaches with less reliance (or no reliance at all!) on classroom learning.

Here is an alternative management development process then…in just 5 easy steps!…built around authentic learning tasks and supported by informal learning assets and small team action learning sessions.

Step 1:  Define a management/leadership model suited to the organization

Too many senior leadership teams abdicate responsibility for defining coherent roles and expectations for their managers.  Instead, they “buy” a management model in the guise of a training program, rather than defining one consistent with their values, organization design and business goals.

HR and Training enthusiastically taking up the charge and end up buying management programs currently in vogue or that suits their personal vision of leadership.  Also many purchased management programs focus only on the “people” side of a manager’s role and ignore the many other important facets.  I’ve seen many management development programs that are completely at odds with the actual management culture of the organization.    Behavior and performance that gets rewarded and encouraged on the job is not what is taught in development programs.   Not good.

HR and Training have a role in helping the senior team craft a vision and approach for management but they should not be digging it out of the recesses of a commercial management program. Learning and training initiatives should support a coherent model of management not the other way around.  Of course, the model can and should be fluid. Adjustments should be welcome and encouraged as an organization learns and matures over time.

Step 2:  Develop an inventory of management scenarios and business challenges

Using the management model as a guide, create a series of realistic scenarios, cases,  and business challenges  (whatever your term of preference) that will be the core of your management development program.  We know the best management development is built around authentic problems, tasks and opportunities.  Managers also tell us it’s the way they prefer to learn.

Make it your mission to work directly with the managers of the business units you support to understand their day to day challenges, responsibilities, successes and best practices.  Use this inside knowledge to create a progression of challenges from simple to complex, perhaps based on the levels of management in your organization.    The inventory should be highly dynamic and constantly evolving as goals and objectives change.

Challenges can and should have focus on desired competencies and target all of the spheres of responsibilities managers have including business, functional, financial and human. Mangers themselves can start creating challenges and problems scenarios to add to the inventory.

Step 3:  Organize the business challenges into learning paths.

There are a variety of ways the business challenge scenarios could be organized.  By management level, by progression of challenge complexity, by competency or skill area, by management responsibility.   All of the above are possible using simple tagging tools.

The most important thing is to provide an organizing structure for managers to access and use the learning assets.  One of the early failings of social learning environments is the assumption that people will fully manage their own learning in personal learning environments.  Some may, but the majority prefer guidance and a few nudges along the way.

The learning paths are most useful for new managers.  More experienced managers will begin using the business challenges on an as needed basis, which is the way it should be.

Step 4:  Acquire and/or develop a series of learning resources and performance aids to support solving the business challenges

Using the business challenges and problems as a guide, purchase or develop learning assets that contain the key concepts, principles, practices and practices that will help managers solve the full range of business challenges in your inventory.  Use media appropriate to your audience and technical infrastructure including print, digital video, performance guides, e-learning, people (coaches/mentors), job assignments and others.  They are getting easier to find as learning content suppliers are starting to deconstruct their programs into smaller learning assets for use in social media environments.

They can be housed in your organization’s social media suite , Management Community of Practice,  Learning Management System (if you still have one), or in an old-school style learning centre.  Most importantly they must be connected directly to the business challenges managers will be assigned to solve as part of their development.

Learning assets should not be the exclusive purview of the learning function.  Social Learning has taught us that “user-generated” content is both powerful and motivating.  Get managers involved in contributing learning assets.

Learning assets will be used by managers individually and in action learning teams to research and discuss solutions to the business challenges from your inventory.  Assets can be organized into clusters or paths matched to the scenarios. The scenarios are the learning motivators.  The content is only the path to the solution.

Step 5. Assemble action learning teams

The learning assets can and should be used independently to solve the business challenges, but doing so exclusively misses the benefits of social learning.  We’ve learned that small teams of managers working together (face to face or virtually) to solve business challenges is a key success factor in management development.  Action learning has refined a robust approach to small group learning that incorporate the best of informal learning.  Other problem-based and case-based learning models also offer springboards to build management learning teams.  See the links here for a few examples).  I offered an approach using management communities of practice here.

Whichever approach is used the goal is for managers to share their experiences and perspectives together as they solve the business challenges.   Here here is a diagram of how management teams working together can use the business challenges and informal learning assists  to continuously develop.

Follow

Get every new post delivered to your Inbox.

Join 338 other followers