Please note that this blog is no longer active. For the most recent insights on Change Management, strategy, and people, please go to www.parkourconsulting.com.

Monday, September 12, 2011

Lessons Ignored: What Corporate Projects can Learn from the Military

Anyone who has spent time around corporate projects has sat through a project wrap-up that included a lessons learned session.  The first time you're excited.  "I can offer insight and feedback that will help the next team avoid our mistakes!"  The second time you still hold out some hope.  "Maybe this group will use our input and build on our lessons learned."  A few projects later, you're resigned.  "Why bother?  They're just going to write my advice on a piece of paper, stick it in a folder, then never look at it again."

In 2005, Harvard Business Review published "Learning in the Thick of It" (Darling, et al).  This article looked at After-Action Reviews, the Army's version of lessons learned.  The found that instead of the typical post-mortem report generation that most companies slog through, the U.S. Army's Opposing Force had developed a dynamic, cyclical process that generated genuine learning and improved results.

If you want to read the article yourself (and I highly recommend that you do), the citation is located at the bottom of this post.  For those who don't, I offer a brief summary of highlights from the article.

The Process
  1. Despite their name, After-Action Reviews (AAR) do not begin after a task is completed.  Instead, they begin during the planning phase.  Before any action is taken, the senior leader develops operational orders.  These orders have four parts: the task, the purpose, the commander's intent, and the end state.
  2. The commander then shares these orders with his subordinates.
  3. Each subordinate is responsible for providing a brief back - "a verbal description of the unit's understanding of the mission and its role" (3).
  4. Soon after, the team holds a dress rehearsal.
  5. Finally, the team begins the mission.  After-Action Reviews are conducted throughout the mission, not just at the end.
The Meeting
The After-Action meetings consist of three main parts that address four questions.  The first step is a reiteration of the AAR rules (4):
  1. Participate.
  2. No thin skins.
  3. Leave your stripes at the door.
  4. Take notes.
  5. Focus on our issues, not the issues of those above us.
The team then conducts a comparison of the actual results versus the intended results.  To help focus this conversation, they answer four questions (4):
  1. What were our intended results?
  2. What were our actual results?
  3. What caused our results?
  4. And what will we sustain or improve?
Once the team has addressed these questions, the senior commander provides his assessment of the lessons learned and the lessons that will be relevant in the immediate future.  When the meeting ends, participants immediately hold AARs with their own teams, to continue the process through all levels of the unit. (7)


The Deliverable
What do these meetings produce?  Unlike most corporate lessons learned sessions, the Army's AARs aren't focused on creating a report or a presentation.  They are focused on generating lessons, theories, and plans that can be applied immediately to the on-going mission.  They avoid blame and instead focus on identifying ways to learn and improve.  Rather than producing a post-mortem, they produce results. 

AARs "focus on improving a unit's own learning and, as a result, its own performance." (4)  In fact, "the group does not consider a lesson to be truly learned until it is successfully applied and validated." (2)

If this sounds like a far cry from the lessons learned sessions you've sat through, you're not alone.  How does your organization conduct lessons learned?  Do they actually result in learning? 

Next up: Applying the principles of the AAR to the corporate project environment.

References
Darling, M., Parry, C., & Moore, J. (2005, July-August). Learning in the Thick of It. Harvard Business Review, 1-8.

Monday, August 15, 2011

Training and the One Minute Manager: The Ultimate Test

Reference note: The full citation for Leadership and the One Minute Manager can be found at the end of this blog post.

Years ago a friend let me borrow her apartment for a week.  One night I picked up her copy of The One Minute Manager by Ken Blanchard and Spencer Johnson.  At the time I was 23, out of college for less than a year, and not managing anyone.  Whatever wisdom the book had to offer was lost on me.

Then last week I picked up Leadership and the One Minute Manager.  The thing that most captured my interest was that, although the book is focused on leadership, it makes numerous references to training.  In fact, the more I read, the more I became convinced that the title could easily be changed to Training and the One Minute Manager.

Let me start my discussion with my favorite quote from the book.  Toward the end of the book, the discussion turns to performance evaluation, and one of the characters shares this anecdote:
...I think of my favorite college teacher.  He was always getting into trouble with the dean and other faculty members because on the first day of class he would hand out the final examination.  The rest of the faculty would say, 'What are you doing?'  He'd say, 'I'm confused.'  They would say, 'You act it.'  He'd say, 'I thought we were supposed to teach these people.'  They'd say, 'You are, but don't give them the questions for the final exam.'  He'd say, 'Not only am I going to give them the questions for the exam, but what do you think I'm going to do all semester?' (Blanchard 87)

Have you guessed his response?  "Teach them the answers."

This anecdote was meant to illustrate the proper way to conduct performance evaluations.  It can just as easily be applied, though, to corporate training.  It addresses some of the fundamental issues of training:
  • What is the objective of corporate training?
  • How do we measure success?
  • What is the role of the trainer and the trainees?
These are questions most trainers have thought about throughout their careers.  Here I set forth my answers, with Leadership and the One Minute Manager on my mind.


Practical Answers to 3 Training Questions

  1. What is the objective of corporate training?  Let's be practical.  Training, both development and delivery, costs money.  And these days, money is tight.  Most often the objective of corporate training is to teach people to do their jobs better, faster, more efficiently, etc.  We want them to use new software, manage their people more effectively, and follow new policies.  All of which is designed to help them help the organization meet its overall business goals.  Compare this to the goal so many educational institutions state of teaching students how to learn, how to reason, etc.  With such different objectives, it doesn't make sense to use the same training methods for employees that we would for students.  Which brings me to the next question...
  2. How do we measure success?  A student's success is often measured by his ability to memorize enough information from a course that he can answer questions on a test without advance knowledge of those questions.  Like the students in the anecdote, however, employees already know the questions on the test.  In fact, there's only one question: Can you successfully perform your day-to-day job activities?  And there's only one acceptable answer.  As trainers, we often get caught up in measuring success based on trainee feedback on course evaluations.  Or we administer end-of-course tests and count a course successful if the trainees can pass.  Sometimes we include pre-tests and post-tests and measure the level of improvement.  The list goes on.  Yet, even when these tests involve real-life simulations, correctly completing a simulation in class is very different from successfully completing your job.  At the end, the only true measure of success is whether an employee consistently and accurately can apply the skills learned in the training sessions to their daily activities.  This measurement of success raises the question...
  3. What is the role of the trainer and the trainees?  This view of training demands a partnership between the trainer and the trainee.  When the only question on the final exam is the ability of the trainee to apply the training to her job, who knows better how to write the exam than the trainee herself?  While the trainer is traditionally the one responsible for putting together the training curriculum and the evaluations, in reality, without the input of the trainee, the potential for the course to succeed is greatly reduced.  In this model, the trainee can't abdicate responsibility for her learning to the trainer.  And the trainer can't work in a vacuum deciding what information is important.  Instead, it's a symbiotic relationship. 
Not only does this model of training increase the chance of success from a content perspective, but it also addresses one of the fundamental concepts of Change Management: People are more invested in the success of endeavors that they have helped to shape and create.  The last time you were forced to attend corporate training, I imagine you did a bit (or a lot) of grumbling.  It's a waste of time.  You have more important things to do.  These courses never teach you what you need to know.

What if you had helped to design the course?  What if you had decided what questions would be on the final exam, and knew that the entire course would be dedicated to teaching you the answers?  What if you knew the training would directly contribute to your success?

Trainers: How much do you collaborate with trainees in developing training?


Works Cited

Blanchard, Ken, Patricia Zigarmi, and Drea Zigarmi. Leadership and the One Minute Manager: Increasing Effectiveness Through Situational Leadership. New York: William Morrow, 1985.

Saturday, July 30, 2011

Art or Science? The Exciting Conclusion...

It turns out, Readers, that you are a quiet and moderate group.  The dominant refrain from reader response and conversations I've had with clients and colleagues is that Change Management must be a combination of art and science in order to be truly effective.

This is great progress from a few years back when many of the people at the firm I worked for had a tendency to interpret CMS - Change Management Strategy - as Chicks Making Slides.

I agree that Change Management requires a combination of the science of methodology and planning with the art of people management and flexibility.  Today, though, I'll focus on the science side.  Specifically, I'd like to call your attention to an area I feel has been greatly lacking in Change Management - the science of Return on Investment (ROI).

Heads up, all you Change Management grad students (does such a thing exist?), because I'm about to depart from my practical advice to lay out my Change Management ROI thesis proposal.

A (Theoretical) Approach to Measuring CM ROI 

Perhaps one of the most difficult questions I receive from clients is, "How can you prove the value add of Change Management?"  I can provide a large number of anecdotes.  I can point to surveys such as IBM's Making Change Work study.  I can list dozens of articles and books that discuss the value of Change Management.

What I cannot do is authoritatively say that if you invest X dollars in Change Management, you will receive X value in return.

What we are sorely missing is a comprehensive, quantitative study of the value Change Management brings to a project or organization.

The Subject
I propose a study that is undertaken using a large organization such as IBM, Accenture, or Johnson & Johnson as the subject.  IBM and Accenture have the benefit of having an extensive and diverse portfolio of projects that include Change Management.  A non-consulting firm such as Johnson & Johnson has the benefit not needing to get permission from other organizations to use their data.

Even better - study more than one organization.  This would help control for differences in methodology and execution.

The Data
The chosen organization would need to gather a few major pieces of data for each project they implemented:
  • What was the total cost of the project?
  • What was the total spend on Change Management activities, broken down by Change Management, Communications, Training, etc.?
  • What were the project's objectives for itself (e.g., schedule, budget, team morale, etc.)?
  • How well did the project meet these objectives throughout the project lifecycle?
  • What were the project's business goals and/or Key Performance Indicators (KPIs) (e.g., save X amount of money, reduce process time by X minutes, increase customer satisfaction by X %)? 
  • How well did the project meet these goals and or KPIs at go-live, one month post-Go-live, one year post-Go-live, etc.?
As a control group, they should gather the same data for projects where they did not have any Change Management, as well as for projects that only had training.

This data should allow them to compare how well a project with Change Management met its internal and business objectives versus projects that did not utilize Change Management.  It would also allow them to do a cost/benefit analysis of the money spent on Change Management as a percent of overall budget versus the level to which they achieved their objectives.

I would recommend projects where objectives are very concrete, making them easy to set and measure.  It's much easier to determine how much a procurement system implementation will save you by reducing maverick spend than it is to determine how much happier your employees are after implementing a culture change.

The Roadblocks
There are a number of hurdles a researcher would have to overcome to implement this type of study.  First, my experience tells me that many projects don't set clear business objectives or KPIs.  I've been on a number of projects where there was no defined business case for the system implementation.

Second, on projects where business objectives are clearly set, I believe it's relatively rare for consulting firms to go back to the client on a regular basis post-Go-live to determine how well those objectives have been met.

Third, there are a large number of variables.  Would different consulting companies have different results based on their methodology?  How much of an impact does the experience of the Change Manager have?  How exactly would we define Change Management for the study?  And the list goes on.

The are other challenges, but the final one I would point out (and this is completely personal opinion) is that I think many Change Managers have a nagging fear in the back of their minds that a study like this might not conclusively demonstrate a positive ROI. 

We shouldn't let that stop us, though.  I believe we would find a correlation between including Change Management on a project and the project's ability to help an organization meet its KPIs.  And if we don't, that at least would point out to us areas where we can improve our methodology and practice.

Do you think this type of study would work?  Have you seen any good, quantitative research on the value of Change Management?

Saturday, July 16, 2011

Change Management: Art or Science?

Today I have a question for you, Readers.  It is one of the ongoing debates among Change Management practitioners: Is Change Management an Art or Science?

Please leave your comments in the comments section below this post, or feel free to e-mail me directly at emilycarrconsulting@gmail.com

I'll post my position on the question next week.  I look forward to the debate!

Tuesday, June 21, 2011

The Danger of Smiling Faces: Fostering Two-Way Communication

There is an on-going debate over how to measure success in Change Management.  Whether your focus is on user adoption, communications, or training, there is a constant stream of "new and improved" methods for determining if your Change program is a success.

Regardless of the formal method used, I've found that the overwhelming majority of people have one fundamental measure for Change Management success: The more people who are happy, the more successful the Change program.  Sometimes we see this explicitly, as with the "Smile Sheets" that so many trainers distribute after each training course.  These surveys ask people whether they liked the course, whether they thought the trainer was good at his job, and whether they thought they learned something.  What the surveys don't do is tell the trainer whether or not he was actually successful in training the end users on a new skill.  I'm not saying these surveys are bad - I use them myself.  They are not, however, a reliable measure of training success.

While Smile Sheets very explicitly show the correlation most people see between happy end users and successful Change Management, the gut-level belief that most people have in this correlation is more often implied.  How often have you heard (or said) these lines?
  • I just got out of a meeting where everyone was really negative about the project.  We'd better change our Change Management approach.
  • I was talking to some people in the lunch room and they weren't happy about the change.  Why isn't the Change program working?
  • I'm getting a ton of calls from people who want don't like the decisions we're making.  You'd better figure out how to make them realize this project is great.
All of these statements imply that because stakeholders are unhappy with the changes you're implementing, the Change Management program is failing and in immediate need of re-work.  Assuming that these statements are coming during the life of the project, however, and not at Go-live or later, I would argue that statements of unhappiness, requests to discuss project decisions, and expressions of annoyance are a sign of success.

What they demonstrate is that people are engaged.  The first stop on the Change Curve is Awareness, and it's not possible to be unhappy with a change unless people are first aware of the change.  Unhappy stakeholders?  Congratulations!  You've successfully reached Awareness. 

The second stop on the Change Curve is Understanding.  People yelling that they don't like how the changes you're making will impact them?  Congratulations!  Your stakeholders understand the change, and you've successfully reached the second step to achieving change.

When I get half-way through a project and all I see are happy, smiling faces, I get worried.  100% happiness tells me that I'm failing miserably at Change Management because one of three things is happening:
  1. People don't know a change is coming, which means my communications aren't reaching their intended audience.
  2. People don't realize how they will be impacted by the change, which means they don't understand the communications I'm sending.
  3. People are pretending to be happy, which means I haven't fostered good two-way communications.
For today, let's focus on the third option and look at:

Practical Tips for Building Two-Way Communications
  1. Make it Easy - People are busy.  If you make it difficult to communicate with you or the project team, noone will bother.  If you have an e-mail address, make it easy to remember.  If you want people to respond via a web site, remember to include the link so that all they have to do is click.  The easier it is for people to tell you what's on their minds, the more likely they are to communicate.
  2. Make it Confidential - How often have you sat in a meeting where noone offers an opinion, but as soon as you leave everyone hurries to someone else's office where they suddenly have lots to say?  Many people are more comfortable sharing their thoughts in a one-on-one setting than they are in a group.  Even more people are likely to communicate if they know that what they tell you will be anonymous when it's shared with others.  This not only requires that you provide people a safe, confidential forum for communicating, it also requires that they trust you to maintain the confidentiality.  Change Management often falls in the HR department for a reason.  Even if it doesn't, you should practice the same level of discretion as an HR professional.
  3. Make it Matter - If you want people to communicate with you on an on-going basis, you have to demonstrate that what they tell you matters.  This means that you pay attention to what they say, take appropriate action based on the conversation, then follow-up to let them know the outcome.  Did they ask you a question?  Make sure you get back to them with the answer.  Did they make a suggestion?  Even if you can't guarantee that their suggestion will be implemented, let them know that it will be considered and tell them when they can expect to hear more.  Did they just want to vent?  Remember that even if you don't agree, everyone is entitled to have an opinion.  One of my major rules of communications: Never ask people for input if you have no intention of seriously considering it and taking appropriate action.

Do you think happy stakeholders is a good way to measure Change Management success?  What other tips do you have for building two-way communication? 

Monday, June 13, 2011

If I Could Have One Wish: Better Documentation

When the first star comes out each evening and I think about what I'd like to wish for, I close my eyes, concentrate really hard, and wish for better documentation on projects.  Unfortunately, many projects don't take the time to develop thorough, consistent documents.  Even worse, some projects don't produce any documentation at all. 

Why do I care?  Change Management is often seen as a separate team working in a little silo of happiness.  At first glance, functional and technical teams can't fathom why a Change Manager would get so upset when she hears that they've decided not to create test scripts or document major project decisions.  Because Change Management is a perpetual "down stream" team, though, almost every action that the project team takes impacts Change Management activities.

Consider these examples:
  • Test scripts are often the foundation for training development.  With poor or non-existent scripts, the training developers are forced to re-create the wheel by either researching customizations to the system, trudging through a series of trial-and-error, or peppering the functional teams with endless questions.  All of these methods require extra time that could have been avoided with detailed scripts from which to work.
  • Major decisions can occur on a daily basis on projects.  The Change Management team is expected to communicate this information both internally to the team and externally to end users.  If decisions are not documented, the communications person is often left to rely on second-hand notes and water-cooler chatter to figure out what message to send.  This can result in messages that are both inaccurate and late.
  • A favorite phrase of many functional leads is, "That's a training issue."  While it very well may be, if there's no place to document issues and risks, the Change team has no way of knowing what concerns they're expected to address.  Three months later when it's time to prepare for Go-live, there's very little chance that all of the teams will remember all of the issues they thought would be handled by Change Management.
If you've never been on a project, or if you're a client who has listened to firms pitch their methodologies, you may be wondering how a lack of documentation could possibly be an issue.  Every firm will tell you about how they have tried and true document templates that will be used throughout the project lifecycle to capture important information about the system and then left behind for the client to reference in the future.  If you've been doing project work for awhile, though, you know that the reality is often very different.  The top 5 excuses I hear for why teams don't want to create documentation:
  1. We don't have time.
  2. We've done this 100 times.  We don't need to document.  We have all of the information in our heads.
  3. The client/end user already understands the system.  We don't need to document it for them.
  4. Documentation's a waste of time.  The system/processes are just going to change anyways.
  5. We're communicating all of the information verbally.
While I'm tempted to jump in and talk about why none of those excuses are valid, instead I give you:

A Practical List of 5 Documents that Should NEVER Be Skipped
  • A Project Work Plan - A project work plan is the backbone of your project.  It ensures that everyone is working toward the same dates, and provides an early warning system if tasks and time lines start to slip.  Without a strong project work plan, teams and individuals cannot properly plan their work and activities.  Many projects don't include Change Management activities in the work plan.  This isn't ideal, but I can live with it.  If a work plan is missing entirely, though, I recommend you pull the emergency cord and halt the project until you can figure out how to put a plan in place.  Whatever the cost of stopping for a week, it will be less than the cost of working for months without a plan, only to discover you're off track.
  • Test Scripts - I addressed this above from a Change Management perspective.  Even without considering training development, though, test scripts are still important.  The client should be involved in testing the system, and it is nearly impossible for them to do this without solid test scripts.  Further, you need detailed test scripts to verify that you have built and tested all of the requirements you gathered.  Finally, if you're implementing a system that needs to be SOX compliant, you'll want to have test scripts in place when the auditors come to call.
  • Issue/Risk Log - I'll keep it short.  Issues and risks that don't get documented typically don't get resolved.  The choice is yours.
  • Decision Log - If you have never sat through a meeting where someone said, "Who made that decision?  I never agreed to that!", and then proceeded to spend the rest of the meeting talking about a decision that was finalized months ago, you're very lucky.  Aside from the benefits a decision log has for the Change team, which are mentioned above, the log assures that everyone has the opportunity to read decisions as they are made.  If they don't agree, they can speak up before work based on that decision proceeds, saving time and money by reducing potential re-work.  They didn't read the log and months later decide they don't like the decision?  Too bad.  The decision log is also extremely helpful when months later people's memories have become fuzzy and disagreement arises about what exactly the decision was. 
  • Training Materials - Despite what college students everywhere like to believe, it really isn't possible to learn through osmosis.  If you want end users to understand and use the system you just spent countless hours and money implementing, you need to train them.  You've come this far.  Don't skimp now.
Is your project documenting?  If so, what benefits have you seen?  If not, why not?

Wednesday, June 1, 2011

Emily's Favorite...Training Development Tool: User Productivity Kit

Over the years, I have developed some favorites in the world of Change Management.  I'll occasionally highlight one of them in this blog.

Today, I'll share my favorite training development tool: Oracle's User Productivity Kit (UPK).

About seven years ago, I was on an SAP implementation.  The client was using a training development tool created especially for SAP, but decided to invite a company called Global Knowledge to demonstrate their training development software, OnDemand.  By the end of the demonstration, everyone in the room was drooling over the functionality this software provided, and the potential it had to improve training development.

After I left that project, I went to other clients where we used a number of the standard training development tools: Camtasia, Captivate, and SnagIt, to name a few.  Although they all have their uses, I still longed to get my hands on that OnDemand software.

So imagine my joy when three years ago I joined an Oracle implementation and discovered that Oracle had purchased Global Knowledge Software LLC and was now using the OnDemand software as their standard training development tool.  They had re-branded it, "User Productivity Kit" (UPK), but it was still the software I had dreamed about all those years.

I dove right in and learned everything I could about UPK, then proceeded to develop and teach courses on it, so that others could enjoy using it, too.  Before I embarrass myself with extreme amounts of gushing, though, let's look at:

5 Practical Benefits of UPK
  1. More than a Training Tool - You'll notice that User Productivity Kit does not have the word "training" anywhere in the title.  That's because this is more than just a training development tool.  It is actually designed to be used throughout the project lifecycle.  Teams that are truly dedicated to getting the most out of UPK can begin using it early in the project to create process documentation and test scripts.  Not only does this help enforce consistency in documentation throughout the project, it also directly feeds in to training development efforts, reducing the amount of time required for the training development lifecycle.
  2. One Input, Many Outputs - Training developers often dedicate a lot of time to reformatting the same information into different training tools.  For a single process, they may need to create a training manual, job aid, demonstration, and hands-on activity.  With UPK, you record the process once, then in one step publish the information in all of these formats, plus more.  It is a huge time saver.
  3. Collaboration - UPK comes in two different versions.  The stand alone version sits on your computer, and no one can access the information other than the person working on that computer.  It also comes in a version that sits on a server, however, and this is a huge win for any project with multiple developers and/or a proper review process.  By putting UPK on a server, anyone with access can view and edit any of the training.  This removes the need to print or e-mail copies for review.  UPK also lets you assign people to each training topic and change the status of topics, which allows the project to see who is currently responsible for the topic and how close it is to being complete.  Anyone who has ever undertaken the complex task of running training development and review understands the benefit of having this level of collaboration.
  4. Integration - If you use UPK with an Oracle product, it has an outstanding level of integration.  As you record your actions, it not only captures the steps you're taking, but it also fills in information.  Click on a button, and it will A) enter the name of the button directly into the training, so that the developer doesn't have to type it in later, and B) automatically insert any alternate ways of completing the action.  For example, if I use UPK to record pasting data into a field using the Paste button, UPK will not only enter the phrase, "Click the Paste button.", it will also have an alternative action that states, "Type CTRL-V."  UPK is also able to do this for some non-Oracle products.  They don't advertise it, but UPK works pretty well with other applications, such as the Microsoft suite of products.  The integration isn't as good as it is with Oracle products, but if you already own UPK, it doesn't become obsolete when your Oracle project is complete.
  5. Ease of Use - I taught myself how to use UPK.  If you're not always on speaking terms with your computer, you may want to invest in a training course.  At the end of the day, though, UPK is an easy tool to use.
Have you used UPK?  What are the pros and cons of working with it?