My daughter Emily is a sophomore engineering student and smarter than her dad. Exhibit A is her teaching me a lesson when I was teaching her how to drive.

As we were sitting at a stop sign and contemplating a left turn, I begin chatting away about gauging the distance and speed of perpendicular traffic, the risks of assuming that other drivers use their turn signals, and the benefits of firm use of the accelerator—to which Emily turned to me and said, “Dad, I think what I should do is make the left turn. Then we can discuss what happened and why. Then maybe I’ll change what I do the next time I make one. And that’s when I will learn something.”

It doesn’t take smart people long to make their points. Adults, including the staff at your firm, learn best by doing something with an aim or goal in mind, then stepping back and considering what happened and whether it met our purpose or not. We then, if we’re really smart, consider why something did not go quite right before we decide what to adjust for next time. Then we try something new, reflect on how that went, and continue to make adjustments until we achieve our goal. OK, I’m not smart enough to be done right there. I have to backtrack to that why moment. If we’re fortunate, we have resources to help us figure out why. Others have tried doing things that we’re trying to do in our firms, and they learned lessons they could share with us about both why and what we can do about the why. Doctors call this diagnosing a cause and determining a general remedy, and good resources (and good doctors) help us to figure out what we need to change. Then we make the fix over time.

Sometimes these resources come in the form of models, tools, techniques, best practices, templates, tip sheets, standards and the like. The best of these are easy to understand, easy to apply and easy to change into something that best fits your firm. If we use a trainer or advisor to help us, this training and development content should be delivered in real time to solve a real-world problem by addressing its root cause.

The majority of staff-development programs and activities are not designed and delivered in this way, which means they are questionable investments with a questionable impact on your brokerage. Working as an HR professional, I became familiar with the subtopic of training evaluation, or how you know if your training is any good. In that world, Don Kirkpatrick’s Four-Stage Training Evaluation Model is regarded as the gold standard. According to Kirkpatrick’s model, if training participants are satisfied with their experience (Stage I), they will retain more of the training content (Stage II). If this occurs, they will perform better (Stage III) and, ultimately, so will your firm (Stage IV).

But Jim Kirkpatrick, Don’s son and also a professor and training expert, contends the link between post-training tests (Stage II) and later performance change (Stage III) doesn’t hold up in their research. He asserts the general training community distorted the model by over-focusing on satisfaction and post-training knowledge retention and only inferring their impact on what truly matters—performance and business impact. He’s urging the training community to reform its approach in favor of more facilitative methods for actively supporting staff and management teams as they learn by doing, thereby affecting performance and business results directly. Because that’s how you know if training is any good.

None of us want to invest in a losing design, and, if we think about it, the reasons event-based classroom training programs generally fall short become clear. Many of us like training and trainers when they make us feel comfortable, but being comfortable is not the same as being prompted to change and improve. At times we enjoy training as one might enjoy a stage performance, but the applause fades quickly when we don’t see much relevance to our day jobs.

Some of us are adept at big-picture thinking that comes across well in training and post-training tests, but big-picture thinkers can struggle to actually make things happen. Most of us require ongoing reinforcement of a new practice for at least a month before it’s embedded. And if the root causes for performance gaps have mainly to do with teamwork and supervisory issues, these will not be addressed by sending one of us off to be trained.

In a winning design that’s most likely to generate a return on investment for your firm, individuals and real-world teams work through this “learning by doing” process:

  • They define in operational terms what they want to achieve.
  • They assess their current state compared with this desired state, noting strengths and gaps and reasons for the gaps.
  • They identify strategies to address these reasons and commit to quick, mid-term and longer-term actions to make them happen.
  • They implement these actions and monitor progress, impact, lessons and adjustments to make, which forms a cycle that renews itself as the team learns by doing.

This is not a description of an event, nor is it particularly content-heavy, as a training curriculum tends to be. Rather, content that supports this improvement cycle is introduced in small, responsive packages when individuals and teams need them as they work to solve real problems. Over time, working through this cycle becomes a common practice that’s embedded in the way your firm goes about its business.

It’s more difficult to turn the embedding of this cycle into a product, price it and deliver it on a large scale, so the training field has been slow to adopt it. But at The Council we’ll be adding a few of these packages to our Web-based tools and Management Series Newsletter this year, with an initial focus on staff development and retention. We’ll also be gathering stories of how you, our member firms, are improving and driving innovations in these and other areas.

As Emily and Jim will tell you, it all starts with the doing.