What is stopping companies from measuring learning?

May 16, 2014 Monica Kraft

Is your company's learning and development strategy wedged between a rock and hard place? At the risk of sounding hyperbolic, it's probably a more fair assessment to replace the rock with a series of boulders, which are best viewed through the lens of skillsets, datasets, toolsets and mindsets. In one of our recent posts, "Why you need to take a Google Analytics approach to measuring learning," we lay the foundation of our perspective regarding L&D's constant battle for greater relevance in business strategy and planning.

In short, learning and development organizations and departments are suffering from an analytics crisis. This has invariably led to doubts among L&D professionals - and corporate leaders alike - about their role and relevance in the company's overall structure. Are they simply another budget line item or overhead cost? And, at the same time, how do you measure return on investment of L&D initiatives? These questions have contributed to our conclusion that we need a new plan of attack to measure learning outcomes. By adapting our language and perspective to integrate terms like traffic, bounce rate, conversion, time on site and social sharing into an L&D context, we create a dynamic system that uses data to drive smarter decision-making and draw on accurate metrics for more actionable insights.

And this is a high-stakes issue. How much do you think enterprises are investing in their employees at the corporate level? It's no small sum of money on the table. According to Deloitte's most recent research, annual spending on corporate learning increased 15 percent in 2013, reaching more than $70 billion in the U.S. alone to shrink the existing skills gap. How long can L&D organizations expect this to continue if they can't prove to their stakeholders their training tools and programs are effectively influencing employee performance and driving ROI?


We asked this question to various L&D professionals on LinkedIn to get to the bottom of what's stopping companies from measuring learning today. Here's what we found:

Skillsets

In any industry, clear communication is fundamental to showing stakeholders your worth in a company's overall paradigm. L&D is no different—except to the extent that professionals must work even harder to demonstrate impactful training outcomes and ROI.

"In my book, the buck stops fairly and squarely with the L&D profession as it's their job to persuade senior execs to invest in training to improve business performance," Graham Cook, founder of RSVP Design & Mobile Learning Design, argues. "The truth is the industry has done a lousy job over the years in explaining, indicating, measuring, proving or at least arguing credibly as to why particular training should be done in particular ways."

So, the basic skillset lacking in the L&D community is communication. Cook's argument is a call to action challenging learning professionals to expand their abilities as leaders in development to show those at the top that L&D is a personnel issue that affects virtually every other aspect of business operations.

Another skill that Cook claims needs reinforcing is trainer knowledge and planning.

"Trainers need to understand what the skill improvement or the behavior change the business needs to see being applied in the role before they start to design or select any training courses," he explained.

There's little argument against planning as a fundamental aspect to establishing a benchmark to measure ROI, but what happens once training is deployed? After you design courses based on the skillsets you and your employees see need to be strengthened, a steady stream of data begins to flow—especially in digital environments—that needs to be explored.

Datasets

Without substantial, accurate data, you're hard-pressed to demonstrate to anyone whether it's worthwhile to invest in L&D organizations. In too many cases, data simply isn't visible.

"The first step is to get clear visibility and trackable feedback from your coaches and managers," explained Simon Mormen, managing director and founder at the application developer Atomus.

In other words, data must be used to benefit learners in coordination with those in charge of training. This concept ultimately draws a connection to the tools we use to measure learning. As anyone familiar with analytics will tell you, the data exists but the difficulty many companies have is finding a way to collect it for meaningful analysis. Then, how do you tell which data is worthwhile to measure? How do you put data to use?

"Data collected will not only assist in improving your training delivery but also highlight areas that may need addition training or adapted materials," Mormen continued.

So, from this we can see that your datasets have the potential to be incredibly powerful in understanding and adapting your L&D strategies, but the key is to make the data visible to all stakeholders. However, is all data equal?

The manager of Learning & Organizational Development at an American Media and Entertainment company, raises the argument that "soft skills" are too amorphous to draw out reliable datasets.

"How do you measure soft skills - management communication, influence, leadership?" he asked.

Toolsets

Part and parcel of the issue of data are the toolsets available to many L&D organizations. In many cases, professionals in the industry seek intuitive tools that make analyzing and integrating data a seamless experience.

"Make analytics easy to use, the reports easy to disseminate, and the ability to integrate one's learnings into the next round of courses is key," said Gregg Hill, president and CEO at Wavicle.

However, the obstacle facing many L&D professionals is having a consistent standard to incorporate into their learning strategies.

Business Consultant at SunGard Consulting Services Morne Lippiatt admits L&D isn't measured accurately.

"What we need is a robust and reliable framework that is also an industry standard and more importantly - available to all learning professionals at all levels," Lippiatt suggested.

Hill's request is neither impossible nor out of reach, and these tools are readily available, but embracing these resources as an industry standard requires a larger mindset shift.

Mindsets

Among L&D professionals, there's fairly wide agreement about the perception of value. Take Graham Cook, for instance.

"Organizations don't care about measuring learning, they care about applying learning that improves business performance."

Understandably, most companies are focused on the financial outcomes of training - does L&D improve their bottom lines? Measuring learning isn't a priority. John Prpich, director of Learning and Development for Rush Enterprises, goes even farther:

"There are many reasons why most organizations don't measure learning, it's real simple, they just don't care," Prpich explained. "[W]hen I say that most organizations don't care about spending the money they spend on training, I mean that their perception of its value is misguided, it's an investment and not an expense and should be treated as such."

Accordingly, many business leaders adopt a perspective that places L&D expenses alongside product development, for example. There's an outcome, and it costs a specific amount of money to get there, with the ultimate goal of getting a financial return that outweighs the costs.

What's more, Prpich cites organizational culture as another stumbling block.

"If the culture doesn't support learning, or if people aren't encouraged to learn, they'll just do what their boss tells them," he said.

Based on the discussion ignited in this LinkedIn forum, the primary obstacle is mindset. However, it's not as simple as you'd think. Yes, organizations continue to view L&D as a line item expense part of loss/profit analysis and not a long term investment, which it is. Yet, L&D professionals continue to herald formal evaluation - Kirkpatrick and Phillips - as industry standards, essentially disregarding new tools at their disposal.

What should we do? It's clear we have to demonstrate value of informal learning supported by ongoing, granular analysis using the right tools. In reality, it requires an organizational transformation, not a flick of the light switch. But it's clear - especially considering how quickly progressive companies are embracing digital L&D strategies - that we need a Google Analysis approach.

 

About the Author

Monica Kraft

Monica Kraft is the Director of Product Marketing for Xyleme, Inc. Xyleme provides content management for learning and development. In this role, Monica is responsible for PR, event planning, customer relationship development, sales enablement, content development for Xyleme.com and driving digital marketing/solution education, and input into the branding strategy and overall marketing plan and budget.

Follow on Twitter More Content by Monica Kraft
Previous Article
How much do we love reuse? Let's count the ways!
How much do we love reuse? Let's count the ways!

When you think about the amount of money learning organizations...

Next Article
3 reasons to change your learning content management strategy now
3 reasons to change your learning content management strategy now

Have legacy systems handcuffed your enterprise? If you speak with John...

×

Get free learning content best practices!

First Name
Last Name
Company Name
Job Role- optional
!
Read the blog!
Error - something went wrong!