About the Author:

Is Agile Working For You? Metrics For Agile Success

November 14th, 2018

Making the decision to move to an agile environment is tough enough, but opting to stay the course when agile gets difficult is even harder. In fact, it isn’t uncommon for companies to revert back to what they used to do; it is easier.

Like anything, determining whether agile is working is impossible if you’re not measuring for success. Likewise, it’s easy enough to think you’ve adapted enough of the “agile culture”, without tracking actual output and change. But most people who fail at adapting agile in their work environment do so because they haven’t tracked their successes.

So, why did you switch to agile anyway? The promise is big – reducing risk across the business and delivering results quicker and more efficiently, which results in greater customer satisfaction. Agile means that things change, but that you (all of you!) have to actually do the work. Talking about change and actually changing are two very different things, which is how these metrics will help you determine whether you’re actually changing to be more agile.

Measuring Agile Success

Metrics have a bad reputation, most of which is unwarranted. One sticking point? Choosing which metrics mean success for you.

There are a lot of ways to measure input and output, speed and efficiency, process improvement, and customer satisfaction, so which ones are most important? It’s less about which ones are the most important, as many measures similar things in different ways, but which are best for your team:

  • Which metrics resonate the most?
  • Which metrics make the most sense for your business?
  • Which metrics can be tied to your overall KPIs?

The following sections provide an overview on product, process, and business success.

Achieving “done”

Turning to agile is supposed to make it easier and more efficient to reach a finished product. Agile relies on pre-determined sets of time, commonly a series of two-week sprints, and the team sets a goal of what to accomplish within each two-week sprint that will set them up to finish the product over the whole series of sprints. A common agile approach is that the product goals can shift over the series of sprints, but the project scope should not be changed in the middle of a single sprint.

Agile metrics, then, can tell you exactly where your development process is improving – or not.

The biggest, most popular way to track success is on-time delivery, but that metric may come a little too late. Here are common ways to track and achieve “done” while working towards the finish line:

  • Sprint burndown, which tracks how much work the team has completed within a single sprint in comparison to what the team said they’d accomplish. Constantly finishing the work before sprints’ end means your team may not be taking on enough work, whereas rarely completing what you set out to do is a sign that you’re overcommitting.
  • Version burndown, which is similar to a spring burndown but tracks progress over several or all sprints (epics), not just a single sprint. This metric can help determine whether you’re making zero progress or if the scope is changing too much or too frequently for meaningful work to be completed.
  • Cumulative flow diagram, which tracks the workflow across all teammates involved. This is important to see who blazes through work and who is stuck on a problem, which can allow more teamwork to accomplish the sprint’s goals.
  • Defects, which shows current problems and bugs that exist. Tracking these helps determine how much current time you can or should dedicate to address defects during development or whether you should push some off to a post-release fixing or saving certain defects for future releases.

Process improvement

The goals of agile are clear – efficient, speedy deliver that results in customer satisfaction – but the how is trickier. How do you improve? How do you get faster? How do you get more efficient? The answer lies in continuous improvement. If you’re seeking to implement an agile culture, then the team and management must support continuous improvement.

Here are two ways to track whether your process is actually improving:

  • Control chart. Though this chart can be used in tandem with measuring “done”, a control chart shows the progress of individual pieces or components. This is useful because it immediately reflects whether your tweaks and changes (being agile) are actually helping, hindering, or having no result on the work.
  • Velocity, which tracks the average amount of work over a sprint. Also useful in measuring for “done”, agile experts also know that tracking velocity over time confirms whether a particular change actually improved your process.

Business success

Of course, a finished project isn’t enough. Your entire development system may be agile but if your clients or customers aren’t satisfied with the working and final products, what’s the point of it all? Only business metrics can tell you whether your products are meeting the client’s or market need for it.

The best way to track this is to engage your customers in several ways and at different points of the process. Metrics like the ubiquitous net promoter score (NPS), sales figures, and stats that consider usage or support calls versus delivered features are essential to gauging whether successfully implementing agile culture also successfully improves your business – that’s the whole point.

About the Author:

Traditional Learning Analytics Provide No Value

October 22nd, 2018

If you ask any LMS administrator what they report on a regular basis, you might hear things like course completions for e-learning modules, assessment scores, or class attendance. Historically, these things mattered – a number that proved the learner did something and got credit for doing it. The problem with those types of metrics is that they don’t really demonstrate whether learning actually took place. When it comes to true learning analytics, very few of the traditional levels of assessment provide value anymore. Technology has advanced in such a way that learning analytics is now comprised of tracking the transfer of knowledge and the effects that transfer has on a business.

Let’s take a look at Kirkpatrick’s four levels of assessment, one at a time. First, there’s Reaction. Did the participant find the learning event to be enjoyable and relevant? Answering this question on a survey tells the evaluator that the event wasn’t boring, the trainer was likable, or there was an above average catered lunch because that’s all that stood out to the participants. There’s nothing in this data that tells the evaluator whether learning actually happened. Participants can fully engage within a class and still walk away no more educated than they were when they walked into the class.

To really assess whether learning happened, let’s move on to Kirkpatrick’s second level – Learning. This level is meant to measure how well participants acquired the expected knowledge and skills based on participation. A typical assessment strategy that demonstrates an increase in knowledge and skills could be a pre-test followed by a post-test. Surely if participants score higher on the post-test than they did on the pre-test they have learned something, right? Those numbers are undeniable. Except that all that demonstrates is a participant’s ability to recall and then recite the information presented.

How many of you reading this ever crammed for a test in high school or college? How much of that information did you retain after the test was over? Let’s check the science on this one. According to German psychologist Hermann Ebbinghaus’s forgetting curve, humans tend to lose their memory of freshly learned knowledge in a very short time if they do not make the effort to retain it. In other words, use it or lose it. If you don’t apply what you’ve learned, the training will not stick. So, when we assess learners immediately after a class, chances are they will forget what they learned as soon as they click the submit button on that test.  

This is where most learning evaluations stop. The learning event occurred, and the session was evaluated, so the job is done, right? Wrong. Where you can really begin to see whether learning stuck is through Kirkpatrick’s third level – Behavior. Did the participant changed his or her behavior based on what they learned? The idea that you can observe this behavior change is exactly what can demonstrate whether learning has been transferred.

For tasks that don’t require decision making, repetition is the key to increased performance, or changed behavior. Think about a basketball player practicing his jump shot. Technically, the execution is perfect. Swish after swish alone in a gym. But what happens when he gets on the court for the first time, his team is behind by one point, the clock is winding down, and he has to choose to either take the shot in the face of a defender, or pass the ball to an open teammate? The behavior he has been training to perform may seem like it’s all about the shot, but it’s not. The goal of his training is to score points. The player’s decision making competence is what demonstrates his learning.  

Author and research translator Will Thalheimer is an out of the box thinker when it comes to learning transfer. He developed a Learning Transfer Evaluation Model (check out the full paper here, or the one-page summary here) that speaks to what learning evaluations should look like. Given a situation where a decision is necessary, assess the situation, and rely on an individual’s training to make the right decision and take the right course of action.

Let’s put it into a customer service context, for example. When the phone rings in the call center, there’s no way for the support rep to know who’s on the other end of the line. It could be an angry customer, or it could be someone with a general support question, but once they say “Hello,” it’s time to act. They have to think, decide, and act in a matter of seconds in order to solve the customer’s problem and keep them satisfied.

Getting back to Kirkpatrick, let’s move on to Level 4, Results. This level shows us where true learning analytics live. It all starts with the desired outcome for the learning event. It’s not about whether the learner passes a test, or whether or not they enjoyed the training. It’s about can the learner perform a task that drives business results? Can the learner sell more widgets? Can the learner handle customer inquiries faster with less errors?

It takes a fundamental shift in how you design a learning event to think through what the ultimate outcome should be – up front.  As learning professionals, we’re tempted to look at levels one and two as the golden ticket, but that’s only half the battle. Those metrics tell you whether people like what you’ve built. Moving on to levels three and four truly demonstrates what impact a learning event has on the business. That’s where companies start to realize how valuable learning is for their bottom line.

About the Author:

You Should Definitely Build a Continuous Learning Culture

September 24th, 2018

Thanks to a rapidly changing global business landscape driven in large part by technology we can either be left behind, change along with it, or ideally, outpace the change to gain a competitive advantage. To do that, the individuals who power the leading organizations in the market have to be continuously learning and trying new things.

That continuous learning, said Tim Rahschulte, CEO of the Professional Development Academy, business professor at George Fox University, and author of “My Best Advice: Proven Rules for Effective Leadership,” is largely a cultural phenomenon.

Learning How to Get Things Done

Nurturing a continuous learning culture can do great things for the business. According to Deloitte’s 2016 Global Human Capital Trends survey, 82% of respondents said they believe that culture is a potential competitive advantage.

Culture can be a complex concept, but in this context it refers simply to how work gets done. Learning, and by association, an organization’s training efforts, can significantly impact the quality and quantity of work. Organizations that understand the importance of learning also understand the importance of continual change, and as Rahschulte said, they recognize the good work employees do but are never quite satisfied with how good that work could be.

Continual change and change management, quality, learning, etc. these interconnected things directly impact an organization’s ability to innovate as well as plump its bottom line. Findcourses.com’s 2018 U.S. L&D Report survey reports that 42% of learning and development professionals say if their employees are highly engaged in learning, they’re also highly engaged overall at the organization. That has implications for productivity, product development, and perhaps most important for technological innovation.

“Most individuals love the idea of change,” Rahschulte explained. “But they loathe the idea of having to change. If we’re going to innovate, we’re going to need new technology. It doesn’t matter if you’re innovating the next generation jumbo jet, smartphone, or a planned community in an urban setting. That happens by learning to do the next thing: to become more efficient, to create the next product, or the next feature within a product, to do something different than what you’re doing today.”

Technology changes so quickly, using learning as a change agent makes perfect sense. The need to build a continuous learning culture in today’s fast-paced business environment is why scientists and researchers are conducting R&D level work at companies like Apple and Facebook and Amazon. “What we know to be true today may very well be questioned tomorrow because of the speed of technology. Think about cyber security and threats from cyber attacks. Many things we address today didn’t exist five years ago. Therefore, we need this constancy of learning,” Rahschulte said. “If we’re just trying to maintain what we have today, we’re actually falling farther and farther behind.”

Change? Do I Have To?

Unfortunately, too many organizations struggle not only with change, they struggle to create a culture that supports continuous learning, and experts say that is a key responsibility for HR and learning leaders. Instead, leaders erroneously assume that hiring the top technical talent, for instance, means it’s up to those individuals to continuously get better and stay competitive. That, “we’ll hire the best people, and if you don’t keep up we’ll find somebody else to take your place,” attitude does very little to promote a positive culture, or to establish meaning in one’s work.

Organizations are also prone to silos, which are atypical to a culture that promotes continual workforce development. Engineers, for example, may find themselves working with other engineers without thinking deeply about the business or service application for the work they do, and how do individuals consume or value the products or services they create. Those shifting dynamics dictate how technical talent might need to change over time and what development options will facilitate that change.

“To create a culture of constant or continual learning, some of the greatest organizations will promote, bring in, share or document consumer reactions to products and services provided,” Rahschulte said. “Being able to see the end result is extremely valuable.

“I might be on the team working to build the camera for the next smartphone. If I’m building that phone it’s great for me to see the value, the enjoyment, the experience the user is having using that feature in connection with the other features. Then I can start getting ideas for the next generation product.”

The Project Is Complete. What Did We Learn?

Innovative ideas will help to inform which new skills, capabilities, and technologies technical talent need to develop or be trained on to create next generation products and services. But before that happens, an organization must figure out how to build learning into the workflow as opposed to bolting it on after the fact.

With that in mind, Rahschulte said many organization ensures there is a feedback loop embedded into project work. For instance, a company won’t wait until a six month project ends to ask, what have we learned? Instead, they debrief every week, even every day, asking: how do we get better for tomorrow? How do we work across silos in engineering, finance, marketing, HR etc. to learn from one another and better conduct our work?

It’s a leader’s job to embed that retrospective learning into work, and that doesn’t just mean the learning leader, it means all leaders, technical too. To facilitate that learning some organizations bring in outside experts. It may not cost much to contact someone doing great research at a local university, and ask them to come and share their findings. Other companies leverage experts who are already inside the organization, tapping into internal experience, knowledge and capabilities. That could mean providing a mentor for lab work. Or, it could mean having staff help teach part of a class.

“Marc Varner, the CISO for Yum! Brands, once said to me, it’s important to realize that we don’t learn from the middle of our capabilities, or our comfort zone,” Rahschulte said. “We learn on the fringe of what we already know. Therefore, in order to get better we have to work at the fringe level. It’s at the fringe that we’re able to see those next generation advancements, innovations, and needs from our customers.”

About the Author:

Newsflash: Big Data, AI, and Machine Learning Aren’t the Same Thing

September 14th, 2018

One of our long-standing pet peeves is the confusion around artificial intelligence. Non-tech folk seem to use certain concepts interchangeably, leaving the impression that artificial intelligence, big data, and machine learning are all the same thing.

They aren’t.

When we’re feeling nice and patient, we can certainly understand this confusion. But we want to clear it up because this confusion, annoying at best, can actually be destructive at its worst. Some companies are correctly touting AI and machine learning, such as self-driving cars from Tesla, Uber, and Google’s Waymo, or software that understands and interacts with human speech. But other companies seem to use the terms in a half-hearted marketing ploy, just to show how cutting edge they are.

Here’s what we want to scream at the tops of our lungs: big data, AI, and machine learning aren’t the same thing! Sure, they’re related and there’s plenty of overlap, but only in the way that sugar, cake, and dessert are all the “same thing”. You may like dessert as a concept, understand how sugar plays an important role, but maybe you prefer ice cream or cookies over cake.

We all love a sweet treat, but we sure aren’t the biggest bakers. But we are tech experts and enthusiasts, so here’s how we clear the confusion on these major concepts:

Artificial intelligence is the concept. Machine learning is one method attempting to achieve

Let’s break this down.

Artificial intelligence is a theory that dates back to at least the advent of computing. As scientists and engineers began to scratch the surface of what’s possible for computing technology, AI became a catch-all phrase for the wonders (and perils) of what a fully computed world could do. Historically seen as the point when machines can simulate the precision and subtlety of human intelligence, this ideal has played out in countless sci-fi movies, like Blade Runner, The Matrix, Ex Machina, and more. But how we actually get there is a matter of debate and necessity – we haven’t reached AI yet. And at this current moment we seem to be getting closer, but mistakes and drawbacks are evident at every turn. The path that will get us there isn’t yet clear.

Machine learning, then, is just one way the world is getting closer to artificial intelligence. Machine learning is a practical application of AI that uses mathematics and statistics. At its most basic, ML is simply computers progressively training on massive sets of data to achieve an

outcome, like finding an underlying pattern or deciding to act based on input. A programmer sets up the machine with some initial algorithm, and the computer trains on this set of rules in either a supervised learning or unsupervised learning environment. (Some industry leaders say that the goal of machine learning is for computers to act without being explicitly programmed, but at least for now, most machine learning set-ups require at least some initial human-led programming.)

Your email’s spam filter is a great example of machine learning. Back in the day, filters may have followed a simple rule or two to filter out spam, such as removing any emails that refer to large sums of money in donation, African princes, or online pharmacies with weight loss miracle pills. Today, though, spam is faster and smarter, so spam filters have to continuously learn what’s spam by looking at built-in metadata, like the email address, where it’s sent from, and the words inside to determine if the language matches other types of emails you receive. This machine learning also takes in user input – when you identify some coupon or newsletter as spam, which your neighbor or friend may wholeheartedly welcome into his mailbox.

Big data is the material fueling AI at large and machine learning specifically. Consider big data the tangible information that allows machine learning to work. As recently as a decade ago, companies didn’t have the ways to collect and store enough data to even begin using machines to find unseeable patterns and relationships. Today, though, data is the product. Companies offer services for free in exchange for real data about their users, and the more relationships that have a computerized component, like using a debit card or Apple Pay, texting someone, clicking specific links on a news article – all this data can be collected and fed into machines to find some pattern.

But this data is only as useful as the methods of extraction used. If you’re sitting on heaps of data but your computing and data mining processes aren’t in place, the data is essentially worthless. Used as an ingredient in a machine learning algorithm, however, you may start to gather intelligence you didn’t already have.

So why all the buzz about AI these days? Thanks to the explosion of cloud computing, gathering and storing big data is a breeze, and machine learning can take advantage of infinite computers, learning much quicker speeds than ever before. Now that you understand big data, AI, and machine learning separately, you can get other stakeholders on board to start understanding and using the concepts correctly.

About the Author:

How to Make Your Training Vendor a Business Partner

September 10th, 2018

Many organizations contract with vendors to purchase off-the-shelf solutions or custom training engagements. It’s a common practice. Unfortunately, it’s also common for organizations to be only marginally satisfied with the solutions those vendors produce.

According to a 2016 article from Chief Learning Officer magazine, only 25 percent of CLOs are “very satisfied,” and another 43 percent are “satisfied,” with their learning analytics vendor. Analytics is only one small part of the learning landscape, but it suggests there’s room for improvement – if learning leaders know how to cultivate a more beneficial relationship with training vendors.

You might think the most important thing to do is find the right vendor, and you’re not wrong. But the first thing learning leaders should do when looking for a technical training vendor is to conduct an internal needs analysis. That way, they can compare organizational training needs to vendor offerings, and ensure training solutions are suitable for their individual and team needs.

Then it’s time to vet companies. For instance, is the vendor training on the most up-to-date versions of technology? What types of labs do they offer? How do they support various types of learners? Is training primarily hands on? What sort of pre- and post-learning comes with a solution?

“You want to explore what value-add the vendor can provide,” said Carmel Ulbrick, Director of Customer Success for DevelopIntelligence.

Customization Is Common, But There Are Levels To It

Speaking of value-add, learning leaders should not be swayed by talk of customization during vendor negotiations because Ulbrick said that’s no longer unusual. Instead, explore the level of customization vendors offer. “We can start from the ground up,” she said of DevelopIntelligence. “We don’t have a general template that we build from. We meet with subject matter experts. We get to know intimately how technology is used within the organization, and then we collaborate with the client to draft a relevant outline and learning objectives for a class.”

The level of customization a training request requires should influence vendor choice. For instance, if an organization needs its training to feature very industry-specific language, has extremely targeted technical training needs, or there is proprietary information an external vendor would have no knowledge of or access to, that should ultimately influence vendor selection.

Once learning leaders determine the right vendor to meet their learning needs, Ulbrick said the next step in forming a highly beneficial vendor relationship is to introduce the teams. Vendor and client need a level of comfort. They should know who are the players on both sides, and feel confident that they’re competent because correspondence may be frequent depending on the level of engagement for the training project.

“DevelopIntelligence has a lot of touch points as we work with the client because we want the relationship to be strong,” Ulbrick explained. “We also want flexibility on both sides so that we can truly collaborate. That has always fostered the best relationships for us.”

You Get What You Pay For

When it comes to vetting potential vendors, too many learning leaders focus on cost more than the product/service they need or how well these solutions align with their learning objectives. Further, Ulbrick says that what appears to be low-cost training solutions are often not the most effective. This type of buying behavior can result in L&D Buyer’s remorse.

When vetting potential vendor partners, there are multiple things to consider, but Ulbrick said don’t neglect to evaluate testimonials from clients either for single classes or for large-scale programs. Also, check out the company’s client list. Even small training vendors are worth considering if they’ve worked successfully with enough big, global companies to gain valuable experience and solidify their position in the marketplace.

At the end of the day, however, people are the best barometer for a successful training vendor relationship and for an organization’s training strategy in general. Ulbrick said the benefits of great training speak for themselves: decreased attrition and increased retention because employees know there’s a career development path available, an external reputation as an organization that cares enough to develop the brightest talent in the industry, and, upskilling or reskilling decreases costs related to talent acquisition and ramping up new hires.

The cost will obviously be a consideration when sourcing a training vendor. Ulbrick said in that context, it pays to be wary of additional charges. For instance, DevelopIntelligence doesn’t charge for program design, which you may see as a line item in proposals for other training vendors. It might be called a program design fee, or a vendor will add in consultation hours. “It doesn’t make sense for us to charge for something that we see as an integral part of the program design,” she explained. “It’s important for us to truly understand the scope of a project and the organization’s needs.”

Your Vendor Should Be Your Business Partner

Once you’ve done the pre-work to determine what kind of training you need, you’ve identified an appropriate vendor, negotiated fees and signed the contract, now you have to nurture the relationship. There should be a smooth handoff from the representative who made the sale and the team that will support successful training delivery. Members from client and vendor teams should be introduced, and their roles explained so everyone knows who to reach out to depending on their need.

Ulbrick said vendors should run through a timeline of what the client can expect in the weeks or months leading up to class delivery, and they should discuss what the post-learning state looks like. Follow up is very important, as tools to extend learning for students post engagement do a lot to ensure successful knowledge transfer. Based on the program, follow up might include labs and exercises, or what DevelopIntelligence calls post-learning milestones. Another option is a curated list of online, self-paced training or learning guides students can consume to build on the training they received in class.

Your training vendor should also conduct a formal debrief to evaluate the training engagement in retrospective. DevelopIntelligence, for instance, reviews evaluations and student feedback to determine if training on additional topics might be necessary to support the recently completed class.

And of course, as you wrap things up, consider the analytics. After all, you may have to prove a positive training ROI. It helps to consider analytics at the end as well as at the beginning of your technical training engagement.

“Ask what metrics they measure their success on and what concessions they have if those are not met,” Ulbrick advised. “You want to stand by your service. If a training vendor balks at that, that’s a major red flag. For me that would be a deal breaker.”

About the Author:

Learning, Partner with Talent Acquisition. Your Company Will Thank You

September 8th, 2018

Training and talent acquisition seems like a logical pairing. One hand neatly washes the other when it comes to preparing top talent to perform on the job. Unfortunately, all companies don’t practice this integration well, and when it breaks down? Well, let’s just say silos are the least of your worries.

Dave DeFilippo, chief people and learning officer for Suffolk Construction, believes that it’s critically important for learning leaders to partner with the talent acquisition function to build organizational capability. In fact, he said while the employee lifecycle starts with sourcing, acquiring and on boarding the best talent, there’s a key hand off to the learning function, which then develops people for their technical or functional roles.

“The risk is that you’re in a few silos,” DeFilippo said. Silos prohibit these two interconnected functions from working together effectively. “The name of the game is, hire the best, acculturate them to the organization, and get them up to speed. The underlying assumption, especially for technical talent, is that they bring some skills to the table; but you want to put employees in a system where they can get up to speed as quickly as possible.”

Develop the Perfect Relationship

DeFillippo said to avoid silos and prevent stopgaps in cross-departmental relations, the talent acquisition and learning teams must work in concert. “If it’s integrated you’re doing it strategically. Whereas if you’re silo’d, it’s transactional,” he explained. “Technology moves so quickly, savvy organizations have those functions fully developed so they can continually upskill talent.”

If learning and talent acquisition partner effectively, they’ve likely agreed in advance to operate within a specific HR operating model or human capital practice. This model or practice clarifies the importance of human capital in the organization, defines the different components of the employee lifecycle and provides structure for business-centric workforce planning.

Everything, whether it’s learning or talent acquisition related, begins and ends with the business, DeFilippo said. But the best sourcing and on boarding plans integrate cultural assimilation with more tactical items so that people quickly understand how to get things done – before the learning function takes the wheel to ensure talent can perform as expected.

Essentially, leaders should be able to fill in any missing gaps that might prohibit employee performance on-the-job. “Think about what roles align with core competencies that differentiate the business,” DeFilippo explained. “Take an estimator, for instance. In our business, an estimator is really important because if you estimate the project wrong, you lose money. If you know it takes two to three years to build that capability, plan for the future. Workforce planning segments roles that are current and future focused. Then there should be a clear hand off to the learning function.”

Ensure the Hand-Off

That smooth hand off from talent acquisition to learning will only happen if there is an effective partnership between the two functions. For instance, leaders in both camps should know when the organization acquires new talent, when does the learning organization own that person in combination with their direct talent manager? Do employees know what learning resources are available to them in on boarding?

When it comes to compliance training, for example, that quick, early hand off ensures there’s no ambiguity. “You decide what learning content is important to deliver at a certain point in the employee lifecycle,” DeFilippo said. “For technical talent, a role-based curriculum can neatly define what positions require what learning.”

The partnership between talent acquisition and learning is also immediately advantageous because technical talent like developers and engineers have to be skilled so often, if you start with the wrong person, the entire process will be flawed. “You have to learn and relearn because technology is going to change on a regular basis,” DeFilippo explained.

“Selecting for aptitude to learn becomes highly relevant. Through the talent acquisition process you can use behavioral interview questions or assessments to check people’s learning aptitude: ‘Tell me about a time you had to learn a new system or technology process to do your job.’ Have them play back for you when they had to make it real.”

Finally, DeFilippo said the partnership between learning and talent acquisition requires that each leader facilitate the other’s work. Acquiring talent in a competitive market – such as advanced technical skill in areas far from Silicon Valley – is so challenging, one hand – or one function – really does need to wash the other.

“For example, when we go on campus to recruit, individuals want to know as part of signing up with a company in a good labor market, how are they going to be developed? Our talent acquisition people need to understand the development process, the curriculum, and the opportunities,” DeFilippo said. “We partner with the talent acquisition people because they need to be educated so they can sell the job.”

In that way, talent acquisition leaders become learning advocates for the external market, and learning works as a branding tool. In the end, the goal is to get the best talent. But how leaders cultivate the relationship between talent acquisition and learning, how they build that partnership, could make all the difference.

About the Author:

Addressing the Gaps in the Professional Development of Instructional Designers

August 14th, 2018

When you think of the learning and development field, training is the first thing might come to mind. However, in a field where the professional development of others is a priority, many learning professionals find that their own professional development is not a priority. An article published by the University Professional and Continuing Education Association (UPCEA), New Study Delves Into Unexplored Professional Development Benchmarking for Instructional Design and Technology Teams shared key findings of a study conducted by UPCEA eDesign Collaborative. According to the post, the 2017 study titled Instructional Design and Technology Teams: Work Experiences and Professional Development establishes benchmarks for experience and salaries among members of instructional design team members, and also reveals the professional development priorities.” Prior to this study, there has been no studies benchmarking in these areas for Learning and Development teams.

As an Instructional Designer, I have first-hand insights into my own professional development and that of many of my peers. But before we dive into the key findings of the study, we need to define two key roles found in Learning and Development tech teams, these are, Instructional Designer (ID) and Instructional Designer Developer (ID Developer).  

An ID is a person who completes the analysis and design of a learning solution. They provide the blueprint for the development of a learning solution. An ID Developer is person who builds or programs learning experiences based on the provided blueprint using an authoring tool, like MS Word, PowerPoint, Articulate Storyline, Adobe Captivate, Trivantis Lectora; these platforms, give the ID Developer the ability to customize their course builds further by using advanced features and JavaScript code snippets.

It should be noted that there are a lot of IDs in the industry that are both ID and ID Developer providing an organization with end-to-end services – from Analysis through Implementation and many times, also take part in the evaluation phase. I should mention that many organizations do not invest the time and money to engage in a formal course evaluation, according to Evaluating Learning: Getting to Measurements That Matter, only “35 percent of 199 talent development professionals surveyed reported their organizations evaluated the business results of learning programs to any extent.” This results in many IDs not having the opportunity to engage in this important instructional design phase; but I digress. For this post, we will be grouping these two roles into the ID bucket and you may see me refer to them as the ID tech team or tech team.  

Today, many IDs face challenges as they begin to build learning solutions, such as heavier workloads, SME management, and demand for integration of emerging technologies like Augmented Reality (AR) and Virtual Reality (VR). Many organizations are raising their expectations as to the types of interactions, features, functions, and technologies they expect to be integrated into a learning solution. However, these same organizations are not providing their IDs with the professional development support they need to up their skills to be able to deliver products that meet these expectations. We are not alone in this struggle, Develop Intelligence, creators of Developer Academy published the results of their  2017 Software Developer survey revealing that “As the technology industry changes, engineers and companies are continually challenged to keep pace” also citing that in order for organizations to maintain a competitive edge software developers, research & development (R&D), and learning & development (L&D) must all be aligned.

Unfortunately, IDs tend to engage in sparse professional development. This leaves many IDs scavenging for opportunities to keep their hands on the pulse of emerging trends and actually developing enough knowledge to be able to at least design for trends and emerging technology inclusion.

According to the UPCEA study,

  • When looking at future professional development opportunities, free webinars and in-person conferences were the most sought-after development opportunities for both team leaders and team members. Team members were much more likely to seek paid webinars and courses associated with a degree program than leaders.
  • Almost two-thirds of respondents (both team leaders and team members) indicated that the frequency of professional development within their organization is determined more by the cost than the number of opportunities.

I wanted to dig a little deeper, so I created my own survey and asked all IDs in my professional learning network on Twitter, LinkedIn and Facebook who work across a number of industries to participate and share their experiences and thoughts on their own professional development. Over a one week period 93 IDs responded. Not surprisingly, I received many messages from IDs telling me how ironic it seems that they don’t really think about their own professional development. In fact, the average ID spends about 10 hours a month of their own time on their own professional development. According to Develop intelligence, the average software developers spends about seven hours a week of their own time learning new skills to do their jobs. This clearly signals that organizations need to provide on-the-job professional development opportunities which can help reduce attrition and overall job satisfaction.  

Back to my survey, I began by asking IDs about their experiences within their organizations and there are some interesting insights, for example, 43% indicated that their organizations offer regular professional development; however based on the complete data story it’s not focused on helping IDs learn new technologies to do their jobs effectively.

I was curious as to how IDs keep up with current trends in the ID space like augmented reality, virtual reality, machine learning. My survey results show that 94% go to digital media and 93% use social media to learn about industry trends.

However, when asked what type of professional development media was preferred 73% stated they preferred self-direct, self-paced learning.

The survey also found that only 33% chose local/national conferences. Anecdotally, I discovered that the cost of conference prohibits many from attending, the average national conference registration (i.e. DevLearn, ATD) cost is on average $1,695. A large number of IDs believes conference is the most difficult form of professional development to get sponsorship from their employers. My survey results also showed that that 75% IDs complete their professional development outside of business hours citing workloads, time, and money as the big barriers to professional development in general.

As part of my research, I also took to  Twitter to find out what IDs in my professional learning network thought would or needed to change as far as ID tech team training in 2019. It seems that most of the IDs in my network didn’t believe that anything would change in the way organizations train ID Developers in 2019, here’s a small sample of the responses I received:

As the Learning & Development industry evolves with new technologies, methodologies, and metric-capabilities there is a critical need for the roles and skill levels of ID tech teams also evolve. The question is, what do organizations need to do in order to ensure their existing tech team are able to stay up-to-date with this evolution?   The National Bureau of Labor Statistics expects there to be an 11% job growth for IDs between 2016 – 2026, stating “Training and development specialists will need to modify their programs in order to fit a new generation of workers for whom technology is a part of daily life and work.” In my survey I asked all IDs an open-ended question, “If you could change how your organization trains or provides professional development for instructional designers, what would you change?”, the 12 answers below sum up the theme of the answers I received,

  1. Make it a priority to invest in at least one course/conference a year.”
  2. “Encourage all of us to upskill. Make time at work to share knowledge”
  3. “Provide more time and resource to them [professional development]”
  4. “Acknowledge that we don’t know everything and have gaps as a L&D function. Provide more resources and info on latest trends. Give professionals more time built into our week for development.”
  5. “A weekly boot camp to learn new technologies and trends and how they fit with business needs would be awesome. More and more project-related experimentation to test new ways and new technologies to develop learning would also help”
  6. “Provide training at different levels—beginning, intermediate, expert—and make it clear which training materials support which level. “
  7. “I would add internal ID certification track”
  8. “Allotted professional development days or budget for professional development “
  9. “Stop the culture of order taking and become consultative business partners instead. “
  10. “I wish there was a bit more guidance on what professional development options are out there. I’m happy to look for opportunities myself, but it would also be nice if there were more curated recommendations from the organization as well.”
  11. “That we be given the same ability to learn and grow as we give all the rest of the employees – we are designing and developing their learning experiences but nothing is designed or developed for us. “
  12. “Speak the language of the value of people development. Invest in the growth and development of their people. See the value of those investments, not just in the skill building of their employees and adding those skills to THEIR business, but in the ensuing outcomes in the value to their PEOPLE – more confident, more satisfied, feeling invested in, more committed to their work, etc. “

We all know that professional development initiatives can be expensive. However, it’s crucial that organizations invest in upskilling and reskilling their ID tech teams to ensure apply innovative  training methodologies and technologies to support them as they move towards cutting-edge solutions and technologies

Here are five strategies every organization can implement today:

  1. Set time aside for IDs to engage in their own professional development. Giving an ID at least 2 hours a week, allows them to attend free webinars, watch industry-specific recordings, attend discussion sessions like TLDCast and experiment with what they’ve learned. $
  2. Partner with vendors – If your organization is using a looking to or currently using a vendor platform (i.e. Articulate Storyline, Go Animate!, Camtasia), ask the vendor to come onsite for a demo or workshop. $
  3. Sponsor ID membership to join local and/or national Learning & Development associations like ATD and The eLearning Guild. These organization provide live local events, free webinars, blogs, articles, and more that will help you IDs stay current.  $$$
  4. Offer monthly ID workshops by inviting internal experts and experts in the local community to come in facilitate a workshop on trending topics and general technology topics, these can be put together on a very low budget. $$
  5. Sponsor a local Learning & Development Conference through sponsorship of a local conferences offered by organizations like ATD local chapters organizations can work double duty providing professional development opportunities for their IDs and marketing their business to a local audience. $$$$$

About the Author:

The Bay Area’s Problem: Attracting and Retaining Tech Talent

August 1st, 2018

In what is essentially a sellers market, software engineers can pick and choose their jobs from a wide range of options. Working for FAANG seem too bureaucratic? There are tons of startups for you. Startups too unpredictable for your taste? Take advantage of a plethora of established companies like WalMart. These options are available despite the large numbers of engineers located in the Bay Area – with more moving in, applying for remote work, or coming up through nearby universities.

These engineers often have different backgrounds, but they are still driven by many of the same motivations, such as a fascination with technology (especially new stuff), and inspiration to grow in their careers, and find new and creative uses for technology.  Companies that want to attract and retain talent must learn how to leverage those motivations. Here are some ways Bay Area organizations are doing just that.

Engineers do what they do because they like technology. They like it because it allows them to do things that are otherwise impossible, like working to improve the accuracy of GPS tracking, coordinate the mass-capture of digital monsters, or organize the world’s knowledge into a free encyclopedia. This same drive often prompts engineers to find freelance or pro bono work to help scratch that creative itch.It’s why they spend their free time helping to roll out the next Linux patch, or tinkering to do wonders with a Raspberry Pi.

Many companies in the Bay Area tap into this desire by providing their engineers with extensive resources, which may be as simple as a lending library of reference books, or as large scale as subscriptions to training resources and on-site classes. The exact approach depends on the company’s culture and resources, but these resources help give engineers have the technical support they need to find new ways to use technology. As a bonus, it also gives them extensive exposure to new ideas, which can have a direct impact on their work.

Being able to play with use the latest technology is another way companies tap into their engineer’s drive. It connects with their natural affection for tech, and helps exposure them to new ideas. Learning the newest programming language may feel like a waste of time – especially if there’s no intention to use it at your company –  but, learning new methodologies or code can help to kick start new ideas. That’s why it’s important to give engineers the space to play with a variety of toys tools during slow moments to keep them engaged. Consider, Python was born over a Christmas holiday.

In addition to providing the technical resources like books, guides and training that engineers crave, Bay Area companies are also starting to provide inspirational resources. Guest speakers, whether they’re internal employees who want to practice presentation skills or world-renowned experts, also help spark inspiration. This cross pollination of ideas helps to kick start creativity, and the availability of these experiences gives engineers even more reasons to stick around. As word gets around, these speakers also help attract future talent.

Many companies send engineers to talks or conferences, but some, like AirBnB, go a step further and send employees to business-related events as well. These experiences help engineers disconnect from work stress, promote work/life balance, and help drive their inspirational motivations when they get back to the job. This motivation can be infectious, spreading from the individual conference attendees  to others on their team who want to know more.

Developing a culture that fosters an engineer’s creativity, and giving engineers space to explore new code and technology gives a company a lot of staying power for technical talent. One engineer I spoke with told me “…I prefer to work at a company where if I fail, I still have the confidence that I can come back and try again.” Nurturing this natural tendency, and giving engineers a safe place to explore, will help spread a company’s image as one where engineers can thrive.

Building that image both encourages existing employees to stick around, and attracts new talent. This resilience is incredibly important, as organizations stand to spend up to three quarters of an employee’s salary should they need to find a replacement. In addition to the monetary loss there is also a hit to reputation as others wonder why someone would leave after joining the company. In locations like the Bay Area where competition is fierce, this is amplified even further.

At the end of the day, engineers just want to find creative solutions to problems. To attract and retain the best talent, organizations need to recognize this driving motivation, and build programs and policies that help support those natural tendencies. What this looks like will differ based on several things, such as your culture, but in the end it comes down to the same stuff: inspire their creativity, and feed it with whatever you have available.

About the Author:

A Punch to the Mouth: What’s the Worst that Could Happen?

July 30th, 2018

As learning professionals, we’re all familiar with the likes of Kirkpatrick and Blooms when it comes to assessment, application, and objectives. Learning and development is all about being able to deliver information in an engaging and efficient way that can be measured and reported on to prove a return on the company’s investment in learning. That’s the ultimate goal, right? Not quite.

Learning and development is more than just meeting an objective to sell more widgets or troubleshoot a problem at a faster pace. Learning and development is about changing behavior to match what the company expects. The problem is that what the company expects isn’t always the way things work. Mike Tyson once said, “Everyone has a plan until they get punched in the mouth,” and I believe that to be one of the most accurate statements I’ve ever heard.

So, that begs the question – how can we punch our learners in the mouth more often? By that, I mean, how can we make learning more impactful in order to prepare learners to think critically about a problem rather than teaching them to handle every situation the same way. A study conducted by the Delft University of Technology demonstrates that learning and development is typically constructed in a way that makes it safe for the learner to pass or fail. There is usually content, knowledge checks, more content, and a final assessment. In the case of software training, there’s often a simulation that takes a user down a specifically planned path deemed the “best practice” method for accomplishing a task. Beyond that, there’s very little based on real-world application.

For example, my main focus in the learning and development world is on sales training, but this concept can be applied across disciplines. When I set out to test a sales person’s knowledge of a product, I employ a technique known as video role play. It’s a system by which a scenario is presented to the learner and he or she is asked to respond using a video of himself or herself responding to the scenario whether it be a value pitch, handling objections, or any other soft skill worth assessing. These videos are graded by managers or subject matter experts using a defined set of criteria given to the learner up front. The same goes for software simulations – ask your learner to perform a system-based task and score their clicks as they walk through the system.

What we’re testing is whether the learner is able to regurgitate facts or follow directions – robots can do these things, especially as AI continues to evolve. What we need to be testing our learners on is their ability to think critically in the moment of need – we need to punch them in the mouth.

In a sales pitch, how do you handle a situation where you walk into a customer’s office and he or she is angry about something and threatening to cancel your product or service? What happens to your planned agenda for that meeting? On the software side, what happens when you troubleshoot an issue, things aren’t responding as you expect, and you have an unhappy customer on the other end of the line? It’s those types of situations on which we need to be testing our learners. We can teach them the best practice, but we need to prepare them for the worst-case scenario.

By preparing our learners for the worst-case scenario, we’re giving them the knowledge of what should happen while asking them to think outside the box to problem solve. Learning and development isn’t about building robots to complete the same task over and over again. Learning and development is about teaching people to think and act like people, teaching them to respond in the moment and take charge of a situation. The healthcare industry is one that routinely requires its staff to take part in “worst-case scenario” training – and I, for one, am glad they do so. Think about it – when an ambulance comes whirring into the emergency lane at a hospital with a patient only moments away from taking his or her last breath, there’s no time to second guess what you’re doing or look something up in a reference manual. Nurses have to react and be able to problem solve in the moment – that’s the vital importance of this type of training.

There will certainly be situations that arise where your learners are in over their head, and that’s something that

needs to be taught as well – where’s the ripcord? If you can put your learners into a pressure-cooker situation and press them until they break, then they will learn their limits and know when to ask for help. If they don’t learn to ask for help, then they will ultimately continue to fail – demonstrating an inability to learn. This is the same reasoning that puts pilots through crash simulations. What would you do if the plane was going down and 200 or more lives depended on your ability to think critically and solve a problem? Fortunately, not many of us are in that situation, but the point remains the same. No two situations will ever be the same, and it is impossible to train for every scenario. However, the more you prepare your learners to handle the worst-case scenario, the more prepared they will be to think outside the box, solve problems for your customers, and provide a better experience for everyone involved. It’s like the old saying goes – what’s the worst that could happen?

About the Author:

How to Secure Budget for Technical Training

July 22nd, 2018

Organizational resources can be tough to get. Companies have become exceptionally tight-fisted and understandably cautious about how they spend money – even if what they’re spending it on has value for the business.

Technical training certainly has business value, but that value proposition can be difficult to articulate when it comes time to secure funding for new initiatives. Adri Maisonet-Morales, Vice President of Enterprise Learning and Development for Blue University at Blue Cross Blue Shield North Carolina, has learned how to successfully move from the “ask” to the actual funding. She has some concrete advice on how learning leaders can make the business case for technical training dollars and win budget not once, but many times.

What’s the best way to ask leaders for resources – time and money – to procure technical training? How do you prepare before you ask?

First, you really need to understand and be able to articulate the problem you’re trying to solve and the perceived value the solution is going to offer. In my experience, when people approach leaders or their business partners with anecdotal information, their business case falls flat very quickly. Instead, it will help to take a few ironclad steps.

One, take the time to really understand the problem that you are trying to solve and be sure the problem can be addressed through training. This point is super important because sometimes we throw training at things inappropriately. This risk here is that the training won’t yield the right outcomes, and that may jeopardize your chances of getting funding in the future.  

Two, do your homework. Be prepared to defend the technical training by explaining any and all alternatives that you’ve considered along with the associated pros and cons.

Three, enlist support and buy-in from key influencers and stakeholders. This step goes a long way to demonstrate both demand and engagement from the workforce.  

Finally, understand how you will measure the impact, the outcomes, and the return on investment in terms that really matter to the business.

Are there any best practices or strategies that increase the likelihood that you’ll get a yes when making a budget request?

In many cases, it comes down to timing and presentation. Be attuned to what is going on in the organization, and plan your ask when it won’t be perceived as a low priority distraction. Additionally, when requesting funding, highlight the organizational benefits, and tell the evidence-based “why we need this training story” using credible sources to make it practical.

If the training is for emerging technology, consider a proof of concept through a pilot or experiment. This is a great low risk “happy medium” that many vendors willingly support. Whether you approach the ask with a full solution or an experiment, make a point to highlight the importance of upskilling the workforce for the many industry agnostic disruptions that can impede organizational performance.  

Finally, be transparent. Do not oversell the benefits, but do be prepared to discuss the pros and the cons using practical examples, and emphasize what will happen if this ask is not honored. In other words, ‘if I don’t get these resources,’ or, ‘if we don’t do this training, this is what it looks like for the organization.’ Paint a compelling picture that will influence the desired outcome.  

A lot of times technical training leaders have issues because they’re trying to secure training funds for emerging technologies, and there is no proof, or proof of success is limited. How should they approach those budget requests?

I would say, stay well informed on those emerging technologies, and be clear about how those technologies may add value to the organization. You need to be able to speak intelligently on: How does that value play out in your organization? How will it improve the workforce? Will it increase competitiveness or agility? Will it create a new and useful capability? What is the likelihood of adoption? What change efforts should be considered? Who are my champions? Who are my detractors?

To your point, there are a lot of technologies that are emerging right now that aren’t tested, and you need to be careful to not chase everything because it seems sexy or cool to do. You really need to be informed about your learner preferences, the organizational mindset, and value against cost. Again, one way to dip your big toe into new technologies and secure funding is to experiment with small, straight forward initiatives that can be efficiently executed and evaluated.      

Let’s say you’ve done all of those things, and it worked. How do you make the best use of your allotted training budget or resources, so that you can ask for money again, and get it?

Document, track, and communicate with transparency. I can’t emphasize that enough. Make sure that you stay well connected to your sponsors, and that they understand every step of the way how that money is being spent and how the budget is performing. In other words, do what you say, and say what you mean.

It is imperative that you be a good steward of the funding, so take care not to squander what you’ve been given. That’s why it’s important to do your homework before you make a funding request. Demonstrating that you can manage funding within the scope of an initiative goes a long way, so if you’re not sure how to budget for technical training, seek help from someone with that expertise.

Last but not least, if you can deliver the training for less than expected, demonstrate the cost savings, and return what’s left over. Then, celebrate with humility, tell the value story, share the credit with your partners, and accept accountability for any areas that need improvement.  

What more can you tell me about how to acquire and properly spend technical training dollars?

Don’t make the ask so complicated that your sponsors shy away, nor should you over simplify it to the point where they miss the value or urgency. Remember, the first step is to verify that there is a problem that technical training can solve, and that you actually need the time, the money, and the resources.

Again, some of these technologies are emerging, so be clear what success looks like to the business. This will require that you have a quantifiable measurement and evaluation strategy at the ready. To ensure that your sponsor sees the value in the technical training, you should also have a plan that reinforces the learning once the training initiative ends. This approach will provide powerful insights into positive changes that can be made to align to key performance indicators.