A Simpler Path to Learning Evaluation with Dr. Alaina Szlachta

Episode Overview

Welcome to another episode of "Making Better" with your host, Matt Gjertsen. In today's episode titled "A Simpler Path to Learning Evaluation," we have a special guest joining us, Dr. Alaina Szlachta. Dr. Szlachta is an expert in measurement and evaluation in the learning and development field. She shares insights on the challenges of measuring and evaluating learning initiatives, the importance of a consulting relationship with business leaders, and practical strategies for incorporating measurement and evaluation into content development. Stay tuned for an engaging conversation that will provide you with valuable takeaways to improve your own learning evaluation processes.

About Dr. Alaina Szlachta

Dr. Alaina Szlachta is an expert in data-enablement systems for small businesses to show outcomes and impact of their learning programs. She is the founder and creator of Measurement Made Easy, a community of practice which uses micro-learning, mentorship, and real case studies to create simpler solutions to common measurement and evaluation challenges. Alaina is also the author of the upcoming ATD Press book "Measurement and Evaluation on a Shoestring," a resource written to make Measurement and Evaluation easier and more accessible for anyone - regardless of their expertise.

Full Transcript

  • Dr. Alaina Szlachta [00:00:00]:

    Just figure out what metrics you need using critical thinking and common sense and that that will take you a long way. I think sometimes we lead with what are my metrics? But we don't think about, well, how is my training designed to change performance? And what does that performance look like objectively? Well, if you can clarify that, you can basically pull a metric from out of that.

    Matt Gjertsen [00:00:24]:

    Hello and welcome to the Making Better podcast, where we talk about how to make organizations, teams, and individuals better. If you are a business owner, a learning development professional, a manager, or even an individual contributor in your organization, this show will give you actionable insights to help improve your own performance and the performance of those around you. As our guest today, we have Dr. Elena Schlotka, an expert in data enablement systems for small businesses to show outcomes and impact of their learning programs. She's the founder and creator of Measurement Made Easy, a community of practice which uses micro learning, mentorship and real case studies to create simpler solutions to common measurement and evaluation challenges. Elena is also the author of the upcoming ATD pressbook measurement and Evaluation on a Shoestring, a resource written to make measurement and evaluation easier and more accessible for anyone, regardless of their expertise. Before we get into the discussion, I need to remind anybody new to the show to make sure to hit subscribe so that you never miss a future episode. And if you're already a subscriber, then I simply ask you to please share this show with at least one other person, because that is how we grow. I cannot tell you how much it means to me with that. Let's get started. Elena, how are you doing today?

    Dr. Alaina Szlachta [00:01:47]:

    I am doing wonderful. I kicked off my morning with another wonderful conversation on LinkedIn Audio, which, if you haven't tried that, it's quite fun, actually. And so here we are having another fantastic conversation. So I couldn't be happier.

    Matt Gjertsen [00:02:02]:

    Yeah, I think this is going to be great. We met at a recent ATD Core Four conference, where you were presenting on exactly what we're going to talk about today. I think measurement and evaluation is always such a huge part of the discussion around learning and development because, quite frankly, it's forgotten a lot or just done poorly a lot. So to kick things off, I gave a little bit of an intro for you. How did you wind up in this space of focusing on measurement and evaluation?

    Dr. Alaina Szlachta [00:02:34]:

    So have you heard there's a few different takes on this, but in the world of business development, they coaches and gurus. They say that when you go into business, you ideally want to pair something that comes intuitively to you with something you're passionate about, with something that people need and are willing to pay for. So if I lean into what was intuitive for me, it goes back to the days when I was in high school. My mom was going to college and finishing her bachelor's degree, and I was tutoring her as she was taking some basic math classes just to get those boxes checked as part of her overall fundamentals. And then I went and worked for a marketing company after getting my bachelor's degree. And everything came down to polling reports and working with numbers and then fast forward. I took five years to get my PhD. And during those years, I worked in the public health sector, was doing tons of research, had amazing mentors, and everything came down to numbers metrics. Are we being effective and using data to tell the impact story? Because our funders, our supervisors, sure, everybody needed that information. And so I was basically lucky. When I look back on it, that the environment that I was in from high school all the way to today was an environment that really pushed me to get comfortable with the measurement and evaluation piece of the puzzle. And then I'll say, Matt, you've heard me say this in the workshop that I did at core four. I say it all the time. But when I left academia and I left the nonprofit world where grants were everything, and I left public health, came to corporate, nobody was measuring much of anything, and my supervisor was shoving, we have to use NPS scores as the metric. That was the end all, be all for our whole department. And I was like, but there's so much more that we could be doing that would actually be way more helpful for us as a department to know if we're being effective. And well, that wasn't met with much openness. And that began my realization that in the corporate sector, there's a lot of need for support around the measurement and evaluation conversation.

    Matt Gjertsen [00:04:52]:

    Yeah, there certainly is. And I think that was one of the first questions I had, is like, so you've lived in these multiple worlds, some much more into evaluation than others. And like I said earlier, the entire time I've been in the space, this has been a topic. Why do you think we find it so hard, particularly in corporate learning development? Why is measurement so hard?

    Dr. Alaina Szlachta [00:05:17]:

    So I have a couple of answers for this, and the first one is really important. So I mentioned that I was fortunate enough to be in an environment for many, many years that really supported me in building confidence and comfort, trial and error, getting messy, rolling up my sleeves and being like, how do we evaluate this? I didn't have a million classes in my undergraduate or my graduate program that taught me to do measurement and evaluation. And I think most of us that have advanced degrees don't get any education around measurement and evaluation. Maybe if you're lucky, one class. And I know that that's changing, but no one's really getting any education. So it's the environment. You add the environment that most of us don't have formal training or education that includes our supervisors, that includes leaders of businesses, and then it includes us, individual contributors. In LND, none of us have that expertise or education to even feel comfortable kind of sticking our necks out there and going, hey, what if we did this? Or what if we did that? Because nobody wants to be.

    Matt Gjertsen [00:06:26]:

    Yeah.

    Dr. Alaina Szlachta [00:06:26]:

    So the environment has first and foremost, a common lack of expertise and education around measuring and evaluating. It's not just you listener, or you, Matt, or me in the past. It's all of us. We don't get enough structured education to sort of practice measuring and evaluating and learn along the way. And then in corporate in particular, the environment is one that typically about 50% of business leaders are supportive of the L and D department in giving them things like resources or continuing education so that people can better measure and evaluate. There's also a lack of partnership often between learning teams and learning leaders and business leaders. The partnership, unlike in marketing or product or customer sales, the partnership is just often not there, and there's a lot of research that supports that. And then also LND in particular, we're always crunched in time and resources. And so anytime I go and speak, I like to ask people, what are your biggest challenges? And I'm never usually surprised. I hear the same thing over and over again. It's time, lack of time, lack of resources, lack of support. And then just like, I don't know what I'm doing. So back to that education piece. And so then that environment. Imagine trying to be successful measuring and evaluating learning when you're in that environment. No partnership, no support, no mentorship, no expertise, limited time, money, and resources. Who would do any measurement and evaluation in those circumstances? And I can tell you that it's that way today. It was that way 60 years ago, that this is the environment that we fortunately or unfortunately are in. And so I think that's a really important piece that it's not for lack of interest, it's not for lack of passion. There's lots of us who are passionate about measurement and evaluation. It's not for lack of I want to know the outcomes and effectiveness of my work. We all want to know that. It's not that people don't care. It's that our environment makes it feel like we're pushing this giant rock up a mountain. And that is why I think it's been challenge and a conversation for ages to help us measure and evaluate. So that would be the number one thing that I can say. Why is it so hard?

    Matt Gjertsen [00:08:54]:

    Yeah, I mean, that's really interesting because that's something that I've never thought of before, especially the first point that you mentioned around just the general training in this idea. It really reminds me of Stephen Levitt, one of the co authors of Freakonomics. He talks all the time about the lack of just statistics training in school and is really doing a lot to try to fix that because that feels like the root of this, that most people graduate high school certainly without statistics. And like you said, if you get any kind of training in higher ed, that's probably it. You took a semester of statistics, let alone trying to apply that to anything. So that's really interesting. And yeah, I think when you have a world where a lot of businesses just look at LND as a cost center, as more of an employee benefit, you're rarely thinking like, does our health care impact business performance? And so if you put learning and development in that bucket, you're just not going to ask those questions.

    Dr. Alaina Szlachta [00:10:06]:

    That's right. So if all of us don't have a big education of measuring and evaluating and then you have business leaders that don't have that education, but they also don't really understand the value of learning and development, just like you said, it's a cost center. It's a nice to have, which is why we all get laid off when economic times get tough. And it's the perspective of the decision maker in the organization that really determines the kind of relationship that L D has with business leaders. And now that's not to say that L D leaders and individual contributors, we can totally influence the kind of relationship that we have with business leaders. We just have to put our necks out there, be a little uncomfortable, try some things, and at the end of the day, we have to share a perspective. So if you know that your business leader doesn't see learning and development as a value add to the business, that's just as much your fault as it is their fault. Because we can do some work to kind of come closer and closer together incrementally. It's not going to happen overnight, but it can definitely happen in time. And we have to do the work to change the environment that we're in. And then I have two more things to say about why this is a problem, but feel free to take us in a different direction if you'd like also.

    Matt Gjertsen [00:11:23]:

    Well, I definitely want to hear those, but just quickly, I wanted to ask you, because you kind of mentioned it so directly, that we do have a part to play in this. How do you start that discussion? Like, if you're somebody in a learning person who, you know, your business doesn't really understand the impact that you can have, that's a very long change to make in any organization. But how do you start that?

    Dr. Alaina Szlachta [00:11:50]:

    Yeah. So what I'm going to share with you actually came out of one of the industry leader talks that I did two weeks ago. And I am not going to take any credit for this. This is a direct line from my partner in crime in that industry leader talk, Jolt Olaf. He works at Amazon, has been in LND for over 20 years. Maybe many of you are listeners, know his name. Here's what he said. Because I think it's the right way of thinking about this is that L and D is either going to have a relationship with business leaders, that's transactional, which most of us have go do this training, you're a cost center. How can we reduce our costs, et cetera. That's very transactional. The order taker reality is a transactional relationship. If you want to have a partnership with business leaders, we have to have a consulting relationship where we are seen as consultants, just like I said, partners in crime with business leaders to help them achieve their goals. And the challenge back to environment is that if the decision maker or decision makers in an organization have been around a long time, and maybe you're new to the department, or maybe the department is new in of itself, I know a lot of businesses will build an L D department. At a certain stage of growth, the leader might have the perspective of a transactional relationship, and you, as the L D team or individual, inherit that. And it's not that you want that transactional relationship, but you're stepping into an environment where that transactional way of being just is part of the culture. Doesn't mean you can't change it. It just means you have to be accountable for this is what I'm dealing with. And how can I then make incremental steps toward being seen and acting as more of a consultant than an order taker? And what Zolt suggested is that you learn to speak the language of the business. So what's keeping them up at night, knowing the metrics that they look at every single day in the morning? I have my metrics as a business, I have my three things that I do every single day to be a good business leader and move forward. What are those metrics? What are those numbers for the decision makers in your organization? And you live and breathe those numbers just as much as they do and try and take the initiative to understand the problems and people performance that are influencing those metrics. And so it's not what we're taught back to what I said earlier. We're not taught this in grad school. And then a lot of us step into the field, and we don't have any formal training at all. And we've just learned from whatever organizations we came from and how they did things without necessarily knowing that there's a different way to operate. And so, again, a lot of this has to do with environment and context and culture and literally the space that we step into both emotionally, psychologically, and physically. And that has a lot to do with why measurement and evaluation is so hard.

    Matt Gjertsen [00:14:53]:

    Yeah. So that's awesome. Thank you for that. You mentioned there were a couple of other reasons that you had for why this is so hard.

    Dr. Alaina Szlachta [00:15:01]:

    Yeah. So these are a little bit simpler. I always lead with the environment, one, because I think it's the most important, and it's that if we really work to be accountable for our environments, I think we could make a lot of headway in improving our approach to measurement and evaluation and also not feeling so bad about being bad at it. The other things are things we can totally easily learn and change rather quickly. So one is the understanding of metrics and models. So, again, back to environment in our industry. I don't know why, but it's this common conversation that there's the Kirkpatrick model, and we have to have these four levels, and then there's ROI, and that really that's it. We either want to show the four levels of evaluation, or we want to show ROI, and that there's nothing else out there that we could use to understand measurement and evaluation. The four levels are just a concept. And I literally had a conversation with someone in my community of practice who was like, how do I make that a level three metric? And I was like, Why do you care about level three? What are you trying to do? You're trying to show that your learning program changed the performance of your learners? That has nothing to do with level three. Level three is just a construct that helps us to organize different approaches to measuring stuff. So I would say the whole metrics and models, just figure out what metrics you need using critical thinking and common sense, and that will take you a long way. I think sometimes we lead with, what are my metrics, but we don't think about, well, how is my training designed to change performance? And what does that performance look like objectively? Well, if you can clarify that, you can basically pull a metric from out of that desired change. And all of that is just critical thinking.

    Matt Gjertsen [00:16:57]:

    Yeah, and I think also to go back to the last discussion and what you were mentioning about getting to know the business, it's getting to know your stakeholders and getting to know the people that are saying yes and no to this stuff. What do they care about? Like, you might think that the perfect level four evaluation is X, but if they care about Y, y is what pays the bills, y is what gets you more resources. Once you figure out what they care about, there's no real need to fit that into some other framework. You have your answer. That's right.

    Dr. Alaina Szlachta [00:17:35]:

    So if we back up, environment is the biggest reason why measuring and evaluating has been so difficult for so long. Two is we get stuck in this preexisting concepts of what pick our metrics first, and then we get in the weeds with that. Or we try to use a model, and then we try to apply it. It just doesn't really work. And then we get frustrated, and we just don't do anything at all or we don't do it well. That's the second thing. And the third thing is, and this is again, we can influence it. Oftentimes we're trying to measure and evaluate something that isn't designed to influence change in the way that we think it is. So we try to have these robust evaluation strategies for information sharing programs. Let me just say, if you're doing instructor led training and the majority of the time the instructor is lecturing, you're probably not going to need to evaluate that, because that is not a learning and development design that is meant to facilitate change. When someone is purely lecturing, the only thing that could happen is, yeah, maybe their knowledge increases. But if they don't have practice with skills and they're not given simulations and they're not given ways to build skills or change their performance, you're not going to have anything to measure and evaluate, because no change is going to happen in the ways that we would like to see it happen, which is performance. And then, of course, how that performance influences a business objective. Our design of learning is the problem, not what we're measuring. And then we get that confused. I had someone ask me yesterday how could I measure and evaluate the outcomes of my 1 hour seminars? And I'm like, you don't really need to.

    Matt Gjertsen [00:19:32]:

    No, yeah.

    Dr. Alaina Szlachta [00:19:36]:

    I mean, 1 hour seminars are great for motivation, inspiration, and giving people some fresh insights. I mean, you and I having this 30 minutes conversation, people are, I hope, crossing my fingers people will enjoy it and they'll find value in the information, but there's nothing to evaluate because we're not trying to change anything except just give you some food for.

    Matt Gjertsen [00:19:58]:

    Think Again. That links also back to the past argument you were making about getting hung up on these models, because to bring up Kirkpatrick as an example, and I love Kirkpatrick, I use it a lot in different ways. But yeah, if you're doing that 1 hour seminar and being like, how do I get to level four? And you're like, you're not going to. One thing that I did really appreciate, though, from the session that you led at core four was that realization that sometimes that's okay, if you've been tasked to do a 1 hour seminar for whatever reason, then get engagement numbers, and that's fine, because you want that 1 hour to be as engaging as possible with this podcast. That's why Spotify gets engagement numbers, because that's what they care about. If you're leading a 1 hour session, that's information dump, then measure the knowledge transfer, and that's fine. But don't kid yourself and don't bend yourself into a pretzel trying to squeeze some ROI out of that, because if you're trying to do that is just wasting time.

    Dr. Alaina Szlachta [00:21:09]:

    Yeah. And so I love your example, actually, like this podcast, for example. And you don't have to share your business model of why you're doing this, but I can say for myself. I have a future podcast that I'm going to be doing and I have a community of practice and I do these industry leader talks. I'm a business owner, I'm a consultant. I love what I do and I want to be helpful whether you buy my services or you just follow the stuff that I give away for free. But at the end of the day, why I do all of this is that I'm trying to grow a following. I'm trying to get my message in front of as many people as possible because I do believe that's important. And I have a book coming out and I'd like people to read it. So I have a reason for why I'm doing the information sharing and the inspirational stuff. So anyone who's doing seminars or who's doing stuff that's more the one off style, why are you doing it? What is it that you want to see? And maybe it's not change in your learners, but maybe it's like you want your L and D department to have a better reputation. Because maybe historically, people in the organization felt like the information was outdated or irrelevant or not very engaging and you want to put some new initiatives in front of learners and stakeholders to show, hey, we're doing things different. We heard you, it was boring and irrelevant. We're doing something different. And then you can track how people feel about it. Right. So it all comes down to why are you doing what you're doing? Which is the whole point of the conference presentation I did at core four, and then figure out what's the best way to evaluate if you did what you intended to do. And that may not even be all the way to four levels in ROI. It could simply be, like you said, are people engaging with it in the way that you hope that they would? And do you see growth in that engagement over time? Maybe that's enough.

    Matt Gjertsen [00:22:54]:

    That's great. Yeah, exactly. Well, you did a great, really expansive kind of look at what are our misperceptions and how we kind of do this wrong when flip that a little bit and we've touched a little bit about this on, but what about outside of the learning team? Because this audience is meant to be broader than just us and kind of talk to the managers and the stakeholders and the SME that are out there. What kind of misperceptions do you often see from that side of the table when they think about measurement and evaluation of learning?

    Dr. Alaina Szlachta [00:23:27]:

    Well, I would say that if you're talking about other departments in an organization like the marketing department and the product department, they probably have nailed measurement and evaluation and they may or may not be using those terms. They're probably using things like OKRs and KPIs and they're probably talking about quarterly goals and annual blah, blah, blah. Right. So I think that the language that we use in the adult learning sector or learning and development. I don't know why it is the way that it is, but other people are doing stuff that involves measurement and evaluation of their work, but they're using a different language. And so, again, I don't know why it may have everything to do with that. L and D hasn't historically been required. If you think about the marketing team, marketing has some of the most robust, detailed, numerous metrics because A, they are very easy to track. I mean, marketing is all about brand awareness and acquiring new customers, getting new views, right? We market to just put ourselves out there and get more awareness. There's millions of metrics and then there's ways to tie that to acquisition of new customers and revenue for the business, blah, blah, blah. So I think that there's a lot we could learn from other departments. Like if you're in L and D, maybe you don't have a great relationship with a business leader, but you've got an awesome relationship with the chief marketing officer or the marketing manager. You could just ask them, what are your metrics? Like, what are the things that you're tracking to know if your marketing work is making a difference? And you can learn a lot probably from that conversation. You could also ask the same question to somebody on the product team, right? Somebody on the customer success team, somebody in the sales team. How do you know if what you're doing is working and what are the goals? Like, what do you do every day? And how do you know if what you're doing is moving the needle toward the goals? What are your metrics? You can learn a lot. So I would say lean into other departments to boost your own awareness not only of the business, but of metrics and evaluation. I think we could all learn a ton. And then if you think about subject matter experts and you think about just other people on the periphery who are also probably interested in measuring and evaluating what's going on there around their perceptions. So, like many of us, I have a love hate relationship with subject matter experts. I'm just going to be perfectly honest, and I work with subject matter experts all the time, but I hear things from subject matter experts like, we don't need to do that activity. Let me just tell them we don't need to measure anything. They'll get whatever they get out of it. And so I think our subject matter experts, the challenges in organizations I've worked with before, sometimes the subject matter expert has more clout, more reputation, more credibility, and they can overpower the L and D expert because that's just how the culture and the environment is. And I know a lot of people who struggle with I know good learning. I can say for myself, I know what good learning is. And good learning is practice based simulation trial and error. Like, 70% of what we do in the design of learning needs to be in that type of thing or on the job and being observed and reporting back, how did that go? That practice, that real life stuff. And instructor led training is like what we all are familiar with. It's what we did in high school, it's what we did in college, and it's what we do still see a lot in the world of adult learning. It's that instructor led lecture based thing. And so don't blame your subject matter expert for not getting that. Learning and development can look different, but we have to be able to say, hey, research says come armed to those conversations with subject matter experts like, hey, I totally respect that you know the content, but I know learning. And so how can we collaborate and make a win win that your content and the information actually translates to something? I can help you with that. And here's how. And so measurement and evaluation is the same thing. I've had clients tell me, oh, let's just build this content and we'll measure later. Guess what happened? I don't work with those clients anymore.

    Matt Gjertsen [00:27:45]:

    Yeah, you didn't get the measurement, so you couldn't show any outcome. So then they're just not around.

    Dr. Alaina Szlachta [00:27:50]:

    Exactly.

    Matt Gjertsen [00:27:53]:

    I was really happy or really unhappy at once when I had what I thought was the perfect opportunity because I was working with the team. This is when I was still internal, and we were going to train a sales team, and I was like, here's what we're going to do. We're going to a B test this because you all are hiring tons of folks. Let's train these people. Let's not train these people. Three months from now, we're going to have 100% answer of whether or not what we're doing is working. And they were just like, no, let's just train everybody. It's like, oh, that is such a perfect opportunity.

    Dr. Alaina Szlachta [00:28:21]:

    Yeah, I know. Or pilot testing. Or back to jolt Olaf, who I mentioned earlier, he was like, training isn't always the solution. Why don't we try something less costly? Why don't we just give folks a job aid? Or why don't we give them access to a mentor for the first three months on the job? Or why don't we do pairing of people in the organization? Like have a buddy system where you can have a designated person to go and cooperate and figure stuff out together. Right. Like, there's a million ways we can support people to build whatever skills or capabilities we need them to have. It doesn't have to be a formal training program.

    Matt Gjertsen [00:28:59]:

    Yeah, absolutely. That's fantastic. So as we start to close here, I do want to ask a question about how I think one of the number one things that I hear around and I know I've experienced around a lack of measurement and evaluation is just people saying, well, we don't have the data. We don't have anything to work with. And so for those kinds of people, when you come into an organization, where do you start? What do you think of starting, of getting that data? What's an easy place to start.

    Dr. Alaina Szlachta [00:29:31]:

    Yeah. So those are my people you mentioned and I've shared. I've worked in almost every different sector of adult education, learning. I even was a middle school teacher. I did community based. I'd done all the things in education and I love it. I love it dearly. And so data, oftentimes we don't have it and that's actually very normal. Or in some cases people have just tons of big data in their organization and they're like, that seems overwhelming. I don't really know what to do with that. Let's just leave that over there. And in many cases it's okay. You don't need to dive into all of that big data because again, the function of learning and development, the immediate function is we do some sort of intervention. It could be a job aid that gets distributed all the way to a twelve week leadership program. We do an intervention and it's designed to change people's behavior. That's it. That's all that we do. We change behavior. We can change attitudes. And that translates to something on the job, which is dependent, of course, on your goals. And so oftentimes we don't have data and we have the opportunity to create that because the data that we need is actually the performance data. And unless your organization I know some do this. Some organizations do rigorously track performance. And the sales team and the marketing team, those activities and the things that they do regularly are tracked because time is money. And we're paying for all of this. And we want to make sure that it's giving us the return that we're looking for. We want to do the same thing with learning and development. And so I suggest you create your own data. And that goes back to what I talked about at core four, which is create a hypothesis. You do this training. How is it designed, why is it designed? And what kind of performance is it meant to influence? And then get those behaviors down to a very clear observable thing. For example, you're doing a database training. You've got some new naming conventions because your database is a crapshoot and nobody can get anything out of it.

    Matt Gjertsen [00:31:40]:

    No database has ever had that problem.

    Dr. Alaina Szlachta [00:31:42]:

    Never. And that's why I use it, because it's a really simple one for we can all relate to. We all have a database of sorts that we work with. And I had my very first job. We had the most intense database training because if we didn't all input the data in the same ways, we weren't going to get any value out of that system. So let's say you have a training. I did this in one of my previous jobs. All of our new hires had to have a database training for those very reasons they had to input data. Everybody in the organization had to be doing it the same way, so that our funders and stakeholders and everybody had credibility and trust in the data that they were seeing. So we did naming convention training. Well, what's the goal of a naming convention training? Is it for people to understand the naming conventions? Okay, yeah, sure. But really it's for them to use them and to use them with 100% accuracy. And so that's the performance change. We're teaching you naming conventions and giving you a chance to practice them and have a job aid or whatever. We're teaching you this so that you input the data correctly every single time. And so therefore, what's the metric that we pull out of that? I would probably look at the percent of inaccuracy and that we'd want to see that new people we know they're new, they're figuring it out. They'll have a certain level of inaccuracy, but in a certain amount of time, there should be no inaccuracies at all. And that's how we create data. What is it that we're doing? What's the observable behavior that we want to see? And then, what's a metric that we can make that lets us know if that behavior is occurring in the way we imagined it or not? And that's it. That's really easy.

    Matt Gjertsen [00:33:16]:

    Yeah.

    Dr. Alaina Szlachta [00:33:16]:

    Okay. I think it's really easy.

    Matt Gjertsen [00:33:18]:

    Well, I think yeah, it's easy to say. It's sometimes not as easy to do. It actually reminds me of the last discussion I had with Christy Oliva, who is a program manager at Amazon on a previous episode. And one of the things she talked about that she personally struggled with, and a lot of L d. People struggle with, is the fact that we have to cross so many department lines, we have to go somewhere else. And so in that example, chances are that data already exists, but you don't own it. And it's a matter of going out and figuring out who does own it and partnering with them to prove it. And then kind of the way I always think about it is it's like we have to give ourselves a break, and there's different tiers of how rigorous you can be depending on your environment. And in the ideal state, you're going out to the business and getting that information, but that's just not practical all the time. And so there is worse data that you can use, but we can control and create if you just can't find a partner and that's through surveys or that kind of stuff, to get an indirect measurement if you have to.

    Dr. Alaina Szlachta [00:34:26]:

    Yeah, and I would say, too, just to push back on that a little bit. So absolutely, I'm fully aware that it's difficult to get access to some of those outcome data points. And what we did to circumnavigate that in this database training example, this is a real one for me, I got buy in from somebody who I had a relationship with. I basically presented to them, hey, let's give new hires this way of inputting data, where instead of we had to input cases and so we would select the type of case and then input all the data about it. So I asked, hey, can we create something that's real, but it's dummy? And so anytime somebody input data, they selected the dummy one and it was super obvious and I double checked that was what was happening. So it didn't skew our data. And we were able to check like somebody goes through this program and I had them all as a baseline, how would you fill this out? Here's some information, how would you intuitively do this? And then we checked their answers. Some people got it right because it was intuitive for them or they had some previous training or whatever. Other people were really off, but at least they got a sense of like, let's do this in real time, see what it's like. And then during the training, you have them continue to input data. And at the end of the program, using that dummy data input, you could see if people are getting it right or not in a variety of easy to difficult scenarios. So that's something we can totally control. But again, remember, it's about how do we design learning and are we designing it in a way that allows us to collect data? Because yeah, for me, in a large organization, especially a global organization, it could be really difficult to get access to all employees and their accuracy numbers, inputting data. But we can at least get a sense that they're leaving the training environment having simulated as close to the real environment as possible and with some data that shows that they had growth in the accuracy of doing the work, whatever that is. That's what I would suggest.

    Matt Gjertsen [00:36:40]:

    I love that. Yeah, because it is a way to bring it back to a little bit more under our control, creating a very realistic environment. And then I know another principle that you push really strong at is doing the measurement at the time of training, like incorporating it as part of the training, so that you make sure you get it and you're not hoping yeah.

    Dr. Alaina Szlachta [00:36:58]:

    And you have complete control. So in the back of that same database example, I had complete control, I was given access to be able to pull reports on, everybody was added to the database as a user, so I could pull reports on how Matt and Elena in this new higher training, how did they do in know? Either immediately I could pull the report and everybody gets their feedback on what they did and we could look at their results or we did it cumulatively over a certain period of time. And so, yeah, I think it's about thinking creatively about how do we simulate the right environment for people to practice the thing that we want them to be doing. And then Data Enablement is all about embedding opportunities to collect data and evaluate it in real time to know that what we're doing is actually working. And that is totally doable. Very doable.

    Matt Gjertsen [00:37:50]:

    Yeah. Absolutely. Awesome. Well, this has been a great discussion. We kind of touched on a lot. What's the best way for people to reach out to you if they want to keep hearing from you and find out more?

    Dr. Alaina Szlachta [00:38:01]:

    So I am@drelaina.com, and I'm also at LinkedIn, at Dr. Elena LinkedIn.com, so you can find me that way. And then I have a community of practice that's all about measurement and evaluation and that's Drelena.com. Backslash. Mme. Makemeasurement easier.

    Matt Gjertsen [00:38:23]:

    Awesome. Well, that's really easy. And we'll make sure to link to it in the description. Thank you so much, Elena. This is great.

    Dr. Alaina Szlachta [00:38:29]:

    This is lovely. Thank you for having me.

    Matt Gjertsen [00:38:33]:

    Thank you so much for tuning in today. If you liked the discussion, make sure to hit like and subscribe so you never miss an episode. As a reminder, if your team is struggling keeping up with the training development demands of your organization, we want to help. Better Everyday Studios is a full service instructional design team that can help you with everything from ideation to actual content creation and delivery. Please reach out to us using the link in the episode notes below. Have a great day.

Thanks for Listening!

It means so much to me and the guests that you chose to spend your time with us. If you enjoyed listening, make sure you subscribe using your favorite player using the links below.

Spotify

Apple Podcasts

Google Podcasts

Previous
Previous

Making Better Managers for Today's Workplace with Eric Girard

Next
Next

Going from Teacher to Program Manager