/ Insights / View Recording: What works and what doesn’t in AI Adoption Insights View Recording: What works and what doesn’t in AI Adoption October 8, 2024Join us for an insightful webinar on “What Works and What Doesn’t in AI Adoption,” where we will explore the key factors that contribute to successful AI implementation and the common pitfalls to avoid. This session will be led by Nathan Lasnoski, Chief Technology Officer at Concurrency, and Brandon Dey, Head of Engineering at Concurrency. They will share their expertise on how to align AI initiatives with business strategies, prioritize AI projects, and foster a culture of experimentation and innovation. Attendees will gain valuable insights into the practical steps needed to achieve tangible results with AI, from envisioning and strategy to scaling and follow-through.During the webinar, we will delve into real-world examples of AI adoption, highlighting the benefits such as increased revenue, improved customer experience, and operational efficiencies. We will also discuss the challenges and tensions that arise during AI implementation and how to navigate them effectively. Whether you are just starting your AI journey or looking to scale your existing efforts, this webinar will provide you with the knowledge and tools to drive successful AI adoption in your organization. Don’t miss this opportunity to learn from industry experts and take your AI initiatives to the next level. Transcription Collapsed Transcription Expanded Nathan Lasnoski OK. Hello everybody. Welcome to virtual edition of our AI symposium. 0:0:18.651 –> 0:0:27.411 Nathan Lasnoski This is our kickoff session, and we’re gonna be spending some time today talking about what works and what doesn’t work in AI adoption. 0:0:27.411 –> 0:0:32.731 Nathan Lasnoski It’s about about twoish years, maybe a little less than that since the Gen. AI. 0:0:33.11 –> 0:0:38.91 Nathan Lasnoski Sort of exciting craze took off, but even feel like the holy cow that it’s been. 0:0:38.901 –> 0:0:43.861 Nathan Lasnoski Almost that amount of time, and I’m really excited to talk about what we’ve seen be successful. 0:0:44.431 –> 0:0:56.751 Nathan Lasnoski And what we’ve seen not be successful, so you can learn from that in your own AI adoption journeys and how you lead your organizations. And to do that, I’m gonna start by just introducing myself. 0:0:57.551 –> 0:0:59.71 Nathan Lasnoski So my name’s Nathalie. 0:0:59.71 –> 0:1:1.431 Nathan Lasnoski I am Concurrency’s chief technology officer. 0:1:1.871 –> 0:1:5.511 Nathan Lasnoski I have spent the last 23 years in consulting. 0:1:5.511 –> 0:1:8.911 Nathan Lasnoski I’ve been with concurrency for a long, long time and. 0:1:9.701 –> 0:1:14.101 Nathan Lasnoski Especially over the last several years have spent time with over 70. 0:1:14.671 –> 0:1:29.61 Nathan Lasnoski Executive teams helping them navigate the adoption of AI within the context of their organization, and it’s been a huge blessing to do that because, man, what have you ever seen a time where you’ve really been able to get to the heart of what an organization does where where? 0:1:29.61 –> 0:1:40.631 Nathan Lasnoski The executive team has thought of technology as an asset for them and really challenged the IT organization and the rest of the organization to use technology to make them better. And some of those organizations have harnessed AI. 0:1:41.581 –> 0:1:45.981 Nathan Lasnoski And leveraged it to to a way that’s really helped them to change and engage. 0:1:46.501 –> 0:1:49.941 Nathan Lasnoski The mission of their business and other ones they’ve got, they’ve got it going. 0:1:49.941 –> 0:1:58.421 Nathan Lasnoski And then maybe they’ve spotted out or they haven’t put the energy into it. And I’m really looking forward to sharing what differentiated those companies in this conversation today. 0:1:58.421 –> 0:2:6.381 Nathan Lasnoski That is my QR code, so if you would love to scan that, I’d love to connect with you on LinkedIn. So I’m super active. 0:2:6.381 –> 0:2:9.981 Nathan Lasnoski On LinkedIn, I have a weekly newsletter on AI leadership. 0:2:10.621 –> 0:2:12.141 Nathan Lasnoski There’s about 20 something. 0:2:12.941 –> 0:2:15.861 Nathan Lasnoski Past newsletter assets that you can take advantage of. 0:2:16.841 –> 0:2:31.901 Nathan Lasnoski We’d really find value in in following that content and taking advantage of that content. I produce it on a regular basis for you to take advantage of. So please connect with me on LinkedIn, get that content, follow me in love for you to be able to take advant. 0:2:31.961 –> 0:2:32.201 Nathan Lasnoski Of it. 0:2:32.201 –> 0:2:35.161 Nathan Lasnoski Nice to meet everyone today, OK? 0:2:35.201 –> 0:2:36.641 Nathan Lasnoski So what are we doing in the session? 0:2:37.41 –> 0:2:41.721 Nathan Lasnoski First, what we’re going to do is talk about what successful AI adoption looks like. 0:2:41.721 –> 0:2:43.561 Nathan Lasnoski What’s it look like? 0:2:43.561 –> 0:2:44.561 Nathan Lasnoski What’s it smell like? 0:2:44.561 –> 0:2:45.201 Nathan Lasnoski What does it feel like? 0:2:45.911 –> 0:2:47.191 Nathan Lasnoski How do I know if I’m doing it? 0:2:47.191 –> 0:3:1.151 Nathan Lasnoski Well, we’re gonna talk about what that adoption looks like, and then we’re gonna go into a little bit of ABAB of what works and what doesn’t work, and then we’re gonna talk about how to take action on some of those ideas. 0:3:1.871 –> 0:3:3.191 Nathan Lasnoski Few things I’d love for you to do. 0:3:4.741 –> 0:3:13.261 Nathan Lasnoski I’m going to I’m one man band on this call, but I’m gonna do my best to be watching the chat. I would love for you to be putting your questions in the chat. 0:3:13.261 –> 0:3:16.581 Nathan Lasnoski I will do my best to answer those questions, if not throughout the session. 0:3:17.361 –> 0:3:22.361 Nathan Lasnoski I will make sure to reserve some time at the end for us to answer them so liberally. 0:3:22.361 –> 0:3:31.801 Nathan Lasnoski Use that Q&A feature in the context of the session. I’d love to just kind of get your reaction and get questions that you have as we as we go throughout. 0:3:32.81 –> 0:3:33.201 Nathan Lasnoski So I’ll do my best on that. 0:3:34.671 –> 0:3:34.871 Nathan Lasnoski OK. 0:3:35.211 –> 0:3:40.971 Amy Cousland We have some people who can’t see the screen, but I don’t sometimes restarting or joining in a certain way. 0:3:40.971 –> 0:3:47.291 Amy Cousland I have people who can see it and people who can’t see it, so this is getting recorded. If you wanna reshow it, share it. 0:3:44.181 –> 0:3:44.741 Nathan Lasnoski OK. 0:3:47.291 –> 0:3:48.411 Amy Cousland Hopefully it’ll help. 0:3:49.571 –> 0:3:50.131 Nathan Lasnoski Will do. 0:3:50.131 –> 0:3:51.771 Nathan Lasnoski Thank you for saying that. 0:3:51.771 –> 0:3:54.411 Nathan Lasnoski Well, that’s look, this was actually a test. 0:3:54.691 –> 0:4:0.691 Nathan Lasnoski This was a test of our our broadcast system. I’ve just gone to like full screen mode. 0:4:0.691 –> 0:4:8.331 Nathan Lasnoski Hopefully this gives you gives you that ability to see my screen without any interruption that looking good there, Amy. 0:4:6.881 –> 0:4:7.241 Amy Cousland OK. 0:4:8.121 –> 0:4:13.641 Amy Cousland It’s looking good for me if other people can’t see it, maybe restart your teams instance, but it’s it’s showing up well for me. 0:4:13.641 –> 0:4:14.521 Amy Cousland Hopefully this helps. 0:4:14.521 –> 0:4:15.81 Amy Cousland Thank you so much. 0:4:15.811 –> 0:4:17.531 Nathan Lasnoski OK, thanks. OK. 0:4:18.651 –> 0:4:21.371 Nathan Lasnoski I’m glad we did that before I made this point. 0:4:21.411 –> 0:4:31.291 Nathan Lasnoski So if you leave this session with only one thing to bring back to your business, I want you to remember this sentence. 0:4:31.851 –> 0:4:35.891 Nathan Lasnoski Successful AI adoption is about actualizing the mission of your business. 0:4:36.131 –> 0:4:41.611 Nathan Lasnoski Now, that may seem obvious, but it’s not obvious to many organizations as they gone down the path. 0:4:42.421 –> 0:4:45.421 Nathan Lasnoski Successful AI adoption of actualizing the mission of your business. 0:4:45.951 –> 0:4:57.951 Nathan Lasnoski Starts with understanding what the mission of your business is. Being able to translate that into strategic objectives that you probably already have, and then thinking about technology as an asset to make that true. 0:4:59.501 –> 0:5:5.701 Nathan Lasnoski And it is an opportunity for us to be able to use a, a technology which is enabling technology. 0:5:5.701 –> 0:5:8.941 Nathan Lasnoski Think about this as like a light bulb moment. 0:5:9.91 –> 0:5:13.531 Nathan Lasnoski The electricity moment, maybe even closer to electricity than light bulb, right? 0:5:13.731 –> 0:5:24.851 Nathan Lasnoski The Internet. If you thought a smartphone, if you thought about these moments where like these technologies became a thing, how long did it take us to really realize and actualize the Internet? 0:5:24.851 –> 0:5:30.691 Nathan Lasnoski How long did it take us to really realize and actualize electricity, or even the smartphone? 0:5:30.851 –> 0:5:34.851 Nathan Lasnoski There was a period where we knew it existed, but it really didn’t hit our social consciousness. 0:5:35.661 –> 0:5:38.141 Nathan Lasnoski Because it was enabling technology that made other things true. 0:5:39.951 –> 0:5:44.951 Nathan Lasnoski That except it’s moving a lot faster, so it’s an enabling technology. It’s changing the game. 0:5:44.951 –> 0:5:54.71 Nathan Lasnoski It’s enabling our organizations to think about something differently, but you have to understand the mission of your business and how it relates to that rather than necessarily looking at it the other way around. 0:5:54.71 –> 0:6:6.71 Nathan Lasnoski So this is a starting point for and how they’ve made this journey and how they’ve enabled that culture of experimentation, but done so in the context of actualizing the mission of the business. 0:6:9.621 –> 0:6:16.261 Nathan Lasnoski So the challenge I would have for you, and I’m gonna go through some examples of this, is been a year and a half. 0:6:16.541 –> 0:6:18.621 Nathan Lasnoski What have you achieved with AI? 0:6:18.621 –> 0:6:21.221 Nathan Lasnoski What? What have you achieved in your organization? 0:6:21.301 –> 0:6:24.581 Nathan Lasnoski You, meaning your organization, what is it achieved? 0:6:24.661 –> 0:6:25.821 Nathan Lasnoski Has it achieved anything? 0:6:26.181 –> 0:6:27.981 Nathan Lasnoski Has it achieved small things? 0:6:27.981 –> 0:6:29.621 Nathan Lasnoski Has it achieved something really significant? 0:6:29.621 –> 0:6:46.141 Nathan Lasnoski Maybe you did something even before the sort of ChatGPT moment. Maybe you’ve been engaged in traditional ML for a long time, and you’ve seen those results already. As before, even getting to this moment and you forced multiplied it through this swing. 0:6:47.141 –> 0:6:51.61 Nathan Lasnoski Here’s some examples of what other organizations have achieved during that same period. 0:6:51.921 –> 0:6:54.281 Nathan Lasnoski So if you say it’s been a year and a half, what have you achieved? 0:6:54.841 –> 0:6:58.1 Nathan Lasnoski Have you increased revenue for your business by winning more deals? 0:6:58.921 –> 0:7:13.661 Nathan Lasnoski Truly think about it that way. If I applied AI, whether it’s not the commodity level and the way that like my my individual team members are leveraging something like a copilot or a tool that helps them to be able to accelerate their work product or it’s auto. 0:7:13.661 –> 0:7:21.801 Nathan Lasnoski Quoting engine that we’ve created, am I winning more deals because I’m able to bring those deals to my customers faster and more accurate way that? 0:7:22.81 –> 0:7:26.681 Nathan Lasnoski Competes my competitors, or have I not done that and I’m still about where I was yesterday? 0:7:28.341 –> 0:7:33.621 Nathan Lasnoski Am I able to ease frustration in my customer experience by applying AI? 0:7:34.221 –> 0:7:36.341 Nathan Lasnoski There’s a a recent study that. 0:7:37.941 –> 0:7:43.861 Nathan Lasnoski Over 90% of businesses that were surveyed are looking at AI to optimize their customer experience. 0:7:43.861 –> 0:7:48.901 Nathan Lasnoski A small percentage have, but most are looking at it as an opportunity for that. Why? 0:7:48.901 –> 0:7:59.461 Nathan Lasnoski Because we want to reduce the amount of time that our customers are sitting in that unhappy state. If I can take a customer that’s unhappy because of either their question or not sure how to use a product. 0:8:0.61 –> 0:8:10.181 Nathan Lasnoski Either products broken or they have a question about like downstream activities within the context of something they’re working with us on. How can I ease that frustration faster for them? 0:8:10.181 –> 0:8:24.541 Nathan Lasnoski Put them in a in a happy state more quickly. Many of our companies we worked with have used AI to be able to arm the customer service teams or arm their direct customers with answers to the questions or even. 0:8:25.301 –> 0:8:28.101 Nathan Lasnoski Opportunities for them to be able to take action based upon something that they know. 0:8:30.41 –> 0:8:41.321 Nathan Lasnoski Have you reduced inventory carrying costs by driving efficiency through the supply chain? This happens to be one of the older uses of AI. Even pre ChatGPT moment, but it’s one of the most powerful. 0:8:41.641 –> 0:8:42.961 Nathan Lasnoski Can I optimize? 0:8:42.961 –> 0:8:44.641 Nathan Lasnoski What really is my supply chain? 0:8:44.641 –> 0:8:56.761 Nathan Lasnoski Can I optimize how much product or staff I’m applying to particular scenario in a particular location with particular skills or qualities to be able to optimize my cost at any given moment? 0:8:57.541 –> 0:9:0.861 Nathan Lasnoski Tremendous opportunity, tremendous opportunity simply because. 0:9:1.431 –> 0:9:5.871 Nathan Lasnoski The dollars here are so tangible. I was talking with an organization. 0:9:5.871 –> 0:9:15.631 Nathan Lasnoski They said if you can reduce the purchase the the price they buy this particular commodity at at this bold cost by 1 cent, you’ll save us $1,000,000. 0:9:16.31 –> 0:9:19.431 Nathan Lasnoski Save us $1,000,000 by driving efficiency of 1 cent. 0:9:19.951 –> 0:9:23.871 Nathan Lasnoski This is the opportunity that stands before us in many organizations have taken advantage of. 0:9:23.871 –> 0:9:24.831 Nathan Lasnoski I worked with an organization. 0:9:24.831 –> 0:9:26.311 Nathan Lasnoski They saved $40 million a year. 0:9:26.311 –> 0:9:28.871 Nathan Lasnoski They’re about a billion and a half organization. 0:9:28.951 –> 0:9:30.431 Nathan Lasnoski They save $40 million a year. 0:9:30.901 –> 0:9:32.661 Nathan Lasnoski Of carrying costs because of playing AI. 0:9:34.681 –> 0:9:39.121 Nathan Lasnoski Have I set clear expectations of my customers that are measurably more accurate than before? 0:9:39.441 –> 0:9:41.321 Nathan Lasnoski Can I set expectations such as? 0:9:41.441 –> 0:9:43.481 Nathan Lasnoski Here is when your product is being delivered. 0:9:44.1 –> 0:9:45.681 Nathan Lasnoski Here is and what? 0:9:45.681 –> 0:9:48.401 Nathan Lasnoski Maybe there’s interruption to the supply chain. 0:9:48.401 –> 0:9:53.481 Nathan Lasnoski Here’s when you can now expect that product to be delivered within one. Do I have a dominois pizza tracker? 0:9:53.601 –> 0:10:1.721 Nathan Lasnoski And when there’s something wrong, does my Domino’s Pizza tracker adjust and give my customer service team the ability to to interact with them? 0:10:1.721 –> 0:10:3.361 Nathan Lasnoski Can I even queue it in against that? 0:10:4.101 –> 0:10:5.581 Nathan Lasnoski In an intuitive customer centric way. 0:10:6.581 –> 0:10:10.141 Nathan Lasnoski I’m working with a company that travel agency. 0:10:10.141 –> 0:10:26.421 Nathan Lasnoski You can follow them on LinkedIn, Fox World Travel. They just went live with a product called Colby and this product, what it enables them to do is their customers will buy their, their customers or other businesses and they will buy travel services. You know, at bulk right like. 0:10:26.661 –> 0:10:33.141 Nathan Lasnoski I have 1000 people are going to be flying this year and I want to know what percentage of my flights are going through SW versus. 0:10:34.421 –> 0:10:36.21 Nathan Lasnoski Northwest. Whatever and. 0:10:36.601 –> 0:10:41.921 Nathan Lasnoski I’m gonna. I wanna ask a question. Have to return that information to me from the business system in AQA centric way. 0:10:41.921 –> 0:10:43.801 Nathan Lasnoski They just went live with that. So cool. 0:10:44.241 –> 0:10:44.961 Nathan Lasnoski Check them out. 0:10:45.801 –> 0:10:47.601 Nathan Lasnoski So can I set those clear expectations? 0:10:49.141 –> 0:10:49.901 Nathan Lasnoski Are my employees. 0:10:50.541 –> 0:10:57.541 Nathan Lasnoski Do my employees indicate that AI that the availability of AI agents creates definitive efficiencies for them? 0:10:57.741 –> 0:11:2.741 Nathan Lasnoski This is not am I turned on. Its do I indicate I’m getting value? 0:11:3.61 –> 0:11:4.901 Nathan Lasnoski Am I seeing efficiencies? 0:11:5.21 –> 0:11:7.741 Nathan Lasnoski So if I have someone enabled for copilot, for example. 0:11:8.651 –> 0:11:16.451 Nathan Lasnoski And their delegating activities to copilot prepare this presentation or give me the the outcome of this particular meeting and summarize the action items. 0:11:16.731 –> 0:11:24.51 Nathan Lasnoski Are they able to measurably show that they’re getting real efficiencies from that, or have I not achieved that because I haven’t? 0:11:24.91 –> 0:11:25.611 Nathan Lasnoski Maybe trained them well or I haven’t? 0:11:25.851 –> 0:11:27.571 Nathan Lasnoski I’ve just been dinking around with it. 0:11:27.891 –> 0:11:29.291 Nathan Lasnoski Have I gotten through that Channel? 0:11:30.861 –> 0:11:34.141 Nathan Lasnoski Do I have a new revenue streams that’s been created by using AI driven information? 0:11:34.141 –> 0:11:40.381 Nathan Lasnoski This is truly about can I use data that I know about my customers to create new revenue so. 0:11:40.721 –> 0:11:47.161 Nathan Lasnoski For example, I have a customer that they sell food products and those food products have extremely low margin, but they’re in every restaurant. 0:11:47.161 –> 0:11:47.801 Nathan Lasnoski You know what? 0:11:47.801 –> 0:11:48.481 Nathan Lasnoski They really know. 0:11:48.521 –> 0:11:50.801 Nathan Lasnoski They know everything about the restaurant business. 0:11:50.801 –> 0:11:55.761 Nathan Lasnoski One of the highest turnover, one of the highest areas of failure is restaurant businesses. 0:11:55.761 –> 0:11:58.241 Nathan Lasnoski But they know a lot about what makes successful restaurants. 0:11:58.481 –> 0:12:1.601 Nathan Lasnoski They can use that information to be able to sell. 0:12:1.601 –> 0:12:4.1 Nathan Lasnoski You’re the top quartile because this is what you do. 0:12:4.1 –> 0:12:5.681 Nathan Lasnoski Really, this is what the top quartile does. 0:12:5.681 –> 0:12:8.641 Nathan Lasnoski Well, here’s how you adjust to be in that top quartile. 0:12:8.921 –> 0:12:10.561 Nathan Lasnoski How do I use data to be able? 0:12:10.821 –> 0:12:15.181 Nathan Lasnoski A new revenue stream or asset to create higher margin activities with my customers. 0:12:16.741 –> 0:12:22.61 Nathan Lasnoski Have I found unintuitive insights into production processes that have been discovered? I was working with a company. 0:12:22.381 –> 0:12:24.341 Nathan Lasnoski They happen to produce. 0:12:25.941 –> 0:12:45.541 Nathan Lasnoski Batteries, so they learned was there is there was like this unknown part of their production process. They were able to use AI to discover essentially why they have this variability within their production. They’re able to achieve normal normalcy of that production by what they learned from the 0. 0:12:45.901 –> 0:12:46.821 Nathan Lasnoski Data kind of like the. 0:12:47.401 –> 0:12:50.321 Nathan Lasnoski I’ve heard of the OT data be like the undiscovered country, right? 0:12:50.321 –> 0:12:51.721 Nathan Lasnoski Like it’s out there. 0:12:51.801 –> 0:12:53.41 Nathan Lasnoski I’ve never used it. 0:12:53.161 –> 0:12:58.81 Nathan Lasnoski How do I use that information to be able to be able to learn something about my my process? 0:12:59.701 –> 0:13:0.821 Nathan Lasnoski So all these things, OK. 0:13:2.541 –> 0:13:4.181 Nathan Lasnoski This is what many companies have achieved. 0:13:4.941 –> 0:13:6.141 Nathan Lasnoski How do we take this forward? 0:13:6.901 –> 0:13:10.861 Nathan Lasnoski So my question is, where are you on this process? 0:13:11.341 –> 0:13:12.621 Nathan Lasnoski So meeting where are you? 0:13:12.621 –> 0:13:18.101 Nathan Lasnoski So in the start of this you have to start with that mapping that envisioning and strategy. 0:13:18.101 –> 0:13:19.541 Nathan Lasnoski Think about the step one right. 0:13:19.981 –> 0:13:21.781 Nathan Lasnoski I need to know how I’m going to. 0:13:22.521 –> 0:13:36.441 Nathan Lasnoski Mary, the mission of my business, the strategy behind it and using AI to force multiply those goals have I thought about that at the executive level and that bubble down into the prioritize alignment. 0:13:36.441 –> 0:13:39.841 Nathan Lasnoski And that’s not just a lot of people think about this just in terms of use cases. 0:13:40.161 –> 0:13:47.441 Nathan Lasnoski It’s not just about use cases, it’s about thinking about what the possible future is and working backward from that possible future. 0:13:48.301 –> 0:13:51.581 Nathan Lasnoski To be able to translate that into action. 0:13:52.261 –> 0:13:54.861 Nathan Lasnoski And not all of those actions are going to work. 0:13:54.861 –> 0:14:2.301 Nathan Lasnoski We’ll talk more about that later this creating this culture of innovation is not about like this one thing. 0:14:2.301 –> 0:14:11.901 Nathan Lasnoski It’s about translating that strategy into action and scaling across my organization that then measurable results we need to measure we need. 0:14:11.901 –> 0:14:16.581 Nathan Lasnoski To prove it, we need to show it actually happens, and then scale that within the context of our business. 0:14:16.861 –> 0:14:21.861 Nathan Lasnoski I venture that that many of you are probably still here, but if you’re not, I’d love to understand. 0:14:22.381 –> 0:14:25.781 Nathan Lasnoski Where you are in the channel so we can start to talk about how we get you there. 0:14:27.421 –> 0:14:38.221 Nathan Lasnoski So to start that part of the conversation, I wanna kinda make it just a general statement. You probably heard the statement the hardest. The hardest company disrupt is your own. 0:14:38.501 –> 0:14:51.581 Nathan Lasnoski Maybe you’ve heard in the context of yourself, like the hardest person to disrupt is yourself. Like it’s always hard for us to turn inward and to accept something that may be true about us ourselves, or to change our own actions. 0:14:52.421 –> 0:14:53.581 Nathan Lasnoski Same thing with companies. 0:14:54.341 –> 0:14:59.181 Nathan Lasnoski Hardest company disrupt is your own, particularly if it is already making money. 0:14:59.821 –> 0:15:4.221 Nathan Lasnoski Companies are I was a fantastic talk last week. 0:15:4.341 –> 0:15:12.901 Nathan Lasnoski That kind of harkened to Clinton Christensen’s ideas, which is essentially that enterprise organizations, they drive toward efficiency, and they get really good at it. 0:15:13.261 –> 0:15:20.501 Nathan Lasnoski But what? They aren’t really great at is disrupting that efficiency to create new business models that may not exist today. 0:15:20.801 –> 0:15:30.961 Nathan Lasnoski So they may be really good at this incremental innovation which a lot of times when you look at the things that you’re trying to do in your business, you think of like well, you know, here’s a process I do today. 0:15:30.961 –> 0:15:32.121 Nathan Lasnoski Can I replace that with AI? 0:15:32.121 –> 0:15:33.801 Nathan Lasnoski And that’s not bad. 0:15:33.801 –> 0:15:38.241 Nathan Lasnoski That’s just like you’re optimizing an existing process. But what? 0:15:38.241 –> 0:15:45.241 Nathan Lasnoski Some companies need to do is they have to say what’s the possible future that could exist in my market space that I’m not harnessing today. 0:15:45.981 –> 0:15:47.581 Nathan Lasnoski And what needs to be true for me to achieve that goal? 0:15:49.341 –> 0:15:50.61 Nathan Lasnoski And then in looking at that. 0:15:50.661 –> 0:15:57.861 Nathan Lasnoski You start thinking about what are the jobs to be done, not only from our organization but from my customers that I can serve more effectively. 0:15:59.421 –> 0:16:0.141 Nathan Lasnoski Do I need for example? 0:16:0.141 –> 0:16:3.941 Nathan Lasnoski I was working with a company that distributes like a commodity. 0:16:4.261 –> 0:16:11.341 Nathan Lasnoski Do I need to quote my customers this or can they quote themselves? And what would need to be true for them to quote themselves? 0:16:11.861 –> 0:16:18.541 Nathan Lasnoski And how would I enable them to get the best price and have certainty that they would get the best price if they quoted themselves? 0:16:18.541 –> 0:16:19.301 Nathan Lasnoski What would happen? 0:16:19.301 –> 0:16:21.501 Nathan Lasnoski What would need to be true for that to for that to occur? 0:16:22.631 –> 0:16:35.551 Nathan Lasnoski Thinking about jobs to be done and able to happen, and this really then drives us to this idea of these Co innovation paths. This idea of sustained innovation that you’re always gonna be doing to run your business and disruptive innovation which essentially. 0:16:37.341 –> 0:16:37.501 Nathan Lasnoski Disrupting. 0:16:39.101 –> 0:16:43.261 Nathan Lasnoski It’s enabling your business to look at the market in a way which is different than it does today. 0:16:43.261 –> 0:16:44.621 Nathan Lasnoski I’ll talk more about that in a minute. 0:16:45.661 –> 0:16:47.341 Nathan Lasnoski So this is this tension, OK? 0:16:47.541 –> 0:16:49.141 Nathan Lasnoski So imagine yourself. 0:16:49.221 –> 0:16:51.861 Nathan Lasnoski Imagine your competitor yourself. 0:16:51.861 –> 0:16:55.101 Nathan Lasnoski You have this revenue and you have a tug of war with competitive forces. 0:16:55.621 –> 0:17:12.341 Nathan Lasnoski Against your competitor and you see here that as you gain efficiencies, you gain more in that competitive force against your competitor, whatever that percentage of market is, let’s say this is per 50% of the available market that spread between you and your competitor or maybe even smaller. 0:17:12.461 –> 0:17:13.781 Nathan Lasnoski Like I was working with this company. 0:17:13.781 –> 0:17:17.501 Nathan Lasnoski They’re like the leading company in the world, but the only own like 8% of the overall market. 0:17:17.501 –> 0:17:20.701 Nathan Lasnoski Like it’s like you’re nowhere close to a monopoly, but you. 0:17:21.501 –> 0:17:26.261 Nathan Lasnoski You’re you’re the biggest company in the world, so it’s a tug of war and these efficiencies you’re trying to drive. 0:17:26.261 –> 0:17:29.421 Nathan Lasnoski This is where you’re you’re applying AI and other tools to grab these. 0:17:29.421 –> 0:17:30.461 Nathan Lasnoski Grab, grab these efficiencies. 0:17:30.461 –> 0:17:33.261 Nathan Lasnoski Maybe that’s like even the auto quoting situation. 0:17:33.261 –> 0:17:35.701 Nathan Lasnoski Just trying to help like get more of that pie. 0:17:36.521 –> 0:17:36.961 Nathan Lasnoski But what? 0:17:36.961 –> 0:17:38.321 Nathan Lasnoski Probably exists. 0:17:38.321 –> 0:17:41.481 Nathan Lasnoski Is this unharnessed opportunity that? 0:17:43.21 –> 0:17:44.101 Nathan Lasnoski Neither of you are getting today. 0:17:44.501 –> 0:17:47.781 Nathan Lasnoski What would need to be true for me to harness that opportunity? 0:17:47.781 –> 0:17:54.381 Nathan Lasnoski What would need to be true for me to use technology to be able to change the way I engage that market? 0:17:54.381 –> 0:18:2.221 Nathan Lasnoski And sometimes it means less features, less capabilities, a different way of engaging. But a broader market that I can then I can counter today. 0:18:2.221 –> 0:18:7.381 Nathan Lasnoski So maybe I’m a $2 billion company today, but the overall market opportunity is $40 billion. 0:18:8.401 –> 0:18:12.801 Nathan Lasnoski How do I get more of that 40 rather than fighting for the two I have right now? 0:18:13.561 –> 0:18:15.1 Nathan Lasnoski Or how do I do both? 0:18:15.1 –> 0:18:20.961 Nathan Lasnoski How do I keep garnering that that two, but then go after the 40 as a new opportunity? 0:18:22.461 –> 0:18:24.581 Nathan Lasnoski This is where people who are being successful are thinking. 0:18:24.581 –> 0:18:26.381 Nathan Lasnoski They’re thinking about both of those those ideas. 0:18:27.61 –> 0:18:35.701 Nathan Lasnoski So if we go into this idea now of what works and what doesn’t, we’re gonna do a little bit of an AB between each of those so. 0:18:37.771 –> 0:18:39.51 Nathan Lasnoski What works and what doesn’t? 0:18:39.51 –> 0:18:41.811 Nathan Lasnoski Business alignment. What doesn’t work? 0:18:42.251 –> 0:18:46.371 Nathan Lasnoski Is it by itself isolated on use cases? 0:18:46.371 –> 0:18:49.531 Nathan Lasnoski It is really good at coming up with use cases in a vacuum, right? 0:18:49.531 –> 0:18:57.931 Nathan Lasnoski Like if you gave it the opportunity to go after AI, they come up with their big list and maybe they share with the business and they try to go after some. 0:18:58.371 –> 0:19:0.811 Nathan Lasnoski But what happens is becomes too siloed. 0:19:0.811 –> 0:19:3.571 Nathan Lasnoski Becomes its objective, not the business’s objective. 0:19:4.381 –> 0:19:8.661 Nathan Lasnoski You need to make sure the business is driving those AI objectives. 0:19:9.221 –> 0:19:24.821 Nathan Lasnoski And are the number one supporter behind why they are being pursued. The most successful AI initiatives have been their been been successful because the business cared about them even more than the tech organization did. 0:19:25.501 –> 0:19:35.941 Nathan Lasnoski And this is by enabling the priorities being mapped to the business by having an AI upscaling campaign and by having a direct experimentation culture. 0:19:37.101 –> 0:19:38.861 Nathan Lasnoski That’s accepting not just accepting. 0:19:38.861 –> 0:19:39.261 Nathan Lasnoski A failure. 0:19:39.261 –> 0:19:44.221 Nathan Lasnoski It’s it’s expecting failure and looking to what they can learn from those moments. 0:19:44.901 –> 0:19:51.301 Nathan Lasnoski How can I have lines in the water realize, oh, there’s no fish in that hole? I’m not gonna fish there again. 0:19:51.661 –> 0:19:58.741 Nathan Lasnoski I’m gonna or I need to use a different fly on my rod because that particular like that hole’s got. That’s a good place. There’s it’s deep. 0:19:58.741 –> 0:20:14.621 Nathan Lasnoski It’s there’s, there’s still water, but I need to use a different type of lure for it to desire to hit that particular particular cast that I get right. But I need to be able to react to those moments and expect failure and be able to have the res. 0:20:15.381 –> 0:20:17.141 Nathan Lasnoski To be able to take those next steps. 0:20:18.701 –> 0:20:31.661 Nathan Lasnoski So you need to be able to have a ability to create and validate alignment to the business and understand what’s there and understand the opportunities that exist before you. So this is an example of that for like software digital. 0:20:33.261 –> 0:20:34.821 Nathan Lasnoski Software digital platforms company. 0:20:34.821 –> 0:20:37.101 Nathan Lasnoski So this idea of thinking about what are the things we do. 0:20:37.101 –> 0:20:38.101 Nathan Lasnoski What are the categories? 0:20:38.101 –> 0:20:39.861 Nathan Lasnoski What are the names or descriptions? 0:20:39.941 –> 0:20:46.261 Nathan Lasnoski Where is going to impact our business and then even how hard is it like I might not go after the moon shot opportunity on day one. 0:20:46.751 –> 0:21:4.421 Nathan Lasnoski I might go after some some lowers like lower capability type of things to get my feet wet to experiment, get some quick wins, but that’s not to distract me from this idea that I’m building muscle to attack the goal. The the failure is when companies think of going. 0:21:4.421 –> 0:21:16.351 Nathan Lasnoski After the quick win as end in itself like that is that is a start. That’s like a A if that’s like your AI strategy, you’re missing this idea of like, how am I gonna engage the business? 0:21:16.951 –> 0:21:21.551 Nathan Lasnoski And transform in in two years, I can look back and say I really did something. 0:21:21.671 –> 0:21:33.711 Nathan Lasnoski So starting with this map, but then getting to a point where I’m following through on it in multiple lanes, but then still picking some picking some that are really focused on achieving measurable outcomes. 0:21:35.301 –> 0:21:39.701 Nathan Lasnoski So this pairs with this idea of Co innovation in Co innovation. 0:21:39.701 –> 0:21:42.261 Nathan Lasnoski What doesn’t work is all AI efforts. 0:21:42.261 –> 0:21:48.61 Nathan Lasnoski Ride on that single use case success. So if you go back to that previous slide, you pick one. You’re like, this is the one we’re going after. 0:21:48.931 –> 0:21:51.931 Nathan Lasnoski And that one kind of hits a roadblock. 0:21:51.931 –> 0:21:55.211 Nathan Lasnoski You’re like, oh, like, that did not get to where I needed to get to. 0:21:56.211 –> 0:21:56.771 Nathan Lasnoski Oh, I’m sorry. 0:21:56.771 –> 0:22:1.11 Nathan Lasnoski AI is a failure because that specific use case didn’t get to where it needed to get to. 0:22:1.371 –> 0:22:8.771 Nathan Lasnoski That’s that’s really like underfunding under engaging under scaling. The idea of how we’re gonna pursue AI within our organization. 0:22:9.131 –> 0:22:10.891 Nathan Lasnoski Don’t put all your eggs on one basket. 0:22:10.891 –> 0:22:18.531 Nathan Lasnoski Don’t think that just because one weren’t use case is not going to be successful or needs to pivot that suddenly you are going to. 0:22:19.71 –> 0:22:20.151 Nathan Lasnoski Not see value from AI. 0:22:20.231 –> 0:22:22.751 Nathan Lasnoski This is where we need to see value. 0:22:22.751 –> 0:22:40.551 Nathan Lasnoski In experimentation, we need to be comfortable with this idea of reacting to what’s happening within the market. Being able to continually see that this innovation is happening outside and inside our organization, creating that culture where we build team momentum vs this one and done long game vs short. 0:22:40.551 –> 0:22:45.471 Nathan Lasnoski Game. Now. That also doesn’t mean we just tinker, right? Like I’ve seen some companies like. 0:22:46.261 –> 0:22:48.981 Nathan Lasnoski You can see some companies that they put a lot of value in. 0:22:49.631 –> 0:23:6.591 Nathan Lasnoski And it sort of rides or dies on that one big thing. I’ve seen the flip side, tons of energy on innovation, but none of it really gets to production right, because that that culture of innovation is impaired with the stick to autiveness of pushing things out of that. 0:23:6.591 –> 0:23:12.551 Nathan Lasnoski Experimentation into a production use case or into like a channel that gets to production. 0:23:14.101 –> 0:23:17.261 Nathan Lasnoski So I’m gonna pause here for a second and ask another question. 0:23:17.541 –> 0:23:20.821 Nathan Lasnoski And I think this is really important question for all of us to ask ourselves. 0:23:21.761 –> 0:23:26.81 Nathan Lasnoski Can this be a moment where every person can be the best version themselves? 0:23:26.81 –> 0:23:27.521 Nathan Lasnoski Where we enable them to be. 0:23:27.521 –> 0:23:35.801 Nathan Lasnoski That, and I ask that question because in every major technology movement there are winners and losers. 0:23:35.801 –> 0:23:46.601 Nathan Lasnoski There are people who are impacted. There are individuals who receive the benefits and those that are in a sense that can be taken advantage of. Unfortunately to receive those benefits. 0:23:47.341 –> 0:23:49.21 Nathan Lasnoski You saw that with the Industrial revolution, right? 0:23:49.101 –> 0:23:51.261 Nathan Lasnoski There was a transition from point A to point B. 0:23:52.1 –> 0:24:1.281 Nathan Lasnoski You saw a dramatic improvement in the sort of general like populace’s accessibility of of goods, right? 0:24:1.281 –> 0:24:16.81 Nathan Lasnoski Like we could produce more of the same amount of people, we could produce more outcome. We can enable more more economic GDP. If you look at the hockey stick that happened after electrification in the Industrial Revolution and then travel. 0:24:17.21 –> 0:24:23.661 Nathan Lasnoski You saw dramatic changes, but you also know that there was dramatic changes to the way people work in those scenarios. 0:24:24.431 –> 0:24:28.671 Nathan Lasnoski You had people who were in these sort of factory settings that weren’t treated appropriately. 0:24:28.791 –> 0:24:35.711 Nathan Lasnoski You also saw that in something we all probably lived through is the advent of the smartphone, or just simply having a smartphone. Getting your first smart phone. 0:24:35.711 –> 0:24:36.951 Nathan Lasnoski What does that mean to us? 0:24:36.951 –> 0:24:38.711 Nathan Lasnoski Like, how did that change the way we work? 0:24:39.231 –> 0:24:46.551 Nathan Lasnoski We know that if I had like knowing what I know today, what would I tell someone who’s getting a smartphone for the first time? 0:24:46.671 –> 0:24:53.151 Nathan Lasnoski How do I prevent them from having the same kind of like addictive qualities that maybe I have with a smartphone that I don’t want them to have? 0:24:53.471 –> 0:24:54.391 Nathan Lasnoski Do we just let that happen? 0:24:54.621 –> 0:24:56.261 Nathan Lasnoski With us, or do we lead through it? 0:24:56.461 –> 0:25:2.821 Nathan Lasnoski This is a moment for us to lead through it, to know that every person in your organization is going to be using AI in some capacity. 0:25:3.61 –> 0:25:8.821 Nathan Lasnoski How do we enable them to be able to have the skills to function in this new world of work? 0:25:10.381 –> 0:25:11.981 Nathan Lasnoski And that goes with scaling. OK. 0:25:11.981 –> 0:25:16.61 Nathan Lasnoski So when you think about scaling an AI engagement you think about. 0:25:16.151 –> 0:25:35.741 Nathan Lasnoski Now is my AI strategy just to enable a small team and then everybody else just keep doing what you’re doing? Or is it to understand that I have diverse lanes of AI engagement across my organization that enables various types of capabilities and engagement that are centric to the? 0:25:35.841 –> 0:25:36.401 Nathan Lasnoski Type of user. 0:25:36.401 –> 0:25:43.401 Nathan Lasnoski So for example, I might have individuals in my organization that are just. They’re information workers, right? 0:25:43.401 –> 0:25:46.961 Nathan Lasnoski They’re doing all sorts of work all day long and they’re on all these calls, right? 0:25:46.961 –> 0:25:54.681 Nathan Lasnoski How can I enable them with, say, copilot for them to record those calls and capture action items and then not have to spend the time manually typing all of that right? 0:25:54.681 –> 0:25:56.521 Nathan Lasnoski Like how can I amp up how they work? 0:25:56.641 –> 0:26:3.561 Nathan Lasnoski How can I name my factory workers to have access to AQ and a agent that enables them to get their HR questions answered faster or to? 0:26:4.301 –> 0:26:7.101 Nathan Lasnoski Be able to participate in the culture of the organization in a different way. 0:26:8.611 –> 0:26:15.131 Nathan Lasnoski This is about scaling and encouragement, engaging a broad group, challenges and hackathons. Executive buy in. 0:26:15.131 –> 0:26:20.691 Nathan Lasnoski This is if your executive team isn’t talking about this and it’s just an it thing. You’ve missed it. 0:26:20.891 –> 0:26:23.291 Nathan Lasnoski This is about scaling across the organization. 0:26:23.291 –> 0:26:29.371 Nathan Lasnoski You might say like like. OK, like I realize we’re gonna hit this through a missed expectations and so on. 0:26:29.491 –> 0:26:30.371 Nathan Lasnoski Like, how does that? 0:26:30.371 –> 0:26:33.411 Nathan Lasnoski How’s that gonna impact us realize that? 0:26:34.341 –> 0:26:37.501 Nathan Lasnoski This same thing with the smartphones. Same thing with the Internet, right? 0:26:38.101 –> 0:26:45.981 Nathan Lasnoski You’re gonna have this channel that happens this ups and downs throughout this transformation, but it will dramatically change everyone’s work. 0:26:47.541 –> 0:27:0.981 Nathan Lasnoski And your leaders are going to be excellent leaders if they work. If they move through that Channel in a way that enables every person to be able to be more and especially that rounds out the curve doesn’t have quite as high of a high, doesn’t have quite as. 0:27:0.981 –> 0:27:13.501 Nathan Lasnoski Low of below and enables you to be able to get that broad engagement engagement meaning like everyone feels they’re on the bus, everyone feels they’re on the boat rowing in the same direction and not left on the shore. 0:27:14.311 –> 0:27:15.111 Nathan Lasnoski Being left behind. 0:27:16.661 –> 0:27:17.701 Nathan Lasnoski And that means follow through. 0:27:17.821 –> 0:27:23.821 Nathan Lasnoski That means what doesn’t work in follow through is when you have that small effort, you get that cold feet. 0:27:23.941 –> 0:27:34.741 Nathan Lasnoski So I had a customer or have they have something like 500 something sales persons we built auto quoting engine for them and initially they released that auto quoting engine. 0:27:34.741 –> 0:27:43.581 Nathan Lasnoski We knew it worked and you had the sales people and they’re like, I don’t know, I’m faster doing it myself or I’m. I’m not sure how to use it. 0:27:44.111 –> 0:27:48.791 Nathan Lasnoski And that VP of sales could have been like whatever, just just adopt what you want to adopt. 0:27:48.791 –> 0:27:50.271 Nathan Lasnoski It’ll be fine, you know. 0:27:50.271 –> 0:27:51.671 Nathan Lasnoski Or maybe I won’t follow through on this. 0:27:53.221 –> 0:27:58.341 Nathan Lasnoski And I had the same thing with like a customer service scenario where it was like it was answering some of the questions. 0:27:58.341 –> 0:28:1.941 Nathan Lasnoski Well, I was answering them all. Well, like, well, maybe it’s not successful. 0:28:2.501 –> 0:28:3.941 Nathan Lasnoski You need to have the follow through. 0:28:4.261 –> 0:28:12.741 Nathan Lasnoski These efforts are going to POC, you’re going to see initial success, you’re going to see Edge cases where these things don’t work, or if major cases where they don’t work. 0:28:13.531 –> 0:28:23.611 Nathan Lasnoski And you have to have organizational follow through both on the adoption side and the business and in the development side in building solutions to bring it all the way through. 0:28:25.261 –> 0:28:27.661 Nathan Lasnoski The the the channel of of adoption. 0:28:27.741 –> 0:28:43.341 Nathan Lasnoski So ultimately that 500 sales person organization adopted this across the entire organization and one of the biggest benefits they received from that was it levelled up everyone. Someone who’s been there 20 years and someone who’s been there five weeks have a lot of the same assets now. 0:28:43.341 –> 0:28:44.141 Nathan Lasnoski Available to them. 0:28:44.821 –> 0:28:48.541 Nathan Lasnoski In creating this level playing field, how they can engage their customers? 0:28:48.541 –> 0:28:55.621 Nathan Lasnoski Basically, the average ability to engage a customer went up because they were able to take advantage of AI that brought that to the table. 0:28:55.781 –> 0:29:1.261 Nathan Lasnoski So commitment on follow through is a critical capability to enable you to be successful. 0:29:2.261 –> 0:29:4.621 Nathan Lasnoski So another pause. 0:29:4.781 –> 0:29:14.421 Nathan Lasnoski I think this is another really important point, and it’s almost important is understanding the mission of the business and enabling a person to be the best version. 0:29:15.311 –> 0:29:28.671 Nathan Lasnoski If you translate that down, I want you to remember that we are currently under hyping AI upscaling. We’re under hyping AI upskilling, an impact and you can say, well, we’re under hyping something like AI is pretty hyped right now. 0:29:29.31 –> 0:29:38.791 Nathan Lasnoski We are under hyping it and the reason why I know we’re under hyping it is you have to look at the actions of the organizations as though they are living, what they believe to be true. 0:29:39.431 –> 0:29:44.911 Nathan Lasnoski Does your organization truly believe, like truly believe in their heart that like AI? 0:29:45.71 –> 0:29:52.231 Nathan Lasnoski Is going to change the nature of the work that they do like there’s a hurricane coming. I gotta get out of the way, right? 0:29:52.231 –> 0:29:57.151 Nathan Lasnoski I gotta get out of town if I don’t believe that, that’s going to have that impact. 0:29:57.151 –> 0:29:58.791 Nathan Lasnoski I’m not getting out of town, right? 0:29:59.151 –> 0:30:7.231 Nathan Lasnoski It might hit me anyway, you know, but I if I understand that that’s a threat and I need to move. I’m gonna get out of town. 0:30:7.431 –> 0:30:14.951 Nathan Lasnoski We need to understand that this is not only a threat, it’s an opportunity and it’s something that is going to impact every person. 0:30:15.191 –> 0:30:31.981 Nathan Lasnoski Organization. The organizations that are skilled and capable with AI tools are going to be tremendously more effective than those that are not even just commodity by itself. The ones that are able to use that well are going to be tremendously more effective simply because it takes less time. 0:30:32.71 –> 0:30:38.311 Nathan Lasnoski To do anything that they need to do. I was working with a company over the last couple weeks and. 0:30:39.861 –> 0:30:43.581 Nathan Lasnoski I I was starting on their like idea registry and you know what I did? 0:30:43.581 –> 0:30:44.541 Nathan Lasnoski I went out to. 0:30:44.541 –> 0:30:47.301 Nathan Lasnoski I went out to copilot and I said here’s their website. 0:30:48.661 –> 0:31:2.21 Nathan Lasnoski Here’s my old registries index that website and give me a table of scenarios that align to their functional towers, categories and return on investment associated with them. 0:31:2.261 –> 0:31:4.901 Nathan Lasnoski And I got a great start list on it wasn’t perfect. 0:31:4.901 –> 0:31:8.941 Nathan Lasnoski It wasn’t what I was going to show to them ultimately, but it got me going, right? 0:31:8.941 –> 0:31:12.781 Nathan Lasnoski And you know how much that helped me? Immense. Immense because. 0:31:13.901 –> 0:31:17.901 Nathan Lasnoski It allowed me to be able to make intentional changes and not spend time on the busy work. 0:31:19.611 –> 0:31:21.51 Nathan Lasnoski So what does that? 0:31:21.51 –> 0:31:22.931 Nathan Lasnoski What’s required to enable that? 0:31:22.931 –> 0:31:27.691 Nathan Lasnoski It’s upscaling and what doesn’t work is not acting like AI is gonna be transformative. 0:31:27.691 –> 0:31:28.971 Nathan Lasnoski It is transformative. 0:31:28.971 –> 0:31:31.251 Nathan Lasnoski It is going to change the very nature of work. 0:31:32.821 –> 0:31:44.101 Nathan Lasnoski And that requires dedicated and substantial upskilling process. And it particularly reminds you that this is not about technology skills, it’s about growth mindset. So what? 0:31:44.101 –> 0:31:59.781 Nathan Lasnoski Every person in your organization having to challenge themselves of what they do today and how those tasks will be replaced and changed in the future and how their work will be augmented by AI agents. You’re seeing the infinite, remember, weren’t like Internet 1.0 right now. 0:32:0.901 –> 0:32:2.101 Nathan Lasnoski Of what this means? 0:32:2.101 –> 0:32:10.981 Nathan Lasnoski You’re seeing the infancy of the delegation to AI agents and a lot of it is like give me information back. It’s moving to go do something for me. 0:32:11.221 –> 0:32:24.21 Nathan Lasnoski It’s like having an intern next to you that just start in and you’re having to understand their skill set and know what they can do and not do like I have my my son power washing my deck and then staining my deck right? 0:32:24.101 –> 0:32:26.21 Nathan Lasnoski I I had to teach him on a power wash to deck. 0:32:26.821 –> 0:32:31.501 Nathan Lasnoski And as he’s improved his skills, I could better delegate that activity to him. 0:32:31.541 –> 0:32:33.661 Nathan Lasnoski Same thing with like the staining process right like. 0:32:34.291 –> 0:32:38.131 Nathan Lasnoski As he better understood the skills and how to do it, he could better delegate. 0:32:38.131 –> 0:32:42.811 Nathan Lasnoski This is what’s happening with AI agents, as we’re better able to collaborate and communicate. 0:32:42.811 –> 0:32:49.851 Nathan Lasnoski Understand the capabilities of the AI agent. Build the AI agents capability to be able to be delegated to. It gets better and better. 0:32:51.421 –> 0:32:53.541 Nathan Lasnoski But this is where AI enablement skills come in. 0:32:53.621 –> 0:32:58.781 Nathan Lasnoski Really, really importantly, they become part of the nature of how we execute on the work that we’re doing. 0:33:0.981 –> 0:33:20.261 Nathan Lasnoski So as we move from this idea of what works and what doesn’t work, I wanna give you some flavor of how companies are tackling some of these important objectives with really well structured sort of development frameworks. And one of the ways that we’ve been thinking about this is. 0:33:20.341 –> 0:33:21.861 Nathan Lasnoski A lot of the AI pursuits. 0:33:22.341 –> 0:33:30.901 Nathan Lasnoski It’s it’s not all that different software development or adoption, right? If you’re we’ve had to adopt things in the past, but now we’re adopting something really substant. 0:33:31.261 –> 0:33:41.141 Nathan Lasnoski It’s like if you said the smartphone’s going to drop tomorrow. This is going to be impactful to you and people get the first smart phone. And like, I remember, my first smartphone was like the Palm 3C. 0:33:41.141 –> 0:33:46.981 Nathan Lasnoski It was like the color version of the Palm and I was like, this is so cool. But I was like, the only person walking around with that palm. 0:33:48.541 –> 0:33:50.101 Nathan Lasnoski I was tracking my help desk tickets on it. 0:33:50.421 –> 0:33:55.821 Nathan Lasnoski Now it’s everyone’s got it right. But if if you told someone, then this is going to change. This is going to rock your world. You’re not going to. 0:33:55.821 –> 0:33:58.21 Nathan Lasnoski In fact, you’re going to be addicted to this thing they brought. 0:33:58.181 –> 0:33:59.501 Nathan Lasnoski They probably looked at me like why? 0:33:59.501 –> 0:34:1.381 Nathan Lasnoski Really, like I’m not going to carry that thing around. 0:34:1.381 –> 0:34:1.981 Nathan Lasnoski Like that’s for. That’s for geeky people. 0:34:2.261 –> 0:34:3.941 Nathan Lasnoski I’m not going to do that like this is. 0:34:4.471 –> 0:34:6.511 Nathan Lasnoski That same kind of level of thing. 0:34:6.791 –> 0:34:11.191 Nathan Lasnoski So how do I think about this in the context of like building these kinds of solutions? 0:34:11.991 –> 0:34:14.511 Nathan Lasnoski So really what this means is you’re doing something hard. 0:34:14.511 –> 0:34:15.791 Nathan Lasnoski You’re doing something complex. 0:34:15.791 –> 0:34:18.231 Nathan Lasnoski Let’s talk about what that is. 0:34:18.631 –> 0:34:23.31 Nathan Lasnoski So if we had to say, what do the winners do? Well, what do the winners do? 0:34:23.31 –> 0:34:29.951 Nathan Lasnoski Well, the first things they do well is they do really rigorous problem selection. 0:34:30.191 –> 0:34:31.471 Nathan Lasnoski They pick the right things. 0:34:32.261 –> 0:34:33.541 Nathan Lasnoski Small internal AI friendly. 0:34:34.541 –> 0:34:46.61 Nathan Lasnoski For replacing bad existing solutions, they’re looking for opportunities that are are clear. The ones that stick out, the ones that business willing to support and they’re following through on. 0:34:46.501 –> 0:34:50.981 Nathan Lasnoski But they have a belief that failure is good, and that’s a hard thing to say. 0:34:50.981 –> 0:34:59.21 Nathan Lasnoski Like failure is good because we can react to it and they’re pragmatic about what’s going on and they’re they’re willing to stop and start and move and shift. 0:34:59.821 –> 0:35:4.781 Nathan Lasnoski And focus on the things that are going to be the right objectives and then in execution. 0:35:5.821 –> 0:35:10.901 Nathan Lasnoski They think about this being engineering activity and engineering in different ways, right? 0:35:10.901 –> 0:35:15.101 Nathan Lasnoski So we have the adoption lane. We have this idea of like low code, no code. 0:35:15.101 –> 0:35:30.901 Nathan Lasnoski You’re seeing that with copilot Wave 2, for example, where you see these like agentic AI scenarios start to pop out, and then as you get to your big, your big hairy audacious AI solutions that support your business in like these huge follow through’s, this is this idea. 0:35:31.201 –> 0:35:34.241 Nathan Lasnoski Of products, not projects, products, not projects. 0:35:34.331 –> 0:35:43.331 Nathan Lasnoski The idea of I’m creating product teams that enable me to be able to support the objectives of the organization and they’re be able to react to the needs of the business. 0:35:43.331 –> 0:35:48.251 Nathan Lasnoski They’re not just spinning up one time projects to be able to respond to something. 0:35:49.11 –> 0:35:54.371 Nathan Lasnoski Projects sometimes are to stack too static for us to be able to really be successful at as an organization. 0:35:55.921 –> 0:36:3.881 Nathan Lasnoski So I think it’s important to think about once your business like starts to tackle an objective and this is centric to building them, OK. 0:36:3.881 –> 0:36:5.881 Nathan Lasnoski So if you think about those three lanes. 0:36:6.531 –> 0:36:17.411 Nathan Lasnoski Especially on the like high build scenarios where you’ll probably be going after some really important objectives. A lot of companies they think of like, OK, you’re gonna go build an ML system. 0:36:17.971 –> 0:36:18.931 Nathan Lasnoski What is that made-up of? 0:36:18.931 –> 0:36:25.411 Nathan Lasnoski What they think of I’ve got a data scientist and they’re using generative AI or they’re using something else and they’re creating this ML code. 0:36:25.651 –> 0:36:26.691 Nathan Lasnoski And that’s it’s true. 0:36:26.691 –> 0:36:27.571 Nathan Lasnoski There is ml code. 0:36:27.571 –> 0:36:36.451 Nathan Lasnoski There is the the the the capabilities of the the machine learning system, the generative AI system that we’re building, but then surrounding that are all these other things. 0:36:37.201 –> 0:36:49.161 Nathan Lasnoski And that’s where the idea of like the data I’m using or the serving infrastructure, how I bring it to my customers like am I building this from scratch or am I creating a power app or a building in copilot studio? 0:36:49.441 –> 0:36:51.881 Nathan Lasnoski Am I doing it in a self-service? 0:36:51.881 –> 0:36:52.441 Nathan Lasnoski Kind of way. 0:36:54.1 –> 0:36:56.201 Nathan Lasnoski Is there a model validate validation process? 0:36:56.201 –> 0:37:5.201 Nathan Lasnoski Have I enabled, for example, Microsoft’s AI safety and security layer that’s looking for different types of compromises against my AI systems? 0:37:5.851 –> 0:37:14.771 Nathan Lasnoski How am I monitoring that to validate that that continues to be true, especially if I launch and land AAI agent that my customers are interacting with? 0:37:14.931 –> 0:37:18.811 Nathan Lasnoski Have I built the right layers between that customer and the information? 0:37:18.811 –> 0:37:21.131 Nathan Lasnoski The business system that sits behind that AI tool. 0:37:21.691 –> 0:37:35.731 Nathan Lasnoski So building AI systems becomes very rigorous, and especially when you’re thinking about, you’re thinking about creating systems for your customers directly, or that your customer service or sales or engineering teams are using. 0:37:35.961 –> 0:37:47.41 Nathan Lasnoski Be able to serve your customers. You need to think about it rigorously, as very much is a software development exercise and then below that exists the data and exists the infrastructure and exists the capabilities. 0:37:47.121 –> 0:37:53.81 Nathan Lasnoski So when you think about building AI solutions underneath, an AI solution is essentially software engineering. 0:37:53.601 –> 0:37:56.121 Nathan Lasnoski It sits on the data and sits on the availability of the cloud. 0:37:56.121 –> 0:37:57.401 Nathan Lasnoski Why bring this up? 0:37:59.41 –> 0:38:16.121 Nathan Lasnoski Is so many organizations have been starting to dabble in the cloud, starting to leverage the cloud for different purposes. And the organizations that didn’t do the work up to this point to get themselves ready skilling wise capability wise data wise, they’re going to be behind, they’re going to. 0:38:16.121 –> 0:38:24.41 Nathan Lasnoski Be challenged to be able to take advantage of the data and the availability of the cloud and some of the muscle memory of software engineering. 0:38:24.41 –> 0:38:26.721 Nathan Lasnoski Tune land AI capabilities within the organization. 0:38:27.771 –> 0:38:37.571 Nathan Lasnoski In the build lane because they haven’t done the homework ahead of time, and you can think about that outside of the build lane in the adoption lane in this way. 0:38:39.81 –> 0:38:40.361 Nathan Lasnoski Imagine you have a family member. 0:38:43.41 –> 0:38:48.121 Nathan Lasnoski Who does not have significant technology skills? OK, like just in their personal life. 0:38:48.121 –> 0:38:57.361 Nathan Lasnoski Just think about like I think about one of my family members that like still struggles with their smartphone still calls you with every question. 0:38:58.131 –> 0:39:1.611 Nathan Lasnoski Maybe they still call the airline rather than online booking, right? 0:39:1.611 –> 0:39:4.411 Nathan Lasnoski Like there’s like, there’s these. They just don’t get it right. 0:39:4.411 –> 0:39:5.931 Nathan Lasnoski They didn’t live through it. 0:39:5.931 –> 0:39:7.971 Nathan Lasnoski They they struggle with the technology. 0:39:7.971 –> 0:39:16.611 Nathan Lasnoski Today they haven’t been able to make the pivot and they’re just at that point. Maybe it’s your grandmother. Maybe it’s your mother. Maybe it’s your sister. Whoever it is, right? 0:39:16.731 –> 0:39:17.811 Nathan Lasnoski Or your your brother. 0:39:20.431 –> 0:39:21.991 Nathan Lasnoski We’re gonna live through that moment right now. 0:39:23.601 –> 0:39:36.881 Nathan Lasnoski And those of us that are getting access to these tools and are growing along with it or digital native land in which adopted because it always existed as part of our life, are going to take advantage of these tools really quickly. 0:39:38.521 –> 0:39:43.121 Nathan Lasnoski But we’re going to have a set of people whose businesses drag their feet aren’t engaged. 0:39:43.121 –> 0:39:55.81 Nathan Lasnoski They don’t think it’s going to be a thing and then you left behind and they won’t have done the homework. So we’ll get to a .5 years from now and they’ll get an AI agent on their desk when they go to the new job, they’ll be like. 0:39:55.401 –> 0:39:58.721 Nathan Lasnoski How do I use this thing and how do I interact and what should I do and? 0:39:59.241 –> 0:39:59.801 Nathan Lasnoski It’s gonna be. 0:39:59.801 –> 0:40:6.681 Nathan Lasnoski It’s gonna feel so foreign because they’re not used to delegating to an AI agent the same way another person might be very comfortable. 0:40:7.401 –> 0:40:10.841 Nathan Lasnoski So one of the things that’s on us is to do the homework right? 0:40:10.841 –> 0:40:26.921 Nathan Lasnoski So in the build lane, it’s about getting all this ready, picking the right solutions to to deliver on AI. But in the adoption lane, it’s about how do I bring my people along for this ride and enable them in a culture of innovation that helps them be to. 0:40:26.921 –> 0:40:27.601 Nathan Lasnoski Be successful. 0:40:29.571 –> 0:40:35.131 Nathan Lasnoski So when you’re building these systems, you’re always looking at opportunities to derisk. 0:40:35.531 –> 0:40:39.131 Nathan Lasnoski So what we do is we divide them into three different pieces. 0:40:39.131 –> 0:40:48.971 Nathan Lasnoski You have a proof of concept phase where you’re just doing the minimum capabilities necessary to to validate the scenario. Actually works like will this work? 0:40:48.971 –> 0:40:50.971 Nathan Lasnoski You know, let’s ask that answer that question first. 0:40:51.251 –> 0:40:53.251 Nathan Lasnoski Like can I get here from here? 0:40:53.291 –> 0:40:54.91 Nathan Lasnoski Can I get there from here? 0:40:54.881 –> 0:40:58.841 Nathan Lasnoski But then you get to a point where you’re then building a minimum viable product. 0:40:59.361 –> 0:41:17.561 Nathan Lasnoski Minimum valid products that may even no are aware is the minimum product to deliver value. The minimum product to deliver value in some way and then also is like missing some elements because you’re not going to over invest until you’re actually delivering value. You’re getting it to a. 0:41:17.561 –> 0:41:29.361 Nathan Lasnoski Point where you like. You can test it out on X number of customer scenarios or X number of use cases and validate it’s doing what not only can it do it but like does it do it in a in practicality way remember. 0:41:29.491 –> 0:41:30.371 Nathan Lasnoski Said pragmatic. 0:41:30.491 –> 0:41:33.851 Nathan Lasnoski Like does this pragmatically do what I I needed to do? 0:41:35.401 –> 0:41:51.521 Nathan Lasnoski Which then leads us to this idea of machine learning operations in building this operational resiliency into everything I create and this and this is one of the things that many companies leave off going back ten years in the AI space, I’ve seen organizations build, you know, ML mod. 0:41:51.521 –> 0:42:0.41 Nathan Lasnoski For machine, for forecasting of their supply chains, but have really weak resiliency on the phase three capabilities. 0:42:0.41 –> 0:42:1.321 Nathan Lasnoski Really weak resili weak resilience. 0:42:2.221 –> 0:42:8.141 Nathan Lasnoski On how the machine learning operations is built in the context of their operational state. 0:42:8.141 –> 0:42:9.21 Nathan Lasnoski So something breaks. 0:42:9.21 –> 0:42:12.381 Nathan Lasnoski They put the wrong data in or. They’re not even evaluating the data. 0:42:12.981 –> 0:42:23.221 Nathan Lasnoski All those surrounding components become too brittle and by the time it gets pushed into production, that guy leaves or the wrong data gets pushed in. 0:42:23.621 –> 0:42:24.141 Nathan Lasnoski This isn’t. 0:42:24.141 –> 0:42:27.261 Nathan Lasnoski This is a production service. I need to have that resiliency. 0:42:27.261 –> 0:42:28.101 Nathan Lasnoski That exists here. 0:42:28.221 –> 0:42:31.501 Nathan Lasnoski This is why follow through is so important, but also constant evaluation. 0:42:32.111 –> 0:42:40.271 Nathan Lasnoski Is so important to to this this channel, so this is a phase comparison of what elements are going to exist in each. 0:42:40.271 –> 0:42:41.591 Nathan Lasnoski You can see this a little bit. 0:42:41.591 –> 0:42:43.151 Nathan Lasnoski This is a good one to screen capture. 0:42:43.151 –> 0:42:57.551 Nathan Lasnoski You can see the POC leaves off with just those first couple components, but then MVP and the ML op start to build on that as you’re creating that muscle memory and as you’re getting it further into the the deployment process. 0:42:58.361 –> 0:43:4.601 Nathan Lasnoski Why this is important is this is not just important. If you go back to that, that arrow of adoption, this is not just important. 0:43:5.191 –> 0:43:13.991 Nathan Lasnoski In the context of building a solution, it’s important in governing what’s necessary to be in production within our organizations. 0:43:14.151 –> 0:43:32.31 Nathan Lasnoski So as you look at creating your artificial intelligence center of excellence or you know steering group, that steering group has to have responsibility not only to ensure that enablement is happening, especially ensure enablement is happening in the right ways against the context of the business both in the. 0:43:32.71 –> 0:43:33.191 Nathan Lasnoski Commodity lane. 0:43:34.41 –> 0:43:37.641 Nathan Lasnoski And the sort of by lane like I’m buying AI solutions and adopting them. 0:43:38.431 –> 0:43:56.31 Nathan Lasnoski Or in the lane of building AI solutions, whether it’s low code, no code, or it’s some of the things we’re talking about here that that governance exists and that’s always a balance. You can’t govern something that doesn’t exist, and you can’t in adopting something without governance is a. 0:43:56.31 –> 0:43:57.111 Nathan Lasnoski Mistake, right? 0:43:57.111 –> 0:43:58.991 Nathan Lasnoski It’s like a. It’s like a road with no speed limit. 0:43:58.991 –> 0:44:2.391 Nathan Lasnoski Sometimes it can be OK, but most of the time you need a speed limit somewhere. 0:44:2.391 –> 0:44:3.471 Nathan Lasnoski You need some science, right? 0:44:3.471 –> 0:44:7.71 Nathan Lasnoski Just keep us going the right directions to pause at the right places. 0:44:7.721 –> 0:44:9.761 Nathan Lasnoski To enable us to be highly aware. 0:44:11.321 –> 0:44:29.241 Nathan Lasnoski Have the right traffic patterns, all of that matters to us being being very successful in creating AI systems. So as you’re driving toward using and adopting AI systems or building them, supporting know where it exists on the channel, like on the site like the the range of cap. 0:44:30.41 –> 0:44:37.641 Nathan Lasnoski Of AI that exists that we might be adopting. So on the far left hand side you can see this idea of no AI. OK, no AI. 0:44:38.161 –> 0:44:42.681 Nathan Lasnoski Simply meaning that like, well, Doc, we’re not using AI for anything, right? 0:44:42.801 –> 0:44:48.721 Nathan Lasnoski And it gets to AI as a tool that we’re able to automate simple tasks. 0:44:48.721 –> 0:44:49.801 Nathan Lasnoski We’re able to. 0:44:49.801 –> 0:44:51.921 Nathan Lasnoski I kinda feel like this is where. 0:44:53.561 –> 0:44:57.481 Nathan Lasnoski Many of the AI solutions had been in the public space for for some time now. 0:44:57.481 –> 0:44:58.921 Nathan Lasnoski It’s like, OK, cool. 0:44:58.921 –> 0:45:3.721 Nathan Lasnoski I’m like I’m creating an image or I’m I’m it’s responding with some content. 0:45:3.721 –> 0:45:8.1 Nathan Lasnoski I’m I’m getting optimized sentence from Grammarly you know? 0:45:8.681 –> 0:45:13.441 Nathan Lasnoski And it’s a tool it’s providing me with an asset to do something, but it’s not. 0:45:13.881 –> 0:45:16.161 Nathan Lasnoski It’s not boosting me above that right it’s not. 0:45:16.161 –> 0:45:19.441 Nathan Lasnoski It’s not moving me past what my ask initially is. 0:45:21.11 –> 0:45:25.51 Nathan Lasnoski Which then moves in this idea of AI as a consultant, this idea that. 0:45:26.601 –> 0:45:34.721 Nathan Lasnoski It’s it’s not only doing something. I’ve asked it to do, but it’s it’s kind of taking the next step beyond that and saying, did you think of did you do X? 0:45:34.971 –> 0:45:39.691 Nathan Lasnoski Did you is providing recommendations back based upon a body of knowledge? 0:45:40.211 –> 0:45:45.171 Nathan Lasnoski And that’s still more on the person than it is on the AI tool. 0:45:45.251 –> 0:45:47.571 Nathan Lasnoski But you can see it’s a next step beyond the tool, right? 0:45:47.571 –> 0:45:49.731 Nathan Lasnoski The It’s asking the next question. 0:45:49.771 –> 0:45:59.51 Nathan Lasnoski It’s it’s infusing knowledge beyond what it was initially asked, but where we’re going to arrive at is this idea of AI is a collaborator. 0:45:59.171 –> 0:46:5.411 Nathan Lasnoski OK, AI is a collaborator, which is this. This idea of AI and humans playing equal roles within the process. 0:46:6.211 –> 0:46:10.811 Nathan Lasnoski And they’re bouncing ideas off of the other. You’ve seen the infancy of this with reasoning systems. 0:46:11.461 –> 0:46:22.61 Nathan Lasnoski The infancy of this with some of the custom systems that have been built, they play this complementary path within the channel, getting then to this idea of AI as an expert. 0:46:22.301 –> 0:46:25.461 Nathan Lasnoski AI controls tasks and uses human for feedback and input. 0:46:25.821 –> 0:46:29.741 Nathan Lasnoski And they can execute those simple sub tasks, but they are the expert. 0:46:29.741 –> 0:46:30.741 Nathan Lasnoski They’re the one providing. 0:46:30.781 –> 0:46:32.781 Nathan Lasnoski So if you look at this in like terms of supply chain. 0:46:34.331 –> 0:46:38.411 Nathan Lasnoski This might be like you know, no zeroes, no supply chain tool. 0:46:38.771 –> 0:46:43.931 Nathan Lasnoski One might be I now have an AI supply chain tool that is going to give me information. 0:46:45.291 –> 0:46:47.731 Nathan Lasnoski About my my. 0:46:49.331 –> 0:46:49.931 Nathan Lasnoski Predicted demand. 0:46:49.931 –> 0:46:57.691 Nathan Lasnoski It’s not prescribing something yet, but it’s giving me predicted demand. OK, as a consultant might predict what I should do about it, right? 0:46:57.691 –> 0:47:6.651 Nathan Lasnoski So it says it’s not just predicting the demand, it’s saying here’s the inventory you should buy based upon the the prediction of the of the needed demand. 0:47:6.651 –> 0:47:13.571 Nathan Lasnoski So it’s like taking the next step and then AI is a collaborator might be like it’s going to bounce off the the potential. 0:47:14.531 –> 0:47:15.11 Nathan Lasnoski Possibility. 0:47:15.581 –> 0:47:33.861 Nathan Lasnoski Future, let’s talk about what could change here or here or here or here based upon that. That idea Ai’s expert is almost coming back to the business and functioning as its own agency within the context of the organization. Saying the the AI agent is the A to. 0:47:33.861 –> 0:47:39.661 Nathan Lasnoski My B on the the supply chain, the supply chain challenge, right? 0:47:39.661 –> 0:47:40.821 Nathan Lasnoski It’s recommending back. 0:47:40.821 –> 0:47:45.501 Nathan Lasnoski It’s saying this is what we now understand. I’m would recommend doing this business you make these. 0:47:45.691 –> 0:47:57.251 Nathan Lasnoski Choices. But it does so with more agency than the tool or the consultant. It has the agency, and it’s simply asking for permission, and it says an autonomous AI is really like it’s completely on its own. 0:47:58.171 –> 0:48:1.291 Nathan Lasnoski Think about this is like truly self driving cars. 0:48:1.291 –> 0:48:6.51 Nathan Lasnoski Do we get to a state where we have truly self driving cars at some point? 0:48:7.611 –> 0:48:9.411 Nathan Lasnoski That’s would be an example of autonomous AI. 0:48:9.531 –> 0:48:10.691 Nathan Lasnoski Why is it so important? 0:48:10.691 –> 0:48:12.251 Nathan Lasnoski It’s extremely expensive. 0:48:12.291 –> 0:48:14.771 Nathan Lasnoski It’s extremely expensive right now to build autonomous AI. 0:48:15.781 –> 0:48:19.501 Nathan Lasnoski Realize that most of the things you build are gonna be in this space. 0:48:21.11 –> 0:48:22.291 Nathan Lasnoski Maybe eventually getting to here. 0:48:23.931 –> 0:48:27.971 Nathan Lasnoski Short term, at least these are the spaces you’re building in. That’s OK. 0:48:28.51 –> 0:48:31.291 Nathan Lasnoski The difference between that and no AI is substantial. 0:48:33.961 –> 0:48:35.281 Nathan Lasnoski Cool graphic. OK. 0:48:35.281 –> 0:48:36.441 Nathan Lasnoski So where do we go from here? 0:48:37.761 –> 0:48:46.521 Nathan Lasnoski Where do we go from here is. I want you to take action and I want you to realize where you are on that Channel. You may already be taking action. May already have your ideas. 0:48:46.521 –> 0:48:48.41 Nathan Lasnoski You may already know how to map that. 0:48:48.41 –> 0:48:48.921 Nathan Lasnoski Or maybe you don’t. 0:48:49.321 –> 0:48:59.481 Nathan Lasnoski So the ways that we help with this channel is a couple different things. We help by doing executive AI envisioning. I do this all the time. 0:48:59.521 –> 0:49:3.601 Nathan Lasnoski Our team is does it all the time the the results of this are substantial. 0:49:4.101 –> 0:49:14.381 Nathan Lasnoski This is a spot to engage us, to help you to get going on the journey in the appropriate ways to help engage your executive team to help understand what ideas are working, which ideas won’t we’ve garnered. 0:49:14.701 –> 0:49:17.301 Nathan Lasnoski Seriously, we have so much background. 0:49:17.301 –> 0:49:20.301 Nathan Lasnoski Know what ideas are good ideas and what ideas are not? 0:49:20.301 –> 0:49:30.141 Nathan Lasnoski Simply that by itself is helpful as you’re starting to do that exploration because there’s some that just simply work, some that simply are challenged to work based upon the capabilities that exist today. 0:49:30.421 –> 0:49:34.91 Nathan Lasnoski So that’s a great area for us to engage right at the start is executive AI envisioning. 0:49:34.531 –> 0:49:40.851 Nathan Lasnoski The second is if you’re going down the lane of commodity adoption, especially copilot. They just released copilot Wave 2. 0:49:41.771 –> 0:49:49.491 Nathan Lasnoski We have a readiness workshop for that we have an adoption program that’s that’s very provable in terms of its ability to drive success. 0:49:49.491 –> 0:49:59.651 Nathan Lasnoski We’re working with organizations all the way up to very highly regulated customers as well as non regulated kind of anywhere in that space for an M365 copilot adoption. 0:50:0.451 –> 0:50:6.51 Nathan Lasnoski If you are thinking about going down the the lane of of copilot adoption option, we are a great company to work with there. 0:50:6.901 –> 0:50:20.891 Nathan Lasnoski And then the third is this idea of like chat bot use case exploration. This idea of like I’m just I’m looking to think about not only what I should build, but maybe I’m in the middle of building it like we are interesting enough, some very highly regulated chat. 0:50:21.91 –> 0:50:29.131 Nathan Lasnoski Scenarios have come to our table recently that we’re helping companies collaborate with. I can’t give you the details of them, but some amazing use cases. 0:50:30.771 –> 0:50:36.251 Nathan Lasnoski And even expanded beyond chatbot to be the idea of just an AI agent that’s performing an activity within your environment. 0:50:36.571 –> 0:50:38.331 Nathan Lasnoski So many opportunities for next steps. 0:50:38.411 –> 0:50:42.131 Nathan Lasnoski I would love for you as you’re leaving today to do 2 things. 0:50:42.131 –> 0:50:50.611 Nathan Lasnoski We get your questions, by the way, if as you’re leaving today, I want you to fill survey, please select one of those and give me feedback. I want to know. 0:50:51.231 –> 0:50:57.271 Nathan Lasnoski What you loved about the session, what you didn’t love about the session, what what we can do better, what we can do better. 0:50:57.591 –> 0:50:58.511 Nathan Lasnoski Give me that feedback. 0:50:58.511 –> 0:50:59.151 Nathan Lasnoski I wanna hear it. 0:50:59.151 –> 0:50:59.911 Nathan Lasnoski I wanna know it. 0:50:59.991 –> 0:51:9.391 Nathan Lasnoski I wanna be able to react to what you’re learning and not learning so we can do more. These next sessions are other sessions coming from our our in person events. 0:51:9.391 –> 0:51:12.151 Nathan Lasnoski So why you shouldn’t invest in Gen. AI? 0:51:12.871 –> 0:51:17.671 Nathan Lasnoski This is a great session. Brandon’s gonna do a great breakdown of just like how to even have a decision matrix. 0:51:17.671 –> 0:51:20.191 Nathan Lasnoski How do I choose what to do versus not do there? 0:51:20.931 –> 0:51:22.131 Nathan Lasnoski And then on the 24th. 0:51:23.671 –> 0:51:39.931 Nathan Lasnoski We get into next Gen. agents which is comparing semantic kernel which is one of the AI agent infrastructures and copilot studio which is a local no code vehicle for building AI agents that you’re already seeing light up in M365 copilot as well that is a. 0:51:39.931 –> 0:51:40.391 Nathan Lasnoski Very interesting session. 0:51:40.391 –> 0:51:41.991 Nathan Lasnoski Tons of value in that as well. 0:51:41.991 –> 0:51:43.631 Nathan Lasnoski So both of those are going to be worth your time. 0:51:43.631 –> 0:51:44.791 Nathan Lasnoski Make sure you sign up for them. 0:51:45.111 –> 0:51:46.871 Nathan Lasnoski They’re on our website right now. 0:51:47.271 –> 0:51:50.71 Nathan Lasnoski OK, I promised you I’d take some time for questions. 0:51:50.851 –> 0:51:54.851 Nathan Lasnoski I would love to answer them now, so I’m going to go check out the Q&A and chat. 0:51:56.131 –> 0:51:57.651 Nathan Lasnoski And we’ll see what is. 0:51:57.811 –> 0:51:58.771 Nathan Lasnoski See what is there. 0:52:0.291 –> 0:52:0.931 Nathan Lasnoski Give me a moment. 0:52:2.491 –> 0:52:2.691 Nathan Lasnoski OK. 0:52:2.691 –> 0:52:6.571 Nathan Lasnoski Oh well, I thank you for the screen thing. I see that OK. 0:52:8.121 –> 0:52:13.121 Amy Cousland I don’t see anything yet, but if anybody wants to add any questions, we still have a few minutes. 0:52:14.531 –> 0:52:15.691 Nathan Lasnoski Right. I will just chill. 0:52:17.251 –> 0:52:19.331 Nathan Lasnoski Please drop them in there if you have questions. 0:52:19.331 –> 0:52:21.891 Nathan Lasnoski I’m. I’m here for you would love to answer them. 0:52:39.121 –> 0:52:39.481 Nathan Lasnoski OK. 0:52:39.481 –> 0:52:42.841 Nathan Lasnoski That means you learned everything, and I’m so happy about that. 0:52:43.241 –> 0:52:52.881 Nathan Lasnoski But if you didn’t and you wanna have more conversations and we would love to as well, please fill out the form and we love to talk to you after the session is over. 0:52:53.321 –> 0:52:54.161 Nathan Lasnoski Yes, exactly. 0:52:54.161 –> 0:52:54.841 Nathan Lasnoski Thank you, mark. 0:52:55.201 –> 0:52:59.401 Nathan Lasnoski AI experts please fill out the form we love to chat with you. 0:52:59.401 –> 0:53:0.441 Nathan Lasnoski I’d love to connect with you. 0:53:0.441 –> 0:53:4.441 Nathan Lasnoski Hit me on LinkedIn and let’s let’s go on to the next steps and talk more about this. 0:53:4.441 –> 0:53:8.281 Nathan Lasnoski So I’m just thrilled that you spent some time with us and looking forward to more. 0:53:8.361 –> 0:53:9.41 Nathan Lasnoski Have a great afterno. 0:53:9.261 –> 0:53:11.61 Nathan Lasnoski Have a great afterNoon and great day and we’ll see you soon.