Insights Midwest Azure OpenAI Summit – Brookfield, WI: Watch the Recording Now!

Midwest Azure OpenAI Summit – Brookfield, WI: Watch the Recording Now!

Join us for this exclusive in-person event!

The Midwest Azure OpenAI Summit is a must-attend event for businesses looking to gain a competitive edge by leveraging the power of artificial intelligence. The summit will feature a series of sessions that will provide practical insights and strategies on how to apply AI to drive revenue, operational savings, and growth for your business. Our program will be packed with key AI innovators discussing a range of innovative and thought-provoking topics including:

  • Keynote Speaker Generac CIO Tim Dickson: Learn about how inspiring IT leaders talk about the ways they are applying AI in their business. Learn about practical and real outcomes driven by AI that create new revenue or operational savings.
  • Top Production Use Cases for Revenue and Operations: Curious about the top use cases and how to apply them to your business? The Concurrency team will walk through production use cases for completed clients, showing scenarios ranging from generative AI to supply chain. Get the chance to experience the platforms for yourself with interactive demos. We’ll also cover vertical-specific use cases such as Manufacturing, Logistics, ISV, and FinServ.
  • How to Get Started with AI. What Works. What Doesn’t: So… you want to get started with AI, but aren’t exactly sure how? Concurrency will walk through learnings (with special guests) on how to get started with envisioning, creating your priority list, and attacking your first pilot. We’ll also spend time on scaled deployment models, security, ML Ops, and ethics patterns that accompany the topic.
  • Copilot, Semantic Kernel, Bing Chat Enterprise, Custom – Where do they Fit: How do commodity AI and company-specific AI fit together? Do they coexist and how do you avoid investing in the wrong thing? We’ll deep dive into Microsoft’s new commodity capability releases and talk about the future of AI in the context of applying effort into mission-enhancing custom AI that is complemented by commodity uplift for every employee.

Complimentary lunch will be provided for all in attendance.

Transcription Collapsed

So thank you for everyone coming. You’ll notice that on the agenda we have a series of breaks of 1115. We have this. We’re gonna take this off by talking about use cases. Then we’re gonna have Tim Dickson from Generac, and we’re gonna do a little bit of an interview style walkthrough of everything that he’s learned through his AI pursuits. There’ll be a lot of opportunity for you to ask him questions as well, so get those ready as we go through this first one. I’m actually gonna show you some of the things that we’ve done for him. So should be ready to kind of ask him things about what he has experienced after we get through with a little bit of interview and then we’ll have a break. So I’ll I’ll is it need to use the potty? Uh, there’s you can do that at your own leisure or at that point, and then we will talk about the ways to get started. And that was where we’ll pull in some people, Brandon, when our lead data scientists to talk a little bit more about that, we’ll have lunch and then after lunch, we’ll have the opportunity to talk about commodity versus custom, which is going to be really fascinating conversation considering how much is dropping right now. So that’s our day. It’s gonna be a whirlwind tour. You will be drinking from the fire hose and ask as many questions as you want. We’re trying to keep it on track. We go a little bit over, we’ll make adjustments as we go through the day, introduce ourselves. My name is Nathan Lasnoski. I’m concurrency chief Technology officer who concurrency for about 22 years been doing data science work for about 7. This has been one of the most exciting points of my life in the technology space. I’ve never talked so many executive teams about the transformation that AI is driving in their business. This is such a tremendous opportunity for all of us to think about our businesses in a new way. So I’m really privileged to have the opportunity to talk with you today and I wanna introduce Brian. Yeah, I’m Brian haydin. I’m a solution architect concurrency. I’ve been here for six years. Yeah, we’ve been doing data science for pretty much since I got here. You know we have. We’ve had data science on on staff, so a lot of fun this last year has been really incredible with the leaps and bounds of technology and I’ll share it with you. Awesome. Awesome. This is one of only a few AI events happening in the market area. Just two nights ago, Milwaukee Tech had the AI AI group kickoff for AI community. So if you’re looking for more things like this, there is a ton getting started. We’re gonna be launching more events here, more events in the Milwaukee Tech community, so keep track of all the things that we’ve been talking about and future opportunities to chat about AI. Alright, so let’s talk about production use cases. So there’s three things I want you to take out of this session by where to like just kind of encapsulate this down to A to a few main points. The first is that the AI AI opportunity is real. There are only a few major transitions that we experience in our lifetimes, and we have the privilege of experiencing 1 right now. Think back to the major transitions that we may have experienced. The Internet? What else has been more monumental than that? Maybe the Industrial revolution, one of the biggest opportunities as a result of this is in a sense enabling humans in a way that was undone as a result of the industrial revolution. Do you think about what that did? It turned people into cogs and machines, right? It took things that were craftsmanship and turned them into opportunities to produce from a manufacturing line. We now have the opportunity to force multiply people. That’s essentially what AI does. Force multiplies people. It doesn’t remove people. It doesn’t replace people. It allows us to take the ingenuity and creativity of each individual and take that to the 10th degree by driving more capability within our business. And what we’ve found, not only from this last six months where Chad CBT became a thing, but from the last seven years has been that AI produces real outcomes within organizations. The biggest change now is that the bar to get in is lower 21st started doing this I’m going to show you. It’s funny. We were talking about this project that we did for this company, Clover, and there’s this video of this, that, that I won’t think really go through the video too much, but it’s in there and it talks about this optimization of their supply chain. Relatively old story. We I think we started that one seven years ago or something. Yeah, and but the bar to get into that project was like $1,000,000 and if we had to solve it again today be tremendously more efficient, tremendously more efficient. So the the thing you should realize is that most of these use cases that we’re showing you the bar to get into these use cases is very attainable to the ROI that’s possible as well as a result. So the opportunity is real. Operational savings tend to be the lowest of low hanging fruits. These tend to be the things that if you can quantify it and it happens a lot, it tends to be something that you can apply AI to, especially if you have the data and you have sponsorship behind it. So operational savings tends to be low hanging fruit and you’ll notice in some of these use cases there are ones that Dr efficiency within the organization. And then the last is that revenue opportunity exist in both incremental and disruptive innovation. I’ll describe the difference between those in a bit, but revenue, opportunity or operational savings is truly what you have to understand. If you’re going to pursue any of these initiatives, the only reason you do this is because it pays off. You don’t. You don’t build an AI model just for fun. You build it because it provides benefits to your organization either by driving operational savings and force multiplying your employees, or by creating new revenue opportunities to drive your top line revenue in your organization. And enables every one of us to think like business people and think like, how can my customer benefit from what I’m doing? This is what you’ll notice in most of the use cases that we drive today. Didn’t wanna. No, I think you hit it. You know the. I probably would add a little bit to that that this is new in emerging technology and not every ad paths gonna be a home run. Think time, but but there is. You’re gonna get something valuable out of whatever engagement that you take on. Even the ones that didn’t meet their objectives are still having real value to organizations that were working with. And so yeah, that’s what I would add to that. OK. So it’s gonna feel like a joke, but I think this is an important thing to understand. If I were to articulate what my own personal experience has been with the AI revolution is, this is that this experience isn’t good anymore? I was listening to a talk on CNBC and to see how I of this organization came on and was talking about their initiatives. So I was like, oh, I’m really interested in what they’re doing and went up to their customer service page and I found this and at one point this is really innovative. I don’t have to pick up a phone and call someone to get a response. I hate waiting on hold. Most of us do as well. We hate waiting for call trees. We hate waiting to talk to one personally, to talk to someone else, so creating a forum that would have someone call me back if I have a customer service issue like I can’t get into my hotel would be really meaningful. But what have I now come to expect? I’ve now come to expect an experience that’s far greater than this, because this experience, although asynchronous, still requires me to pick up the phone, it still requires me to take energy and invest in this relationship in a way where I’m already mad. So I had an alternative experience for this. I showed up in a hotel in Minnesota that I usually stay at and I asked for my hotel room and they said I hadn’t. I didn’t have a hotel room for the day. I was like, why I made this reservation earlier earlier today. My black of pre planning but I made it earlier today. Well, what I found out was that I actually made the reservation for two weeks in advance. Unfortunately, like I accidentally clicked the wrong date and because I had made it through a like booking platform like most of us do, that they couldn’t just like changed on their end without me paying more money. So I go to the app for that booking platform and it has a chat experience. They go to the chat experience and I say, hey, I’m at the hotel and I booked for the wrong day and they say no problem. I’ll take care of that for you within seconds. There’s a a phone call. If the guy at the desk, he picks it up, he’s talking to a person from that booking platform. They asked me if I would be OK with the slight change in charge. I say yes on the chat, he corrects. It hangs up. I go back up to the desk, I said. I think it’s fixed now. He’s like, Yep, we’re all good to go. And I’m like that kind of experience asynchronously for me was much more capable than what I would have experienced with something like this. So they did is they transformed my experience. I have numerous examples of where I’ve replaced. They’ve replaced something that would have interrupted my day, which something that makes my day much more effective and requires less overhead to be able to make make the customers experience better or to create the right value for them. So this isn’t good enough anymore, but what also isn’t good enough and you may not be able to see the bottom of the screen. Here is this experience. So we’ve all experienced terrible chat bots and this is a chat bot for the Hyundai Tucson. Very recent platform as of July 21st. What kind of gas should I use in my palisade? The answer can you please say that in a different way? I answer best when asked short general questions. I’m asking what kind of gas I put in your car. Like you should probably know that, right? It should be able to answer that question, so I was building this other platform for a company and the search experience wasn’t very good and the engineer was arguing like, yeah, it’s actually going like, think about this, it’s it’s hard and it’s bringing back data and it’s doing a good job. And I’m like, you can’t. You can’t think of it that way. You’re comparison is Google. If you can’t provide a search experience that’s on parity with Google, you’ve lost the the customer should immediately doesn’t like it. That’s why everybody hates the SharePoint experience cause it’s terrible, right? Like the SharePoint search experience is universally bad like you are now. You’re now held in comparison to chats ET. If you’re experience, looks like my chatbot but it just gives you like five options. You gotta click on them like you’re not really chat. It’s not really a chat. It’s like more of like a like a wizzywig little like file. The options thing that’s not a chatbot chatbot is conversational, and this conversation failed. Now I might have tried to drive me in certain things, but it didn’t. So you’re gonna measure up, and you have to measure up against the best. Now, thankfully, we have tools in my toolbox, nick that have. So as we talked today, there’s two different types of innovation that you need to be thinking about. One type of innovation is where your first projects will probably come from is incremental innovation and this is I already do this today. I’m just gonna do it faster. So yesterday we were talking with a company that does that quotes. Exchange quotes additive metal manufacturing. OK. And 10% of those are automated. The conversation was how can I turn that from 10% to 50%? Where is making 100%. I just need to make it meaningful change and the number of those that I automate, that’s incremental change that takes me instead of having to hire 10 more employees to do quoting, I can keep the current number I have and I can maximize their capability and return faster for my customer and win more deals. That’s incremental innovation just ripped of innovation is why I changed the game. So working with another company that is automating the loan business. So think about what happens when you go to get a loan. You’re here. The press the blue button on, you know session starts and mortgage and what gets you your free quote, right? Well, it’s not free, right? Once they have to do the hard pull, it costs about $2000 to do a mortgage. What this company is doing is they’re taking the loan officer out of the quoting process for the loans they intend to not pay loan officers for quoting mortgages by automating the entire front end of that process and using chat to do it. That is disruption. You take that and you change the entire market. So think about your business in the context of both of these things. How does the mission of my business change when I apply AI to different elements of the business that allows me to engage my customers in a different way? That’s really what we’re getting at today. While these use cases. So one of the things that we tend to come out of conversations with is something called an idea registry and the most important thing we’ll talk more about this in the context of how to get started is that you have executive alignment through to your individual envisioning sessions. And which you can see here is that we’re talking about ideas name and the idea, but ascription of the idea value category. How hard is it? And then picking based upon category priority then ultimately selecting within those where you gonna place your bets. And again, the biggest benefit is that the bars lower to get into these pets like I would have told you four years ago. Each one of these is $1,000,000 idea so it better be worth like 4,000,000 to you in this context it starts to drop. What’s really important is that your organization’s able to create something that allows you to prioritize the right ideas to perform your investment against. And Brian’s gonna talk a little bit more about that at the getting started, but I ain’t quick kind of thoughts on on like things we’ve learned from this now. So yeah, in terms of like the value Anna analysis, you know, one of the things that I I really enjoyed doing these, I’ve probably done 10 or 15 of these, you know, with different organizations like getting the business users, you know engaged and like the art of the possible and like just the ideation, it’s really fun to go through kind of like this is like, so old school but and this is maybe 1520 years ago that I was working at a financial institution and and it was about to go belly up, you know. And so they they they basically had everybody stop working for like 2 days like come up with all these ideas about how to save money. And you know what we’re doing today kind of reminds me of that. Not that your companies are going belly out, but it’s getting people to think about. Like what are some? What are the pain points that we have in our organization? What are some of the manual things that I just hate doing every day? Getting people like thinking about that and and then us helping them understand. Yeah, that’s actually something that we can do and it’s really going to take, you know, four to six weeks to get you started to get you progress on it. It’s it’s incredible. Absolutely. You know one thing that that spurs for me is when you think about the destructive side, one of the most important ways to think about that is if I was starting my business from the ground up from scratch, I would AI influence the way that I went to market and built my business. I had no employees. I had no infrastructure. I had nothing but I had money. How would that influence the way I built my business from the ground up? Don’t take the sunk cost. Think about what would take from now and that’s what you have to consider in the context of your competitors. Your competitors are gonna think that way. OK, so let’s start talking about types of ideas. So these are the five biggest buckets on the generative side, think like large language models, scenarios that we see most often and I’ll talk about the broader use case in a minute and I’ll show you examples. Uh to to let large extent so left hand side or right lookers left over here. We have the customer service scenarios and the order status scenarios kind of go together in a sense, this is about reducing the time that a customer has when they are curious about something, when they have a question, when they need answers, when they have friction. This like period where they’re frustrated with you and you’re losing the relationship. How can I reduce the time for me to provide them with a satisfactory response? Sometimes that’s simply about giving them asynchronous experiences where they can get the answer. Sometimes it’s about arming. You’re a customer service teams to be able to answer them faster by getting the answer faster, and sometimes it’s about predicting the data underneath it. So you have a reasonable response to give them. Sometimes the reason we don’t respond to a customers because we don’t know brochure. Well, what if I could give a better reasonable on an answer to that with a percentage or a confidence that gets us to a point where we get to answer the customer’s question with more data than we had before. So a lot of these use cases are about easing the relationship with the customer, helping them to get their answer faster. This middle one sales, quoting it’s extremely popular and one of the reasons it’s so popular, is because it’s about driving top line revenue. It’s about driving my customer relationships further and faster because I am more responsive than my competitors. One of our customers that we talked with, they saved. They went from taking four hours to create a quote to only minutes to creating a quote in this particular business, this business is a is metal production. The person who delivers an accurate quote the fastest is usually the one who wins. So if we do that, we create more revenue faster. This right hand side here data mining. This is kind of splitting 2 use cases. The first is about surfacing information. Sometimes this is internal chat GPT instances. Sometimes this is internal scenarios where you’re surfacing like healthcare information or what days do I have off or give me customer? Tell me about my product. I’m trying to sell like maybe I don’t know about, like, the drafts of all the boats. I wanna answer that question quickly to my customer and it’s also something where people want to ask questions of data. Like sometimes someone might say like tell me my top 10 customers and you say, well, I’ll go get that for you. I’ll create a report like no. What if I could just ask something, you know? Tell me the answer. That question I could keep change the game and then this final item on the right hand side process automation. Think about this. Like anything that happens a lot, if it doesn’t happen a lot, it’s probably not a good candidate. If it happens a lot, it’s meaningful. It’s a great candidate, one of the items on this previous page. Automation of PO and order matching for 400,000 to 500,000 orders a year. Not a huge company, but they do a tremendous amount of matching of orders. What happens if they automate that 90% of the time? Tremendous impact. So what process automation is all about driving benefit now? Some people think, see and they see RPA the difference between what you can do now and what RPM could do before. Is that RPA now is enabled with the artificial intelligence to recognize what it couldn’t recognize prior? We can look at a object or a set of text or a process and we can do more with it. And what RPA used to be able to handle so process automation is a tremendous opportunity. Yeah, I’m working with. No, I I call center that we did some analytics and identified 5 use cases that would cover 80% of their call volume, you know and we could automate for those five use cases really easily. So it really can take a load off and allow them to do more meaningful work. Really was what? It nonstop. So as you can see here and I won’t cover all these is a variety of other use cases that have been around for a long time that now we’re enabled as a result of generative AI. So think of generative AI as the front end, and sometimes the back end of a lot of these use cases on the left, like supply chain optimization or predictive maintenance or smart warehouse or smart products that create more intuitive experiences that lower the bar for your customers to engage them. AI is not a new thing. It’s just a thing that now we made much more accessible to organizations and the business realizes it’s fun. This is a list of the top ten industrial use cases. Interesting that predictive maintenance is at the top. Word about predictive maintenance predictive maintenance works if you built the smart product or if you have a tremendous number of things that all happen the same way and you have the instrumentation deployed to do something with it where it doesn’t work is where it’s cheaper just to replace the bit or the saw or the element on the supply chain on every four months or three months. Then actually doing prediction. So it’s an interesting use case that it’s at the top, but it’s meaningful because a lot of companies benefit from it in their capabilities, quality, supply chain optimization, cybersecurity will show an example of that later. Alright, so let’s go into some examples. So let’s start with customer service. OK, so this is an example of an application that we built for Brunswick boats. You can see on the right hand side a standard specification of vote. It has things like propulsion and speed and and draft and so on and on the left hand side you can see the question that we’re using. So the question in this case is, you know, tell me about the sundancer. See that right right there and it’s giving me an answer, but the elements of this that you should recognize is 1. It gives me the reference to the thing, so it always shows me where I can find that within the document. Why is that important? This whole hallucination topic right? Like is it giving me the incorrect answer? So now we have to back that up with. Where did I get the answer? So reference where that answer came from. You can also see things like what’s the draft? You know, it’s it’s finding that or what’s the maximum speed, see. Interesting enough, it’s recognized that wide open throttle is the maximum speed, so sometimes it figures out interesting things. Umm. But however, one of the things that we’ve learned is that sometimes it doesn’t recognize terms that you have, and that’s where the real work comes in. I think one of the things that we found here is it takes about two days to create POC. It takes like two months to create something that’s actually production worthy. Easy for me to show you something that kind of works. It’s hard to create something that actually works for your end customer that they get value and have a satisfying experience from and that’s, you know, I won’t steal the Thunder of this one. But we’re gonna talk about this one in depth in the next session. Yeah. And your mileage is going to vary, right? So in the case of Brunswick giving the incorrect information to somebody who might be on a phone and the middle of lake MI like that could be a problem, right? But you know for uh, somebody who’s doing a shopping in online shopping thing the, you know, having a little bit of a discrepancy or errors or the way that big of a deal. So it really just kind of depends on the use case and how far you wanna refine that. But yeah, so your mileage will vary a little bit. Exactly. Yep, but you can see how you can create beautiful experiences that are very user friendly and provides more than just even a simple chat experience. Something that really gives them the information to look into looking for faster next use case and you can pepper Tim with all sorts of questions about this. One is the generic use case, so in this case we have like person in asking about how to shut down its standby home generator and we are providing AI response to the customer service Rep to be able to provide a response back to the individuals asking for the call, which tends to be the right way to start these kinds of projects. So in this case, you probably wouldn’t want to give that information to the end customer until you have sat. You become satisfied that you’re achieving this necessary amount of accuracy in those responses, and in this case we’re giving that response to the customer service Rep they have the chance to validate it. They have a chance to tell us whether it’s working or not and also what the source was. And then they can provide that response back to the customer service Rep this is something that’s in production with their customer service team today. Now for something really interesting. In the upper right hand corner, what you see is a diagram of a IKEA uh instructions to put together a a shelf. And just within the last couple days, Chad GBTV dropped, which is a multimodal interface now doesn’t solve all the problems we have with multimodal diagrams and documents. No. Does it advance it? Yeah, because I just gave it that diagram and I said tell me what’s happening in step 7 and it gave me a pretty reasonable response of what’s happening in step seven. No key. Well, there’s a meaning very minimal key, no words to describe it and interpreted the document and gave me information about what’s happening in step 7. So why is this important? It’s important because it even further lowers the bar for us to be successful when we talk about interpreting diagrams, product use manuals, instructions, how to do things. Where do I find kind of questions? I was looking at a manual yesterday and I said I was. Yeah, of a cockpit of a bow. And I said show me where the throttle is based on this image. First thing I did is I put it into the image the image without the key and it couldn’t answer it, but it said most images like this have a key. Which was interesting that even knew that that it would have had to keep. I put it in with the key and it correctly identified the throttle of out of like 16 different options that I was pointing toward and where it was in the boat. Fascinating that this is dropping so low in terms of their capability now that the other thing I did, I didn’t get it to actually incorrectly answer the questions. Well, and so it’s it’s lowering the bar, it’s enabling us to do more, but it’s not a magic wand, but amazing how far this is getting us in terms of what multimodal will actually create in this picture. Another type of interface for customer experience. Notice that these are like going after the most annoying experiences that we can have. What do we what do we hate about healthcare? Pretty much everything about health care, but what do I hate most? It’s about dealing with health insurance companies. When you deal with the health insurance company, you usually have to call someone. You’re not call tree, you’re waiting. Person doesn’t really wanna be working there in the 1st place and you get this really challenging experience of trying to find figure something out. What if I could just go to a chat and have it provide the results of what I’m curious about? Like what is my deductible? Is this covered? Can I do this kind of procedure? This is where there’s opportunity to be able to enhance our experience. This is a, you know, asking that same question about deductibles. It’s actually relatively easy to deploy this inside of everyone’s company that sits in this room because you all have documents that represent your health coverage. You’ll have people that answer those questions. Do you think they like answering those questions? No, like and sometimes they don’t even know the answer. They have to go find the document to answer it. What if I could just learn the company with the answers as a result of training and on a document? It’s kind of things that we can enable to let the people who are in human resources you, the kinds of things that they love doing, which is helping people to be the best versions of themselves as opposed to answering healthcare questions. The other kind of example of answering you know what happens if I have X24 coat in my F-150 truck. It’s no problem. That’s a spark plug misfire also then enabling that with different types of adjacent experiences you might you might have a relationship too. It was a lot of these are B2C or B to customer service to C kind of experiences enabling either my direct customer or my business that I’m working with. Sometimes the customers that we’re working with are like they’re B2B customers. So they sell through distributor, same kind of scenario. They have all the same questions. You go to your require mechanic. How they getting any questions? They’re going on to forums, they’re trying to figure things out, right? That’s why it takes so much time from the research. What if you could harm them to be able to answer these questions faster? All right. I’m gonna let Brian talk about this one cause this is kind of directly to one of the use cases that you’ve been hitting on, which is like call analytics, customer call scenarios, understanding like the result of calls. Alright. Yeah. So so like I mentioned before, we working with this call center, there’s a couple different ways that we were gonna tackle the problem. One was it some of the email, some of the communications come in through emails. And so we can monitor emails that way. But a lot of that interactions are with voice, right? So somebody’s actually on the phone like talking with somebody. So we’re doing real time translation of that call and providing feedback to the user. You know that, you know in real time. So it’s as if you’re constantly talking to a chat GPT and we’re trying to interpret or predict or project what people are doing. Another aspect of this was just to start this off was what can we learn about these conversations? Maybe not in real time, but just can be scored these things and so we look at three different metrics. So just generally speaking, when that call was terminated, did the user get what they asked for, like just simply yes or no? What was, you know, the sentiment analysis, you know, it’s scored this, you know, was it a good call? Was it a bad call? And then the last thing that we did for them was, should this be like we trained the model to be able to predict whether we wanted a manager to go ahead and just review this phone call. So most call centers have a certain percentage of random calls that they get reviewed. This was a way for them, so it really capture like where the training needed to happen and and you know, provide some efficiencies and coaching that way. So really good use case. I was surprised by how often this use case comes up. Actually, I I wasn’t kind of predicting that this is gonna be one of the main ones, but for large companies, you could see how understanding the quality of their interactions with customers is really worthwhile. Yeah, it’s an area of the business that has such high turnover too. So training is a big issue for those areas. OK, let’s talk about accelerating sales to close. So a couple of scenarios that this has become really meaningful in the first area is creating quotes for customers in a fashion that is reducing time. It may also be about increasing accuracy, so sometimes quotes are accurate, sometimes they’re produced. They have to be redone after they get to the actual production team. That was that other use case that, like, check stuff to make sure that it was actually the right quality you want to catch that higher up on the on the start. What’s one thing we hate? We hate going backward in a process. People are really good about like continuing forward. Like if I’m at Disney World and I know it’s a 30 minute wait for the Batman ride when that’s actually Six Flags, but there’s 30 minute wait for the Batman, right? I wanna do the Batman ride and I’m OK. Just kind of like continuing forward cause my expectations set, but what happens if I get there and I’m like told oops, sorry Sir, you now have to go over here to this other new Cline and you have to restart. Well, I get really upset. Right. Because or the same thing like customer service experiences. If I have to go back and explain it again, I hate that this is all about like making sure that once I go forward in the process, I don’t have to go back in the process to correct it. So sales quote generation, sales, quote, accuracy, complex quote scenarios. If it’s difficult to put together, can I do a lot of small things to accomplish a big thing? So for example, we solve this for a customer and I’ll show you in a second that was easy quotes. But then, once we saw the easy quote, we could combine a bunch of easy quotes to solve the complex quotes and then sales asset helper, which is like my customer has questions about X and that kind of goes back to that Brunswick use case where like I wanna know about the draft of the boat. I wanna know about this. I wanna know about that. Like how quickly can I enable my sales person to answer those questions? So this is a an example when of sometime all you learn to use zoom it umm I down the lower bottom here what it’s asking is can I get can you get me the following items for our new set we are designing. I need this ASAP, probably can’t read it, but what it ultimately says is 3301 stainless sheet cold rolled, 122811 gauge, then a series of numbers which orients around a piece of metal, a size of metal that needs to be cut, produced and delivered. It’s unique to this company. I can’t screw it up, but it’s the thing I need for every use case. So what we built for them is we essentially integrated this right into the coding experience inside of Outlook, which is where this person is living. You could say why did they get all their quotes in outlook? It’s a different story this, but 90% of their quotes come through email. So right here you have the extract of the critical information from that quote that then is run in a machine learning model behind the scenes and then creates a quote right here which the person can either modify or accept and send right back. And then we measure the extent to which they can accept it immediately, or if they have to modify it. And we’ve been winning incrementally by increasing the accuracy that month over month that more and more often they can accept the first time quote. The long term vision is they wanna go headless, headless for chat centric, quoting for X number of use cases from the customer directly never having to interact with a person to get the thing that they already wanted. How did they do that? They get to a point where they measure like how accurate am I when I have certain types of content come through and when I need to use an exception sequence to send them back out to talk to a real human and continue them forward on the journey. But what this did is it dropped the time to quote substantially. What you’ll also notice is that the text was right in the email, so then one of the other things they had to solve was, well, people send over these requests for quote as PDFs that contain several things, like several line items. So then what we did is we would extract those all, turn them all indie individual content, produce a master quote. That thing could be sent back to the customer, all without a human having to actually touch it, and these are not easy, right? There’s 100,000 different possible skews of metal, and if you’re designing the front of a dishwasher hood that dishwasher good can only be used on that dishwasher once they build it, it’s gone. They can’t reuse it. They can’t sell it to someone else. That’s what they built. So these are really important to get right when they do it and what they’ve noticed is that their accuracy keeps going up because they can interpret it and look for faults that other people bet a person. This is across 600 sales reps you 600 sales to think about. How hard is the train 600 sales reps on custom metal quoting from measly difficult. This is an opportunity for them to enable all those people with a base set of capabilities that they can then adjust over time. Remember how long that took to get someone it was like over a year, you know? And so we just did another project that’s pretty similar to this not as robust, but it was like 2 months, you know, so being able to get this, this technology is solved. And when we did this originally, this was before chat heating. And so you know now that we have those tools available to us, but we don’t have to create customized language models for your business. It’s just so much easier for us to do it. I remember when data scientists like he built that custom LLP model, and when we had chat GPT, we like. We gave it 3 examples and it largely was getting it right. It was like when you count like. Yeah, yeah. Yeah, exactly. The CIO is like, that’s scary, like, OK. OK, so this is getting to another interesting use case for a company that builds Windows and one of the things that you learn about companies that build windows is how often these are custom. Yeah, way more than I ever expected. These were fully custom products and it’s all based upon how much do I know about the schema of producing these custom products. Well, the problem that these companies have is it’s Excel workbooks with like all these macros and challenges to produce anything really interesting is like it’s it’s it’s this nightmare that they’ve they’ve continued to move down the road. You ever see that car? That’s like got the clown sticking out of it and the guy, you know, you’re trying to cram as many people in. Then you got the stuff on tops like that. It’s like except it’s it’s a it’s a window company. So what they’re trying to get to is an immersive experience that allows a person to point your phone at a window opening and lets you actually design the window hot as an individual. So like instead of having the guy come to your location to the spec, what if? What if I lowered the bar? Engage my customer directly to help them, you know, play around with this themselves. So by the time the person comes to the house, people already potentially specked out what they want, and they’ve used the AR experience, not putting on the like AR goggles cause nobody hasn’t. But what is? Everybody got a cell phone. Everybody has a cell phone that kids have probably had that phase where they went through playing that what it was, that Pokemon Go wrote her. Yeah. Yeah. Where like there’s something here, but it’s not actually here the same kind of experience, right? I wanna AR experience provide me with the answer allowing me to remove the need from my engineer to interpret something from a engineer, then put that into a spreadsheet, then put that into a design that goes into this file alone. Document that then goes into the actual thing they’re building, all of which are manual steps. Completely reinvents the experience of engaging you at home is what more preferential to going into Home Depot or Lowe’s and talking to the person at the desk who really isn’t that well trained on any one particular window. In fact, that’s one of the biggest challenges they have is like they know that that person doesn’t really know their products. They kind of know them, but they’re what happens when you have a conversation. They open up a book and it’s got these like paper pictures of windows and you’re, like, trying to figure out through that they need to all these companies trying to create BC experiences that kind of work with their B2B partner. This is another example of like a quick order. I need 40 pallets of two by fours. Weather treated no problem. Here’s your order. Etcetera, giving the sort of same kind of experience around complex orders or refunds and scenarios. Same kind of experience you saw earlier, but for a simple cone contractor kind of experience. OK, but hits apply chain. OK, so this this is the first, how do I say this first AI project, one of the first, probably the first. Like really major AI project? We did that then was like the cascading project for many follow on projects of similar types and this was about optimizing supply chain around the context of demand forecasting. So this was a company called Clover. Some of you may have used this example in the past, but I think it’s really meaningful. It’s one that we’ve now repeated many other companies and this problem is actually a really old problem that never needed generative AI in the 1st place. It was simple. You know old school machine learning problem and what this is really about is solving the ratio between demand and inventory. And at some point you want more inventory, some points you want less inventory, like during COVID everybody wanted more inventory, right? You wanted the ability to keep producing. Everybody wanted toilet paper, but now I don’t really want like 17 different like 16 sets of toilet paper in my house. I want like just enough to get us by for like 3 weeks or whatever, right? There’s a point in time where you have different decisions that are occurring and where you wanna apply your capital at different points. In this case, it’s about a $2 billion company and they saved about $80 million over the course of the year. The uh capital savings and their inventory as a result of optimization to the way they handled the relationship between demand, inventory and locations and with that then gives you is the ability to apply that resource somewhere else in your business. This is the optimization that’s usually very attainable for organizations, because the technology in this space are really they’re fragile like you either have that you’re a piece system that simply doesn’t have this capability at all. Very few that your platforms have anything like this or you have these sort of middle tier SAS providers where people are trying to like send their data to it and get some kind of response. And sometimes it makes it better to some degree, but it tends to be this, like, ever struggle of switching from provider to provider to get a better value. And where I find people tend to eventually land is creating a competitive advantage by building a model that actually understands that, and that’s what these guys did. So the goal behind this was to optimize the ratio of demand inventory forecasting and provide ultimately a a savings over year over year that also allowed them to prepare pretty effectively for what COVID happened and respond to it later. And I’ll get you guys this deck. There’s a great video of this company talking about what they did there. The the reason I bring this up 8 to communicate again, AI isn’t new. It’s just being explored and engage in now because we can do this project a lot cheaper than we used to be able to do it. That’s what’s made so exciting. This guy took a big bet and said I’m gonna focus on driving my inventory costs down and spent $1,000,000 on it now. You can do this for a lot less investment, which I think is what’s really driving a lot of value. All right. Let’s talk about prediction now. Diction is something that you know, sometimes people aquate to predictive maintenance, which is one of the use cases that we tend to talk about when you talk about prediction. Now, can I have a couple of scenarios that we talked about the same thing I think about prediction in the context of anything that happens within your business. What if I knew X was going to happen? What would I then do about it? That’s essentially storytelling, right? Like what would happen if I doubled my business? What would I need to do? What would happen if this site had an inventory loss? Well, how would I handle it? It allows me to model what if scenarios and we watch like the the Marvel show. What if, like, just fascinating? Think about that sometimes. Right in this context, it’s imperative for us and understand what could happen to our business if something happened. And sometimes it’s if our if our strategy succeeds, we all have strategies for our business. What if my strategy succeeds? If I could predict what is going to be necessary if that strategy succeeds, so prediction has a lot of different engines. In this case, it’s for a company called Herraiz. Horus is a company that video is not working, but it’s a company that there’s a lot of things, but it melts metal. You’re see, Terminator two, you know, like when it’s like I was reading this, this this article about, like, the times it’s appropriate for a man to cry like one of them is like when the team 1000 or tight hundreds getting lowered into the metal. OK. It’s like you know, ohm. So sad, right? I see it now. That’s what these guys do. They have these gigantic vats of molten metal and what this video shows, but it’s not showing right now, but you can kind of see in these other pictures is like there’s a guy who actually, like, goes up to that like a dipstick and like has to check what’s happening within this metal and try to predict when he has to turn it off or change the temperature and stuff. And he’s very experienced, but he only knows when he knows and actually hope he gets hazard pay. Having to do something like that, right? What we built for them is a air gap solution that lets them actually predict what they need to do in this space. So for them they only have a couple of these these machines, but the waste associated with screwing that up is tremendous because you know that metal is only used for one customer. It’s custom. They have the shop that if they have to shut that down, it’s a it’s like a week long process and so that’s lost revenue over the course of like a very extended period of time because it takes two or three days to cool down two or three days to keep that that unit back up. So it solves a really expensive problem for them. So remember when I said, like you look for things that happen a lot also look for things that are really bad. Something goes wrong or expensive if it goes wrong, that’s another. That’s kind of what this use case is about. They then scaled this, you know, to other locations. Another scenario which I’m very well seem familiar with because I just had a baby, but I’m not really familiar with it because I’m not my wife. But with this is is about prenatal heartbeat detection. So who’s that? Who’s who’s been in the delivery room before? Yeah. OK. Yeah. Or watch like for those who haven’t. Maybe you’ve seen like a movie where like they have the like little tip tape coming off the the little heartbeat monitor thing. And then the doctor comes in and goes, right? What’s the? What’s it? You know, he looks at the like, the history of it. Right? You like? What is? There’s gotta be a better solution to that problem. Like, why do they have to look at that on a piece of paper and then try to predict based upon what happened before what’s happening in the future with the baby? But this is doing is it’s saying. Here’s the here’s the view. Here’s the contractions. Maybe like giving too many flashbacks, and then here’s what could happen in the future based upon what we’re plotting. So looking at it, predicting giving a perspective of what kind of what’s gonna happen with the mother and up to four fetuses in the context of the the, the, the person, which is pretty interesting and something that’s really like changing the way that people can look at that kind of data, they’re helping a mother through that process. So prediction again doesn’t have to be about a machine failing. It can be about almost anything. OK. Hopefully we’re not getting through example fatigue. You’ll still with me? Yes, OK, cool quality inspection. Quality inspection, which you are looking at is the inside of a sewer pipe. I’ve never seen that myself by crawling around in sewer pipes that I did walk. Watch Shawshank redemption. It looked terrible because, uh, and what you can see here is this. This like robot is going around in the inside of the sewer pipe and is looking for failures now. In the past, the person used to have to watch that video to be able to look for those failures themselves. Now what a bot can do is go down the sewer pipe, come back out, and then it can indicate these little areas where issues are found and then have the period of video where that issue happened. Umm, this is for a company called Burgess and Niple and they they use this to be able to speed up the process of looking for those failures and then also Dr accuracy within which ones that are actually significant issues versus which ones that aren’t. So video analysis and the quality space can be a fantastic opportunity for you to do things and we’ll talk about other example for JJ Keller and just a minute that we did a POC for. OK, order status to action. We’re getting there order status to action is what happens when something goes wrong with my order. What I’ve experienced with this is there’s two parts of this screen where I can improve my experience with my customer. The first is making this very customer conversational driven. Allow it to be a chat experience that I can ask questions of. When’s my order being delivered? When can I expect it? What’s wrong with it? What happens if I have to change my order at this phase? Can I? All these are questions that companies have. You probably have a top ten list of things that people ask while the orders already in the process of being shipped. These are things that we can answer via asynchronous experience or or in the customer service team with the right answers. But what you can also see here is the entire state of the order, and I don’t know if you ever heard the term Domino’s Pizza tracker, but there’s just a really exciting announcement from Domino’s and Microsoft. Couple days ago about their partnership to further enhance start official intelligence surrounding order delivery and other activities that they’re performing with and their stores. So even the historical Domino’s Pizza tracker still needs work put into it, because if you actually, if you didn’t know this, the first version of the Domino’s Pizza tracker actually had no relation to whether or not it was actually being produced or not. It was just a dumb timer. They would you’d order and they would set a timer and they just have like enough accuracy that it sounded good, but it wasn’t actually accurate. Now it’s gotten to a point where it’s, you know, it’s actually accurate. So what you can see in here is that there’s different stages. What customer service teams have a how difficult time answering is? What do I answer if I don’t know the answer to the question right? Like, like my thing’s delayed. But how do I know when it’s going to be delayed until? Well, most customer service teams don’t have the information about the business to know that answer, you need to be someone pretty like broad in your knowledge of the organization to be able to answer those questions. So it causes delays for you to be able to respond to the customer. That creates frustration and angst amongst your customer that on the wrong day might cause them to go somewhere else. What this allows me to do is to say what if I could predict like this yellow section and say when that gets delayed it tends to be this period of time I can start to set some level of expectation with my customer based upon the information. I know this is a kind of problem that can allow me as a customer service Rep to be able to provide a reasonable answer to my customer when I have some ability to predict different gaps within the process that are knowable but not knowable. For me, because I don’t have the vision across the organization now, one of the most common blockers that people say when you’re getting started in the space is I don’t have the data. I think that’s a cop out. I don’t have a date as a cop out. I don’t have the data for that use case is the real question. Most of these use cases are very attainable by getting the data. Meeting them are not. This is an example of 1 where you need a hell of a lot of data because you need to be able to know what happens across the supply chain. So I don’t have the data for this part of this use case. Is the right answer. I don’t have the data and I need to spend two years waiting for good dough is not the right answer. That’s that’s just kind of a like, not really thinking about the entire problem in the context of your business. All right, data mining data mining has become very, very interesting. And there’s a couple different scenarios that I’ll pull out. Names aren’t like truly data mining, but I think they kind of fit. This left one here is like monetizing data. Some of our customers, one of my customers is a food distributor. They they distribute food about $50 billion of food a year, and they know a tremendous amount about restaurants as a result of distributing their food. They know which ones are in the top quartile. They know which ones are profitable, which ones are profitable. They know which ones are more likely to stay in business for more than five years. You know the level of restaurants that fail after five years is tremendous. This is where I can turn from being simply a food distribution company into a knowledge company. I can charge premium SKUs. I can charge premium products to my customers because I know something about them. Even if your product is dumb, food is relatively dumb, right? I don’t like digitally enable a banana, but I can digitally enable the knowledge I have about the banana and them so data mining in many cases about allowing my customer or my internal organization to be able to learn something about what I do. Also comparables with public competitor data connections between data people products identifying those connections is where we can drive tremendous value from it. And a lot of this is driven by AI. This is a a view of me asking a question, show me the revenue by product in descending order. It created it being like Chachi. BT created the sequel code that’s been provided this answer of my top 10 customers. Your mileage may vary here, right? Like you, if you know if you know the answer to the question and you can write reasonable SQL queries, you could probably create something like this for certain types of experiences. Other ones might be too complicated. You have to really think about the kinds of answers you wanna provide, but you can do really interesting things. But also notice that Power BI is gonna have a lot of this baked in for analysts. There’s a lot of interesting things happening in that space. OK. The last one I’m gonna cover, and then we’re gonna go onto our. Our interview is process automation happens. A lot happens. It’s expensive. I want to create a new line of business. Any kind of process automation this could be customer contract, process reviews, just the one scenario at the beginning was matching of the PO with the order I need to produce the right thing every time it could be supply chain insights or reviews of something where I have to look for quality. Validating something. Anything that happens a lot. So this is an example of customer contract automation. So in this scenario we take a look at the customers purchase order. They’ve sent that purchase order back to us. It contains information. It is not in the same format or even the same diction that my order is in. I have to match that to a production order and the sales order. All those three things need to match. Typically I need a human to do that and I need to interpret what is in that because it might be like like rolled might be rolled or might be RLL. I need to be able to know that those things are the same and a human be like Yep, those match or human might be like. No, they don’t. Now I need to train a platform to be able to do that same thing. So if I can create AI platform that can do that, I can validate that before production and execute the order a percentage of the time. But then the other percentage of the time I’ll send it to an exception for human intervention. That’s that scenario that operates 500,000 times. So what if only 10% of those go to a human as opposed to all of them? Now I’ve just totally full force. Multiply the opportunities that my organization has, and that’s probably the way to think about a problem. Your best incremental scenarios are ones where does success look like. Just partial success equals success. If yes, then I’m getting I’m I’m probably doing my like first project in the right place. If partial success doesn’t equal success, my bar for success is pretty high and I have to have a lot of stick to itiveness around the value of the idea for me to go down that route. OK, now I’m gonna close this with like two really interesting use cases. This is, uh, one that we worked on as a demo for or POC for JJ Keller. And what we’re looking at is distracted drivers. What are these truck drivers that I’m driving down the road at 80 miles an hour in a like multi ton missile? OK. And I need to make sure that I’m not gonna run someone over or be unsafe myself. So I’m looking for like belts is OK, I’m not subtracted driver phone use things like that. This is actually a classical machine learning approach to this I’m having. I had a team we had to teach it. Like what? The belt is and are the. Am I using a phone and all these kinds of things so you can see on the right hand side it’s identifying those things, right? This guy, this guy’s umm, phone use and seat belt. His his phone. He’s he’s using a phone and that’s problematic. So these are problems we wanna look for and then always thought, what if I what if I put this at chat sheet TV? Can you describe this image for me? I see a man sitting in a car. The man appears to be middle aged and thinking. Not sure why he’s thinking, but for what he’s thinking because he’s got his belt around his arm doesn’t appear to be safe. Is he wearing a seat belt? I do not see him wearing his seat belt. That’s pretty interesting, right? Like I didn’t have to teach this. What a seat belt. Was it? Just knew it. What is this current speed of the vehicle based on what is visible in the image, the image indicates the photo was taken while moving at 73 miles an hour. Note that that time the dash Cam that gets captured as a part of the camera and the cab, so even without training, I’m getting interesting things from this this vehicle. Now here’s what it’s got really interesting. You can’t see the image very well, but it’s used. You saw before is this person driving safely? I gave it none of the information. I just gave it. The image is this person driving safely? I did this yesterday. Put it into Bing’s GPT 4B. Highest paying blah blah blah based on the image you sent me, blah blah blah. Ohh based on the image you sent me, I can see that the person in the driver’s seat is not driving safely. Safely, here are the reasons why the person is holding a phone to their ear, which is a distraction and violation of law in many states. The speed limit in residential neighborhoods is usually 25 miles an hour or lower. How did it even figure that out? Because I’m in a residential neighborhood, and that’s a house. The cars traveling at 47 miles an hour, which is too fast and dangerous for pedestrians and other drivers. The GPS coordinates showing the cars in Richfield, WI, where the weather is currently cloudy and rainy, also shouldn’t be driving that fast. While on a phone when it’s called and ready in a residential neighborhood, bad stuff. Right. When I find really interesting about this is this is the kind of question that you really are asking? This is the why and I’m not asking about the speed of the seat belt or the phone use directly. I’m asking you about the why is this my? Is my driver safe and this had to put all those pieces together to answer that question. Now, again, your mileage might vary. This is very new, but I can do with classical. I can do it with GP TV and now I can do it with this even faster and I think that’s just fascinating. This is this is really fascinating and uh, little back story on this. So wouldn’t J Keller, you know, came to us with this problem? It’s not that like this. Kind of like the tection hasn’t been around for a while. For them, it’s that the systems are so expensive and they’re proprietary and nobody wants to share it, right. So all these pieces are starting to come together and literally what we’re talking about with chat, GPT 4V that was just announced this week, you know, so you know this is this is so fast and so emerging. Yeah, it’s it’s kind of crazy on. Yeah, it it’s just what it can do is amazing. OK, I lied. This was the last use case smart products. Smart products. Anyone who builds something at your customer interacts with that has a digital element. This is the where you get to be a smart product. I’ll look for you to look. Watch this later, but this is a video of a customer of ours, Brunswick. They built a smart boat. They’re leader is smart boating. So what they do is they build a boat, they build a boat that enables you to know, is my battery charged? Am I geofenced? Is it gonna run out of? Is it gonna run out of gas by the time I get to my destination? You know, different elements that they’re factoring into the boat to enable you to have a great day. One of the things that I hate and I have a boat and I just talked to the CEO of this company yesterday talking about how and Rick burned me because I was trying to turn what kind of gas to put in the bowl. And he’s like, that’s obvious. And I apparently it wasn’t obvious to me, but providing a great experience for like the boating experience is one of the most like instrumental things you back your boat in the water and the battery is not charged up and you got the kids with the life preservers on. Then you gotta get back out and waste 4 hours your day fixing something like everybody hates that experience. Who’s had a boat? Everybody loves the experience and going on and tubing. How can I have more fun tubing and less fun less annoyances? Getting the right thing to put my books. I didn’t know if something was wrong. This is what they’re trying to do, right? That’s where predictive maintenance becomes real. We’re smart. Capabilities become real. This is the last scenario which is the Smart Guard service. This is something that JCI is working on the idea of not is the wrong person in my facility. Is there a person in the facility like they have these guys that sit, you know, you saw all seen the movies guys that said it desk watches these little like these screens and there’s like is someone in the facility? Yes or no where someone within the yellow lines? Yes or no. I can I you can do smart guard service or not replace but augment that person’s ability to accurately detect that something’s happening. So detection of instances, detection of false problems, one of the things that we worked on with the customers, even the enabling wayfinding with in a warehouse to find parts for picking and predicting the best routes that individual could take. Alright, so a couple closing comments here. There are a tremendous number of examples. Hopefully what you found from this is wow. How do I even pick one? How do I select what’s right for me? Goes back to the beginning, and that’s envisioning it’s executive alignment and that’s where we’re gonna talk about in our heart to get started session. But hopefully this enables you to think about wow, these are possible. These are real. These are things companies are actually doing and it’s driving value for them in a I think about that in the context of the mission of my business. I can make a difference to how I execute on my business over the next five years.

Transcription Collapsed

Get started with our next section? What does everybody think so far? Information galore and good? Hopefully. It’s great to see so many familiar faces here. My name is Kate Weiland Moores and I am Concurrency’s Chief operating officer privilege to work beside Nate Lasnoski. And I’m a lot of this information for me. I’ve heard many times and so happy to share it with all of you. Today we’re taking this on a roadshow, so we’ll be presenting this Milwaukee, Chicago and Neapolis and I’m just really great to see the turnout. I’m especially pleased and honored to introduce our guest speaker, Mr Tim Dixon. Tim is the CIO of Generac. I’m sure some of you may have had the pleasure of seeing him speak prior to this event today, but he is really the driving before driving force behind digital transformation and strategy over at Generac for I think 3 plus years already. Tim Fred of concurrency. We’ve done projects with Generac. I personally have been just so impressed with his level of enthusiasm and honestly, just the thirst for innovation and transforming Generac the manufacturing world, transforming, transforming their business, resulting in multiple awards. So you can take a look at some of that information and find that yourself on the Internet. Just so happy to introduce you to him. Thank you for being here today. Painting right? There no actually no, I know. I should have so take it away guys. Alright, thank you. Morning. Morning, everybody here last year for this in the same exact room I spoke last year as well. If you’re here, a good dose of Generac month. Yeah, and user OK. Brand new audience. Nobody knows my jokes. That’s alright. That’s that’s. Well, once you, Tim, thank you for joining us. I would love it if you could just give me to start by. Like, hey, like, who Are you sure? Like, how did you get where you are? What’s been like you’re guiding principles for for Generac and just like, let’s start there. Sure. Like square one, yeah, so. Born, raised and we’re sensing who’s ever been to lacrosse. Wisconsin, my hometown. Beautiful countries, they call and big news across is gonna start brewing old style account, but it’s not working for your time. But so talent moved away in high school 30 years ago, came back for this job Generac 3. So it was. It was a way. It worked in a lot of different places with a lot of different cities. HP, IBM, Dell, Motorola laureate and then generate a few startups between Philly, Raleigh, San Francisco, Austin, Chicago, DC and then here to Milwaukee. So I don’t, you know, advocate that much moving and certainly my kids wouldn’t advocate that either. But I’ve learned a lot and worked a lot of different companies and industries, and I’d like to say that Generac was kind of looking for someone who’s kind of been there and done that before. Three years ago, and it’s sort of the bark down their digital transformation, there’s just a perfect fit for me knowing what I know, doing what I do personally that I have being, you know from the state of Wisconsin to kind of just became pretty fit to come back to Wisconsin, lead that Generac in their in their digital transformation. And I tell this funny story. If you’ve ever been to the Milwaukeetool visual airport, you go off the southwest gate. There used to be 2 signs next to each other. One sign was Generac. Our business is power and that is very much the case. We’re still very much driven by power outages and things of that nature and are our products are on backup power generation helpful. You know you resilient and power outages and then the sign right next to it, said Milwaukee. Welcome to Waukee, the Midwest coolest, most underrated city. So I thought three years ago and I came back to walking if I could help lead digital Generac and their digital transformation and at the same time give back to the sitting Milwaukee through community service, hopefully improving his tech profile that would just be an awesome dream come true for me. So that’s why I do what I do and that’s why I’m here. You know, Generac has been an interesting place over the last three years. I would say it was a lot like other ohh you have 60 plus year old manufacturing companies here in the area. It was ripe for change. They just didn’t have a leader. Who knew what they were doing and was willing to take some risk was very risk averse, was very on Prem, very back office and and it’s a great playground for me to do what I love to do and takes a shot at it. See if we can make a go of it. And I would say the biggest thing that I brought to the team were sort of those opportunities. What are the? What are the events? What are the opportunities? What are the circumstances? What are the things that we can do together that we could try out and just see if it works and that I’ll not only gave an Ender partners like concurrency and Nate here an opportunity to partner with us, it gave an opportunity for my team to learn these new tools, learn these skills, try things out and move in their careers sort of change their career dream like they never would have had before. And so thankful for that as well. And when I look back at my three years at Generac would probably had seven or eight hackathons, we’ve probably had 30 or 40 lunch and learns with various different vendor partners. We won a few awards along the way. I like to think that we’ve probably moved. I don’t know 1520 people into new roles that didn’t exist in the company before. I knew data scientists and things of that nature. Ohh God, a number of promotions and a lot of recognition to see the table of the sea leadership team. And so I’d like to thank Generac is a very good space. Now, three years later, and we’re sort of ready for sort of the next level of transformation, all that stuff that they talked about here earlier, we’re trying out. So I’d say phase one, the transformation is completely been we’ll do some wonderful things that have has really moved the company forward. Now we’re ready for the next challenge. Here we are. So talk a little bit about what you think the state of the state is. You’re you’re connected with a lot of other leaders in the IT space, you know, some are, some are moving, some are not what’s been what’s been your experience with like what the state of the state is, what’s caught, what, what’s what’s unique about companies that are engaging in AI? Sure. So, you know I look, I’m pretty opportunistic, I look for you know, those moments of confusion and chaos and things of that nature. I’m a big proponent of a changing the brand, perception IT in the eyes of our business partners and customers. Any chance that I get in these moments of confusion and chaos are those opportunities, marketing IT and stuff up and show up of like like nobody’s business. And so that was the situation with chat, GPT for other situations to prior to that. But please charge BT, who came back from the holidays and wondered what the heck chat GPT? What? And I’m not ready for downloading. Boom all of a sudden you hit with this thing. And what is it? And so on and so forth. And like those are the opportunities that I look at to seize and sort of use those opportunities to transform the business and have IT lead those opportunities. So we were the ones that Generac that basically, you know, put our arms around it, dug deep, played with it, you know, put some security around it, got security involved and I say and it there’s a world of doing things that are accepting. To scare and chat, GPT very could have very well could have been scary for a lot of people, but we were kind of in the accepting of the what? What were those things that we could use the tool for that would be accepting and the generated culture that would be accepting from a business standpoint that would be accepting that people kind of rally around this, this new capability, it’s covered possibilities. And so I speak a lot of conferences and I remember back in April Ish, March, April timeframe last, all the CIO that the audience from Wisconsin CIO and said who has blocked chat, GPT and more than half of the people in the audience had blocked chat GPT and I couldn’t believe I don’t I’m sitting there and I was stunned because at that moment you just altered any chance of innovation, any chance of collaboration, any chance of the art of the possible any chance of IT stepping up and showing up just completely opposite of what I’m all about and I just can’t believe it. I don’t know how many turned it on since probably enough. Any but how could you do that to you? To you, you know, to your teams, to your company, because what’s happened with us and this is a little plug for concurrency, we had a concurrency on it generate back in March, April time frame. We did a an art of the possible sort of hackathon with you guys around bringing bringing in sort of subject matter experts around chat. GPT, Nate and team showing us what folks were doing with chat. GBT. We had an ideation session. We came up with over 350 use cases project GPT, both internally use cases and external use cases. We had some, we had a good lunch and remember. And we played around, we have a lot of postage and there’s kind of a fun event and we were energized and charged to go do something with this thing. And I think they mentioned that we’ve already launched three of those ideas and production 11 with with concurrency just here this past Monday. And I look at everything that we’ve done to sort of or inject the company with this new possibility, this new energy, you know, this new era of of AI at Generac. And if you had blocked it back then you would have cut all of that off. And I just couldn’t just couldn’t understand now. Is it perfect? No, I we still learning yes and one and one of the we’ll talk about some of these use cases, but one of the things that I say and this is one of the reasons why folks come up. So I happen to have an awesome see so, but I also happen to have an awesome digital transformation leader named Dave. That’s and that is a healthy tension to have in your team. You have a guy pushing the edge, taking risks, wanting to do, you know, the next best awesome thing and then you have the other guy who’s my cybersecurity. I saying, you know, wait, let’s talk about this. You gotta put some policy around this kind of secure 1st and so on so forth, but that is a healthy tension to add. I’m very thankful that I have both of those folks and like organization cause one makes me look good. And two, I know that there’s always gonna be that challenge in attention. I think that is healthy and maybe other schools don’t have that in their organization. Maybe they just want their scared. I don’t know, but we got smart about it really quick. We we actually created a chat GBT policy security policy. We I remember this like it was yesterday. We posted the policy on our corporate Intranet site was our Microsoft guys. That’s Microsoft SharePoint online. Ohm, we posted this this policy on the Intranet site and it basically said here’s what. Here’s what you need to know about chat, GPT. We don’t want you putting our company data in it. We’re not gonna block it. We’re gonna create a a separate version internally that you can play around with, and we can actually learn about what people want to do with it. And if you have any ideas for how this technology can improve the business, go to Dave and Ski, one central place that company to bring your ideas and go nuts. And that policy, those awesome because it said the company was willing to take some risk, get that risk aversion of willing to take some risk. Try this thing up, but also encourage people to play around with it. And I think a month or two later we launched our internal version of chat GPT 3.5 and called Power Chat. And I think we had 5000 users the first week on it, thousands of page views and and now we have analytics running on it where we’re actually seeing what people are doing with the tool and the putting in code. They’re using it for testing and using it for job description and then using it for all these wonderful play. Once again, that is all possible because we did it. Lock it in the 1st place, so I’m very proud of the team. For what we’ve done, like I said, it’s very, very, very early days when you look at these 350 use cases. But we launched 3, so we have a long, long way to go and we have a monthly update with my boss, the the CEO in an update on AI last night, last Monday, and pretty much we’ve been sticking to the internal use cases around chat, GPT and open AI, but he wants to go nuts. He wants to go external and he wants to, you know, put the stuff external facing. He wants customers coming to the side. You saw some of the stuff that name was showing around, you know, predictive maintenance and and questions about generators and so on and so forth. He wants to go now. So he wants to put this on the outside. I want to have a great customer experience, so that’s where we’re eventually going. We’re not there now, but if you have the CEO now pushing on you to make that happen, it’s gonna happen. So it’s fun. It’s fun. Time to be action, right? What’s been the process of shepherding those ideas from ITV A I don’t know. David was very mentioned. The weeds of that, like shepherding some of those right. His too. So yeah, so some fulfilling prophecy. So it’s exactly shepherding them from like idea to actual and from like a sponsorship stuff like the business getting feel like that’s one of the biggest things that you need to be successful. I think that’s actually one of the reasons that hackathon was so successful is it wasn’t just a bunch of people coming together. Cheers you. You came into the room, just kicked off or like, we’re gonna do one of these and then empowered the group to think this is not just me talking about stuff to talk about stuff. We’re looking at what is the best opportunities in the company and how do we focus our energy toward them and that that kind of gave them the sponsorship about them once they got out of the idea phase, wouldn’t you have to do to to make sure that that was successful? So one, I think people know this, but that event could not be seen as just an it. Obviously you guys came in with great partnership there, but we had to involve business for this to move anywhere in the customer service space, you know internally and some of the working communication space for you, just not like business absolutely had to be involved. So before the hackathon we we went and asked my peers, you know, the presidents of the be used to nominate, you know, three or four or five people that they want to send to this event. And by coming to the event, you knew you’re gonna be educated, but yet you knew you were going to have an opportunity to create some ideas. And then you were also going to have an opportunity to vote. You remember the event? They walked around with both post-its or colored stickers, if you will, and people got to vote member, we would will it down from 350 use cases to the top 30 and then to the top three. So we actually use the audience to vote on the things that they felt were most important to business, and those are the ones that we did. So that gave you sort of the pricing mission, got your ideation, you got your involvement and then you got your ideas sort of implements in front of the most of the stuff that wasn’t sure we’ll do another one. And I remember that day the CFO came. Ohh my God. When, when would there be ever be a time the CFO would come to a hackathon? CFO is there like a kid in the candy store and you know folks have engineering and marketing and so on and so forth. I saw was an awesome collaboration with business and if you get anybody know David Manske in here besides me obviously. So he’s been a little. Yeah, he’s been on the road show a little bit across Generac every single business unit head has asked him to come to their team and present championship team Monday or Tuesday this week. I’m free up, which he had an all hands with HR when your major. OK, you have an all HR the Chr O generic asked him to come in and do an hour and a half hitch unchecked GB THR. We didn’t have any HR use cases. I don’t think from that from that thing, but they have some, they have some down. And so I just think that it’s awesome in one, you know, in charges and probably some of those risk, of course, uh folks. But the fact that the CHOCHRO asked him to come in and do what the take an hour and a half out of your day and a busy day with HR with hiring and firing and everything to learn about chat, GPT, something is definitely going to come out about that. There were definitely, I’m sure, a few ideas. And now he’s got a new champion in the company. So it’s kind of spread like wildfire, but The thing is, if you open up the opportunity for these events for these conversations, for these art of the possible, you have to do something about it. You just can’t do the one piece and not do the follow through on the other piece. You gonna talk to talk and then you gotta walk talking now? I’d like to thank you know, six months into this. The side the journey would open AI, we actually have a huge amount of support in business that can come to us in this space. We hope it had to spreads the little bit. Like you know, Citizen, you know, data scientists or citizen development, where now you don’t necessarily have to come to one place to get this thing to go in. If you have a private version right there so that the desktop you can actually pronounce, but right then and there and just share, share and collaborate this stuff that you’re doing with other folks. So I like where we’re at and I like we’re what have you learned pushing the use cases into production? Like if you can say ohh yeah or like that customer service or like like what? What was the tipping point where you’re like, it’s ready to go? It’s ready to put this in front of like, yeah, so I’ll just share the three views cases that we’ve launched so that the private instance called powerjet that’s been live for about it. I lived it back in umm August and we went live in September, so it’s it’s had, you know, both IT and business for at least a month or two playing around that. So that was our first launch of the second launch was this thing we call the window of lost conviction. Does anybody have a generator? By the way, they have a generator. Generates. Awesome. Awesome. So did you buy it after a power outage? No. Uh, I’m alive. I’m glad you’re here. So one of the use cases that came out of the hackathon was this idea called the window of lost conviction. So when you have a power outage like the next day or two is when we get most of our leads. Most of our requests for quotes. Yeah, most of our phone calls and most of our installation requests. And so the longer that that goes after a power outage, you start to lose that conviction. You start to start to wean at a customers off and they kind of get less interested. So we wanna capture that opportunity sort of right that. But with that power outage events happens and it’s all the generator and so the window lost conviction is basically a chat GPT bot that creates anywhere of a number of different types of email templates, so that when at that point in time, the day after two days after three days after a week after two weeks, 4 weeks later, all the way up and filling, you know, three months later we haven’t done anything, it sends out campaign emails based on different types of messages that, you know, sort of coddle them in the beginning and try to encourage them to buy a generator. And then sort of makes fun of them and makes it really annoying. Three or four months down the road, why haven’t you bought when you’re stupid? Thank message me. So hey, that drives sales. That improves leading conversion. That’s closure rate. Me. That’s a real business value that’s been been launched. So that’s the second use case and the third case is the third use case is the the work that we did with concurrency. It’s called customer service chat. So basically we have customer service agents that answer the phones all the time and if customers call in what customers calling. So we get their data. So if customers call in and ask them question, uh, that either has been answered before or the customer service agent doesn’t know it pops up a bot. It basically tells them the answer them that question based on three or four years of previously recorded teams. Messages of customers calling in so it’s launched on Monday. We’ve had 62 bot interactions, all positive, all 100% true, and all emoji thumbs up since Tuesday. So it’s been like 2 days without 62 brings. So these are real life use cases, guys. I mean, these are really live things that are in production that are being used, all they completely transform the company individually, probably not you know, are there gonna replace humans and what they do, you know, probably not. It’s gonna augment in the work that’s already done, but they are. They showing the possibilities of this technology and if they exciting people about the technology, they know they improving customer experience closure and you know key KPI metrics absolutely. So this is contagious. It’s gonna continue to go on. Can you talk a little bit about any questions from the audience? Grow lots, but I’ll turn to boil it down. So what type of impact has this had on your staffing and overall corporate budget and what sort of queuing, what sort of QA policies and things have been adapted since to quality check the AI driven products as the budget and that’s kind of a misnomer by the way. So nobody has a budget. This guys work and where every October now. Yeah. Second half the everybody knows it, get budgets cut second half of the year, especially the last quarter of the year. No one in 2023 predictive open AI chat you nobody put any money in their budget in 2023. With us that knology. So it’s the CIL you gotta figure out. You gotta rob Peter to pay Paul. You gotta slice it from one project and give it to another. I think for 2024, the way that we’re gonna work it is as an example, we’re rolling out, worked it. And so basically if you kind of shave off some of the work Dave budget in the hopes that you know, you know you create a few chat, GPT use cases, all that work today, he’ll kind of fall under a major program. Will probably work. It work that way, but this is all you know, part time work. There weren’t any full-time resources set aside for any of this stuff. People are doing this at Generac because they wanna learn the tool and they wanna see if they can do something really cool with it so that anyone’s full time job. We have other stuff that we’re doing in the world of AI and so on so forth. So this thing kind of came out of left field and people just reactive. So they wanted to be from a budget resource perspective because from a future work perspective, here’s where it’s going to mean. Google would have seen in a job description for a prompt engineer last year. Have you ever seen 1 from 2020 to? There was no such like this on the project. Shouldn’t here last year there is such a thing. Now we’ll be hiring, you know, hopefully a couple of those. You know LLM architects come up with their own version of the job description. But now there are jobs that are possible and opening as a result of skill, and that’s exciting is that’s that’s a game changer. Data science took 30 years to come to provisionally about the science isn’t college three years ago. This stuff happened. That’s the year. So there is gonna be there wasn’t a war on talent already. There was definitely gonna be a war talent at these guys or gals who are sharp in this space. They’ll come in anything that they want in any area that they live so good for them. That answering all your questions, yeah, just the other ones like QA. What type of QA? There’s no automation, so you know stuff like Salesforce or something like that where you want based scripts and so and so forth. This is all manual stuff and so that use case that I was telling you about the customer service in the change that’s been you know my team part time the weekends kinda way that thing with responses says pulling agents off the phones to see if that answered the question and so on and so forth. So it’s very manual. Right now, there isn’t any real automation around this stuff. Yeah, like there was like there is other you enterprise apps. This guy are there by by roll or by profile. Who in your organization like, inside and outside of IT has been lost. Important in driving these first three projects and projects that are coming and then uh second question is is do these projects resemble types of IT projects you’ve done in the past or are they different in some way like like how do they look different from similar projects in the past? Is it? Yeah. Is it different technology or is it different? So the guy who leads this for me, I he was my digital incubator. And just the by pure coincidence, when I was in this room last year presenting the keynote, he was in this audience working for another company. So I hired him as a result of meeting him at this event last year. It’s you sitting in the back, remember? You in the Who’s sitting in the back and you saw my presentation and I and in opening for a senior drug drill, digital strategy and technology. So we applied for it as a result of the meeting, yet this this event last year, I heard them. So there you have to have a leave. You have to have someone leading it for you if you want the person runs, it’s probably a he has a full time job in terms of running out of data and AI the data and be IT team. For me, it’s about a twelve person team, right? So and B2 or three of those folks are kind of focused on this stuff now. So you have to have a central. You have to have a central incubator for this. This does not work in a normal PMO project team. In normal application team this doesn’t work like that. It has to be a separate service central team that that drives innovation and incubation. If other people have seen this work other way that I’m, I’m happy to be convinced, but I’ve not seen it worked in the other way that than that and it’s a part profound lock part strategy that he also runs the data in in AI BI team because what do you need for all this stuff? But you need data data. Everybody’s data like Janine, Dieter, and so I I can find. I can see this happening pretty quick, so if you guys have ever implemented you know, ML OPS, the key to you know, normal AI, you have to keep feeding your AI models data all the time. Data we require 15 companies last three years. We pull that data into these models, so if you’re not constantly, you know, reinforcing and training your model, they’re gonna say they’re insights. I see that happening with this. Well, if you’re not constantly updating that teams chat with phone calls that are coming in from customers, you’re more likely to give the wrong answer in that bot. So these things need to be kept up to speed and it kept that fed. Like Microsoft to plug here. So we have we once got sometimes better be lucky than good. We just happen to have all of our, you know, documents and manuals and processes and all that stuff. We just happen to have them stored in SharePoint, so we looked all that stuff above the power chat and prime ministers chat? And ask more questions to get more answers from God generative perspective. So some of this is lucky. Just some of this instances of Microsoft. Kind of sort of conference, I mean it’s been, it’s been as a result of our great partnership with Microsoft. But if you had that stuff stored separately in a different platform where if you weren’t recording your your customer phone calls and teams and you were not doing something else, this stuff would be very difficult to pull together. We just happen to have all of this stuff stored in some way shape or form, and Microsoft technology that’s out. Then your second question was aroused the I guess how does this resemble projects from the past? Or what’s different you you’ve answered some of that with some of the folks you. Yeah, the new organization could so, but the joke and it is that give you $1,000,000 and you’ll see a launch of something in the year, you know Salesforce work day, SAP and take your pick up big Enterprise app. This is not that. So Nate and team were out of Generac once again in April, helping us ideate with these wonderful use cases. For live 3 Ohh project stuff. So in less than four months, we’ve launched 3, you know, chat, GPT like use cases. That’s quick. That is very this stuff has to go because I think, you know, I think the business would lose interest if it sort of wins out a little bit longer than it has. But four months is a nice kind of quick turtle, and it kind of keeps the business partners engaged. If it was anything less than that, they probably wouldn’t believe that’s true. If there’s, if there’s anything longer than that, something else will come up as a higher priority. So if you keep it around at three or four months time, which is which is a stress, you know on the team, because if you have 10 or 12 of these going on, once they’re taking three or four months of these, that’s a lot IT people work. That’s not so. That model has to be figured out somehow, and then I can see it. Maybe a year from now being less centralized and more decentralized right now to sort of charge it and and streamline and scale it fast, it has to. It’s within that one that one team, that 3-4, that your questions. Thank you. Anything else that’s good? Alright, you talked about the beginning with the policy. Do you have a DLP in place to monitor it? Or so he’s I think your size of an organization. To plan on, discourage or prevent somebody from putting sensitive information. Or he’s like PII stuff in chat. Yeah. So we have a deal P and we monitor the heck out of that stuff and people have been caught and we asked them why they did it and they they didn’t know. Some people don’t read policies. Some people don’t look at the corporate Internet. I don’t know what the percentage of all the cyber security policies that get get rid of, but I’m sure it’s very low ohm. But you’re gonna have that fallout. And I would say probably from the time this thing was launched around Christmas until the time we actually issued the policy, February, March, who knows what happened. Xbox put out there, but since March, since the the policy was documented until now, we have a a much tighter control over it. So the way that we do it is we don’t block somebody from going to open AI. We route them to our private instance that they haven’t hit them. Uh, so Power chat is domain is is marketing across the company chat dot generated not com. But if you have to go on to open AI and redirects you to to interact and share with DLP service, use Ohh finance common technology. There’s no secret and stuff. She’s not clever security expert. Pushing back. Yeah. I just want to ask you think that we’re at a point where more CIO’s need convincing. Ohh God yes. We need support convincing your CEO of the importance. Well, and once again, it’s better to be lucky than good. I happen to report to the CEO at Generac, and without that, there’s none of this would really even happen. So I’ll tell you a couple of things that that are important. So one, not every CIO imports the CEO. It’s unfortunate that that’s the case. They probably have my network. See I’m Networks report to the CEO or the CFO. I didn’t know a guy reports to legal and that sucks. Ohh so you have to report to this. You know, for any of the stuff it could be possible. You can’t drive transformation through later two or three layers load. See one second thing is, so I I remember myself. I was at my daughters volleyball tournament the first or second week of January and I follow on Twitter and he posted on Twitter that he wants everyone in the senior executive team to download, chat, GPT, play with it. But I sent that to my boss and he sent out an email like that weekend into his senior leadership team, telling them the same thing. Not every CEO would do that now. Would he have done that if I didn’t? Didn’t show him any penny off tweet. I don’t know. Maybe a few months later, but he did it right down in there, so I knew that. I knew that I had this support community. What I what I have had that before. I probably still would have done some of the stuff that you know we did, but I would have eventually, you know, ask him. But that instance by that email that gave an open playbook to go with us with this stuff, I can tell you one thing online of CEOs with that sign out that email, that’s fortunate. But one more question before we get back to the. What would you look for, Billy or? Yeah, I know. What would your recommendation be for a smaller organizations that don’t necessarily have really large IT organizations? It’s maybe a team of five or ten IT individuals. They really have started seeing requests from the business like you at work in an organization where I do report to the CEO, it’s great. It’s the company is really wide. They’re really accepting of this new technology. They wanna do more. We’re just trying to figure out what’s the next step. Do we have something like a a powered chat or now there’s some great simple examples that that you gave earlier on? Any advice for this? I was out in Manske that were in Chicago Newsday, speaking at data Bricks conference in that same exact question was asked. We probably have about 100 hundred 25 folks in the audience that we’re basically asking those how do you get started with this stuff? How do you mean? Couldn’t believe that that, that many people had hadn’t gotten started. So one you need a champion and does that champion need to sit 19? No, but they need to be fairly technical, fairly knowledgeable about this technology. They need to have that played with it. They need to have tried it out, you know, they need to know what other folks do. But you need the champion. This goes nowhere without the champion in the person driving this. That’s #1 #2. You’re ultimately gonna have to educate them persons all alternating. Have educated the company in some way, shape or form where they use partner like the currency intent you know not or whether you, you know publish bunch of youtubes or something and and send them out to people in email and watch over but you’re gonna have to educate people and what it’s changing so fast. Look at this thing that got launched this weekend and they where the heck did that thing come from? So you need somebody needs to be a champion. Somebody needs to be on top of this. Somebody needs to educate the people and then, like I said, you’re you’re gonna need, like, a central point or a central team that’s responsible for incubating, and that team needs to know that they’re not only responsible for incubating itself, but also responsible for sharing the learnings. You don’t tell people what you’re doing. It don’t tell people what you’ve learn, then someone else is gonna try that same thing, and they’re probably gonna fail. By a similar issue that you may have. So that’s that’s an excellent place to start. I don’t think there was a lot of YouTube videos. First things they first, but now there’s the telescope up there and I do like them all on you here in the. Come on, do a lunch and learn up here, Carol. So you talked a lot about using AI for customer facing and different business cases. What about larger projects? Or there be construction IT and using it for planning estimated forecasting risk? Do you see it coming that way too? I totally I mean this in the slide chain space, I totally see a ton of use cases, not supply chain. Think about demand and supply planning. With all these supply chain, they’ll just deal with chips and lead times and so and so there, there is a full area of that use cases that are that are out there. So if you think that’s the back end, so there’s a ton of space, it’s supply chain area, I think that’s possible and you kind of go all the way in the front end. I do believe that this will become a the the company. You know the team, the org who figures out the customer facing aspect of this and then enterprise scenario. So we’re we’re BB shopping, you know 95% of our revenues through a dealer and distributor network when we’re not really need to C but if we can figure out a way that this somehow can augment our marketing information or materials or infomercials, you know all that stuff or branding to our dealers. Holy cow, that has a ton of spend. He’s got a ton of money on that stuff and we can figure out some way this can get external facing and draw like our dealers to either our site or the also to learn this on their own rather than pay all this money from the education fund. I think that is that is also a huge area of opportunity. So on the back end, on the front end, I think it’s the biggest, the biggest value and then there’s a whole bunch of different use cases in the middle that I think will will only get solved if people who are aware and live those problems bring those forward. Some some people don’t. Some people are worried about this is job security and they won’t. They won’t bring a a use case that could potentially, you know, automate and approval cycle or pricing and codes and so forth back and see and stuff like that as well. So those are those are truly can be truly transformational. And then yeah, I. Why do you think all these technology platforms are coming out with their own chat? GPT and I was at Sills Forces Conference a few weeks ago. Yeah, istein GPT and I got all this GPT. SAP coming up with theirs UI path has theirs like all these guys are trying to figure out because they don’t want another tool that GPT tool brought into and take over their space like every week. There’s a new announcement of some GPT work day and GPT or and I think I think there is an opportunity. This two maybe not replace some enterprise software platforms or packages, but certainly delay the decision to buy sometimes as a result of tweeting. See what you know? What insights in each room so I can definitely see that happen? Hotel Microsoft, I said that. Yes. Have you seen much of an effect on intellectual property? And I don’t mean protecting what you’ve done, but the creation of new intellectual property. For example in the generator. Ohm. There’s anyone doing that? So, yeah, I’ll, I’ll give you an interesting use case. I saw this at at the Salesforce Conference and there’s, you know, huge DI application. So retailers are using this obviously to you know about clothes and and colors and different types of styles and so on and so forth. And so the content that they’re presenting, you know, obviously is a human try on an extra large, long, 46 long jacket, you know online and see that in different styles and so on and so forth. So retailers, a ton of use cases that’s. But if you think about that image, that picture you can get into issues around dei pretty quick. Who’s that person you are? They black. Are they white or is there women? Is a female as a man, as a transgender, all this stuff. And so yeah, it’s retail is going nuts in this space. But that, that, that TEI component, you don’t end up like a bud light. And so there’s a lot of risk in that, but I’ve seen we tell adopt that pretty. What about in a generator itself? If you had experience that you see, we can do this with voltage or fewer parts. You have engineers who are looking at it to do something to the product itself. I haven’t seen those use cases myself. It doesn’t mean they’re not happening, but I think any product team at this point, if if I know for a fact our software engineers are using this to make, so we have a monitoring and analytics on our power chat. Absolutely. One of the most popular in these cases is helping when go. So if you assume that that code is part of what you’re talking about, then yes, we haven’t. We have an intern start this something to the summer and the funniest thing of figures from UWM and he’s in the engineering team and his first day of work. He was walking around, he posted on LinkedIn. He goes. I’m walking around Generac today and everyone’s using chat GPT for writing code. You know that not like that. So we brought him in. We brought him into the team. He helped the develop one of these use cases, so people are doing it that and this interval smart enough to work that out against. It’s not a huge there. It’s not a huge software development organization. We have some that we’re done, Microsoft. If Generac software developers and Waukesha, WI are using chat GPT, write code. It’s happening. Any other questions? Are you worried about like license savage of what, Chad GPT would produce and the information it collects from its models and stuff? Yeah, that’s publicly accessible with, but is licensed by an organization or person. And then being and then using that in production and not being aware. Yeah, absolutely. So that’s not conference and just about budget here. You know, some free, if I remember correctly, they could validate this. So in power chats or internal chat GPT instance, I think it doesn’t cost anything to submit. But when you receive content back, it’s like .001% of a penny per like section of of data or section of content. Sure. Not what? So that’s pretty cheap, but you also, like I said, you have to know people and you need people who know what they’re doing, because if you did not have a rule that manages the size of response from this thing like this thing can go on forever. It was .001. Penny is kind of pretty quickly. So we we you know, we have a couple rules that sort of right size the response as a result of that cost. Ohh, think about job descriptions. You know, think about things of that nature where it’s pulling back ton of content that’s readily available on the Internet. That’s just quick. So you need to have. You need to have people who are aware of these these rules and and these things that doesn’t. But I mean, compared to compared to BLOB storage and Microsoft Azure, this is cheap. You don’t. The other question. All right. It’s been fun, folks. Anything else? Name no. Just thank you. Thank you for being here. Maybe I’ll be back here next year.

Transcription Collapsed

So we’re going to take the next about 45 minutes here and talk about how to get started. So we have someone who has not been introduced to you yet, who stands head and shoulders above me and intelligence and capability. And this is Brandon. Brian wants you to choose yourself. Yeah. Everybody, thanks for coming. My name is Brandon. I’m concurrency’s lead data scientist and I’ve been doing applied math for about 8 years. I’ve building the machine learning systems that Nathan and team have talked about earlier today. That’s great to me, everybody. Awesome. So we’re gonna break this presentation up into a couple different sections. I’ll kick it off and Brian’s gonna take the ball, and then Brian’s gonna cover some. Some really getting into the weeds and some of the things that Tim talked about. And then we’ll hand out back to Brandon. So alright, so I think that this is probably. Thing I want you to remember from this session it’s also how we can help you, but it’s it’s really about where you are on a particular journey and any one of your organizations is going to exist somewhere on this journey or maybe even multiple places on the journey simultaneously because of the way your organization is structured. So we’ve worked with companies all the way up to Johnson Controls and all the way down to ones that only have a few 100 million of revenue production, right? So that’s a pretty big range, right? We’ve done the envisioning sessions for Johnson Controls Generac and then down into smaller companies, right? So that’s a very different experience across those. But what’s been really fascinating about this has been how the size of the company is somewhat immaterial to these pursuits that the AI efforts are more focused on use case than they are the size of your organization. But whether you exist on multiple or or or or just one of these spots, 10s of depend upon the size of your organization so far left hand side is establishing executive alignment and I think that’s the most meaningful thing that I take away from what I saw happen at Generac that was successful. And is successful because you have executives that care about the initiative and know that this is the future of the business and they’re willing to take steps forward to do it. I saw the exact same thing over at Foxworldtravel Sam, their CIO entered the first action he did is he had his brief the executive team President, CFO, CIO, VP of Sales. We briefed that team and once that team understood the very nature of what we were doing, they are willing to support the efforts that were cascading from it. So executive alignment is critical. Establishing guardrails, establishing mission, understanding the different tool sets that you might apply when we talk about commodity versus customs will come into play as we talk about. Like what’s happening in the copilot space versus what you would build and that executive alignment lets you move into those those group envisioning sessions that Tim talked about. So the group envisioning sessions are really about how do I create a broad intent within my organization that the organization doesn’t just feel that ideas are sent to them, but they are participatory in those creation of opportunities and then our working with the team to be part of the solution rather than seeing this coming at them. It’s not coming at their job. It’s coming with their job. So those use cases move into the evaluation of the scenario in picking one or multiple where you’re placing your bets. And those bats can be disruptive moonshots. But usually need to have some that are incremental that deliver those. Those wins along the way that the company sees I’ve this is turning into something. And in between that sometimes you might POC something and POC is like proved to me this actually works right? Is this possible? Sometimes we’re doing that just like we’re taking like that. Since stereo that IKEA thing like what does it eat? What would she be T4 be actually even do with this? We did this. We did a scenario for a customer where they wanted to interpret this code that was written in this like it’s not even FORTRAN. It’s like some ancient like. Manufacturing four GL. Yeah. So what is this? Yeah. And they wanted to turn that into a document. It’s like is this even possible like and we threw a data science to static. We’re like it might be, but it’s more like a research project, like that one. You might want not wanna throw a lot of money at versus this other one, which is probably a better pick. So you get the you get to that point and you’re building out pilots which are moving into a production state. That’s what you know. Tim went through the process on three different scenarios of saying I’m gonna take this. I’m gonna get it in front of Miss users that are using it to do their job and we start small and we grow and then we’re eventually getting to these production iterations that are building in machine learning operations at that brand’s gonna talk a lot about is what are the components that we’re building in when this is truly at a phase where it’s impacting the business needs to run just like any other operational platform once it’s delivering value, you need to consider it operational system. And then this last piece is very interesting because scale pattern is really about like, what does this look like when I do a lot of it and some of the companies we’ve engaged in, that’s one of the first questions they’ve been asking like Johnson Controls for example, like they’re gonna do this now once not 10 times, not 30 times, but hundreds of times. How many bots am I actually building here? How many models am I building? How much am I reusing them? Componentized yet, plugging in across multiple platforms, or is it one bot to rule them all? There’s some good answers there. There’s some emerging best practices. Feels like some of the other things we did five years ago. Very interesting where that comes into the journey, but you have to consider it across across the channel. So you’re somewhere on this journey right now and understanding sometimes you need it even back up to go forward. Like, if you’re if you’re here and you haven’t really gotten this yet, sometimes you need to back up in order to be in a position where you get the support you need to get the financial backing or the people, or the support or the upskilling to be able to move forward. So when you think about executive alignment, you’re thinking about what does the business’s mission look like through the lens of AI? That’s really the question that you’re asking yourself. What does my business mission look like through the lens of AI? AI the question of what is the organization stance on it does interesting question like actually we were brought in by a CIO to talk with the CEO about what the organization stands is. It’s kind of like educate them about what that stance even is. And then in a sense that CIO’s question of like, what does the stance with almost like the wrong question to be asking, because really what the Co is he was pretty excited like he just kind of wanted to know like how do I guardrail it how do I set the right framework to be able to put myself in the right position? Positioning the message, the relationship with people once you go and you start driving initiatives in this space, people either see it as a threat or an opportunity. How do I make it an opportunity for them and a lot of that comes from the way you talk about it and the way you engage the people that are part of the solution and help them to see it as part of the solution for their own career. So is that this? I think I mentioned this. The last talk I was at this Milwaukee tech like AI group mixer thing that I spoke at briefly. There’s 75 people there, and when I asked the group to say is next time we come back, there’s be three times as many people here because this is a fraction of the individual’s impacted value by AI within our professional ecosystem. And now is the time for us to be thinking about this in the context of everyone’s work. And an understanding common blockers. So it executive alignment is really about positioning the company. Which leads us to, and this is what I walk through with this, this CEO was here are the main blockers that she needed to think about as you start walking toward the journey. The first blocker is this concern about data privacy, and this is in a sense this is like a little bit of FUD. It’s just, it’s just like anything you put in, Chachi. PT is becoming used to be able to retrain models like God watch out and yes, I mean it is public once you start putting like you shouldn’t put the secret sauce of how you produce your product into chat. GPT. Yes, that would be a bad idea. But when people engage in these projects in a professional way, you’re using private instances of all of this that isn’t shared to produce new versions of models. So just the reassurance that when you go down the path, it doesn’t mean that the rest of the world knows your recipe for producing a certain type of rolled alloy or the secret sauce in your uh, your type of carbon you’re using or the the idea of like what? What my? How do I make my boats more efficient than someone else’s boats? Like I’m not sharing that with someone else when I deploy a private instance of this. So that’s that’s I think it’s really important. 2nd is this idea of data not being ready and we talked about this earlier. This is really more about start with the end in mind. Start with the end in mind. What am I trying to solve? And then what do I need to be able to solve that problem? What needs to be true for me to solve the problem? Don’t start with this like my generic data needs to be ready. Start with what needs to be true for me to solve that problem and then build incrementally from there. And that may mean you don’t take the moon shot opportunity quite yet, but it might mean that you can incrementally work toward that opportunity by making progress for perfection. Human displacement is a concern. This is more about attitude then it is like the actual thing itself. Disruption has been disruption of humans has been around forever. This is our opportunity to be able to do it right, to do it in a way that focuses on doing it human centric way. We all care about the people around us. This is our opportunity to be able to force multiply people rather than just force multiplier machines, so attitude is important. This is probably be the most if I were to say, like data privacy being concerned, one concern two is probably this. Like what if it gets it wrong thing? And you could ask yourself, like, how do I actually really even understand whether it gets it wrong? How do I know if a human gets it wrong? It’s probably another question. Ask yourself. Like, umm, there’s a lot of ways to handle this. This is confidence interval. This is giving them reference to where the document came from. It’s highlighting the text, the dock the content came from. It’s measuring what confidence actually means in this context. Like what does it mean that it’s a 99% confidence interval, like versus a 50%? Umm, it’s injecting different types of language into the prompts, like don’t answer if you don’t know like insert. Maybe or might like a human would. If it’s not sure, like what do I do if I don’t know right? I just, you know, say like the weight of the bathroom is through there cuz I know. But like, maybe I don’t know exactly how to get to like this other business in the office park and be like it’s probably over that way, right? I injected language there that we all understand to say I kind of know when it’s directionally that space we can do the same thing with models, right? Or if I just straight up, don’t know, I just, I don’t know. We need to do the same thing, and then the last is bias and this is a loaded term ohm and it’s it’s something we need to be very cognizant of when we build these models. Now a lot of cases it’s it’s not super relevant, but in many cases it’s extremely relevant because it imparts this need for us to think about who does my AI platform impact. When I think about the end in mind, who is it impacting and how do I avoid the consequences of me designing this in a way that takes advantage of or doesn’t or unintended consequences that I may be projecting and a lot of that’s about like things Brandon. I’ll talk about, like, not building black box models like understanding. I wanna see what’s happening inside the machine and be able to understand it. Gauge it. Measure it. Nobody output and and umm. Think about the data that feeds into the production of it. So these are common things. These are all achievable there are there are problems that can be addressed as a component of doing this right? So when we talked about the envisioning group, these can be 5 to 100 plus people. You know, we’ve all done these both virtually and hybrid. Creating ideas is one of the most empowering things from your teams, especially if you tell them it’s gonna do something meaningful. Like if it’s just ideas for ideas sake, it’s not worth it if it’s ideas because you know you have some acceleration and momentum, it works. So it’s just an agenda of what that looks like. There’s Tim again. This is one from our on site. There tends to be, you know, generally like 3 hours ish. You start with art of the possible. You get people exposed to what it is. I’m actually fascinated by how many people haven’t even used chat GPT yet. One of the things I learned in this is that sometimes you need to start with just like. Here’s what chat GPT is and give people that like light bulb moment or what? Like, that’s a thing. And then you’re like, yeah, that’s been a thing. And then there’s all the rest of them. That, like I understand that. And then you show them the art of the possible. Then you can break out the groups that can do the ideation and bring them back based upon prompts and you you immediately get hundreds of ideas as a result. And produce something like this, which we showed before. So I’m gonna hand over to to Brandon and Brandon’s gonna start talking about how do we evaluate a scenario and how do we get going. Yeah. David, so the life cycle of AI projects is, as Nathan pointed out, quite long, but it does have a starting point and I want to talk about that starting point by explaining why I do what I do. What do I do? Data sets, so I think there are a lot of parallels between the two. Umm, the world is super interesting and I think data science applied math is the best way to understand it. Umm, you know, why do some countries win more Olympic medals than others? And why do some students drop out of high school and others don’t? Why do farmers sell soybeans? More on Friday than they do on Monday. Right. All these questions of causality are typically asked by clients as a starting point for for projects they’re like the least useful to actually answer, and we very rarely do because it gobbles up resources that aren’t useful enough predictive system. So starting with the right question is is everything and. They often can be wise questions, but uh, we don’t want to do that. We don’t. There are two big differences in how data scientists work. The first is explaining stuff and then the second is predicting stuff and I think that sort of Jangles our intuitional little bit seems counterintuitive that if you can explain something, you couldn’t predict that something, right, if you understand it causally, you should be able to use that understanding to make some inference about the future. And for a lot of technical reasons that I’m not gonna go into, this is just not true. And in case in point. A CEO of a company that writes software for the commercial real estate industry approach me. A couple years ago and they said we need to understand why some of our employees do certain projects faster than others, all else equal. So what’s going on here? Umm. And so I asked what you’re interested in this for what reasons? And they said we just want to understand our business better. So OK, so if no interest in predicting how long it’s gonna take somebody to do a job, right? Yeah, of course not. OK, so you spent a bit of time getting to the bottom of that question and involved a lot of, like complex charts that really unpacked and disentangled the nuance between variable one per variable, variable 2 and like, rooted out all of the uncertainty in that question. And they liked it. They loved it. At the end of the presentation, though, they go, Rick, how well does this predict how long it’s gonna take somebody to do the job? And I I said, well, this model really wasn’t built to predict. It was built to explain this phenomenon right? So the the intuition here is that explanatory models explain the past, whereas predictive models they fill in a blank about the future. Umm, they feel very, very similar, but they’re not. So how do you prevent asking a sort of useless question? And I I don’t mean that sort of mainly I still talk about why questions all the time in my work, because this is just what we do as people and who cares about the sort of statistical details. But the way you prevent that is by focusing on ROI, right? And there are three components to this. They’re just like the measurable stuff. The things that you can quantify, there’s the capability ROI, which is basically how well project allowed you to develop your in House ability to predict stuff. And then there’s the strategic component of ROI, which is if the project positions you to uh, activate your your strategy. Different projects have different Rois. Umm, the ones you start with really ought to have measurable impacts. But you can you you can invest in a platform that has a a strategic or and capability ROI as well. So once you have a sense of the sort of ideas that define the projects you should do, you still need to find out right? How do you how do you find those things? And the chat bot examples are all the rage because they’re super powerful. Very useful. Umm, but there’s a whole family of projects that aren’t really chatbot desk. Right. And these are the the ones that sort of follow this structure. First of all, they’re benchmark table. They’re either benchmark because there’s an existing solution that exists, right? Some legacy analytics solution that is, you know, suboptimal or there’s a human doing it right now with performance that you can track and so on. The next thing you want to have is actually used to be datable or there needs to be data for us. Umm. And I’m not gonna talk about sort of the characteristics of the data other than that it needs to be trusted, needs to be clean with varying definitions of clean and used to be sufficiently big. I don’t mean like big data in the sort of marketing term sense. It just needs to be enough of it and enough is defined differently for different problems. Uh, the I think the most important part of these four things that sort of allow you to zone zoom in on a a use case is if if there are favorable adopt sort of a adoption characteristics, right. So the business doesn’t have unreasonably high expectations of model performance, right? With like the chat, but I think we all get it’s gonna hallucinate or make mistakes. That’s a an expectation that is OK and and tolerable in certain contexts, but in other contexts it’s a little mistakes are super problematic, like in a health care setting or something. There are a number of characteristics about I adoptability basically that make things more likely to succeed early on, which is the minimum accuracy characteristics. Evaluation window or the sort of period after the pilot or POC ends during which you or you know kick the tires. This thing any good is actually working. That should be short, hopefully under a month. That’s because people have really short attention spans and they kind of move on to figure and better things. So you want to capture attention quickly and then hopefully it’s not disruptive to people, process our tech and this might fly in the face of the the two innovation paradigms that we can talk about before the incremental versus disruptive. But it it doesn’t. This is just about picking the problems. Well, early on in your journey so that, uh, you sort of don’t throw the baby out with the bathwater, right? You want people to be interested and invest in their machine learning programs, so just pick the ones that aren’t gonna require a ton of investment in in a change manager. OK. So that’s sort of the foundation. Brian is going to talk a little bit about a pilot that we did to kind of click into some of these details. Yes, I’ll take the one. Odd. So POC versus pilot, do you want to talk about that before we jump into what a pilot really means? I do, but not. Yeah, you know, talk about that. Yeah. OK, alright. So when lot of times when people start this journey, uh, you know, we can often come to the table and do a demonstration for you. Even with your data at a customer that sent me over like 5 contracts, what can you do with this? And I spent like 2 hours and threw up a chat GPT and they asked questions and these contracts and like, wow, that’s really cool. Ohh, but it’s never that easy. Like you can get something out of it, like pretty quickly, but there’s always a little bit more that meets the eye. So wanna walk through some of those challenges that that actually come up? We talked a little bit about the Generac, what we talked a lot about Generac actually today and Tim was here. So when we got into the Generac solution, what we did is we kind of identified what are the outcomes that we want to get out of this, what are some of the challenges this is gonna solve and how are we gonna get here? What’s that journey look like for us? We worked with Microsoft on this. This is from a case study that that Microsoft did with them, and what we did is we started to break this down. What does this chat bot look like? What does it mean to do so? I think Brandon kind of mentioned like what is the end result, you know we started there. So good components for a technical chatbot are gonna have obviously some sort of language, you know, interaction with the users. So support or you know how many gonna query? It’s ad hoc. It’s going. I don’t know what’s going into this thing. It’s gonna be 62 people from customer service that are gonna be throwing stuff at it. We have to be able to, like, you know, support any kind of these requests. Ohh, we haven’t response that’s got that has to come out of this. So this is, you know, the information that the the that the users asking for and then we decided that we wanna give them the source of this information. This goes back to kind of the Brunswick example that I talked about where you know 70% isn’t good enough. You know, it’s gotta be like 99% when you’re saying cup, the fuel valve, right. So and then the last thing that’s really super important, yeah, especially with Generac was just the feedback mechanism and it’s simple. It’s just thumbs up and thumbs down, so those are the four components that you’re gonna need to get out of that and chat should be T doesn’t it only does this one right here, right. So we’ve gotta do a lot of work to get the input in. We have to figure out how to get these sources back to the user and uh, Chattahoochee doesn’t really care about, you know, whether they got the answer right or not. It’s just a bunch of words that it threw back at you. The other thing too is like, what’s the information that we’re that we’re feeding. Most companies have product manuals when they’re, you know, selling a product and their product manual might look like this. A lot of pictures the IKEA examples are really good. One where you know, here’s a step by step guide and doesn’t matter what language it’s in, everybody’s gonna understand it. Make the screw. Here’s a tool, but it together one through three. But if I’m gonna build out, like if I wanna answer the question, what tools do I need in order to install this window? You and I can look at this and it seems pretty intuitive what we’re gonna get at. But if you look over on the right side, you know that’s what. That’s what these OCR technologies see, and it’s all the imagery is gone, you know, all the lines, all the context, all the boxes. It doesn’t know what those words mean or that they’re related to, you know, pictures of a drill or something like that. So you have to figure out how to break that information down and index it in. Categorize it in a way that chat, GPT, are these open AI? These LMS can actually interpret the information. And then you know, we saw this example, it’s getting better, alright. You know this multimodal capability is only being pieced together within the last few days. I mean that the technologies behind it had existed before, but not in an LLM. Capacity. So for years we’ve been able to take images and throw them at, you know, at models and have it described things to us right? What is this thing? And it’s typically like object detection. So kind of going back to the previous slide here like we can, we can have this image broken down and say that this is a drill, but in the context of this document, like putting those pieces together, it’s not, it’s not there yet. We’re starting to get there and we’re, I don’t even know how it’s gonna work yet because we only had like 2 days to play with it. But it’s pretty incredible, is what it amounts to. So there’s gonna be new ways for us to solve this. You know, as we, you know, as you go through this journey, but these tools don’t, you know, some of the problems you’re gonna run into, those tools don’t exist yet. And we’re gonna have to work to to figure out those gaps. So now we got process the information. I know what the results gonna be. How do I actually build a system like this? You know component by component, so I’m not gonna. I’m not gonna walk you through this diagram. Feel free to take some pictures, but at the end of the day I’ve gotta have a data pipeline. I have to get that information I have to process it. I have to be able to take, you know, another common thing that that customers come with us or is that their documents or versioned. So, you know, I might have a new technical document that comes out like every month and with certain updates to it, I have to be able to push that information into my into my models on a regular basis. And I can’t have people like copy and pasting and and managing that information. So being able to create a data pipeline of flow, that information in the system is important and then everything else here is just what you kind of expect, right? I’ve got my language model decomposition is gonna attack that problem. It’s going to provide the result, but as you can see, it’s busy. There’s a lot of there’s a lot of different components that go into it. And then the last part is just being able to monitor and this is something that most people have overlooked when we do projects with them is that getting a feedback mechanism. So how do I know if this is good enough? You know, and it doesn’t have to be complex. I don’t want people to overthink it. I would suggest just starting with that you know thumbs up, thumbs down that it you know that I get close. Did I not? And that’s going to give you directionally if this is gonna work for your organization or not. And to Brandon’s point, like, do these evaluations on a on a frequent basis like a month making iteration, you know, another month and then you can get ready to go to like a production setting. It’s a great point. So let’s keep popping the hood. Let’s see what else is like under the the surface of one of these systems? Umm, before I do. Uh, how many of you have or know if you’re company has experimented with data science, machine learning, but haven’t having gone to production? Yeah. OK, maybe about 20 presenting the group. How many have gone to production with a model? Yeah, possibly a larger group. Uh, OK, so how do we how do we get there? A lot of people think that machine learning system looks like this. Is the machine learning code. This is excellent. OK, we’re done. The reality is a little more complicated. There’s a lot more going on, and in fact machine learning component is the the the smallest component, typically of a system, especially robust in production systems. You know, there’s certain infrastructure without which you can’t serve model output to the business. There’s feature engineering without which the model doesn’t really have any to learn from. There is monitoring without which it’s very hard to debug, which is problematic and regulated spaces. If you’re a model risk management group at a credit union, you care very much about, uh, monitoring your models. There’s a lot more going on, and if this sounds like a lot, it can be, which is why it’s important to phase when you build each component and you do this though, you derisk the project. A lot of executives will think of data science as a science experiment, and that’s true until it’s not right. Machine learning engineering as engineering. But before it is, there’s a lot of sort of uncertain science. Ask elements to it, which are very risky and through unwanted for a business who’s trying to maximize investment in the programs. So isolate the science parts from the engineering parts by doing something like this. Typically in three phases, the first of which is a POC which, as Ryan was sort of nodding to, is a little different from a pilot. The the difference? There is a difference and it’s not without distinction. The POC is all about proving that business value can be created from the models. Really not something technically like kenas technically be done. It’s umm, I guess that’s not always true. There are sometimes where you’re trying to prove something can be technically done, but the best majority of times you just trying to prove that business value can be captured. And if your data science team is not focused on this, then they probably are accountable for the wrong things. They should be accountable for business goals, not technical ones. Umm, the field has evolved to a point where we kind of have a sense of what techniques are going to work and what use cases. And so we’re doing something extremely novel in a very narrow band of the world. You should know sort of what to invest in and what not to invest in. But anyways, this POC phase is really four things that can be customized to some extent, but you pretty much always want to do the data collection of course, machine learning like the code which you need to involve feature engineering for and then model validation. This is the part of the project where you’re basically making sure that the the model is producing reasonable results given whatever constraints you have and you want to test this with business users. Typically. Hey, is this look good or not? I I built the product recommendation engine for Publix, the sort of Florida based grocery store. Before I got to consulting and the model validation phase of that project was very interesting because as a product you go to publix.com, you throw some stuff into your grocery cart and thing I built recommend complement complementary stuff like Amazon. It’s nothing. Uh, fancy. But they were trying to validate the model by basically confirming that the model was producing intuitive recommendations, right? But peanut butter in your cart? You want Kelly. You buy a ham, you put him in there. You want cheese if certain kind, right? Very simple stuff. That can be confirmed with users, but there is a another function to product recommendation engines which is to help your users explore your product universe, right? Well, what if, like if a your Netflix for example, right, you want your users to find new content, so you continue to pay the monthly subscription. So model validation has a lot of flavors to it. Is is my point, but once that is done, you’ve confirmed that you can create business value. Let’s move into actually sort of capturing that business value, right? Can’t be done. Let’s actually go do it, and that’s where we move into the MVP, which is basically the PC but with some sort of infrastructure. And then you’re also doing more robust testing, debugging, and some configuration and so on because you don’t want to be, uh, basically trying to do Brits, right? But you don’t wanna do so much. Dey risky in that you gobble up lots of resources that could give otherwise spent better later on. OK, so later on. After some conditions are met by the business right, it’s like OK, this thing is running well. That’s great. Let’s move into the last phase, and oops, right, this is essentially about automation. There’s more to it than that, but let’s automate what we have. Let’s make sure that it’s reliable, logins in place, etcetera. OK, so this is the same information, just presented differently and I’m not gonna drain this whole slide because there’s too much here. But it’s basically just defines and explains to. So what behind each of these components and then went to to do each, you can customize what components in the blue you do during each phase, right? You don’t. Umm, you know, we have some clients who, uh, for example, no, they want to go to production right away. Maybe that’s because they’ve done some of their own research. They’ve tested a model. Hey, this is good enough, but let’s just can we just deploy this thing? Umm. And that’s totally reasonable. We’ve worked with a pathology diagnostics lab that was very interested in basically making their revenue cycle management team. The people who are responsible for collecting the accounts receivable from insurance companies who are reluctant to pay the claims they want, they want to make that team more efficient. They their team had too many claims to get through. Umm, some of the years company denied them. They really had to be smart about picking the claims that they thought were most likely to be paid out at the highest contracted amount, but in the shortest amount of time given the least effort, right? So there’s a lot of little optimization things that they’re interested in, but they had an economist on staff who had sort of investigated this. He just didn’t know how to move it to production. Umm, so they I know they said just move it to production. Well, there is a number of things that that weren’t really done to allow that model that was investigated on the side to be successful in production. So we ended up sort of doing the whole thing from from the get go, while sort of learning from what they had done prior. My point is you don’t have to do this all. It’s not totally prescriptive, but it’s a really good place to start. So many questions about this. OK, so there’s the double click slide. So this is basically the the POC components data collection, feature engineering, ML development and then the model validation with some concrete examples to kind of put boots on the ground a bit. Again, I’m not gonna drain this whole slide. There’s a lot here. But it. These are. The things I would start with, right? It would be hard to do a POC without doing any four of these components. Sometimes you see people skipping the model validation phase and that what happens then. I don’t know if anybody has experience with it, didn’t really have experience with this. Yeah. So basically, you don’t. People don’t use it. Don’t trust it, which is problematic. Uh, so this is the thing that is skipped the most, and this goes back to what we were talking about with explainability and and making sure that it’s not a black box. You have to do this right? Going back to the story I shared before about the CEO asking us why some employees work faster than others, that project was exclusively about transparency and why? Why do some people do what they do and why are there particular outputs given particular inputs? This is like the whole point and uh, some people, some people don’t prioritize that or they prioritize it at the expense of prediction. Like I mentioned. OK. Does anybody have any questions on this? And you’ll have access to this all after word, so you can take a look at the examples and so on. OK, so you build a POC and then you stack on like these four additional components. Typically the most important one is serving infrastructure. That’s just surfacing the the model output to the business. It could be an endpoint, it could be umm, you know you’re tabling data somewhere that is referenced in a report. It could be this things even on the front end, right? Like so. I mean infrastructure, I don’t just mean back in, it could be the UI that they’re using and so on. Umm, testing and debugging. Super important. Don’t really talk about it data verification. Also, something that skipped a lot if there any data engineers in the in the group, you might feel the pain of skipping this one and then process management again. Super important, but orchestrating all these jobs and so on. It’s the Super important and kind of the core value add of an MLPS framework, but this is an MVP. This is an ML OPS, so what’s going on? Again, this is you can customize. Analog uh, this is really just shorthand for like automation, right? Umm, so there’s there’s actual automation making sure that tasks which in a POC are usually manual because you don’t wanna, you know, prematurely optimize and monitoring configuration, resource management and metadata management. Umm, each of these topics could warrant like a 20 minute talk on it, so I’m not gonna drain this, but if anybody has any questions about sort of the the nuance of it, let’s talk afterwards. I think it’s a. It’s an interesting conversation. OK, so done the POC, there’s business value that can be extracted. You deploy the minimal viable product, so you’re making sure that you can actually capture it, and then you’ve automated it. But you’re gonna do this a lot more, right? You’re gonna do this at scale. So how do you ensure best practice? How do you enforce certain policies and so on as your data science machine learning team grows, you should stand up, but a center of excellence, so all that is, is really doing two things. It’s enabling positive things and enforcing that certain best practices are met. The most important things. That I’ve noticed that should be enabled. Are this sort of curiosity driven exploration? You there are very few examples of of building something that was sort of top down lead before I got to consulting. Consulting is a little bit different, but in industry, when you are the data scientists, all of your projects are sort of all the valuable ones, at least in my opinion, the ones that turned out to be valuable. I don’t have a a a real horse in the race tended to be ones that the team thought of when they made an interesting observation. They go. Oh, that’s kind of curious. Why is that the way that it is? Those are not sort of topped out, so it’s really important to allow the team to explore the curiosities. This is not the same as open ended research, though. I think that’s sort of the tension, you know, you’ll hear executives say, you know, this isn’t it. Doesn’t school student academia, this is you’re not publishing papers. This is about getting more business value, and they’re not actually at, they’re not in competition with one another. The next thing you want to enable is very easy, very cheap access to subject matter experts who know the business is the best right. Data science doesn’t happen in a vacuum. You know people who can explain the data, the business process behind it? Umm, but for the might seem really obvious, but avoiding or not focusing on them does have sort of, uh, unwanted outcomes. That thing you want to enable is is data access I mentioned before my time at Publix and you know the senior data scientist without access to data, this was a problem for obvious reasons. Umm, there really early in their journey. And so basically I was relying on, you know, CSV sentence right, which is great for POC testing, but when you go to production, there’s there’s some issues there. So you wanna enable data access and compute access? Obviously, with some policies in place, otherwise you can really run up your cloud computing costs. Umm, but you don’t wanna give somebody access to like, you know, 14 gigabytes of RAM when they’re trying to train a very large sort of data greedy model to we’re silly now you want to enable failure and this is just like the you know fail fast kind of notion and not just like tolerate it but encourage it enable that to happen and that’s really what and then I’ll framework is about it’s about competing on process by which you go to production not really competing on data or models anymore but the the process behind those thing. So the story here is make all this cheaper and then you need to balance that with or some rules and you wanna put rules around the budget. Do you wanna put some rules around data access? They shouldn’t have keys to the whole castle, just the relevant rooms that there might be interesting things inside of. And then enforce the alignment. And this is the alignment that Nathan was talking about earlier. The executive alignment right, making sure that people are not just in support of the projects, but real champions like Tim was talking about right. You can’t have somebody just like, yeah, thumbs up, do this thing and then when things go a little sideways, they go whoa. Like they kind of back away. You want somebody to champion these things. OK, so how do you sort of go from the curiosity led development to business impact lead development and there isn’t really anything explicit here. Your data scientist will do it for you, Bill. Uh amongst themselves? Talk about they’ll make very uncertain things right, like. We don’t know if we can do this particular technique and right at this particular outcome, but they’ll investigate that on their own before they go talk to their manager or whatever. And then by the time they confirm that, then they’ll stop talking. So and certainly and they’ll transition from that sort of curiosity thing to an impact statement, they’ll say, well, if we can corporate this thing, then we’ll arrive at these expected results and we know this because we’ve done back testing, umm, and it will happen after it doesn’t actually need to be enforced. So whole point of the Coe is to do better work faster without any extra risk. I think that’s fairly obvious, but the really unobvious things I’m not necessarily gonna talk about the organizational aspects, the team structure, aspects of a code which is a super, super important part, but it’s very subjective and I don’t wanna get too prescriptive on people. We can talk afterwards about my, umm, sort of recommendations. But the things that I can’t talk about it related to umm these ideas when and where do you enable and when we’re you enforce again this is super, there’s a lot of nuance here, but typically you want to enable in the top two phases and the POC and NLP get through those cycle super fast have an idea test it doesn’t work, does it not? If it does, let’s deploy it and then you have to enforce some stuff. So that’s when you’re down in the ML OPS phase. This is for simplicity, but you can obviously you have to enforce compute and data access up in the POC phase, but in general this is really where enablement is the most important, and in general this is really where enforcing is the most important. Does anybody have any questions about this? And these do look like Packer colors up here, Ryan and I, when we were playing with these slides, I’ve said this is orange, but it is not worthless. We’ll see all. Alright, so the minimum components we talked a lot about the components before, but to sort of simplify things, you know sandbox right, this is where you’re sort of experimenting and they’re really kind of two sort of prongs to experimentation. There’s the sort of AB testing experimentation where you just the blue widget or the red widget make higher convergence. Right? There’s that kind of thing, which doesn’t happen in a sandbox, but this kind of experimentation. This is sort of like the R&D. This is the curiosity led development. Hey I noticed like this weird spike in sales at this time. I wonder what’s going on there. Well, it’s investigate it. I wonder if we can extract some information from a data set that’s currently being missed and incorporate that into our demand planning models. That kind of thing happens all the sandbox. Umm. Then you move into the the pilot MVP. Here we sort of smashed those two words together, pilot and MVP. Implying that they’re essentially the same, right? You’re when you’re piloting something. You are Daniel, right? You’re it’s not just on the ground in sand. And that really is a a distinction with a difference, but it does depend on how you define equals, and then you move into production, which is about reusability, reliability and simplicity. Umm ML OPS things and then you have this thing all the way to the right, which is sort of a specialized area for, umm, basically our acceptance. Everything I’ve said so far? Umm. Through uncommon but. Does happen in certain scenarios. Yeah, especially with like. You know, if they’re very special privacy concerns and so on. All of these components are on a, A, A, a sort of foundation of solid ethics and a good data platform. The ethics is a whole topic unto itself, but just to comment briefly on it. Well, people will say you need to have an Ethics Committee to get started with with data science, and that’s only true sometimes. Umm, certainly not true all the time. Just depends on the use case, right? So. Another thing to point out with this is sometimes it actually matters when it seems like it shouldn’t. So the example before of that diagnostics, that technology lab, we’re basically just taking the health insurance claim, making some the they’re making the health transform and they’re submitted to the insurance company, they get paid from the insurance company if that claim satisfies like the contract they have between patient and insured. Umm, some of those get denied and the model that we built will basically take that pile of denied claims and just rank them. Give me the the best ones on top. The things that are gonna pay out the most and then the ones that are less likely to be approved and payout lesser amounts put them on the bottom. Umm, it wouldn’t really seem like you would need an Ethics Committee to review the the model in that sense, but you you could if the output of that model affects the way that that lab does business with their customers, right? What if management finds out like, hey, you know, patients of a certain demographic tend to be approved more. So why don’t we just accept those patients, right? That that seems problematic. It seems like somebody should sort of make ethical judgments about that. So my point with that is sometimes you need it when it might not really seem like you do. And in cases where it seems like you do, maybe you don’t. There’s this general scaled architecture. Does Nathan wanna talk about this? Ohh sure sure. So one of the common questions that we get well Commons probably overstated. One of the questions that we get from larger enterprises is what I had talked about earlier, which is and my building 100 different models. Or am I building one? Or am I building some collection of that in between and? I think it’s important to note that this is not really a solved problem at the moment. You know, five years ago, maybe even longer than that, you know, we were getting started and helping companies, companies were getting started and deploying their cloud environments. They’re Azure government environments and we had established practices that were better practices for doing that and in fact, we helped write the best practices for a lot of those things. But there wasn’t like a framework defined as to like how you make those decisions and overtime established frameworks have come out from Microsoft, AWS, Google and terms of like here’s the pattern that that our companies are using and they call it the cloud adoption framework in terms of like how should I position my subscriptions, how should I label it? What should I use for classification? The only things kind of become best practices that you have to have really good reason to diverge from. Umm at the moment this is sort of in a better practice, a zone right of like what are we learning? What is someone else learning about the same space and what I’ve sort of leaned on to answer this question or at least point in the right direction has been the two tension forces that exist in in how you answer. It is who’s familiar with the like jobs to be done. Framework. OK, very, very few if any. OK, so the jobs we’ve done, framework is, is it the top of the thing? It’s the why it’s the I I am building a I have a hammer because I need to put in a nail, right? But the the guy doesn’t actually want the hammer. He doesn’t want the daily, just wants to fasten 2 things together so they don’t become unfastened because he’s building something. So the why is the fascinating, and it may even be the building these building. So the jobs to be done is ultimately what you’re serving through the mission of your organization. So you can start thinking about this in the context of the scale architecture terms of this like top down framework. Sometimes people think of it like as customer 360, like another way to think about that. Like what are all the pieces I need to put together to position a view for my customers that feed from data across the business from different domains. OK. So that’s like angle one. Angle two is the idea of data domains. Have you? Who’s familiar with the data mesh architecture? OK, data mesh architectures are built around the idea of data domains that are aligned to areas of the business or vertical stacks of things that you’re building within the organization that isolate data into groups of stakeholders that have a ton Amy around that data. And that doesn’t always align to the job to be done. OK, so like the thing you’re doing from the customer which spans across data domains, doesn’t actually always align to the functional delineation of groups within the organization. They are in the problem right? And in between those two pieces is models that you’ve built based on data that sometimes are combined together to serve a specific use case. So this problem is more complex than it seems, and again, it’s like the ethics problem is probably a talk in and of itself, which is a good idea for our future talk. But umm, it’s it’s when you look at this idea, you have to think about it in the context of both of those sort of polling polling extremes in how they relate to how you build reusability within the overall spectrum. And that reusability may be the interface. Maybe that’s the reasonable thing. It may be the you have domain specific AI models or maybe you have domain specific AI models that actually serve multiple use cases and then you have an API that talks between them. Or maybe they’re even having same service domains that are reusable. All of these are Connor Mines, modular components that need to think about at scale, so I’ll kind of cap it there, but understand like this isn’t a solved problem, but it’s a problem that there are other frameworks that we can apply to the problem to help us to position ourselves to be more flexible for when the day comes that it actually is a solved problem. We can skip. Yeah. Good. So this is very similar to what Nathan was talking about. This is a simplified version of OPS architecture, right? Uh, so sort of the big rectangle in the middle. This is really the the whole machine learning lifecycle, right? You’re gonna just experiments train a model and get some results. Maybe package of them into modular components, right? If you’re if, maybe you have a domain that’s similar that was talking about deploy and then of course monitor. This is all happening in CCP framework, promoting data and objects between environments and so on. This is kind of just a caricature, so it double click on it. There’s this and just an interest of time. OK, train this slide either. But umm, I guess the one of the big takeaways this slide is there are so many different iterations of this framework that you could use depending on different datasets, right? So this is sort of for a classic tabular setting where you’re doing demand planning or the sort of pathology example I gave before, and then where you have a tabular data set this will look different if you have text data which is really popular right now on structured data or computer vision models. But they do follow something very similar and there’s a lot more information than we can. We can share if you’re interested in learning more, I can point to resources or elaborate on any of these points if you’re interested. And. With that, I I wanted to close just on a few prescriptive thoughts on culture. I said I wasn’t gonna talk about it. Just a few things that I’ve noticed. So, like you’re Ashley. Drive. I’ve talked about that before. If you do that, you’re gonna sort of encourage people to learn by doing things rather than just by talking about them, which I think is very important in a climate of AI hype. Right? Well, let’s actually put boots on the ground. Do the thing did you want to balance short and long term costs and benefits? Obviously, and I still have this as a point because I think sometimes it’s lost on on folks who are umm for who he we hear about projects where the engineering and science parts are sort of meshed and they shouldn’t be yet. But if you sort of separate them, then you can ensure short term games with while. They’re shooting at for North Star of long term gains as well. Uh, you know the culture that has a very high tolerance for ambiguity, especially with some of these machine learning projects where you don’t really know if The thing is gonna work I’m or maybe the data set that that use case requires hasn’t been used before and there’s not great documentation on. There aren’t really people around seem to know much about it, so there’s a ton of ambiguity and being able to sort of enjoy and thrive on that is really important culturally, and this is probably the most contentious thing on here. I don’t recommend having data science be a support team. I think that’s fine if you wanna focus on incremental gains that that night was pointing out. But if you really want disruptive stuff, you know, let people’s curiosity. Drive that. Make them responsible for business goals and, umm, they’ll do great things and every organization in every high performing organization. I’ve worked for this was the case. Umm Nope, it was not true. OK, so this is umm what I said before. You wanna encourage exploration but not open ended research just to prevent sort of, you know, throwing money down rabbit holes that don’t go anywhere. And if you don’t, people are gonna do it anyways. You know very hard person to manage, I think. And we have been told not to do things. I do it anyway. I think there’s promise there and there very few times where that hasn’t really been the case. So it’s very important to encourage exploration, but not open ended research. OK, not obvious tips the the companies who are most successful with their machine learning programs are the ones who have committed to it here respective of the outcome of any individual project. You know they’re not using this pilot project as a litmus test. They’re like, this is just the first of many subsequent projects. Let’s do it and I think that is something that kind of take away from this. So on the macro level, investing in machine learning is important, but on the micro level at the data scientist level, try to avoid it. There’s just a lot of overhead involved with maintaining such a system. Deterministic things are much different from the probabilistic system set we build and from the nodes and the audience. I’m sure people have very relatable stories to this. Uh, start small. Precisely right. You know, you can’t really just say let’s do AI versus beta edit and you’ll have a great use case really. Ben Time doing a lot more thinking than doing, at least at the gecko. That’s super important for success and it’s all real problem. This is not trying to be coy with this. I just. I see. You’d be surprised by how many times people you know think they’re solving a real problem, but you talk to somebody else and it’s not right. And I think that’s probably just a communication thing, not necessarily anything deeper than that. People just need to have access to all the information from a number of different lanes to identify the real problems. And if you don’t, if you don’t have quick wins really on, you’re gonna be defended that just it happens. So something like how four and and this machine learning isn’t really a science experiment anymore, it’s just product development. And I think this is the hardest thing for live work with to accept, right? This is just sort of it’s table stakes now. That’s important too. And then, because we’re in a climate of AI hype. You know, I should point out that machine learning projects onto themselves are not objectives. They’re just a means to an end, so you know, treat them that way. And that’s probably more for the technical minded people in the audience who have, like, super committed to their technical thing that they built and have a problem. Giving it up and being like closes and good enough we have to do something else in the server. Upset. Umm, that just means to an end. And with that, we will end where we started and they’re the the journey. So we can help you at any stage. This is something that, uh, everybody goes through in, in their own own way. But there are there are common themes to what we’ve noticed. We’ve just success, so happy to help at any time. And with that, I think I’m between you and lunch.

Transcription Collapsed

We are going to kick off the rest of the afternoon here by talking about commodity versus custom, and this is going to send her a lot around the copilot M365 type of capabilities, but also talking about this sort of broader context of building, integrating, taking advantage of what’s coming down the pipeline from SAS companies and so on and how it relates into the things that you might build that sit alongside that or are used in conjunction with each of those use cases. So you already know me, but you don’t know my, my, you don’t think like everybody and concurrency is super tall. Strategic hires and the last one. That’s right. Uh, but introduce yourself, please have Chris Blackburn on this solutions architect here on our modern workplace team. So pretty much anything Microsoft 365, I work with the rest of our delivery team as well as our sales team to not only educate customers, work through conversations and understand their needs, but also help prescribe the right tools in that M365 stack for them to implement ahead of things like copilot how to get ready for things like copilot in their environment and here for eight years about a third years tightened year 10 year here and everything to do it. Microsoft Consulting for over 20 years. So awesome. Thanks, Chris. So Chris is gonna run the majority of this session. I’m gonna kind of kick it off by talking about a little bit of relationship between these two, sort of synergistic angles of looking at AI projects. And when you done by about 2:30 ish on our track. OK, so there’s really 2 like. Significant boxes that your AI initiatives might fit in in. I suppose you could delineate this into like softboxes, but this is the way it come to think about it. There’s commodity and then there’s mission centric or you might call it custom. But I think mission centric is like a I think a better way to think about it in the broader context. So commodity that I like to think about with commodity is like spell check. We’ve all gotten super used to spell check in an if all of a sudden spell check stops working, all the words that I always spell wrong, but I just sort of rely on spell check to fix for me as I’m writing the sentence sharp becoming spelt wrong in the context of what I’m writing. So it becomes something I don’t even think about. That becomes immensely useful to my productivity. If I didn’t have spell check your your perspective of the documents I’ve written or what I ultimately deliver to you will get taken down a notch, cause you’ll notice I can’t spell actually or or probably or things like this that are there to kind of often misspelled right, or my grammar might not be as appropriate because I use a, you know a semicolon instead of a colon in a certain situation. Umm and commodity is really just this way about, like lifting up the general productivity of everyone who’s participating in this, like space that we’re in. And when you think about where AI takes the commodity capabilities, it’s even further than like just what the generic kind of spell check or word prediction that we have in Outlook is today. It’s this idea of how can I accelerate reuse without even thinking about reuse? Like, why am I so quick to produce PowerPoint? Well, it’s because I’ve produced PowerPoints for all these scenarios before and I can pull things I know and create a PowerPoint really fast because I just know where I’ve stored all those assets, right? I’ve done a statement work on this type of form. I’ve done this kind of project. I know where that old asset was, so I can pull that slide over and be like this is the perfect slide for this situation. AI has the ability to do that in a immensely more capable way because it has the index of all the data that exists within my and, but choose your environment sales environments or my productivity environments or my my customer service environment. So on the left hand side what you can see are examples of tools that we might be using in a commodity space. You know, copilot, Windows 11 copilot off, she’s 65. Copilot, security copilot varieties of SAS products you might be adopting that are centric to your industry. You will see AI popping up as a commodity component of all of those tools that you’re adopting on a regular basis. And on the right hand side is everything that we talked about this morning, which is where AI brings a competitive advantage to your ability to execute within your organization and not that not that commodities and commented event just more like table stakes, right. It’s more like if I’m writing documents and I’m not using this in my competitor is they’re simply gonna be faster at doing it than what? Like I can possibly be on the right hand side. It’s like it’s this direct alignment to a job to be done that’s tied to the mission of my business. I think about the mission of my business. I think of what? What jobs to be done? My mission enables whatever those things are, whether I’m manufacturer or I’m a knowledge broker or whatever I do. This is where competitive advantage is driven by accuracy, ROI sponsorship data, enabling the creation of AI capabilities that make my company more effective than what it does directly. Very specifically to its organizational capabilities. So if we were gonna talk about examples between these two things, let’s talk about commodity first. So summarize code and improve. This is my GitHub and GitHub scenario right this is. I’ve written something. Now tell me how I can make it better. It can maybe refactor my code or might update my code or address it in a different way. Help me write an email. I need to write an email to someone that’s I’m not sure how to say it. It’s a it’s a complicated question. It’s it’s something I’m not sure how to do it with the right empathy. Maybe I wanna a prompt to be able to do it better. Help me write an email. Summarize a meeting that’s gonna be a great copilot. Scenario of like, don’t you always hate being the guy or gal who has to like be the note taker and like write all that down and send out the summary notes after the meeting like at jobs like a thankless job and a lot of times like it makes you less engaged in the being like I actually hate taking notes in meetings like I prefer to close my laptop, put it to the side. What the person in the eye really focus on internalizing? What is they’re saying to me and ask good questions and someone else takes the notes and the person who’s taking the notes tends to not be able to be as engaged because they’re busy taking notes. What if I didn’t have to take notes? Something took it for me and then captured the action items that were developed out of the meeting commodity capability that we shouldn’t necessarily need to do find content in teams. Ohh, I used to say SharePoint that’s even worse like finding contents and teams is like one of my least favorite things to do and that was like hierarchical right you go and like folders and and Windows Explorer and find stuff now with teams it’s just everywhere this agreement. So like, that’s an example and how can I make that experience faster? Analyze security logs. It’s like a Sentinel scenario, right? Like how can I enable me to enable the team to find the right things in the security logs in my Azure environment or then optimize Azure spend? This hasn’t come out yet, but there’s a variety of blogs on it where there’s a copilot for Azure spend optimizer, which is even better than like Azure advisor like. This is super cool. So commodity is all about, just like all of us need it, let’s enable it broadly across our organizations and mission centric is the things we talked about this morning. Create a quote for a customer, customer service chat bot. Automate PO to order matching, supply chain optimization. Project Health Care Q&A and RFQ filter like one of my companies I was talking with. They were all about like, how do I filter down a certain number of these quotes I get so I only focus on the important ones. These are things that commodity probably isn’t gonna do well for you, and when I looked at it to try to do them, it just sort of works, but doesn’t really get to level accuracy or helpfulness where it actually achieves that goal. So SAS products now might intermingle between the two of these. Yeah, it might be built for a very specific use case to accomplish a goal, and it might sort of be a casas platform that’s driven toward a commodity of doing a mission centric thing. But you know, we’re not entirely there yet. So that’s the major divide between these two items. We’re gonna talk about mostly in this session is how do I enable the commodity use cases broadly across my organization. So when you think about that cycle, we’ve been talking about like where do you get started, a lot of times deciding on AI being a thing and that’s important for us to do starts here. The group envisioning sessions and deciding on your idea is where you might say, is this a commodity scenario that’s gonna be served by what we adopted and copilot, or is it going to be a scenario that’s adopted by us building, like concurrency, GPT? Or, come on, come on, Generac. GPT that does certain things for my organization early on and enables other use cases later. So one of the things that’s important to creating the strategy is mapping the core business strategy to those goals, aligning revenue and operational savings opportunities that are tied to it and understanding the commodity uplift to general productivity isn’t as easy to capture and articulate as that of umm the mission centric. You know, like mission centric is usually like very closely tied to a direct ROI, whereas commodity uplift is more like general productivity competitiveness. There’s a little bit harder to measure, like a lot of things are gonna see these, like, wow. Like in and of themselves are kind of like. How do I measure that like but then put together? You’re like, wow. Like that doesn’t represent a pretty substantial difference to how we do work and what does my life look like then versus now? Like be like trying to measure everybody without spell check on the company. Like, it’d be interesting measurement, right? Words per minute spelled wrong. So a lot of that comes into a little bit of a general bucket. And then select focus opportunities that are in mission centric idea I AI. One thing that last thing I would call up for, you know, handing this over to Chris here is that, umm, these all work together. These all work together, so like. You may choose to have a Generac GPT copilot and a like a customer service chat all in the same environment, and they all point at each other to do different things that are focused to the organization’s goals and they’re not duplicative in nature. As long as you understand the job that you’re asking each to perform, and one of the really interesting things that we’ll talk about is this idea of plugins and that plugins can facilitate one of them talking to the other that like a copilot that does this, like kind of generic job might point into a generic GPT and might even point into a customer service bot or directly to a customer service bot to answer a certain question, and they all can play very relatedly to each other. This goes to that scale model idea too. That, like all these, are part of the same ecosystem called helping our company to be able to be more effective. So with that, I’ll let Chris not arresting and and to extend that even more in some cases as they really do reference each other, it’s which is the best tool for the job. So we can say as we start to look at the open AI platform, it might be data within the business, could be a Bing chat enterprise where we’re just looking at large language models to understand some sort of concept or look at some sort of data. And then as we start to talk about the Microsoft 365 copilot at the heart, it’s what data is in that platform that’s going to really help Dr efficiencies and the business. And you know, as we start to really dig into where a lot of us have our data today, we have it in teams, channels, we haven’t in SharePoint sites. We have it in mailboxes and we wanna be able to produce some sort of answer to a question based on data that is already there. That’s when we start to look at that first objective as the business outcome. What are we looking to get out of one of these chat GPT like in platforms? What are we looking to get out of Microsoft 365? Copilot, which is going to be more publicly available in about a month, but we’ll talk about that a little bit. But as we start to think about some of the business scenarios, if that Microsoft 365 scenarios won an interests you, this is a great time to be thinking about where do you want that tool to provide business outcomes. Because that is a tool that a lot of businesses are looking at as to what is the ROI for that. As you look at licenses, as you look at some of the requirements as you’re looking to integrate more into the business through things like connectors and plug INS and even into your existing data, how is that going to drive business efficiencies across your user base? And I meant Jim the cost because as we start to look at each of these different skews as Microsoft starts to present that concept of their copilot over and above Azure, over and above being, there are specific requirements that you’ll want to meet. There are specific costs that will come with those starting to look at Windows itself. That is generally going to be available. It’s it’s starting to rule out now. It’s probably by the time they knight rolls around in the next month, we’ll start to see 23H2 become available to the general public. You’ll be able to see your users engaged in Windows copilot, or maybe about to disable that, because when you definitely seeing both scenarios and be able to push and users to something like more of a Bing chat enterprise where we want them to be able to reference those large language models but do it in a protected manner where we know that if they are searching for organization specific, they’re using organization specific content and looking at it at broader context, we know that that’s not going to feed into those large language models. So it provides more of a security and the commercial data protection. And then as we start to look at that Microsoft 365 copilot, that commoditized version of AI built into every facet of those Microsoft tools that we use every day, like SharePoint and teams, email the web. All of those do you require at least some understanding of how are we positioning our security, privacy and compliance around our data? How are we ensuring that that data is protected and as we are looking to present that maybe some of that protection is just within your organization, maybe you don’t want certain content that you have within SharePoint to be widely available to everyone within your organization. So being mindful of your data protection policies is very important, or you even turn a feature like this on. But then being able to leverage the Microsoft 365 chat, some of the plugins that Nathan mentioned, that is something capable that we can integrate into other tools to make it available inside of teams where we could look for a certain content or integrate with a, maybe you’re using some third party platform, one that is more common would be something like a JIRA. Maybe you’re seeing that or using that in the context of copilot, and that is something that you can integrate not only into your cloud based experience, but then how are you integrating that experience into your Microsoft 365 apps as well? So don’t know if you if anyone has seen kind of this diagram. You might have got out onto the web you’ve looked at. OK, what exactly is Microsoft 365 copilot? There’s been some discussion of the semantic index and how it takes a look at your data. There is the context of your applications and then as well looking and referencing those large language models which you can see on the right hand side of the screen, this happens every single time that you’re doing a search using the Microsoft 365 copilot instead of the response, and at the end of each session it destroys the data. So that’s part of that, that corporate data protection that is in the Bing chat enterprise today that as you look to leverage your own semantic index, which will be something you’ll be able to turn it on in the upcoming months is going to give you that capability to leverage those chat GPT like functionalities within your Microsoft 365 environment. So indexing is at the heart of the success of leveraging your data, and Microsoft 365 in the context of copilot itself and so as you can see, you know, if we were to break down a specific search with using the semantic index, it uses things like organization terms. So it’s your custom Dictionary of words that you use in your organization. It takes specific documents. Maybe this is a phrase that is used commonly across your documents. It’s able to index that and recognize that is a very important excerpt. Could be a mission statement and then looking at the semantic search itself, it’s you interacting with your data in your organization, looking at the A data or documents that might live in your OneDrive and a team site in SharePoint. It’s providing that index and as well. Looking at the context of that document, if I’m doing a search and then being able to come back and looking at different searches within your organization and being able to produce things like frequently asked questions. So if multiple people are performing that same search, it’s able to learn from that and it’s able based on that semantic index to provide some of those results. So this is a high level breakdown of what a result would look like using this semantic index. Understanding what’s in your environment and being able to provide more accurate results than you would traditional SharePoint search today. So let’s take a little bit. So raise their hands. Is everyone familiar with Microsoft 365 copilot? Does anyone have Microsoft 365 copilot today? Fantastic. So you got two in the House of love to talk to you afterwards. Is there are some stranger requirements that we’ll talk about in order to get access to that within your organizations today? So as we look at the different copilots, you’re gonna hear it everywhere. There’s a copilot for power BI. There’s a copilot for PowerPoint. That’s in Windows and some we’ll talk a little bit about some of the security aspects of it, how it’s going to be coming to Intune to be able to look at your devices in the future. So there’s AI is tying into everything in the Microsoft world and being able to understand what is capable of doing will help you be able to decide some of those business cases down the road. So as we look at copilot itself, we can see that it’s not only that cross out capability, but also integrates into the whole of your Microsoft suite. It’ll integrate it in Windows. It will integrate into every facet of the Microsoft world, as I mentioned, but as we start to look at some of use cases where copilot may be value, is Azure going through and using whiteboard. Let’s say you’re doing some brainstorming. It has the capability to take some of that brainstorming some of those nodes and be able to provide an interactive summary of what is captured as well as being able to generate action items. I think that’s probably gonna be one of the biggest values of copilot itself as we look at all these different integrations into the Microsoft world of being able to take large amounts of data and provide quick and concise summaries. Well, I know here concurrency with some of us are starting to look at the teams premium plugin or teams premium license within Microsoft 365 and there will be some meeting summary values which we’ll look at in a second, but that’ll be the biggest driver around adoption. I think in some of the success of that is how it will help speed up your end users day-to-day lives and how they work in each of these products. The copilot for outlook, this particular add in is definitely one that does require that new version of Outlook. You might have seen that little toggle switch if you’re using outlook on your desktop today to be able to activate that new version. So to be able to look at some of your emails, be able to provide a summary if you’re interacting or exchanging emails with multiple individuals in a thread, it’ll be able to provide you that summary of that interaction. So if you get looped into the conversation, you can quickly be able to get up to speed without going through, which I know I have to do. Sometimes it’s going through multiple conversations and interactions within an email thread to understand what’s actually happening. When you get brought into an email thread and it has the ability to not only track assignments but also has that ability to attend to beating for you so that I have not seen in person. But I’m very curious to see how Outlook can respond to emails or be able to capture some of that apart from some of the teams premium integrations. So here’s an example of where you would look at that Outlook integration and be able to see a meeting note and then be able to look at a particular action item for a summary within an email. And then on this next slide, kind of be able to see meeting summaries. So this is probably one of the coolest features that brings that current teams experience, that meeting experience and gives you that ability to even if you missed a meeting to have that ability to quickly catch up by looking had a meeting recap to understand what interaction went on. If there are specific tasks that might be assigned to you or and not have to go back through and watch a meeting and it’s entirety, or even go back and read through a transcript of a meeting, which I’ve done several times and it’s not a fun exercise. So you know, being able to have those summaries definitely provide some value and then being able to do things like jumpstart documents. So let’s say that there are. Your team is working on a number of documents. You know that there have been notes taken in a OneNote. There are other things like a contract, et cetera. You can go through and just using natural language, say draft up a. Particular proposal for a customer based on a particular document and then it will go through and help you create that foundation of the document without having to start it from scratch. And a lot of organizations, they have templates and so being able to take a template and taking notes from a meeting will be able to help jumpstart some of that interaction as well. And one of my favorite that I’m really excited about is on the go. So being able to use natural language, be able to have my phone, it’s always to text, integrate, copilot, have it be able to go through and actually create a draft of an email or even a document on the go. Definitely is a time saver because I know I’ve gone through and I’ve had just type up emails from scratch sometimes and being able to use copilot to help jumpstart some of that capability while on the go, I will definitely be an immense time saver. So throughout each of these scenarios, I think the biggest theme is being able to take data, be able to take it conversations, documents, and quickly streamline those and provide you that knowledge of what’s going on in a conversation, be able to help jumpstart interactions with customers and be able to really allow you to then take that next step of productivity without having to really start from scratch within each interaction and to take the interaction of that Microsoft 365 platform and really start to look at other facets of the business where there may be integrations. Microsoft has a number of their graph connectors that allow you to integrate Microsoft 365 with a plethora platforms and it just continues to grow where you can. Then let’s say integrate a booking platform. A lot of us are familiar with a number of different booking platforms that will allow us to go through and make reservations, and this scenario, this citric easy is one that is used in the scenario where we have a fellow coworker that has booked a flight and now I also wanna be able to. I mean, I’ve been asked to travel along with them. And so in this scenario, as we can see, we can ask copilot to help us book this flight. And at the end we can see that it will allow us to just click that book button without having to send it email or teams chat to that coworker asked them for their itinerary. It’s able to use the graph, pull that information, look for that same flight, and then give us that capability to book it as well. So all this was done just within the Microsoft 365 platform makes it easy for us to perform some of those tasks that may not necessarily be centric to word processing or creating documents or presentations, but helps to save business productivity within that M365 space. So as we look at copilot, as we look at some of the AI driven capabilities, even outside of that Microsoft 365 stack, there are a number that currently exist today that we can take a look at as well. So being chat enterprise, it is that protected version of chat GPT without creating your own internal GPT instance. It allows you to leverage the large language models and it keeps those searches private to your organization. So if I am going to bing.com/chat and I am already Microsoft 365 customer and it’s turned on within my environment, I will then be able to see that I have a protected session and then can start searching. I can go back and reference those searches. I can save them in different categories and I know with and have that piece of mind that it’s not contributing any of the data that I look for to the large language models. I also have that ability to look at meeting information as we kind of took a look at before in some of the meeting summaries, this is gonna be a huge time saver with the teams premium capabilities. So that intelligent recap at its heart is one of the biggest capabilities to that specific license. And one thing to note is to really take advantage of the intelligent recaps you have. You must have recording and transcription turned on for all of your meetings so it can take the transcription and can take the meeting itself and be able to produce that intelligent recap. So that’s one of the biggest benefits of the teams premium. It gives additional meeting options when you go into outlook and you click meeting options you can do things like watermarks, provide managing some of the access to who can record and have access to some of the data allows you to do virtual appointments, gives you an improved webinar experience. So if you’re using a third party webinar platform, this license will allow you to be able to take advantage and potentially have a cost savings over some of those third parties and one of the big things that I’m I’m really excited about as a a streamer myself is the ability to take external coders and take external sound, take external cameras and be able to produce a meeting with the live streaming input. So instead of having to leverage just a laptop and start a meeting on teams, I then have the ability to take some of that external hardware. It’s very akin to what some of that experience was traditionally when teams was launched and they had to live meeting support. It’s built into that webinar experience now and adds that enhanced capability and the one thing to call out with the teams premium feature is introductory. As Microsoft’s ruling it out, it is a little bit discounted, but in GTA it’s looking it to be about $10 a user. So that’s another component of the copilot experience. The AI experience that lives outside of just that Microsoft 365 copilot as well. And then there’s Windows copilot, which, as I mentioned at the beginning of the conversation by Crosoft, starting to roll out as part of 23H2, which is, you know, it’s it’s going to take probably from what I’ve read is probably the next month or so, I’m gonna guess up until the night to roll out to users. I there is a a guide that I put together at bitley slash Windows 11 copilot that walks you through that experience of how to enable the Windows Insider Preview and how to be able to actually install that capability. So you can take a look at Windows copilot inside of the operating system as part of that update, there’s a lot of new functionality. It is. Think of it as Windows 11.1, as I’ve heard some people on the web referred to it as with some of the new featured capabilities that are built into this particular build. But being able to use natural language speech to navigate and interact with different features in Windows, be able to generate AI driven images which are some of the capabilities you’ll find in Microsoft 365 copilot as well. I will exist within that new build. This and it it also being able to things add things like the new File Explorer, there’s enhanced OneDrive, there’s a lot of capabilities that are coming as part of that new Windows build as well. The only caveat to call out is if you do have any users in the EU, there’s a I’ve I’ve a pause on that roll out as they look at some of the digital markers act rollouts to be able to allow that to be implemented and and so Microsoft’s working with you to ensure that they meet those guidelines within the new version of Windows 11. Uh security copilot is one that is still currently in a private preview as well. This is leveraging tools like Sentinel. It’s leveraging tools like defender. It’s using that AI to be able to take all that data and machine learning and look at security events within your environment and normal use within your environment and help you to be able to leverage that at machine learning speeds and so talk a little bit about that security copilot. It allows you to take instant response, usually within minutes, to any sort of security event. It will allows you to use natural language prompts to go through and interact with some of the log analytics data that may live in a platform like defender or a platform like Sentinel and be able to dig deeper just based on not only the use of those tools within your environment, but also leveraging some of the Microsoft’s security associations data as well and being able to help your team so you don’t necessarily have to have someone that is completely. Trained and educated on things like log analytics or Sentinel allows you to bring in analysts to be able to look at some of the data just by using natural language and be able to cut down task which may have originally taken hours to just a few minutes a time and so that will allow you to not only modernize your security operations through a enhanced posture, but also allow you to streamline incident response as well as being able to provide summaries to any sort of security events and being able to have some of those customized events. But the big thing is really that interaction, that chat based ability to interact with your security data, to be able to surface any sort of issues at a much more rapid pace. So from the perspective of availability around the security copilot, it’s still in Early Access preview. You have to be an enterprise customer and work with the Microsoft team to be nominated to be able to use it. There is a fee for that one year license. There is still no information on what that’s going to cost after it goes into GA. If this current time but some of the value is being able to help some of your security analysts and be able to have additional use cases based on what Microsoft is seeing within their customer base as well. So think of it as Sentinel, but then you’re connected to security events almost similar to what you would have in defender. And to call out the prerequisite for that as well, that having that defender license that P2 license is required. So for some customers it could be a E3 with defender or could be an E5 license to get that P2 suite for being able to have some of that additional analytic data that comes as part of V2 as as what’s required from a license perspective. So with all of the different AI tools with Microsoft 365 copilot coming into GA very soon, what are some things that you can do to be ready for that when it becomes available? If being able to add that AI driven assistance for all of your users is paramount and you wanna add that in soon as it comes into GA, there are some things that you wanna keep in mind to be able to prepare for that. So the E3 or E5 licenses actual Azure Active Directory account for all of your users. They also have to have OneDrive provision because of how it interacts with some of that data. Being able to leverage all of your services within O365 is very important to get the richest data to be able to have that cross app experience. We’ll talk about that here in a few slides. Being able to have the new outlook enabled for your users. The good news is that can use the new teams or the classic teams experience, depending on which your users are happy. What loop? In some cases, there are certain components of the Microsoft 365 codepad that leverages loop for some of the summaries, so that would be used to be enabled those your devices must be on the current channel or the monthly enterprise channel, and then meeting the requirements for the Tenant Review Council. And you might be asking yourself, what the heck is that? And I actually asked that same question too, but in order to qualify to use copilot, Microsoft wants to make sure that you are set up for success within your tenant and how it’s configured and to be able to meet those requirements, we’ll talk through those at for the Tenant Readiness Council. So you you’re as of November 1st is when your elephant enterprise customers are eligible to be able to get Microsoft 365 copilot. I mentioned licensing on the previous slide. The thing is, with copilot, it’s a one to one. So if your users have an E3 or E5, you have to have the same number of those in order to get the copilot license. You cannot purchase it outside of your licensing provider or you can’t purchase through your licensing provider. I should say you can’t go to a CDW or a software one and purchase it right over the shelf. What they have to do is you have to work with your Microsoft enterprise team in order to essentially be nominated for that, and then they have to work alongside you and your enterprise team to be able to be eligible for that. And as of right now, retail or CSP customers at least through the end of 2023 are not eligible and that goes to some of the requirements of having that 300 seat minimum and the cost of that essentially is that is $30.00 per month. So that the roughly comes in around 100K for a year for just the minimum number of seats for copilot as well. Looking at data residency requirements, you know they do, based copilot, does interact really well with GDPR and the EU. So that’s one thing that is taken into account. And then finally, you have the Microsoft looks at your usage health and and sure that you have a healthy usage adoption. So there’s a quite a little bit of a checklist that wanted to make sure that was detailed, user detailed to customers that they have to go through and meet in order to be eligible to use copilot as well as being able to meet some of the licensing requirements and the minimum number of seats as well. So insuring it, a healthy tenant, Microsoft has an adoption score report within your Microsoft 365 portal. You can go in and take a look at it. Any point this is some of the information that Microsoft looks at to ensure that you have adequate usage. So as you can see here from the sample image from an organization score 58, it’s not going to necessarily keep a customer for being eligible for copilot, but it’s going to drive a lot of their success and and seeing that they do use teams a lot for their meetings, they’re collaborating a lot within teams and SharePoint. They have a a broad adoption of mobile users. They’re using the Microsoft 365 apps Great network connectivity. They’re endpoints are going to healthy state. All of those are contributing factors to looking at your adoption score, and so Microsoft looks at this as well, just to ensure that you’re on track. If that number was very low, that might be something that tenant letting as Council may come back and say, I think you might need to go back and look at better adoption within your tenant to ensure that when you do turn on these features and functionality that they provide you the best benefits possible. The other thing that I would say probably on the list of requirements as you start to look at readiness for Microsoft 365 copilot is your data and looking at the reasons 3 core tenants, it’s looking at your permissions, it’s understanding what are you doing for data loss prevention, what are you doing for synthetic feeling today. If you have E5, leveraging that in junction with automatic classification is going to give you the best results when it comes to looking at the privacy of your data and then being able to leverage some of the reports that exists within the Microsoft Portal today to be successful in adoption of not only best data standards, but also best adoption of Microsoft 365 copilot itself. So a lot to think about when it comes to being ready for Microsoft 365 copilot. Not only from your tenant itself, but understanding some of the licensing and cost requirements definitely are things to keep in mind as we move into that November timeframe and it does become available. And then over and above, not only as we talk about AI and some of those interactions of your Microsoft 365 data with that open AI platform, but also ability is to interact with that Microsoft 365 data for some of the AI tools that live and being able to accelerate some of that development. This kind of takes that line between the commodity tools that we just talked about and really crossed them into those missions centric capabilities and understanding what’s available to help really speed up some of that adoption to help your developers help your teams and be able to build intelligent apps within your organization. So the Microsoft AI journey, it actually has been going on since 2017. You heard Brian Haydin in here earlier today saying that he’s been doing AI projects for a long time and and he’s right. A lot of these capabilities have lived within the Microsoft platform for some time, and they’ve just continued to mature up into 2022 and 2023. As that chat GPT has hit the stage Microsoft recognized as they were going through and even back in 2017, and defining their six core tenants around responsible AI has really been building up to this point. This panel penultimate moment where open AI really is ready for the mainstage across all of the Microsoft tools that exist today. And as we continue to adopt AI, one of the biggest adopters that Microsoft sees in the success in this era is that ability of no coding. It’s being able to take some of the tools that exist within things like the power platform and being able to have them be widely adopted within your user base to where users window technical background can adopt some of these AI tools. They can do it quickly. They can do it cost effectively because they don’t have to have coding backgrounds. Then there is not a need to be able to or or need to have to actually hire developers, and so that reduces the overall cost of not only building some of these community driven or employee driven applications, but also being able to maintain them going forward as well. The other component as well for being able to really drive no coding is just that in the ability to be extensible with all of the other Microsoft products. So being able to tap into things like Microsoft 365, copilot read data out of teams out of SharePoint, etcetera will be able to help provide some of that value and extend that application into some of the real world data that lives within your organization and be able to be scalable and secure. Understanding Microsoft Frameworks and there’s security standards will ensure that that data is protected so that error of no coding is really gonna drive a lot of what we see in that AI boom over the next few years. And being able to leverage things like power platform to really easily build out things like workflows. So I user would no coding experience can build out a workflow that looks at it email when it arrives and then determine what happens with that email. What does it do? Does it? What kind of workflow does it go through? Polls. Information gets manager data sends out an email and looking for keywords like invoices and then be able to start a specific workflow to approve or reject a message. All this happens by your end users. You know, one in it are development has to drive this level of adoption because leveraging a tool like copilot inside of power platform will allow them to easily build these types of business automation processes with the tools that exist at their disposal without having some of that former knowledge of how those workflows exist for the developers. The ability to leverage GitHub copilot, which is now GA inside of tools like GitHub inside of Visual Studio will also allow some of those in house or business tools that have been developed to be developed or be able to be matured a lot quicker and allow your developers to write code a lot easier based on some of the languages that only that they understand inside of the code that they’ve written, but also be able to help them save time and being able to complete code, troubleshoot code, etcetera and being able to respond to it using natural language commands. So as you can see at the bottom there’s a lot of value behind that new version of copilot that is integrated into GitHub to be able to use tools like the open AI standards and use natural languages to be able to solve problems within codes, or even be able to understand code. So in this scenario, the user is looking at specific details within a code itself. Maybe they need to call that specific code within their? Within their coding itself, understand how to use it step by step and to be able to fix an issue within code itself. The GitHub copilot will enable and speed up some of the troubleshooting is processes as well as some of the development processes as well, and then finally being able to look at not necessarily Microsoft copilot, but really that copilot chat. This is where you take these different integrations from different applications that live outside of Microsoft 365’s world and be able to integrate your development into these different available tools using plugins to really speed up the interaction of your users with the data that lives within your business. Within your business, so as Nathan referred early on, being able to look at some of those three different vectors of how to interact with your data, whether it’s things like a Bing chat enterprise to look at the large language models being able to look at like at Office 365 and Microsoft 365 copilot to understand the data within your environment. This takes a step further into your business applications and allow them to really integrate chat capabilities that reach inside of your business as well. So probably ask yourself, where do I start? There’s we talked a lot about open AI today. We talked a lot about the different types of copilots. There is multiple ways to look at these tools and based off of where each of you in your organizations and their maturity might understand where to plug in some of these technologies over the years concurrency we’ve kind of used this model of a technology maturity curve to really look at where our users are within their maturation process and being able to understand kind of where you are in that journey. And keep in mind things like the innovation that doesn’t necessarily have to happen at the top of the curve, some of that can happen early on, but as users are are as your organizations are starting to crawl, walk, run when it comes to AI itself, you know these are where some of those pieces of the journey can start to fit in. So you know, if you’re still within the Office 365 today, maybe that next step is looking at the M365E3 platform. It’s maybe it’s leveraging those large language models through Bing chat enterprises. Just search broad and maybe it’s using more of a narrow search with Azure Chat GPT to look inside of your organization. It’s looking at where what are you running from windows to be ready for copilot, and that integration? Not only on the desktop, but also maybe leveraging tools like teams premium to help your users be more efficient when it comes to the meetings they’re in every day and then as you take that step up, it’s looking at that Microsoft 365 copilot or that security copilot to enhance productivity to enhance security. It’s continuing to innovate through not only the code based platforms, but then really starting to leverage some of the, you know, code platforms as well. That’s all I’ve got. Uh, any question? Any help? We can start over again. Go ahead, cost of that is increasing my Microsoft cost by 25%. Is there gonna be an opportunity to evaluate it with a limited time to see if my organization’s gonna find value when absorbing that additional cost the at this moment I, the short answer is no, and I think as Microsoft continuing to ideate through some of the business cases that they see with copilot and business readiness as they look to move into the retail and the CSP channels to be able to consume that on a smaller scale and maybe even a month to month scale, there’s going to probably be some opportunity. But yeah, without looking in the crystal ball and and where I stand right now in October 2023, Microsoft doesn’t seem to from at least a retail perspective for a GA perspective I should say or going down that approach, I can pause it like one like observational reason why that’s the case, yeah. When you turn on semantic index for copilot, remember what it’s doing is burning a whole lot of GPU and a Microsoft data center to be able to create a large large language model on the data that you have in SharePoint and M 365 and that’s what segments it from like, say a general GPT which is like 1, narrowly focused copilot is building that sort of brain against your entire data set and it’s gonna do a lot of things really well. It’s not gonna do a lot of things. Very precisely, but it will do without a lot of things really well, but that takes hardware and compute that someone that needs to pay for. So that’s why the like you can’t turn on like 5 people because it wouldn’t be economically reasonable to turn on for five people. You have to turn it has to be some like level bar where it actually makes sense to run that against your environment. Great question. Yes, Sir. Is the teams premium different than the Office 365? Is it a subset? It is an additional license, so if you have Microsoft 365, E 3 or E5 today it’s it is a bolt on license, kind of like Microsoft also has Bolton licenses for Intune. Now to add additional features and some of the other. Products that they have maybe they’ll add on another skew at some point to bundle all those together, you know only time will tell, but you know they there’s definitely branched out into some of these additional add-ons on top of your existing licensing feedback up the slide one can’t then maturity curve real quick I think it’s important to note on this curve that I mean it it’s not entirely intuitive because it is a curve right but like these things are somewhat can happen kind of out of order in a sense that like the things up in the innovation space are not really dependent upon the things happening lower down in the spectrum. So just be aware that like it’s not entirely a step curve, but a lot of times there’s like things are more accessible than others, like doing a like company specific GPT is different or like or or E3 E 5 is a little bit more attainable than say like M365 copilot because of the steps that you need to do to get there. But they all kind of work together at different points to this journey, and they’re all gonna be valuable for you. Especially concerning that a lot of these up here have been available since early 2023. Yeah. And a lot of the ones on here are coming out closer to ignite. If we were to take a step forward into 2024 and have the same conference, I think this would probably make a lot more sense just because of the availability of a lot of these features. Just and how easy it is to get into some of these in the future with a very low or no cost, just based on existing licensing that you may have versus the whole ideation process that really does live around these last three in terms of being ready at from a business perspective. That’s where I think that you start to see that curve really start to go upwards because of the business readiness that you will need in order to be successful in implementing those in the bottom half is more like generally driven by the commodity enhancement in the top app is more use case driven by specific sort of value scenarios. Jeff, your hand up as well, they did. It was basically the same question I was just OK what is it gonna take dividual user to get that teams premium premium access and? Yes. So that is. Let’s see if we can go back here for a second. So that one would be in and there’s. There’s definitely a lot of value, even even as we’re using it here. CONCURRENCY and we’ve started to see some of that be realized as well through I guess this is definitely start this year is something there we go. I’ll just. I’ll just back up. How about that? Hang in there. We’re going for a ride. Go back here to yeah, the umm. OK, while he’s doing that before everybody leaves, I want to ask one thing of you, we want feedback and we also want you to win a raffle. So there’s an incentive on this. So if you still have your your agenda, there’s a QR code on it. I’d really appreciate it if all of you would please click on that and fill out feedback. Tell us what you liked. Tell us suggestion that would make this more awesome. Tell us what we need us. Awesome. Or why you hated it? Hopefully things that are positive not, we’d love to take feedback too. Give us all that because we’re on a roadshow, we keep making this better every single time. We’re gonna cover two more cities. We’re probably more cities after that, so we wanna take all your feedback into this. So please please kind of do us that favor and if you wanna talk after this, we’ll be hanging around as well and we’ll have to work with each view. And if anyone needs one, I think you have extras naming. Yeah, they’re just just one. Please raise your hand and we can get one in your hand. Remember to add that on the last slide, the next stop. Yeah, it was just put it up there. I don’t know where anything. Yeah, yeah. QR code on the slide. Yes, be digital. Talked about that, but yes, the the meeting recaps, you know, in some of the other functionality and the teams premium that is an additional licenses as mentioning can’t to bolt that on. And and it’s it is a little bit cheaper right now. I think Microsoft seizing that just as an incentive to get people into the feature. And as I mentioned as well, the meeting recordings and transcription is an important piece of it too. So as you’re interacting with members of your team or outside customers, that’s just that caveat to remind them and say, were you using this to help with being being able to provide better beating recaps and efficiency. Are you OK? If we turn this on and the beginning of a meeting versus a customer or a someone else walking in and seeing that it’s automatically turned on. So you know that that that itself is something to keep in mind as a script to deliver to anyone that joins the meeting as to why that is happening, how many people automatically, if I could see us just show of hands just for my own curiosity. Curiosity how many have transcription turned on for meetings today? Generally. OK. So it’s still still pretty low and anyone wanna share why that’s they they for the ones that didn’t raise their hands, you might not leverage that today. The transcription, one of one of the big things there, our attorneys have decided that say no more. Don’t. Ohh, there are states that require to both parties to be willing to be in on recording, and it’s one of the reasons like in our zoom phone calls we haven’t turned it on as well because if you get into a call with somebody that’s in a state that doesn’t, that has to have both parties know you can get into legal troubles with it. If you don’t tell everybody that you get on the phone with, hey, this call is being recorded. Sure, even if your calling them, yeah. Because they could sign off on it. But these, they’re company. Sign off on it inside of their States and I I could see the the legal quandary there. So it’s a thanks for that feedback. The legal part was definitely the biggest. Yes, Sir. Not the transcription, but a a question. Have you come across any sort of an issue where only one person is licensed with premium, but if they’re not the meeting owner, they don’t get the recap afterwards, even if they recorded it? So if unless everyone is licensed, they don’t get the recap, I haven’t to be quite honest, I haven’t been shared down that yet, but I will definitely keep that in mind. You know, as we’re we’re starting to turn it on for a lot of our delivery team, not only are delivery people within modern work, but also our project managers. That’s something that we’re just kicking the tires to, kind of see what his capabilities are. I know that Microsoft, you know, in some of the teams channels that I’m on in the Microsoft tenant from concurrency. If I try to go in and meet and recap, it says you’re it doesn’t share with outside organizations. So my guess is at some point in the future that will be something that if you were to allow external organizations that are licensed to see it, that has to be a sharing option that’s turned on. So both sides can actually see the recap, but that’s a good note out. We’ll keep an eye on that, but to answer your question, I haven’t seen it yet. OK, my my experience was that even internally that if the other people on the call like say they owned it, there’s only the meeting owner. If they were licensed for teams premium and started the recording, the transcription, that would actually do the recap. The meeting owner? Yeah. OK. That’s I could see that making sense. So someone started a meeting? Or was there meeting and you had it set? They started to record and they weren’t licensed that it wouldn’t capture that. I have gone back and some meetings that were recorded and transcribe before it was turned on and didn’t see any of that recap on there, so that I’m thinking that probably plays into some sort of additional tick in the Microsoft side that will generate that recap, making sure their license and they are the meeting aware, great question. Thank you. Yes, Sir. Is there a big list of prerequisites for Bing chat enterprise? There’s no, actually, that one’s probably one of the easiest ones, so it really the only prereq is E3E5 Business Standard or business premium. So as long as you have one of those licenses, you can go on in the tenant and I I found in some of the tenants that I have access to even some of the demo ones that as long as those licenses exist that was already being turned on. So I guess in that might be a problem at some point, but at least at least that the good news is, even if it’s on and it’s inside your tenant, your user logs in with their identity, it’s going to protect that search. It’s not gonna go out to the large language model, so that’s enabled the tenant level. That’s enabled at tenant level. So you have to go to this link and you have to be a global admin or in some cases I have seen it though I’ve already on the main thing to be aware of, there is right now V CE. Meaning, if you were there is that you can’t service your company specific data through the chapter. Say that again, you can do you don’t. You can’t surface companies specific data. I can’t search for data that’s in my SharePoint or teams through this, so that’s only look at the putting it in keeps it safe, but I just can’t get it back. Yeah, it’s like it’s it doesn’t, it’s it’s great. It just doesn’t like serve any of those like, OK, I want to start surfacing like this and this and this and this. And this year we chat, which is the main reason, like one of the main reasons of, like, uh, companies take that next step of like provisioning their own. But this is great for like just safety on things like I can search for our. Yeah, very easy. I can search for the ROI for migration to Azure cloud but I couldn’t go here and here and search what was this customer projects ROI on that I wouldn’t be able to return that formation and feed it that information in the question that you can’t. You can’t use any of that. It doesn’t index it. Yeah. Why you pay 30 bucks a month for user profile, right? And only the people that have Microsoft 365 this figure out 15 people in your 300 birth organization. That’s only that group that’s gonna have access to that or did does everybody have to be? I think as long as you have a license user license with that you can turn on that feature. These are lost for that per user. Yes, 5 bucks. User month, yeah. Umm, if you go. To the GitHub one, Asher. So you been using GitHub X? Yeah. No, that is some really interesting stuff. Yes, Brian, Brian, if you have dev teams and they are not using copilot or copilot X, this is another easy win like it’s now they’re secularly expensive to enable. It’s. It provides a strict once you get to access and provides a tremendous amount of value to the productivity experience, and it augments the especially learning a new capability. Understanding code based learning language like it’s a great little extra benefit and this is a little older. The holders, probably the wrong word, like copilots holder X is not, but this just ** just dropped like for everybody yesterday I think was yesterday before something like that very recently. Yeah, it’s get up to code and it’s been around, but that is that’s what was that was rolled out at build. I was when they made it available and you had to sign out for weight loss to go on OHS. But yeah, so this program to the chat. Yeah. What’s that? If you have the chat you’re using X, is that true? Well, so this is this is like Visual Studio copilot like inside or like your ice for developers. Yeah, I mean using copilot and VS code for sure you have the chat interface. I’m not sure if I’m using GitHub copilot or copilot X. You have the chat interface where you talk like really interact with it. That’s that’s that comes with the X. There’s that part of complete features that has just copilot, but they’re pilot access autocomplete, so kind of like the dev tools that you that I got you that being able to being able to do select portions of code, right click on it and and then being able to yeah, that’s the that’s the the experience super productivity increase for developers. Yeah, they can have copilot. You just highlighting some code, right clicking on it and then having it come back with some. Yeah. Yeah, for sure. Natural language fonts very in terms. Basically, you know, watch your permittivity increasing the say, let me do something like this and ah, I mean, it really does. Depends on the level of develop. Sure. If Bill and their commitment to use it quick shuttle, but I mean. People that people that are using it on a regular basis just can’t do without the product.