/ Insights / View Recording: Milwaukee AI Summit 2026 – The Agent Portfolio Insights View Recording: Milwaukee AI Summit 2026 – The Agent Portfolio March 3, 2026 In this session, Brian Hayden, Solution Architect at Concurrency, shares a pragmatic framework for organizations that are trying to move beyond โAI excitementโ and into repeatable, measurable outcomes. While AI capabilities are advancing at an extraordinary pace, Brian argues that most organizationsโespecially outside major tech hubsโare living in a different reality: adoption moves slower than innovation, and that gap is not a failure. Itโs an opportunity to move deliberately and win longโterm. Rather than chasing headlines or sprinting into broad experimentation, this talk focuses on how to close the capabilityโadoption gap intentionallyโby treating agents like a portfolio of products, not oneโoff tools. Brian introduces practical models to help leaders decide what to build, how to prioritize, how much autonomy to allow, and how to put governance in place so teams can scale without pulling the plug on pilots. This is an operatingโmodel session, not a demo. Youโll leave with a clear way to identify highโvalue agent opportunities, score them, govern them, and build momentum over your first 90 daysโwithout โaltitude sicknessโ from moving too fast WHAT YOUโLL LEARN In this webinar, youโll learn: Why the AI capability curve and the AI adoption curve are fundamentally differentโand how to close the gap deliberately The three โcampsโ of adoption Brian sees across organizations: Commodity AI (table stakes tools) Broad experimentation (multiple pilotsโoften where โwreckageโ happens) Purposeโbuilt AI integrated into highโimpact workflows for sustainable advantage A clear vocabulary for modern AI initiativesโLLMs, RAG, agents, and Copilotsโand why these are โoperating roles,โ not just product categories The Autonomy Ladder (5 levels) for agentsโand how autonomy demands increasing discipline and governance Observe โ Recommend โ Prepare (human approves) โ Execute in sandbox โ Execute in production Why many mature organizations start at Level 2โ3 and why skipping straight to Level 5 causes pilots to fail The Agent Portfolio operating model: โStop buying AI. Start managing agents like a portfolio.โ A practical method to find your best agent candidates: Frontline pain (ticket queues, production floor, repetitive work) Leadership themes (strategic priorities like cycle time reduction, risk reduction) Build events (hackathons/ideathonsโonly if fed into a governed portfolio) A simple opportunity scorecard that captures: workflow + systems touched + access + data required + control level + success metrics A prioritization โshaker tableโ scoring model that weighs: business impact, time to value, execution fit, trust/risk, and autonomy ceiling Why prompt injection is one of the most underappreciated risks in enterprise AIโand the four foundational controls to start with input validation, least privilege, humanโinโtheโloop, auditing/replay A practical 30/60/90 day plan to move from ideas to controlled production and ongoing portfolio review FREQUENTLY ASKED QUESTIONS What is an โagent portfolioโ and why does it matter? An agent portfolio is an operating model where you treat agents as managed assetsโfunded and governed like productsโrather than scattered pilots. This helps organizations scale value while avoiding repeated risk debates for every new agent. Why do so many AI pilots fail to scale? Because teams start with a demo or a pilot without clear success criteria, prioritize nothing, and then canโt justify cost or risk when itโs time to productionize. The difference isnโt the modelโitโs the operating model. Whatโs the โAutonomy Ladderโ? Itโs a fiveโlevel framework for deciding how much independence an agent should haveโranging from observation (summaries/drafts) to fully autonomous execution in production. Higher autonomy requires stronger guardrails, testing, and oversight. Where should most organizations start on autonomy? Brian recommends most organizations start around Level 2 (recommend), with many ready for Level 3 (prepare + human approval). Jumping straight to full production autonomy often causes pilots to be shut down. What is prompt injectionโand why should leaders care? Prompt injection is an attack where malicious instructions are hidden in content your agent reads (documents, tickets, web pages), causing the agent to follow the attackerโs instructions instead of yours. Itโs especially relevant for service desk and workflow agents that pull in external or userโgenerated context. How do you reduce risk without slowing down innovation? Guardrails actually help you move faster over time. When governance patterns are reusable (policies, permissions, approval paths, logging), your second agent ships faster than your firstโand trust compounds. What should we do in our first 90 days? Brianโs recommended approach: 30 days: score ~10 candidates, pick 2 โinner ringโ options, define autonomy ceilings 60 days: ship 1โ2 to controlled production with approvals + logging 90 days: establish a monthly portfolio review to fund/scale/kill based on KPIs ABOUT THE SPEAKER Brian Hayden, Solution Architect at Concurrency, brings nearly 30 years of software engineering experience to helping organizations adopt AI in a way thatโs practical, governed, and outcomeโdriven. His focus is on closing the gap between AI capability and realโworld adoptionโbuilding operating models that prioritize the right agent opportunities, set the right autonomy levels, and scale responsibly through governance, controls, and measurable results. TRANSCRIPT Transcription Collapsed Transcription Expanded 1 00:00:11,511 –> 00:00:13,113 Thanks for the intro. 2 00:00:13,713 –> 00:00:15,281 And, I’m going to apologize. 3 00:00:15,281 –> 00:00:18,385 I’m going to try to get through this with a voice that is over the weekend 4 00:00:18,385 –> 00:00:19,319 just trying to kill me. 5 00:00:19,319 –> 00:00:22,222 So, but I am Brian Hayden. 6 00:00:22,222 –> 00:00:23,723 I’m a solution architect. 7 00:00:23,723 –> 00:00:27,427 I’ve spent about 30 years in software engineering, and, 8 00:00:27,861 –> 00:00:32,532 with this, you know, being kind of like the third phase of software engineering, 9 00:00:32,832 –> 00:00:35,835 it’s been just really exciting couple of years with AI. 10 00:00:36,269 –> 00:00:40,306 So just a few weeks ago, there’s this post that went viral, 11 00:00:40,874 –> 00:00:44,411 got about 80 million views from guy I hadn’t heard of, 12 00:00:44,577 –> 00:00:47,580 before, but just in a matter of days that got a lot of views. 13 00:00:47,847 –> 00:00:50,583 And the title was something big is happening 14 00:00:50,583 –> 00:00:54,220 and, it was written by a startup CEO, Matt Schumer. 15 00:00:54,721 –> 00:00:57,524 And, you know, in it he compared 16 00:00:57,524 –> 00:01:00,860 where we are at with AI to like 2020, 17 00:01:01,161 –> 00:01:05,865 you know, pre-COVID when everything kind of blew up in the world and the message 18 00:01:05,865 –> 00:01:11,004 was basically something massive is shifting in in technology right now. 19 00:01:11,704 –> 00:01:14,707 And I’ll be honest, you know, when I was reading it, 20 00:01:15,041 –> 00:01:17,644 it seemed a little bit unhinged. So 21 00:01:19,512 –> 00:01:20,680 it was written for his family 22 00:01:20,680 –> 00:01:26,419 and friends who really aren’t in tech, you know, and, the surprising, you know, 23 00:01:26,419 –> 00:01:29,589 somehow it ended up in everybody’s feeds, including mine. 24 00:01:30,256 –> 00:01:32,092 So I sat with it for a couple of days, 25 00:01:32,092 –> 00:01:34,227 you know, and I wrote a little post on LinkedIn. 26 00:01:34,227 –> 00:01:37,430 The title of it was Nothing Big is happening in AI, 27 00:01:38,331 –> 00:01:41,334 and I meant every word of that. 28 00:01:41,434 –> 00:01:43,736 So here’s what I actually meant. 29 00:01:43,736 –> 00:01:47,974 Matt’s writing from inside of San Francisco, inside of his, like, tech, 30 00:01:48,441 –> 00:01:52,178 you know, labs and where where innovation is really fostered. 31 00:01:53,046 –> 00:01:55,215 He’s describing a capability curve 32 00:01:55,215 –> 00:01:59,752 that, is on the bleeding edge, and he’s right about the curve. 33 00:02:00,286 –> 00:02:01,888 Models are getting better. 34 00:02:01,888 –> 00:02:06,359 The technology is getting, faster progress that we’re experiencing. 35 00:02:06,359 –> 00:02:06,893 It’s real. 36 00:02:07,927 –> 00:02:10,730 But I’m not here to argue with that. 37 00:02:10,730 –> 00:02:11,164 What I’m. 38 00:02:11,164 –> 00:02:13,133 What I’m here to say is that this room isn’t 39 00:02:13,133 –> 00:02:16,603 in San Francisco weโre in Pewaukee, Wisconsin. 40 00:02:17,036 –> 00:02:19,639 We’re talking about manufacturing floors. 41 00:02:19,639 –> 00:02:24,043 You know, in Menominee Falls service desks, you know, in downtown Milwaukee. 42 00:02:24,377 –> 00:02:27,347 We’re talking about FinOps here in in Pewaukee. 43 00:02:27,347 –> 00:02:30,016 And things look like that’s the reality on the ground. 44 00:02:30,016 –> 00:02:31,684 That looks a lot different. 45 00:02:31,684 –> 00:02:35,288 So when he’s talking about the capability curve, 46 00:02:36,122 –> 00:02:38,625 it’s not the it’s not the adoption curve. 47 00:02:38,625 –> 00:02:39,959 And that’s different. 48 00:02:39,959 –> 00:02:43,029 So that moves at a completely different pace. 49 00:02:43,296 –> 00:02:46,599 You know and the gap between these two curves, 50 00:02:47,500 –> 00:02:50,904 that’s not really a problem that gaps and opportunity. 51 00:02:51,337 –> 00:02:54,340 And so today we’re going to talk about closing that gap deliberately. 52 00:02:56,109 –> 00:02:59,112 So three years ago I remember this really well. 53 00:02:59,712 –> 00:03:00,346 I was at 54 00:03:01,381 –> 00:03:02,982 Microsoft Build. 55 00:03:02,982 –> 00:03:06,686 And it was the first Microsoft Build when open AI was released. 56 00:03:07,187 –> 00:03:09,956 And probably the first one I think that they had 57 00:03:09,956 –> 00:03:12,959 that was in person after Covid. 58 00:03:13,126 –> 00:03:16,896 And Microsoft announced this massive investment in Mount Pleasant, 59 00:03:16,896 –> 00:03:21,067 Wisconsin, billions of dollars National headlines. 60 00:03:21,301 –> 00:03:22,569 And honestly, 61 00:03:22,569 –> 00:03:26,206 it felt like Milwaukee, Wisconsin was going to be the center of the AI universe. 62 00:03:26,639 –> 00:03:29,642 but, you know, if you think about it, this was announced in 2022. 63 00:03:29,809 –> 00:03:32,812 It’s 2026, it’s still not online. 64 00:03:32,979 –> 00:03:35,982 You know, and that’s not a problem. 65 00:03:36,216 –> 00:03:38,318 You know, that’s just reality. 66 00:03:38,318 –> 00:03:40,320 A lot of you probably drove past it. 67 00:03:40,320 –> 00:03:43,223 You know, maybe a quick show of hands if you drove past it today, 68 00:03:43,223 –> 00:03:46,492 on I-94, you still see the cranes? 69 00:03:47,493 –> 00:03:48,828 You still see the fencing. 70 00:03:48,828 –> 00:03:53,266 The steady progress that feels slow is not keeping up with the pace 71 00:03:53,266 –> 00:03:57,003 that the keynotes or the the, the presentations stated. 72 00:03:57,570 –> 00:03:59,772 So it’s not really a failure, though. 73 00:03:59,772 –> 00:04:01,207 That’s reality. 74 00:04:01,207 –> 00:04:03,610 It’s what it actually takes to do 75 00:04:03,610 –> 00:04:06,512 transformation, building infrastructure like this. 76 00:04:06,512 –> 00:04:07,714 You need permits. 77 00:04:07,714 –> 00:04:09,782 You need the power to get run out there. 78 00:04:09,782 –> 00:04:11,684 Labor agreements. 79 00:04:11,684 –> 00:04:14,354 And none of that really makes the keynote slide. 80 00:04:14,354 –> 00:04:17,924 The reality is that real transformation runs on unglamorous kind of things. 81 00:04:18,491 –> 00:04:19,626 It’s concrete. 82 00:04:19,626 –> 00:04:22,562 Concrete cures slowly, and so does transformation. 83 00:04:23,796 –> 00:04:25,465 So here’s the reality. 84 00:04:25,465 –> 00:04:27,967 The manufacturers, you guys in this room, 85 00:04:27,967 –> 00:04:30,637 aren’t flipping a switch and becoming AI overnight. 86 00:04:30,637 –> 00:04:34,841 We have to retrofit ERP systems, you know, and that takes years. 87 00:04:35,108 –> 00:04:38,111 We have to run pilots inside of our union agreements, 88 00:04:38,177 –> 00:04:41,180 that take months for us to negotiate and re navigate. 89 00:04:41,614 –> 00:04:45,485 And we’re waiting for the next capital cycle to get to fund our infrastructure. 90 00:04:46,085 –> 00:04:49,489 So that’s not a lack of ambition, but what it is, it’s a it’s 91 00:04:49,489 –> 00:04:51,124 a responsible transformation. 92 00:04:51,124 –> 00:04:55,094 That like takes into account your real customers, the real problems, 93 00:04:55,094 –> 00:04:59,766 real operations and has real consequences when something actually breaks. 94 00:05:00,500 –> 00:05:03,169 So the loudest voices that we’re hearing online, 95 00:05:03,169 –> 00:05:05,138 they’re talking about a thunderclap. 96 00:05:05,138 –> 00:05:08,508 And what I see working with organizations, many of you that I’ve worked with 97 00:05:08,708 –> 00:05:10,510 and are working with today, 98 00:05:10,510 –> 00:05:13,513 it’s something quieter and it’s a little bit more durable. 99 00:05:13,880 –> 00:05:18,484 You know, it’s AI compounding and it’s compounding slowly and deliberately. 100 00:05:19,519 –> 00:05:23,156 So the companies that are laying the right foundations, 101 00:05:23,423 –> 00:05:27,327 and looking like right now, they’re the ones that have an advantage, 102 00:05:27,493 –> 00:05:30,496 you know, that are going to have an advantage three years from now. 103 00:05:30,763 –> 00:05:33,766 And so how do you how do you become one of those? 104 00:05:34,233 –> 00:05:36,035 So I’m going to do a little bit of a poll. 105 00:05:36,035 –> 00:05:38,438 I want to know a little bit more about you. 106 00:05:38,438 –> 00:05:41,074 And I’m going to frame a little bit of how I talk about some of these stories 107 00:05:41,074 –> 00:05:43,276 based on, you know, what I see from the audience. 108 00:05:43,276 –> 00:05:47,213 But, on the screen, you see kind of five different stages that I see companies in, 109 00:05:47,380 –> 00:05:50,249 and the first one is watching and waiting. 110 00:05:50,249 –> 00:05:53,052 these are companies that are evaluating AI. 111 00:05:53,052 –> 00:05:55,788 They really haven’t started experimenting. 112 00:05:55,788 –> 00:05:58,524 And be honest with me. Quick show of hands. 113 00:05:58,524 –> 00:05:59,659 You know who’s in that stage. 114 00:06:03,963 –> 00:06:04,397 What about 115 00:06:04,397 –> 00:06:07,500 experimenting people that are running actual pilots? 116 00:06:07,700 –> 00:06:09,669 Nothing in production yet. 117 00:06:09,669 –> 00:06:12,672 And who’s there? 118 00:06:13,039 –> 00:06:14,340 A little bit more. 119 00:06:14,340 –> 00:06:17,043 What about third? We’ve got something live. 120 00:06:17,043 –> 00:06:18,711 Maybe you’ve deployed some agents, 121 00:06:18,711 –> 00:06:21,748 but you’re not quite scale that the organization and something 122 00:06:21,748 –> 00:06:25,284 that that I hear from my customers a lot is how do I scale this? 123 00:06:26,052 –> 00:06:29,055 Can I get a show of hands? 124 00:06:30,256 –> 00:06:30,923 All right. 125 00:06:30,923 –> 00:06:34,327 Fourth, these are, you know, kind of the people that are excelling. 126 00:06:34,327 –> 00:06:35,461 You’re scaling. 127 00:06:35,461 –> 00:06:37,797 You have multiple agents in production. 128 00:06:37,797 –> 00:06:39,031 And the thing that I hear most 129 00:06:39,031 –> 00:06:42,969 in this group is that we need governance to make it accelerate faster. 130 00:06:43,302 –> 00:06:46,005 Who’s in that group? 131 00:06:46,005 –> 00:06:47,507 And what about the last group? 132 00:06:47,507 –> 00:06:50,276 Honestly, not sure. We’re still just trying to figure it out. 133 00:06:52,979 –> 00:06:54,781 Well, a few of you were honest. 134 00:06:54,781 –> 00:07:00,620 I think maybe a third of you raise your hand in total, 135 00:07:00,620 –> 00:07:04,757 so I’m not getting a really good, really good, understanding. 136 00:07:05,091 –> 00:07:06,692 But let’s, you know, let’s take a look where I think 137 00:07:06,692 –> 00:07:10,096 most of the organizations are, I would say, generally speaking, 138 00:07:10,363 –> 00:07:14,233 if you’re in the first two, you know, buckets don’t feel bad. 139 00:07:14,300 –> 00:07:16,068 you’re where most companies are. 140 00:07:16,068 –> 00:07:18,438 And I mean that, you know, as a compliment. 141 00:07:18,438 –> 00:07:19,772 You’re not chasing headlines. 142 00:07:19,772 –> 00:07:21,474 Most of you’re trying to be deliberate. 143 00:07:21,474 –> 00:07:24,410 But this is kind of what Concurrency is seeing across our clients. 144 00:07:24,410 –> 00:07:26,045 And we’ve done a little bit of polling with them. 145 00:07:26,045 –> 00:07:28,414 Informal, not like a fancy study. 146 00:07:28,414 –> 00:07:31,551 you know, majority customers are still kind of in that first two buckets. 147 00:07:32,051 –> 00:07:35,154 And, very few have something that’s live 148 00:07:35,154 –> 00:07:38,157 and actually in production that’s running their business. 149 00:07:38,224 –> 00:07:40,560 That’s making a huge impact. 150 00:07:40,560 –> 00:07:44,664 You know, even fewer are doing the scale problem. 151 00:07:44,664 –> 00:07:48,301 We have a couple of customers that are looking to us for some advice on that. 152 00:07:48,768 –> 00:07:49,936 But if you’re in the first two, 153 00:07:49,936 –> 00:07:53,039 two groups, you probably there because you have discipline. 154 00:07:53,272 –> 00:07:56,442 And that’s the instinct to validate before you scale. 155 00:07:57,076 –> 00:07:59,445 And that’s probably your biggest competitive advantage 156 00:07:59,445 –> 00:08:02,415 right now is taking things methodically. 157 00:08:02,415 –> 00:08:05,685 You aren’t actually behind, if you think about it, looking at these numbers. 158 00:08:06,018 –> 00:08:07,787 So don’t feel bad about that. 159 00:08:07,787 –> 00:08:10,790 You’re exactly where a lot of thoughtful organizations are right now, 160 00:08:11,224 –> 00:08:14,660 and this isn’t really a moment where you want to sprint blindly, 161 00:08:14,961 –> 00:08:19,398 but you do have to get the pace picked up a little bit and it’s a moot, 162 00:08:19,432 –> 00:08:22,435 but it’s the moment where we’re going to start moving with intention, 163 00:08:22,602 –> 00:08:25,037 so don’t rush ahead without a plan. 164 00:08:25,037 –> 00:08:27,573 you know, speaking of where these customers are, 165 00:08:27,573 –> 00:08:31,010 I had this idea, for one of my other talks a couple of weeks ago. 166 00:08:31,177 –> 00:08:33,679 And I’ll give you just a little bit of quick context about me. 167 00:08:33,679 –> 00:08:37,250 For those of you don’t know me, I’m definitely an avid outdoors guy. 168 00:08:37,650 –> 00:08:39,552 I like to hike. 169 00:08:39,552 –> 00:08:40,486 I like to hunt. 170 00:08:40,486 –> 00:08:42,655 I’m out in Colorado, in the mountains. 171 00:08:42,655 –> 00:08:45,291 I do a lot of fishing on Lake Michigan. 172 00:08:45,291 –> 00:08:49,061 And that’s kind of how my brain works and, like, processes things. 173 00:08:49,395 –> 00:08:53,032 So, I understand the world kind of really threw that dirt 174 00:08:53,032 –> 00:08:56,202 and mud and, you know, kind of being in the outdoors and thinking. 175 00:08:56,502 –> 00:09:00,273 And so when I talk about, I use a lot of analogies that make sense to me, 176 00:09:00,606 –> 00:09:03,609 and I’m hoping that they can kind of make a little bit of sense to you. 177 00:09:03,676 –> 00:09:07,046 So, one of, you know, one of these analogies 178 00:09:07,046 –> 00:09:10,049 comes from like the old TV show Gold Rush. 179 00:09:10,182 –> 00:09:12,852 And, anybody remember that Gold Rush? 180 00:09:12,852 –> 00:09:14,654 I think I was kind of addicted to that. 181 00:09:14,654 –> 00:09:19,292 And, but in Alaska and it wasn’t really the drama. 182 00:09:19,692 –> 00:09:23,529 But it was the equipment that they used that started to make sense to me, 183 00:09:23,930 –> 00:09:26,933 the way that the miners chose to chose their tools, 184 00:09:27,466 –> 00:09:32,038 often very innovatively, they scaled their operations 185 00:09:32,038 –> 00:09:35,541 and managed risk was kind of a good lens for me to, like, think through. 186 00:09:35,541 –> 00:09:36,309 Like, AI. 187 00:09:37,743 –> 00:09:38,110 And so 188 00:09:38,110 –> 00:09:41,113 here’s a framework that I helped, that I thought of that I, 189 00:09:41,280 –> 00:09:44,283 you know, could help organizations understand things a little bit clearer. 190 00:09:44,650 –> 00:09:48,821 So I broke down organizations into three different camps. 191 00:09:49,422 –> 00:09:52,558 The first camp was where most organizations really are right now. 192 00:09:53,092 –> 00:09:55,628 They’re using commodity AI tools, 193 00:09:55,628 –> 00:09:58,631 and the commodity one for me is, table stakes. 194 00:09:58,864 –> 00:10:01,367 People are dipping their toes in the water. 195 00:10:01,367 –> 00:10:02,902 And there’s nothing wrong with that. 196 00:10:02,902 –> 00:10:05,104 But it is a commodity position. 197 00:10:05,104 –> 00:10:09,141 And so if everybody is doing it, which most of you probably are, 198 00:10:09,475 –> 00:10:10,343 then it’s just table 199 00:10:10,343 –> 00:10:13,446 stakes at this point and it’s not really any kind of a competitive advantage. 200 00:10:14,113 –> 00:10:18,384 So camp two, that’s where most of the energy is right now. 201 00:10:18,384 –> 00:10:20,987 And that’s where the excitement, that’s where people are talking. 202 00:10:20,987 –> 00:10:23,823 in camp two, this is broad experimentation. 203 00:10:23,823 –> 00:10:25,191 Multiple pilots, 204 00:10:25,191 –> 00:10:28,561 people are, trying to figure out what this means for their organization. 205 00:10:28,761 –> 00:10:31,764 They’re spinning up agents, you know, they’ve got some excitement. 206 00:10:31,964 –> 00:10:35,001 And honestly, this is where I see a lot of wreckage right now. 207 00:10:35,201 –> 00:10:38,170 People making mistakes and stumbling, and falling down. 208 00:10:38,337 –> 00:10:42,875 And then in camp three, it’s, the destination, purpose built. 209 00:10:42,875 –> 00:10:45,978 I align with the mission in your business, and, 210 00:10:45,978 –> 00:10:48,948 deeply integrated into your high impact workflows. 211 00:10:49,181 –> 00:10:51,617 So this is where people want to get that’s going to give 212 00:10:51,617 –> 00:10:53,019 you a sustainable advantage. 213 00:10:54,020 –> 00:10:54,286 But the 214 00:10:54,286 –> 00:10:57,657 problem is, how do you get from camp one where most of us are to camp three, 215 00:10:58,157 –> 00:11:02,461 and a lot of people try to jump that jump from camp one to camp two. 216 00:11:02,995 –> 00:11:05,998 And the the hiking analogy I have is out in Colorado 217 00:11:06,365 –> 00:11:08,300 that’s going to give you altitude sickness. 218 00:11:08,300 –> 00:11:11,771 So you move too fast and you’re not going to have the right governance 219 00:11:11,771 –> 00:11:15,875 rails in place in order to really absorb that kind of transformation. 220 00:11:16,308 –> 00:11:17,777 And so you’re going to make mistakes. 221 00:11:17,777 –> 00:11:20,746 And so my caution for you is if you’re in camp one, focus on what 222 00:11:20,746 –> 00:11:22,515 it’s going to take to get to camp two. 223 00:11:22,515 –> 00:11:25,818 And if you’re in camp two, focus on what it’s going to take to get to camp three. 224 00:11:27,420 –> 00:11:30,489 So here’s we’re going to spend the next 45 minutes or so. 225 00:11:30,890 –> 00:11:31,323 Think about it. 226 00:11:31,323 –> 00:11:34,193 This is like kind of your Alaskan map. 227 00:11:34,193 –> 00:11:37,196 Lay the land before we start getting the excavators out and start, 228 00:11:37,363 –> 00:11:39,065 start digging things up. 229 00:11:39,065 –> 00:11:42,535 Let’s start a little bit with, sharp kind of a vocabulary. 230 00:11:42,635 –> 00:11:44,203 There’s a lot of people in this room. 231 00:11:44,203 –> 00:11:45,438 I was looking at the attendees coming 232 00:11:45,438 –> 00:11:48,874 in, CEOs all the way down to interns and people in school. 233 00:11:49,375 –> 00:11:52,411 So let’s just make sure that you know, everybody in the room is speaking 234 00:11:52,411 –> 00:11:56,916 the same language when we talk about LLMs and RAG and, you know, all those terms, 235 00:11:57,583 –> 00:12:00,586 and then what kind of moving to a control model that I, that I’ve started using. 236 00:12:01,020 –> 00:12:03,723 And it’s a framework for helping, organizations to decide 237 00:12:03,723 –> 00:12:07,259 how much autonomy an agent, is going to use. 238 00:12:07,860 –> 00:12:11,997 And, then from there, we’re going to go to the agent portfolio itself, 239 00:12:11,997 –> 00:12:13,666 which I think is what a lot of people were, 240 00:12:13,666 –> 00:12:16,669 were looking forward to your operating, portfolio. 241 00:12:16,869 –> 00:12:21,340 And then we’ll close here with like a 30, 60, 90 day, little session, give you 242 00:12:21,340 –> 00:12:24,510 some practical things that you can take back to your office tomorrow morning. 243 00:12:24,910 –> 00:12:26,145 after this session, you’re going to have 244 00:12:26,145 –> 00:12:28,380 a couple of opportunities to go a little bit deeper. 245 00:12:28,380 –> 00:12:31,383 And I just wanted to make sure they planted the seeds for those. 246 00:12:31,417 –> 00:12:33,919 So the governing and the governing and securing 247 00:12:33,919 –> 00:12:37,223 AI at scale session that’s going to build directly on this control monitor. 248 00:12:37,223 –> 00:12:39,759 We’re going to talk about data and AI sessions. 249 00:12:39,759 –> 00:12:42,161 So shows really how your data is. 250 00:12:42,161 –> 00:12:45,898 Your data foundation is going to determine what that ceiling, 251 00:12:45,898 –> 00:12:48,901 this ladder that I was talking about, what the ceiling is going to be. 252 00:12:48,901 –> 00:12:53,272 And then we’ll finish with that panel, with the three regional leaders 253 00:12:53,272 –> 00:12:55,174 that are going to help you understand 254 00:12:55,174 –> 00:12:58,310 how they’re making the real decisions that they’re making right now. 255 00:12:58,944 –> 00:13:01,947 I can tell you, because I know each one of them, they’re not hypotheticals. 256 00:13:02,181 –> 00:13:03,816 They’re going to be real stories. 257 00:13:03,816 –> 00:13:06,819 So let’s start with that vocabulary, large language model. 258 00:13:07,119 –> 00:13:10,589 you know, I know that some, some people are here to understand 259 00:13:10,589 –> 00:13:13,259 what most of these terms mean, but I want to make sure that, 260 00:13:13,259 –> 00:13:14,827 if you do know of them, that’s great. 261 00:13:14,827 –> 00:13:17,563 But I just want to make sure you’re using the same definition that I am. 262 00:13:17,563 –> 00:13:19,365 His words get kind of thrown around pretty loosely. 263 00:13:19,365 –> 00:13:22,234 So, first one, large language model. 264 00:13:22,234 –> 00:13:27,473 These are, these are tools that will predict and produce language. 265 00:13:27,940 –> 00:13:29,308 They’re super useful. 266 00:13:29,308 –> 00:13:31,911 And where most of the hype really started. 267 00:13:31,911 –> 00:13:35,981 But the challenge with LLMs is that it doesn’t know anything about your business. 268 00:13:36,448 –> 00:13:40,553 It doesn’t know what your data looks like, it doesn’t know what your documentation 269 00:13:40,553 –> 00:13:43,556 or SOPs look like. That’s not what it is. 270 00:13:44,089 –> 00:13:45,991 And it certainly doesn’t know anything 271 00:13:45,991 –> 00:13:48,494 about what your policies as an organization are. 272 00:13:48,494 –> 00:13:51,864 So, it’s a dangerous tool to a certain degree, 273 00:13:52,231 –> 00:13:54,033 because it’s not trusted by default. 274 00:13:55,201 –> 00:13:56,969 That’s the talk about RAG. 275 00:13:56,969 –> 00:13:59,972 And this stands for retrieval augmented generation. 276 00:14:00,673 –> 00:14:05,544 This is a fantastic tool that started to once we had LLMs 277 00:14:05,811 –> 00:14:08,614 could retrieve information from your knowledge, 278 00:14:08,614 –> 00:14:12,351 your company, and, you know, actually answer it in a way that makes sense. 279 00:14:12,818 –> 00:14:16,589 So now your SOPs and your customer data, are part of that response 280 00:14:16,589 –> 00:14:18,157 and are part of that cycle. 281 00:14:18,157 –> 00:14:21,861 So think of it just, an LLM plus your context. 282 00:14:22,595 –> 00:14:24,096 The third is an agent. 283 00:14:24,096 –> 00:14:26,131 And this is more than just a model. 284 00:14:26,131 –> 00:14:30,636 similar like with WAG when we said that that’s your LLM with knowledge. 285 00:14:30,903 –> 00:14:33,873 An agent is essentially an LLM with tools. 286 00:14:33,873 –> 00:14:37,343 now it doesn’t just answer questions, it actually does things for you. 287 00:14:38,043 –> 00:14:39,879 And that’s what makes them powerful. 288 00:14:39,879 –> 00:14:43,649 But it’s also why it requires a completely different level of governance. 289 00:14:44,450 –> 00:14:46,352 So finally there’s CoPilot. 290 00:14:46,352 –> 00:14:47,686 I feel like I have to explain 291 00:14:47,686 –> 00:14:51,156 CoPilots the most because Microsoft has 73,000 of them. 292 00:14:51,690 –> 00:14:54,159 but if you want to think of it a little bit simpler. 293 00:14:54,159 –> 00:14:56,662 CoPilots are really just the user interface. 294 00:14:56,662 –> 00:15:00,266 It’s how you interact with your agents, on a daily basis. 295 00:15:00,666 –> 00:15:03,836 And it’s the way they show up inside of, like, Microsoft 365 296 00:15:04,069 –> 00:15:08,474 or CoPilots inside of your ERP or CoPilot inside a service desk. 297 00:15:08,741 –> 00:15:11,410 I’m going to leave this slide with the one little key insight. 298 00:15:11,410 –> 00:15:14,613 I don’t like to think of these, you know, just as product categories. 299 00:15:14,880 –> 00:15:17,182 They’re actually more like operating roles. 300 00:15:17,182 –> 00:15:20,219 So when you deploy AI, you’re not just buying software, 301 00:15:20,619 –> 00:15:22,788 you’re deciding which roles in your organization 302 00:15:22,788 –> 00:15:25,391 are going to be staffed by some of these intelligent agents, 303 00:15:25,391 –> 00:15:28,360 and which ones are going to be remaining in the human in the human lens. 304 00:15:28,727 –> 00:15:31,030 So that shift in thinking matters, you know, 305 00:15:31,030 –> 00:15:33,098 pretty much for everything that we’re going to talk about next. 306 00:15:34,533 –> 00:15:35,634 I mentioned the capability 307 00:15:35,634 –> 00:15:39,071 curve before, and Schumer was right about this, this kind of point. 308 00:15:39,471 –> 00:15:43,375 The capability curve is real and it’s moving exceptionally fast. 309 00:15:43,842 –> 00:15:46,612 So there’s an organization out there called Meter. 310 00:15:46,612 –> 00:15:51,183 This is, you know, kind of a very new you know, in the news kind of thing. 311 00:15:51,183 –> 00:15:55,187 But what they do is they track They track the length of real world tasks 312 00:15:55,421 –> 00:15:58,290 that an AI model can complete successfully 313 00:15:58,290 –> 00:16:01,160 end to end, without any human help. 314 00:16:01,160 –> 00:16:04,463 So let me let me unpack that again just a second time. 315 00:16:04,997 –> 00:16:08,500 If a task would take a human ten minutes to complete it, 316 00:16:08,901 –> 00:16:11,937 and an AI can complete that task regardless if it takes it 317 00:16:11,937 –> 00:16:17,176 two hours, ten days, or one minute, the measurable output of 318 00:16:17,176 –> 00:16:20,179 that is that it’s a ten minute task that it was able to complete. 319 00:16:20,379 –> 00:16:23,182 Does that make sense to everybody? Okay, so 320 00:16:24,583 –> 00:16:26,719 about a year ago that number was ten minutes. 321 00:16:26,719 –> 00:16:31,824 Today and the the latest, meter ratings, it’s up to five hours. 322 00:16:32,257 –> 00:16:35,260 So agents are now capable of doing 323 00:16:35,594 –> 00:16:38,797 five hours worth of humans complexity task. 324 00:16:39,098 –> 00:16:42,001 Whether that takes five hours or takes a minute doesn’t matter. 325 00:16:42,001 –> 00:16:43,135 But that’s how good they are. 326 00:16:43,135 –> 00:16:45,204 They can reason through those things. 327 00:16:45,204 –> 00:16:49,408 And so this has been doubling roughly every seven months. 328 00:16:49,742 –> 00:16:52,044 And kind of similarly to Moore’s Law, 329 00:16:52,044 –> 00:16:54,646 we were seeing this like doubling every seven months. 330 00:16:54,646 –> 00:16:58,250 But you know it’s changing now. 331 00:16:58,517 –> 00:16:59,585 Now we’re down to, 332 00:16:59,585 –> 00:17:03,288 you know, something more like three months in terms of how fast this is doubling. 333 00:17:03,956 –> 00:17:06,959 But, you know, take a step back and think about this. 334 00:17:07,026 –> 00:17:10,496 One year ago, I could handle a ten minute task, 335 00:17:10,662 –> 00:17:13,866 and we were talking about one year ago today, you can complete 336 00:17:13,866 –> 00:17:16,869 nearly five hours of of work. 337 00:17:16,935 –> 00:17:18,804 That’s a meaningful shift. 338 00:17:18,804 –> 00:17:22,107 And I think that’s where Matt is really kind of grounded in his statement. 339 00:17:22,674 –> 00:17:26,078 The models that we’re using today would have been unrecognizable, 340 00:17:26,311 –> 00:17:28,580 inconceivable, you know, to what? 341 00:17:28,580 –> 00:17:30,783 To what we’re capable of doing today. 342 00:17:30,783 –> 00:17:35,154 So I’m not here to argue that AI is not progressing, because it is. 343 00:17:35,521 –> 00:17:37,189 It absolutely is. 344 00:17:37,189 –> 00:17:41,293 But here’s what the hype kind of misses the capability and the adoption. 345 00:17:41,994 –> 00:17:43,929 They’re completely different curves. 346 00:17:43,929 –> 00:17:45,097 And the adoption curve is the one 347 00:17:45,097 –> 00:17:47,032 that actually matters to most of the people in this room. 348 00:17:47,032 –> 00:17:48,233 That’s why you’re actually here today. 349 00:17:49,635 –> 00:17:50,903 So let’s 350 00:17:50,903 –> 00:17:53,872 pause and take a look at what some of the actual data says. 351 00:17:53,906 –> 00:17:55,574 And this is super recent to. 352 00:17:55,574 –> 00:17:58,410 This is from PWC this year already. 353 00:17:58,410 –> 00:18:00,946 And they surveyed some global CEOs last year. 354 00:18:00,946 –> 00:18:05,984 And about 20% say that they’re applying AI extensively to core business functions, 355 00:18:06,185 –> 00:18:09,721 not just experimenting, but actually deploying agents at scale. 356 00:18:09,955 –> 00:18:13,058 One of the best Morgan Stanley often cited, you know, 357 00:18:13,058 –> 00:18:16,061 as one of the most aggressive adopters. 358 00:18:16,295 –> 00:18:19,298 They went from a pilot just running pilots 359 00:18:19,331 –> 00:18:22,301 to having full rollout in 24 months. 360 00:18:22,434 –> 00:18:25,604 I mean, that seems kind of incredible to a certain degree, 361 00:18:25,604 –> 00:18:29,741 but also that’s the standard of fast, So going back to what we’re talking about 362 00:18:29,741 –> 00:18:33,679 with concrete, caring slowly fast is definitely a relative, scale. 363 00:18:34,046 –> 00:18:37,683 you know, most regulated industries are looking at 3 or 5 years 364 00:18:37,983 –> 00:18:40,986 for meaningful AI transformation. 365 00:18:41,019 –> 00:18:43,589 And so none of that’s really meant to discourage 366 00:18:43,589 –> 00:18:46,592 you, but it’s meant is to give you some context. 367 00:18:46,758 –> 00:18:49,761 The organization, you know, if your organization is feeling 368 00:18:49,795 –> 00:18:53,699 you’re behind because you’re not moving at the speed of like a LinkedIn feed, 369 00:18:55,000 –> 00:18:57,636 you’re probably not right on schedule right now. 370 00:18:57,636 –> 00:19:00,639 And the question isn’t really whether you want to move, 371 00:19:00,739 –> 00:19:03,642 it’s how you move with precision so that you close this gap 372 00:19:03,642 –> 00:19:05,344 instead of burning your budget. 373 00:19:05,344 –> 00:19:08,347 On pilots that are probably never going to get shipped. 374 00:19:08,447 –> 00:19:11,316 So what actually separates the organizations 375 00:19:11,316 –> 00:19:14,319 that close the gap from the ones that drown in pilots? 376 00:19:14,786 –> 00:19:19,825 And I’ve come up, with this control model, that I want to share with you, 377 00:19:19,825 –> 00:19:23,462 and it’s a simple one that you can start using, pretty much tomorrow. 378 00:19:24,029 –> 00:19:27,099 And so for me, I’m thinking of it as like an autonomy ladder, 379 00:19:27,666 –> 00:19:28,867 and there’s five rungs. 380 00:19:28,867 –> 00:19:31,937 And the higher that you climb, the more that the more that you’re letting 381 00:19:31,937 –> 00:19:32,804 your agent do for you. 382 00:19:34,106 –> 00:19:35,007 But it’s 383 00:19:35,007 –> 00:19:38,010 also the more discipline that you need to have for your agent. 384 00:19:38,143 –> 00:19:39,378 And so on. 385 00:19:39,378 –> 00:19:42,381 The first level, level one, they just observe. 386 00:19:42,714 –> 00:19:45,651 So this is what most people think of like just your chat bot. 387 00:19:45,651 –> 00:19:48,020 it’s just summarizing your email. 388 00:19:48,020 –> 00:19:49,721 It’s just creating your drafts. 389 00:19:49,721 –> 00:19:51,990 it’s not doing anything meaningfully. 390 00:19:51,990 –> 00:19:54,660 And this is where almost everybody starts. 391 00:19:54,660 –> 00:19:58,363 But level two is where the agent starts to do things like recommend things 392 00:19:58,830 –> 00:20:01,400 and so it can tell you what to do. 393 00:20:01,400 –> 00:20:04,102 But you’re the one that’s actually pushing the buttons. 394 00:20:04,102 –> 00:20:06,572 You’re the one that’s actually acting on it. 395 00:20:06,572 –> 00:20:10,142 And, you know, in this case, I would think of it more like a service 396 00:20:10,142 –> 00:20:13,979 desk use case where I’m trying to figure out what the problem is. 397 00:20:14,313 –> 00:20:18,016 What gear shift do I, you know, move or, what? 398 00:20:18,050 –> 00:20:19,551 You know, what part do I have to replace? 399 00:20:19,551 –> 00:20:21,186 That’s the recommend. 400 00:20:21,186 –> 00:20:23,355 But in level three, we get to prepare. 401 00:20:23,355 –> 00:20:26,258 And so now this is where things get a little bit interesting. 402 00:20:26,258 –> 00:20:28,260 The agent is going to draft the action. 403 00:20:28,260 –> 00:20:31,263 And it’s going to give you instructions and what you have to do. 404 00:20:31,463 –> 00:20:34,466 that might be something like, responding with this email, 405 00:20:34,499 –> 00:20:37,569 creating this purchase order, routing this decision. 406 00:20:38,036 –> 00:20:42,074 Those are a couple of good examples, but a human approves it before it goes out. 407 00:20:42,541 –> 00:20:43,775 And that’s important. 408 00:20:43,775 –> 00:20:47,179 Many mature and enterprise deployments are actually starting to live right 409 00:20:47,179 –> 00:20:51,516 in that space where the agent is actually doing the work under human supervision. 410 00:20:52,050 –> 00:20:54,853 but then level four is executing in a sandbox. 411 00:20:54,853 –> 00:20:57,489 Now we’re starting to talk about autonomous agents. 412 00:20:57,489 –> 00:21:01,293 And inside that sandbox we’re going to limit, what the agent has access to. 413 00:21:01,727 –> 00:21:05,030 But they can act now on their own in a really controlled environment. 414 00:21:05,130 –> 00:21:08,133 I think OpenClaw for people that are reading the news. 415 00:21:08,533 –> 00:21:11,036 You like most people that are, well, not most people. 416 00:21:11,036 –> 00:21:13,905 I think most people are probably just like willy nilly doing stuff. 417 00:21:13,905 –> 00:21:18,110 But, most responsible people are taking OpenClaw and putting it inside 418 00:21:18,110 –> 00:21:21,680 of like a sandbox so that it doesn’t have access to everything. 419 00:21:22,147 –> 00:21:25,884 But there’s no real world consequences until you validated 420 00:21:25,884 –> 00:21:28,887 the results, you’re not really letting it go, go wild. 421 00:21:29,388 –> 00:21:30,922 But this requires testing. 422 00:21:30,922 –> 00:21:32,724 It requires simulation. 423 00:21:32,724 –> 00:21:35,093 It’s going to require validation loops. 424 00:21:35,093 –> 00:21:38,530 And we’ve started to adopt this term called evals, 425 00:21:38,730 –> 00:21:41,733 which become really important when you start to have autonomous agents. 426 00:21:42,267 –> 00:21:43,835 Now, number five 427 00:21:45,103 –> 00:21:45,337 at the 428 00:21:45,337 –> 00:21:48,707 top, this is where we just let the agents execute in production. 429 00:21:49,308 –> 00:21:52,678 They’re acting on real systems with real data in real time. 430 00:21:53,145 –> 00:21:54,279 But at this point, 431 00:21:54,279 –> 00:21:57,349 if you’re doing it and doing it right, you’ve got the guardrails, you’re doing 432 00:21:57,349 –> 00:22:01,086 logging, you’ve got human in the loop, checkpoints for the right moments. 433 00:22:01,586 –> 00:22:05,190 lot of our systems that, you know, that we’ve built at Concurrency 434 00:22:05,490 –> 00:22:08,460 intentionally have human in the loop until the last minute. 435 00:22:08,794 –> 00:22:12,230 And so if you’re thinking about starting, I would say 436 00:22:12,230 –> 00:22:15,534 most organizations should start around level two. 437 00:22:16,368 –> 00:22:19,971 A meaningful, meaningful number of you are already doing 438 00:22:19,971 –> 00:22:22,974 or are ready for things to do in level three. 439 00:22:22,974 –> 00:22:25,677 But if you jump straight into level 440 00:22:25,677 –> 00:22:28,680 five without ensuring that there’s the proper guardrails, 441 00:22:28,847 –> 00:22:32,217 what I’m seeing right now is that’s where most organizations, they pull the plug. 442 00:22:32,484 –> 00:22:34,753 Those pilots go there and they die. 443 00:22:34,753 –> 00:22:37,756 And so deciding which level agent, 444 00:22:38,357 –> 00:22:42,060 which level is going to fit your agent is important to start with 445 00:22:42,594 –> 00:22:46,331 to make sure that you are, governing that across your entire portfolio. 446 00:22:47,165 –> 00:22:50,969 another recent thing, a quick little field report from last weekend. 447 00:22:51,603 –> 00:22:56,341 Jack Dorsey from Twitter, also with Block the company that runs Square 448 00:22:56,575 –> 00:23:00,512 and Cash App, he announced that he’s cutting more than 4000 jobs. 449 00:23:01,046 –> 00:23:03,782 And that’s a very significant portion of their workforce. 450 00:23:03,782 –> 00:23:06,418 I think their workforce is 10,000 people. 451 00:23:06,418 –> 00:23:08,353 And a shift that overnight. 452 00:23:08,353 –> 00:23:10,756 But he wrote this really, 453 00:23:10,756 –> 00:23:13,759 interesting letter that he posted on, Twitter. 454 00:23:13,859 –> 00:23:16,862 And he was explicit about why he was going to cut the teams. 455 00:23:17,396 –> 00:23:22,434 Smaller teams can now accomplish as much as what larger teams used to be able to, 456 00:23:22,834 –> 00:23:25,837 because of the things that AI tools are able to do for them. 457 00:23:26,405 –> 00:23:28,940 I’m not here to judge whether he was making the right call. 458 00:23:28,940 –> 00:23:30,709 That’s completely Dorsey’s decision. 459 00:23:30,709 –> 00:23:34,012 And there’s a lot of people online that are saying, 460 00:23:34,012 –> 00:23:37,015 that it’s about time that Dorsey makes these kind of calls. 461 00:23:37,149 –> 00:23:38,550 But sit with this for a minute. 462 00:23:39,584 –> 00:23:39,918 Do you 463 00:23:39,918 –> 00:23:42,988 want your organization to be reactive to that kind of move, 464 00:23:43,789 –> 00:23:46,825 responding to the competitive pressures after it falls? 465 00:23:47,459 –> 00:23:50,996 Or do you want to be the organization that saw that shift coming, 466 00:23:52,030 –> 00:23:54,766 and have you had the ability to build the 467 00:23:54,766 –> 00:23:58,637 the capabilities internally so their team could move faster? 468 00:23:59,037 –> 00:24:01,006 When you have that, 469 00:24:01,006 –> 00:24:04,242 it’s not really a rhetorical question most of you are going to answer. 470 00:24:04,242 –> 00:24:06,044 You want to be ahead of that curve. 471 00:24:06,044 –> 00:24:10,315 So the difference between those two positions really is having a portfolio 472 00:24:10,749 –> 00:24:13,051 and this kind of control model that we talked about. 473 00:24:13,051 –> 00:24:14,686 Before, you need one. 474 00:24:14,686 –> 00:24:17,656 So a little bit of, you know, sobriety here. 475 00:24:17,756 –> 00:24:19,458 I’m going to spend a little bit of time. 476 00:24:19,458 –> 00:24:22,360 This is the kind of thing that keeps most people up at night. 477 00:24:22,360 –> 00:24:25,330 But I think it’s really an underappreciated risk right now. 478 00:24:25,330 –> 00:24:30,135 In enterprise, enterprise AI, and that’s prompt injection. 479 00:24:30,702 –> 00:24:33,705 And, some of you may have heard that term. 480 00:24:33,839 –> 00:24:36,708 And even if you’ve never really written a line of code or don’t plan 481 00:24:36,708 –> 00:24:39,711 on writing a line of code, this is something that’s important. 482 00:24:39,711 –> 00:24:42,714 And here’s more or less a simple definition. 483 00:24:43,014 –> 00:24:47,652 prompt injection is an attack when someone hides malicious instructions 484 00:24:47,786 –> 00:24:52,257 inside of context or data, that your agent is going to read 485 00:24:52,624 –> 00:24:57,195 and that the agent then follows those instructions instead of yours. 486 00:24:57,996 –> 00:25:00,999 So The attacker didn’t hack your model. 487 00:25:01,199 –> 00:25:03,001 It didn’t hack your software. 488 00:25:03,001 –> 00:25:06,171 They hid instructions inside of something that your agent is going to use 489 00:25:06,638 –> 00:25:08,874 Sometimes it’s a document that it reads. 490 00:25:08,874 –> 00:25:11,877 Sometimes it could be some information that’s in a service ticket. 491 00:25:12,244 –> 00:25:15,180 It could be, a little bit more malicious than that. 492 00:25:15,180 –> 00:25:18,350 But the agent really can’t tell the difference between your directive 493 00:25:18,750 –> 00:25:20,919 and what the attacker’s directive is. 494 00:25:20,919 –> 00:25:24,089 So it just sees words, you know, when it gets put together, 495 00:25:24,389 –> 00:25:26,358 and that’s all it really cares about. 496 00:25:26,358 –> 00:25:29,027 We used to have this thing, when I was doing software 497 00:25:29,027 –> 00:25:32,030 engineering SQL injection work the same way. 498 00:25:32,097 –> 00:25:35,534 so this again, I want to just go back to, to what I said at the very beginning 499 00:25:35,534 –> 00:25:39,538 of this slide, which is this isn’t this isn’t a new problem, but 500 00:25:40,438 –> 00:25:43,875 people are building agents right now to help their service desk, 501 00:25:44,209 –> 00:25:48,146 to help, you know, to really help internal users or their, their customers 502 00:25:48,680 –> 00:25:51,683 and these are totally normal workflows that people are doing in, 503 00:25:52,117 –> 00:25:55,353 and they’re retrieving normal context and things that people would read, 504 00:25:55,554 –> 00:25:58,323 but they’re not actually thinking about the ramifications of it 505 00:25:58,323 –> 00:26:00,759 because they’re so impressed by the wow factor of it. 506 00:26:01,760 –> 00:26:03,728 the agents aren’t doing this because it’s evil. 507 00:26:03,728 –> 00:26:06,898 It’s doing it because it can’t distinguish between something that was legitimate, 508 00:26:07,432 –> 00:26:09,801 and something that has malicious content. 509 00:26:09,801 –> 00:26:12,103 But what do you actually do about this? 510 00:26:12,103 –> 00:26:15,240 There’s really just four kind of simple things that you can do right now. 511 00:26:15,674 –> 00:26:18,510 Not 40 things that you have to think about just four things 512 00:26:18,510 –> 00:26:20,045 that you should start with. 513 00:26:20,045 –> 00:26:23,548 And first input validation, just like we did in the past. 514 00:26:23,715 –> 00:26:26,718 Sounds pretty simple, but before any external content 515 00:26:26,885 –> 00:26:30,655 gets to your agent, the document, the tickets, the web, 516 00:26:30,755 –> 00:26:34,759 the web pages, you have to treat it as it’s something that isn’t trusted. 517 00:26:35,026 –> 00:26:36,394 You have to inspect it. 518 00:26:36,394 –> 00:26:38,063 You have to sanitize it. 519 00:26:38,063 –> 00:26:41,299 You cannot let the raw, content hit that model directly. 520 00:26:41,967 –> 00:26:44,636 And there’s a lot of tools that are out there that can help you with that. 521 00:26:44,636 –> 00:26:47,572 and they’ve been shipping like this for a long time with these kinds of, 522 00:26:47,572 –> 00:26:50,942 parsing, tools. 523 00:26:51,476 –> 00:26:54,713 But the second concept is something we’ve been talking about a lot, too, 524 00:26:54,746 –> 00:26:55,914 is least privilege. 525 00:26:55,914 –> 00:27:01,086 your invoice processing agent should never really have access to your HR, data 526 00:27:01,686 –> 00:27:02,621 or your customer 527 00:27:02,621 –> 00:27:05,624 service agents shouldn’t really have access to your financial systems. 528 00:27:06,057 –> 00:27:10,895 So give your agents what access that it actually needs for that particular task. 529 00:27:11,262 –> 00:27:14,265 Microsoft has made that incredibly easy, 530 00:27:14,499 –> 00:27:17,502 with the new, Agent 365 framework. 531 00:27:17,769 –> 00:27:21,206 That makes agents essentially an identity inside of your network. 532 00:27:22,007 –> 00:27:25,010 Third, make sure that there’s a human in the loop. 533 00:27:25,377 –> 00:27:28,513 So you have those kind of checkpoints that any action 534 00:27:28,513 –> 00:27:30,382 that touches any kind of an external system, 535 00:27:31,783 –> 00:27:32,350 they’re going to 536 00:27:32,350 –> 00:27:35,820 if it’s going to send data outside of your environment or reach out to a customer, 537 00:27:36,254 –> 00:27:39,391 or makes any kind of change that you can’t undo easily, 538 00:27:40,725 –> 00:27:43,395 make it make sure that the human has the approval first. 539 00:27:43,395 –> 00:27:44,496 Full stop. 540 00:27:44,496 –> 00:27:46,031 Don’t pass that. Stop. 541 00:27:46,031 –> 00:27:49,034 This is your once one chance as a circuit breaker. 542 00:27:49,267 –> 00:27:51,970 And then for auditing. 543 00:27:51,970 –> 00:27:54,939 So can you replay exactly what the agent does. 544 00:27:54,939 –> 00:27:55,774 And that’s important. 545 00:27:55,774 –> 00:27:58,510 Replay exactly what the agent did. 546 00:27:58,510 –> 00:28:02,047 So there’s going to be some concerns with that because how do I replay 547 00:28:02,380 –> 00:28:05,550 something with the exact same with the exact context, 548 00:28:05,950 –> 00:28:09,287 and the exact words while there’s privacy issues with that, 549 00:28:09,554 –> 00:28:12,557 you know, so it’s a complex subject. 550 00:28:13,391 –> 00:28:16,428 But if you can’t replay this and you can’t investigate 551 00:28:16,428 –> 00:28:19,431 incidents, then you can’t prove compliance. 552 00:28:19,597 –> 00:28:21,900 And that’s important for a lot of organizations. 553 00:28:21,900 –> 00:28:24,436 And you’re just flying blind at that point. 554 00:28:24,436 –> 00:28:28,773 So luckily the vendors vendors are starting to catch up to this stuff too. 555 00:28:29,174 –> 00:28:33,344 So OpenAI, Anthropic, they all have features, things 556 00:28:33,344 –> 00:28:37,082 like lock down mode, that are built in that help 557 00:28:37,082 –> 00:28:42,187 deterministically disable tools that get commonly exploited in the ecosystem. 558 00:28:42,721 –> 00:28:45,356 So make sure that you’re remembering those, 559 00:28:45,356 –> 00:28:48,660 and incorporating these into your workflows. 560 00:28:49,027 –> 00:28:51,362 And remember that altitude sickness that I talked about. 561 00:28:52,797 –> 00:28:53,231 Skipping 562 00:28:53,231 –> 00:28:56,234 this, it’s a really good way to get sick. 563 00:28:56,301 –> 00:28:59,070 So guardrails let you go faster. 564 00:28:59,070 –> 00:29:02,440 I think that there’s this idea that governance will slow you down 565 00:29:02,440 –> 00:29:06,211 as an organization, and that it’s a tax 566 00:29:06,211 –> 00:29:10,181 that you’re going to pay to use AI, but it’s not governance. 567 00:29:10,181 –> 00:29:13,184 And guardrails are the engine that lets you actually scale 568 00:29:13,351 –> 00:29:16,287 without any kind of a real governance model. 569 00:29:16,287 –> 00:29:20,492 Every agent that you build for your team is just going to be one single, one off, 570 00:29:20,658 –> 00:29:23,928 every deployment that triggers a brand new risk conversation, 571 00:29:24,329 –> 00:29:27,732 you’re going to be starting your journey from zero every single time 572 00:29:27,732 –> 00:29:29,801 without these kind of guardrails. 573 00:29:29,801 –> 00:29:32,537 So when you bake it into the pattern, you build it once 574 00:29:32,537 –> 00:29:36,074 and then you have these things that you can reuse those security models, 575 00:29:36,441 –> 00:29:39,444 you know, they can incorporate those types of things right into it. 576 00:29:39,611 –> 00:29:41,312 You’ve got policies. 577 00:29:41,312 –> 00:29:44,415 You have, permission groups, agent sets. 578 00:29:44,616 –> 00:29:47,619 Those are things that help you build faster and scale faster. 579 00:29:47,652 –> 00:29:50,655 Your second agent’s going to ship faster than your first, 580 00:29:50,722 –> 00:29:54,526 because you have an approval path that’s already built in and it exists. 581 00:29:54,993 –> 00:29:57,996 Your fifth, it’s going to be faster, even still. 582 00:29:58,029 –> 00:30:01,599 And they compound, as you go through this journey 583 00:30:01,599 –> 00:30:04,602 so that trust actually accumulates within the organization. 584 00:30:05,103 –> 00:30:07,972 So the organizations that are winning right now, 585 00:30:07,972 –> 00:30:10,241 they’re not the ones that are just trying to figure this out. 586 00:30:10,241 –> 00:30:13,311 They’re the ones that are moving faster on day one, with agents 587 00:30:13,311 –> 00:30:16,514 that they’re building and the ones who built the right foundations early, 588 00:30:17,115 –> 00:30:22,220 are by, let’s say, like day 50, they’re moving faster than everybody else. 589 00:30:22,220 –> 00:30:27,525 And when they get to day 100, as if like, you know, the they just move faster. 590 00:30:28,526 –> 00:30:32,397 So, using that 591 00:30:32,964 –> 00:30:34,165 altitude sickness, 592 00:30:34,165 –> 00:30:37,335 analogy again, the ones that are going to be really successful at this, 593 00:30:37,335 –> 00:30:40,338 one of the ones they’re going to let yourselves acclimate at this step, 594 00:30:40,638 –> 00:30:43,408 so pause and make sure you get this governance right. 595 00:30:43,408 –> 00:30:46,177 And now I want to talk a little bit 596 00:30:46,177 –> 00:30:49,180 about, like, what’s going to be, our first agent. 597 00:30:49,380 –> 00:30:53,685 So to make this real, we need some sort of an operating model. 598 00:30:54,219 –> 00:30:58,122 And, the framing, you know, for this summit really kind of stated. 599 00:30:58,389 –> 00:31:01,759 Stop buying AI, start managing agents like a portfolio. 600 00:31:02,360 –> 00:31:05,830 So I want to start classifying them by business impact. 601 00:31:06,297 –> 00:31:08,700 And for many of you, what does that mean? 602 00:31:08,700 –> 00:31:09,667 Is it revenue? 603 00:31:09,667 –> 00:31:12,670 Is it cost is it risk? 604 00:31:12,770 –> 00:31:17,609 Customer service operations could be any one of those things. 605 00:31:17,609 –> 00:31:19,911 What’s important to you as an organization? 606 00:31:19,911 –> 00:31:22,914 But when you think about these agents, fund them, like their products, 607 00:31:23,381 –> 00:31:25,016 manage them, like their assets. 608 00:31:26,251 –> 00:31:27,518 Let me bring that life. 609 00:31:27,518 –> 00:31:28,519 Let me bring that to life. 610 00:31:28,519 –> 00:31:31,422 A little bit of a metaphor that hopefully is going to fit with this room. 611 00:31:31,422 –> 00:31:34,425 Because back to my little gold mining operation. 612 00:31:34,959 –> 00:31:37,962 I kind of wanted to use AI to show me and the excavator, 613 00:31:38,363 –> 00:31:41,366 super kind of fun to go dig gold. 614 00:31:41,699 –> 00:31:43,368 But what are the excavators? 615 00:31:43,368 –> 00:31:45,837 they’re the ones that move the raw material, the earth. 616 00:31:45,837 –> 00:31:46,838 Right. 617 00:31:46,838 –> 00:31:50,441 So in your organization, that’s how we identify agent candidates. 618 00:31:50,708 –> 00:31:54,913 And where’s the work that can be, you know, assisting people, automating things, 619 00:31:55,480 –> 00:31:58,316 or augmenting the work that people do. 620 00:31:58,316 –> 00:32:01,319 Then we move on to the shaker table where, 621 00:32:01,452 –> 00:32:05,623 it separates the valuable material from just the the waste, the dirt. 622 00:32:05,657 –> 00:32:06,724 Right. 623 00:32:06,724 –> 00:32:09,894 In your portfolio, that mechanism is essentially going to be 624 00:32:09,894 –> 00:32:12,897 a scoring and prioritization tool that we’re going to use. 625 00:32:12,897 –> 00:32:16,134 And we all know that not every idea’s actually worth building. 626 00:32:16,567 –> 00:32:19,971 But if it gets past that shaker table, that’s the filter. 627 00:32:19,971 –> 00:32:21,039 That’s 628 00:32:21,039 –> 00:32:24,642 where this is where we’re going to catch the fine particles and the large nuggets. 629 00:32:24,642 –> 00:32:27,946 And we’re going to, make sure that we’re not letting anything escape. 630 00:32:28,680 –> 00:32:31,683 That’s the governance and kind of security controls that we’re talking about. 631 00:32:32,150 –> 00:32:34,886 There inside of those layers in the filter mats. 632 00:32:34,886 –> 00:32:37,422 And then finally we’re going to capture those nuggets. 633 00:32:37,422 –> 00:32:41,759 This is what you want to keep measurable business outcomes, real impact 634 00:32:41,759 –> 00:32:42,994 and something that you can show 635 00:32:42,994 –> 00:32:46,431 back your CFO, so that we can get funding on some of these projects. 636 00:32:46,931 –> 00:32:49,400 So let’s walk through each one of these stages. 637 00:32:49,400 –> 00:32:50,768 But before we do. 638 00:32:50,768 –> 00:32:52,937 I want you to do something inside your head. 639 00:32:52,937 –> 00:32:55,373 I’m going to it’s going to be a little silence for a second. 640 00:32:55,373 –> 00:32:58,109 But think about the work that you do in your organization. 641 00:32:58,109 –> 00:33:02,180 That happens the same way every single day by hand. 642 00:33:02,680 –> 00:33:05,283 Workflow might be someone does five lookups 643 00:33:05,283 –> 00:33:07,552 to answer just one single question. 644 00:33:07,552 –> 00:33:08,386 Done that. 645 00:33:08,386 –> 00:33:12,423 A report that takes three hours to compile because the data lives in 4 646 00:33:12,423 –> 00:33:14,158 or 5 different systems. 647 00:33:14,158 –> 00:33:16,527 What about an approval that sits in an inbox for two days? 648 00:33:16,527 –> 00:33:18,363 Because no one’s quite sure who owns it. 649 00:33:18,363 –> 00:33:19,664 So take it a little bit. 650 00:33:19,664 –> 00:33:23,968 Think of one workflow where one mistake cost less than $1,000, 651 00:33:24,502 –> 00:33:28,039 maybe just two, but it happens 100 or 1000 times in a week. 652 00:33:29,907 –> 00:33:31,709 So that pattern, that low 653 00:33:31,709 –> 00:33:35,880 individual cost, that’s cumulative drag on the organization. 654 00:33:36,314 –> 00:33:39,050 That’s where your first nugget is probably hiding. 655 00:33:39,050 –> 00:33:42,053 So it’s not in the flashy kind of transformation initiative. 656 00:33:42,687 –> 00:33:44,756 It’s in the thing that’s been quietly annoying 657 00:33:44,756 –> 00:33:47,759 somebody on the team every day for the last three years. 658 00:33:48,292 –> 00:33:51,529 I think of one of my favorite stories that we worked with, one of our clients, 659 00:33:51,929 –> 00:33:54,532 and it was a transformational shift. 660 00:33:54,532 –> 00:33:58,102 All it did was one person’s work, but it demonstrated 661 00:33:58,403 –> 00:34:02,707 that we could reduce the time that somebody was spending from six hours 662 00:34:02,707 –> 00:34:06,477 a day down to 15 minutes, and eventually down to like five minutes, 663 00:34:07,011 –> 00:34:10,014 and then that the end result wasn’t they had to let somebody go. 664 00:34:10,314 –> 00:34:12,617 It was that they closed the job wreck. 665 00:34:12,617 –> 00:34:16,654 So those are the kind of impacts that can really sway an organization, 666 00:34:16,821 –> 00:34:19,824 because once that story manifested itself to the board, 667 00:34:20,091 –> 00:34:22,593 it became easy to go back and get more ideas funded. 668 00:34:23,861 –> 00:34:24,962 the first question, any 669 00:34:24,962 –> 00:34:28,466 portfolio conversation kind of comes up where where do you actually dig? 670 00:34:29,167 –> 00:34:31,602 I’ve got three different places that you can think about. 671 00:34:31,602 –> 00:34:33,771 First, frontline pain. 672 00:34:33,771 –> 00:34:35,606 So this is where the work is happening. 673 00:34:35,606 –> 00:34:37,341 Talk to your service desk. 674 00:34:37,341 –> 00:34:41,512 Look inside of your ticket, your ticket queues, walk the production floor. 675 00:34:42,013 –> 00:34:45,550 And I love actually getting out in front of the environment and looking at people 676 00:34:45,783 –> 00:34:49,220 and how they actually do their work, what it actually takes to get done. 677 00:34:49,220 –> 00:34:50,254 You see these patterns. 678 00:34:50,254 –> 00:34:53,224 So doing it the same way over and over. 679 00:34:53,324 –> 00:34:56,694 Those are the areas that you’re going to find, a lot of these quick wins. 680 00:34:57,328 –> 00:34:59,764 Second, leadership themes. 681 00:34:59,764 –> 00:35:03,034 I’ve said this before in a lot of my conversations with my customers, 682 00:35:03,634 –> 00:35:07,338 the priorities of your organization are going to drive AI initiatives. 683 00:35:07,872 –> 00:35:11,008 if you can’t define their North Star, for me, what’s important 684 00:35:11,008 –> 00:35:14,045 for your organization to focus on and being successful, 685 00:35:14,312 –> 00:35:17,248 I’m not going to be able to help you prioritize accurately 686 00:35:17,248 –> 00:35:19,951 what kind of AI projects you should be funding. 687 00:35:19,951 –> 00:35:23,754 So if client says de-risk this process 688 00:35:24,088 –> 00:35:27,458 or your CEO, your CEO says reduce this cycle time, 689 00:35:27,825 –> 00:35:31,596 that tells me that those are strategic, important priorities for the organization. 690 00:35:31,896 –> 00:35:32,363 And those are 691 00:35:32,363 –> 00:35:35,933 the things that I want to look at, because it’s going to gain us momentum. 692 00:35:36,467 –> 00:35:39,303 And then third, I would look at like build events, 693 00:35:39,303 –> 00:35:42,306 structured type of time, boxed events that are going to generate, 694 00:35:42,573 –> 00:35:46,978 just creative ideas, hackathons, idea thons. 695 00:35:46,978 –> 00:35:49,947 People have a bunch of different, names that they want to call them. 696 00:35:50,014 –> 00:35:52,350 But, you know, here’s the caveat. 697 00:35:52,350 –> 00:35:55,586 Only run these if you’re willing to take that output 698 00:35:55,753 –> 00:35:58,623 and feed a governed portfolio. 699 00:35:58,623 –> 00:35:59,223 Hackathon. 700 00:35:59,223 –> 00:36:03,861 And I see this a lot that ends with a demo and nothing more is a waste of time. 701 00:36:04,428 –> 00:36:07,431 And, and so what we really want to do 702 00:36:07,431 –> 00:36:10,501 is get something that’s going to go into a playbook that we can run later. 703 00:36:11,135 –> 00:36:13,838 So every candidate that you identify, 704 00:36:15,072 –> 00:36:16,174 I have this thing that I sort 705 00:36:16,174 –> 00:36:19,677 of built out for this, it’s it’s kind of an opportunity scorecard. 706 00:36:20,244 –> 00:36:22,947 And it looks at five different fields. 707 00:36:22,947 –> 00:36:24,148 The workflow. 708 00:36:24,148 –> 00:36:28,719 The workflow touches, the system, who needs access to it, the data 709 00:36:28,719 –> 00:36:30,154 it requires 710 00:36:30,154 –> 00:36:32,990 what the control level is going to be, and most importantly, 711 00:36:32,990 –> 00:36:35,960 what the success metric is going to be for it. When we’re done. 712 00:36:35,960 –> 00:36:36,260 All right. 713 00:36:36,260 –> 00:36:39,197 So here’s here’s one that I actually worked through. 714 00:36:39,197 –> 00:36:43,568 Let’s let’s imagine that there’s 47 field technicians. 715 00:36:44,035 –> 00:36:47,672 And they spend eight minutes per shift just looking up the right work 716 00:36:47,672 –> 00:36:49,507 instructions. 717 00:36:49,507 –> 00:36:52,510 Through that process, they’re doing things like calling their supervisors. 718 00:36:52,810 –> 00:36:54,345 They’re digging through their binders. 719 00:36:54,345 –> 00:36:56,681 They’re texting each other. Add that up. 720 00:36:56,681 –> 00:36:59,850 That’s 900 minutes of cumulative drag every single day. 721 00:37:00,418 –> 00:37:03,421 So that’s what this card is going to kind of reflect. 722 00:37:03,521 –> 00:37:06,891 So on the workflow technician that reports an issue 723 00:37:06,891 –> 00:37:09,927 and needs to correct SOP users are affected. 724 00:37:09,927 –> 00:37:11,128 The number of users affected, 725 00:37:11,128 –> 00:37:14,999 we’ve got 47 technicians and about 12 lookups per day. 726 00:37:15,399 –> 00:37:18,336 We’re touching things like SharePoint for SAP documents 727 00:37:18,336 –> 00:37:21,205 or going in this ServiceNow to look at the tickets. 728 00:37:21,205 –> 00:37:24,208 Data required 2400 procedure documents. 729 00:37:24,208 –> 00:37:28,145 And about 80% of this we’ve identified as already being digitized. 730 00:37:28,512 –> 00:37:31,649 So, I want to call that out because that’s a readiness signal. 731 00:37:31,882 –> 00:37:36,187 That means, hey, look, I’ve got 80% of this already accessible to my LLM. 732 00:37:37,488 –> 00:37:39,757 Control level 2 or 3, right. 733 00:37:39,757 –> 00:37:42,994 We’re just basically looking at recommending the right SOP. 734 00:37:43,261 –> 00:37:45,796 We’re not saying necessarily take any kind of actions. 735 00:37:45,796 –> 00:37:47,598 We’re saying just give me some recommendations 736 00:37:47,598 –> 00:37:50,434 about what we should be doing so that I can act on it. 737 00:37:50,434 –> 00:37:52,003 Now look at that metric. 738 00:37:52,003 –> 00:37:55,873 So we reduced the lookup time from eight minutes to maybe under 90s. 739 00:37:56,340 –> 00:37:59,343 And we cut the supervisor escalations by 30%. 740 00:37:59,944 –> 00:38:02,580 That card took about 20 minutes for me to fill out. 741 00:38:02,580 –> 00:38:05,583 Thinking about it out of one of these hackathons. 742 00:38:05,950 –> 00:38:09,720 But I can answer the questions that you’re IT lead your security officer 743 00:38:10,254 –> 00:38:13,658 or your CFO are going to want to ask before they approve any kind of pilot. 744 00:38:14,225 –> 00:38:17,395 So coming armed with this information, is going to help 745 00:38:17,395 –> 00:38:19,497 you get this actually on to the shaker table, 746 00:38:19,497 –> 00:38:21,365 which we’re going to talk about, talk about next. 747 00:38:22,967 –> 00:38:25,169 So once you have your candidate 748 00:38:25,169 –> 00:38:28,839 now we’ve got a separate, the gold from the gravel. 749 00:38:28,839 –> 00:38:31,842 And that’s what that shaker table is supposed to do in the mining analogy 750 00:38:32,143 –> 00:38:34,812 is just shake that out, get the get the gold 751 00:38:34,812 –> 00:38:37,815 out of the, out of the dirt so we can start cleaning it up. 752 00:38:38,082 –> 00:38:41,085 And so here’s a simple scoring model that you can use. 753 00:38:41,185 –> 00:38:43,988 And there’s a bunch of different ones out there, but, I had a meeting 754 00:38:43,988 –> 00:38:47,091 with one of my customers, and he showed me what he’s doing on his side. 755 00:38:47,091 –> 00:38:50,461 And so I’ve incorporated into this other thing that I’ve been doing, 756 00:38:50,928 –> 00:38:52,697 but, it’s still kind of similar. 757 00:38:52,697 –> 00:38:55,232 First, what’s the business impact? 758 00:38:55,232 –> 00:38:58,235 So how big is the KPI that this agent could move? 759 00:38:58,703 –> 00:39:01,906 How many people, how many processes, what’s the scale of this? 760 00:39:02,073 –> 00:39:04,542 Does it affect three people in one department 761 00:39:04,542 –> 00:39:05,309 because that’s going to score 762 00:39:05,309 –> 00:39:08,379 differently than something that touches everybody in the organization. 763 00:39:08,446 –> 00:39:09,847 Makes a little bit of sense. 764 00:39:09,847 –> 00:39:11,882 Second time to value. 765 00:39:11,882 –> 00:39:15,619 I had a talk about this, a webinar I did, talking about time to value, 766 00:39:15,720 –> 00:39:17,188 Pro-code versus Low-code. 767 00:39:17,188 –> 00:39:19,724 Nick, I know you’re out there someplace. We’ll talk later. 768 00:39:19,724 –> 00:39:20,391 time to value. 769 00:39:20,391 –> 00:39:24,228 Can you ship something meaningfully in a couple of weeks, or is this something 770 00:39:24,228 –> 00:39:27,531 that’s going to take a long, quarter or even a half a year? 771 00:39:27,998 –> 00:39:30,701 anything that gets into that inner ring 772 00:39:30,701 –> 00:39:35,072 should be something that you can ship in, like 30 to 60 days, something achievable. 773 00:39:35,706 –> 00:39:39,777 Third execution fit a couple of different angles on this one. 774 00:39:39,910 –> 00:39:41,445 Is your data ready? 775 00:39:41,445 –> 00:39:43,214 You know, are you able to use it? 776 00:39:43,214 –> 00:39:45,516 Do we have a readiness score on the data? 777 00:39:45,516 –> 00:39:47,385 What about your your staff? 778 00:39:47,385 –> 00:39:50,221 are they able to handle those kinds of integrations? 779 00:39:50,221 –> 00:39:51,989 Are you going to pick a Low-code tool 780 00:39:51,989 –> 00:39:54,158 because you don’t have any software engineers? 781 00:39:54,158 –> 00:39:57,661 Is the process clear enough that you could write it down for a new employee? 782 00:39:58,095 –> 00:40:01,265 These are the things you want to make sure that you have for the execution fit. 783 00:40:01,899 –> 00:40:04,902 And then fourth, trust and risk. 784 00:40:05,169 –> 00:40:07,104 What happens if something fails? 785 00:40:07,104 –> 00:40:09,740 What’s the blast radius on something like that? 786 00:40:09,740 –> 00:40:13,110 Something in our, letter on 1 or 2 or maybe even three. 787 00:40:13,644 –> 00:40:16,147 That’s just recommending, 788 00:40:16,147 –> 00:40:19,150 you know, something, and it gets it wrong. 789 00:40:19,150 –> 00:40:20,751 You know, that cost a minute. 790 00:40:20,751 –> 00:40:21,685 That’s not a big deal. 791 00:40:21,685 –> 00:40:24,688 But something that’s going to act is going to have much bigger blast radius, 792 00:40:25,055 –> 00:40:27,925 something that can authorize payments and makes a mistake. 793 00:40:27,925 –> 00:40:29,960 That’s a totally different category of risk 794 00:40:29,960 –> 00:40:31,962 and something that you’re going to score differently. 795 00:40:31,962 –> 00:40:34,799 And then finally, what’s that autonomy ceiling. 796 00:40:34,799 –> 00:40:37,568 Not where you’re going to start, but where the ceiling is. 797 00:40:37,568 –> 00:40:40,304 Where do you envision this getting up to in the ladder? 798 00:40:40,304 –> 00:40:42,373 How high should you allow it to go? 799 00:40:42,373 –> 00:40:45,409 Some of the candidates should just stop at level two and stay there. 800 00:40:45,576 –> 00:40:47,678 You never going to move and progress past it, 801 00:40:47,678 –> 00:40:49,947 but others you’re going to go up that ladder 802 00:40:49,947 –> 00:40:52,416 methodically so you don’t get altitude sickness 803 00:40:52,416 –> 00:40:55,853 and that you can test and validate this, this agent as it goes through there. 804 00:40:56,854 –> 00:40:59,390 let’s look at a little bit of a live exercise. 805 00:40:59,390 –> 00:41:02,393 So I’ve got three different use cases up here. 806 00:41:02,460 –> 00:41:04,628 And I know I’m throwing a lot of frameworks at you. 807 00:41:04,628 –> 00:41:07,631 But I want to make sure that you can put these to use as well. 808 00:41:07,665 –> 00:41:09,166 as I go through each one of these, 809 00:41:09,166 –> 00:41:11,402 I want you to think back to the different rings that we had 810 00:41:11,402 –> 00:41:14,939 and vote inner ring, middle ring or outer ring. 811 00:41:15,272 –> 00:41:18,275 for the first use case, imagine that you’re a manufacturer. 812 00:41:18,843 –> 00:41:22,112 You’ve got field technicians constantly that need the correct procedures 813 00:41:22,112 –> 00:41:23,514 for their equipment. 814 00:41:23,514 –> 00:41:26,250 Today what they’re doing is they’re calling their supervisors 815 00:41:26,250 –> 00:41:27,618 and they’re digging through their binders. 816 00:41:27,618 –> 00:41:30,120 And you want an agent that’s just going to help them surface 817 00:41:30,120 –> 00:41:32,890 those work instructions when they describe the issues. 818 00:41:32,890 –> 00:41:35,659 So just real quick, Inner? 819 00:41:36,694 –> 00:41:38,362 Middle. 820 00:41:38,362 –> 00:41:40,064 Nobody’s nobody’s picking middle. Middle. 821 00:41:40,064 –> 00:41:41,365 You’re a vote here. 822 00:41:41,365 –> 00:41:42,466 All right. 823 00:41:42,466 –> 00:41:45,102 Outer. 824 00:41:45,102 –> 00:41:46,537 All right. Use case B. 825 00:41:46,537 –> 00:41:49,106 So you’re in association or nonprofit. 826 00:41:49,106 –> 00:41:52,910 Your members call email about benefits, coverage, eligibility. 827 00:41:53,277 –> 00:41:55,546 You’ve got staff that spends a large portion of their time 828 00:41:55,546 –> 00:41:57,481 answering the same question from documents. 829 00:41:57,481 –> 00:41:59,683 You know, those documents already exist. 830 00:41:59,683 –> 00:42:02,453 And so you really just want an agent that’s going to answer questions and route 831 00:42:02,453 –> 00:42:05,956 the complex ones to somebody that’s a little bit more, knowledgeable on that. 832 00:42:05,990 –> 00:42:08,993 That area. Inner? 833 00:42:09,226 –> 00:42:10,594 Middle. 834 00:42:10,594 –> 00:42:12,029 Outer. 835 00:42:12,029 –> 00:42:13,063 All right. 836 00:42:13,063 –> 00:42:16,066 Last one, you’re in financial services. 837 00:42:16,333 –> 00:42:19,904 You file compliance documents, analysts draft them manually 838 00:42:19,904 –> 00:42:21,939 from source data and attorneys review them. 839 00:42:21,939 –> 00:42:24,675 That’s the important part. Attorneys are going to review them. 840 00:42:24,675 –> 00:42:27,545 You want an agent that drafts from the source data 841 00:42:27,545 –> 00:42:30,881 flags, anomalies and cues the attorney for approval. 842 00:42:31,415 –> 00:42:33,584 So what do you think about this one? It’s this inner. 843 00:42:35,085 –> 00:42:36,854 Outer 844 00:42:36,854 –> 00:42:38,589 mixing it up. 845 00:42:38,589 –> 00:42:39,590 Paying attention. 846 00:42:39,590 –> 00:42:41,559 Middle? 847 00:42:41,559 –> 00:42:41,859 All right. 848 00:42:41,859 –> 00:42:43,627 So let’s talk about how I would score these. 849 00:42:43,627 –> 00:42:46,163 So I feel like a is in that middle ring. 850 00:42:46,163 –> 00:42:47,831 The data in the inner ring. 851 00:42:47,831 –> 00:42:52,036 It’s if the data is just the documents, the control level is probably a level 852 00:42:52,036 –> 00:42:54,738 2 or 3. It’s just recommending an approved. 853 00:42:54,738 –> 00:42:56,874 something like that is pretty much cookie cutter. 854 00:42:56,874 –> 00:42:59,710 And there’s even tools that you can just off the shelf by, 855 00:42:59,710 –> 00:43:02,713 you’ll have something running in 30 days, no questions asked. 856 00:43:02,913 –> 00:43:06,283 b probably is like somewhere in the inner, the middle. 857 00:43:06,650 –> 00:43:09,653 Depending on how structured your data gets. 858 00:43:09,653 –> 00:43:12,656 So if you’re in that area, I’d say that like I give you both, 859 00:43:12,723 –> 00:43:16,460 you know, the right one, but the C probably is in the middle ring. 860 00:43:17,561 –> 00:43:19,697 The compliance layer and the governance requirements, 861 00:43:19,697 –> 00:43:21,498 they’re going to take some time to get right. 862 00:43:21,498 –> 00:43:24,568 But remember I did say that that attorney is going to be doing the review. 863 00:43:24,868 –> 00:43:26,971 So it’s not actually autonomous. 864 00:43:26,971 –> 00:43:28,672 regardless of whether you’re right or wrong, 865 00:43:28,672 –> 00:43:31,442 whether you agree with my scoring or not, 866 00:43:31,442 –> 00:43:34,745 here’s the key point is that there’s a framework for this. 867 00:43:34,745 –> 00:43:36,547 And something that you should do intentionally. 868 00:43:36,547 –> 00:43:39,483 this is going to be, valuable for your organization, 869 00:43:39,483 –> 00:43:42,820 whether you all agree on it or not, because the point isn’t really 870 00:43:42,820 –> 00:43:45,623 whether it’s in the inner middle or the outer ring. 871 00:43:45,623 –> 00:43:48,559 The point is the conversation that you’re going to have with your teams 872 00:43:48,559 –> 00:43:51,395 talking about why it’s to be in one of those areas. 873 00:43:51,395 –> 00:43:54,164 And so that’s how you get organizational alignment 874 00:43:54,164 –> 00:43:57,468 before you start even spending a single dollar on these kind of projects 875 00:43:57,735 –> 00:43:59,403 and wind up forcing some down the toilet. 876 00:44:00,804 –> 00:44:02,039 let me show you where, 877 00:44:02,039 –> 00:44:05,042 the other side of the portfolio looks when it works. 878 00:44:05,275 –> 00:44:07,778 this was a pretty good story. In the news. 879 00:44:07,778 –> 00:44:11,415 Klarna, they deployed an AI assistant that handle the equivalent of roughly 880 00:44:11,649 –> 00:44:14,652 700 of their, customer service representatives. 881 00:44:14,752 –> 00:44:16,687 Full time service agents. 882 00:44:16,687 –> 00:44:18,122 The resolution time. 883 00:44:18,122 –> 00:44:18,889 This is a huge. 884 00:44:18,889 –> 00:44:21,525 And the news went from, like, eight minutes to under two minutes 885 00:44:21,525 –> 00:44:23,160 by using these agents. 886 00:44:23,160 –> 00:44:27,398 And at the time, they reported millions and millions, I think around $40 million 887 00:44:27,398 –> 00:44:30,768 that they reported in profit improvement over this single year. 888 00:44:31,168 –> 00:44:33,937 That’s huge. That’s huge impact. 889 00:44:33,937 –> 00:44:35,472 And it got people talking about it. 890 00:44:35,472 –> 00:44:36,874 That’s measurable. 891 00:44:36,874 –> 00:44:38,642 Let’s take a little bit of a step out. 892 00:44:38,642 –> 00:44:42,613 Gartner predicts that more than 40% of agentic AI projects 893 00:44:42,713 –> 00:44:46,817 this year are going to get canceled, not because the technology didn’t work, 894 00:44:47,051 –> 00:44:51,055 but probably because of things like escalating costs or unclear business 895 00:44:51,055 –> 00:44:55,426 value, or weak risk controls that cause problems within the organization. 896 00:44:56,226 –> 00:44:59,163 So the technology is often like the easy part. 897 00:44:59,163 –> 00:45:01,965 It’s the discipline that’s the hard part. 898 00:45:01,965 –> 00:45:04,802 And talking about Klarna 899 00:45:04,802 –> 00:45:08,906 just to keep myself honest, they later adjusted that model 900 00:45:09,206 –> 00:45:12,109 and if you remember, they reintroduced more human involvement 901 00:45:12,109 –> 00:45:16,914 for some of the complex tasks because of customer experience matters. 902 00:45:16,914 –> 00:45:19,917 It happens, but that’s not really a failure. 903 00:45:20,017 –> 00:45:21,652 That’s portfolio management. 904 00:45:21,652 –> 00:45:23,287 If you think about it. 905 00:45:23,287 –> 00:45:26,557 AI handled the repetitive work, got most of those things done, 906 00:45:27,057 –> 00:45:29,960 but they just misjudged what humans needed to handle, 907 00:45:29,960 –> 00:45:32,930 and were over overambitious with it. 908 00:45:32,930 –> 00:45:34,865 So that system actually matured. 909 00:45:34,865 –> 00:45:36,233 And that’s a difference. 910 00:45:36,233 –> 00:45:38,168 These kind of portfolio models that I’m giving 911 00:45:38,168 –> 00:45:39,570 you are going to help you establish that. 912 00:45:41,205 –> 00:45:42,506 So here’s the question. 913 00:45:42,506 –> 00:45:46,477 The question that the Gartner number, that 40% should it should make us ask 914 00:45:46,577 –> 00:45:48,912 what did the nuggets do that the gravel didn’t. 915 00:45:48,912 –> 00:45:49,847 So the cancel project. 916 00:45:49,847 –> 00:45:52,316 So it’s 40% that started with the technology. 917 00:45:52,316 –> 00:45:53,851 They saw a demo. 918 00:45:53,851 –> 00:45:56,987 And I know that I know that’s how many people have seen the demo. 919 00:45:56,987 –> 00:45:58,388 And it didn’t work right. 920 00:45:58,388 –> 00:46:00,257 So they started with the demo. 921 00:46:00,257 –> 00:46:04,394 They stood up a pilot, but they never defined what the success criteria was. 922 00:46:04,495 –> 00:46:07,297 They never looked at what it was, what the finish line was going to be. 923 00:46:07,297 –> 00:46:10,300 they funded everything and they prioritized nothing. 924 00:46:10,701 –> 00:46:12,035 And that’s when the bill came to. 925 00:46:12,035 –> 00:46:14,872 And they couldn’t justify what they were spending their money on. 926 00:46:14,872 –> 00:46:17,875 The projects that succeeded, like Klarna 927 00:46:18,075 –> 00:46:21,311 and the leaders that you’re going to hear from today on the executive panel, 928 00:46:21,845 –> 00:46:23,914 they started somewhere a little bit different. 929 00:46:23,914 –> 00:46:27,718 They started with a measurable workflow, a number that they were trying to move. 930 00:46:28,018 –> 00:46:31,021 Remember that one person, that job that we’re trying to fill, 931 00:46:31,188 –> 00:46:33,757 can I do something so I don’t have to fill this job? 932 00:46:33,757 –> 00:46:35,259 That was the KPI. 933 00:46:35,259 –> 00:46:37,261 Do I have to fill the job? Simple. 934 00:46:37,261 –> 00:46:40,264 They scored their candidates before they actually built anything. 935 00:46:40,330 –> 00:46:43,100 And they had some of these governance, patterns in place. 936 00:46:43,100 –> 00:46:45,803 So the second pilot was approved in half the time. 937 00:46:45,803 –> 00:46:46,637 The first 938 00:46:46,637 –> 00:46:50,073 when you get when you have success stories that you can go back to your CEO 939 00:46:50,073 –> 00:46:52,609 and your board with, they’re going to improve more. 940 00:46:52,609 –> 00:46:55,345 And I’ve actually had them come back to these, 941 00:46:55,345 –> 00:46:58,515 leaders and ask them, why aren’t you doing even more? 942 00:46:59,483 –> 00:47:01,351 So the difference isn’t the model. 943 00:47:01,351 –> 00:47:03,020 It’s the operating model. 944 00:47:03,020 –> 00:47:06,223 And that brings me to the last thing that I promised I talk about today 945 00:47:06,757 –> 00:47:08,659 a 30, 60, 90 plan. 946 00:47:08,659 –> 00:47:11,662 You know, a lot of you are looking for something that you can take out of here. 947 00:47:11,862 –> 00:47:14,865 And what’s your first 90 days going to look like? 948 00:47:14,998 –> 00:47:17,334 feel free to take a picture. 949 00:47:17,334 –> 00:47:20,704 the first 30 days, I want to encourage you to score 950 00:47:20,704 –> 00:47:23,941 ten agent candidates using the frameworks that we’ve talked about. 951 00:47:24,241 –> 00:47:25,709 Make sure you use that opportunity 952 00:47:25,709 –> 00:47:27,811 scorecard, and we’ll make sure we get you the slides. 953 00:47:27,811 –> 00:47:30,814 So if you missed taking a picture of some of this, will get that out to you. 954 00:47:31,215 –> 00:47:33,584 But then score them on the shaker table. 955 00:47:33,584 –> 00:47:35,552 Pick two that land in the inner ring. 956 00:47:35,552 –> 00:47:38,922 The ones that we’re going to get done fast and deliver value, and define 957 00:47:38,922 –> 00:47:41,992 their autonomy ceilings before you actually start building them. 958 00:47:42,793 –> 00:47:44,461 Then, date by day 60. 959 00:47:44,461 –> 00:47:45,963 See if you can get ship, ship 960 00:47:45,963 –> 00:47:48,765 1 or 2 of these into a controlled production environment. 961 00:47:48,765 –> 00:47:50,367 Human approvals in place. 962 00:47:50,367 –> 00:47:51,802 Got to keep that human in the loop. 963 00:47:51,802 –> 00:47:53,370 Especially when you’re getting started. 964 00:47:53,370 –> 00:47:57,875 Make sure you’ve turned logging on so that we can retrace our steps at some point. 965 00:47:58,308 –> 00:48:01,311 But real world data that’s actually flowing through it. 966 00:48:01,511 –> 00:48:04,681 And at day 90, let’s see if we can set a goal for standing up 967 00:48:04,681 –> 00:48:06,850 a monthly portfolio agent review. 968 00:48:06,850 –> 00:48:10,954 Are you getting enough ideas into your portfolio to be able to go go 969 00:48:11,054 –> 00:48:13,423 methodically look through the value that’s being delivered? 970 00:48:13,423 –> 00:48:16,093 And, make the decisions? 971 00:48:16,093 –> 00:48:17,527 Should we fund this one? 972 00:48:17,527 –> 00:48:19,897 Should we scale this one that we’ve already built? 973 00:48:19,897 –> 00:48:23,467 Should we kill this one that we delivered, that isn’t really generating 974 00:48:23,467 –> 00:48:27,437 the kind of KPI value that we said, don’t hesitate to make any of these decisions. 975 00:48:27,671 –> 00:48:29,873 This is a new territory that we’re all in. 976 00:48:29,873 –> 00:48:31,074 we’re not all experts. 977 00:48:31,074 –> 00:48:34,077 in this because I hasn’t been around for 25 years. 978 00:48:34,211 –> 00:48:37,114 For us, it’s been around maybe a little bit longer than most, but, 979 00:48:37,114 –> 00:48:40,117 but but it is still a new technology. 980 00:48:40,250 –> 00:48:42,786 And here’s the line that I really want you to take on. 981 00:48:42,786 –> 00:48:45,656 Momentum is cultural before it’s technical. 982 00:48:45,656 –> 00:48:46,356 Right? 983 00:48:46,356 –> 00:48:49,893 So build that momentum with your culture and make those shifts 984 00:48:49,893 –> 00:48:51,662 inside of your organization. First. 985 00:48:53,163 –> 00:48:54,131 I gave you the map. 986 00:48:54,131 –> 00:48:57,134 But we’re not done with the sessions, you know, today. 987 00:48:57,234 –> 00:48:59,536 So I just want to give you a recap of what’s coming up. 988 00:48:59,536 –> 00:49:00,971 rest of today is about how you run 989 00:49:00,971 –> 00:49:03,507 each piece of the equipment without losing your limbs. 990 00:49:03,507 –> 00:49:06,944 The next session is, going to go a little bit deeper into the governing 991 00:49:06,977 –> 00:49:08,845 and securing AI at scale. 992 00:49:08,845 –> 00:49:10,881 That’s the full operational playbook. 993 00:49:10,881 –> 00:49:13,850 The behind the control model on the guardrails that we talked about. 994 00:49:13,850 –> 00:49:17,621 Now we’re going to move into data and AI, creating great agents with trusted data. 995 00:49:17,888 –> 00:49:20,891 And trust the data multiplies the value of these great agents. 996 00:49:21,191 –> 00:49:24,461 So if your candidates keep landing in the outer ring of the shaker table, 997 00:49:24,594 –> 00:49:27,297 hopefully this session is going to help you understand why. 998 00:49:27,297 –> 00:49:30,600 then finally, we’ve got that executive panel with Brian Atkinson 999 00:49:30,600 –> 00:49:33,603 from Atlas Energy, Michael Barrett from Pottawattami. 1000 00:49:33,670 –> 00:49:36,406 And Hao from Clarios is going to be on the panel. 1001 00:49:36,406 –> 00:49:39,943 These are three leaders that we’ve been working with that make real AI decisions 1002 00:49:39,943 –> 00:49:43,947 inside of real organizations right here that you can reach out to and talk to. 1003 00:49:43,947 –> 00:49:46,650 Well, one of you is not quite here. 1004 00:49:46,650 –> 00:49:49,653 But, but anyway, these aren’t theories. 1005 00:49:49,953 –> 00:49:51,655 It’s not just the slides. 1006 00:49:51,655 –> 00:49:53,757 It’s going to be actual conversations that they’re having. 1007 00:49:54,791 –> 00:49:55,892 So I’m 1008 00:49:55,892 –> 00:49:59,029 going to close with that same thought that I started this off with. 1009 00:49:59,696 –> 00:50:02,099 Because I think it’s even more true today. 1010 00:50:02,099 –> 00:50:05,335 Those loud voices that are talking online, you know, about that thunderclap. 1011 00:50:05,702 –> 00:50:07,070 They want you to feel that there’s 1012 00:50:07,070 –> 00:50:10,607 some sort of urgency, that there’s some sort of crisis. Why? 1013 00:50:10,941 –> 00:50:13,143 Because urgency actually sells. 1014 00:50:13,143 –> 00:50:16,313 But in the real world, here in this room in the Midwest, 1015 00:50:16,546 –> 00:50:20,250 these organizations AI is about compounding quietly. 1016 00:50:20,617 –> 00:50:23,587 It’s about being slow and deliberate with your actions. 1017 00:50:23,887 –> 00:50:26,423 So remember, from a manufacturing side, concrete 1018 00:50:26,423 –> 00:50:29,426 does cure slowly and so does transformation. 1019 00:50:29,760 –> 00:50:32,262 But it does cure. 1020 00:50:32,262 –> 00:50:34,631 And today is really about how you mix it. 1021 00:50:34,631 –> 00:50:35,065 Thanks.