Insights View Recording: Milwaukee AI Summit 2026 – Governing & Securing AI at Scale

View Recording: Milwaukee AI Summit 2026 – Governing & Securing AI at Scale



In this session, Joe Steiner, Solution Architect at Concurrency, breaks down what it really takes to govern and secure AI as adoption accelerates across the enterprise. As organizations move from experimenting with public AI tools to deploying Copilot, building agents, and developing their own AI solutions, Joe explains why governance isnโ€™t about slowing innovationโ€”itโ€™s about enabling it responsibly.

Rather than advocating for extremes like banning AI entirely or embracing it without guardrails, this webinar outlines a pragmatic, staged approach to AI governance. Joe walks through how unmanaged AI use leads to shadow IT, data exposure, employee distrust, and operational riskโ€”and how thoughtful governance helps organizations capture value while maintaining security, trust, and usability.

Using real-world examples and practical scenarios, the session covers the full AI adoption spectrum: from organizations just beginning to define acceptable use policies, to those securing Microsoft Copilot, to teams building and publishing AI agents at scale. Attendees gain a clear framework for balancing risk, usability, and business value as AI becomes a core part of how work gets done.

WHAT YOUโ€™LL LEARN

In this webinar, youโ€™ll learn:

  • Why banning AI or adopting it without controls both create serious business riskโ€”and what a balanced governance approach looks like
  • How AI adoption typically progresses, from no usage to public AI tools, to Copilot, to custom agent development
  • What โ€œresponsible AI governanceโ€ really means beyond securityโ€”including operational, financial, legal, and trust considerations
  • How to create and evolve an AI acceptable use policy that aligns with real employee behavior
  • Why unmanaged AI leads to shadow IT, data exfiltration, and employee dissatisfaction
  • How to govern public AI usage to reduce data leakage and exposure to untrusted tools
  • Best practices for securing Microsoft Copilot, including permissions hygiene, data classification, and retention policies
  • How poor data hygiene leads to bad AI outcomesโ€”and why cleaning and protecting data is foundational to trustable AI
  • What changes when organizations start building their own AI agents and solutions
  • How shared responsibility shifts when you move from using AI tools to publishing AI products
  • The role of identity, access controls, monitoring, and auditability in securing agents at scale
  • Why lifecycle management, testing, and ongoing monitoring are essential for AI systems that evolve over time

FREQUENTLY ASKED QUESTIONS

Why is AI governance so important right now?

AI adoption is happening faster than nearly any technology before it. Without governance, organizations expose themselves to data leaks, regulatory violations, employee mistrust, and poor decisionโ€‘making based on unreliable outputs.

Whatโ€™s the biggest mistake organizations make with AI?

Taking an extreme approachโ€”either banning AI entirely or allowing unrestricted use. Both lead to risk. Bans create shadow IT, while unrestricted use accelerates security and compliance problems.

Is AI governance just a security issue?

No. While security is critical, governance also includes data management, operational risk, financial oversight, legal considerations, ethics, and user enablement. Governance should help people use AI effectively, not block productivity.

How does governance change when we move from Copilot to custom agents?

Once you start building agents or AI solutions, youโ€™re no longer just a consumerโ€”youโ€™re a provider. That brings new responsibilities around model behavior, data exposure, accuracy, monitoring, and ongoing support.

Why does data governance matter so much for AI?

AI surfaces whatever data it can access. If permissions, retention, and classification arenโ€™t clean, AI will expose outdated, sensitive, or unintended informationโ€”leading to bad outputs and loss of trust.

Whatโ€™s the role of identity in agent governance?

Assigning identities to agents allows organizations to control what agents can access, who can interact with them, and how their activity is monitored. Identity is the foundation for secure, auditable agent behavior.

Do we need to solve everything at once?

No. Governance evolves over time. Organizations should start where they are, put foundational controls in place, and mature their approach as AI usage and sophistication grow.

ABOUT THE SPEAKER

Joe Steiner, Solution Architect at Concurrency, helps organizations adopt AI securely and responsibly as they move from experimentation to enterpriseโ€‘scale deployment. With a deep background in cloud, security, and identity, Joe focuses on balancing enablement and riskโ€”helping teams govern AI in ways that protect data, maintain trust, and unlock real business value without slowing innovation.

TRANSCRIPT

Transcription Collapsed

1 00:00:12,045 –> 00:00:13,246 name is Joe Steiner. 2 00:00:13,613 –> 00:00:14,881 I’m a solutions architect 3 00:00:14,881 –> 00:00:16,082 with Concurrency. 4 00:00:16,249 –> 00:00:17,250 We’re going to be talking about 5 00:00:17,250 –> 00:00:18,451 AI governance. 6 00:00:18,618 –> 00:00:22,122 How do we govern and secure AI? 7 00:00:22,122 –> 00:00:24,124 As we we kind of work 8 00:00:24,124 –> 00:00:25,458 along the journey into that. 9 00:00:25,725 –> 00:00:27,460 Each of you are probably in different places there. 10 00:00:27,460 –> 00:00:29,329 So we’re going to kind of talk about the progression there. 11 00:00:29,496 –> 00:00:30,730 It’s really the best way to approach that. 12 00:00:30,730 –> 00:00:32,165 You’re not going to solve it all at once. 13 00:00:32,632 –> 00:00:33,600 And so we’re 14 00:00:33,600 –> 00:00:35,201 going to talk about, what the best way 15 00:00:35,201 –> 00:00:36,436 to work your way through that is, 16 00:00:36,803 –> 00:00:38,238 after this, 17 00:00:38,238 –> 00:00:39,439 we’ve got what I view to be 18 00:00:39,439 –> 00:00:40,640 the highlight of the day, 19 00:00:40,774 –> 00:00:41,708 which is we have 20 00:00:41,708 –> 00:00:43,176 some panelists that are going to be coming 21 00:00:43,176 –> 00:00:44,377 from the audience 22 00:00:44,444 –> 00:00:45,545 to talk about 23 00:00:45,545 –> 00:00:46,780 their real world experiences 24 00:00:46,780 –> 00:00:48,014 with AI, what they’ve done. 25 00:00:48,281 –> 00:00:49,449 And I’m really looking forward to that. 26 00:00:49,449 –> 00:00:51,718 So I hope you’ll stick around with us for that. 27 00:00:52,352 –> 00:00:55,722 So we talk about AI. 28 00:00:55,889 –> 00:00:59,526 Just kind of backing up a bit. 29 00:00:59,592 –> 00:01:00,960 Humans have had kind of 30 00:01:00,960 –> 00:01:02,662 a weird relationship with AI 31 00:01:03,029 –> 00:01:04,631 for about 75 years. 32 00:01:05,265 –> 00:01:06,299 When you think about it, 33 00:01:06,299 –> 00:01:07,867 you know, back in the 50s, 34 00:01:07,867 –> 00:01:09,235 there are stories about 35 00:01:09,235 –> 00:01:11,404 Asimov started with this, and there have been plenty 36 00:01:11,404 –> 00:01:13,039 of other science fiction has been addressing 37 00:01:13,039 –> 00:01:14,240 this for some time. 38 00:01:14,574 –> 00:01:15,842 The concerns about what 39 00:01:15,842 –> 00:01:17,677 AI is going to do to the world, 40 00:01:17,677 –> 00:01:18,878 to our lives, to our 41 00:01:19,145 –> 00:01:21,514 our relationships with it, 42 00:01:21,514 –> 00:01:23,049 spanned a spectrum here 43 00:01:23,049 –> 00:01:25,618 between kind of those doomsday scenarios. 44 00:01:25,618 –> 00:01:27,120 You saw that with like Terminator 45 00:01:27,120 –> 00:01:28,388 and the Skynet stuff, 46 00:01:28,388 –> 00:01:31,324 and on to Utopia, where everything’s going to be, 47 00:01:31,324 –> 00:01:32,792 you know, golden and great and, 48 00:01:33,126 –> 00:01:34,494 no worries. Right. 49 00:01:35,028 –> 00:01:36,229 The truth is, 50 00:01:36,729 –> 00:01:38,231 those make for good stories. 51 00:01:38,665 –> 00:01:39,833 Make for good movies. 52 00:01:39,833 –> 00:01:41,234 But not really 53 00:01:41,234 –> 00:01:42,435 where we’re heading, right? 54 00:01:42,669 –> 00:01:43,636 Really where we’re going to be 55 00:01:43,636 –> 00:01:45,405 is it’s up to us to determine 56 00:01:45,405 –> 00:01:47,841 whether AI is detrimental 57 00:01:47,841 –> 00:01:49,042 to organizations. 58 00:01:49,109 –> 00:01:50,643 It’s kind of neutral, doesn’t do much 59 00:01:51,010 –> 00:01:52,579 or to be realized. 60 00:01:52,579 –> 00:01:54,280 The possibilities of this really having 61 00:01:54,280 –> 00:01:56,549 some exponential improvement 62 00:01:56,783 –> 00:01:57,851 in what we’re doing. 63 00:01:57,851 –> 00:01:59,185 But it’s on all of us to how we 64 00:01:59,252 –> 00:02:00,386 how we get there. 65 00:02:02,255 –> 00:02:03,523 As we’ve gone through 66 00:02:03,556 –> 00:02:05,992 history with different, technologies, 67 00:02:05,992 –> 00:02:07,193 you know, started 68 00:02:07,293 –> 00:02:08,661 that have started way back, 69 00:02:08,661 –> 00:02:09,996 but you look back at AI, 70 00:02:09,996 –> 00:02:12,398 I look at what we saw with, 71 00:02:12,398 –> 00:02:13,600 the internet. 72 00:02:13,800 –> 00:02:14,134 Right. 73 00:02:14,134 –> 00:02:15,702 It’s a similar adoption 74 00:02:15,702 –> 00:02:16,903 pattern there. 75 00:02:16,970 –> 00:02:19,105 And there are different ways to approach it. 76 00:02:19,239 –> 00:02:20,473 So the same thing with cloud. 77 00:02:20,907 –> 00:02:23,076 So some organizations 78 00:02:23,076 –> 00:02:24,277 will start with a hey, 79 00:02:24,377 –> 00:02:25,812 we’re going to avoid using this. 80 00:02:25,812 –> 00:02:26,112 Right. 81 00:02:26,112 –> 00:02:27,580 We’re going to kind of let this settle out, 82 00:02:27,580 –> 00:02:28,781 see what happens. 83 00:02:29,449 –> 00:02:30,650 The problem with that 84 00:02:30,650 –> 00:02:31,851 is that you don’t realize 85 00:02:31,851 –> 00:02:33,820 any of those benefits until you start 86 00:02:33,820 –> 00:02:34,988 using it. 87 00:02:35,488 –> 00:02:36,389 The other problem with that, 88 00:02:36,389 –> 00:02:37,590 and the bigger problem in my mind, 89 00:02:37,590 –> 00:02:38,858 is that frequently 90 00:02:38,858 –> 00:02:40,960 organizations will take an approach like, hey, 91 00:02:41,261 –> 00:02:42,762 we’re going to ban this, right? 92 00:02:42,762 –> 00:02:43,963 We’re going to wait till 93 00:02:43,963 –> 00:02:45,165 we’re all set. 94 00:02:45,398 –> 00:02:46,199 That leads to what 95 00:02:46,199 –> 00:02:47,667 I would describe as your shadow IT 96 00:02:47,767 –> 00:02:48,902 Mutiny. 97 00:02:48,902 –> 00:02:51,704 We saw this in ten years ago. 98 00:02:51,704 –> 00:02:53,173 I had a client here in Wisconsin, 99 00:02:53,173 –> 00:02:55,875 large manufacturing, company 100 00:02:55,875 –> 00:02:57,911 that was banning cloud technology. 101 00:02:58,545 –> 00:02:59,746 They’re people. 102 00:02:59,979 –> 00:03:01,848 When in response to this, said, well, 103 00:03:01,848 –> 00:03:03,783 I still need this to get my job done right. 104 00:03:03,783 –> 00:03:05,118 This is kind of ridiculous. 105 00:03:05,618 –> 00:03:07,187 So they literally would leave 106 00:03:07,187 –> 00:03:09,956 the office, take a personal computer, 107 00:03:10,256 –> 00:03:11,758 take the files from their work, 108 00:03:12,225 –> 00:03:13,693 put it on a personal computer, 109 00:03:13,927 –> 00:03:15,128 go to a coffee shop, 110 00:03:15,128 –> 00:03:17,130 literally across the street. Right. 111 00:03:17,163 –> 00:03:18,331 Some of you some of you know what I’m 112 00:03:18,331 –> 00:03:19,532 talking about, 113 00:03:19,799 –> 00:03:21,034 literally across the street, 114 00:03:21,534 –> 00:03:22,468 get their work done 115 00:03:22,468 –> 00:03:23,670 and then come back in. 116 00:03:23,803 –> 00:03:25,905 All for legitimate purposes, right? 117 00:03:25,905 –> 00:03:27,373 They weren’t doing anything nefarious. 118 00:03:27,373 –> 00:03:28,875 They’re not trying to get away with anything. 119 00:03:29,175 –> 00:03:30,376 It’s trying to get work done. 120 00:03:30,944 –> 00:03:32,145 And it got kind of ridiculous. 121 00:03:32,212 –> 00:03:33,746 Got particularly ridiculous when they started 122 00:03:33,746 –> 00:03:35,815 noticing other employees doing the same thing, 123 00:03:35,815 –> 00:03:37,083 sitting at the coffee shop with them. 124 00:03:37,483 –> 00:03:38,418 I mean, it’s just it’s 125 00:03:38,418 –> 00:03:40,019 it’s moronic, right? 126 00:03:40,787 –> 00:03:42,655 So once everyone starts 127 00:03:42,655 –> 00:03:43,856 noticing that, 128 00:03:44,157 –> 00:03:44,757 that starts 129 00:03:44,757 –> 00:03:46,426 leading to employee dissatisfaction, 130 00:03:47,193 –> 00:03:48,161 then you have people starting 131 00:03:48,161 –> 00:03:49,329 to realize, like, hey, 132 00:03:49,596 –> 00:03:51,130 this is a lot harder to work here 133 00:03:51,431 –> 00:03:53,433 than it is somewhere else, right? 134 00:03:53,466 –> 00:03:55,835 I’m going to go somewhere that’s going to let me do the job 135 00:03:55,835 –> 00:03:57,036 the right way. 136 00:03:57,203 –> 00:03:58,905 Which creates all kinds of new risks 137 00:04:00,506 –> 00:04:01,708 on the other end, 138 00:04:02,575 –> 00:04:02,842 when you 139 00:04:02,842 –> 00:04:04,844 have new technologies, you’ll have organizations 140 00:04:04,844 –> 00:04:07,146 that say, hey, this is the best thing ever. 141 00:04:07,247 –> 00:04:07,981 Don’t worry about it. 142 00:04:07,981 –> 00:04:09,349 Let’s just let’s dive 143 00:04:09,349 –> 00:04:10,550 in headfirst, right? 144 00:04:11,050 –> 00:04:12,418 And so it’s great. 145 00:04:12,452 –> 00:04:14,587 You start realizing benefits early. 146 00:04:15,255 –> 00:04:16,456 You don’t restrict 147 00:04:16,889 –> 00:04:17,890 what’s what’s happening there. 148 00:04:17,890 –> 00:04:19,659 So you can realize kind of the maximum 149 00:04:19,659 –> 00:04:20,860 level of benefits, 150 00:04:21,227 –> 00:04:22,695 not calculating for the risks. 151 00:04:23,696 –> 00:04:24,564 There are massive 152 00:04:24,564 –> 00:04:26,399 risks with doing that. Right. 153 00:04:26,432 –> 00:04:28,935 So I might I have unmanaged 154 00:04:28,935 –> 00:04:30,937 usage of these new techs. 155 00:04:31,638 –> 00:04:33,006 There’s a sorted 156 00:04:33,439 –> 00:04:34,807 security problems, 157 00:04:35,108 –> 00:04:37,577 operational issues, financial issues 158 00:04:37,577 –> 00:04:39,045 ultimately that come from that. 159 00:04:39,879 –> 00:04:40,546 And ultimately 160 00:04:40,546 –> 00:04:41,748 when you have an unmanaged 161 00:04:41,748 –> 00:04:43,116 technology in the environment 162 00:04:43,483 –> 00:04:44,817 it starts to lead to employee 163 00:04:44,817 –> 00:04:46,019 distrust of that. 164 00:04:46,419 –> 00:04:47,620 So part of the problem with 165 00:04:47,620 –> 00:04:48,988 that is that if I’ve got something 166 00:04:48,988 –> 00:04:50,189 that’s not managed, 167 00:04:50,256 –> 00:04:51,424 particularly with AI, 168 00:04:52,525 –> 00:04:53,226 I’m using these 169 00:04:53,226 –> 00:04:55,295 tools, it presenting me bad information. 170 00:04:55,628 –> 00:04:56,629 I’m making decisions 171 00:04:56,629 –> 00:04:57,830 off of bad information. 172 00:04:58,431 –> 00:04:59,632 That’s not good. 173 00:04:59,666 –> 00:05:01,701 People start to have a their opinion of 174 00:05:01,701 –> 00:05:03,970 AI soured from things like that. 175 00:05:04,304 –> 00:05:05,672 So you got to manage that too. 176 00:05:06,673 –> 00:05:07,874 The truth is, 177 00:05:08,641 –> 00:05:09,842 best way to approach it 178 00:05:10,209 –> 00:05:11,744 as it is with so many things in 179 00:05:11,744 –> 00:05:13,479 life is kind of in the middle. 180 00:05:14,213 –> 00:05:15,982 How do we establish kind of a responsible 181 00:05:15,982 –> 00:05:17,183 use model for this? 182 00:05:17,350 –> 00:05:19,018 Realize maybe not all benefits, 183 00:05:19,018 –> 00:05:20,219 but most 184 00:05:20,620 –> 00:05:22,455 we have managed usage of this. 185 00:05:22,655 –> 00:05:24,791 Try and drive towards employee satisfaction 186 00:05:24,791 –> 00:05:25,992 and there as best we can 187 00:05:26,392 –> 00:05:27,727 and minimize the risk. 188 00:05:28,161 –> 00:05:29,729 Right. To do this. 189 00:05:29,796 –> 00:05:31,064 That’s where governance comes in. 190 00:05:33,199 –> 00:05:34,567 For me, AI governance 191 00:05:34,834 –> 00:05:37,337 is about enabling responsible AI use 192 00:05:38,438 –> 00:05:39,639 and development 193 00:05:39,706 –> 00:05:41,274 by focusing on certain things. 194 00:05:41,808 –> 00:05:43,443 One is certainly the risks. 195 00:05:43,810 –> 00:05:45,745 This is not just security risks. 196 00:05:46,212 –> 00:05:47,513 These are operational risks. 197 00:05:47,547 –> 00:05:48,815 These are financial risks. 198 00:05:49,115 –> 00:05:50,817 There’s a host of things to be part of this. 199 00:05:50,983 –> 00:05:53,052 Security is a big part of this, to be sure. 200 00:05:53,486 –> 00:05:54,620 But there’s more than that. 201 00:05:55,888 –> 00:05:56,689 The other 202 00:05:56,689 –> 00:05:57,824 side thing that you’re balancing 203 00:05:57,824 –> 00:05:59,359 there is how am I balancing 204 00:05:59,359 –> 00:06:01,761 that with maximizing the business value 205 00:06:01,761 –> 00:06:02,962 that I’m trying to realize 206 00:06:03,296 –> 00:06:04,230 from, like Brian’s talk 207 00:06:04,230 –> 00:06:05,531 this morning, where he’s talking about 208 00:06:05,665 –> 00:06:07,734 creating a portfolio of of agents, 209 00:06:07,734 –> 00:06:08,935 how do I’m designing that? 210 00:06:09,202 –> 00:06:09,969 Somebody is kind 211 00:06:09,969 –> 00:06:11,270 of come forward to the vision for what 212 00:06:11,270 –> 00:06:12,472 they want out of AI. 213 00:06:12,605 –> 00:06:14,540 Governance has a responsibility to help them 214 00:06:14,540 –> 00:06:16,342 realize that in a managed, 215 00:06:16,342 –> 00:06:17,844 safe fashion. Right? 216 00:06:17,877 –> 00:06:19,178 That’s really what this is about 217 00:06:19,679 –> 00:06:20,847 ensuring usability. 218 00:06:20,847 –> 00:06:21,581 Make sure that you’re not 219 00:06:21,581 –> 00:06:23,716 locking things down so much that people can’t use it. 220 00:06:24,117 –> 00:06:25,551 Kind of like my example earlier. 221 00:06:26,786 –> 00:06:28,287 Also, how am I making sure 222 00:06:28,287 –> 00:06:29,489 that I’m maintaining trust 223 00:06:29,956 –> 00:06:31,391 so that people are using this? 224 00:06:31,424 –> 00:06:32,625 We’ve made this available 225 00:06:33,025 –> 00:06:34,560 and the the information 226 00:06:34,560 –> 00:06:35,928 provided is of a quality 227 00:06:36,362 –> 00:06:38,564 that people can trust that they’re not 228 00:06:38,564 –> 00:06:40,833 having to constantly second guess everything. 229 00:06:42,068 –> 00:06:43,436 Key areas for doing this 230 00:06:43,436 –> 00:06:45,705 involve setting the right policy and strategy 231 00:06:46,139 –> 00:06:47,340 for the organization, 232 00:06:48,541 –> 00:06:50,243 for putting in the right education 233 00:06:50,243 –> 00:06:52,678 and enablement efforts in there, enabling the people. 234 00:06:52,712 –> 00:06:54,280 This is a very much a people 235 00:06:54,280 –> 00:06:56,115 first technology in so many ways. 236 00:06:57,517 –> 00:06:59,152 How do I put the right protections 237 00:06:59,185 –> 00:07:00,386 monitoring in there? 238 00:07:01,687 –> 00:07:01,988 And then 239 00:07:01,988 –> 00:07:04,390 how do I ensure that I’ve got any kind of ongoing 240 00:07:04,424 –> 00:07:06,225 testing and development that happens here too? 241 00:07:06,659 –> 00:07:07,660 It’s not just a set 242 00:07:07,660 –> 00:07:08,828 at once and forget it. 243 00:07:09,095 –> 00:07:09,929 You need to continue 244 00:07:09,929 –> 00:07:11,164 to manage this over time. 245 00:07:11,330 –> 00:07:13,065 Now when I look at I use 246 00:07:13,599 –> 00:07:15,301 there’s kind of a spectrum of that, 247 00:07:15,601 –> 00:07:16,536 right? 248 00:07:16,536 –> 00:07:17,703 There are those 249 00:07:17,703 –> 00:07:19,539 not using AI yet today. 250 00:07:20,406 –> 00:07:22,141 I’m going to ask you, I’m guessing there’s probably a low 251 00:07:22,141 –> 00:07:23,342 number here, but 252 00:07:23,543 –> 00:07:25,011 how many of you in your organizations, 253 00:07:25,011 –> 00:07:26,279 they’re like, yeah, no, we’re not 254 00:07:26,279 –> 00:07:27,480 using AI. 255 00:07:27,547 –> 00:07:28,981 So hands anyone? 256 00:07:30,550 –> 00:07:31,551 Good. 257 00:07:31,551 –> 00:07:32,752 That’s great. 258 00:07:32,885 –> 00:07:34,854 There’s going to be certain things we talk about here 259 00:07:34,854 –> 00:07:36,456 for those that do operate that way. 260 00:07:36,756 –> 00:07:38,257 You want to make sure these things are in place. 261 00:07:38,257 –> 00:07:39,725 Because we’re going to show kind of a progression 262 00:07:39,725 –> 00:07:41,427 for governance as you work through this. 263 00:07:41,828 –> 00:07:42,695 So hopefully those 264 00:07:42,695 –> 00:07:43,896 those things are helpful. 265 00:07:44,297 –> 00:07:45,031 How many of you 266 00:07:45,031 –> 00:07:46,732 your organization allows you to use 267 00:07:47,066 –> 00:07:48,334 free public AI, 268 00:07:48,334 –> 00:07:49,669 but they’re not investing in things 269 00:07:49,669 –> 00:07:50,903 like CoPilot and not building 270 00:07:50,903 –> 00:07:52,104 agents yet. 271 00:07:52,405 –> 00:07:53,606 So hands. 272 00:07:54,040 –> 00:07:54,440 Okay. 273 00:07:54,440 –> 00:07:55,641 Thank you. 274 00:07:56,242 –> 00:07:57,543 How many of your organizations 275 00:07:57,543 –> 00:08:00,646 are investing in AI systems 276 00:08:00,646 –> 00:08:02,882 like CoPilot licensed for CoPilot? 277 00:08:03,749 –> 00:08:04,951 Great. Great. 278 00:08:05,218 –> 00:08:05,885 Changed a lot. 279 00:08:05,885 –> 00:08:07,787 I left Microsoft two years ago. 280 00:08:08,354 –> 00:08:08,588 Three. 281 00:08:08,588 –> 00:08:10,490 Now, I guess in 2023, 282 00:08:10,490 –> 00:08:11,757 CoPilot is just coming out. 283 00:08:12,158 –> 00:08:13,359 I can tell you internally, 284 00:08:13,759 –> 00:08:14,961 everyone there is like, 285 00:08:15,127 –> 00:08:16,462 what is this thing? Right? 286 00:08:16,496 –> 00:08:17,897 We’re they’re struggling 287 00:08:17,897 –> 00:08:19,799 to use it ourselves, right? 288 00:08:20,266 –> 00:08:21,334 Come a long way. 289 00:08:21,334 –> 00:08:23,402 In three years, I have not seen a technology 290 00:08:23,402 –> 00:08:25,271 get adapt adopted this quickly. 291 00:08:25,671 –> 00:08:27,306 In my 30 years in the business. 292 00:08:29,375 –> 00:08:29,709 The next 293 00:08:29,709 –> 00:08:30,910 part of this is how many of you 294 00:08:30,910 –> 00:08:32,378 are actually building your own 295 00:08:32,378 –> 00:08:33,579 AI offerings. 296 00:08:33,779 –> 00:08:34,947 This could be more CoPilot 297 00:08:34,947 –> 00:08:37,416 Studio, could be AI Foundry could be host. 298 00:08:37,416 –> 00:08:38,985 All right. Thank you. Thank you. 299 00:08:39,385 –> 00:08:40,686 When you take that on, it’s a whole 300 00:08:40,686 –> 00:08:42,989 nother level of responsibility when you’re doing this, 301 00:08:43,222 –> 00:08:44,490 because you’re now providing a product 302 00:08:44,490 –> 00:08:45,691 for other people to use. 303 00:08:45,958 –> 00:08:46,826 So you have to take on 304 00:08:46,826 –> 00:08:49,428 some of the things that Microsoft has to take on itself 305 00:08:49,428 –> 00:08:50,630 or other vendors 306 00:08:50,763 –> 00:08:52,398 when they’re delivering an AI product. 307 00:08:52,565 –> 00:08:54,166 So we’re going to talk about all that as we go. 308 00:08:55,801 –> 00:08:57,003 First of all, 309 00:08:57,103 –> 00:08:58,371 if you’re not using AI directly, 310 00:08:58,371 –> 00:09:00,006 and again, I appreciate most of you aren’t. 311 00:09:00,806 –> 00:09:01,507 Thing to remember 312 00:09:01,507 –> 00:09:03,609 there is that there’s no avoiding AI. 313 00:09:04,110 –> 00:09:05,511 It’s here to stay. There is. 314 00:09:05,912 –> 00:09:07,146 You can’t put your head in the sand 315 00:09:07,146 –> 00:09:08,514 and hope it bypasses you. 316 00:09:08,748 –> 00:09:09,815 You have to embrace it. 317 00:09:09,815 –> 00:09:10,983 You have to realize that this is 318 00:09:10,983 –> 00:09:12,919 this is our new reality, right? 319 00:09:13,686 –> 00:09:14,387 In this stage, 320 00:09:14,387 –> 00:09:15,721 we encourage a lot of organizations 321 00:09:15,721 –> 00:09:16,923 creating the AI policy. 322 00:09:17,390 –> 00:09:18,824 If you don’t have one, it’s a good time 323 00:09:18,824 –> 00:09:20,226 to kind of get caught up on that. 324 00:09:20,593 –> 00:09:22,028 So you can create that policy for 325 00:09:22,028 –> 00:09:23,529 how do you want to responsibly use 326 00:09:23,529 –> 00:09:24,897 AI within your organization? 327 00:09:25,464 –> 00:09:26,666 Right here 328 00:09:26,766 –> 00:09:27,700 we highlighted here 329 00:09:27,700 –> 00:09:30,202 because for organizations that haven’t quite gotten there 330 00:09:30,202 –> 00:09:32,705 yet, you’re trying to create that vision for okay, 331 00:09:32,705 –> 00:09:33,773 how would we use this. 332 00:09:33,773 –> 00:09:34,807 You got to get there. 333 00:09:34,807 –> 00:09:35,975 So how do you create that plan 334 00:09:35,975 –> 00:09:37,810 for what you would allow use for? 335 00:09:38,978 –> 00:09:39,612 Other important 336 00:09:39,612 –> 00:09:41,547 part here is just educating yourself. 337 00:09:42,114 –> 00:09:43,215 A lot of you probably have. 338 00:09:43,215 –> 00:09:45,217 Being here today is probably helped a little bit, 339 00:09:45,585 –> 00:09:47,086 but you’ve probably gone through a lot of what 340 00:09:47,286 –> 00:09:48,621 we’re going to talk about there already. 341 00:09:50,022 –> 00:09:51,524 The other thing there is to restrict 342 00:09:51,524 –> 00:09:53,159 AI use an access until you’re ready. 343 00:09:53,626 –> 00:09:53,859 Right. 344 00:09:53,859 –> 00:09:55,328 If you’re really trying to avoid that, 345 00:09:55,728 –> 00:09:57,630 there are some tools there which can be used 346 00:09:57,930 –> 00:09:58,898 further as you’re starting 347 00:09:58,898 –> 00:10:00,099 to control the environment 348 00:10:00,466 –> 00:10:01,467 that are beneficial here 349 00:10:01,467 –> 00:10:02,668 to to put in place. 350 00:10:03,669 –> 00:10:06,739 Now, I appreciate that after lunch. 351 00:10:07,406 –> 00:10:08,874 Probably not the most exciting 352 00:10:08,874 –> 00:10:10,910 topic in the world, right? 353 00:10:11,277 –> 00:10:13,813 So we put in some fun videos here. 354 00:10:13,813 –> 00:10:15,414 We had a couple of our associates 355 00:10:15,748 –> 00:10:17,316 play our some fictional characters, 356 00:10:17,316 –> 00:10:18,517 Daryl and Daryl, 357 00:10:18,517 –> 00:10:19,986 and highlighting some, 358 00:10:19,986 –> 00:10:22,355 some caricature style, 359 00:10:22,688 –> 00:10:25,257 events that people 360 00:10:25,658 –> 00:10:28,127 might, might be experience as they’re going through this. 361 00:10:29,695 –> 00:10:30,997 So here’s Daryl and Daryl 362 00:10:31,430 –> 00:10:32,398 struggling with 363 00:10:32,398 –> 00:10:34,934 they’re not necessarily using AI yet, 364 00:10:35,501 –> 00:10:36,369 but it’s it highlights 365 00:10:36,369 –> 00:10:37,737 the importance of AI awareness. 366 00:10:39,905 –> 00:10:40,673 Hey, Daryl. 367 00:10:40,673 –> 00:10:42,842 Did you catch the game last night? 368 00:10:42,842 –> 00:10:43,609 Kansas game? 369 00:10:43,609 –> 00:10:44,610 No, I didn’t. No. 370 00:10:44,610 –> 00:10:45,811 Come on, come on, come on. 371 00:10:45,978 –> 00:10:47,146 Look at this. 372 00:10:47,146 –> 00:10:48,314 Have you ever seen 373 00:10:48,314 –> 00:10:49,548 a baseball game like this before? 374 00:10:49,582 –> 00:10:50,182 Look at this video. 375 00:10:50,182 –> 00:10:51,384 Check this out. 376 00:10:52,618 –> 00:10:53,953 What is going on? 377 00:10:53,953 –> 00:10:55,888 I just want to say I can’t stop watching this. 378 00:10:55,888 –> 00:10:57,390 Like, what is that video? 379 00:10:57,523 –> 00:10:58,824 You sure this is real? 380 00:10:58,891 –> 00:11:00,092 It’s a video, Darryl. 381 00:11:00,159 –> 00:11:01,360 Videos are real. 382 00:11:03,029 –> 00:11:03,696 Darryl. 383 00:11:03,696 –> 00:11:06,065 This is clearly AI generated. 384 00:11:06,232 –> 00:11:07,500 But it’s a video. 385 00:11:07,533 –> 00:11:10,503 No no no no no, that does not happen 386 00:11:10,503 –> 00:11:13,239 in basketball games, I clearly cannot. 387 00:11:13,239 –> 00:11:15,474 You’re saying I can do this? 388 00:11:15,508 –> 00:11:17,510 Yes, I can do a lot of things, Darryl. 389 00:11:17,510 –> 00:11:18,477 And this is one of them. 390 00:11:20,046 –> 00:11:20,646 You got to watch out 391 00:11:20,646 –> 00:11:21,847 for these things. 392 00:11:23,416 –> 00:11:23,649 Thing 393 00:11:23,649 –> 00:11:25,918 you’re highlighting there is that even if you’re not using AI, 394 00:11:25,951 –> 00:11:27,687 you need to ensure that you understand 395 00:11:27,687 –> 00:11:28,888 what I can do 396 00:11:29,221 –> 00:11:31,057 and be able to police yourself 397 00:11:31,057 –> 00:11:33,192 in terms of trusting what you’re seeing out there. 398 00:11:33,359 –> 00:11:34,360 Right then. 399 00:11:34,360 –> 00:11:36,362 This goes for whatever stage you’re in. 400 00:11:36,362 –> 00:11:36,662 In this. 401 00:11:36,662 –> 00:11:38,431 It’s an important aspect of this. 402 00:11:38,664 –> 00:11:40,266 You need to understand that there’s 403 00:11:40,266 –> 00:11:42,601 a lot of possibilities with AI. 404 00:11:42,935 –> 00:11:44,804 Deepfakes is a very real thing. 405 00:11:44,804 –> 00:11:47,106 The phishing, sophistication 406 00:11:47,106 –> 00:11:49,809 has gone through the roof now with AI there. 407 00:11:49,809 –> 00:11:51,677 Those emails you’ll get from, 408 00:11:51,677 –> 00:11:53,646 you know, that uncle in a foreign country 409 00:11:54,280 –> 00:11:57,083 look far more legitimate than they used to, right? 410 00:11:57,083 –> 00:11:57,483 Used to. 411 00:11:57,483 –> 00:11:59,418 We’ve had been dealing with these things for a long time, 412 00:11:59,719 –> 00:12:00,953 and just making that more, 413 00:12:01,587 –> 00:12:04,356 makes it harder to to discern. 414 00:12:04,356 –> 00:12:04,924 Right. 415 00:12:04,924 –> 00:12:05,958 But there are things you can do 416 00:12:05,958 –> 00:12:07,426 to educate yourself 417 00:12:07,426 –> 00:12:08,627 to, to deal with that. 418 00:12:08,794 –> 00:12:10,029 It’s important that everybody does 419 00:12:11,897 –> 00:12:12,665 in terms of, 420 00:12:12,665 –> 00:12:15,234 kind of getting ready for AI 421 00:12:15,468 –> 00:12:17,203 and or if you’re already in AI, 422 00:12:17,570 –> 00:12:19,338 I highly encourage you 423 00:12:19,772 –> 00:12:20,973 to investigate 424 00:12:21,240 –> 00:12:23,209 embracing a zero trust strategy. 425 00:12:23,843 –> 00:12:24,677 Zero trust. 426 00:12:24,677 –> 00:12:27,113 And the reason I really, really emphasize 427 00:12:27,113 –> 00:12:29,115 this is zero trust. 428 00:12:29,115 –> 00:12:30,616 Even though it’s a security model, 429 00:12:30,850 –> 00:12:32,885 security is usually about saying no 430 00:12:33,786 –> 00:12:34,086 zero. 431 00:12:34,086 –> 00:12:35,855 Trust is not zero. 432 00:12:35,855 –> 00:12:37,957 Trust is actually about saying yes. 433 00:12:38,257 –> 00:12:39,458 How do you enable 434 00:12:39,925 –> 00:12:41,927 your organization and the security 435 00:12:42,061 –> 00:12:43,429 within your organization to be able to say 436 00:12:43,429 –> 00:12:44,630 yes to things? 437 00:12:44,797 –> 00:12:45,998 This is a way to do that. 438 00:12:46,031 –> 00:12:47,666 You have a strong enough security model. 439 00:12:47,933 –> 00:12:49,835 It allows security become an enabler 440 00:12:50,236 –> 00:12:51,971 of certain things you want to do. 441 00:12:52,571 –> 00:12:53,773 If you’re getting started with this, 442 00:12:53,839 –> 00:12:54,940 I highly recommend 443 00:12:54,940 –> 00:12:56,275 running a zero trust assessment 444 00:12:56,275 –> 00:12:57,476 on your environment. 445 00:12:57,510 –> 00:12:58,511 Maybe running a workshop 446 00:12:58,511 –> 00:12:59,979 to get familiar with the concept. 447 00:13:00,446 –> 00:13:02,214 It’s important to understand 448 00:13:02,214 –> 00:13:04,016 the real philosophy of that. 449 00:13:04,016 –> 00:13:05,584 And it’s not about, you know, 450 00:13:05,584 –> 00:13:07,419 never trust anybody in that, which it is. 451 00:13:07,419 –> 00:13:09,355 But there’s a reason for that. 452 00:13:09,355 –> 00:13:11,690 And the purpose of that is that if I treat it that way, 453 00:13:12,057 –> 00:13:13,926 I treat it as kind of assume breach. 454 00:13:14,593 –> 00:13:15,528 That allows me to, 455 00:13:16,595 –> 00:13:18,197 to lock things down enough 456 00:13:18,798 –> 00:13:20,666 that I’m able to enable 457 00:13:20,666 –> 00:13:21,867 other ways of working. 458 00:13:21,867 –> 00:13:23,402 This was highly important as well. 459 00:13:23,402 –> 00:13:25,437 In 2020, we went to hybrid work models, 460 00:13:25,838 –> 00:13:26,939 a lot of organizations. 461 00:13:26,939 –> 00:13:28,207 This philosophy really 462 00:13:28,207 –> 00:13:29,975 help them out because they’re like, 463 00:13:29,975 –> 00:13:31,644 I’ve got to be able to enable these people. 464 00:13:31,844 –> 00:13:33,612 How do I secure the core enough 465 00:13:34,079 –> 00:13:36,081 that I can allow people to operate 466 00:13:36,081 –> 00:13:37,550 outside of the network perimeters? 467 00:13:37,550 –> 00:13:41,520 Perhaps I operate in, identities 468 00:13:41,821 –> 00:13:43,155 very important for this. 469 00:13:44,190 –> 00:13:45,825 So I would encourage you all to make progress 470 00:13:45,825 –> 00:13:47,026 towards adoption of that. 471 00:13:47,393 –> 00:13:47,760 Your thing. 472 00:13:47,760 –> 00:13:48,961 Do make sure you have an acceptable 473 00:13:48,994 –> 00:13:50,196 AI usage policy. 474 00:13:50,629 –> 00:13:51,363 Make sure you’ve set 475 00:13:51,363 –> 00:13:52,665 some kind of parameters 476 00:13:52,932 –> 00:13:53,999 for your organization in terms 477 00:13:53,999 –> 00:13:55,868 of what we allow for what we don’t 478 00:13:56,502 –> 00:13:57,803 in terms of AI use. 479 00:13:58,571 –> 00:13:58,904 Right. 480 00:13:59,939 –> 00:14:02,107 I would highly encourage you to use 481 00:14:02,107 –> 00:14:03,475 don’t just have this be it 482 00:14:03,576 –> 00:14:05,411 and security bring H.R. 483 00:14:05,411 –> 00:14:06,712 into this, bring legal under this, 484 00:14:07,012 –> 00:14:08,914 bring as many people from your organization as you want. 485 00:14:08,914 –> 00:14:10,683 But I certainly would start with those four. 486 00:14:11,083 –> 00:14:12,284 You don’t want to muddy the waters 487 00:14:12,284 –> 00:14:13,686 so much you can’t get anything done, 488 00:14:13,986 –> 00:14:14,887 but you need certain core 489 00:14:14,887 –> 00:14:16,088 representation 490 00:14:16,188 –> 00:14:17,389 in order to do this right. 491 00:14:20,626 –> 00:14:21,427 Education level 492 00:14:21,427 –> 00:14:23,462 and understanding what I can do. 493 00:14:24,029 –> 00:14:25,931 Identifying AI generated content. 494 00:14:26,232 –> 00:14:27,800 There’s a host of online 495 00:14:28,100 –> 00:14:29,001 learnings on this. 496 00:14:29,001 –> 00:14:31,604 I would encourage anybody that hasn’t been familiar. 497 00:14:31,637 –> 00:14:33,272 This this is an ongoing thing. 498 00:14:33,539 –> 00:14:34,940 Things are changing all the time. 499 00:14:34,940 –> 00:14:36,609 Make sure the organizations educated 500 00:14:37,576 –> 00:14:38,310 first barrier 501 00:14:38,310 –> 00:14:39,578 there is making sure people are doing 502 00:14:39,578 –> 00:14:42,014 smart things and not making stupid mistakes right? 503 00:14:45,150 –> 00:14:46,719 In terms of restricting AI 504 00:14:46,719 –> 00:14:49,188 tool usage, we highlight this here. 505 00:14:49,688 –> 00:14:50,856 You can use the things 506 00:14:50,856 –> 00:14:52,157 like Defender for Cloud Apps. 507 00:14:52,591 –> 00:14:53,525 This actually feeds 508 00:14:53,525 –> 00:14:54,727 off of your firewalls 509 00:14:54,727 –> 00:14:56,161 and the information within there. 510 00:14:56,662 –> 00:14:57,630 It’s a great tool 511 00:14:57,630 –> 00:15:00,132 for being able to start examining 512 00:15:00,366 –> 00:15:01,934 what people are doing with AI 513 00:15:02,434 –> 00:15:03,569 and offering 514 00:15:03,569 –> 00:15:05,004 a viewpoint on that 515 00:15:05,004 –> 00:15:06,205 in terms of whether that’s 516 00:15:06,205 –> 00:15:07,940 sanctioned or unsanctioned. 517 00:15:08,474 –> 00:15:09,775 You market is unsanctioned. 518 00:15:10,175 –> 00:15:12,077 It’ll start putting alerts out there. 519 00:15:12,578 –> 00:15:14,046 If you have defender for endpoints in there, 520 00:15:14,179 –> 00:15:15,381 you can actually block use. 521 00:15:16,148 –> 00:15:17,583 You can also do this through certain 522 00:15:17,583 –> 00:15:18,784 networking tools as well. 523 00:15:19,084 –> 00:15:20,352 But as important as that 524 00:15:20,352 –> 00:15:23,789 list of AI agents and tools out there 525 00:15:23,789 –> 00:15:24,990 continues to grow, 526 00:15:25,057 –> 00:15:26,325 not all of which can be trusted. 527 00:15:26,926 –> 00:15:28,127 The nice thing is that Microsoft’s 528 00:15:28,127 –> 00:15:29,361 keeping an eye on that for you. 529 00:15:29,595 –> 00:15:30,829 They’re providing risk scoring 530 00:15:30,896 –> 00:15:32,197 in terms of the tooling 531 00:15:32,197 –> 00:15:33,399 and agent that are out there. 532 00:15:33,499 –> 00:15:34,900 Just provides a guide for everybody. 533 00:15:34,934 –> 00:15:36,368 See, don’t have to be master of that domain 534 00:15:36,368 –> 00:15:39,071 to top of everything else you’re doing in your organization. 535 00:15:40,839 –> 00:15:42,041 All right. 536 00:15:42,441 –> 00:15:43,642 Second stage 537 00:15:44,076 –> 00:15:45,277 using public AI. 538 00:15:45,811 –> 00:15:46,078 Right. 539 00:15:46,078 –> 00:15:48,480 So here I’m using things like ChatGPT 540 00:15:49,214 –> 00:15:52,451 could be, CoPilot chat could be, 541 00:15:52,451 –> 00:15:54,520 Claude, a host of other, 542 00:15:54,553 –> 00:15:55,754 other tools. 543 00:15:56,388 –> 00:15:57,990 First thing to be aware of here is be aware 544 00:15:57,990 –> 00:15:59,458 of what you share. Right. 545 00:15:59,458 –> 00:16:01,026 And what’s happening with that information. 546 00:16:01,493 –> 00:16:02,361 As you’re sharing, 547 00:16:02,361 –> 00:16:03,662 even as you’re prompting, 548 00:16:03,929 –> 00:16:05,464 you are sharing information 549 00:16:05,464 –> 00:16:06,665 just in those prompts. 550 00:16:06,899 –> 00:16:08,000 If you share data up 551 00:16:08,000 –> 00:16:09,201 there, share documents, 552 00:16:09,668 –> 00:16:10,836 you’re investing a whole 553 00:16:10,836 –> 00:16:13,038 nother level of information out there 554 00:16:13,238 –> 00:16:15,341 to be used by whatever tool. 555 00:16:15,341 –> 00:16:17,176 And whoever’s running that to on the back end. 556 00:16:18,010 –> 00:16:19,411 Continue to refine your policy 557 00:16:19,411 –> 00:16:20,612 and educate here. 558 00:16:21,246 –> 00:16:22,748 Make sure you’re protecting 559 00:16:22,748 –> 00:16:25,217 your organization against the untrusted tooling. 560 00:16:25,517 –> 00:16:27,119 This is evolving things like Defender 561 00:16:27,119 –> 00:16:28,320 for Cloud Apps 562 00:16:28,320 –> 00:16:29,955 to allow for trusted tooling, 563 00:16:29,955 –> 00:16:31,190 but maybe not the untrusted 564 00:16:32,725 –> 00:16:33,959 letโ€™s have another video in terms of 565 00:16:33,959 –> 00:16:35,394 an example of where this can go wrong. 566 00:16:36,695 –> 00:16:38,063 Daryl, come over here, I want to show you something. 567 00:16:38,130 –> 00:16:39,298 Yea, what you working on Daryl? 568 00:16:39,298 –> 00:16:42,034 I am just doing some analysis 569 00:16:42,034 –> 00:16:43,235 for the sales team. 570 00:16:43,469 –> 00:16:44,403 And, 571 00:16:44,403 –> 00:16:46,372 there’s this new Russian AI tool. 572 00:16:46,405 –> 00:16:46,839 Yeah. 573 00:16:46,839 –> 00:16:48,674 It’s supposed to give you tremendous outputs 574 00:16:48,674 –> 00:16:49,942 just for the switch 575 00:16:50,309 –> 00:16:51,510 data prior to security. 576 00:16:51,643 –> 00:16:52,611 Nothing to worry about. 577 00:16:52,611 –> 00:16:53,812 It’s really cool. 578 00:16:53,879 –> 00:16:55,180 So are we paying for yours? 579 00:16:55,347 –> 00:16:56,382 Oh, no. Completely free. 580 00:16:56,382 –> 00:16:57,316 Oh, sweet. 581 00:16:57,316 –> 00:16:58,550 It’s the goodness of their heart. 582 00:16:58,984 –> 00:17:01,653 So can I, let’s go ahead here. 583 00:17:01,653 –> 00:17:04,056 I’m going to upload the sales 584 00:17:04,056 –> 00:17:05,624 forecast in spreadsheets. 585 00:17:05,691 –> 00:17:07,493 There’s not really much in it other than just, like, 586 00:17:07,760 –> 00:17:08,827 our client’s names 587 00:17:08,827 –> 00:17:10,095 and their addresses. Now. 588 00:17:10,129 –> 00:17:11,163 Yeah, it’s stuff like that. 589 00:17:11,163 –> 00:17:12,598 Maybe I should send you mine, too. 590 00:17:12,598 –> 00:17:13,999 And then you could do them both at once. 591 00:17:14,266 –> 00:17:15,401 That would be great. 592 00:17:15,401 –> 00:17:16,335 He should. Okay. 593 00:17:16,335 –> 00:17:19,004 I’m gonna build my clients out. Iโ€™ve got all my customers here. 594 00:17:19,171 –> 00:17:20,139 Here we go. Yeah. 595 00:17:20,139 –> 00:17:21,006 Throw that in there. 596 00:17:21,006 –> 00:17:23,475 OK, cool to start it. 597 00:17:23,675 –> 00:17:26,345 Catch that one too. Submit. 598 00:17:28,247 –> 00:17:28,614 All right. 599 00:17:28,614 –> 00:17:29,882 Why is my computer black? 600 00:17:30,182 –> 00:17:31,383 Wait. Daryl? 601 00:17:31,650 –> 00:17:32,584 Yeah. 602 00:17:32,584 –> 00:17:34,053 Yeah, my computer’s black. 603 00:17:35,654 –> 00:17:36,855 Me too. 604 00:17:37,456 –> 00:17:38,657 Mine too. 605 00:17:38,857 –> 00:17:39,825 Oh, no. 606 00:17:39,825 –> 00:17:41,026 We’ve been hacked. 607 00:17:41,160 –> 00:17:41,927 What happened? 608 00:17:41,927 –> 00:17:43,128 What site was that? 609 00:17:43,262 –> 00:17:44,463 Oh, God. 610 00:17:47,766 –> 00:17:48,967 Oh, Daryl. 611 00:17:50,335 –> 00:17:50,803 so. Yeah. 612 00:17:50,803 –> 00:17:52,938 So again, extreme example, 613 00:17:52,938 –> 00:17:54,373 but you get the point right there. 614 00:17:54,406 –> 00:17:55,774 There are risks to sharing 615 00:17:55,774 –> 00:17:57,142 information through public AI. 616 00:17:57,376 –> 00:17:58,343 You want to make sure you’re using 617 00:17:58,343 –> 00:17:59,878 trusted tooling tooling 618 00:17:59,878 –> 00:18:01,313 that’s not sharing your data 619 00:18:01,313 –> 00:18:03,615 outside of the sessions that you’re running in. 620 00:18:03,715 –> 00:18:04,850 Right. 621 00:18:04,850 –> 00:18:05,651 CoPilot chat 622 00:18:05,651 –> 00:18:07,286 does a nice job of this, right? 623 00:18:07,486 –> 00:18:09,388 Not all tools out there are created the same. 624 00:18:09,388 –> 00:18:11,490 Some will mine your data that you’re sharing with them 625 00:18:11,757 –> 00:18:13,559 may not lead to hacking incidents, 626 00:18:13,559 –> 00:18:14,760 but it will lead to, 627 00:18:15,094 –> 00:18:16,028 data exfiltration 628 00:18:16,028 –> 00:18:17,729 that you may not have anticipated. 629 00:18:18,163 –> 00:18:20,833 So it’s important to continue to use 630 00:18:21,066 –> 00:18:22,634 trusted publicly AI tools. 631 00:18:22,634 –> 00:18:24,636 There’s a lot of readouts on that. 632 00:18:24,636 –> 00:18:26,572 There’s a host of of lists. 633 00:18:26,605 –> 00:18:28,173 And then people that can guide you into what 634 00:18:28,440 –> 00:18:30,042 what’s safe, what’s not and why. 635 00:18:31,677 –> 00:18:32,411 One important 636 00:18:32,411 –> 00:18:34,480 aspect here I want to highlight is that as you’re 637 00:18:34,480 –> 00:18:35,681 using AI, 638 00:18:36,515 –> 00:18:37,783 you want to treat that like 639 00:18:37,816 –> 00:18:40,219 you’re talking to a person, right? 640 00:18:40,219 –> 00:18:42,187 So if I’m talking to public AI, 641 00:18:42,621 –> 00:18:44,289 I want to treat it as if I’m talking to 642 00:18:44,289 –> 00:18:46,058 maybe somebody outside of my organization. 643 00:18:46,725 –> 00:18:47,960 If I’m asking questions 644 00:18:47,960 –> 00:18:49,161 or sharing information there, 645 00:18:49,328 –> 00:18:51,430 would I tell them that? Right. 646 00:18:51,430 –> 00:18:52,664 And being mindful of 647 00:18:52,664 –> 00:18:54,166 of my responsibilities to whoever 648 00:18:54,166 –> 00:18:55,367 I’m working for, 649 00:18:55,434 –> 00:18:57,069 it’s just kind of an initial good 650 00:18:57,069 –> 00:18:58,670 frame of mind to to have. 651 00:18:58,670 –> 00:19:00,105 They’re like, what I share 652 00:19:00,105 –> 00:19:01,373 this with with a competitor. 653 00:19:02,207 –> 00:19:03,242 Probably not. 654 00:19:03,242 –> 00:19:03,742 Maybe not. 655 00:19:03,742 –> 00:19:04,977 Want to put it in the public eye 656 00:19:05,477 –> 00:19:06,678 right. 657 00:19:06,945 –> 00:19:07,946 The other thing is that, 658 00:19:07,946 –> 00:19:10,749 you want to make sure to update your 659 00:19:10,749 –> 00:19:11,950 AI policy 660 00:19:12,050 –> 00:19:13,252 to kind of make sure you’re 661 00:19:13,418 –> 00:19:14,620 taking into account 662 00:19:14,887 –> 00:19:15,921 what you want to allow for 663 00:19:15,921 –> 00:19:18,023 as you’ve now entered this, this age, 664 00:19:18,023 –> 00:19:20,225 and define what those acceptable tools are. 665 00:19:20,526 –> 00:19:21,693 Make sure you’ve communicated 666 00:19:21,693 –> 00:19:22,895 that with everybody. 667 00:19:22,928 –> 00:19:24,096 It’s one thing just to say, okay, 668 00:19:24,096 –> 00:19:25,297 we’re just going to use this. 669 00:19:25,564 –> 00:19:26,298 You still have people 670 00:19:26,298 –> 00:19:27,766 going around, right? 671 00:19:28,000 –> 00:19:29,134 That’s not helpful. 672 00:19:29,134 –> 00:19:30,569 Need to make sure people are aware 673 00:19:30,569 –> 00:19:31,770 that this is available. 674 00:19:31,770 –> 00:19:34,139 This will help against the shadow IT piece. 675 00:19:34,406 –> 00:19:37,109 Or if you’re saying like, hey, we have these trusted tools to use, 676 00:19:37,442 –> 00:19:39,011 stop using everything else, 677 00:19:39,144 –> 00:19:40,546 whether you’re blocking them or not. 678 00:19:42,915 –> 00:19:45,150 Education enablement again enabled. 679 00:19:45,617 –> 00:19:46,618 Make sure you’re running everybody 680 00:19:46,618 –> 00:19:49,388 through an appropriate I use training model. 681 00:19:49,421 –> 00:19:49,588 Again. 682 00:19:49,588 –> 00:19:51,190 There’s a whole host of these out there. 683 00:19:51,557 –> 00:19:52,691 We can help you with that. 684 00:19:52,691 –> 00:19:54,526 But there’s there’s plenty online that you can 685 00:19:54,526 –> 00:19:56,562 you can use to to get through this. 686 00:19:57,196 –> 00:19:59,331 What trusted AI tools are appropriate 687 00:19:59,331 –> 00:20:00,532 use cases for it? 688 00:20:00,999 –> 00:20:02,201 Risks of data sharing. 689 00:20:02,201 –> 00:20:04,636 It’s important people start to understand that when you’re 690 00:20:04,636 –> 00:20:06,638 sharing things out there, what might happen with that data? 691 00:20:06,638 –> 00:20:08,574 Why is that a problem? Right? 692 00:20:08,574 –> 00:20:09,474 Not everybody gets that. 693 00:20:09,474 –> 00:20:11,543 A lot of people are trying to do the right thing all the time, 694 00:20:12,144 –> 00:20:14,479 but mistakes happen when they don’t know 695 00:20:14,479 –> 00:20:16,181 that they’re making errors 696 00:20:16,181 –> 00:20:17,382 in judgment, 697 00:20:18,016 –> 00:20:19,551 handle ethical issues. 698 00:20:20,152 –> 00:20:21,553 You know, it’s important 699 00:20:21,553 –> 00:20:23,155 that we’re we’re being responsible 700 00:20:23,155 –> 00:20:25,157 and how we behave out there, too, 701 00:20:25,157 –> 00:20:26,558 because sometimes those prompts, 702 00:20:26,959 –> 00:20:27,759 we’re asking things 703 00:20:27,759 –> 00:20:29,895 maybe we shouldn’t be in there 704 00:20:30,195 –> 00:20:31,096 that is that’s 705 00:20:31,096 –> 00:20:32,631 recorded, that’s available out there 706 00:20:32,631 –> 00:20:34,199 that can come back to bite you too. 707 00:20:34,633 –> 00:20:35,667 So make sure that 708 00:20:35,667 –> 00:20:36,902 that everyone’s behaving 709 00:20:36,902 –> 00:20:38,937 appropriately can treat it like a person. 710 00:20:39,338 –> 00:20:40,639 You wouldn’t say it to another person. 711 00:20:40,906 –> 00:20:42,407 It shouldn’t be done with AI either. 712 00:20:42,975 –> 00:20:44,676 Most of you, I’m not worried about this, but 713 00:20:44,943 –> 00:20:46,778 that has been a problem for some in the past. 714 00:20:47,813 –> 00:20:49,014 Make sure to. 715 00:20:49,681 –> 00:20:51,984 This is where we get to that trust 716 00:20:51,984 –> 00:20:53,185 but verify. 717 00:20:53,385 –> 00:20:55,687 I might still be at the never trust. 718 00:20:56,088 –> 00:20:57,656 Verify everything you see here. 719 00:20:58,357 –> 00:20:58,924 I prefer 720 00:20:58,924 –> 00:21:00,392 AI tools that give me the sources 721 00:21:00,392 –> 00:21:01,727 for where the information came from. 722 00:21:02,227 –> 00:21:04,162 It sped up my finding that it gave me 723 00:21:04,162 –> 00:21:05,530 a good readout of it, but 724 00:21:05,530 –> 00:21:06,732 I’m still checking the source. 725 00:21:06,898 –> 00:21:08,667 Make sure that that readout was correct. 726 00:21:08,967 –> 00:21:10,602 Make sure that’s a legitimate source, 727 00:21:11,203 –> 00:21:12,437 where that information came from. 728 00:21:14,406 –> 00:21:15,907 Protection monitoring again, 729 00:21:15,907 –> 00:21:17,476 evolving Defender for Cloud Apps 730 00:21:18,443 –> 00:21:18,677 here. 731 00:21:18,677 –> 00:21:20,178 We’re going to start getting into though 732 00:21:20,779 –> 00:21:22,681 if I’m going to be sharing data out 733 00:21:22,681 –> 00:21:24,316 and be able to upload a file, 734 00:21:24,316 –> 00:21:26,685 maybe to even CoPilot chat 735 00:21:26,685 –> 00:21:29,288 or Claude or ChatGPT, 736 00:21:30,656 –> 00:21:32,691 I want to start protecting my data 737 00:21:32,958 –> 00:21:34,226 at the document level, 738 00:21:35,193 –> 00:21:36,295 because there’s certain data there 739 00:21:36,295 –> 00:21:37,796 that I don’t want getting out 740 00:21:38,096 –> 00:21:39,331 right now. 741 00:21:39,865 –> 00:21:42,067 This is a process unto itself. 742 00:21:42,167 –> 00:21:44,503 We’re going to talk a little bit about data governance protection 743 00:21:44,503 –> 00:21:46,905 throughout this, because it is an important topic 744 00:21:47,472 –> 00:21:48,240 at this point. 745 00:21:48,240 –> 00:21:50,909 All we’re stating is that for the high, 746 00:21:50,942 –> 00:21:52,511 really highly sensitive things 747 00:21:52,844 –> 00:21:54,713 you think about marking those documents 748 00:21:54,946 –> 00:21:56,181 as sensitive, possibly 749 00:21:56,181 –> 00:21:57,482 putting some encryption on those 750 00:21:57,783 –> 00:21:59,184 so that only certain people 751 00:21:59,184 –> 00:22:00,385 are able to get to those. 752 00:22:00,552 –> 00:22:02,387 These are highly sensitive business documents. 753 00:22:02,387 –> 00:22:03,555 I want to start there. 754 00:22:03,555 –> 00:22:05,123 Please protect yourself against the things 755 00:22:05,123 –> 00:22:06,658 that are the biggest threat to your organization 756 00:22:07,526 –> 00:22:08,760 involved, that over time, 757 00:22:09,928 –> 00:22:11,129 the other thing to do is, 758 00:22:11,330 –> 00:22:13,432 manage your browsers and devices. 759 00:22:14,566 –> 00:22:15,834 There are a host of tools 760 00:22:15,834 –> 00:22:18,303 that Purview has in concert with Intune, 761 00:22:18,737 –> 00:22:19,638 probably others as 762 00:22:19,638 –> 00:22:20,906 well, where Microsoft shop. 763 00:22:20,906 –> 00:22:23,442 So I’m going to be talking a lot about the Microsoft tooling today. 764 00:22:23,442 –> 00:22:24,743 But the principle is the same. 765 00:22:25,644 –> 00:22:26,978 You want to embed 766 00:22:26,978 –> 00:22:29,114 some level of DLP into the browser 767 00:22:29,414 –> 00:22:31,083 so it can monitor what you’re doing. 768 00:22:31,083 –> 00:22:32,551 Most of these 769 00:22:32,551 –> 00:22:34,553 AI tools exist within a browser. 770 00:22:34,886 –> 00:22:35,921 I then can control 771 00:22:35,921 –> 00:22:37,155 if people are accessing ones 772 00:22:37,155 –> 00:22:39,558 I want or ones that I don’t write. 773 00:22:39,825 –> 00:22:41,693 I also will ultimately allow me to see 774 00:22:41,927 –> 00:22:43,428 what are they doing in there? 775 00:22:44,229 –> 00:22:45,464 Even in the trusted ones, 776 00:22:46,998 –> 00:22:48,667 if I am or the device in the Purview, 777 00:22:48,834 –> 00:22:50,035 I can see some other 778 00:22:50,235 –> 00:22:51,603 data flows within there too. 779 00:22:51,603 –> 00:22:54,005 And what’s happening with them, with my sensitive information 780 00:22:54,506 –> 00:22:55,207 inside of there 781 00:22:55,207 –> 00:22:56,508 as well, just provides 782 00:22:56,508 –> 00:22:58,810 a lot of good visibility here, helps me evolve 783 00:22:58,810 –> 00:23:00,145 my policies, helps me 784 00:23:00,145 –> 00:23:01,546 have all my protection mechanisms. 785 00:23:01,847 –> 00:23:03,048 It’s a great place to start. 786 00:23:03,281 –> 00:23:05,283 Pretty low lift, to be honest, right? 787 00:23:05,283 –> 00:23:08,387 This is not a difficult thing to do, but it is important. 788 00:23:09,788 –> 00:23:10,522 The other thing I would do 789 00:23:10,522 –> 00:23:11,757 in this ties in with being able 790 00:23:11,757 –> 00:23:13,458 to monitor this from the DLP 791 00:23:14,025 –> 00:23:15,827 is be able to turn on Data 792 00:23:15,827 –> 00:23:17,896 Security Posture Management for for AI. 793 00:23:18,797 –> 00:23:19,431 This is something 794 00:23:19,431 –> 00:23:21,466 that’s part of the Purview E5 suite. 795 00:23:21,967 –> 00:23:24,035 It’s a great visibility 796 00:23:24,035 –> 00:23:26,071 tool, provides AI observability in there 797 00:23:26,505 –> 00:23:27,539 and you can start seeing 798 00:23:27,539 –> 00:23:28,740 what’s happening. 799 00:23:29,007 –> 00:23:31,209 Use this in concert with Insider Risk 800 00:23:31,510 –> 00:23:32,878 to create policies, 801 00:23:32,878 –> 00:23:34,846 which is another part of of Purview. 802 00:23:35,680 –> 00:23:37,349 And use this to create policies 803 00:23:37,349 –> 00:23:39,718 to monitor for certain generative AI use. 804 00:23:40,218 –> 00:23:41,052 And you can see kind of the 805 00:23:41,052 –> 00:23:42,354 what’s being shared on prompts, 806 00:23:42,788 –> 00:23:43,922 the information being shared in 807 00:23:43,922 –> 00:23:45,757 that nothing else is just providing 808 00:23:45,757 –> 00:23:46,958 observability at this point. 809 00:23:46,958 –> 00:23:49,127 So you can start to say, okay, we’ve got to tune this 810 00:23:49,127 –> 00:23:50,061 a little bit. 811 00:23:50,061 –> 00:23:51,596 I don’t really like this happening. 812 00:23:51,596 –> 00:23:52,764 Being able to manage that. 813 00:23:55,267 –> 00:23:55,967 That’s public 814 00:23:55,967 –> 00:23:56,735 AI. 815 00:23:56,735 –> 00:23:58,370 Let’s start getting into when we’re starting to pay for this 816 00:23:58,370 –> 00:23:59,571 a little more 817 00:24:00,405 –> 00:24:01,506 using AI systems. 818 00:24:01,506 –> 00:24:03,275 One, obviously use something 819 00:24:03,275 –> 00:24:05,410 that you trust the don’t pay for things you don’t trust. 820 00:24:05,410 –> 00:24:07,245 That’s just common sense. 821 00:24:08,313 –> 00:24:10,215 But so you’re using something 822 00:24:10,215 –> 00:24:11,416 like CoPilot. 823 00:24:11,416 –> 00:24:12,918 Be aware what you’re exposing there. 824 00:24:13,251 –> 00:24:14,786 Not so much to the outside world. 825 00:24:14,786 –> 00:24:16,588 CoPilot is going to protect you for a lot of that. 826 00:24:17,055 –> 00:24:17,889 This is what you’re sharing 827 00:24:17,889 –> 00:24:19,658 internally, right? 828 00:24:20,058 –> 00:24:21,159 There are a lot of threats 829 00:24:21,159 –> 00:24:22,461 that start to get exposed. 830 00:24:22,828 –> 00:24:24,229 If you don’t manage your environment 831 00:24:24,229 –> 00:24:26,264 properly and open up CoPilot to it, 832 00:24:26,731 –> 00:24:27,499 you’re going to expose 833 00:24:27,499 –> 00:24:29,301 all kinds of things that you never thought 834 00:24:29,301 –> 00:24:31,436 somebody would find or see 835 00:24:31,803 –> 00:24:32,771 or anything like that, 836 00:24:32,771 –> 00:24:35,474 because CoPilot searching all that and it’s it’s 837 00:24:35,540 –> 00:24:36,975 surfacing those in ways you wouldn’t 838 00:24:36,975 –> 00:24:38,176 even anticipate. 839 00:24:38,477 –> 00:24:39,077 So it’s important 840 00:24:39,077 –> 00:24:40,412 to put some protections on that 841 00:24:41,279 –> 00:24:42,948 against that unexpected 842 00:24:42,948 –> 00:24:45,250 information sharing internally. 843 00:24:46,384 –> 00:24:47,886 So you an example what can happen. 844 00:24:50,689 –> 00:24:51,156 Hey, Daryl. 845 00:24:51,156 –> 00:24:52,724 Did you hear the boss 846 00:24:52,724 –> 00:24:53,925 story about Project Phoenix? 847 00:24:54,192 –> 00:24:55,160 I haven’t heard of that. No. 848 00:24:55,160 –> 00:24:57,062 You want to put it in CoPilot and see if we get anything? 849 00:24:57,562 –> 00:24:57,896 I just. 850 00:24:57,896 –> 00:24:58,864 Oh, I’d love to know more. 851 00:24:58,864 –> 00:25:00,065 Yeah. 852 00:25:00,332 –> 00:25:01,766 Ooh. CoPilotโ€™s got some hits. 853 00:25:02,567 –> 00:25:04,202 I was presenting information. 854 00:25:04,202 –> 00:25:06,771 Project Phoenix is a strategic acquisition. 855 00:25:07,539 –> 00:25:08,740 Acquisition? 856 00:25:08,740 –> 00:25:09,574 Wait wait wait. 857 00:25:09,574 –> 00:25:11,309 Are we being acquired 858 00:25:11,309 –> 00:25:13,578 or are we acquiring? CoPilot, 859 00:25:13,612 –> 00:25:16,081 what is the new company org chart? 860 00:25:16,448 –> 00:25:17,482 Wait, this acquisition 861 00:25:17,482 –> 00:25:18,950 is going to close next week. 862 00:25:19,351 –> 00:25:20,385 Nice. Alright. 863 00:25:20,385 –> 00:25:21,219 Sweet. 864 00:25:21,219 –> 00:25:22,420 Could be a big payday. 865 00:25:23,121 –> 00:25:24,322 Stock? 866 00:25:24,456 –> 00:25:26,224 Daryl, wait. 867 00:25:27,092 –> 00:25:28,293 We’re not in this. 868 00:25:28,393 –> 00:25:29,194 You’re not in this 869 00:25:29,194 –> 00:25:30,195 org chart. 870 00:25:30,195 –> 00:25:31,863 No. There must be a mistake. 871 00:25:31,863 –> 00:25:32,998 I’m not. I’m not. 872 00:25:32,998 –> 00:25:35,767 And this is straight from our boss. 873 00:25:35,800 –> 00:25:37,168 I’m not in this org chart either 874 00:25:38,069 –> 00:25:39,037 875 00:25:40,171 –> 00:25:41,673 Are we 876 00:25:42,841 –> 00:25:44,042 Weโ€™re not going to get fired, are we? 877 00:25:44,042 –> 00:25:45,243 But I. 878 00:25:46,444 –> 00:25:47,045 This is it. 879 00:25:47,045 –> 00:25:48,747 This is ridiculous. God! 880 00:25:51,016 –> 00:25:52,050 And now you have an HR problem 881 00:25:52,050 –> 00:25:53,618 that didn’t have to happen, right? 882 00:25:53,785 –> 00:25:54,653 Like that’s. 883 00:25:54,653 –> 00:25:55,787 This is why it’s important 884 00:25:55,787 –> 00:25:57,756 to be careful of what you’re sharing internally. 885 00:25:58,290 –> 00:25:59,891 Innocent enough. Right? 886 00:25:59,891 –> 00:26:01,359 They’re not doing anything wrong. 887 00:26:01,359 –> 00:26:02,561 They’re curious, 888 00:26:02,661 –> 00:26:04,763 but they don’t have any business knowing that given where they’re 889 00:26:04,763 –> 00:26:06,264 at in the organization right now, that’s 890 00:26:06,264 –> 00:26:07,899 there are reasons that that’s managed 891 00:26:08,033 –> 00:26:09,200 right. 892 00:26:09,334 –> 00:26:09,968 This is why 893 00:26:09,968 –> 00:26:12,304 data governance and protection is so important. 894 00:26:12,404 –> 00:26:15,006 When you enter the world, particularly of agenetic 895 00:26:15,006 –> 00:26:16,207 AI systems 896 00:26:16,274 –> 00:26:17,609 it’s going to expose data 897 00:26:17,642 –> 00:26:20,845 out there that you had no idea was there still. 898 00:26:21,112 –> 00:26:21,713 Right. 899 00:26:21,713 –> 00:26:23,481 Or it’s going to expose 900 00:26:23,481 –> 00:26:24,783 that you have permission problems. 901 00:26:24,950 –> 00:26:27,018 You weren’t aware existed, things 902 00:26:27,018 –> 00:26:28,486 that have been shared over time 903 00:26:28,486 –> 00:26:29,688 that are still open 904 00:26:29,921 –> 00:26:31,723 and now maybe should have been locked down. 905 00:26:32,824 –> 00:26:33,558 So we’re talking 906 00:26:33,558 –> 00:26:34,793 about the 907 00:26:34,793 –> 00:26:36,061 data governance and protection 908 00:26:36,227 –> 00:26:39,097 begins with understanding the data that’s out there. 909 00:26:39,331 –> 00:26:39,531 Right. 910 00:26:39,531 –> 00:26:41,499 What do I have that I’m going to surface 911 00:26:41,499 –> 00:26:42,701 to CoPilot. 912 00:26:42,801 –> 00:26:44,002 So that’s able to surface 913 00:26:44,002 –> 00:26:46,237 that on to your employees. 914 00:26:47,272 –> 00:26:49,140 How do I protect that data. 915 00:26:49,708 –> 00:26:51,977 How am I able to secure that in a way 916 00:26:52,344 –> 00:26:54,613 that they’re not seeing things maybe they shouldn’t 917 00:26:54,713 –> 00:26:56,381 within different parts of the organization. 918 00:26:57,115 –> 00:26:57,682 And finally, 919 00:26:57,682 –> 00:26:58,883 and this isn’t always the most 920 00:26:58,883 –> 00:27:00,085 obvious one is 921 00:27:00,118 –> 00:27:01,586 you want to keep the data you need 922 00:27:01,953 –> 00:27:02,887 and remove the data. 923 00:27:02,887 –> 00:27:04,522 You don’t. Right. 924 00:27:05,156 –> 00:27:06,925 This is important for cleaning this up. 925 00:27:07,192 –> 00:27:08,960 This gets into the quality 926 00:27:08,960 –> 00:27:10,161 of the CoPilot results 927 00:27:10,161 –> 00:27:11,429 that you’re going to get quality 928 00:27:11,429 –> 00:27:12,731 the AI search results 929 00:27:12,731 –> 00:27:13,932 that you’re going to get out of this. 930 00:27:14,466 –> 00:27:14,966 Right. 931 00:27:14,966 –> 00:27:16,801 If I have a bunch of old junk 932 00:27:16,801 –> 00:27:18,103 out there, it’s going to surface 933 00:27:18,103 –> 00:27:19,738 that the same as that surfacing the new 934 00:27:20,171 –> 00:27:21,673 and make inferences off of that, 935 00:27:21,906 –> 00:27:23,842 making inferences that maybe are outdated. 936 00:27:24,209 –> 00:27:25,677 They’re based on bad information. 937 00:27:25,977 –> 00:27:26,978 So it’s important that you’re 938 00:27:26,978 –> 00:27:28,246 cleaning up your environment 939 00:27:28,580 –> 00:27:29,714 and making sure that you’re 940 00:27:29,714 –> 00:27:30,982 only have the information 941 00:27:30,982 –> 00:27:32,183 in there that you should. 942 00:27:32,317 –> 00:27:33,518 First thing here 943 00:27:33,718 –> 00:27:34,919 you got to define a data 944 00:27:35,320 –> 00:27:36,855 data governance and protection policy. 945 00:27:36,855 –> 00:27:38,156 So starts 946 00:27:38,156 –> 00:27:39,758 with sensitive data definitions 947 00:27:40,592 –> 00:27:41,292 right. 948 00:27:41,292 –> 00:27:43,361 What am I defining as different levels 949 00:27:43,361 –> 00:27:45,296 of sensitive data within my organization 950 00:27:46,031 –> 00:27:47,065 with Purview. 951 00:27:47,065 –> 00:27:49,000 Microsoft’s got some great schemes for this 952 00:27:49,000 –> 00:27:51,136 and a great progression for evolving this over time. 953 00:27:51,670 –> 00:27:53,805 This is a big, big thing. 954 00:27:54,305 –> 00:27:56,141 This takes time to get it right. 955 00:27:56,608 –> 00:27:57,809 But you can get started 956 00:27:57,809 –> 00:27:59,177 small and start 957 00:27:59,177 –> 00:28:00,578 to realize some of the benefits 958 00:28:00,779 –> 00:28:02,580 and evolve that practice over time. 959 00:28:02,947 –> 00:28:04,683 You don’t have to solve this all at once, 960 00:28:04,949 –> 00:28:06,351 but you do need to pay attention to it, 961 00:28:07,419 –> 00:28:08,019 right? 962 00:28:08,019 –> 00:28:09,220 The other side of that 963 00:28:09,454 –> 00:28:11,089 is having retention policies. 964 00:28:11,523 –> 00:28:12,490 This is what governs 965 00:28:12,490 –> 00:28:14,292 what’s there and what’s gets removed. 966 00:28:14,459 –> 00:28:15,760 What needs to be kept 967 00:28:15,927 –> 00:28:18,063 persistently archived, 968 00:28:18,530 –> 00:28:20,265 and what needs to be deleted. 969 00:28:20,565 –> 00:28:21,766 After time. 970 00:28:22,867 –> 00:28:23,768 I will tell you, 971 00:28:23,768 –> 00:28:24,969 this has been something that, 972 00:28:25,804 –> 00:28:27,272 legal counsel 973 00:28:27,272 –> 00:28:28,506 within certain organizations 974 00:28:28,506 –> 00:28:30,575 have been driving for for some time, 975 00:28:31,042 –> 00:28:31,976 because the more data 976 00:28:31,976 –> 00:28:32,877 that’s in there, 977 00:28:32,877 –> 00:28:35,113 the more data is ultimately discoverable, while what’s 978 00:28:35,280 –> 00:28:37,115 what’s been happening there becomes 979 00:28:37,115 –> 00:28:39,484 any legal challenge, a bigger issue, 980 00:28:39,617 –> 00:28:42,987 more costly, and exposes more risk. 981 00:28:42,987 –> 00:28:44,422 The organization ultimately. 982 00:28:45,590 –> 00:28:46,658 So it’s important again 983 00:28:46,658 –> 00:28:47,859 to clean up the data 984 00:28:48,059 –> 00:28:48,960 and mark the data 985 00:28:48,960 –> 00:28:50,295 so you know what’s sensitive 986 00:28:50,528 –> 00:28:51,730 and what’s not right. 987 00:28:51,730 –> 00:28:52,897 Not everything is sensitive. 988 00:28:53,631 –> 00:28:54,332 Some things 989 00:28:54,332 –> 00:28:55,800 while it’s corporate data, 990 00:28:56,835 –> 00:28:58,036 somebody else saw it. 991 00:28:58,570 –> 00:28:58,970 They’re not 992 00:28:58,970 –> 00:29:00,271 they’re not going to cause any problems. 993 00:29:00,271 –> 00:29:01,439 Right. They know that already. 994 00:29:02,507 –> 00:29:03,274 There are other things. 995 00:29:03,274 –> 00:29:03,975 Yeah. 996 00:29:03,975 –> 00:29:05,844 Mission time Appliance Institute 997 00:29:06,444 –> 00:29:07,178 and stuff like that. 998 00:29:07,178 –> 00:29:08,313 Which you have to retain. 999 00:29:08,313 –> 00:29:10,048 Yeah. There are there are. So yeah. 1000 00:29:10,048 –> 00:29:11,516 So when you’re designing this particularly 1001 00:29:11,516 –> 00:29:12,717 the retention policies 1002 00:29:13,084 –> 00:29:14,018 that when you’re going to make sure 1003 00:29:14,018 –> 00:29:15,754 that to have legal counsel as part of this. 1004 00:29:16,588 –> 00:29:16,988 Absolutely. 1005 00:29:16,988 –> 00:29:18,223 A lot of organizations 1006 00:29:18,223 –> 00:29:20,024 will have their retention 1007 00:29:20,024 –> 00:29:21,760 policies, their archiving policies. 1008 00:29:21,893 –> 00:29:23,862 There’s certain data and depends on your industry. 1009 00:29:23,862 –> 00:29:25,230 That’s not true of one company. 1010 00:29:25,230 –> 00:29:26,898 One size does not fit all here. 1011 00:29:27,398 –> 00:29:28,199 Some companies 1012 00:29:28,199 –> 00:29:29,934 need to keep data forever. 1013 00:29:30,201 –> 00:29:31,736 Some. And it’s certain types of data. 1014 00:29:31,736 –> 00:29:33,037 It’s not everything right. 1015 00:29:33,371 –> 00:29:35,473 Some need to keep it for seven years. 1016 00:29:35,807 –> 00:29:36,775 Some there’s different 1017 00:29:36,775 –> 00:29:38,610 regulatory timelines on all that. 1018 00:29:38,610 –> 00:29:40,044 So it’s important 1019 00:29:40,044 –> 00:29:41,579 we’re not going to drill through all this today. 1020 00:29:41,579 –> 00:29:42,747 We can help with that. 1021 00:29:42,747 –> 00:29:44,048 But we’re not going to do all that today. 1022 00:29:44,082 –> 00:29:45,283 So it’s important to be approaching 1023 00:29:45,850 –> 00:29:47,218 things that things to be considered. 1024 00:29:49,354 –> 00:29:50,555 Next part of this is 1025 00:29:50,822 –> 00:29:52,423 if I’m rolling out CoPilot, 1026 00:29:52,824 –> 00:29:54,526 I need to have an AI enablement plan. 1027 00:29:55,160 –> 00:29:56,060 How am I going to make sure 1028 00:29:56,060 –> 00:29:57,428 everyone’s getting the most value 1029 00:29:57,428 –> 00:29:59,130 for what you’re paying for? Right? 1030 00:29:59,230 –> 00:30:01,366 What am I doing to help drive usage, 1031 00:30:01,733 –> 00:30:02,934 to help drive 1032 00:30:03,067 –> 00:30:05,236 the the realization of the benefits 1033 00:30:05,236 –> 00:30:06,671 that we’re hoping to get from that 1034 00:30:07,839 –> 00:30:09,274 training, ongoing assistance 1035 00:30:09,274 –> 00:30:10,875 they’re doing, 1036 00:30:11,075 –> 00:30:13,278 basic CoPilot, agent creation. 1037 00:30:13,278 –> 00:30:14,512 If you decide that we’re going to start 1038 00:30:14,879 –> 00:30:16,281 allowing for agent creation 1039 00:30:16,281 –> 00:30:17,816 inside of their writing guidance 1040 00:30:17,816 –> 00:30:19,117 for their what’s allowed, what’s not, 1041 00:30:20,485 –> 00:30:22,220 and ensuring that the data sources 1042 00:30:22,220 –> 00:30:23,621 made available to CoPilot 1043 00:30:23,888 –> 00:30:25,089 are the ones you want. 1044 00:30:25,456 –> 00:30:26,691 Right? Could be SharePoint. 1045 00:30:26,691 –> 00:30:28,793 You’d say, hey, we’ll let you operate off 1046 00:30:28,793 –> 00:30:30,428 of these select SharePoint sites, 1047 00:30:30,428 –> 00:30:32,063 but not these, right? 1048 00:30:32,363 –> 00:30:34,132 Allow for OneDrive teams. 1049 00:30:35,700 –> 00:30:36,301 Could be like, 1050 00:30:36,301 –> 00:30:38,036 hey, we’re going to allow CoPilot to interact 1051 00:30:38,036 –> 00:30:40,672 with our Dynamics 365 environment. 1052 00:30:40,939 –> 00:30:42,941 Could be some other external data source. 1053 00:30:43,208 –> 00:30:44,709 You could expose a lot through there. 1054 00:30:45,109 –> 00:30:46,945 But you want to be mindful of what you’re allowing for 1055 00:30:46,945 –> 00:30:48,947 and what you’re not going to just turn on everything. 1056 00:30:50,181 –> 00:30:51,082 Finally, continue 1057 00:30:51,082 –> 00:30:52,550 up to your AI policy as you’re 1058 00:30:52,550 –> 00:30:53,751 making these decisions. 1059 00:30:53,918 –> 00:30:55,086 Make sure that’s captured 1060 00:30:55,086 –> 00:30:56,421 within your AI policy 1061 00:30:57,922 –> 00:30:59,123 education. 1062 00:30:59,357 –> 00:31:00,291 If you’re going to pursue 1063 00:31:00,291 –> 00:31:01,492 data protection 1064 00:31:01,693 –> 00:31:02,727 and retention, 1065 00:31:02,727 –> 00:31:03,995 need to educate people on it. 1066 00:31:04,629 –> 00:31:05,263 You can tag 1067 00:31:05,263 –> 00:31:06,731 all you want, but you need to tell people 1068 00:31:06,731 –> 00:31:08,233 what’s what because they’re going to have 1069 00:31:08,233 –> 00:31:09,667 to help you with the tagging and tuning 1070 00:31:09,667 –> 00:31:10,869 that over time. 1071 00:31:11,469 –> 00:31:12,537 CoPilot use. 1072 00:31:12,537 –> 00:31:14,272 So a host of online training for this. 1073 00:31:14,272 –> 00:31:15,473 A lot of great stuff out there. 1074 00:31:15,940 –> 00:31:17,208 Again, something we can help with. 1075 00:31:18,009 –> 00:31:18,743 But you want to make sure 1076 00:31:18,743 –> 00:31:19,944 that people understand 1077 00:31:20,211 –> 00:31:21,412 how to use the tool 1078 00:31:21,546 –> 00:31:22,580 and get them excited about it, 1079 00:31:22,580 –> 00:31:24,415 get them using it, but doing it the right way. 1080 00:31:25,049 –> 00:31:26,317 And this is about enablement. 1081 00:31:26,317 –> 00:31:28,786 At the end of the day, but doing it in a managed fashion. 1082 00:31:29,888 –> 00:31:31,389 You also may want to consider creating 1083 00:31:31,389 –> 00:31:32,824 a center of excellence, community 1084 00:31:33,157 –> 00:31:34,359 of some kind, 1085 00:31:35,260 –> 00:31:36,728 people that are champions 1086 00:31:36,728 –> 00:31:38,730 for AI, that understand it maybe a little better 1087 00:31:38,730 –> 00:31:39,931 than everybody else, 1088 00:31:39,931 –> 00:31:41,432 and kind of help the organization 1089 00:31:42,100 –> 00:31:43,268 find its way forward. 1090 00:31:43,902 –> 00:31:44,335 Right. 1091 00:31:44,335 –> 00:31:45,536 It’s a great practice. 1092 00:31:45,770 –> 00:31:47,839 Can be hard to execute, 1093 00:31:48,373 –> 00:31:49,607 but it is a great practice 1094 00:31:49,607 –> 00:31:50,808 when done right. 1095 00:31:50,842 –> 00:31:52,677 Yes. Mean if you can. 1096 00:31:52,677 –> 00:31:54,178 Have a pilot integrated 1097 00:31:55,480 –> 00:31:56,681 that in 1098 00:31:56,781 –> 00:31:58,549 just to help you build your own. 1099 00:31:59,050 –> 00:32:00,218 Work. Yep. 1100 00:32:00,285 –> 00:32:02,587 It’s 100% safe 1101 00:32:02,887 –> 00:32:04,489 from other people searching. 1102 00:32:04,923 –> 00:32:05,990 You know, if you did not 1103 00:32:05,990 –> 00:32:07,191 put it in the service 1104 00:32:07,392 –> 00:32:08,593 and you’re just using 1105 00:32:08,960 –> 00:32:11,062 100% saved, that 1106 00:32:11,296 –> 00:32:12,563 when someone does a search 1107 00:32:12,764 –> 00:32:15,099 like Daryl and Daryl did. 1108 00:32:16,234 –> 00:32:18,303 So I’m going to categorize 1109 00:32:18,303 –> 00:32:19,771 that answer a little bit classified, but 1110 00:32:20,271 –> 00:32:22,206 so M365 CoPilot, 1111 00:32:22,206 –> 00:32:23,408 the one you paid for, 1112 00:32:23,474 –> 00:32:24,676 you have your session. 1113 00:32:25,009 –> 00:32:25,810 That information 1114 00:32:25,810 –> 00:32:27,011 is contained within the session. 1115 00:32:27,211 –> 00:32:29,113 The other thing, through Certified CoPilot does 1116 00:32:29,414 –> 00:32:30,782 is it runs off the permissions 1117 00:32:30,782 –> 00:32:31,983 within the Microsoft platform. 1118 00:32:32,150 –> 00:32:33,484 So if you can’t see it, 1119 00:32:33,985 –> 00:32:36,087 if you’re not able to find it and search anyways 1120 00:32:36,120 –> 00:32:37,388 somebody else’s OneDrive 1121 00:32:37,956 –> 00:32:38,823 SharePoint site 1122 00:32:38,823 –> 00:32:40,024 that you don’t have permissions for, 1123 00:32:40,358 –> 00:32:41,926 it will restrict that for sure. 1124 00:32:42,460 –> 00:32:43,127 There’s that is 1125 00:32:43,127 –> 00:32:44,429 core to what that’s providing. 1126 00:32:44,429 –> 00:32:45,697 That’s why it’s such a great tool 1127 00:32:46,130 –> 00:32:47,598 because it handles that for you. 1128 00:32:47,598 –> 00:32:48,933 So you don’t have to to manage 1129 00:32:48,933 –> 00:32:50,134 that the same way. 1130 00:32:50,134 –> 00:32:51,336 Right. 1131 00:32:51,336 –> 00:32:53,304 Your data also is not shared outside 1132 00:32:53,304 –> 00:32:54,539 of the organization at all. 1133 00:32:55,039 –> 00:32:55,907 It’s designed so 1134 00:32:55,907 –> 00:32:57,108 that your CoPilot instance 1135 00:32:58,109 –> 00:32:59,310 physically, well, 1136 00:32:59,610 –> 00:33:00,645 none of it’s physical, 1137 00:33:00,645 –> 00:33:02,280 but there are boundaries 1138 00:33:02,280 –> 00:33:04,215 to what that CoPilot instance can do. 1139 00:33:04,449 –> 00:33:06,250 It’s not tied in with anybody else’s. 1140 00:33:06,484 –> 00:33:08,920 So your data can’t flow outside of that unless you 1141 00:33:09,454 –> 00:33:10,188 actually share that 1142 00:33:10,188 –> 00:33:11,389 with somebody else. 1143 00:33:11,689 –> 00:33:12,857 Okay. 1144 00:33:12,857 –> 00:33:13,257 Great. 1145 00:33:13,257 –> 00:33:14,459 Great question by the way. 1146 00:33:16,127 –> 00:33:17,161 So let’s talk about protection 1147 00:33:17,161 –> 00:33:18,363 monitoring 1148 00:33:18,796 –> 00:33:20,064 this speaking to what 1149 00:33:20,064 –> 00:33:21,265 we were just talking about. 1150 00:33:22,166 –> 00:33:23,868 One of the first things I would do 1151 00:33:23,868 –> 00:33:25,570 is review your SharePoint permissions. 1152 00:33:26,204 –> 00:33:28,940 Anybody with a single CoPilot M365 1153 00:33:28,940 –> 00:33:30,141 CoPilot license 1154 00:33:30,441 –> 00:33:32,477 also has the rights to SharePoint 1155 00:33:32,477 –> 00:33:33,678 Advanced Management. 1156 00:33:33,878 –> 00:33:34,879 It’s a suite of tools 1157 00:33:34,879 –> 00:33:36,080 that Microsoft’s provided 1158 00:33:36,381 –> 00:33:37,448 allows you to review 1159 00:33:37,448 –> 00:33:39,817 permissions, review your top 100 1160 00:33:39,817 –> 00:33:41,019 SharePoint sites to see 1161 00:33:41,019 –> 00:33:42,220 what’s getting hit the most, 1162 00:33:42,787 –> 00:33:43,888 and be able to govern 1163 00:33:43,888 –> 00:33:45,656 what’s shared within CoPilot. 1164 00:33:46,791 –> 00:33:48,226 Very, very useful. 1165 00:33:48,493 –> 00:33:50,728 It’s eye opening for most organizations. 1166 00:33:50,728 –> 00:33:52,230 Everyone thinks they got it locked down. 1167 00:33:52,663 –> 00:33:54,465 You’d be surprised what’s happened over time. 1168 00:33:54,599 –> 00:33:55,933 I’ll be honest. 1169 00:33:56,634 –> 00:33:57,235 So that’s 1170 00:33:57,235 –> 00:33:58,536 that’s the first place to start. 1171 00:33:58,770 –> 00:33:59,337 Permissions. 1172 00:33:59,337 –> 00:34:01,239 Start stops, new access. 1173 00:34:01,739 –> 00:34:03,608 What it doesn’t stop is people 1174 00:34:03,608 –> 00:34:04,809 that have access 1175 00:34:05,009 –> 00:34:05,910 may be sharing that 1176 00:34:05,910 –> 00:34:07,245 in different ways that they shouldn’t. 1177 00:34:07,945 –> 00:34:09,447 That’s where sensitive data 1178 00:34:09,981 –> 00:34:11,816 management and Purview comes in. 1179 00:34:12,250 –> 00:34:13,251 That then takes that down 1180 00:34:13,251 –> 00:34:14,452 to the document level. 1181 00:34:14,886 –> 00:34:16,320 If I have access to something, 1182 00:34:17,055 –> 00:34:18,156 I say, oh, that’s great. 1183 00:34:19,323 –> 00:34:19,690 My friend 1184 00:34:19,690 –> 00:34:20,892 Levi back there was like, hey, 1185 00:34:21,292 –> 00:34:22,293 can you share that with me? 1186 00:34:22,293 –> 00:34:23,795 Oh sure. Not thinking about it. 1187 00:34:24,195 –> 00:34:24,762 Here you go. 1188 00:34:24,762 –> 00:34:25,963 But he shouldn’t have that. 1189 00:34:26,664 –> 00:34:28,166 I just transferring that document over. 1190 00:34:28,166 –> 00:34:29,367 Nothing stops that. 1191 00:34:29,767 –> 00:34:31,736 If it’s a sensitive documents, 1192 00:34:31,736 –> 00:34:33,171 you want to be able to protect 1193 00:34:33,171 –> 00:34:34,372 that at the document level. 1194 00:34:34,539 –> 00:34:36,174 So he goes to get that file, 1195 00:34:36,574 –> 00:34:37,775 he can’t open it. 1196 00:34:37,909 –> 00:34:39,310 The same thing will be respected 1197 00:34:39,310 –> 00:34:40,812 within CoPilot will say 1198 00:34:40,845 –> 00:34:42,613 you’re not going to be able you don’t have rights to this. 1199 00:34:42,613 –> 00:34:43,915 I’m not going to infer things off 1200 00:34:43,915 –> 00:34:45,116 of that information. 1201 00:34:45,149 –> 00:34:47,418 So provides that underlying protection 1202 00:34:48,086 –> 00:34:49,020 for those documents, 1203 00:34:49,020 –> 00:34:50,588 even if it’s someplace it shouldn’t be. 1204 00:34:50,888 –> 00:34:52,290 So it’s that next level down 1205 00:34:52,290 –> 00:34:53,491 from controlling permissions 1206 00:34:54,659 –> 00:34:55,626 retention policies. 1207 00:34:55,626 –> 00:34:56,794 Again cleaning up the data. 1208 00:34:56,794 –> 00:34:58,329 So you’re getting good. 1209 00:34:58,329 –> 00:34:59,697 It’s doing analysis 1210 00:34:59,697 –> 00:35:02,033 and inferences on a quality set of data 1211 00:35:02,633 –> 00:35:04,869 a clean set of data not data 1212 00:35:04,869 –> 00:35:07,839 that’s old outdated unimportant. 1213 00:35:09,540 –> 00:35:09,907 Make sure you 1214 00:35:09,907 –> 00:35:12,043 refine your access policies for CoPilot. 1215 00:35:12,410 –> 00:35:13,611 With entering Intune 1216 00:35:14,045 –> 00:35:14,679 you want to make sure you’re 1217 00:35:14,679 –> 00:35:17,648 driving MFA in the right way. 1218 00:35:17,648 –> 00:35:19,150 So if somebody is accessing 1219 00:35:19,484 –> 00:35:21,786 CoPilot, it’s just like drilling 1220 00:35:21,786 –> 00:35:23,287 right into SharePoint anywhere they are. 1221 00:35:23,588 –> 00:35:24,856 You want to make sure you’re protecting 1222 00:35:24,856 –> 00:35:27,425 CoPilot like you would any other way 1223 00:35:27,425 –> 00:35:29,260 of accessing your corporate information. 1224 00:35:30,394 –> 00:35:31,629 And then I would start monitoring 1225 00:35:31,629 –> 00:35:32,830 the CoPilot prompts 1226 00:35:33,064 –> 00:35:33,698 Purview again 1227 00:35:33,698 –> 00:35:34,999 has a suite of tools here. 1228 00:35:35,233 –> 00:35:36,434 Communication 1229 00:35:36,434 –> 00:35:37,635 Compliance is a great one. 1230 00:35:37,969 –> 00:35:39,170 I can start monitoring 1231 00:35:39,370 –> 00:35:40,638 the prompts people are putting 1232 00:35:40,638 –> 00:35:42,273 through and start realizing, 1233 00:35:42,273 –> 00:35:43,875 hey, we’re not meeting this need 1234 00:35:43,875 –> 00:35:46,511 or we need to put a stop to this, right? 1235 00:35:46,511 –> 00:35:47,812 You gain a lot of information 1236 00:35:47,812 –> 00:35:50,781 from that insider risk auditing. 1237 00:35:51,582 –> 00:35:52,450 And again, 1238 00:35:52,450 –> 00:35:53,718 data security, posture management 1239 00:35:54,051 –> 00:35:55,019 and then eDiscovery. 1240 00:35:55,019 –> 00:35:56,854 eDiscovery allows you to search all this too. 1241 00:35:57,121 –> 00:35:59,023 There are some things you’re going to find in CoPilot. 1242 00:35:59,290 –> 00:36:00,458 You have to put on hold. 1243 00:36:01,125 –> 00:36:01,359 Right. 1244 00:36:01,359 –> 00:36:03,060 That’s now information exchange 1245 00:36:03,361 –> 00:36:04,929 that has to be captured there too. 1246 00:36:05,229 –> 00:36:06,831 So it’s important to have these tools there 1247 00:36:06,831 –> 00:36:07,965 so you can actually capture that. 1248 00:36:10,434 –> 00:36:11,636 Testing. 1249 00:36:11,669 –> 00:36:12,537 So now since we’re 1250 00:36:12,537 –> 00:36:14,172 actually using AI and using it 1251 00:36:14,438 –> 00:36:15,973 internally in a managed fashion, 1252 00:36:16,707 –> 00:36:17,675 you need to be testing 1253 00:36:17,675 –> 00:36:18,876 this periodically. 1254 00:36:19,043 –> 00:36:21,345 Running test prompts, running M365 1255 00:36:21,345 –> 00:36:23,014 search understand 1256 00:36:23,948 –> 00:36:25,750 M365 graph and search 1257 00:36:25,750 –> 00:36:27,585 data is the foundation 1258 00:36:27,585 –> 00:36:30,421 that’s underneath M365 CoPilot. 1259 00:36:31,088 –> 00:36:32,290 If I can search for it, 1260 00:36:32,590 –> 00:36:33,991 if I can prompt for it, 1261 00:36:34,358 –> 00:36:36,394 it’s all the same, right? 1262 00:36:36,761 –> 00:36:39,163 So I think the interesting part there is that 1263 00:36:39,163 –> 00:36:42,166 you actually can start bringing in other data sources 1264 00:36:42,700 –> 00:36:43,901 just by incorporating 1265 00:36:43,901 –> 00:36:45,236 that into the Microsoft 1266 00:36:45,236 –> 00:36:46,671 365 search fabric, 1267 00:36:47,138 –> 00:36:48,306 so I can start bringing on 1268 00:36:48,306 –> 00:36:49,507 premise data sources 1269 00:36:49,774 –> 00:36:51,842 and start to be able to surface insights 1270 00:36:51,876 –> 00:36:53,911 off of that because of this fact. 1271 00:36:54,445 –> 00:36:56,380 So it’s an important aspect of what’s happening 1272 00:36:56,380 –> 00:36:57,582 behind the scenes. Are. 1273 00:36:59,784 –> 00:37:02,119 Now, as we’re going past Productized 1274 00:37:02,119 –> 00:37:04,188 AI and starting to develop maybe 1275 00:37:04,222 –> 00:37:05,957 agents building our own 1276 00:37:06,991 –> 00:37:08,192 shared responsibility model 1277 00:37:08,192 –> 00:37:09,393 starts coming into play. 1278 00:37:10,628 –> 00:37:11,829 Just using CoPilot. 1279 00:37:12,029 –> 00:37:13,164 I’m managing for usage. 1280 00:37:13,164 –> 00:37:15,399 That’s what we’ve been talking about so far, right? 1281 00:37:15,967 –> 00:37:17,168 I start building, 1282 00:37:17,468 –> 00:37:19,170 I have further responsibilities. 1283 00:37:19,170 –> 00:37:20,571 I’m now offering a product 1284 00:37:21,072 –> 00:37:22,640 to my own internal customers, 1285 00:37:22,840 –> 00:37:24,508 possibly even external customers, 1286 00:37:24,976 –> 00:37:25,776 that I need to start 1287 00:37:25,776 –> 00:37:27,044 managing things differently. 1288 00:37:27,278 –> 00:37:29,113 Start managing the underlying model. 1289 00:37:29,380 –> 00:37:29,647 Right. 1290 00:37:29,647 –> 00:37:30,982 You’re not building CoPilot, 1291 00:37:31,315 –> 00:37:32,783 you’re building some agents on top of that. 1292 00:37:33,017 –> 00:37:34,518 But if I’m starting to take that further 1293 00:37:34,518 –> 00:37:36,520 and developing my own models on top of there, 1294 00:37:36,954 –> 00:37:37,989 I have some responsibility 1295 00:37:37,989 –> 00:37:39,190 to manage that the right way. 1296 00:37:41,659 –> 00:37:43,594 So let’s take the last final stage. 1297 00:37:43,894 –> 00:37:46,330 Starting to build AI offerings here. 1298 00:37:46,330 –> 00:37:47,865 Again, you have to assume responsibility 1299 00:37:47,865 –> 00:37:49,066 for what you’re building. 1300 00:37:49,500 –> 00:37:50,501 You also need to protect 1301 00:37:50,501 –> 00:37:51,702 your organization 1302 00:37:51,702 –> 00:37:52,837 against unexpected 1303 00:37:52,837 –> 00:37:54,639 or misleading information sharing. 1304 00:37:54,905 –> 00:37:56,474 This is probably a little different at this point, 1305 00:37:56,474 –> 00:37:57,675 because I know I have 1306 00:37:57,675 –> 00:37:59,110 a responsibility to provide 1307 00:37:59,343 –> 00:38:00,645 a trusted tool 1308 00:38:00,645 –> 00:38:01,879 to anybody that’s using it. 1309 00:38:02,313 –> 00:38:04,448 I need to be managing that now far differently 1310 00:38:04,448 –> 00:38:06,117 than I used to with CoPilot. 1311 00:38:06,117 –> 00:38:08,185 A lot of that’s handled within the CoPilot fabric. 1312 00:38:08,519 –> 00:38:10,655 If I’m going to do this myself, I need to make sure that I’m 1313 00:38:10,655 –> 00:38:11,922 providing quality results 1314 00:38:12,323 –> 00:38:13,724 in order to have that be useful 1315 00:38:13,991 –> 00:38:15,593 and to realize, again, the benefits. 1316 00:38:15,893 –> 00:38:17,328 Ultimately, I’m expecting off of that. 1317 00:38:20,431 –> 00:38:20,965 Let’s see what these 1318 00:38:20,965 –> 00:38:22,166 guys are up to now. 1319 00:38:23,901 –> 00:38:24,635 Hey, Darryl. 1320 00:38:24,635 –> 00:38:25,936 Yeah, this is the new, 1321 00:38:25,936 –> 00:38:27,338 air agent that they just released. 1322 00:38:27,438 –> 00:38:29,140 Yeah, he spent a lot of time working on that one. 1323 00:38:29,240 –> 00:38:31,108 Yeah, it’s really cool, a minor unveil. 1324 00:38:31,142 –> 00:38:32,243 Yeah. Here. 1325 00:38:32,243 –> 00:38:32,710 His deal. 1326 00:38:32,710 –> 00:38:35,513 So, basically, this answers to different questions 1327 00:38:35,513 –> 00:38:37,648 about policies that we have at the company. 1328 00:38:37,648 –> 00:38:39,350 So I want to read, like, a really big manual. 1329 00:38:39,383 –> 00:38:40,217 Oh, sweet. 1330 00:38:40,217 –> 00:38:42,586 So let me, I always forget 1331 00:38:42,586 –> 00:38:44,455 how many vacation days. 1332 00:38:44,455 –> 00:38:46,524 Yeah. You know. All right. 1333 00:38:46,524 –> 00:38:48,059 Okay, I might as well just ask 1334 00:38:48,059 –> 00:38:49,260 the new agent there. 1335 00:38:49,794 –> 00:38:52,330 How many GTO days 1336 00:38:52,330 –> 00:38:54,632 do I get in a given year? 1337 00:38:54,865 –> 00:38:56,801 And this is, like, 100% correct, right? 1338 00:38:56,834 –> 00:38:57,535 Yeah. Yeah. 1339 00:38:57,535 –> 00:38:58,336 Everything comes about. 1340 00:38:58,336 –> 00:38:59,603 Data is all sweet. 1341 00:39:00,304 –> 00:39:01,505 All right. 1342 00:39:01,505 –> 00:39:02,773 Oh, wow. 1343 00:39:03,174 –> 00:39:06,110 You get 80 days of PTO, 80 days off. 1344 00:39:06,143 –> 00:39:07,878 Are you kidding me with this one? 1345 00:39:07,945 –> 00:39:09,580 Darryl. That’s awesome. Wow. 1346 00:39:09,613 –> 00:39:12,049 All right, I need to plan a trip right now. 1347 00:39:12,483 –> 00:39:12,883 Where? 1348 00:39:12,883 –> 00:39:14,185 I’m ready to use this new agent. 1349 00:39:14,251 –> 00:39:15,953 Why does this take, like, a month off? 1350 00:39:16,354 –> 00:39:17,688 I love this agent. All right? 1351 00:39:17,722 –> 00:39:19,457 I mean, I mean, I reached out to my girlfriend, 1352 00:39:19,457 –> 00:39:21,125 actually, she could be so excited about this. 1353 00:39:21,525 –> 00:39:23,094 You know, I’m putting my PTO on right now. 1354 00:39:25,229 –> 00:39:26,630 you said, oh, I love this agent. 1355 00:39:26,630 –> 00:39:28,933 This is great, but it’s not giving you good information. 1356 00:39:28,933 –> 00:39:29,233 Right. 1357 00:39:29,233 –> 00:39:30,468 And they’re about to make some bad 1358 00:39:30,468 –> 00:39:32,303 decisions based on on 1359 00:39:32,303 –> 00:39:33,504 bad information. 1360 00:39:33,504 –> 00:39:35,172 Hence the responsibility as you’re building 1361 00:39:35,172 –> 00:39:37,608 this agent, the company built this agent in this case. 1362 00:39:37,608 –> 00:39:39,377 And they’re theoretical example 1363 00:39:39,810 –> 00:39:41,011 for these gentlemen. 1364 00:39:42,480 –> 00:39:43,414 So as you’re 1365 00:39:43,414 –> 00:39:44,615 diving into building your own 1366 00:39:44,615 –> 00:39:46,083 agent it’s important. 1367 00:39:46,183 –> 00:39:47,385 First of all, 1368 00:39:47,451 –> 00:39:49,453 do you have a broader AI strategy now. 1369 00:39:49,487 –> 00:39:52,390 Now you’re taking the some of further step 1370 00:39:52,390 –> 00:39:53,657 into what you’re doing with 1371 00:39:53,657 –> 00:39:55,559 I need to have a plan. 1372 00:39:55,559 –> 00:39:57,061 You need to have a vision for yourself. 1373 00:39:57,261 –> 00:39:58,596 You need to establish some priorities 1374 00:39:58,596 –> 00:39:59,797 for yourself. 1375 00:39:59,830 –> 00:40:01,232 Need to have a model for 1376 00:40:01,232 –> 00:40:03,300 how do I evaluate different I use cases 1377 00:40:03,601 –> 00:40:04,769 Brian highlighted 1378 00:40:04,769 –> 00:40:05,970 a lot of this this morning. 1379 00:40:06,437 –> 00:40:06,771 Right. 1380 00:40:06,771 –> 00:40:08,406 How do I manage a portfolio of agents. 1381 00:40:08,406 –> 00:40:09,473 What’s that going to look like? 1382 00:40:09,473 –> 00:40:11,041 How am I going to make decisions right. 1383 00:40:11,542 –> 00:40:13,644 Being able to define 1384 00:40:13,677 –> 00:40:15,012 approved technology and data 1385 00:40:15,246 –> 00:40:16,580 or how you’re going to approve 1386 00:40:16,580 –> 00:40:17,882 technology and data over time, 1387 00:40:18,649 –> 00:40:20,418 have a governance plan requirements. 1388 00:40:21,218 –> 00:40:22,920 And this is probably the part 1389 00:40:22,920 –> 00:40:24,288 that that gets forgotten sometimes. 1390 00:40:24,288 –> 00:40:26,223 Here is what do I need to do 1391 00:40:26,223 –> 00:40:28,893 within the organization to drive this 1392 00:40:29,126 –> 00:40:30,161 and be able to 1393 00:40:30,161 –> 00:40:32,596 to perform this the right way, right. 1394 00:40:32,596 –> 00:40:34,231 What are some of the organizational changes 1395 00:40:34,231 –> 00:40:36,834 that have to occur so that I can both 1396 00:40:37,001 –> 00:40:38,636 develop this and use it 1397 00:40:38,769 –> 00:40:39,970 and then manage it? 1398 00:40:40,037 –> 00:40:42,606 Right now, this isn’t just a productize tool. 1399 00:40:42,873 –> 00:40:44,041 I’m creating something. 1400 00:40:44,241 –> 00:40:45,743 It’s going to involves a little extra, 1401 00:40:46,210 –> 00:40:47,678 thought process. There. 1402 00:40:49,213 –> 00:40:50,014 The other thing you should do 1403 00:40:50,014 –> 00:40:52,249 here is plan an AI framework policy. 1404 00:40:52,616 –> 00:40:53,818 This can be challenging. 1405 00:40:55,052 –> 00:40:55,920 This is an area 1406 00:40:55,920 –> 00:40:57,354 that the tooling is evolving 1407 00:40:57,354 –> 00:40:58,556 as we speak. 1408 00:40:58,622 –> 00:40:58,923 Right? 1409 00:40:58,923 –> 00:41:00,090 Right now 1410 00:41:00,157 –> 00:41:02,193 there’s a lot of good things coming into this space 1411 00:41:02,193 –> 00:41:03,961 because the need for it has been realized. 1412 00:41:04,495 –> 00:41:05,463 But that’s 1413 00:41:05,463 –> 00:41:06,230 the major 1414 00:41:06,230 –> 00:41:07,665 vendors are kind of catching up to 1415 00:41:07,665 –> 00:41:09,066 what’s all going to have to happen here. 1416 00:41:09,567 –> 00:41:10,768 A lot of tools out there. 1417 00:41:11,035 –> 00:41:12,203 I expect over time 1418 00:41:12,203 –> 00:41:13,404 that’s going to consolidate. 1419 00:41:13,437 –> 00:41:14,772 You’re going to have a much simpler framework 1420 00:41:14,772 –> 00:41:15,973 for this over time. 1421 00:41:16,006 –> 00:41:18,108 But let’s talk about what what needs to happen in there today. 1422 00:41:18,108 –> 00:41:20,110 We need to manage 1423 00:41:20,110 –> 00:41:21,312 lifecycle management, 1424 00:41:21,512 –> 00:41:22,680 need to manage security, 1425 00:41:22,680 –> 00:41:25,015 organizational policies 1426 00:41:25,015 –> 00:41:27,718 that need to be factored into this cost management. 1427 00:41:28,252 –> 00:41:29,587 AI development can get 1428 00:41:29,587 –> 00:41:30,821 really costly really fast. 1429 00:41:30,821 –> 00:41:32,656 You need to keep an eye on that right? 1430 00:41:33,791 –> 00:41:34,425 Make sure you’re 1431 00:41:34,425 –> 00:41:36,627 defining what models and tooling for everyone to use. 1432 00:41:36,861 –> 00:41:38,262 So again, driving the security 1433 00:41:38,262 –> 00:41:39,597 and operational aspects of that. 1434 00:41:40,030 –> 00:41:41,298 You’re doing this a smart way. 1435 00:41:41,932 –> 00:41:42,833 This is just about driving 1436 00:41:42,833 –> 00:41:44,034 responsible use, 1437 00:41:44,301 –> 00:41:46,537 not trying to stop anything, 1438 00:41:47,037 –> 00:41:50,074 just trying to protect it so that you can keep going down the road. 1439 00:41:50,074 –> 00:41:52,409 You don’t hit a running up in the ditch 1440 00:41:52,476 –> 00:41:53,677 later. 1441 00:41:54,011 –> 00:41:54,845 Again, continue 1442 00:41:54,845 –> 00:41:56,046 to update the AI policy 1443 00:41:57,548 –> 00:41:58,883 education enablement here 1444 00:41:58,883 –> 00:42:00,384 you’re really focusing on, right? 1445 00:42:00,384 –> 00:42:02,119 How do I create this framework 1446 00:42:02,486 –> 00:42:04,388 for how we’re publishing agents? 1447 00:42:04,655 –> 00:42:06,423 I’m creating AI product out there. 1448 00:42:06,724 –> 00:42:07,892 Right. 1449 00:42:07,892 –> 00:42:09,393 Some of the great tools out there 1450 00:42:09,393 –> 00:42:11,028 Microsoft has as a host of these 1451 00:42:11,028 –> 00:42:13,430 depends on the tooling you’re using and kind of them, 1452 00:42:13,531 –> 00:42:16,066 I think you’re going to see agent 365 slowly 1453 00:42:16,367 –> 00:42:18,269 take over this space more and more. 1454 00:42:18,936 –> 00:42:19,670 But right now you have 1455 00:42:19,670 –> 00:42:20,971 the CoPilot control system. 1456 00:42:20,971 –> 00:42:23,607 So if I’m doing things with CoPilot Studio, 1457 00:42:24,341 –> 00:42:26,076 I have kind of a control plane there. 1458 00:42:26,610 –> 00:42:27,311 I foundry, 1459 00:42:27,311 –> 00:42:28,612 if I’m doing things there, maybe 1460 00:42:28,612 –> 00:42:30,614 using some Azure services for 1461 00:42:31,181 –> 00:42:32,650 I can use the control plane there. 1462 00:42:33,183 –> 00:42:35,619 Agent 365 is emerging as kind of that 1463 00:42:35,886 –> 00:42:37,321 that central framework 1464 00:42:37,321 –> 00:42:38,522 for all of this. 1465 00:42:38,522 –> 00:42:39,890 But it’s maturing as we speak. 1466 00:42:39,890 –> 00:42:41,091 This was just announced 1467 00:42:41,392 –> 00:42:42,760 a few months ago in ignite. 1468 00:42:43,127 –> 00:42:45,663 So it’s an, in, in development. 1469 00:42:45,663 –> 00:42:46,530 Right. It’s on its way. 1470 00:42:48,766 –> 00:42:50,968 Organizationally, you want to consider 1471 00:42:50,968 –> 00:42:52,202 creating the AI product group 1472 00:42:52,202 –> 00:42:53,470 now too, right? 1473 00:42:53,537 –> 00:42:55,472 I need to have somebody that’s owning these things. 1474 00:42:55,472 –> 00:42:56,740 That’s going to be responsible for what 1475 00:42:56,740 –> 00:42:58,375 they’re doing, responsible 1476 00:42:58,375 –> 00:42:59,910 for the ongoing management of this, 1477 00:43:00,945 –> 00:43:02,279 and be able to lead and manage 1478 00:43:02,279 –> 00:43:03,480 the development efforts. 1479 00:43:05,716 –> 00:43:06,817 Agent 365. 1480 00:43:06,817 –> 00:43:08,752 Like we said, this is something 1481 00:43:08,752 –> 00:43:09,987 really excited about this. 1482 00:43:09,987 –> 00:43:12,089 I think this is really going to help a lot of organizations 1483 00:43:12,423 –> 00:43:13,357 get past the, 1484 00:43:13,357 –> 00:43:15,225 okay, we’d like to do this, 1485 00:43:15,225 –> 00:43:16,860 but we’re not quite sure how to manage this. 1486 00:43:17,194 –> 00:43:18,862 I think this is going to help a lot of organizations 1487 00:43:18,862 –> 00:43:20,064 do that very quickly. 1488 00:43:21,498 –> 00:43:22,700 Right now. 1489 00:43:23,767 –> 00:43:24,602 Way to do this. 1490 00:43:24,602 –> 00:43:27,404 The first thing you need to do, 1491 00:43:27,871 –> 00:43:30,474 the sign identity to each agent, 1492 00:43:31,408 –> 00:43:32,610 Microsoft Entourage 1493 00:43:32,810 –> 00:43:33,811 agent identities 1494 00:43:33,811 –> 00:43:35,112 allows you to do this. 1495 00:43:36,480 –> 00:43:37,114 I can then 1496 00:43:37,114 –> 00:43:39,783 monitor what that agent is doing, 1497 00:43:40,351 –> 00:43:41,919 who’s able to interface with it, 1498 00:43:42,152 –> 00:43:43,220 who’s not. 1499 00:43:43,220 –> 00:43:45,422 It allows me, that whole host of controls. 1500 00:43:45,556 –> 00:43:47,458 That identity plane is so important. 1501 00:43:47,458 –> 00:43:49,259 And throughout cloud services 1502 00:43:49,259 –> 00:43:50,461 within Microsoft 1503 00:43:50,561 –> 00:43:51,929 is really going to help out here 1504 00:43:51,929 –> 00:43:53,364 too, right? 1505 00:43:53,364 –> 00:43:54,898 Just an ongoing visibility 1506 00:43:54,898 –> 00:43:56,800 as to who’s interacting with this, 1507 00:43:56,967 –> 00:43:58,602 what this thing’s doing, the data 1508 00:43:58,669 –> 00:43:59,870 that’s accessing 1509 00:44:00,037 –> 00:44:01,839 all kinds of great information. 1510 00:44:01,839 –> 00:44:03,040 There. 1511 00:44:03,741 –> 00:44:05,075 You also want to start here. 1512 00:44:05,075 –> 00:44:06,844 You’re going to probably need to manage your structured 1513 00:44:06,844 –> 00:44:08,612 data state more fabric 1514 00:44:08,612 –> 00:44:09,780 certainly core to this. 1515 00:44:10,147 –> 00:44:12,483 Now you heard from senior earlier on on what 1516 00:44:12,483 –> 00:44:13,651 fabric can do for you. 1517 00:44:13,917 –> 00:44:15,786 Purview also has aspects here too 1518 00:44:15,786 –> 00:44:17,821 that you need to to put into place. 1519 00:44:18,422 –> 00:44:19,990 Purview data catalog can be great 1520 00:44:20,324 –> 00:44:21,525 for helping to surface 1521 00:44:21,759 –> 00:44:22,760 the data sources 1522 00:44:22,760 –> 00:44:24,061 that are out there that are trusted, 1523 00:44:24,495 –> 00:44:26,430 particularly if you’re playing endorsements. 1524 00:44:26,430 –> 00:44:28,399 They’re say, hey, use this data. 1525 00:44:28,832 –> 00:44:30,234 We don’t trust that one so much. 1526 00:44:31,035 –> 00:44:31,769 It’ll speed up 1527 00:44:31,769 –> 00:44:33,003 development drastically. 1528 00:44:33,771 –> 00:44:34,672 The other thing you want to make sure is 1529 00:44:34,672 –> 00:44:35,873 you got the right permissions there. 1530 00:44:36,173 –> 00:44:37,741 If I carry an agent that says, hey, 1531 00:44:38,475 –> 00:44:39,543 here’s all our earnings 1532 00:44:39,543 –> 00:44:40,911 data that anybody in 1533 00:44:40,911 –> 00:44:42,112 the organization could use, 1534 00:44:42,513 –> 00:44:44,181 we may not be hiding it. 1535 00:44:44,415 –> 00:44:45,649 We might be right. 1536 00:44:45,649 –> 00:44:47,084 There may be information I don’t want 1537 00:44:47,084 –> 00:44:49,053 everybody seeing within the organization 1538 00:44:49,620 –> 00:44:50,487 if I’ve got the right 1539 00:44:50,487 –> 00:44:51,722 permissions on that data, 1540 00:44:52,322 –> 00:44:54,692 even if I expose it, potentially 1541 00:44:54,925 –> 00:44:56,760 it’ll actually stop itself from being surfaced 1542 00:44:56,760 –> 00:44:58,529 because of what I try. 1543 00:44:58,562 –> 00:45:00,497 And and the identities will control for 1544 00:45:01,965 –> 00:45:02,866 want to play management 1545 00:45:02,866 –> 00:45:04,068 control system. 1546 00:45:04,435 –> 00:45:06,870 Ultimately I think this is going to be agent 365. 1547 00:45:07,271 –> 00:45:09,339 Right now you’re probably using some amount 1548 00:45:09,339 –> 00:45:12,242 of CoPilot control system that found your control plane 1549 00:45:12,242 –> 00:45:14,611 to put the ongoing management 1550 00:45:14,912 –> 00:45:16,080 for the, 1551 00:45:16,080 –> 00:45:17,281 agents that you’re being developed, 1552 00:45:17,581 –> 00:45:18,916 being able to keep tabs on them. 1553 00:45:19,183 –> 00:45:20,451 Again, identity is very important 1554 00:45:20,451 –> 00:45:21,652 for that. 1555 00:45:22,286 –> 00:45:23,587 You want to be defining access 1556 00:45:23,587 –> 00:45:25,189 policies for agents, right? 1557 00:45:25,189 –> 00:45:26,857 So if I made that agent available, 1558 00:45:27,224 –> 00:45:28,859 where can people access this from? 1559 00:45:29,059 –> 00:45:30,661 Remember, as people 1560 00:45:30,661 –> 00:45:32,930 are interfacing with it, they’re extracting data from it. 1561 00:45:33,430 –> 00:45:34,465 Where do I want that data 1562 00:45:34,465 –> 00:45:35,666 to be able to go? 1563 00:45:35,799 –> 00:45:36,366 Right. 1564 00:45:36,366 –> 00:45:37,768 If I have somebody that says, hey, 1565 00:45:38,102 –> 00:45:39,303 I can log into this, 1566 00:45:39,603 –> 00:45:41,338 but if I’m on my home PC, 1567 00:45:42,039 –> 00:45:44,441 I may not want I may not want them to be able to get that, 1568 00:45:44,875 –> 00:45:46,110 or I may need to control 1569 00:45:46,110 –> 00:45:47,745 that on different devices to say, 1570 00:45:47,745 –> 00:45:50,280 hey, you can’t copy information 1571 00:45:50,280 –> 00:45:51,715 out of this to be able 1572 00:45:51,715 –> 00:45:52,916 to share with anybody else 1573 00:45:53,383 –> 00:45:54,685 but the host of controls 1574 00:45:54,685 –> 00:45:56,453 that you want to have in place to protect yourself. 1575 00:45:58,021 –> 00:45:59,456 Managing agent compliance. 1576 00:46:00,224 –> 00:46:02,559 So here we’re starting to say, okay, 1577 00:46:02,559 –> 00:46:04,027 the agent can do certain things. 1578 00:46:04,528 –> 00:46:05,262 What are some things 1579 00:46:05,262 –> 00:46:07,064 we should be surfacing and not right. 1580 00:46:07,264 –> 00:46:08,398 HIPAA data, 1581 00:46:08,398 –> 00:46:10,234 different regulatory data, 1582 00:46:10,567 –> 00:46:12,269 making sure that that’s controlled 1583 00:46:12,603 –> 00:46:15,105 and managed within the underlying framework. 1584 00:46:15,105 –> 00:46:17,040 So you can protect people from making mistakes there. 1585 00:46:17,441 –> 00:46:18,308 Right. 1586 00:46:18,308 –> 00:46:19,076 A lot of tooling there 1587 00:46:19,076 –> 00:46:20,277 to help you with that. 1588 00:46:21,145 –> 00:46:22,513 Finally, you need to be able to secure 1589 00:46:22,513 –> 00:46:23,881 agents, make sure 1590 00:46:23,881 –> 00:46:25,382 not allowing for unnecessary cyber 1591 00:46:25,382 –> 00:46:26,583 threats there 1592 00:46:26,850 –> 00:46:27,551 and not making 1593 00:46:27,551 –> 00:46:28,752 sure that you’re controlling 1594 00:46:28,786 –> 00:46:30,187 the broader security posture 1595 00:46:30,187 –> 00:46:31,388 management there. 1596 00:46:31,488 –> 00:46:32,256 And also make sure 1597 00:46:32,256 –> 00:46:33,524 that you’re not exposing yourself. 1598 00:46:33,524 –> 00:46:34,725 Some ways you don’t. 1599 00:46:34,725 –> 00:46:35,726 So a combination 1600 00:46:35,726 –> 00:46:37,961 of security posture management 1601 00:46:38,529 –> 00:46:40,531 through defender for cloud right. 1602 00:46:40,531 –> 00:46:42,733 So we’ve got data security posture management 1603 00:46:43,066 –> 00:46:44,234 within Purview 1604 00:46:44,234 –> 00:46:45,435 within defender for cloud. 1605 00:46:45,435 –> 00:46:46,637 There’s 1606 00:46:46,637 –> 00:46:48,038 AI security posture management 1607 00:46:48,038 –> 00:46:49,273 to help control the underlying 1608 00:46:49,273 –> 00:46:50,474 security threats. 1609 00:46:50,641 –> 00:46:52,009 Being able to, to 1610 00:46:52,009 –> 00:46:54,311 allow for injections and all kinds of things. 1611 00:46:54,545 –> 00:46:56,046 It will protect your agents from that. 1612 00:46:57,414 –> 00:46:57,581 You can 1613 00:46:57,581 –> 00:46:59,216 use foundry security, baseline 1614 00:47:00,083 –> 00:47:01,185 red teaming. 1615 00:47:01,185 –> 00:47:02,586 So a host of other tools there too 1616 00:47:02,586 –> 00:47:05,522 that can help with determining 1617 00:47:05,656 –> 00:47:06,990 do I have threats here that maybe 1618 00:47:06,990 –> 00:47:08,192 I’m not aware. 1619 00:47:09,760 –> 00:47:10,961 Testing and development 1620 00:47:11,762 –> 00:47:12,830 certainly want to be monitoring 1621 00:47:12,830 –> 00:47:14,031 the audit logs. 1622 00:47:14,131 –> 00:47:16,099 You need to establish some kind of incident management 1623 00:47:16,099 –> 00:47:17,634 support process around the tooling 1624 00:47:17,634 –> 00:47:18,836 that you’re providing, right. 1625 00:47:18,869 –> 00:47:20,070 You’re providing a product 1626 00:47:20,204 –> 00:47:21,405 you need to be able to support. 1627 00:47:21,605 –> 00:47:23,073 Well, that’s inside the organization 1628 00:47:23,473 –> 00:47:24,675 or outside. 1629 00:47:24,675 –> 00:47:25,876 How am I going to manage that? 1630 00:47:27,211 –> 00:47:28,245 Establish a DevOps 1631 00:47:28,245 –> 00:47:29,546 lifecycle around this right. 1632 00:47:29,813 –> 00:47:31,481 This isn’t going to be a all right. 1633 00:47:31,481 –> 00:47:32,749 It’s all done one time. 1634 00:47:33,050 –> 00:47:34,751 This is going to evolve over time. 1635 00:47:35,519 –> 00:47:36,720 So how am I putting some 1636 00:47:36,720 –> 00:47:38,088 lifecycle management on this. 1637 00:47:38,088 –> 00:47:39,656 So I’m continuing to update 1638 00:47:39,990 –> 00:47:42,092 and make those changes and evolve this as you go. 1639 00:47:43,794 –> 00:47:45,229 As part of that ongoing testing. 1640 00:47:45,229 –> 00:47:46,430 Monitoring is important. 1641 00:47:46,730 –> 00:47:48,932 Having people just go at it, testing 1642 00:47:48,932 –> 00:47:50,701 tools, all kinds of things out there. 1643 00:47:51,802 –> 00:47:52,202 Founder 1644 00:47:52,202 –> 00:47:54,271 and CoPilot have a host of tools built in there 1645 00:47:54,271 –> 00:47:55,472 to help you with this. 1646 00:47:55,672 –> 00:47:57,074 There’s the Microsoft Responsible 1647 00:47:57,074 –> 00:47:58,275 AI toolkit. 1648 00:47:58,508 –> 00:47:59,710 There’s a host of tools 1649 00:47:59,710 –> 00:48:01,345 inside of there that you can leverage towards this, 1650 00:48:01,612 –> 00:48:03,614 something they use themselves for their own internal 1651 00:48:03,847 –> 00:48:05,983 AI development that they make available to you. 1652 00:48:06,550 –> 00:48:07,451 Fair learn helps 1653 00:48:07,451 –> 00:48:09,186 against controlling for biases 1654 00:48:09,620 –> 00:48:11,255 within the the AI models 1655 00:48:11,255 –> 00:48:12,723 and products that you’re developing 1656 00:48:13,857 –> 00:48:15,125 and interpret ML, 1657 00:48:15,125 –> 00:48:16,326 and it kind of ML 1658 00:48:16,360 –> 00:48:18,262 will help with when you’re dealing more with machine 1659 00:48:18,262 –> 00:48:20,097 learning models and doing deeper 1660 00:48:20,097 –> 00:48:21,298 data analysis. 1661 00:48:21,398 –> 00:48:22,699 And make sure that that’s done, 1662 00:48:22,699 –> 00:48:23,901 the right way. 1663 00:48:26,069 –> 00:48:26,503 All right. 1664 00:48:28,238 –> 00:48:28,639 This is 1665 00:48:28,639 –> 00:48:30,307 the spectrum of what we’ve talked about today. 1666 00:48:31,041 –> 00:48:33,710 The purpose of this has been really, 1667 00:48:33,744 –> 00:48:35,579 you know, make sure that you’re anticipating 1668 00:48:36,046 –> 00:48:37,648 what you’re going to be dealing with next. 1669 00:48:37,981 –> 00:48:39,650 It’s not about opposing 1670 00:48:40,083 –> 00:48:42,419 the coming waves of AI development 1671 00:48:42,419 –> 00:48:43,620 that’s going to happen. 1672 00:48:43,620 –> 00:48:45,756 It’s how do you enable that but enable 1673 00:48:45,756 –> 00:48:47,491 that in a responsible way through governance. 1674 00:48:49,660 –> 00:48:50,560 So this is kind of 1675 00:48:50,560 –> 00:48:51,828 a summary of everything. 1676 00:48:51,828 –> 00:48:53,497 We’ve talked about different categories. 1677 00:48:54,131 –> 00:48:55,766 Hopefully this was beneficial to you.