Introducing generative AI into your organization is a multi-step process that, if implemented correctly, can have a significant impact on efficiency and bottom line. Monica Livingston leads the AI Center for Excellence at Intel. In this video, she outlines the initial steps required to assess opportunity, gather resources, and deploy infrastructure when building a generative AI strategy.
Category
🤖
TechTranscript
00:00 Some of the things that one should consider when evaluating AI strategy, first is the
00:07 cost versus return on investment.
00:09 What are you actually hoping to achieve with AI?
00:13 There has to be a business need.
00:14 There has to be a business outcome.
00:16 It's not just that we're doing it better.
00:19 There's brand new types of applications that we've never been able to do before.
00:28 I'm Monica Livingston and I lead the AI Center of Excellence at Intel.
00:32 So big question in AI is, do you build from the ground up?
00:36 Do you have sufficient skill set and data and need to build an application from the
00:42 ground up?
00:43 Or do you purchase off the shelf or you could even customize an off the shelf application?
00:48 And the cost of that model or of that application needs to be such that you have a return on
00:54 investment.
00:56 The other part of getting started with AI altogether is understanding your data.
01:02 What data do you have?
01:04 Because in order to train AI models, you need to have your own data sets or you need to
01:08 have access to data sets or you need to license data sets.
01:11 But one way or another, you have to get data that is usable.
01:17 When you get to the point to where you understand what your workload is, whether it's built
01:23 in-house or developed externally, you know what workload you want to run.
01:28 Then you start looking at, okay, infrastructure, what do I run this on?
01:32 For these smaller models, for running inference, we recommend the fourth generation Intel Xeon
01:37 Scalable Processor.
01:39 Most data centers have Xeon processors in their install base, so they're already there,
01:44 as well as our core processors, which are for client devices like notebooks and desktops.
01:49 So our role in making AI accessible is to add AI functionality in these product lines.
01:56 So being able to run your AI applications on general purpose infrastructure is incredibly
02:03 important because then your cost for additional infrastructure is reduced.
02:09 And then you look at responsible AI.
02:12 Actually responsible AI is a huge growing area.
02:15 You should have a vendor checklist for responsible AI, specifically to be able to vet that your
02:21 vendor is using AI responsibly, that they have processes in place to correct for any
02:26 sort of issues that might potentially come up.
02:29 So those are a couple of things that someone should consider once they're building their
02:33 AI models and really infusing AI into their applications rather than just saying, "I'm
02:39 building AI and I'm building it from the ground up for AI."
02:43 So you have a business outcome, you have an application, and the question is, "Should
02:46 I start using some form of AI in that application?"
02:50 [Music]