We’ve heard so much about the value and capabilities of generative AI over the past year, and we’ve all become accustomed to the chat interfaces of our preferred models. One of the main concerns many of us have had has been privacy. Is OpenAI keeping the data and information I give to ChatGPT secure? One of the touted solutions to this problem is running LLMs locally on your own machine, but with the hardware cost that comes with it, running LLMs locally has not been possible for many of us. That might now be starting to change.
Nuri Canyaka is VP of AI Marketing at Intel. Prior to Intel, Nuri spent 16 years at Microsoft, starting out as a Technical Evangelist, and leaving the organization as the Senior Director of Product Marketing. He ran the GTM team that helped generate adoption of GPT in Microsoft Azure products.
La Tiffaney Santucci is Intel’s AI Marketing Director, specializing in their Edge and Client products. La Tiffaney has spent over a decade at Intel, focussing on partnerships with Dell, Google Amazon and Microsoft.
In the episode, Richie, Nuri and La Tiffaney explore AI’s impact on marketing analytics, the adoptions of AI in the enterprise, how AI is being integrated into existing products, the workflow for implementing AI into business processes and the challenges that come with it, the importance of edge AI for instant decision-making in uses-cases like self-driving cars, the emergence of AI engineering as a distinct field of work, the democratization of AI, what the state of AGI might look like in the near future and much more.
About the AI and the Modern Data Stack DataFramed Series
This week we’re releasing 4 episodes focused on how AI is changing the modern data stack and the analytics profession at large. The modern data stack is often an ambiguous and all-encompassing term, so we intentionally wanted to cover the impact of AI on the modern data stack from different angles. Here’s what you can expect:
Links Mentioned in the Show:
New to DataCamp?