cover of episode 695: NLP with Transformers, feat. Hugging Face's Lewis Tunstall

695: NLP with Transformers, feat. Hugging Face's Lewis Tunstall

2023/7/11
logo of podcast Super Data Science: ML & AI Podcast with Jon Krohn

Super Data Science: ML & AI Podcast with Jon Krohn

Frequently requested episodes will be transcribed first

Shownotes Transcript

What are transformers in AI, and how do they help developers to run LLMs efficiently and accurately? This is a key question in this week’s episode, where Hugging Face’s ML Engineer Lewis Tunstall sits down with host Jon Krohn to discuss encoders and decoders, and the importance of continuing to foster democratic environments like GitHub for creating open-source models.This episode is brought to you by the AWS Insiders Podcast), by WithFeeling.ai), the company bringing humanity into AI, and by Modelbit), for deploying models in seconds. Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast) for sponsorship information.In this episode you will learn:• What a transformer is, and why it is so important for NLP [04:34]• Different types of transformers and how they vary [11:39]• Why it’s necessary to know how a transformer works [31:52]• Hugging Face’s role in the application of transformers [57:10]• Lewis Tunstall’s experience of working at Hugging Face [1:02:08]• How and where to start with Hugging Face libraries [1:18:27]• The necessity to democratize ML models in the future [1:25:25]Additional materials: www.superdatascience.com/695)