cover of episode The future of computational linguistics

The future of computational linguistics

2023/5/5
logo of podcast The Future of Everything

The Future of Everything

Frequently requested episodes will be transcribed first

Shownotes Transcript

Our guest, Christopher Manning), is a computational linguist. He builds computer models that understand and generate language using math. Words are the key component of human intelligence, he says, and why generative AI, like ChatGPT, has caused such a stir. We used to hope a model might produce one coherent sentence and suddenly ChatGPT is composing five-paragraph stories and doing mathematical proofs in rhyming verse, Manning tells host Russ Altman) in this episode of Stanford Engineering’s The Future of Everything podcast.

Connect With Us:

Episode Transcripts >>> The Future of Everything Website)

Connect with Russ >>> Threads) / Bluesky) / Mastodon)

Connect with School of Engineering >>>Twitter/X) / Instagram) / LinkedIn) / Facebook)