Richard Socher, ex-Chief Scientist at Salesforce, joins us to talk about The AI Economist, NLP protein generation and biggest challenge in making ML work in the real world.
Richard Socher was the Chief scientist (EVP) at Salesforce where he lead teams working on fundamental research(einstein.ai/), applied research, product incubation, CRM search, customer service automation and a cross-product AI platform for unstructured and structured data. Previously, he was an adjunct professor at Stanford’s computer science department and the founder and CEO/CTO of MetaMind(www.metamind.io/) which was acquired by Salesforce in 2016. In 2014, he got my PhD in the CS Department at Stanford. He likes paramotoring and water adventures, traveling and photography. More info:
Research:
Google Scholar Link(https://scholar.google.com/citations?user=FaOcyfMAAAAJ&hl=en)
The AI Economist: Improving Equality and Productivity with AI-Driven Tax Policies Arxiv link(https://arxiv.org/abs/2004.13332), blog(https://blog.einstein.ai/the-ai-economist/), short video(https://www.youtube.com/watch?v=4iQUcGyQhdA), Q&A(https://salesforce.com/company/news-press/stories/2020/4/salesforce-ai-economist/), Press: VentureBeat(https://venturebeat.com/2020/04/29/salesforces-ai-economist-taps-reinforcement-learning-to-generate-optimal-tax-policies/), TechCrunch(https://techcrunch.com/2020/04/29/salesforce-researchers-are-working-on-an-ai-economist-for-more-equitable-tax-policy/)
ProGen: Language Modeling for Protein Generation: bioRxiv link(https://www.biorxiv.org/content/10.1101/2020.03.07.982272v2), blog ]
Dye-sensitized solar cells under ambient light powering machine learning: towards autonomous smart sensors for the internet of things Issue11, (Chemical Science 2020). paper link(https://pubs.rsc.org/en/content/articlelanding/2020/sc/c9sc06145b#!divAbstract)
CTRL: A Conditional Transformer Language Model for Controllable Generation: Arxiv link(https://arxiv.org/abs/1909.05858), code pre-trained and fine-tuning(https://github.com/salesforce/ctrl), blog(https://blog.einstein.ai/introducing-a-conditional-transformer-language-model-for-controllable-generation/)
Genie: a generator of natural language semantic parsers for virtual assistant commands: PLDI 2019 pdf link(https://almond-static.stanford.edu/papers/genie-pldi19.pdf), https://almond.stanford.edu
Topics Covered:
0:00 intro 0:42 the AI economist 7:08 the objective function and Gini Coefficient 12:13 on growing up in Eastern Germany and cultural differences 15:02 Language models for protein generation (ProGen) 27:53 CTRL: conditional transformer language model for controllable generation 37:52 Businesses vs Academia 40:00 What ML applications are important to salesforce 44:57 an underrated aspect of machine learning 48:13 Biggest challenge in making ML work in the real world
Visit our podcasts homepage for transcripts and more episodes! www.wandb.com/podcast
Get our podcast on Soundcloud, Apple, Spotify, and Google! Soundcloud: https://bit.ly/2YnGjIq Apple Podcasts: https://bit.ly/2WdrUvI Spotify: https://bit.ly/2SqtadF Google: http://tiny.cc/GD_Google
Weights and Biases makes developer tools for deep learning.
Join our bi-weekly virtual salon and listen to industry leaders and researchers in machine learning share their research: http://tiny.cc/wb-salon
Join our community of ML practitioners: http://bit.ly/wb-slack
Our gallery features curated machine learning reports by ML researchers. https://app.wandb.ai/gallery