Connor Tan is a physicist and senior data scientist working for a multinational energy company where he co-founded and leads a data science team. He holds a first-class degree in experimental and theoretical physics from Cambridge university. With a master's in particle astrophysics. He specializes in the application of machine learning models and Bayesian methods. Today we explore the history, pratical utility, and unique capabilities of Bayesian methods. We also discuss the computational difficulties inherent in Bayesian methods along with modern methods for approximate solutions such as Markov Chain Monte Carlo. Finally, we discuss how Bayesian optimization in the context of automl may one day put Data Scientists like Connor out of work.
Panel: Dr. Keith Duggar, Alex Stenlake, Dr. Tim Scarfe
00:00:00 Duggars philisophical ramblings on Bayesianism
00:05:10 Introduction
00:07:30 small datasets and prior scientific knowledge
00:10:37 Bayesian methods are probability theory
00:14:00 Bayesian methods demand hard computations
00:15:46 uncertainty can matter more than estimators
00:19:29 updating or combining knowledge is a key feature
00:25:39 Frequency or Reasonable Expectation as the Primary Concept
00:30:02 Gambling and coin flips
00:37:32 Rev. Thomas Bayes's pool table
00:40:37 ignorance priors are beautiful yet hard
00:43:49 connections between common distributions
00:49:13 A curious Universe, Benford's Law
00:55:17 choosing priors, a tale of two factories
01:02:19 integration, the computational Achilles heel
01:35:25 Bayesian social context in the ML community
01:10:24 frequentist methods as a first approximation
01:13:13 driven to Bayesian methods by small sample size
01:18:46 Bayesian optimization with automl, a job killer?
01:25:28 different approaches to hyper-parameter optimization
01:30:18 advice for aspiring Bayesians
01:33:59 who would connor interview next?
Connor Tann: https://www.linkedin.com/in/connor-tann-a92906a1/