Fei-Fei Li, a self-described shy person, prioritized the scientific aspects of AI and questioned the relevance of her personal experiences to the broader topic. She was persuaded by John Etchemendy, her co-director at the Stanford Human-Centered AI Institute, who argued that her unique perspective as a young woman, immigrant, and AI scientist would be valuable and inspiring.
Fei-Fei Li's mother, recognizing the limitations and lack of freedom in China, desired a better future for her daughter. Fei-Fei Li's intellectual curiosity and non-conformity, coupled with her mother's own unfulfilled aspirations, motivated the family's decision to immigrate.
Math transcended the language barrier, offering a subject where her lack of English fluency was less of an obstacle. However, even as a child in China, Fei-Fei Li had a strong affinity for math and physics, viewing math as a tool and finding greater fascination in the audacious thinking required by physics.
Mr. Sabella, recognizing Fei-Fei Li's potential beyond her ESL status, connected with her through a shared love of literature and science fiction. He provided not only academic support, including extra calculus lessons during lunch, but also emotional support and encouragement, becoming a crucial mentor in her life. He even lent her family a significant sum of money to start a dry-cleaning business, demonstrating extraordinary generosity.
Following the AlphaGo success and the rise of AI hype, Fei-Fei Li chose Google for two primary reasons: to gain experience with cutting-edge industrial AI and to understand AI's real-world impact. Her role at Google Cloud provided insights into AI's application across diverse sectors, confirming her belief in its transformative potential.
Amidst growing concerns about AI bias, data privacy, and the societal impact of technologies like facial recognition and self-driving cars, Fei-Fei Li felt compelled to address the human element of AI. She emphasized the importance of aligning technological development with human values and considering the broader societal implications of AI.
Fei-Fei Li advocates for a framework that prioritizes human dignity, well-being, and societal values in the creation, deployment, and governance of AI. This involves interdisciplinary research, ethical review processes, data fairness considerations, and thoughtful regulation to ensure that AI benefits humanity.
ImageNet is a large-scale dataset of labeled images created by Fei-Fei Li and her team to address the challenge of object recognition in computer vision. Inspired by WordNet and the need for large-scale data, ImageNet revolutionized the field of AI by providing a comprehensive resource for training and evaluating object recognition algorithms. Its creation involved overcoming numerous technical and logistical hurdles, including the use of Amazon Mechanical Turk for crowdsourcing image labeling.
While acknowledging the potential risks of AI, Fei-Fei Li emphasizes its broad applicability and potential for positive impact across various fields like medicine, education, and environmental science. She argues that AI, like electricity, is an enabling technology with diverse beneficial applications, contrasting it with nuclear technology, which has more limited and potentially destructive uses.
Welcome, welcome, welcome to Armchair Expert, Experts on Expert. I'm Dan Rather and I'm joined by Modest Mouse. Hi. Hello. I've been talking about this book quite a bit over the last six months, The World's I See, Curiosity, Exploration, and Discovery at the Dawn of AI.
So,
We've had a lot of AI, but I'll say that this, what makes this episode so special is Dr. Fei-Fei Li's personal story is so compelling. It means the fact that people can land in this country, not speaking English, deep into school and fucking pick it all up and then master all these fields. Become better than everyone. Oh my God. It's so impressive. Yeah. Oh, I loved her. Please enjoy Dr. Fei-Fei Li.
We are presented by Amazon Prime. It's more than just fast, free shipping. Whatever you're into, it's on Prime. So keep listening, keep watching, keep on keeping on to hear more about why we love Prime during the fact check. ♪ He's an object expert ♪ ♪ He's an object expert ♪ ♪ He's an object expert ♪
This is a very, very sinking couch. I know. We're up to a terrible start. You want to swap? You want to sit in this chair? You can. I feel like I'm going through therapy very soon. Yeah, well, that is the goal. We do want people to feel very relaxed. Too comfortable, really. Yeah, if the spirit moves you to lay down supine, you're invited to do so. I might. I woke up so early. Yeah.
What time did you wake up? Probably 5.40 something. That's early. Yeah. What time do you normally rise? Not that much later. My alarm is 6.20. Okay, mine's 6.40. Oh, okay. I'm aspiring. But you know what's funny? Today it was 6.20. You have kids. I have kids. How old are they? Nine and 11. Mine is eight and 12. You have eight and 12? Yeah, so we're in the same stage of life. Boys or girls? 12 is boy, eight is girl. How about you? Girl, girl. Oh.
Yes, I'm so lucky. Although 11 about to be 12, I'm starting to get an inkling of what's coming my way. Yeah, yeah. In a house with three ladies. In fact, yesterday was a very emotional day. I'm barely hanging on. What happened? I don't know what's going on with my three ladies, but all of them are in some kind of hormonal turmoil. And every variety, which is fun. Have we started?
Yeah, yeah, yeah, yeah, yeah. We're always recording. We call it ABR. Always be recording. Always be recording. Okay, so you have 12 and 8. And are they close? They're very close. He's a good big brother. He totally is. Yeah. He's sweet. Silvio's your husband, yeah? Yes.
And does one of them have Silvio's personality and one have yours? Well, I can tell you one of them has Silvio's hair. Okay, which is... A lot of curls. Curls. World of curls. Yeah, world of curls. Well, you're here because I read your book maybe two months ago. I was having dinner with Ashton Kutcher. Do you know who that actor is? Yes, he just texted me last night. He's like, you're seeing my friend.
Oh, good. Yeah, so we were at dinner and we were just chatting about people we thought were really interesting. And then he asked me if I had read your book and I hadn't. I went into it thinking I would get a history lesson on AI, which I did. I hope so. And a very thorough one.
But I would not have invited you for that. Your life story is so interesting and beautiful in the way you write about it. You're an incredible writer. Thank you. I do want to give credit to my friend Alex, who co-wrote the book with me. Okay, that's very big of you. Who's Alex? Alex.
Alex is a friend of mine. He does not claim to be a writer, but he's a very talented writer. And we've known each other for years and he loves AI and we talk about it. So when I was invited to write this book, I do feel like I want Alex to co-create this with me. So we became a creative team.
It was a really fun process. How do you know Alex? I know Alex through TED. 2015, I gave my first TED talk. I watched it. Yeah, thank you. Yeah, yeah, about image, how hard it is for a computer to see. Yes, and that's how I got to know Alex.
Because he worked with Ted in some capacity? I think so, yeah. He was in some kind of partnership with Ted and he was helping me to put together my slides. Since then, we've become friends and we talk about AI and he's also helped me in some of the Stanford HAI, Human Centered AI Institute stuff. You're kind of creative partners. Yes.
It's very interesting because book writing is a very different creation compared to doing science. We wrote almost three years or two and a half of those years. During the day, I do my science and some of the evenings I do the creative writing. It's such a different part of the brain. Yeah. Which one do you find more exhaustive? Both. Both in different ways. No, but they are different.
very different. Of course, I've been a scientist for almost three decades, so I'm more familiar with being a scientist. But the creative writing journey, I loved every minute of it. When I say I love, it's not necessarily happy love. It's a painful, lovesome part of it. But I really loved it. That's what I want to start with, because I'm curious, when you sat down with Alex, I'm sure the historical part, the scientific part, that stuff is probably easy. But had you ever told your life story to anyone in that detail?
Yeah. Do you think that's a personal disposition or where you come from culturally? I think of the story of your father, which we'll get to, and how little he told you about his own childhood until the time was right. And I gleaned from that, well, this isn't a culture that is just divulging all this emotional trauma and baggage. Well, I...
I have to say, I think culture in the case of an individual sometimes is too broad a brushstroke. I think it's more individual. I'm a relatively shy person. And Alex and I wrote the first version. It was purely scientific. It was the first year of COVID.
We talk on the phone almost every night. And one of my best friends is a philosopher at Stanford called John H. Mendick. He's a very revered higher education leader. He was Stanford's provost for 17 years. And he is co-director with me at Stanford Human Center Institute. So I was really preoccupied
Oh.
I was like, what? That's the last thing I want to do here. He said, you're missing an opportunity to tell your story, tell AI story through your lens. And I was just so rejecting that idea. I was like, who?
wants to hear my story. I want to write about AI. I call him Edge. Edge said, there are many AI scientists who can write a so-called quote-unquote objective story of AI, but your angle would be so meaningful, your voice to the young people out there, the young women, immigrants, people of all kind of background. And we were sitting in his backyard with a triangular shape, three chairs. And I looked at Alex and
he was almost jumping off his chair. - With excitement. - He said, "I told you." He said, "I told you so much." Of course it only takes Etch to tell you that. - Well, let's jump to a really big philosophical question about that. I think when reading your story, you came here, this huge language barrier, such a fish out of water, but your work, if good enough, would speak for itself.
And it would be a meritocracy. And so it's not surprising to me that someone who got to where they wanted to go with that belief would have a hard time thinking, wait, I was trying to transcend this otherness. This otherness is the thing that would be most interesting and worthy of attention and affection. What a gap. It's very subtle you caught that because...
Because when I go into the world of science, I don't think too much about many other things. I just follow that light, follow the curiosity. And to this day, even when I was writing the book, it's AI that fascinates me and I wanted to write about AI. So it was very strange that someone wants to read about me.
Yeah. Well, I think even the notion that you're struggling so hard, I got to set up your story more. This is the last thing I'll say out of context. Monica's like, not everyone's read the book. But just, of course, math was appealing because math didn't have a language barrier. Yes, but I do want to
be honest, even when I was a young kid in China, I loved math and physics. I love physics. I would say even more than math itself. I saw math more as a tool. I saw more beauty and fascination in physics. Yeah, there's more philosophy. Yes. Okay, so let's start in China in 1976. You're the only child in
Of your parents. Yeah. And talk about your mother because she's very interesting. She is very interesting. My mom come from a normal family. But as the book says, her family is in a difficult position because of the history. So she was a very good student. I think the intellectual intensity I have, a large part of it come from my mom. She was a curious student. She was very intelligent.
intense. But her dream was pretty shattered when she was not able to go to a normal high school when she had a dream for college. And that carried her through. And then you arrive and you show this great aptitude. And now she has, in a sense, a second chance at this dream. But she starts recognizing pretty soon your path is going to be stilted as well if you stay there. So what's happening? What is she noticing as you
start getting educated and show this aptitude? A lot of this is hindsight because I didn't talk to my mom in this way, right? I think it was a combination that my mom has her own longing to be more free, maybe. And in hindsight, I don't know if she knew how to translate that.
in the world she was living in. And the opportunity to go to a new world was as appetizing to her as it is for her on my behalf. It's also true she saw me as
a bit of a quirky kid. I think that blend of what she was longing for and what she was longing for on my behalf without me realizing was the motivation of many of the changes, the decision of immigration. Well, what
What would have been your trajectory had you stayed in China in 1988 when you're 12? Am I misremembering that your mom felt like they weren't giving you the attention and encouragement that she was hoping you would get? My mom was not looking for attention for me. My mom was looking for freedom for me. I'd
What do you mean by quirky? Yeah.
But also there is a part of me, why should girls not play soccer? Why should girls be told they are biologically less smart than boys? I was told at least more than once, watch out that girls will in general be less smart by the time you hit your teenage time. This is what I'm remembering from the book, that you were explicitly told you're not as smart as boys. I was.
I wasn't told in the context of one-one, like, let me sit you down, Feifei, and tell you. I was told in the way that teachers will say things to boys. Yeah. Or the context. Society had a whole different expectation for boys. I was very lucky my own family protected me.
me, but they can only protect me so much as soon as you enter a school system, as soon as you interact with society, all that came through. From that point of view, I was not following the normal path. I was reading different books. You know, I was so passionate about
UFOs, physics, special relativity. I would grab my classmates to talk about that, but that was just not normal. Yeah. Who was exposing you to all that stuff? That's a great question. I was trained
ask myself that question when I was writing the book and I still don't have a strong answer. I think the early curiosity, the exposure came from both my parents. My dad loved nature. My mom loved books and literature. But how did I
did I fell in love with physics and UFOs and all that? I'm not totally sure. It could be my dad before he came to New Jersey. He was ordering me some magazines outside of the school reading and that exposed me to those topics. And because my parents protected my curiosity, when I say
protected, it really just meant they left it alone. They didn't meddle with it. I kind of followed it through myself. So your dad leaves when you're 12. He goes to New Jersey. He's there for three years on his own. And he is setting up a landing for you and your mother. Yeah. Yeah. Do you remember that three years missing him terribly? How was that experience? It was
tough. I mean, it was early teenagehood. There was no internet. Phone calls are extremely expensive to the point of being prohibitively expensive. So it was mostly letters every couple of months. But the
I was a teenager, so I had my own world to explore as well. So it wasn't like I was sitting in the room crying or anything. So then you come your sophomore year. Yes. You start a public high school in New Jersey. Persephone High School. One of the experiences you had, I came in and told Monica immediately about
and you were in a class, some kind of a study hall or something. Library. And you were with a group of other ESL kids, English as a second language kids. And you saw a very, well, no, I want to say how insignificant this first interaction was, like benign brushing up against a kid's backpack or something. Right.
And what happened? A group of ESL students were in the library and then the bell rang or something. We have to file out of the library door. And I remember they was crowded. I honestly did not see what happened to that boy. But all I knew was my ESL
friend was on the floor. By the time I realized there was some commotion being kicked and punched, I think there was nose bleeding and he was holding his head. Yeah, he said he got a concussion and a broken nose. And there's two boys kicking him. Yeah. And that's not even maybe the most traumatic part. It's that after he's gone for a couple of weeks...
He comes back and he's just not the same boy. Yeah, I mean, nobody would be the same after going through something like that. Definitely, it's a huge impact. It was an experience that was definitely pretty intense for all ESL students. Nobody felt safe.
for a long while. Yeah, I think it changes your worldview on a dime, which is, ooh, this new place I'm in can get pretty violent and a little out of control. And if you're other, this could happen. I have to imagine, yeah, it's an incredibly scary recognition of where you're at. Yes, but also I do want to give more
colors, right? I love that your show focuses on the messiness of being human. Being messy is being multidimensional. But it was also an environment where there was so much support. There was so much friendliness. And there was also so much opportunity. So it was very confusing. I'm not trying to say that experience itself is not heavy. I don't feel lucky about that experience. I mean, there was anger and all that. But
But in the meantime, the fuller context of that community was also quite a supportive community. So it was very confusing. It gave me the multidimensionality of the new country I landed in. Everything's happening. A lot of opportunity is happening as promised. And then a lot of xenophobia and violence is happening. Right. Yeah. Did you feel like you had to, after that, sort of like keep your head down? Maybe it's...
It's just my own personality. I always felt I had to keep my head down. Right. Especially as an immigrant. Sometimes I feel that way even now, especially given the AI world we live in. I feel I need to keep my head down to do work. Of course, that particular event probably added a layer of complication, at least for a while. But it also taught me you have to stand up for yourself. It did always.
open different insights to me. I don't know if you would rank these things in your life of like serendipitous things happening, but meeting Mr. Sabella has to be minimally in the top 10 and I would hope in the top three. Yeah, it's possibly in the top three for sure. Meeting Mr. Sabella was so lucky for me. Yeah, I find this to be one of the sweetest stories I've ever read about and kind of makes me hopeful for the
people, how generous they can be. But in a nutshell, minimally, you're thinking I'm going to do good at math. I don't have to go to my dictionary back and forth like I do in every other class. And you're in math and you're getting problems wrong. And you yourself cannot identify any pattern in this
You don't know what's going on. And you go to see Mr. Sabella in his office. That's your teacher? Yes. Mr. Sabella was my math teacher. I got into calculus and then Persepolis High School doesn't have calculus BC. We only had AP calculus for AB. So he had to teach me during his lunch hour for BC. But this story you're talking about was earlier. It was during some pre-calculus stuff. And it turned out I was...
using a broken calculator. That they had gotten at a garage sale. Her father loves garage sales. It was his favorite thing in the world. I know. Every weekend they'd go to... He still does. He still does. I love garage sales.
I love garage sale. I don't have time to go to garage sale, but I love it. Mr. Sabella was tough. He is a tough love kind of teacher. So even though I was ESL, I was this mousy girl, he didn't think I needed any extra. He didn't pity you. No, not at all. For one quarter of semester, I got 89.3%.
I still remember that. I was like, oh, God, 0.6. I would get to at least an A. And he would not give me that A. You asked about extra credit and he was like, get real. How about get a good grade on the test? Yeah, he would say there are many smart kids in the class. You just have to work hard.
But it sounds in the retelling like the breakthrough. And I think this scene would be in a movie if I were writing the movie. You're there. He discovers the tan symbol on your calculator is malfunctioning. He helps her figure this out because he too can't figure out the pattern of all these errors. And then somehow you guys start talking about books. And he asks if you've read a certain science fiction writer. You try to tell him you haven't read that one, but you really love science.
a million kilometers under the sea. You can't translate it and you can't pronounce Jules Verne, but he figures out you've read Jules Verne. Yes. And he is like shook. He's like, you've read Jules Verne? Yes. And then you go on to say, yes, and you've read Hemingway and you've read everything. Well, I've read everything my mom gave me, which was a lot. Which was extensive. Yeah. If I were him in this
young girl from China comes in and she has read most of the classics. That's a real like, what am I dealing with here? I gotta imagine for him at least that was a moment where he's like, okay, I'm betting on this horse. I think he saw a person he can befriend with just the way I saw in him. Later on, I realized again,
And this is hindsight that he does that to so many students. And he used this way of opening up in different ways, not necessarily science fiction or classic literature to really get to
be so helpful for him and me beyond math and calculator. When we started talking about science fiction and the English classics, he realized that he was seeing me more than an ESL kid at that point. And he's also a shy person himself. Later, his wife, Jean and Bob,
Later I called him Bob. We became such good friends. Many, many years, way longer than maybe. Gene said he's such a bookworm. Even during his family parties, he'll be by himself reading. So he's totally an introvert in a way that we just had chemistry. Yeah. But this is one thing I was not able to fit into the book.
is that for years he would keep a diary. And his diary talks about just his teaching life. And I know in this diary, there are so many stories about different students he helped with, not in the sense of bragging. It's just he's a writer, right? So years later, before he passed away, we didn't know he was going to pass away. I told Bob, I said, Bob, you've got to turn this into a book. Of course, we could anonymize it.
But this is an American teacher's story of so many students. Many of them are immigrant students because they lack the support. They lack the family. Some of them are in high school by themselves. Family is overseas. Many of them are like me. Parents are so busy that the students don't have that emotional support. And he supported so many students. I can sit here and tell you endless story.
he wanted to translate that into a book, but somehow he just couldn't bring himself to do it. Maybe he's too shy. Maybe he's too humble. Yeah, I think he's struggling with the same issue you're struggling with. You don't feel entitled to tell this story. Right. I feel so strongly he needed to write this book. I almost felt like one day I would write it for him. But of course,
He passed away so suddenly because of the brain tumor. So when I was writing my book, I realized, let me tell the Bob Sabella story. Let me tell the story on behalf of so many American public school teachers. They don't have much of a voice. Nobody knows their name, but they work above and beyond every day for the students in their community.
They don't care which part of the world they come from, which kind of family background they come from. But they invested so much in these students and they changed lives. Yeah, they're very unsung heroes. They're not tenured professors at elite universities. They're totally unsung heroes. And they're the ones that get the people to those destinations. Yeah. It's a really beautiful story. How instrumental was he in you finding your way to Princeton? He was instrumental.
I think you're underplaying your story. If you came here in seventh grade and ended up at Princeton, that's one story. You had two.
years to get yourself ready to learn English to start Princeton and you didn't speak any English you're very much under which is fine I think so would your teacher yeah you feel maybe that's self-indulgent or something but that's really bonkers again AI aside to land and go okay if
If you drop me in Russia and told me I have two years that land at their most elite university, it's not going to happen for 99.999% of people. Let's talk for a second about you going to Princeton. This is another fun moment for me in the book because there's something so much more important about Einstein than the theory of special relativity.
And I can't really articulate what it is, but I know you have a good dose of it. So what was it like going there and seeing the statue of Albert Einstein and imagining that you would in some way be touching that reality? So the first time I saw the statue of Albert Einstein, before I was applying for college, it was probably early junior year. My dad continued to find things for us to do that's free. It's very important. It's free.
Princeton's Natural History Museum was free. That's why we went there. Garage sales, free. Exactly. Museums, free. St. Einstein's statue was kind of
of symbolic for me that I'm getting back to where really my soul wanted to be at. Because as a teenager, landed in a new country, trying to learn language, deal with all the messiness, you know, Chinese restaurant, walking dogs. You're working a ton of hours. Yeah, exactly. I did
of forget about physics. I was taking physics class in school, but I forget about the sense of love. Romanticism. Yes, it really is that first love. And it kind of got me back to that, rekindled something. Well, don't you think it left an imaginary world where this person existed and it put it in your own three-dimensional reality? Yes. Suddenly I feel so much closer to that person and that person symbolizes
the entire world of physics. I feel so much closer. I was literally in Princeton, right? Yeah, yeah. That felt very different. And he lived there for what, 30 some years? Yeah. Maybe more. I think that would be a special moment as well. I'm sure you watched the movie Oppenheimer. Yeah, yeah. Do you remember?
the opening scene was Einstein in front of that little pond. Yep, yep. Talking with Oppenheimer. Right. He was first there by himself. Yeah. I call that my pond. That pond literally exists. It was very close to my dorm by the time I got to Princeton. And I would go there a lot because I know that was close to the Advanced Institute where Einstein worked. Yeah. So like when the scene came out, Sylvia was sitting next to him. I'm like, Sylvia!
You have such a full circle. Yeah, yeah. I'm currently stuck in a rut where I'm learning a lot about physicists, historical physicists. And I'm wondering, have you read When We Cease to Understand the World? Have you read that book? No. Or have you read The Maniac? Either of those? No. No.
The Maniac's all about Janusz von Neumann. I'm reading a different bio of him, but not The Maniac. Which one? Oh, it's in my phone. Yeah, same. Don't worry about it. This one's fun because it has the perspective of a million different people in his life. Like the student he was friends with at school, one of his wives, people who worked with him. And you get this really comprehensive view. Another Princeton guy. Yeah, I'm obsessed with all these guys. And then when we cease to understand the world is many of these physicists who were so brilliant at a time who ultimately died.
became crazy. And how many of their breakthroughs in the math of quantum mechanics, coming to this guy in a nine-day, 106-degree fever, writing down the matrices and not understanding the math when he comes out of it, but it holds. There's a lot of weird magic in this space, I think, where people have these breakthrough thoughts and they touch some understanding and they're in a compromised state mentally. It's just fascinating to me. It's like mystical. Yeah. Physical.
is absolutely the discipline that pushes you to think so audaciously that you have to transcend the immediate reality. Yes. That's what I loved. I loved about Einstein. I loved about modern physics. Even Newton, classic Newtonian,
physics, you have to think so beyond the immediate reality. All those stories of him getting asked a question and then answering it two and a half days later, and he hasn't left the chair and the person left. Like he went away for two and a half days and then came back with the answer. Or just the notion, I think one of the most intriguing parts is like, you're going to have thoughts that cannot be expressed in language, but can only exist in math.
That already is like, what? There is actually even beyond math. Right. And then there's a realm beyond math. Yes. It's the closest thing I think we have to magic. Where it's like completely outside of our grasp, but for a handful of people. I love it.
I love it. You call it magic. It is also the furthest thing we have to AI is that humanity in us, that magic, that creativity, that intuition, that almost ungraspable way of intelligence. Yes. We should keep that in mind. So you're at Princeton. You're also working a ton, right? Yes. When do your parents start the dry cleaner? So we started very quickly.
quickly right after my freshman year started because my mom's health was going so badly. They were working in Newark, New Jersey. I don't know if you guys know that part of New Jersey. From Persephone to Newark, New Jersey is a very difficult drive. My mom's health was
bad and it was long working hours. I was really worried about them. Doctor was worried. We finally decided if we can do a local thing in Persephone, it would be better for the family. And it was very important for me that the business is a weekend business because that way I can do the lion's share of work. But there are pretty
pretty much three kind of weekend business for immigration families like us. Open a restaurant, open a grocery store, or open a laundry. And restaurant and grocery require very late working hours for restaurant employees
And groceries very early. You have to go to Chinatown to get supplies. So neither of these work for my mom's health. Whereas dry cleaning was actually perfect because it's a daytime business. It's very long hours during the weekend, but it's at least daytime. And a lot of my mom's work, especially when it comes to alteration, she can sit comfortably.
in front of the sewing machine. Because your mother had had a reoccurring fever as a child and it greatly degenerated some of her heart valves. So she was really struggling with heart issues. Yes. She carried that illness with her all her life. And there's no money in the dry cleaning. There's only money in the seamstressing, whatever we call it. The tailoring. The tailoring. I mean,
money in any of this. But having that tailoring ability was nice because it helps a little bit. And my mom is incredible. She never learned this. She was a bookworm and she's kind of a brainy. She should have done what you did. Right. Yeah. I don't think she would love physics.
But you know what I mean? She should have probably been an academic. Yes, she would have been an academic. But then she just kind of figured out tailoring by herself. I still don't know. Like, I tried. I could not. The only thing I can do is sit there and unstitch things for her. Sure, I think a chimp can do that. Thank you.
Yeah, exactly. I say that because I know how to remove stitches from garments. Yeah. And I don't have more skills than a champ. Yeah. So we opened a dry cleaner shop during the middle of my freshman year. And that became my entire memory of my undergraduate experience.
Here's a fun fact. Princeton is organized by residential dorms. I lived in one of them called Forbes. It turned out Forbes is very famous for its Sunday brunch. Oh. I didn't know there was a Sunday brunch because I was home doing dry cleaning. You said you didn't go to a single party. But then when I went back to Princeton as a faculty, Forbes was very kind. They made me a faculty fellow and I discovered Sunday brunch. That was fun.
That was like 50 years later. Instead of the freshman 10, you gained the 30 10. Yeah, exactly. The faculty 10. So I felt so good. I finally got my Sunday brunch. And I think it's worth mentioning when you guys were trying to open that dry cleaners, you were trying to raise $100,000 and you were $20,000 short. And again, Mr. Sabella. Monica. He gave the money.
Yeah, it was a total shock. To this day, actually, as a 19-year-old, as much as I appreciated Gene and Bob, I did not realize the extent. We're talking about late 1990s. There are two public school teachers with two kids about to go to college. Wow.
It's unimaginable. He said, Jing, and he decided to do that. I mean, at that moment, I was very, very grateful. But now after I became a grown up, this is unimaginable. It's impossible that someone would do that. Especially he later told me, I think when I was returning the money, he said, I didn't realize you'll be able to return. Right. I was like, what? What?
Of course, you have to give it thinking you'll never get it back. I guarantee he and his wife were like, we're giving this money away. I did not know that. He did use the word lend. And of course, in my mind, I was like, of course, I'm going to return. Like, I'll do anything to return. But Jing and Bob,
They could not have assumed that. So the money was being raised to help your mom? No, to help my family. To start the business. Yeah, we as a family, I still consider myself the CEO of the dry cleaner. I live in Silicon Valley. You have to claim yourself to be a CEO of something. A C-suite. Yeah, exactly. Bob and Jean. It's incredible. I don't even think their kids knew about this till they read my book. Wow.
Oh my God, how proud I'd be of my dad. Okay, so you graduate from Princeton and you have a degree in physics.
As well, some kind of computational. Yeah. So Princeton is a quirky school. It didn't have minors. So it has these certificates, but they're just minors. I had a computational mathematics as well as an engineering physics minors. And when you're there, unless I'm misremembering, you had a very singular focus on being a physicist. But while you're there, you start realizing you're maybe open to something different. It's actually
really interesting. I never necessarily thought I would be a physicist, but I wanted to be a scientist. That was almost a sacred calling for me. It was an identity. Yeah, it was an identity. For some reason, this girl who works in dry cleaners just wanted to be a scientist. And then I loved physics, but I loved physics
physics for its audacity and curiosity. I didn't necessarily feel I'm married to a big telescope inquiry. So I was just reading a lot. And what really caught my attention was the physicists I admired so much, Einstein, Schrodinger, Roger Penrose,
actually are curious beyond just atomic world. They were curious about other things, especially life, intelligence, minds. And that was immediately no point and the eye opener for me. I realized I love that. Yeah. Understanding how this brain works. Intelligence works.
It's crazy the overlap that has now been proven. But at that time, that's not an obvious. We haven't figured out neural pathways and we're not going to map that onto computers yet. So these seem on the surface very different fields. One's biology and one is, you know. Right. But for me, it was the science of intelligence. I always believed intelligence.
It's the science of intelligence that will unite our understanding of both the brain and the computers. Right. OK, so then you choose Caltech to go to graduate school. Yes. What did you think of California? I mean, my God, what a place, right? I know we're 15 minutes away from Caltech here. Yeah, yeah, yeah. So I was choosing among MIT, Stanford and Caltech. And honest to God, I almost chose Caltech because of the weather. Yeah.
It was so balmy. And the vibe. Yes, the turtles, the garden-like campus. And of course, I walk into this building and
I think it was Moore building at Caltech. And guess whose photo was there? It was Albert Einstein. And I was like, what? It turned out he was visiting. And of course, there was Richard Feynman, the Feynman lecturer. So I just followed these physicists, apparently. And New Jersey was cold. And also, I really have an issue with cold because my mom's illness is...
is exacerbated by cold. So every winter she suffers a lot. So I have this negative affinity to coldness coming from taking care of my mom. So coming to Southern California, I was like, oh my God, I love this place. Did your parents come with you? Later they did. In the middle of my grad school, they did. Were you worried about that leaving? I had to switch from being on site to remotely run the dry cleaning. The dry cleaning was...
that the customers are all returning customers. So my mom would be able to handle with one part-time worker. And Bob Sabella was doing bills for my mom. Oh my God. Yeah.
He was just helping me. And another thing he helped me, as a young graduate student, I would be entering the world of writing scientific articles. That's pretty intense. He would still proofread my English for me, all my papers. Tell me about North Star and how you discovered yours, because this happens at Caltech. Yes. The prelude of the North Star was my education from college.
physics is always about asking the right questions. If you go to the Nobel Museum in Stockholm, there is an Einstein quote about much of science is asking the right questions. Once you ask the right questions, solutions follow. You'll find a way for solutions. Some people call it hypothesis-driven thinking. I've always been just thinking this way. So as I was studying computational neuroscience, as well as artificial intelligence at Caltech University,
I was always kind of seeking what is that audacious question I wanted to ask. And of course, my co-advisor Pietro Perona and Christophe Koch, they were great mentors guiding me. But many things start to converge, not just my own work, but the field. People working on visual intelligence from neuroscience, from AI start to
orbit around this idea that the ability to recognize all kinds of objects is so critical for human visual intelligence. When I say all kinds of objects, I really mean all kinds. I'm sitting here in your beautiful room. There's table,
bottles, couch, pillow, a globe, books, flower, vase, plants. T-Rex skeleton. Okay, that's behind you. That's behind you. It's about to eat you. That's great, yeah. Shirts and skirts and boots and TV. So the ability for humans to,
to be able to learn such a complicated world of objects. Millions and millions of objects. It's so fascinating. And I started to believe, along with my advisors, this is a critical problem for the foundation of intelligence. And that really started to
become the North Star of my scientific pursuit is how do we crack the problem of object recognition? Okay, so now I think is a great point to just go through a couple of the landmark events that take us to where the technology is at that time. So I guess we could start with Turing. We could start in 1956. Give us a couple of things that have happened in computing up to that point. Right. So that's the parallel story I was writing in the book.
Now that I have people hooked into you as an individual, now we can get a little protein in this and learn some stuff. Right. Well, the field of computing, thanks to people like Alan Turing von Neumann, was starting during World War II time, basically. Of course, for the world of AI, a very important moment was 1956, when...
when what we now call the founding fathers of AI, like Marvin Minsky, John McCarthy, Claude Shannon, they get together under, I believe, a U.S. government grant. DARPA funded it or something? DARPA funded to have a summer-long workshop at Dartmouth with a group of computer scientists. At that point, the field of AI was barely kind of born, not born yet.
They got together and wrote this memo or this white paper about...
artificial intelligence. In fact, John McCarthy, one of the group leaders, was responsible for coining the term artificial intelligence. I think we could get even more rudimentary, right? So up until that point, a computer was something that could solve a problem. It could do computations. It could calculate. And this notion of artificial intelligence, what it really meant is, could we ever ask a computer questions that
That it hadn't been pre-programmed to answer. What are the hallmark things that separated at that time artificial intelligence from just computing? Because I think we've just fast forwarded to everyone saying AI, and I don't think they really even take a second to think of what that step is between computing and computation and thinking. Right. Up to that point, you can think no matter how powerful the computer was,
It was used for programmed calculation. So what was the inflection concept? I think two intertwined concepts. One is reasoning. Like you said, if I ask you a question, can you reason with it? Could you deduce if a red ball is bigger than a yellow ball, a yellow ball is bigger than a blue ball, therefore the red ball must be bigger than the blue ball.
Right. Without having been programmed that. Without directly saying red ball is bigger than the blue ball. Yes. So that's a reasoning. So that's one aspect. A very, very intertwined aspect of that is learning. A calculator doesn't learn whether you have a good 10 button or not. It just does what it is. You had a bad one. One.
I had a bad one. So artificial intelligence software should be able to learn. That means if I learn to see tiger one, tiger two, tiger three, at some point when someone gives me tiger number five, I should be able to learn, oh, that's a tiger, even though that's not tiger one, two, three. So that's learning. But even before the Dartmouth workshop,
there were early inclines, like Alan Turing's daring question to humanity, can you make a machine that can converse with people, QA with people, question and answer, so that you don't really know if it's a machine or a person. It's this curtain setup that he conjectured. So it was already there, but I think the founding fathers kind of formalized the field. Of course,
What's interesting is for the first few decades, they went straight to reasoning. So they were less about learning. They were more about reasoning. They were more about using logic to deduce the red ball, yellow ball, blue ball question. So that was one branch of
of computer science and AI that went on during the years, predated my birth, but during the years of my formative years without me knowing. I wasn't in there. Right. But there was a parallel branch. That branch was messier. It took longer to prove to be right. But as of last week, we had the Nobel Prize awarded to that, which was the neural network.
So that happened again in a very interesting way. Even in the 50s, neuroscientists were asking questions, nothing to do with AI, about how neurons work. And again, my own field, vision, was the pioneering study about cat mammalian visual system. And Hubel and Wiesel in the 1950s and 60s were sticking electrodes into cats' visual cortexes.
to learn about how cat neurons work. Details aside, what they have learned and confirmed was a conjecture that our brain or mammalian brain is filled with neurons that are organized hierarchically, layered. They're not like thrown into a salad bowl. Right. And that means information travel in a hierarchical way. Up these columns.
Yes. For example, light hits our retina. Our retina sends neural information back to our primary cortex. Our primary cortex processes it, sends it up to, say, another layer, and then it keeps going up. And as the information travels, the neurons process this information in somewhat different ways. And that hierarchical processing gets you to
complex, intelligent capabilities. That's a mouse I'm seeing if I'm a cat. Or this tiger sneaking up on me. And I think this could be a bad analogy, but you might be misled to think, oh, well, a camera can take a picture and then the computer can show the picture. So the computer understands that's a photo. But really, the camera has broken what it's seen into thousands of pixels. They are coded with a numerical sequence. The computer reconstructs those colors. It's a grid. And virtually, that's what our eyes
Our eyes are just grabbing photons and they're sending back the ones and zeros. And then back here in the cortex, it's assembling it all. Yes. And how did evolution assemble us so that we can recognize all this beautiful world? Not only we can recognize, we can reason with it. We can learn from it. Many scientists have used this example is that
children don't have to see too many examples of a tiger to recognize a tiger. It's not like you have to show a million tigers to children. So we learn really fast. And as you point out in the book, it took us 540 million years of evolution to get this system. Exactly. So just to finish, so the neuroscientists were studying the structure of the mammalian brain and how that
visual information was processed. Fast forward, that study got the Nobel Prize in the 1980s because it's such a fundamental discovery. But
But that inspired computer scientists. So there is a separate small group of computer scientists who are starting to build algorithms inspired by this hierarchical information processing architecture. You build one algorithm at the bottom that's maybe generic? No, it's a whole algorithm, but you build mathematical functions that are layered. Okay.
So you can have one small function that process brightness, another that process curvature. I'm being schematic. And then you process the information. But what was really interesting of this approach is that in the early 80s, this neural network approach found a learning rule.
So suddenly it unlocked how to learn this automatically without hand code. It's called backpropagation. And also Jeff Hinton, along with others who have discovered this, was awarded the Nobel Prize last week for this. But that
is the algorithm neural network. Could you think of it as almost a filtration device, which is like this data comes in, we filter out these three key points that then filters up and then we come to our conclusion at the top of this hierarchy. You could actually. Because it's just like all this raw info at the bottom and then we kind of recombine it into this layer and then another process filters. Well, it's not a school bus. It's not this. You're just
keep filtering it. Of course, you combine it in mathematically very intricate way, but it is like layers of filtration a little bit. Okay, great. So and now also when you find your North Star, another thing that's happening at the same time is WordNet, right? This is kind of a big breakthrough for early AI. For linguistics. So WordNet had nothing to do with AI. It had nothing to do with vision.
But what happened for my own North Star is that I was obsessed with the problem of making computers recognize millions of objects in the world. While I was obsessing with it, I was not satisfied because my field was using extremely contrived data sets, like data set of four objects.
or 20 objects. I was really struggling with this discrepancy because my hypothesis was that we need to learn the much more complex world. We need to solve that deeper problem than focusing on a very handful of objects. But I
But I couldn't really wrap my head around that. And then again, Southern California, I remember that Biederman number in my book is that I read a psychologist paper, Irv Biederman, who was up till two years ago, a professor at University of Southern California. He conjectured that humans can recognize tens of thousands of object categories. So
So we can recognize millions of objects, but categories are a little more abstract. Animal, food, furniture. German Shepherd. Transportation. Yeah, sedan, fighter jet, and all that. Yeah. So he conjectured that, but that
But that conjecture didn't go anywhere. It was just buried in one of his papers. And I dug it out and I was very fascinated. I called it the BWN number because I thought that number was meaningful. But I don't know how to translate that into anything actionable. Because as a computer scientist, we're all using data sets of 20 objects. That's it. And then I stumbled upon WordNet.
What WordNet was, was a completely independent study from the world of linguistics. It was George Miller, a linguist in Princeton. He was trying to organize taxonomy of concepts, and he feels alphabetically organized dictionary was unsatisfactory because in dictionary, an apple and an appliance would be close to each other, but that apple should be closer to a pear.
than appliance. So how do you organize that? How do you regroup concepts? So he created WordNet, which hierarchically organized concept according to
meaning and similarity rather than alphabetical ordering. Does WordNet not lead to the machine that can read the zip codes? No. It doesn't. What's that called? That's what I meant to bring up. That was ConvNet, Convolutional Neural Network. That's happening as you're getting your idea about the images right. We've trained a machine to read zip codes.
basically, handwritten zip codes. So that was Young-Kung's work in Bell Labs. That was an early application of neural network in the 1980s and 1990s, where that neural network at that time was not very powerful.
But giving enough training example of digits, the scientists in Bell Labs were able to read from zero to nine or the 26 letters. And with that, they created an application to read zip codes to sort mail. But its data set was, I forget, it was like a thousand or something. It wasn't that. It was a lot of...
There's a lot of handwritten digits. Yeah, and common mistakes. They would feed it. That data set was probably tens of thousands of examples, but we're talking about just letters and digits. What they had proved in concept, you're going to try to do in images, but the lift for images is so exponentially larger than getting the machine to read.
Exactly. By a factor of what? I mean, when you lay out what it's going to take for you to prove this theory you have and you figure out how long it's going to take, it's going to take like a decade of you feeding them, right? There's some moment where the amount of images you're going to have to feed this computer to train it can't almost be done by the group of you. So I think what you were referring to was the process of making ImageNet. Yes. And that process was once we realized that
Thanks to the inspiration of WordNet and also Biederman's number and also many other previous inspiration, we realized what computers really need is big data.
And that was so common today because everybody talks about big data, you know, open AI talks about big data. But back in the 2006, 2007, that was not a concept. But we decided that was the missing piece. So we need to create a big data set. How big is big? Nobody knows. My conjecture went with Biederman's number. Why don't we just map out the entire world's visual concept? Oh, my God. Yeah.
Yeah. Why don't we? And you wrangled someone in that this wasn't even really their North Star. Okay, so Professor Kylie at Princeton, he was very supportive of me. He was a senior faculty. But what was really critical was...
he recommended his student to join my lab, Jia Deng. And Jia was just a deer in the headlight as a young first-year graduate student. He didn't know what's going on. He got this crazy assistant professor, me, and told him that we're going to create a data set that map out the whole world's visual concept. He's like, sure.
You know, I don't know what you're talking about, but let's get started. So he and I went through the journey together. I mean, he's a phenomenal computer scientist and many hoops we jumped through together. It was just the solution that got us through. This level of plotting that you were able to take on
is unique to you. And I think it's moving here in 10th grade and looking at that fucking dictionary back and forth and back and forth and back and forth. That kind of really unique dedication and unwavering plotting. A million other scientists could have had your idea, but I think it's that
thing right there that makes you capable of creating image. That's an interesting observation. Yeah, it's not. I think we like to think of these things very simplistically, like, oh, you had a great idea. Who gives a shit? A lot of people had great ideas in graduate school. I do tell my kids ideas are cheap. Exactly. Hollywood. Someone's like, that was my idea. Oh, really? Did you write the script? Did you execute it? Did you cast it correctly? Did you motivate everyone? Your idea is one percent of the equation of a great movie. Yeah.
Because when I'm reading your thing and the data's coming in, it feels like, and tell me if I'm mischaracterizing it,
The deeper you got into this experience, you were just learning every day it was going to be harder than you originally anticipated. It just kept getting worse and worse and worse and worse for years, right? It was pretty bad. When I'm reading it, I'm like, I would have quit a trillion times. I'd be like, maybe computing will get to a point where this job will be made easy, but right now it's too hard. How do you even start something like that? Do you literally just look around the room and you're like, okay.
Okay. Here we go. Yeah, I'll start with this room and write everything. Well, okay. So first of all, I've had years of training as a scientist. So after you formulate a hypothesis, you do have to come up with a plan. My PhD thesis had a mini version of ImageDesk, so I got a little bit of practice.
But yeah, our idea was to create a data set and a benchmark to encompass all the visual concepts in the world. So we had to start with WordNet. We had to figure out what is visual. We have to figure out what are the concepts we need and then where to get the source images and how to curate it. Every step of the way, like Dax, you were saying, we were just on.
way too optimistic at the beginning. Naivete is the best asset you can have. Yeah, I was just fearlessly stupid. Yeah, it's a great gift. And then we start to hit all these walls of Jia and I and other students, but Jia was the main student. We had to just deal with every obstacle that came. Now,
this is a funny thing, right? Sometimes serendipity makes a world of difference. What was really critical was the Amazon Mechanical Turk, the crowdsourcing platform. Amazon, nothing to do with us. We're like, oh, we have
all these servers sitting in our data centers and we have nothing better to do. Let's make an online worker platform so people can just trade little tasks. A marketplace for that computer labor. Exactly. Which
I didn't know it exists. I was in New Jersey, Princeton, and trying to pull my hair out. And then some student who did his master at Stanford came to Princeton and just mentioned it casually and said, do you know this thing? That was really, really quite a moment for me. Yeah, that cut this process down by 80% or something. Yeah, 10x. That was one of the...
technical breakthrough that really carried this whole project. They're years down the path and they're calculating how much further it's going to be. And they know they have years and years ahead until this moment. Not only years and years, the budget, hiring undergrads or whatever, just doesn't cut it. The budget was not going to cut it. My tenure was on the line. It was a dicey few moments.
So to fast forward to the end, you create ImageNet and you can feed in a picture of a boy petting an elephant and the computer knows that's a boy and that's an elephant. Might be a different size than the other elephant I saw, but I know that's an elephant. And this is huge. This earns you the title of godmother of AI.
I know you don't have to comment. I know you don't want that. And I want to fast forward. Now, you've accomplished this incredible thing. You teach at Princeton for a while, as you say, and then you take up a teaching position at Stanford where you still currently are. You become one of these people that undergrads would then study about, which is fascinating. And you go to work for Google during a sabbatical for like a year and a half. Yes.
And there's a moment where part of your job is to go meet with the new recruits that are going to start their employment at Google. Is it fair to say this is one of your, I don't want to call it a crisis of conscience because that would be too strong, but how would you say it? You have an opportunity to talk to those people and it sounds to me like you went rogue a little bit. Yes, I did go rogue a little bit. So it's very...
It's very important to call out the year. My sabbatical at Google was 2017 and 2018. That was my first sabbatical. I finally had a sabbatical. And it was a conscious decision for me to go to Google because this is right after AlphaGo. So AI was...
Having its first hype wave, at least public moment, and Silicon Valley, of course, was ahead of the curve and new AI was coming. So I had multiple choices, but I really wanted to go to a place for two reasons. One is to learn the most advanced industry AI, and Google was by far the best. But also to go to a place where I can see
how AI will impact the world. And Google Cloud was a perfect place because cloud business is serving all businesses. So at cloud, being the chief scientist, I was able to see
the technology translating to product and product impacting healthcare, hospitals, financial services, insurance companies, oil and gas companies, entertainment, agriculture, governments, and all that. But in the meantime, it was confirming my hypothesis that this technology
has come of age and will impact everyone. It was the first tech clash. 2017 was right after Cambridge Analytica. Let's remind people. So Cambridge Analytica figured out how to maximize Facebook politically.
And people were very upset by that. Yeah, social media's algorithmic impact can drive societal changes. It was also around the time face recognition bias was being publicized for the good reasons of calling out bias. It was also around the time that self-driving car accidents start to happen. So before that, tech
was a darling. The media doesn't report tech as a force of badness. But I do want to point out, because I heard you point it out, which is in the early advancements, they had all these peaks and valleys AI. And there was a moment in the 70s where it looked promising. And immediately people went to robots were going to take over the world. So we also do have this immediate sense. We do jump to that. They jumped to it in the 70s. It's worth pointing out. That's true. Hollywood is always ahead of the curve on that.
Yeah, yeah, yeah. Well, we sell fear and excitement, so. So it was a tech clash that came at us very fast. Google has had its own share. I was actually also witnessing the struggle that Google was causing
coming to terms with defense. Yeah, they had taken a contract to develop some drone phase recognition stuff. And the people at Google were told that they were only working on nonprofit stuff. There was a bit of a revolt. You were there during all that. Yes. In hindsight, it was a mixture of many things. It wasn't a single event. I remember it was summer of 2018 and we were just coming off
turmoil. In hindsight, they're small, but at that point, and I was just like, I'm about to speak to, maybe my memory is wrong, but I thought it was 700 interns from worldwide who worked at Google that summer. And they're the brightest from the whole world. And they were hand selected by Google. You know, Google is really a machine of talent. And what do they want to hear from me? Of course, I can talk about
come work at Google. That's my job as someone who was working at Google. But I felt there was more I should share. Really coming from the bottom of my heart at that point, something that you will appreciate is that the math behind technology is clean, but the human impact is messy. Technology is...
so much more multidimensional than equations. Yeah, they're all benign. It's how we implement all that. Neutral, there we go. But once they start to interface with the world, the impact is not necessarily neutral at all. And there is so much humanness in everything we do in technology. And how do we connect that?
I decided to talk about that with the interns. And is this the first time you articulate that you want a human-centered development of AI? Yeah, it was around that time, 2011.
18 March, I published the New York Times op-ed. I laid out my vision for human-centered AI. So let's parallel your speech to the interns and then also getting to go in front of Congress. So what is your overarching sense of how we keep this technology going in a direction that does serve humans? My overarching thesis is that we must stop
center the value of technologies development deployment
and governance around people. Any technology, AI or any other technology should be human centered. As I always say that there's no independent machine values. Machine values are human values or there's nothing artificial about artificial intelligence. So it's deeply human. So what are the practical things we do? What are the legislative things? What does that mean? How do we do that?
So human-centered AI should be a framework, and that framework could be applied in fundamental research and education. That's what Stanford does.
or creating business and products, as Google and many other companies do, or in the legislation and governance of AI, which is what governments do. So that framework can be translated into multiple ways. Fundamentally, it is to put humans' dignity, humans' well-being,
And the value that a society care about into both how you create AI or how you create AI products and services or how you
govern AI. So concrete examples, let me start from the very basic size upstream. At Stanford, we created this Human-Centered AI Institute. We try to encourage cross-pollinating interdisciplinary study and research and teaching about different aspects of AI, like AI for drug discovery, AI for developmental studies, or AI for economics and all that.
But we also need to keep in mind, we need to do this with the kind of norm that reflect our values. So we have actually a review process of our grants. We call it ethics and society review process, where even when researchers are proposing a research idea to receive funding from HAI, they have to go through a study or a review about
What is the social implication? What is the ethical framework? And are you bringing in philosophers and anthropologists and psychologists? This is the interdisciplinary aspect. That's the very fundamental research example. Now, translate to a company. When we think about an AI product, let's say I would love for AI to detect skin condition for diseases. That's a great idea. But
Starting from your data, where do you curate data? How do you ensure data fairness? So if I play out that experiment, it's like, yes, I would love to take my phone, scan my face and know if I have a melanoma. That all sounds great. Where does the results of that get stored? Does my insurance provider have access to that? What all happens? It's not just me that's going to find out I have this melanoma. Exactly. What about the scan of the face? And also the algorithm that detects melanoma
Is it trained on... Just white folks? Right, exactly. Narrow type of skin or all skins? What's the impact of that algorithm? Will it disproportionately help some group and alienate another? And do you have to pay? Because if you pay, you'll probably get a certain group more than you'll get another group. Right. So all those are messy human elements. And then you ask about legislation. Then we come to government. Of course,
There is always a tension between how much regulation, how do you regulate? Is good policy only about regulating? For example, I firmly believe we actually should have good policy to rejuvenate our AI ecosystem, to make our AI ecosystem really healthy. For example, right now, the gravitational pull is that all the resources, the data, the computation and the talents are all
concentrated in a small number of large companies. It's all for commerce. Yeah. Universities can't really compete at the level that Meta, Google. My own lab at Stanford has zero NVIDIA H100 chips. There you go. Yeah. Like that's always been the good corrective mechanism we've had societally.
is the world of academia. And it competed pretty robustly with any private sector. And it's not just competition. It's that the problems we work on are curiosity-driven, and sometimes they are really public good. For example, my own lab, we're collaborating with hospitals to prevent seniors from falling. That is not necessarily a commercially lucrative technology, but it's humanistically important. The universities do all kinds of work like that.
Now our universities in the age of AI is so under-resourced that we cannot do this kind of work. I have been working really hard in the past five years with HAI, with Washington, D.C., with Congress people, senators, White House agencies to try to encourage the resourcing of AI through National AI Research Cloud and data. And then we have legislation and
regulation? How do you thoughtfully put guardrails so that individual lives and dignity and well-being are protected, but the ecosystem is not harmed? So all of this, I'm always on board with. I love it. I'm so grateful there's people like you pushing us in that direction. But we just had Yuval Harari on to talk about his take on it. And
What I ultimately get so discouraged and defeated by is we're not doing this on an island. We're doing this while many other countries do this simultaneously. So how do you see us dealing with the competitive nature of these AI technologies emerging and us maybe proposing we're going to do it in this way, but
being realistic and saying, well, Russia might not have those guidelines and China might not have those guidelines. And if they have a product that people like, we can't compete now with it. So do you believe there could be cooperation? We could outlaw faking humans. Okay, so the U.S. has outlawed faking humans. No one else does.
And those fake humans are really convincing and entertaining and all these things. And then that industry takes off somewhere else. Like, how do we do this in a world that there are no barriers of this technology? I was also chatting with Joval. Did he give the C- grade to humanity? Did he say that? I didn't get the C- out of him. He said that's humanity.
has gotten a C-minus. And I was like, Yuval, you know, I'm a teacher and a mom. When a kid comes home with C-minus, you don't throw the kid out. We help the kid to get better. So first of all, you're right. We're not living in a vacuum. And AI also is not living in a vacuum. AI is just
one technology that's among many. So I absolutely do believe that there can be cooperation. How exactly we cooperate, who we cooperate with, and what are the parameters of cooperation is much, much more complicated. Look at humanity. We have gone through this so many times. I mean, you're always right. We have many, many
messy chapters, even nuclear technology. But we have gotten to a state that there is a fine balance at this point of nuclear powers. I'm not saying that's necessarily culpable. I think it is. And then I think what's really important, and I only know this because I'm on my second Von Neumann book, but Von Neumann was employed in the wake of the Manhattan Project to deal with how this proliferation was going to work. And he was so analytical and so...
that he said, mutually assured annihilation is the solution. He knew that was the only outcome. It felt sociopathic to say it and to commit to it. But he's like, look, I'm modeling this out. This is the only way it works is mutually assured annihilation. That's what we ended up with. And so I'm having a little Van Noyman-y feelings about like, no, I think it's a race to who can win until everything gets neutralized. I don't know another comp other than the nuclear arms race. Sure.
is the difference between AI and nuclear technology is AI is so much more an enabler of so many good things. True. So that's very different from nuclear. Of course, nuclear can be an energy. We're coming back around to it. Right. But AI can help discover drugs and
AI can help break through infusion. AI can personalize education. AI can help farmers. AI can map out biodiversity. So AI is much more like electricity than it is like nuclear physics. So that's the difference. So from that point of view, the viewing angle of AI, at least I do not think it has to only from the competitive lens because it should be also through the enabling lens.
the enabling of our world of so many good things that can happen. And that's the challenge is how do we keep the dark use of AI at bay? How do we create that kind of balance somehow? But in the meantime,
encourage the development of AI so that we can do so many things that's good for people. So I accept that the nuclear analogy falls short in that there's so many benefits to this. Totally agree. But I will say again, to parallel nuclear arms race in this moment in time, I think it would be only the second time where international cooperation is at its peak, where it's most needed. We have got to
recognize this as a moment where we have to be getting closer to all these places and not further away. Our competitors, our geopolitical adversaries, that if ever there were a time where everyone stands to gain, other than the nuclear arms race, this is the time where it's like, we got to really figure out how to cooperate a bit.
because everyone will experience the downside if we don't. Yeah. Climate too would be the other more recent thing. There's a Paris Accord and there is things that globally people have come together. I agree with you, but I will just say that climate to me is a little dicier simply because you have all of these burgeoning industrial economies that
We would be slapping rules on. It's easy for us to adopt a lot of things that it's not for Sri Lanka. It's not totally fair. There actually should be areas of the world where they are allowed to pollute more as they pull themselves. You know, like. I mean, I think that's part of it. It's just an acknowledgement globally that we're all going to have to do something. And especially the superpowers do need to take more on than others. But it's just getting on the same page that I think we've done okay at.
And at least there's some consensus there. So there could be some consensus here potentially. Yeah, I just hope that we recognize this is a moment to be making friendships a lot better.
and not doubling down. I do think we must always recognize cooperation is one of the solutions. Do you get to the guardrail point in the conversation with the legislators? Do you have certain guardrails that you believe should be... Like, I like Yuval. Yuval said we shouldn't ever be able to fake humans. And I also think there should be a disclaimer on all AI-generated things that you at least know it came from that source. I do think we should pay a lot of attention on where rubber beats the road.
Because AI can sound very fancy, but at the end of the day, it impacts people. So if you use AI...
through medicine. Then there is a regulatory framework. For example, my mom, again, does imaging all the time because the doctors have to use MRI, ultrasound, you name it, to monitor her. Honestly, do I care if that MRI is fully automatic or is it operated by humans or it's a mixture? As a patient family,
I probably care more about the outcome. If the result of the MRI can be so accurate. 78% of an AI or a human does it at 40. It's a no-brainer. Exactly. But all I care are two things. One is it is the best result my mom can get.
Second is it's safe, right? I don't care if it's that kind of mixture. So that regulatory framework is there. I'm not saying FDA is perfect, but it is a functioning regulatory framework. So if there's an AI product
that goes into that MRI, I would like it to be subject to the regulatory framework. There we go. Yeah, yeah. Right. So that's where rubber meets the road. The same as finance, environment, transportation, you name it. That's a very pragmatic approach. It's also...
because as we have AI products that's entering our customers' market, and it takes away from, in my opinion, the science fiction rhetoric about existential crisis, machine overlord, that can stay with Hollywood.
Yeah, yeah, yeah, yeah. I believe the downstream application is where we should put our guardrail attention at. Right. I really want to encourage people, even if people have only a cursory or no interest in AI,
I really think your book is one of my favorites I've read. It's just your personal story, as reluctant as you are to embrace it or talk about it, is a really special story. Thank you. I mean, what ground you've covered. Do you give yourself any moments where you go,
Goddamn girl, we got here. That's very sweet. That's the problem of always chasing after North Star. I try to like look forward. One thing I do reflect back is how grateful I am. I'm not here by myself. I'm here because of the Bob Sabella, Gene Sabella, the advisors, the students, the colleagues. That I feel very, very lucky. Yeah, there's a lot of sweet people in the world still. Yeah, it's good. Yeah. It's helpful.
Oh, well, Feifei, this has been a delight. I hope everyone gets your book, The Worlds I See, Curiosity, Exploration, and Discovery at the Dawn of AI. And boy, those lucky people that get to have you as a teacher. Oh, man, so jealous. I also love the narrator of your book. Have you listened to it on tape? A little bit. I didn't finish the audio. You didn't finish. Right. Yeah, it's hard to listen to your own stuff. Well, it's not her. I know, but your own stuff.
stuff. You spent so much time writing it. Right. I'm like, do I have time? I should finish my Von Neumann book. Yeah, yeah. And you should read Maniac. Yeah, you got a couple new books to read. I'm so grateful you like the book. Oh, I love it. It's just really beautiful. I love the narrator, but I was having the moment where I was like, I was only introduced to you through this book. I was completely ignorant about you. And then there's a narrator. When I was doing research on you, I'm like, oh, we're going to find out what the real voice is. Oh, come on. I've
I feel a little self-conscious because of my accent. Oh, really? Because I consider if I should narrate my own book, but I feel like my accent is probably too strong for that. That wouldn't be the reason I'd advise you not to do it. I think it's way, way harder than people think. And there's a lot more acting involved. I've heard some writers narrate their own book. You got to be a performer. Right. Forget your accent. There's like a performance to be done. Right. And that's how many hundred pages. Yeah. Yeah.
You also probably need to put your time there. You have a lot of other stuff. Don't waste your time. Well, I hope you come back and see us again sometime. And I'll be following everything you do. And thank you for trying with all your might to make this a human centered development. Thank you. It's so important. And I do think.
creators and creators' voices are so important because we started this conversation with what's different from human intelligence, AI, and that creativity, the insight is a huge part of it. And now that we have the generative AI trying to create things, I think the collaboration with humans is so important. Yeah. All right. Well, be well and thanks for coming. Thank you.
Hi there, this is Hermium Permium. If you like that, you're going to love the fact check with Miss Monica. Hi, Monty. Hi. We had so much fun yesterday, didn't we? We did. I did. I did. We shot a commercial. So much fun. So much fun. Yeah. I had a really fun full circle moment. Okay, yes, please tell. Because I
I got out of the car. You know, I haven't acted in a while. Sure. We're all rusty. Yeah. I got out of the car and I started recognizing some of the people on set. And I realized I had worked with a lot of that crew on some previous commercials in my day. Sure. One of the many, many thousands of commercials you had done. It felt so nice and exciting.
And cool. Like, you know, I had done these commercials as just this actor auditioning and doing this thing. And now we're doing a commercial. Where they asked you to be in it. Yeah. And we're doing it together. Yeah.
Not for this podcast, but- But because of this podcast. Yeah. And it was something really cool about it. I agree. I liked it. And I think it's because my ring is fixed. I have some housekeeping. Okay, great. You know, I read the comments. And so, and this is so embarrassing. And I read it a couple of times. I'm like, these people are crazy. That's, I didn't say. So people were like, you said the wrong voice of Darth Vader in the Morgan Freeman intro. Yeah.
And I thought they were saying I had said Morgan Freeman was the voice of Darth Vader. And I'm like, I know I didn't say that because I know he's not. And James Earl Jones was the voice of Darth Vader. And I said Edward James Olmos. So I did say it wrong. It was another three-name actor.
with an Edward in it. I see. Yeah, that's hard. So I fucked that up and my apologies. Oh, and then the other thing was they had coitus interruptus because we were chatting and I was going to say I was going to give a Danny Ricardo update because I had ridden motorcycles with him. Oh. But I guess then we got sidetracked and I never did. So all these people who are rightly concerned about our sweetheart, Danny Ricardo, how's he doing?
We're left hanging. And I'm here to report that he's so happy. Yeah, he's doing great. He's so, so happy. We were riding motorcycles all day long and we chatted a bunch and he said he's just very happy. I'm glad. Yeah, he's just doing really, really good. So people should rest assured that Danny Rick is thriving. Yay. Yay. Love to hear that. Yeah. Do you want to tell people what Toto texted you?
It was so funny. I had texted him to say, hey, people really love the episode, and me in particular, I really loved it. Thanks for doing it. And he said...
How are the numbers? You know I am a lap time guy. Oh, so playful. Oh, God. God. I got to say, I want to say out loud, that really put a lot of wind in my sails. That made me so happy to have that episode come out. It really...
Right-size my perspective. As I vocalize on here, it's been a challenging transition. I've been really stressed. There's been bad news and challenges. And this came out and I was like, oh, right, dumbass. You get to meet people that you are obsessed and in love with. Holy lottery. Yeah, I just was, I was beaming all day Wednesday from it. Yeah, it was a great episode. And so, yeah, just so cool. We get to talk to people.
Anyone we want to talk to. Not anyone. I still have a list. We still got Tay. Liquid death. I'm just pointing to objects. Monkey with huge balls. No, we already had machine gun Kelly. We did. We did. Okay. There's another fun update, but this I'm starting, I'm getting worried that people are going to be afraid to text me. I guess these people should know. I run it through my analysis. Okay. And I would never say anything that was in a text that I didn't think was just lovely. You know what I'm saying? I get worried about it. Don't you?
Like, you know, someone's got a private exchange with me. Yeah. And then I'm reporting on it. There's an ethical dilemma here. Sure. But sometimes they're so funny and I think the person would like it anyways. So I sent Pitt the clip of Toto talking about me telling him that Pitt said he was a good dancer and then Toto talking about him coming to dinner. Yeah. And then he said, I made up the thing about him being a good dancer. And I said...
Oh, no. I said, I can't believe you made that up. In fact, I don't believe you made that up. I still believe he's a great dancer. Yeah, me too. But he did say, because Toto was like, when did he see me dance? I know. But then he just had to go, well, I don't understand how that happened, but I'm going to take that. He was just. He's being funny. He was doing a yes and. He was doing a bit. He was like, you're not going to believe this. He's also a phenomenal dancer. But he was just with him. I believed it. Yeah.
I think the crux of that story is I'm gullible. I think he is a great dancer. Can we talk about Chrisma a little bit? Sure. I got the fever as much as I've ever had it. As hard as I've ever had it. Let me tell you what's happening. So, so far from our homework. Yeah. We watched Christmas Vacation already. Home Alone 1 and 2. Side note, I've never heard Delta laugh harder in my life.
Then the 27-minute set piece in Home Alone 2 where he's hitting the guys with bricks. Oh, sure. She was laughing uncontrollably for like 27 minutes. She said at one point...
It doesn't get old. Like they threw a fifth brick or whatever. And she's like, it doesn't get old. And I got so much joy out of watching her have that big of a laugh at something. So cute. Okay, so Home Alone 2, we did Gremlins, another Christmas favorite for us. Last night we did the Grinch Who Sold Christmas original cartoon. And I wanna go out and say, for the record-
It's the number one Christmas cartoon to ever be made. It is the most creative. We all watched it. How many more Christmas cartoons are there? There's a lot. You've got Rudolph. You've got the Chuck Brown. Oh, yeah. You've got the—there's a bunch. Okay.
But I'm saying maybe even Christmas anything. It ends and I said, you know, Dr. Seuss should really be regarded as like Salvador Dali. He had such a unique personality.
imaginative world he created in the words, in the set pieces. I mean, that's one of the most creative people to ever live. Of course. I think he is given his due props. Yeah. You know, there's a Seuss land. There is? Yeah, at one of the parks.
There is? I think. Yeah, Seuss landing in Orlando, Florida. In Orlando, Florida. I should go. You should go. You should pay your respects. I like when people use the term Seussian. Did you ever hear anyone use that? No, but I like it. Yeah, it's cool, right? Yeah. Yeah, like Newtonian or like it's a paradigm. But it kind of sounds like Sisyphusian, which is my favorite word. Yes.
Which is not my favorite word anymore. You taught me that word. Yeah. And I thank you for that. To remind people, Sisyphus pushed the rock up the hill every day. There's a Buddhist take that like, that's what people interpret that as a story of not wasted effort, but like, you know what I'm saying? Yeah. Yeah. A fool's errand. Yeah. But there's a Buddhist way of looking at it, which is like this person had purpose every single day, all day long and was not suffering probably. Yeah.
It's a story of suffering. It was a huge rock. Well, first of all, he's probably jacked to be in the belief. So strong. But that's an interesting way to reframe it. That like, no, this person every day of their life had purpose. Yeah. Probably very happy. That's a lovely way to look at it. Yeah. It's actually Sisyphean. Sisyphean? Yeah. I like Sisyphean. Me too. And I maintain it. Yeah. Okay. So you're in the Christmas spirit. Yes. And I wake the girls up every morning. I wake up about...
20 minutes before the girls to meditate. And so now I'm, they wake up to me playing from my phone over the Sonos Christmas music. Wow.
And I want to make a great recommendation to people who are using Spotify and you can make a station, go to the Charlie Brown Christmas album, and then go specifically to the song Christmas time is here. Make a station out of Christmas time is here. And it's the best Christmas mix I've ever had. Ooh, that sounds lovely. And it's on all the time. And so, you know, the tree is fantastic.
over decorated. Yeah. You know, we get one tree and Kristen gets a tree in the kitchen and hers is artistic. And this year it's wicked. Oh, cute. Yeah. And our tree is a throw up of color.
And I have those old fashioned bulbs that the water bubbles up in them. They're almost impossible to get to sit vertical on your tree. I've spent most of my free time positioning all of them. And then they, I pull the cord and they all fall down. It's, it's a Sisyphean task. Wow. Ding, ding, ding. I didn't expect it to come around that quick. I had all this anxiety about presents, but I knocked a bunch of presents out the other day. Nice. You, you use a little bit of my gift guide. I use your, your gift guide, uh,
Almost exclusively. There were good gifts on there. Complain about your gift guide, though. You make things sell out. Your gift guide is moving markets. Ha ha ha.
Yeah, well, I pick great items. Yeah. I have to say. You do. You have exquisite taste. Thank you. Some of your recommendations were so good that I found myself dancing around on the websites. Yes, that's the goal. Yeah. Yep. And yeah. There's fun stuff abound. There's fun stuff. So. So, and let's just, so your tree has colored lights, right? Yes.
So many. Yeah. I have four strands, really long strands, and four of those bubbly light strands. Sure. And the tree's touching the ceiling. It's a Clark Griswold. It's too big. Uh-huh. And I'd cut up foot off it. But I just want to talk about lights. Oh, okay. Okay.
My apologies, Miss Monica. Miss Monica, I'm sorry. I get so carried away sometimes when the spirit moves me. I don't leave my apartment much, so I really enjoy decorating it, get all those colors. It makes me optimistic. I wonder how Hermium, does he have a delivery service? How does he get his tree? I have a cousin who's not working at the moment, and he loves going to department stores and
plazas and shopping malls and strip malls. Wow. And I'll call him on the landline. That's what I have, Miss Monica. I pick up the phone and I call his. His name is Bert. Oh. Yeah, he's my, did I say my brother-in-law or my cousin? You said cousin. Yeah, he's my cousin. I just remembered.
Weirdly enough, he's also my brother-in-law, but it's my stepsister. Okay, so it's all on the up and up. Everything's on the above board, as they say. I call up Bert and I say, here's what I need, Bert. Six water weenies, 10 spatulas, and Bert. It takes him a while, sometimes four or five days, and then he comes over. He does charge me a little more, but that's okay. Sure. And then I have to call him back up and ask him to deliver the presents. Wow. That's okay, though. He charges me for that, too. Okay. Okay.
Getting a little taken advantage of, but that's, you seem fine with it. Okay, great. Mom? Remember, I'm not your mom. Okay, Miss Monica. Mom Monica. Um. Mrs. Mom. Okay.
Color lights. Yes, the lights because Chris and I assume on her nice tree has white lights. Yep. Yeah. Yeah. And this is, you know, this is a big thing. I don't know if it's a, Rob. Yes. What color lights do you have? Well, first of all, do you have the light you want? Yes. Okay. I do. I like the like yellowy colors.
White light. Warm gold. Yeah. White lights. That's white. I mean, there's shades of white, too. He's trying to walk in the middle and be nice, but really he has white lights and he likes them. I have white lights, but they're kind of yellowy.
Yeah, I know what you mean. There's like a warm and a cool. Now, listen, sometimes you complain about there being two boys, one girl in this situation. But you have to admit, Rob is a perfect middle ground. Like if Aaron was here, it would suck. He disproves my gender stereotypes quite a bit. Yeah, yes, because Aaron grew up exactly like you. So it's not fit. You just assume it's men because you and Aaron believe it. That's right, Monica. That's right, Miss Monica.
So, yes, Rob did not grow up with you and like you. Oh, he's from the big windy. So I don't think it's gender, but I do think some people love the nostalgic colored lights. Yeah. And then other people who care about aesthetics.
Love the white light. I could really get on my high horse about it. I used to have a really strong stance on it. And it's all my class warfare stuff. Yeah, which is... It's so tired. Is that what you're going to say? No, I wasn't going to say that. Your life does not match that mentality anymore. Doesn't at all. But did you see Chris Rock's latest standup? He said, I am rich, but I identify as poor. Yeah, that's fine. Okay, for him. Yeah. Okay, but for me...
Yeah, you aren't, you're of the highest class in this country. Yeah, well, there's people with a lot more money than me, but I do have- You're of the highest class. Okay, okay. And you also hobnob with the highest class. Yeah, but you know what? I act like myself and I have color. Here's what I'll say. The white, all white Christmas tree- Yeah, yeah.
is like occasionally I'd see that at people's houses who had an extra living room that no one went into and you weren't allowed to go in there, you know, take off all your shoes, you know, you get in a fucking Intel outfit to go in the room and all of it seems stuffy and not playful and fun and colorful. It's felt very presentational and,
And where's your tree? But I used to be judgmental of that. I still don't like it, but I'm not as judgmental. Because your second tree is in your second living room.
Okay. Okay. You know, I got to keep you. I got to just remind you. I know I'm spoiled. I know I'm spoiled. Yeah, okay. I'm really spoiled. It's just, to me, the class warfare thing, I would hope you now see. That it wouldn't be fair for a stranger to hate me just because I have money? Yeah. Yeah. Yeah, I would feel that way on the other side of it, but I wouldn't.
I wouldn't expect anyone to feel that way, not be on the other side of it because I get it. Okay. So I have white lights, obviously. Yeah, I know that. I would know that. Yeah, everyone would know that. You don't need to tell me that. I know that. And I'm not judgmental of you. I'm so glad you're having the Christmas you've always wanted. Thank you. Yeah. Yeah.
Jess and I had pig day and we went to home. We just missed you, I guess. Cause it seems like the timing. Yeah. Cause we were there at like 11 AM on a Saturday and you were there at 11 AM. But I gotta say, this is my record of all time. I was so fast and there was no fighting. This is like first year in a few.
That day is very triggering for our family. I think it's hard for families that have to decide. You got to compromise. And everyone has their things they care about. And luckily, Jess and I have the same thing. We don't like bald. You call the tree a bald puss? Bald pussy. Bald pussy. If there's bald spots. Okay, great. And we don't like that. Okay. You like more of a Brazilian tree? No, Brazilian is... That's shaped and full. Brazilian? Yeah.
Yeah. Isn't a Brazilian like you have a landing strip? I thought Brazilian is... Clean? Clean. Rob? Do you want me to Google? Yes, I do. Just definition of Brazilian wax. And pictures? And pictures. You can do that on your own time. It removes most or all of the hair from the pubic region, including the front, sides, back, and often the area around the anus. Yeah. Okay.
I'm glad I... What's the landing? The landing strip's just the landing strip? Yeah, there's like, you can just get different kinds, but Brazilian generally means all hair. Do you think any dudes get a landing strip? I was just thinking I want to go do that just as a bit. I've done that as a joke. You have? For Natalie. Oh! Oh my God. That's so funny. Did she laugh? Did it make her horny? No. It was not meant to be. That's really funny.
Stay tuned for more Armchair Expert, if you dare. Okay, let's take a break from the fact check to thank our presenting sponsor, Amazon Prime. Prime has you covered with movies, music, and everything you could possibly need to make the holidays perfect.
Whatever you're into, it's on Prime. This is very exciting. It's holiday party season. Yes, it is. That time of year. Work parties, family parties, parties with friends. Party parties. Parties with your animals.
If you're as popular as Monica, you're hitting the party circuit. It's a great reason to shop for new clothes or accessories and really like spice up your wardrobe. Make it fancy. Prime can help with that, especially if you decide last minute you want to buy something new. You're set with Prime's fast, free shipping. And hey.
What you're buying for holiday parties depends on whether you're a guest or a host. If you're hosting, then you're going deep on Prime to find everything you need to make your home feel fun and festive and perfectly like you. Oh, tell me about it. I really like to make my house feel very me during the holidays.
You could be decorating the outside of the house, getting some lights, something for the windows, grab some new holiday towels, some festive hand soap. Oh, I love a good festive hand soap. Candles. You really you can do it all. Absolutely. And you can get all those things on Prime. Oh, and one other thing. Amazon Music is here to help with the playlist. Curating the party playlist now.
It's an art. Amazon Music will get the vibe right. Listen, what we're saying is anything you need for a holiday party is on Prime. Nice sweaters, goofy sweaters for the ugly sweater party, holiday decor, gifts for the host, or fun small stuff for a gift exchange at work. The sky is the limit when Prime's fast, free shipping is at your fingertips.
From streaming to shopping, it's on Prime. Visit Amazon.com slash Prime to get more out of whatever you're into.
Now, we were also so quick. So quick. In fact, it was almost eerie. We walked in and we were doing just like a quick look and Jess just beelined. He knew his daughter. Like Christmas vacation, there was a beam of light shining down on it. Yes. And he knew his daughter. Yeah. And it was the one. Are you his daughter? Because I think.
You view more of his mom. No, the tree is our daughter. Okay, okay. That makes sense. We co-parent. Okay. But she lives at my house. Yeah, so I got you. So he's a little bit of a debbie dab, but whatever. And she's really pretty. She's so nice. She's...
We said, because last year Archery was a boy and he was a model. Oh. He was gorgeous. Striking. Striking and very perfect. Angular. Exactly. Not around features. Very angular. Not around features. This girl is, she's not a model, but she's a star. Oh.
Oh, yeah. That's the kind I like. Exactly. And I've been trying some different hats on her. Toppers. Oh, okay. Hats. I haven't decided yet. Is there no part of you that...
Like what I really, the softest spot in my heart I have is for Charlie Brown's Christmas when they get that really bad tree. Charlie Brown did a bad job and they hated it. They're yelling at Charlie because of the tree. But then they decide to love it and it's a good little tree. It's a sweet story. And I always am drawn to the shitty tree there because I think no one wants this tree and we'd have a great Christmas with this tree. I have a real, I get emotional about it.
Wow. Yeah, I want to like rescue the shitty tree. Oh my God, the way you feel about the trees is like how Kristen feels about the dogs. That's right, that's right. And all because of Charlie Brown, I think. Wow. So yeah, so the girls have one agenda, which is to never like the same tree, I think is their agenda. Of course.
And then mom has an agenda. Mom's very aesthetic. You know, it's very important to her. So for her, trees are not dogs. She wants a pretty one. Yeah. Yeah. She's got something in her mind she's looking for. Yeah. My singular goal is when you pull into Home Depot, you can either park...
And then go by the tree and then enter the line to pull up where they'll put the tree in. And I want to just pull into the line and know that they can get that tree fast enough that by the time it inches up to the front, we'll have gotten a tree. So my only objective is to get...
The trees in time, by the time I'm pulling the truck. Because Kristen stays in the car? No, in previous years, they go in and I wait in the car. This year, I went to. So what'd you do with the truck? I just parked it and I'm like, I'm going to run in. I'm going to see if I find a tree. It's not going to move up that fast. They got to load a tree. I didn't hold anyone up. Okay. And then we got the trees by the time I pulled up. So that was my goal.
Mine's way less aesthetic and way more time management. Yeah, I don't feel bad for the... You don't? I don't. How could you not? Well... A tree that no one wants, Monica? It's already... Dax is already dead.
I always get a tree that has a little bit of personality. And by personality, I mean missing parts. Bald puss? Miss Monica, I don't know what you just said, but please don't say it again. You want to talk about that? I'm here to buy a big old Christmas tree. Tell me about your tree. Does it have a Brazilian name?
What? Stop. What do you put under your tree? Make him go away. Make him go away. I can only take, I can't, I kind of forgot. It sounds though like you guys are very Frito-esque when you're shopping for this tree. You're just not doing the voice. Talking about bald, I don't even want to say it either. My God. And you're saying it's your daughter?
This is twisted. Certainly don't want Jess talking about his daughter in that fashion. No. Last update. It was time for a crop, a harvest. Everyone already knows that. I feel like people are going to have a bunch of judgment about this. I guess fuck them. Delta's like, I want to shave my legs. Will you shave my legs? Yeah. I'm like, okay. People are going to be like, you shouldn't shave your kid's legs. I can already feel that coming. But I don't give a fuck. She wants me to shave her legs. Yeah, why not? She feels left out. I did it.
Her leg hair is also cashmere. It is. So we now have two fields in rotation. And so I want you to see what an enormous. Are you combining? Yes. It's now father-daughter cashmere. And I want you to, you remember how much we had just two days ago. Yeah, practically none.
Look at the amount of cashmere we now have. Wow. It's like quadrupled in size. I was making a joke that we might get a mitten or a scarf in 10 years, but I actually think that's a real possibility now. Look at the amount in there now. Do you want to feel her? I do. I want to touch it, but also last time we touched it. Some of it disappeared. That's okay. Now we got two growers. Wow. There is so much. Yeah. Yeah.
Now you have two growers. We got basically a mink farm. Are they separated? There's no real. Yeah, it's just I think it's separated. Wow. It's so soft. Yeah, I think hers might even be softer than my back hair. Oh, my God. But that's got a time limit. Her leg hair will turn into shitty hair like our leg hair. But currently she is growing cashmere. Oh, my God. You think I need to get a work permit for her?
Because she is now kind of actively... You're probably illegal. It's like, yeah, it's illegal. I don't want to out her because she did such a great job. Lincoln shaved my back, did a great job. Yeah. But she was... She thought she had some cashmere on the razor and she emptied a little bit into our pouch. And then I discovered, no, some of that was beard hair. So I had to actually go in and pull it. Now I'm getting embarrassed. It sounds like a bit, but then you realize, no, it's not a bit. He's really...
That happened. Did you use tweezers? No, I just, I could feel and I'd pull that out. I probably lost a lot of really good product. That's okay. Yeah, we live and learn. This is an R&D situation. It was only the second harvest, so. Wow. Still learning a lot. Exciting. All right. Oh, one more thing. One, one.
cool thing that happened that I want to put out there in the world because I think it's good for me to manifest this. When Callie and I were shopping, we went into one store and I bought some cute little boxer shorts. Okay. As we were leaving, Callie was in front of me and someone had held the door open for
Oh, oh, oh.
Oh, okay. And- It looked like the look on your face was that it was a famous person. It was the most- Gorgeous. Gorgeous person I've ever seen. Really? Male or female? Male. Give me age, height, describe. Build it for me. But now he's sort of a haze. Oh. Like, I don't remember- I don't.
I like that part. I know, I know. But part of it was, it happened so fast. He took my breath away. Yeah. And...
And I think it read, you know, it read. Okay. Your face betrayed you. Yeah. And he smiled and I don't remember if he showed teeth or not. Flirting. But no, he just like, that's who he is. Okay. And I turned, you know, I turned and I said, oh my God, that guy was so hot. And she said, I know. Callie was fucked up too. Yes. So we, so this is an undeniable situation. You should have gone back inside to talk to the third woman.
Who entered. Well, I think they were together. Oh. Well, I don't know. Okay. There's no way to know. So this is a lost persons report. Exactly. If you open a door for Kelly and Monica at the farmer's market. Brentwood Country Mart. Brentwood Country Mart. Yeah. On Black Friday. On Black Friday. Probably around noon. Okay. Yeah. Contact, I guess, comment in this. I'll read it. I'll read them all. Comment or. Don't. No catfishes.
Exactly. I guess you'll be able to see the photo though and you'll know. Oh, I'll know. And no one could fake it. No, because, you know, when I walk through the world, I'm extremely unobservant. I don't notice people. You're blind, basically. I really am. And speaking of blind, I got some soap in my eye this morning and it was- Blinding? I thought I did some permanent damage. Of course. Okay. So anyway, I walk around so unobservant and yet this person was strong. He pulled me out.
It was shocking. He's like a lifeline. He was so attractive. How many more times did you think about him? That day? A lot of times. A ton. Yeah. Did you like whip up fantasies? I know you're prone to fantasies. I am prone to fantasies. I didn't actually. I was more just like- Thunderstruck. I was just taken. Love at first sight. A little bit. Oh my God, Monica. And I don't even believe in that bit, like maybe. Anyway, that was a big mystery. Yeah.
Wow. And how often have you thought of him since then? Daily or once every few days? No, it's starting to dissipate. And I don't remember him at all. I sure hope he reaches out in the comments. Me too. Also, no bullshit, no catfishing. Yeah, guys, seriously. Stop catfishing, everybody. Seriously.
Okay. Anyway, so that, um, we'll add that to the mystery pile with the guy I met at, uh, in New York, the restaurant guy. Oh, right. That mystery is so delicious. Did you ever eat a big catfish sandwich, Monica? I don't give him permission to say my name. What do you want him to call you? I don't want him. Don't leave it. Don't let, let him decide. Okay. This is for Faye Faye Lee. Oh,
And a ding, ding, ding, we just interviewed someone who knows her intimately. Yeah. Not intimately. Yeah. I mean, sexually. Yeah.
The colleague. We just interviewed a colleague. And he was giving her a lot of props and reverence that she really deserves. She was, I loved her so much. I loved her so much too. She was a delight. Yeah. Now some facts for her. How long did Einstein live at Princeton? He lived in Princeton, New Jersey for 22 years.
from 1933 until his death in 1955. He purchased a house at 112 Mercer Street, which became his home until his death. The house was for him, his wife Elsa, stepdaughter Margo, and secretary Helen Dukas. His secretary lived with him? I guess so. Interesting. I bet it's more like an assistant. Nowadays, we'd call it an assistant. You're probably right. Yeah, I guess secretaries were just assistants. Okay. Oh, we talk about Cambridge Analytica. Yeah.
which was the whole thing that happened with Facebook. I encourage people to listen to Acquired, the podcast Acquired. They do an episode on Meta, fantastic episode. Yeah.
And they talk about what happens with the Facebook Cambridge Analytica scandal. And a lot of it's very misunderstood. A lot of what the public thinks, we're all missing a ton of information. It's kind of like a Martha Stewart thing. Exactly. Where we all think she traded her company. Exactly. And that's not what it was. And she didn't. Nor did she even do any insider trading. I know. But she still went to prison.
But yeah, the nefarious activity was on Cambridge Analytic, not Meta. Right, but also they- And they were just using existing tools that anyone could have been using. But they were using old information from an old quiz or quiz or something that Facebook did a long time ago. And that's what they used. They weren't using current information. And yeah, they, like Facebook didn't sign off on-
They didn't hand over this information. Everyone should listen to Acquired just period. It's such a good podcast. It is. It is. I'm always shocked. Yeah. If you like a deep dive, that's the show for you. In the business world, like learning. I mean, you listen. I mean, they're four hours long. That's six. Yeah. They spend a month researching a company and then they just tell you everything about the business and how it came to be and all of it. And you do leave feeling like you went to
Like you took a course in business. Oh, big time. Yeah, yeah, yeah. I recommend Acquired. And that's it. That was it? Yeah. Okay. I just adore her. I wish her the best. Me too. I'm grateful for her. Me too. Yeah. That's the line we learned from the Lisa Kudrow fact check that we say to people that I'm just grateful for you. I'm grateful you exist. Oh, yeah, yeah, yeah. Now we know. And I'm grateful for her existence. Yeah, me too. Okay. And Toto's a great dancer. Don't listen to anybody else.
He's great. Love you. Love you.