cover of episode Runway AI with Joel Kwartler

Runway AI with Joel Kwartler

2024/11/27
logo of podcast Software Engineering Daily

Software Engineering Daily

People
-
-
J
Joel Kwartler
Topics
Joel Kwartler: Runway AI 是一家全栈应用AI研究公司,致力于构建多模态AI系统、模型部署基础设施以及利用AI进行多媒体内容创作的产品。公司最近发布了Gen-3 Alpha模型,该模型同时使用视频和图像进行训练,并将为文本转视频、图像转视频和文本转图像工具提供动力。Runway 的工具旨在增强人类的创造力,而非取代人类。公司与艺术家和技术人员紧密合作,产品具有多项独特功能,例如多运动画笔和相机控制,以满足创意人员的需求。Runway 的用户来自各个领域和行业,应用场景广泛,从大型企业到自由职业者,甚至包括知名艺术家。公司重视数据隐私和安全,采取多种措施保护用户数据,并积极应对深度伪造等挑战。未来的发展方向是构建能够理解整个视觉世界及其动态的通用世界模型。 Gregor Vand: 作为访谈主持人,Gregor Vand 主要负责引导话题,提出问题,并对 Joel Kwartler 的回答进行回应和补充。他从安全和技术角度对 Runway 的产品和技术进行了提问,例如数据隐私、深度伪造的防范措施以及技术架构等。他的提问帮助观众更好地理解 Runway 的技术和产品,以及其在行业中的定位。

Deep Dive

Key Insights

What is Runway's primary focus in the AI space?

Runway focuses on generative AI diffusion models rather than large language models (LLMs). They build multi-modal AI systems and tools for multimedia content creation.

What was Joel Kwartler's journey before joining Runway?

Joel had a background in both creative and technical fields, studying computer science and English. He worked on ML projects for comedy writing and in creative tooling startups like Figma and Sourcecraft before joining Runway in 2023.

What is Joel Kwartler's role at Runway?

Joel leads the product team at Runway, overseeing collaboration across teams, including researchers, engineers, designers, and users, to ensure the development of creative AI tools.

Where is Runway headquartered?

Runway is a remote-first company with offices in New York and San Francisco. Employees can work from anywhere, but the company gathers for in-person events like film festivals and offsites.

What is Runway's mission?

Runway aims to enable anyone to tell any story they can imagine at the highest quality possible, eliminating the need for large VFX budgets.

What are some unique features of Runway's tools?

Runway offers features like multi-motion brush and camera controls, allowing users to have granular control over animations and effects, such as zooming while moving or making objects float.

Who are Runway's users?

Runway's users span industries, including Fortune 500 companies, freelancers, marketers, film studios, and even artists like Madonna and A$AP Rocky, who use the platform for music videos and live performances.

How does Runway handle data privacy and security?

Runway has enterprise-grade security measures, including automatic moderation systems, C2PA authentication for media provenance, and user controls for data deletion. They prioritize privacy and security as model capabilities grow.

What is Runway's approach to preventing deepfakes?

Runway employs visual and text moderation systems, C2PA authentication, and ongoing investment in alignment and safety measures to prevent the misuse of its tools for creating harmful content like deepfakes.

What is Runway's vision for future models?

Runway is focused on building general world models that understand the entire visual world and its dynamics. Their latest Gen-3 Alpha model is a step toward this goal, with ongoing improvements in understanding and simulation.

Chapters
Joel Kwartler's background blends computer science, English, and experience in creative tooling startups. His work with ML for comedy writing at Botanic Studios and roles at Figma and Sourcecraft led him to Runway, where he saw the potential of generative AI to enhance creativity.
  • Joel Kwartler's interdisciplinary background in computer science and English.
  • His experience with ML for comedy writing at Botanic Studios.
  • His roles at Figma and Sourcecraft in creative tooling spaces.
  • His decision to join Runway in 2023 due to the potential of generative AI.

Shownotes Transcript

Translations:
中文

Runway is an applied AI research company building multimodal AI systems, model deployment infrastructure, and products that leverage AI for multimedia content. They're among a handful of high-profile video generation startups and have raised impressive amounts of funding from investors such as Google, NVIDIA, and Salesforce Ventures.

The company recently released their Gen 3 Alpha model, which is trained jointly on videos and images and will power text-to-video, image-to-video, and text-to-image tools. Joel Kortler is Runway's group product manager. He joins the podcast with Gregor Van to talk about Runway and the technology the company is developing.

Gregor Vand is a security-focused technologist and is the founder and CTO of MailPass. Previously, Gregor was a CTO across cybersecurity, cyber insurance, and general software engineering companies. He has been based in Asia Pacific for almost a decade and can be found via his profile at vand.hk.

Hi Joel, welcome to Software Engineering Daily. Thank you for having me. Yeah Joel, it's great to have you here today. You come in with the company Runway, the platform Runway. We're going to hear all about Runway very soon. It's in the LLM AI space, just to kind of cover that one off. And I only say that now because I think hearing about sort of your history before Runway will be kind of interesting. Sure. What was the journey to joining Runway?

Yeah, of course. And we'll cover this more in depth later, but a runway is not in the LLM space. It's more in sort of like generative AI diffusion model space. That's a great distinction. So yeah, thank you for clarifying that. No worries.

My journey before Runway really sort of led me directly to Runway in a couple of weird ways. I'd always been foot in the creative world and one foot in the more technical products, tooling worlds. In college, I studied basically computer science and English and was always back and forth between those two fields.

So I'd actually become part of this group that was doing like ML for comedy writing back in like 2018-ish called Botanic Studios. It was a mix of like ClickHole, Onion Writers, and like, you know, ML PhDs. And we were just sort of playing with the generation of ML models back then that were like, you know, Mark Options, Adversarial, Neural Nets, things like that, to see if we could generate like anything that was funny, basically. And we trained like

predictive text keyboards and match tone or, you know, get a bunch of outputs from a neural net and go through them as comedians and try and pick out the funny ones. And so that was, I guess, sort of part of the realm in which I'd always been paying attention to like, when is machine learning or how does machine learning sort of accelerate creativity and the stuff that you might want to create and

And at the same time, sort of in my career, I'd been working at a bunch of startups that were sort of in these creative tooling spaces. I was at Figma. I was at Sourcecraft for a while. Sort of always building things for, in my definition, you know, creatives, which includes engineers, includes designers, because I was a little bit selfish. You know, tools that I enjoyed using and products that I enjoyed using and for people who I thought liked their things. And that's the coolest thing you can do. And so building tools for people.

was always fun. And so those sort of started to combine or really start to cross paths. And like late 2022, I'd gotten the feeling that like, you know, I wanted to maybe step back from just like traditional tool building on the product side as a result of the botnik work had been paying attention to like, you know, GPT-3. It was getting really interesting. Some of the image models, it felt like, oh, we're at like the precipice of, you know, what might be a huge step up in suddenly like what it means to be created using technology. And

Ultimately, you know, sort of like, OK, well, there's got to be a company already in the space, right? Like doing interesting things. And of course there was and it was Runway. And so it was sort of just this perfect combination of like, well, suddenly I don't have to be sort of like one foot on the creative comedy side. I live in Los Angeles. I do some stand up on the side and one foot in the like, you know, tech startup side. I could just be like both feet all in. And so I joined Runway at the start of 2023.

Yeah, that's fascinating. I guess I never thought about kind of ML and comedy coming into like the same space. So and especially back in 2018, that clearly is sort of very early to any of this kind of thing. So super interesting. What is your role at Runway today?

Yeah, at Runway, I lead the product team. So I have the very exciting and very fun job of getting to work really across every other team we have at Runway and getting to work with even folks outside Runway, like our users, our customers, you know, the folks that were taking feedback and were trying this stuff and giving us feedback all the way to, you know, the researchers, the engineers, the designers working on building into the product, making sure then we're, you know, communicating effectively via sales and marketing teams and our community teams. And so Runway is, you know, maybe like a traditional startup team.

squared in that we have like all the traditional teams and we have all these like additional very exciting teams like our creative team and our research team that you're not going to find maybe as an early stage startup unless you're at a place like runway and i mean you're sitting in la today and i believe the company is based in la is that correct

Runway is actually a remote first company based in New York. So we have an office in New York and then we also have an office in San Francisco. And then we have sort of the remote first just means that like folks, even if you're in New York, San Francisco, there's no like days in the office requirements. You're welcome to come home whenever you want. But it's been, you know, it's fun. You know, once or twice a year, we all get together at our office for like our film festival or for our offsite and sort of get to meet everyone and work together in person for a week or two.

Got it. I guess I was curious on the L.A. connection in case, you know, ultimately a lot of film production, et cetera, happens out that way and whether there was sort of a strategic help there. Yes, it's definitely been useful living, I think. And I think we have a couple other folks who are in L.A. and it was just, again, sort of a natural like people who are really interested in the space were in L.A. and they were also interested in the ML side. And so they joined Runway. Awesome.

So let's dive into the product. As you called out, it is not an LLM. It is Gen AI. Thank you for the correction there. Let's hear about Runway. Like what is Runway? Yeah. Runway is sort of a rare company in that it's like a full stack applied AI research company. And so we both like invent and build these AI models. And then we also invent and build the tools on top of them.

that really unlock the new forms of creativity and streamline the entire creative process from concept to finished product for really pretty much every use case you can imagine. And so it's a really unique place because those are, I think, two of the maybe juiciest or most interesting areas to work on right now in technology in the world. It is the research side of like what's possible. And then there's, okay, if it's possible, a way for us to interact with it.

And the nice having both of those sort of in-house to runway being full stack is you don't even have to like work on them separately, you know, but they're both there. You're like, they actually can inform each other. And so we get to bring, you know, learnings from our product directly back to the research team. And likewise, we can bring sort of research experiments directly into new product experiments in ways that, you know, we couldn't if those were separate companies.

So I mean, if I'm a user-- I'm just trying to paint the picture for our listeners. If I was to jump into Runway, what's the first thing I'll be doing? And what do the outputs look like to me as a user? Yeah, great question. Often the first thing people do, they jump into Runway and they jump into our Gen 3 Alpha model, which is the latest version of video generation models. And then you can start with either a quick text sentence or an image of, let's say, a dog running through a field made of balloons.

and it will generate sort of a video of that. Or, you know, often we see people taking images that like are actual photos they took, you know, and then adding fun, you know, VFX to that that you'd never. So, you know, a photo of, you know, my kitchen filling up with balloons and it basically like sort of creates the effect without needing a traditional pipeline that would take a long, a long time to do. Yeah. And I think sort of from what I read, one of my sort of positions as the tools for human imagination. And, you know, I think a lot of

critique on anything in the JNI space that is, you know, for example, music or art effectively, you know, I think a lot of people have opinions around sort of, well, who's doing the imagination bit now. So how do you guys sort of look at that in terms of, I guess, assisting imagination without kind of removing the human's need to think about things?

Yeah, I think one of the nicest things about Runway is that was, you know, effectively a solved problem from the start, because the Go founders sort of all met at this like art tech program at NYU, and they were all sort of in both the art and tech worlds already. And so there was never a question of like,

oh some technologists have this model they could do interesting things like how do you involve artists it was like you know from day one the dna of the company was artists and technologists together and we really see that as we grew we hired a lot of people both you know naturally like they might be working as engineers but they're visual artists on the side or like our direct creative team where we have like a very sizable in-house creative team and so there was never you know one of the things that i was evaluating when i was considering joint runway was like is there ever a chance

that the artist and tech sort of needs like diverge. And it was clear that was never going to happen at runway because founded by artists for artists effectively. And it was going to be focused on, you know, building tools for humans as opposed to building tools where the humans are not involved. Got it. Okay. And let's sort of, I guess, look at how the output actually happens in a sense. We're just talking about speed versus quality here, I guess.

which is one of the biggest considerations, again, when people are using any kind of Gen AI tool. Sure, you can get things fast, but does it sort of feel like something incredible? How do you guys balance that from a product perspective?

Yeah, great question. I think that even goes back to the question of like, how does it, you know, enhance human creativity? And the speed is a big part of it, right? Because it dramatically increases sort of how fast you're able to iterate on your ideas. And when you see something, you know, that's real, it's very different than like imagining how I look and I can spark new ideas. And so that really accelerates things.

But likewise, the quality has to be really good, right? For that to actually be interesting to go through all these ideas you've got to actually see and have a reaction to like, oh, that one really, so that shot, you know, is really not what I'm looking for. I'm not going to go in that direction. And so we really focus on both. Both are really important to the creative process and the way that sort of humans creates. And so we do a good job, I think, collecting a lot of feedback from customers to make sure, you know, we're balancing those needs effectively. But ultimately, we've seen over the past year, year and a half of our research that like both just improved dramatically.

So, you know, Runway produces content. Like if I just sort of look at Runway's website, for example, and like look at all the examples, like to me, I haven't seen anything quite like that. And at the same time, again, people listening today might kind of think, oh, well, I already use this other tool for Gen EI in the, whether it's image or video space. How would you, I guess, start to really be able to describe the differentiation? I mean, I'm aware there's sort of some

features that are kind of put forward things like multi-motion brush and camera control could you maybe speak a bit to those and then you know yeah anything else in terms of how you sort of position this yeah good question I'm happy to talk about the features I think I would say broadly the features that are different are almost just like

you know, an effect of the cause that Runway is the most focused on sort of like building tools again for creatives. And so we work really, really closely with creatives to understand what they need and where their workflows are going, for example. And so we have some controls, like you mentioned, multi motion brush camera controls, which give you, you know, very direct like camera level control. And I want to zoom in while I'm moving to the right or I want, you know, these three marbles to fall off the table, but I want these two marbles to float into the sky.

and sort of like those kinds of granular controls that you need to really create like interesting and unique contents. But more broadly, you know, Runway had a very stable vision, you know, in the entire time that I've been here since the company was founded, which is ultimately we think the technology is going to enable anyone to tell any story they could imagine at like the highest quality imaginable, right? You're not going to need $100 million VFX budget to tell the sci-fi story that you envision or to commercial that you envision. And so as a result,

that drives a lot of our research and a lot of our product updates. And so you see, you know, the effect of that as a user, you end up with all these features that are going to be very unique because we know you need those controls, but it also, I think, drives our research vision and it drives, you know, how we approach building products, which is like we release stuff as fast as it's ready so that we can get it to the hands of users and learn, you know, what's it useful for and how should we continue to improve it.

Yeah, I mean, just sort of taking a sort of side off the pure product for a second, you know, you've talked a couple of times now about users and customers and feedback. So like, who at the moment would you say are like, in what sort of spaces and, you know, especially commercially, obviously don't need names exactly, but commercially, who is kind of using this kind of tool? Yeah, just me, actually. I think I'm the only user we have. I just sort of spend all day on different machines trying to pretend it's, you know, I'm kidding. I think that what's amazing about Runway is...

We have a lot of users pretty much from every domain, every industry, every vertical that's really led us to double down on our philosophy of let's release this so we can discover these cases we wouldn't have even thought about because we only have experience in this industry or that industry or we're only talking to that customer this week.

And so we really see like Runways tools are used for everyone from like Fortune 500, Global 2000 type companies to freelancers to marketers, you know, film studios telling you types of stories, streamlining their workflows. But even beyond the sort of like traditional, like, okay, longer form video content, we have folks using it for previs and storyboarding to just explore all sorts of different directions much, much faster than you would be able to with traditional tools.

We have editors who generate videos in Runway that they then composite into existing footage so they can do like that last mile of tricky VFX that really puts the sparkle and polish on something that otherwise would have taken them a long time. We even have artists like Madonna, A$AP Rocky creating music videos or visuals for their show with Runway. And so it's really sort of expanded everywhere. I would say there's no like these people use Runway. It's like everybody uses Runway.

Yeah, okay. That's interesting. I like the examples, especially the music examples where you realize actually a lot of what is on, I guess, on the screens behind them when they perform are kind of these like looping

visuals, which probably used to take a long time to kind of figure out. And now I can imagine like a lot more kind of creativity effectively, like where you can just kind of see a whole bunch of like ideas and actually almost finish product and kind of pick one. Is that sort of a good example? Yeah, that's in a way in which we see people use Runway. Yeah. And like away from the music example, I mean, I think you just mentioned things like storyboarding and online

Obviously, that's traditionally has been something that people have really kind of like had careers around. Do you see there kind of being a movement where people that have already been in that industry are actually like transferring over to kind of becoming sort of masters of runway? Maybe not like today, today, but is that sort of a path? Yeah. Yeah. I mean, I think what we see is there are a lot of folks in the traditional entertainments and VFX world who were

some of the earliest and most excited adopters of Runway because for them, it sped up the stuff that maybe was felt slower about their workflows or they would be working with someone who'd give them a little feedback on something they have to change and it would take a long time to make that change and sort of create a flow. And so I do think that we see

And what we're excited by is sort of the ability to like speed up the fun parts, which is like the ideas, the stories you want to tell, you know, getting into the details, creating your own vision and less worried about, okay, now I've got to like manually, you know, create this effect somehow using a particle editor or whatever that might be. Yeah, that makes sense. So back to the product kind of,

First, let's stick on the features for a second. Then we might just go a little bit more into some of this technical side. So with the product feature, I'm just curious, from a roadmap perspective, how do you guys even figure out where you... What I'm thinking is that to develop a feature for something like Runway must just take a long time. And so it can't be this maybe super fast iteration. It has to be maybe more considered, but...

Yeah, you tell me, how do you guys figure out? Yeah, I hear you say that. And in my head, I'm like, man, I can't imagine my life would be very different if it was slow. It is very fast, actually. It's fast and it's very exciting as we grow and that we have a bunch of fast things that are then stacked on top of each other. So it feels like there's always something big going out that week, which is great. I think what we found over time is that like that vision that I mentioned, having this sort of like we're building tools for humans so that anything you can imagine you can create.

And the end goal, right, is like, you know, top level production level quality for every possible thing you'd want to create.

has been really helpful in actually allowing us to be a little more flexible with our short-term roadmap, which I think is necessary given the types of stuff that we work on and the ways in which things maybe sometimes speed up or maybe sometimes don't. And so we are able to be a lot more flexible in the short term in terms of like, okay, what's the next thing that seems most valuable based on what we just last released that we should be getting in the hands of users next?

And sometimes that changes because, you know, it allows you to learn a lot from like, oh, we released this, you know, for example, like motion brush, like suddenly it was a huge hit. People really like that control and then they wanted more of it. And so, okay, well now we've got to develop like more in that direction versus just assuming like, you know, you've got that one thing and now let's get back to the roadmap that we planned at the beginning of the year, nine months ago. So I think that we approach things with a much more flexible opportunity focus that lets us move so quickly as a result.

Yeah, I think my question had come from, I guess, a place where at the end of the day, I don't work on Gen AI at all. I'm just a consumer. So I think to me, it feels like, how can I move fast? But I think that's great to hear. That's kind of sounds like the only way it can work. So that's kind of fun and fascinating.

This episode of Software Engineering Daily is brought to you by Leanware. Struggling with development teams that say yes to everything but deliver on nothing? Leanware offers a refreshing approach. They're a Colombia-based team delivering top-tier software development with full transparency and world-class engineering standards. They've honed their craft over nearly five years, sticking to technologies where they have senior expertise. This means no compromises on quality ever.

Their C-level executives are always accessible, ensuring seamless communication and a genuine partnership. Plus, being in a similar time zone to the U.S. makes collaboration effortless. Don't settle for less. Partner with Leanware for software development done reliably. Visit leanware.co or see the show notes to get started. That's leanware.co. Leanware, redefining software development with exceptional quality and realistic expectations.

Let's go a bit more into just the technical side. I appreciate you're on the product side, but can you kind of explain anything towards, say, just the technical architecture behind the models or just the platform in general?

Yeah, I mean, I think that our approach in general, you know, comes from some of that product and research perspective on like being user first, where we want to build very robust, very stable and very usable, you know, products. And so as a result, our approach is to make sure that, you know, the stuff that we're releasing is fits into all those categories that like, you know, we're not going to have some spike on a release and then go down or we're not going to have an issue where people can't understand how to use the products.

And so that's sort of our, I think our technical approach is to make it very accessible. I guess kind of hand-in-hand technical and product is data privacy and security. I'm from the security side, which sort of sometimes comes into privacy, sometimes, mostly comes into privacy as well. But, you know, this has obviously been quite a sort of hot topic in the maybe more LLM space, but Gen AI in general space.

How are you handling that? I mean, in terms of the inputs from users, like maybe could you just kind of run through like what kind of inputs can a user even give? And then like, how is that sort of then considered from a privacy perspective?

Yeah. And almost to reverse that order, you know, I think we were very early at Runway and sort of scaling up the maturity of our security and data infrastructure teams and tooling with sort of the knowledge that like if things continue to progress the pace we expected, like we would want that already in place. And so in terms of like what users can provide, you know, changes and it often grows as these models become more powerful. So initially, you know, with our first text to video model, it just was text.

Like you could just provide some text and then, you know, after that you could provide text in an image either or together. And then after that you provide text and image and sort of different directions, other style modifications on top of that. And so I think that as the models improved, like, you know, there's, there's more things that you can provide. And as a result, we wanted to make sure we were well ahead of the curve in terms of standard best practices for security and privacy. And even, you know, some additional systems that we added that we felt like we wanted to have. Yeah. I mean, so like, I don't know if I was to,

When I use Runway, can I upload photos, videos? Is that kind of an input I can give? Yeah, exactly. We have a video-to-video model that you can use as well. You can give us photos, videos, text. Those are sort of the main categories. We have audio models where you can give us either transcripts or audio to sync to as well.

I guess like, yeah, so if I was to like upload, call it a video, like do I have controls like after that to kind of, I guess, remove that video off the platform? Like not maybe as part of content I've created, but like do I have sort of the ability to then remove that? Exactly. We've got all of the sort of like enterprise grade deletion, data protection, security things you'd expect as a user, you know, if you wanted to. Although I personally hope you wouldn't, especially now that we have this conversation. If you wanted to go delete your Runway account, you know, you could do that as well.

Nice. It's just one of these considerations now that almost just comes hand in hand if anyone is going in the direction of producing a commercially available LLM or in the Genii space. It is something that just comes with the territory now and a very difficult thing to still navigate around, but it's great to hear you guys sound like you had that baked in from the start, so it makes a ton of sense.

If we also look at like how the content, I guess, continues. So I think one thing I'm quite intrigued about, and like both from the technical side and the product side is if I have already created something and then I'm wanting to go back like a week later and like I want now five more of these things in exactly the same style, but just with some tweaks. How challenging is that?

Yeah, it used to be, you know, I'd say a year ago, more challenging. And then we heard from users, you know, who brought up the same, the same like yourself. And so we made it very easy. And now there's just, I think it's literally one button, you can go back to a video that you've created with runway and jump right back into reusing all the settings, all the inputs, you know, to create more. And then you can tweak those settings, you can tweak those inputs, you can extend it in different directions, through time to, you know, what you wanted to, but maybe didn't finish doing a week ago.

Can you sort of describe the process like behind sort of, you know, if I type words, like how do those words, I guess, match up with something visual, like just in layman basic terms for me? Yeah. Yeah. Sort of the overall concepts. You can almost think of it in terms of like a new type of camera that you control differently, right? Than a traditional camera where you've got to be, you know, only what you can physically point to in the real world that you would see.

And here it's much more like these models have like an understanding of the world. Like they've got the world inside of them and you're using maybe your image or maybe your video, depending on the model you're using to direct that

And then it's effectively returning to you what you've directed. And so the difference is just instead of, you know, the camera only capturing the part of the world that you can show it at the moment, it actually has knowledge about the world. And you're just the director using text or motion brush or different controls that we have in the product to pull that content out and create it. I think one example would be interesting on the front page of Runway. There's something that mentions like an ox. And actually the image is a...

I'm from Scotland and it's what we call a Highland cow. And so I'm really curious, like how that sort of matches up. If I type Highland cow, would I get Highland cow or would I, or like, I'm just curious, like how, where that kind of not exactly where it comes, like where the content comes from. Cause I know that can be a sort of a topic that's difficult to discuss, but yeah, if you kind of see where I'm going with like how words match up to images or say, you know, images, but the content. Yeah. So, yeah. Yeah. You're right that there are still cases, you know, these are still extremely early, early

models we expect many generations of improvement beyond where you can give it a distinction between an ox and a high-leaf cow and get that every single time. I think what we see is part of that comes from just the models are still developing and building their understanding of the worlds. Sometimes working at Runway especially and you're like, oh, this thing, it's not working. Why isn't it working yet? It's helpful to just take a step back and be like,

you know, two years ago, if we'd shown this to anyone, like you would have run down the street, like screaming, oh my gosh, this is amazing. Like you guys have to see this. And so it's fun to see, and it's exciting to see how people raise their standards and expectations because, you know, they should ultimately the vision that we're driving towards.

But I think that it's helpful to remember that we're very early and this is the worst it's ever going to be. Oh, for sure. Yeah. I wasn't in any way sort of criticizing the ox versus highland cow. I was just, and it is amazing. Like when I saw the example, I just, I kind of had that reaction you just talked about. I kind of run down the street, very excited.

And yeah, I was just kind of curious how the sort of, I don't even want to call it library because like library is such a terrible word, but just the sort of how that sort of matches up with how does this system know if I was to type like mug, like where does that come from? I guess is what. Yeah. That sort of just comes from the understanding of the world that it's built. And I would say, you know, for, especially for,

unique things. We work with a lot of enterprise customers who have many concepts that they've literally just invented or even creators individually who've just created something that isn't going to be in any understanding of the world. And that's where we see some of the customization tools and pipelines that we've built

runway and we work with enterprise customers on especially sort of being helpful because then you know let's say you're you're doing a sci-fi piece and you've got this like sort of cow thing but you know you just created it there's no way that you know it would be in the world and so being able to customize the models further than based on your creative vision i think is a big important part that we focus on as well

Got it. Yeah, that's super cool. That's very fun. I maybe should have touched on this, I guess, with like the data privacy and the security piece. Probably one other question people have in that area is, I guess, the topic of deepfakes and sort of what are you, I guess, able to do to prevent that happening, you know, generating content that could be deemed deepfake?

Yeah, for sure. We have a whole bunch of measures in the product and a whole bunch of dedicated folks who work on this. We have a couple of like new and improved visual and text moderation systems that have automatic oversights on filtering, you know, what we deem to be inappropriate or harmful contents. We have C2PA authentication, if you're familiar with that, which is sort of like a provenance certificates on like...

showing you the media was created with Gen 3, in this case, in our Gen 3 models. And it's always been the case that as the model capabilities and the ability to generate high fidelity content increased, we continue to sort of invest

ahead of that curve on our alignment and safety side so that when you're back two years ago and you're getting maybe smaller, pixelated, jerkier footage, it's not as much of a concern. But knowing where things are going, which we had insight into given that we're doing the research at house, we've always been able to get ahead of, OK, before we release this next model, we're going to need these new level of systems in place. And so that's always been our approach to make sure that those are in place before we release. MARK MANDEL: Yeah, great to hear.

So looking ahead, from what you can share, where is Runway going? What kind of things do you guys see on the horizon that would probably make it into Runway?

Yeah, we're really focused on building sort of general world models, which are effectively like systems that understand the whole visual world and its dynamics. And we just released about a month ago is a major step towards this goal, but it's, you know, still very early. We're still a couple of steps, you know, maybe many steps from that goal. And so this is the first and smallest of our upcoming models. You know, it can still struggle with certain complexities to your point, like, you know, it can still confuse subspecies of, I don't know if they're subspecies, I shouldn't say that, but different types of cows.

And so our approach has been to, is to basically build up to that full general world understanding. And we found, you know, even with like the Gen 3 models that building that up then teaches the models all sorts of other like interesting properties. And so, you know, we've got a lot of like, we've seen a lot of very fun like physics and texture simulations that people have been doing with some of the models, you know, the way it animates waters is a lot of fun to play with. And so,

those capabilities sort of naturally come from our goal on building. Yeah. Very exciting. Yeah. A couple of questions I just tend to ask now at the end of episodes. One is just, what's a typical day for you as the PM at runway? Like what is a typical day?

Yeah, the sort of shabby answer to that is there's really no typical day, which is what makes the role so fun. And I think, yeah, the product team at Runway, but a lot of the teams at Runway, you know, get to be involved in so many cool, different areas from like working with, you know, creators who are, you know, professional creators all the way down to like hobbyist creators to working with researchers or being, you know, researching on these models.

And so I would say for myself, the typical day to the extent that exists is a mix of like talking with users about both the use cases and the like user experience and interfaces, working with our researchers to evaluate experiments or sort of bring that user feedback back to the research team, working with our engineers and designers on the product side to actually piece things into the product and make sure they're stable and ready for a big release, reviewing our metrics and sort of quantitative signals and making sure that like

we are releasing things that are valuable to people to working with sort of our sales, finance and marketing teams to make sure that we're like telling the stories and building the business that we want to be building.

Yeah, cool. Typically, yeah, the answer I get is there is no typical day. Sorry. No, of course, it's just what's kind of fun about technology. But at the same time, you know, when I talk to like CTOs, like often it's hiring, hiring, hiring, hiring. So it's always kind of fun to kind of, yeah, fun to kind of get just a sort of theme on like what actually someone is doing day to day. And I think our listeners really appreciate hearing that in terms of like roles that they're probably thinking about wanting to go into as well.

And, you know, kind of final question, sort of the same track, like knowing what you know now, like today, what advice would you give yourself starting out in this field whatsoever? Yeah. I mean, starting out in this field whatsoever, it would have been like, go join runway, whatever year this, this timeline starts in, like they're up to some really cool things. They're a great group of people. I think to go back to sort of like realistically, when I first started at runway, what would my advice be at that point? I think coming in,

i was comfortable in having like this background on the creative side and also this background on the on the sort of tech startup you know sas tooling side but felt you know just like an innate sense of like okay you know i have this experience and on building you know sas products how the business models work the best way to interact with users the best way to sort of plan for releases and for roadmaps and i think took months early on to just sort of adjust to like

But Runway is of this new generation, different type of company where having a research team in-house, having a creative team in-house, being able to totally shatter expectations of what's possible a couple of times a year just very much changes traditional playbooks. And so I think learning to use them as certainly an input in deciding what's the best thing for us to be focused on, what's the best thing for myself to be doing on the product team at Runway.

but being comfortable jumping into like, well, actually, you know, let's just, let's just try this thing. Cause you know, first principles will lead you to believe it might be an experiment that could pan out. I think we've done a good job building that culture across the company now to where like people certainly bring in experiences from their other roles, but we try a lot of very interesting things and a lot of them work, which is really exciting. I think that's a really good point. I think fear of failure, I think today can kind of

mean that people don't experiment and to quote probably everybody knows who I'm quoting stay curious so yeah I think that's a really good kind of place to leave it which is yeah just that these tools like runway will never even be conceived if people are not able to just experiment and like that thing that project it might not go somewhere but at the same time

you know, project can be for yourself. It can be for many others, but having just that curiosity and yeah, don't fear of failing, so to speak. It's not failure. It's just trying stuff out. Yeah, exactly. So Joel, yeah, it's been great to have you here today. I really appreciate the time and sharing and,

about runway and i definitely won't call it an llm ever again so i apologize that's good yeah well thank you for having me and i'll i'm going to take that feedback on types of cows directly back make sure we're testing for that in the future yeah i want to see highland cow on the website all right we'll see what i can do thank you so much really appreciate the time and hope we can catch up again in the