AWS Inferentia2-based Amazon EC2 Inf2 instances can help you deploy your 100B+ parameter generative AI models at scale. Inf2 instances deliver up to 40% better price performance than comparable Amazon EC2 instances. Tune in to learn more about this new launch that helps you increase performance, reduce costs, and also improve energy efficiency when deploying your ML applications. Inf2 PDP https://go.aws/44oez5T Neuron documentation https://bit.ly/44oLmrz AWS Inferentia https://go.aws/3NAyhFr AWS Trainium https://go.aws/3nkivnH