• Start Time:
  • End Time:
  • Day:
    Day 2


Generative AI has become one of the most exciting and rapidly growing areas of machine learning. As the demand for more sophisticated models increases, it is becoming increasingly important to develop robust and scalable platforms for training and deploying these models. In this session, we will explore various strategies for scaling ML platforms to support foundation models, including distributed training techniques like model parallelism, cloud based solution and model compression techniques. I will also discuss the challenges and limitations of these approaches, as well as best practices for implementing them in real-world scenarios.

Associated Speakers:

Animesh Singh

Executive Director, AI and Machine Learning Platform


Associated Talks:

10:15AM - Day 2

View Presentation: Scaling Machine Learning Platforms in the age of Generative Models

View Full Info