Yet Another Infra Podcast

Yet Another Infra Podcast - Ep14 - Vitaly Gordon, Diego Oppenheimer, Alex Clemmer, and Yevgeny Pats explore the future of AI infrastructure, the potential of large language models as a mass-market offering, open source in AI and more.

May 17, 2023
Listen Now

Summary

In this episode, hosts Vitaly Gordon, Diego Oppenheimer, Alex Clemmer, and Yevgeny Pats delve into the future of AI infrastructure, emphasizing the specific needs of MLOps as it relates to large language models (LLMs) versus traditional machine learning models. They discuss the advantages of open source in the AI field, focusing on its role in developing efficient ETL solutions and how it promotes innovation without vendor lock-in. The conversation also explores the implications of foundational models for scaling machine learning operations, particularly the significance of having a modular software architecture that supports extensibility. Furthermore, the hosts examine the evolving competitive landscape, where smaller teams leveraging open source tools can challenge well-established companies. Lastly, they highlight the importance of operational efficiency in delivering successful AI applications to the market.

Key Takeaways

  • 1MLOps processes need to be adapted significantly to cater for large language models, indicating a departure from traditional machine learning practices.
  • 2Open source tools are becoming essential in AI, fostering innovation, while offering customizable ETL solutions that better serve diverse client needs.
  • 3The rise of foundational models is shifting the industry's focus from hardware capabilities to the efficiency of software distribution and accessibility.
  • 4Concerns regarding the total cost of ownership when using open-source software highlight the balance between perceived savings and potential hidden costs.
  • 5The competitive landscape is changing, allowing smaller teams leveraging open-source technologies to innovate and disrupt traditional market players.
  • 6The integration of software architecture that supports plugin development is vital for long-term business viability and adaptability.
  • 7Rapid iteration and effective deployment of AI solutions are essential for success, highlighting the need for operational excellence.

Notable Quotes

"Diego Oppenheimer states, 'If you think about what MLOps is and what really it encompasses, it's really like the automation to getting to faster, better, higher quality kind of machine learning throughput pipelines.' This emphasizes the importance of automation in achieving quality in ML operations."

"Oppenheimer also mentions, 'I think we’re still in the phase where a thousand flowers are blooming and some of those are not going to work out, but I don’t think we actually know what this space is going to look like in 10 years.' This reflects uncertainty and potential in the evolving AI landscape."

"Seeing open source is truly having its amazing moment in AI, with tools and communities flourishing to tackle various applications."

"The total cost of ownership of open source software might turn out to be higher than companies anticipate due to the lack of support and the need for maintenance."

"This idea of building a strong foundational layer and allowing for extensibility through plugins is not just a technical challenge but a business strategy for long-term viability."

"The letter from Google stated that they don't have AI boats, nor does OpenAI, which highlights the questionable landscape of AI development and competition in the open-source realm."

"It's interesting how the narrative shifts when you consider that the technology is available for everyone, yet the models produced are dependent on the community's contribution and innovation."

"What I think the author means to say is that Google is capturing foundational work and is capturing essentially none of the value and is not on track to capture any of the value if things continue going this way."

"The better the distribution mechanism for software, the easier it is to build the stuff. So the faster that you can ship that stuff to people, the higher fidelity, the more that you can learn from it, the better off, like all things equal that you're going to be."

"And I think that it is almost certainly the case that's going to be true for pretty much any revolution in software, but especially like AI where the factors that you use to get into production quicker, the accessibility of that stuff, the more people that can actually use the technology, the more places that it's practical means that the more use cases are going to be, it's going to wedge itself into more use cases."