I was lucky enough to participate in this panel at IaCConf on AI and ML in the world of Infrastructure as Code. In the spirit of using AI tools to make my life easier, here's an AI generated summary of the panel.
This panel discussion explores the impact of AI and ML on infrastructure as code (IaC). The discussion features Robert Hafner, distinguished engineer at Comcast and author of "Terraform in Depth," and Alan Hilton, ecosystem engineer at Momento and AWS Hero. The panel explores the pros and cons of relying on AI for foundational infrastructure layers, touching on the current state of AI adoption in cloud operations, with 17% of teams using AI-driven capabilities and 41% actively exploring AI solutions.
A central theme is the importance of "human-in-the-loop" systems. Panelists argue against fully automated AI solutions in IaC, emphasizing the need for human oversight to verify AI-generated code and configurations, particularly in addressing the "long tail problem" where AI models struggle with less frequent scenarios.
The discussion covers whether MLOps or AI Ops is the future, agreeing that the average person won't likely train models but instead will need to become a stronger analyst, reviewing AI-generated code and communicating effectively with AI tools. Both experts underscored the necessity of understanding code, even with AI assistance, for effective pull request reviews and bug identification.
The panel also addresses whether AI will replace jobs, answering that AI is more of a tool that is used for smarter autocompletes and research. Alan used the example of refactoring code with an AI agent this morning, where the lines of code was reduced from 850 to 180, and he still had to check for code smells.
The panelists debated the importance of Retrieval Augmented Generation (RAG) data versus the LLM itself, with Robert arguing they are equally important because the LLM cannot do anything without the RAG. The quality of RAG data ensures adherence to organizational standards when using AI.
When asked about how engineers will adapt to increased AI usage, the panel stressed the importance of learning how to learn and adapt to industry changes. Robert made the point that the biggest skill is learning how to learn, and constantly diving deep into new concepts.
Regarding the question of open-source versus homegrown AI models, Robert predicted the open-source models are the future and will be more amazing than the proprietary ones. He went as far as saying OpenAI will go bankrupt in 5 years, with open source options like Llama and Qwen taking over. Alan agreed, but emphasized the need for specific tools to be developed to enhance these open-source models, creating personalized AI assistant experiences.
In conclusion, both Robert and Alan did not believe that AI capability has peaked yet.