Generative artificial intelligence (AI) platforms will soon become widely integrated across DevOps workflows in way that will enable developers to build more secure code faster while at the same time making it simpler for software engineers to maintain large code bases.’
The latest example of generative AI being applied to DevOps comes from Google, which is previewing Duet AI for Google Cloud. This latest edition to the Google Cloud platform in addition to surface cod recommendations in real time also promises to enable DevOps teams to use custom code to train the large language models (LLM) that Google provides.
Google is allying with GitLab and Tabnine to integrate LLMs with a continuous integration/continuous delivery (CI/CD) platform and testing tools, respectively.
Of course, Google is playing catchup with Microsoft and OpenAI, the research organization that created the generative AI model dubbed ChatGPT that is being integrated into every application Microsoft provides, including most notably the GitHub Co-Pilot code writing tool that its subsidiary provides.
Other providers of DevOps platforms and tools that are providing integrations with ChatGPT include New Relic, Honeycomb, Endor Labs and ClickUp. More ambitiously, Atlassian has been using the data it collects to train LLMs created using the OpenAI platform to identify, for example, classed of incidents that have been previously resolved to generate recommendations for fixing similar issues.
Kubiya has even gone so far as to employ generative AI to create entire DevOps workflows based on data it pulls from platforms such as Notion and Confluence, and other sources of technical documentation.
It’s still early days as far as usage of generative AI within DevOps workflows is concerned but the one thing that is clear is the amount of high-quality code being created by developers is about to exponentially increase. Hopefully, AI technologies will also one day help software engineers find ways to manage that volume of code moving across their DevOps pipelines. On the plus side at least, the quality of that code should dramatically improve thanks to recommendations from LLMs that, for example, will identify vulnerabilities long before an application is deployed in a production environment.
Generative AI platforms are fundamentally changing the way humans interact with machines. Instead of requiring a developer to create a level of abstraction to communicate with a machine, it’s now possible for machines to understand the language humans use to communicate with each other. Developers via a natural language interface will soon be asking generative AI platforms to not only surface suggestions, but also debug applications and, eventually, write the code used to create an application.
At this point, like it or not, the generative AI genie is out of the proverbial bottle. Just about every job function imaginable will be impacted to varying degrees. In the case of DevOps teams, the ultimate impact should involve less drudgery as many of the manual tasks that conspire to make managing DevOps workflows tedious are eliminated.
There are other forms of AI that are being applied to AI as well. PagerDuty is applying machine learning algorithms to the data it collects to automate incident response. Regardless of the type of AI employed, the level of automation being brought to bear on DevOps workflows is about to substantially increase. The challenge now is determining how to refocus the efforts of the DevOps teams still needed to manage DevOps on the tasks that add value by focusing on more complex tasks that are not only more challenging but previously they lacked the time to tackle. In fact, generative AI may ultimately help bring back some joy to DevOps teams that, if anything, have always been committed to ruthlessly automating as many IT processes as possible.