HomeDevelopmentDevOpsMicrosoft to Set Pace of AI Innovation for DevOps in 2024

Microsoft to Set Pace of AI Innovation for DevOps in 2024

Microsoft investments in generative artificial intelligence (AI) are set in the coming year to transform DevOps workflows in ways that will dramatically accelerate the pace at which applications can be built and deployed on the Azure cloud.

At the core of those efforts are two AI tools that are currently available in preview. Microsoft Copilot for Azure leverages large language models (LLMs) to enable IT teams to use natural language to create, configure, discover and troubleshoot Azure services. It also enables IT teams to create complex commands, ask questions and optimize costs.

The second major AI initiative for DevOps comes from Microsoft’s GitHub arm, which is previewing a Copilot Workspace tool that leverages generative AI to automatically propose a plan for building an application using natural language descriptions typed into the GitHub Issues project management software. It then generates editable documents via a single click that developers can then visually inspect, edit and validate before code is generated. Any errors in the code discovered by application developers or the Copilot Workspace platform itself can also be automatically fixed. In addition, summaries of the project can automatically be created and shared across an application development team.

It’s not clear how much of an AI advantage Microsoft will be able to maintain as rivals similarly invest in AI but it’s now all but certain DevOps workflows will never be the same. Many of the manual tasks that create bottlenecks with DevOps workflows today are about to be eliminated. Those advances, however, will not eliminate the need for software engineers so much as they will enable DevOps teams to build and deploy applications at a pace that a year ago would have seemed unimaginable. The challenge now is determining how DevOps roles will evolve as more tasks become automated.

Right now, AI already enables developers to write code faster than ever. Unfortunately, a lot of that code may be suspect. For every one developer that writes better code with the aid of a generative AI platform there is another that might inadvertently be injecting vulnerabilities into their code. A generative AI platform such as ChatGPT based on the LLMs created by OpenAI is trained using code of varying quality collected from across the Web. Developers with little to no cybersecurity expertise are prone to cut and paste cod generated for them without much appreciation for its quality. Naturally, it will fall to DevOps teams to review any and all code before it is uploaded into a build at least until LLMs that have been trained used code that has been more closely vetted become more widely available.

It remains to be seen, however, just how AI might be used to automate code reviews but as the overall size of the codebase that needs to be managed continues to increase DevOps teams will need to rely more on predictive, causal, and generative AI tools to cope with it all.

Ultimately, LLMs embedded within DevOps platforms will democratize software engineering. Instead of having to, for example, write and test a script an IT administrator via a natural language interface will simply request one. It’s now only a matter of time before more organizations are able to embrace best DevOps workflows to build and deploy applications simply because the level of expertise required has been greatly reduced. Rather than being an approach to application development that only a limited number of organizations can financially afford to embrace, DevOps becomes an IT disciple that even the smallest of organizations with help from AI can employ.

In the meantime, IT leaders should prepare their teams today for a new AI tools and platform in 2024 that will make most existing DevOps workflows in comparison to be nothing less than antiquated.


Receive our top stories directly in your inbox!

Sign up for our Newsletters