OpenAI’s GPT models are a powerful tool that can be used in many different applications. But one of the areas where it made the most sense to me was within the DevOps platform. For those who doesn’t know what DevOps is – it can shortly be defined as a combination of software development and IT operations that aims to automate and streamline the software delivery process.
In this blog post, we will explore how to exploit OpenAI GPT models within DevOps to improve the software development process.
What are OpenAI GPT models?
OpenAI GPT (Generative Pre-trained Transformer) models are machine learning models that use deep learning techniques to generate natural language. These models are trained on large datasets of text and can generate human-like responses to prompts.
The GPT models are pre-trained on large amounts of text and can be fine-tuned on specific tasks. This allows the models to generate natural language responses to specific prompts, such as questions or requests for information.
How can OpenAI GPT models be used in DevOps?
One way to use OpenAI GPT models in DevOps is to improve the software development process. Here are a few ways that GPT models can be exploited within DevOps:
- Code commenting and documentation quality analysis: Use GPT-3 to analyze the quality of comments in the codebase and suggest improvements to make them clearer, concise, and informative. Also use GPT-3 to analyze the quality of the documentation in wiki. This process helps ensure that the documentation accurately reflects the functionality of the code and the user stories.
- Test automation: Use GPT-3 to automatically generate test automation scripts based on natural language descriptions of the desired tests. This process helps streamline the testing process by automating the generation of test scripts and reducing the need for manual testing.
- Code style enforcement: Use GPT-3 to enforce a consistent code style across the codebase by suggesting corrections or reformatting the code. This process helps ensure that the codebase follows consistent formatting and style guidelines, making it easier to read and maintain.
- Project estimation: Use GPT-3 to estimate the time and resources required to complete a project based on natural language descriptions of the requirements and constraints. This process helps ensure accurate project planning and resource allocation.
- Code standardization: Use GPT-3 to standardize the codebase by suggesting common programming practices, coding standards, and design patterns. This process helps ensure that the codebase follows consistent coding practices and design patterns, making it easier to read and maintain.
- Improving commit messages: Use GPT-3 to suggest better commit messages based on the changes made to the code. This process helps ensure that commit messages accurately reflect the changes made to the codebase.
- Enhancing natural language search: Use GPT-3 to analyze code and generate descriptions, tags for functions and classes to improve your DevOps platform’s natural language search capabilities. This process helps improve the discoverability of code and makes it easier to find specific code snippets.
- Code generation: Use GPT-3 to generate code snippets based on natural language prompts from user stories and acceptance criteria. This process helps automate the code writing process, reducing the need for manual coding.
- Automated testing: Use GPT-3 to generate test cases based on the code and user stories, which can then be automatically executed to test your code. This process speeds up the test case automation 10x.
- Project management: Use GPT-3 to generate reports and dashboards based on project data. This process helps automate project reporting, ensuring accurate and timely reporting.
- Natural language interface: Use GPT-3 to create a natural language interface for your DevOps platform. This process helps improve the platform’s usability by allowing users to interact with it using natural language commands. For example, ask DevOps to create a user story to resolve the feature gap from bug 123 or ask it to check which pipelines are currently running in the organization.
Anyone of these warrants its blog post on its own, but this blog post is more about the art of possible than the in-depth possibilities.
Implementation takes forever?!
You might think it takes a huge effort to accomplish any of these features within Azure DevOps. This is, of course, not true. You can easily start with your proof-of-concept by copying DevOps text into any GPT-based chat application such as ChatGPT or using the GPT-Playgrounds available on, for example, the OpenAI website.
A few examples using a manual approach
GPT3 as the Azure DevOps API where the prompt is the requirement and the output is an api call you can copy and paste into Postman:
GPT3 as Business Consultant writing User Stories where the prompt is the requirement description from customer workshop:
This is not the best long-term solution, but it will quickly give you an idea of the feasibility of using this approach.
Integrating GPT-3 programmatically
More automatic ways would be integrating GPT-3 directly into Azure DevOps as either Azure DevOps extensions, Logic Apps, Power Automate flows, or Azure functions.
I prefer to exploit the GPT model using Azure OpenAI and reuse the API/Custom connector for multiple DevOps organizations or any other application requiring similar service. This also gives me the ability to manage everything on one platform. But it works as well using the OpenAI APIs directly. There is also a cost perspective where the prices differ between the two and are changing.
OpenAI GPT models are a powerful tool that can be exploited within DevOps to improve the software development process. Whether it’s automating customer support, generating code, automated testing, or documentation, GPT models can save time and improve the overall quality of the software. I don’t foresee a future without using GPT-3 or similar models to leverage the work we do – but I equally have no clue how much our DevOps engagement will change. Only time will tell, but I am sure we have just seen the beginning of it all.
If you have other cool ideas on using GPT-3 within your DevOps process, please let me know!