GPT-4 in my hands…I don’t think I will sleep much tonight!
Introduction
As an AI enthusiast, I was beyond excited when OpenAI announced the release of GPT-4, the newest iteration of their ground-breaking language model. With a proven track record of pushing the boundaries of natural language processing, I knew I had to give GPT-4 a spin and share my experiences with you all. So, without further ado, let’s embark on a journey to explore the capabilities of GPT-4 and discover how this powerful tool has the potential to revolutionize the way we interact with technology.
Impressive Conversational Skills
One of GPT-4’s primary selling points is its conversational prowess, so I put it to the test by engaging in a simple Q&A session. To my amazement, the model provided detailed, informative, and coherent answers to my questions. I tested its knowledge on various topics, from historical events and scientific concepts to pop culture and current events. GPT-4 handled them all with ease, delivering responses that were human-like in their depth and accuracy.
Creative Writing and Storytelling
I wanted to see if GPT-4 could unleash its creative potential, so I asked it to generate a short story based on a simple prompt. The result was astonishing – the model crafted a captivating narrative with well-developed characters, a coherent plot, and a satisfying conclusion. The storytelling abilities of GPT-4 are leaps and bounds ahead of its predecessors, making it a valuable tool for writers in need of inspiration or assistance in crafting their stories.
Coding Assistance
As a developer, I was particularly curious about GPT-4’s ability to generate and understand code. I asked the model to help me with some Python and JavaScript snippets, and it provided accurate, functional code in both cases. This feature has the potential to revolutionize the way we approach programming tasks, making it easier for developers to solve complex problems and streamline their workflows.
Limitations and Ethical Considerations
Despite its incredible abilities, GPT-4 is not without limitations. It occasionally provided inaccurate information or misunderstood the context of my questions. However, these instances were relatively rare compared to the overall quality of its responses. Additionally, it’s essential to consider the ethical implications of using AI systems like GPT-4, including issues like potential biases in training data, malicious use, and user privacy.
Conclusion
My experience with GPT-4 has been nothing short of remarkable. Its conversational skills, creative writing abilities, and coding assistance are game-changers in the realm of AI and natural language processing. While it’s essential to acknowledge its limitations and approach its use with ethical considerations in mind, GPT-4 offers immense potential for revolutionizing the way we interact with technology. As an AI enthusiast, I’m thrilled to witness the continued evolution of these groundbreaking models and can’t wait to see what the future holds for AI-powered language systems.
Use ChatGPT to automatically update your team chat messages for clarity.
Use ChatGPT to update chat messages to boost clarity!
As we continue to work remotely in the wake of Covid, communication through text is more common than before. With messages flying back and forth, there is a chance of misinterpretation, which can cause serious issues. Messages can easily be misinterpreted, assigning incorrect intent or meaning. To minimize this risk , I created a Power Automate that checks my Teams messages for any risk of misinterpretation.
To achieve this, I used the newly released ChatGPT API. ChatGPT is a large language model trained by OpenAI to understand natural language and generate human-like responses. By leveraging ChatGPT’s capabilities, I created a Power Automate that analyzes the text of the messages and checks for any potential misunderstandings. If there was content in my chat message deemed to include ambiguous content, too many alternative meanings, or obvious spelling mistakes – then the Power Automate flow would update my Teams chat message.
To accomplish this, I used;
A custom connector to OpenAI ChatGPT API
A custom connector to Microsoft Graph API
A Power Automate flow
The first step was to set up the Power Automate flow. I started by connecting it to my Microsoft Teams account and setting up a trigger that detects new messages. To not check all of them during proof-of-concept, I have added a condition that it will only check messages where I include a code. I put # start of the message as the trigger to check my text. Then, I added the action to get the text of the message. Next, I created an action to call the ChatGPT API through my custom connector. I passed in the text of the message as input and requested an analysis of the message and added the instruction for ChatGPT, so it was clear to know what to do with the text content. The API then returned a response with the analysis, including any potential risks of misinterpretation and a suggested rewrite. Finally, I added a condition to check if the analysis indicated any risks of misinterpretation. I let ChatGPT rate it between 1-100 and along the way I found that above 75 was a good rating threshold for me to have ChatGPT do rewrites. I also made sure to set the temperature parameter low to minimize the creativity in the responses – making them more predictable.
Sending the teams chat message to ChatGPT API with an instruction prompt.
OpenAI’s GPT models are a powerful tool that can be used in many different applications. But one of the areas where it made the most sense to me was within the DevOps platform. For those who doesn’t know what DevOps is – it can shortly be defined as a combination of software development and IT operations that aims to automate and streamline the software delivery process.
In this blog post, we will explore how to exploit OpenAI GPT models within DevOps to improve the software development process.
What are OpenAI GPT models?
OpenAI GPT (Generative Pre-trained Transformer) models are machine learning models that use deep learning techniques to generate natural language. These models are trained on large datasets of text and can generate human-like responses to prompts.
The GPT models are pre-trained on large amounts of text and can be fine-tuned on specific tasks. This allows the models to generate natural language responses to specific prompts, such as questions or requests for information.
How can OpenAI GPT models be used in DevOps?
One way to use OpenAI GPT models in DevOps is to improve the software development process. Here are a few ways that GPT models can be exploited within DevOps:
Code commenting and documentation quality analysis: Use GPT-3 to analyze the quality of comments in the codebase and suggest improvements to make them clearer, concise, and informative. Also use GPT-3 to analyze the quality of the documentation in wiki. This process helps ensure that the documentation accurately reflects the functionality of the code and the user stories.
Test automation: Use GPT-3 to automatically generate test automation scripts based on natural language descriptions of the desired tests. This process helps streamline the testing process by automating the generation of test scripts and reducing the need for manual testing.
Code style enforcement: Use GPT-3 to enforce a consistent code style across the codebase by suggesting corrections or reformatting the code. This process helps ensure that the codebase follows consistent formatting and style guidelines, making it easier to read and maintain.
Project estimation: Use GPT-3 to estimate the time and resources required to complete a project based on natural language descriptions of the requirements and constraints. This process helps ensure accurate project planning and resource allocation.
Code standardization: Use GPT-3 to standardize the codebase by suggesting common programming practices, coding standards, and design patterns. This process helps ensure that the codebase follows consistent coding practices and design patterns, making it easier to read and maintain.
Improving commit messages: Use GPT-3 to suggest better commit messages based on the changes made to the code. This process helps ensure that commit messages accurately reflect the changes made to the codebase.
Enhancing natural language search: Use GPT-3 to analyze code and generate descriptions, tags for functions and classes to improve your DevOps platform’s natural language search capabilities. This process helps improve the discoverability of code and makes it easier to find specific code snippets.
Code generation: Use GPT-3 to generate code snippets based on natural language prompts from user stories and acceptance criteria. This process helps automate the code writing process, reducing the need for manual coding.
Automated testing: Use GPT-3 to generate test cases based on the code and user stories, which can then be automatically executed to test your code. This process speeds up the test case automation 10x.
Project management: Use GPT-3 to generate reports and dashboards based on project data. This process helps automate project reporting, ensuring accurate and timely reporting.
Natural language interface: Use GPT-3 to create a natural language interface for your DevOps platform. This process helps improve the platform’s usability by allowing users to interact with it using natural language commands. For example, ask DevOps to create a user story to resolve the feature gap from bug 123 or ask it to check which pipelines are currently running in the organization.
Anyone of these warrants its blog post on its own, but this blog post is more about the art of possible than the in-depth possibilities.
Implementation takes forever?!
You might think it takes a huge effort to accomplish any of these features within Azure DevOps. This is, of course, not true. You can easily start with your proof-of-concept by copying DevOps text into any GPT-based chat application such as ChatGPT or using the GPT-Playgrounds available on, for example, the OpenAI website.
A few examples using a manual approach
GPT3 as the Azure DevOps API where the prompt is the requirement and the output is an api call you can copy and paste into Postman:
GPT3 as Business Consultant writing User Stories where the prompt is the requirement description from customer workshop:
This is not the best long-term solution, but it will quickly give you an idea of the feasibility of using this approach.
Integrating GPT-3 programmatically
More automatic ways would be integrating GPT-3 directly into Azure DevOps as either Azure DevOps extensions, Logic Apps, Power Automate flows, or Azure functions.
I prefer to exploit the GPT model using Azure OpenAI and reuse the API/Custom connector for multiple DevOps organizations or any other application requiring similar service. This also gives me the ability to manage everything on one platform. But it works as well using the OpenAI APIs directly. There is also a cost perspective where the prices differ between the two and are changing.
Conclusion
OpenAI GPT models are a powerful tool that can be exploited within DevOps to improve the software development process. Whether it’s automating customer support, generating code, automated testing, or documentation, GPT models can save time and improve the overall quality of the software. I don’t foresee a future without using GPT-3 or similar models to leverage the work we do – but I equally have no clue how much our DevOps engagement will change. Only time will tell, but I am sure we have just seen the beginning of it all.
If you have other cool ideas on using GPT-3 within your DevOps process, please let me know!