Insights Enhancing Business Efficiency: Integrating Azure OpenAI with Custom Software Applications

Enhancing Business Efficiency: Integrating Azure OpenAI with Custom Software Applications

Integrating Azure OpenAI’s (AOAI), powerful language models such as GPT 3.5 with custom software applications has become increasingly popular as businesses seek innovative solutions to enhance user experiences and automate various tasks. Azure OpenAI with its GPT (Generative Pre-trained Transformer) models, offers a robust foundation for natural language processing tasks. A significant majority of industries and businesses therein could hugely benefit by leveraging AOAI’s capabilities. Concurrency works with clients from various business backgrounds ranging from manufacturing to retail. Our teams have implemented AI solutions for these clients to help automate meaningful and reoccurring tasks such as automating the process of reading, interpreting, and subsequently acting on quote requests; or, predicting inventory levels and allowing teams to act on these proactively. In this article, we will explore the key considerations and the overall process of integrating OpenAI with a custom software application. We will also showcase a small Azure application architecture as one of the possible implementations.

It is common among innovative leaders in various organizations to get into an all-or-nothing approach as it pertains to AI. One of the common misconceptions is that OpenAI and more specifically its GPT models are the only tool to solve a business process problem. Concurrency’s modern applications and data & AI teams are ready and able to discuss from the onset exactly whether the use of Azure OpenAI is the best route for success within an organization. The artificial intelligence models that are made available to our development teams continue to grow and AOAI is but one tool that is available to our consultants.

One of the most important first steps when discussing integrating OpenAI with an organization’s workflow is where exactly to integrate it and how to best do it. Defining a clear use case is a key step towards success – whether in the form of a chatbot or an automated process built into an organization’s Outlook, it is imperative to target this technology well from the onset. The natural next step is to draw a line in the breadth of knowledge that an initial application should be exposed to. For example, a manufacturing company may manifest this by targeting a single or a few product types within the catalog of product types available. A single product type may still harbor thousands of products and therefore the resulting impact will still be meaningful all the while ensuring that the first step of integrating Azure OpenAI with the organization was a success. Artificial intelligence integrated within a company’s daily business flow is best described as a snowball – it starts small and as it continues to roll becomes larger and thereby exponentially more useful within an organization.

 Among Concurrency’s long list of satisfied clients is a manufacturing company that engaged its teams when attempting to explore the possibility of helping its technical support representatives deliver the highest level of assistance more efficiently to their customers. Concurrency’s Modern Applications and AI teams envisioned, architected, implemented, and subsequently handed off a robust Teams bot that made use of Azure OpenAI as well as large language models to be able to answer industry and product specific questions found in technical documents provided by the client company. The Azure Bot was integrated with a Teams channel wherein the bot answered questions along with providing hyperlinks to source documentation that helped the artificial intelligence land its answer. This solution not only helped support representatives to respond to their customers promptly; it also helped lower the threshold of specialized knowledge a representative must have to effectively help a customer. Another business benefit of this system was that the more it was used, the more knowledgeable it became by making use of training data supplied by subject matter experts. As a result, the knowledge that was ingested into this artificial intelligence model was then used to teach newer representatives. This exciting use case is only one of the several successful AI solution engagements where Concurrency has helped its clients take advantage of artificial intelligence to streamline business processes.

Data security and privacy should be a focal point for all organizations. While building an application to intercept, scan, and act on emails that a sales rep receives sounds like a wonderful idea, and it is, it can also pose some challenges when ensuring that a company remains in compliance with various security and privacy best practices. Concurrency’s development teams have been faced with the challenge of maximizing the effectiveness of a custom application integrated with OpenAI while minimizing the potential for data exposure or data privacy violations. A stakeholder needs to identify what data should and could be stored versus not as it can be the deciding factor behind what type of application is built and how that application is integrated with Azure OpenAI.

Custom applications that are integrated with Azure OpenAI need to be designed in a way to maximize the effectiveness of an AI model while minimizing its use to remain fiscally responsible. A well-trained or prompted model will be able to deal with large data ingestion; however, it is often not the most cost-effective. Development teams need to coordinate to establish an understanding of what data is necessary to complete a task while integrating Azure OpenAI. It is recommended to use the Azure Pricing Calculator to help determine what an estimated cost will be for various data. A custom application sends in text data and Azure OpenAI breaks that text data up into tokens. The table below can be a helpful approximation of what that may look like; however, we recommend pairing the below with the use of the Azure Pricing Calculator to estimate costs as they are subject to change.

1-2 sentences~3 tokens
1 paragraph~100 tokens
1,500 words~2048 tokens

 A positive user experience is the pinnacle of a well-designed application. While automation is the target of Azure OpenAI integrations, the ambiguity of where in the process that integration is will often dictate whether an end user is satisfied with an implementation. A well-designed user experience not only enhances user satisfaction but also contributes to increased engagement, reduced bounce rates, and positive word-of-mouth recommendations. By prioritizing user needs, preferences, and expectations, developers can create an application that not only meets functional requirements but delights users, ensuring its relevance and success in a competitive digital landscape. As such, an application must make use of tools to “inform” a user where they are in the process. An Outlook application may send feedback directly as a toast or update a category for the email in question whereas a fully customized application would make use of progress bars or other forms of notification.

All the aforementioned are important things to discuss with stakeholders during the foundational stages of a successful project. Once the team is ready to start to begin the implementation stage, it is important to consider the technical design of an application. Below is a high-level architectural diagram of a barebones custom Teams bot Azure OpenAI integration hosted in Azure.

The above diagram details an implementation where an Azure Function is the logic layer of the system and serves as the “traffic cop” for all interactions with the system. As a user sends messages to a bot, the bot’s exposed Teams channel via Azure Bot routes all messages to an awaiting application programmer interface (API) that in turn integrates with a prospective database and in this case, most definitely an Azure OpenAI instance. These resources most often are hosted on an organization’s tenant to ensure that all data routed through this system remains in that organization’s cloud domain.

 A clean slate is often very helpful for increased velocity; however, organizations are often keener on modernizing existing applications to minimize cost and increase time to market. While the Azure OpenAI portion of the initiative would have to be entirely brand new, it is entirely possible to integrate an existing application with a new Azure OpenAI deployment. In most cases, it would only call for a new service being built within an existing API as well as secrets management to ensure that the communication is retaining security at the top of mind. Another consideration is for applications that make use of older model deployments within Azure OpenAI. For example, if an application is currently making use of GPT3 and an organization would like to upgrade to GPT3.5 Turbo or GPT 4, there are cost implications along with technical implementation considerations. Mainly, how newer models process requests as well as handle adjusting hyperparameters can vary in some ways between model deployments. For this reason, developers must know that refactoring existing implementations will be necessary to change model deployment integrations.

Integrating OpenAI with a custom software application involves careful planning and consideration of various factors, from use cases and data security to cost and user experience. By following a systematic approach and staying informed about Azure OpenAI’s capabilities, developers can harness the power of AI to enhance their applications and deliver innovative solutions to users. Concurrency’s development teams have a plethora of experience dealing with a multitude of business problems and are one of Microsoft’s most engaged partners in dealing with custom business solutions that leverage Azure OpenAI. As a result of being at the forefront of artificial intelligence implementations, Concurrency is well-equipped to help lead its clients to a solution to all of their business problems.