Powering AI Applications
The purpose of this document is to offer an end to end view on how to materialize value through data from an integration perspective. In order to cover this broad scenario, this post will be segmented into the following sections:
One of the most common challenges that we see with our customers today has to do with data activation. When we speak about creating value from our business data, we can find two major use cases: Monetization (external/internal) & Data Activation (internal).
Within the current business landscape, one of the primary facilitators of monetization and data activation is the utilization of application programming interfaces (APIs). APIs provide businesses with the capability to securely and effectively expose their technological assets in a controlled and governed manner. This technology encompasses a diverse range of variations and flavors, necessitating the consideration of multiple aspects to ensure successful implementation.
Before entering data activation as a whole, it's important to understand where business stands today. Let's see the figure below:
The reality is that the information of every business is distributed in siloed islands completely disjointed and disconnected. In this context it is very difficult, tedious and expensive to actually make data globally accessible to our teams, requiring efforts in complex integrations.
The data, platforms, applications and systems support business processes. In a way, we are trying to build bridges between teams and islands. These bridges will act as a foundation for value creation in record time. Automating business processes that require multi-platform interactions, enabling native connectivity between systems, giving our IT departments an agile way of creating complex orchestrations, and in the case of this specific post, activating data and making it broadly accessible.
As the journey starts, business will stumble upon multiple challenges. At the end of the day, technology that enables us to easily and securely share data throughout the organization is needed. This technology should enable seamless connectivity across clouds, SaaS and legacy apps, breaking down the data silos and islands and automating business processes.* https://www.accenture.com/us-en/insights/technology/interoperability
Based on the provided image, we can observe the composition of most of our customers technological ecosystems.
Here is where we start to see some of the major challenges. Let's elaborate on some of the most common ones and how Application Integration and Apigee X can help:
Data is of utmost importance and value, and ensuring that it is solely accessible to authorized individuals is key. To achieve this objective, it is imperative to identify our consumers and comprehensively assess the resources they have access to. Authentication, authorization, and access control are indispensable components in safeguarding data. Apigee X will empower the organization to seamlessly and systematically utilize various security mechanisms to protect sensitive information and protect against prevalent API vulnerabilities. The Apigee security stack, coupled with Cloud Armor, will serve as a robust defense against OWASP Top 10, ensuring the integrity and confidentiality of our data.
Having a centralized hub to discover, document, catalog and manage the lifecycle of APIs is not a simple task. It's one of the most common challenges that organizations face when they need to expose their assets (externally or internally). In the context of data activation, governance plays a key role as the data you are exposing, monetizing, activating or sharing will eventually end up being consumed through an API, requiring a standardized interface, internal visibility and discoverability. Additionally, managing its different stages of development and aligning to compliance and security requirements.For more information related to API Hub: API Hub Docs
Integrating two applications that are not designed to work together can be a daunting task, involving significant time, budget, and complexity. Application integration offers a solution to this challenge, enabling seamless data and functionality exchange. Native connectors in Application Integration Platform streamline the integration process, reducing complexity and providing a user-friendly interface. This approach allows for rapid integration, saving time and resources while ensuring a seamless user experience.
Explore Connectors
For more information check out our doc pages:
Exposing data that rests (in some cases) in platforms and warehouses that do not support a “transactional paradigm” can incur degraded performance, high latencies and significant costs in computation on the backend. Stumbling upon use cases that require pagination, complex queries or unstructured data will heighten the need for optimizing the load. Some of the systems and platforms where our core data resides are not yet optimized and can also impact consumer experience. By employing Apigee's caching policies, we can significantly enhance the consumer experience by reducing latency and minimizing the computational load on backend systems, resulting in improved performance and efficiency. The volatility and temporal nature of data present challenges. However, these challenges can be addressed in a straightforward manner by leveraging the core capabilities of Apigee API Management.
"¿How about making a query just when we need to and not with every API call?"
The data exposed will not come from its origin in the final format/structure needed. This can be because there are definitions with regards to exposure standards or because consumers require a different structure for data consumption. Mediations are very important in order to grab the raw data and transform it into beautiful, readable, standard compliant response bodies. Application Integration can make the process of data mapping, transformations, mashups and formatting a breeze. Apigee can add a last mile layer of transformation if needed with its comprehensive mediation policies.
In determining the parameters of data sharing, several considerations must be addressed. Firstly, the frequency with which consumers are permitted to view our data must be established. Whether this occurs daily, weekly, or monthly requires careful consideration. Furthermore, it is essential to monitor the consumption of our data and implement a regulated framework to govern its utilization. In internal applications, limits can be defined in accordance with specific organizational needs. However, in external monetization scenarios, this challenge assumes even greater significance. Ultimately, a well-defined approach to data consumption will enable us to establish an effective and transparent billing process for our customers and consumers, while also safeguarding against excessive consumption in cases of credit exhaustion. Apigee provides a simplified solution for configuring and managing quotas, facilitating the implementation of these measures.
Things move fast, and this is obvious. The business will want to go at the velocity of its customers and will require its IT departments to move along with it. Application integration visual designer, native connectors and a varied task repertoire make businesses move fast and with confidence with rich mediations, native integration capabilities, and an easy to use interface. Apigee will govern the exposure, enable fast security implementations, internal visibility and regulatory compliance.
An example of the integration designer:
Having visibility of the way data is being consumed is key for operating the technology that makes all these possible. With Monitoring, Analytics and Logging, Application Integration and Apigee will give you an insight of the most important metrics required for a business to operate, analyze and trace its integration interactions. If customized metrics are needed, custom logs can be configured in order to persist transactional and business KPIs for further analysis.
After delving into the significance and difficulties of data activation - essentially making it accessible - we can now explore the implications for generative AI. The capabilities, accuracy, and quality of Large Language Models (LLMs) are directly proportional to the data we inject them. This is a hot topic, with businesses facing challenges in feeding contextual data from CRM, ERP, and core platforms into these models to enhance back-office operations. Throughout the article, the theme of automating processes and orchestrating complex business scenarios through native connectivity and data activation was prominent. Now, I'd like to add another layer into the conversation.
We now have easy to consume, governed data, ready to be used for business value creation. This information can be fed into AI applications and help businesses be more agile, efficient and find data that was lost in the depths of the technology stack . Let's see the following example:
This is the invitation I’m making. We have solved and enabled our business to access data throughout all our platforms. We can now enrich our back-office operation and employee experience and efficiency through AI applications that answer questions about your operational/organizational/enterprise context.
"How about we have a chat with our core platforms and systems? "
Pen and paper are easy. Let's go one step further. A link to a technical article will be provided that will show how to bring this idea to life. This technical demo will enable your organization to prove the value of native connectivity in a short period of time. After that, it's all about dreaming and extrapolating multiple use cases and potential scenarios.
The first demonstration will cover the following:
Technical Demo Link: Technical Demonstration - Using Application Integration & Vertex Agents to activate BigQuery Data
For our Spanish Speakers, here's a cool YouTube Video of a very similar demo!!