AI UX: Transforming the user experience with Vertex AI and Discovery AI

Lauren_vdv
Community Manager
Community Manager

ai-ux.png

AI UX (AI-driven User Experience) is an emerging term that represents the new way humans interact with computers that’s driven by AI. 

In the past, the way we interacted with computers was limited by the way backend systems, such as databases or search engines, organized data. 

However, as advances in AI technology become available, we’re seeing new ways for humans to interact with computers that are used to drive operational efficiencies and improved user experiences. This “AI UX” is quickly becoming the industry standard for new consumer-facing web services and apps.

In this article, we’ll dive into real-life examples of AI UX, how technology has changed to enable AI UX, and how you can transform your user experiences using Google Cloud’s Vertex AI and Discovery AI solutions. 

This article is based on a recent session from the 2023 Cloud Technical Series. Register here to watch on demand

If you have any questions, please leave a comment below and someone from the Community or Google Cloud team will be happy to help. 

What is AI UX? 

As mentioned above, AI UX is an emerging term used to describe AI-driven user experiences. 

What are examples of AI UX?

You’re probably already having many AI-driven user experiences, and don't even realize it! To help get a better idea of how it looks in action, let’s consider a few AI UX examples. 

Google Lens

Google Lens is a tool where you can use your camera or an image to search for and find products, resources, and other content online. You can also add text to define and clarify your search. 

For example, your bike has a broken thingamajig, and you need some guidance on how to fix it. Instead of poring over catalogs of parts and then looking for a tutorial, you can use the point-and-ask mode of searching with Google Lens to quickly find the exact moment in a video that can help.

google-lens-bike.png

This example of AI UX is known as multimodal search, which is made possible by Google’s AI model, Multitask Unified Model (MUM).  

YouTube

Another AI UX example is YouTube. YouTube aims to identify the videos you like and the subjects that interest you by analyzing your viewing history, feedback on past videos, and other indicators like the channels you subscribe to. This process goes beyond simple personalization - the data and AI help computers develop a more profound understanding of you as a user. 

Gmail

Gmail also uses AI to analyze your sent and received messages to offer the most suitable, recommended word options as you compose a new email. This helps accelerate and enhance your writing experience. 

Most recently, Duet AI for Google Workspace takes AI UX even further with Google Workspace solutions like Gmail, Docs, Sheets, and Slides. 

With Duet AI for Google Workspace, you can create original images from text, right within Google Slides or use the “Help me write” feature in Google Docs - simply enter a topic you’d like to write about, and a draft will instantly be generated for you, including smart chips for information like location and status. Learn more about Duet AI for Google Workspace.

Lauren_vdv_1-1692112248341.gif

How has technology changed to enable AI-driven user experiences?

So what has changed that makes AI UX possible today? 

In traditional IT systems, most data is organized as structured or tabular data - using simple keywords, labels, and categories in databases and search engines. 

traditional IT data.png

In contrast, AI-powered services arrange data into a data structure known as “embeddings.” 

embeddings.png

Embeddings are a way for AI to organize data based on its meaning

Once trained with specific content (e.g. text, images, tweets, etc), AI creates a space called an “embedding space,” which is essentially a map of the content’s meaning, where contents with similar meaning are closer together in the space. 

The two-tower model, illustrated below, is a specific type of embedding-based search where queries and database items are mapped to the embedding space by two respective neural networks. In this example, the model responds to natural-language queries for a hypothetical literary database.

Lauren_vdv_4-1692112247211.gif

source

The two-tower model can be customized to support various business cases, such as determining the best items for users, suggesting the next video to watch, providing the best answer to a question, or even finding the best images for a text query as a multimodal search.

AI UX in action with Vertex AI Matching Engine

Let’s take a look at a customer use case. Mercari is a thriving marketplace service that has around 5.3 million active users in the US and 20 million active users in Japan.

Mercari launched a new service called Mercari Shops in Japan that allows small business owners and individuals to quickly open their own online shops, allowing shop operators to sell items directly to consumers. 

At the core of the new service, Mercari is using Vertex AI Matching Engine to realize a crucial part of the shopping experience - displaying recommended, relevant items to shoppers sold by different shops on the same page. 

Previously, this wasn’t possible - shoppers had to go to each individual shop to browse items within a single shop. By incorporating AI UX, shoppers can more easily and quickly find the items they’re looking for with an improved user experience. 

How does this work exactly? 

Mercari applied the text description of each item to a word2vec model combined with TF-IDF to extract the embedding of that item, representing the meaning of the product.

Lauren_vdv_5-1692112246995.png

Then they introduced Vertex AI Pipelines for building the entire MLOps setup. The first pipeline extracts embeddings from the raw item data, while the second pipeline executes a vector search using Vertex AI Matching Engine to find similar items for a query.

Lauren_vdv_6-1692112246901.png

By introducing Matching Engine, Mercari Shops was able to build the production vector search service within a couple of months. Learn more about Mercari’s story here

AI UX in action with Discovery AI for Retail

Let's take a look at another example of how AI UX can be applied in the retail industry using Discovery AI for Retail solutions.

As part of Discovery AI for Retail, Retail Search is a service provided by Google Cloud for retailers to use similar Google Search type capabilities, but with the retailers' own products. So organizations can provide Google-quality search, browse, and recommendations ​​experiences that are customizable and built upon Google’s understanding of user intent and context - helping to increase conversions and reduce search abandonment.

Lauren_vdv_7-1692112246876.png

For instance, if a user searches for "date night dress," a traditional keyword search engine would display dresses with “date” patterns because it's in the product name or description. 

On the other hand, Retail Search understands the query's intention. In this case, the user is looking for a dress to wear for a date, and therefore, Retail Search returns dresses that are highly likely to be relevant and purchasable for the user.

Lauren_vdv_8-1692112247001.png

The new user experience powered by AI has already made a significant impact on businesses. Before implementing Retail Search, Lowe's was encountering 15% of queries with no search results. However, now they are able to find relevant results for many of these queries. 

Macy's e-commerce site experienced a significant impact thanks to Retail Search. The conversion rate went up by 2% and the revenue per visit increased by 1.3%. More importantly, they are now able to provide the much sophisticated user experience as an e-commerce site. Learn more.

What’s next for AI UX

Generative AI, or generative artificial intelligence, is a type of machine learning that has received enormous attention as of late. Generative AI learns patterns and structures in training data that can be used to analyze information and create new data (e.g. content such as text, images, music, audio, and videos).

Generative AI builds on existing Google Cloud data, AI, and ML services, and we’ve introduced a number of technologies to make new levels of innovation possible, including:

  • Generative AI support in Vertex AI, our machine learning development platform. This includes Model Garden, a single place to search, discover, and interact with a wide variety of models from both Google and our partners, as well as Generative AI Studio which provides user-friendly tools to tune and customize models, all with enterprise-grade control over data security.
  • Generative AI App Builder, which delivers Enterprise Search and next-generation chatbots to help automate business processes and build engaging customer experiences. 
  • Duet AI for Google Cloud, which assists cloud users with contextual code completion, reviews, inspections and more. 
  • New foundation models, including Codey for code generation, Imagen for text-to-image creation, and Chirp, our state-of-the-art speech model, to help supercharge the capability and productivity of software teams.
  • Duet AI for Google Workspace, to provide an always-on collaborator for creativity and productivity.

Although generative AI is an emerging technology, organizations are already exploring a range of use cases spanning industries and applications

Trending use cases include accelerating content creation and software development, improving customer service through personalized self-help interactions and chatbots, and unlocking new ways to search, synthesize, and analyze information.

Learn more about how generative AI is revolutionizing AI UX and how you can get started with Generative AI on Google Cloud solutions here.

AI UX Resources 

Vertex AI Matching Engine

Discovery AI/Retail Search

Generative AI 

Special thank you to Kaz Sato (@kazsato) for delivering the original content during the 2023 Cloud Technical Series

1 1 2,820
Authors