Search tools are critical in a shopping experience. The search box on any ecommerce site remains the single major contributor to transactions. Not having an efficient product search tool can be catastrophic; over 37% of shoppers say they are not likely to return if they have abandoned a retail site after experiencing poor search results.
Search isn’t new. It has been around for a long time. However, a perennial challenge for retailers is that searching for products is still not as easy or intuitive as what customers expect it to be.
The current way in which product search has been architected, relying on heavy indexing and annotation of attributes (e.g., keywords, search terms) has probably reached its limits as we move into a semantic and visual web where intent and meanings are more important than the search terms. Inspiration, discovery, and purchase are three very different stages for a shopper. Current search tools have no way of differentiating the shopper’s search intent across each stage and accurately personalizing quality search results for them. To overcome this, retailers need to move beyond the search box.
Beyond Single Modality Approach
Most search tools today are single modality; either text, image, voice, or language. To improve the product search experience, we need to move beyond a single modality approach. We need to embrace a new way of searching for products using keywords, images, and even languages in the most natural way shoppers would express themselves in the real world. Tools like Google’s new multi-search and several others are leading the way in building a smarter search experience for products across various modes. According to Gartner, consumers favor voice and visual search, and brands can increase digital commerce revenue by 30% by adding these capabilities to their existing search experience, especially on mobile.
The multi-modality approach aims to gain a deeper understanding of the search (such as intent) rather than just the query itself. For instance, if you are looking for a coffee table to go along with your Zen-styled living room, you can upload an image of your living room and type “Zen-styled coffee table” into the search tool and let the AI behind multi-search generate the most relevant and precise set of results. In the same way, a shopper walks into a furniture store and starts a conversation with the store associate, describing what he/she is looking for, be it with an inspirational image he/she pinned on Pinterest or Googled.
The multi-search tools take all these inputs and convert them into “vectors”, matching them against the closest vectors of products in the catalogs. The key difference is the ability of multi-search tools to take in ‘vectors' – whether keywords, images, videos, languages, or combination thereof – in the process instead of just keywords or images alone.
With over 81% of retail shoppers conducting online research before buying, eCommerce merchants must deliver a better and deeper understanding of search queries on their sites, which includes multi-search capabilities in product search and going beyond just keywords or images, and using our vector-based algorithms. This would result in more relevant, precise results for the shopper. Having worked closely with global brands and retailers to bring these new capabilities to market through product innovations, we know that search and recommendations are fundamental in any part of a shopper's journey. This means having advanced features that allow fashion retailers to deploy, such as “search the look”, “shop the look” or “complete the look” on retail product pages or mobile applications (such as via the camera lens or gallery) or even within a social platform. eCommerce merchants should deploy advance search tools that offer a decentralized commerce architecture that enables extensible, personalized, friction-free commerce experiences and integrated, optimized back-office operations.
Fragmented customer journey is perhaps the biggest challenge retailers are facing. Inspiration, search, discovery, and purchase can happen at different or the same moments across that journey—in-store, web, mobile, social. The key to success lies in connecting the fragmented customer journey.
Customer Journey: Connecting the Fragments
The Dawn of Social eCommerce
As the world moves into a post-COVID environment, the retail scene as we know has changed. We are ‘phygital’, more omnichannel, personalized, mobile, and definitely more social. This means retailers face an even more fragmented customer journey where inspiration, search, discovery, and purchase can happen at different or the same moments across that journey. The challenge for merchants is making themselves present at the right moment with the relevant products to shoppers at the right price. For instance, 73% of retail consumers use multiple channels to shop, and 59% of shoppers use their mobile devices in-store to compare costs or research and coupons. Many retailers today are failing badly to connect the fragmented customer journey. Beyond search and recommendations, fragmentation is probably the next biggest challenge in addressing the customer’s journey.
We have seen interest from retailers in exploring more sophisticated types of recommendations, including personalization. It is one of the more proven areas in which AI-powered solutions have been able to make a significant impact, processing voluminous amounts of data generated everyday – from search terms to click and transaction data. For instance, in the world of fashion, where inspiration plays a big part, retailers are using different AI-powered recommendation tools, including visually powered ones, to recommend products with model or influencer-styled images that “complete the look” for an outfit. For home furnishing retailers, it’s a similar “Shop-the-Room” experience to provide not just relevant but thoughtful, helpful recommendations that fit the buyer’s budget and desires.
Gen Z shoppers are the next new group of consumers that retailers cannot ignore. Disruptive and distinctive, Gen Z shoppers are calling for innovations in the shopping and social ecommerce experience. They consume content insatiably from Instagram, Tiktok, and other social platforms, all of which deliver personalized content feeds.
Merchants need to embrace these social ecommerce platforms so that their products can be seen, discovered, and promoted. For instance, Tiktok allows merchants to discover and connect with thousands of creators through their affiliate program and is directly accessible on Tiktok Shop. Tiktok is even testing a “Shop Now” for their influencer videos. While it is still early days, merchants should continue to experiment with targeting this group of consumers across different platforms, utilizing AI-powered tools that uncover and leverage trustworthy user-generated content (UGC) that engage this group of shoppers.
Oliver Tan, Chief Executive Officer
Oliver Tan is the CEO of ViSenze, an artificial intelligence company powering visual commerce at scale for brands and publishers. As CEO, Oliver oversees the company’s overall growth strategies and leads global business development covering strategic partnerships, corporate development, and investor relations. Prior to founding ViSenze, Oliver spent five years with a pioneer cybersecurity startup, recognized by Gartner Magic Quadrant, up till its successful acquisition and exit.