TBH, Typing is passe.
You’re scrolling through Instagram when you see an influencer rocking a pair of pants. Now, you’re confused whether that’s bootcut or bell bots. To satiate your curiosity (and buy them for yourself) you turn to traditional text-based search engines to awkwardly ask them what are they wearing. After several failed descriptions and one misspelled word, you are now in a rabbit hole of internet searches.
Seems like a lot of effort just to find out what kind of pants someone on Instagram is wearing!
Wouldn’t it be a lot easier if you just took a screenshot and have the results automatically show on your search?
That is visual search pulling up some serious wizardry.
“An image is worth a thousand words- and a million bucks.”
The internet has revolutionized the way we consume information. Two of the most noticeable innovations fostered by the World Wide Web are the Voice and Visual search. While Voice search has been dominating the search landscape for about a decade now, visual search seems to be the new sheriff in town!
Brought to life by CBIR (Content-based image retrieval), also popularized as QBIC (query by image content), the visual search has been hiding in plain sight for quite some time. Infact, it had been right under our noses for almost 10 years before evolving into mainstream internet searching. It may not be as pertinent as the voice search just yet, but it is likely to become a gamechanger in the coming years. Pinterest and Google were the first brands to adopt the functionality, others have started to follow suit.
Infact, stats show that nearly 62% of millennials and Gen Z are inclined towards using visual search over all other internet search types.
Visual Search, on paper.
Today’s consumers have an attention span of 8 seconds – a second less than a goldfish. In such an era, visual search capabilities play an important role in capturing the imagination and interest of the consumer.
By definition, visual search uses images as the query for searches in place of text-based search. Simply put, visual search turns your phone’s camera into a search engine – just point your smartphone camera to any object, take a picture, and get search results in a jiffy.
However, visual search goes way beyond matching images with search results. Photo apps can read text on post-its and turn them into excel sheets, extract and translate texts from images, scan QR codes, recognize logos, celebrities, handwriting, and landmarks. So much, and we are only getting started.
At the moment, visual search is taking fashion e-retail by storm. With well-optimized content, this search type and eCommerce sites are like a match made in heaven. When customers perform a visual search, they look for a product using an image – players like Myntra have tapped into AI-powered recommendation tools that suggest close matches to the product, based on the visual search.
If that isn’t powerful AI, I don’t know what is.
Visual Search vs. Image search
Although both fall under the CBIR umbrella, visual search is more of a “sensory search”. The main difference between the two searches is that people use text to conduct an image search, but with visual search, an image is enough to conduct the search. While visual search uses images as a starting point, a classic image search starts with some text in the search field that leads to a SERP (Search Engine Results Pages) that shows a number of images based on the typed prompt. The search can be narrowed down further by selecting smart filters on the search bar.
|Fun fact: did you know Google was the first search engine to introduce image search in July 2001? Only to make it easier for people to search for an image of Jennifer Lopez in a green dress that broke the internet!|
The technology that drives visual search
Visual search is powered by two elements: computer vision and machine learning. Using these, the search engine can return results to any uploaded picture, just like when you type the words into the search bar.
Computer vision is the technology that teaches a computer to see and interpret what they are seeing. In a sense, computer vision tries to get machines to understand the world we as humans see. Although this technology has been around for several years, advancements in machine learning actually brought it to the forefront for visual search. Machine learning provides information that computer vision needs to recognize and interpret all the elements of an image.
Industries dominating the Visual Search game
The main industry that’s using visual search to its full potential is of course the retail and eCommerce industries. The clothes people wear, a pair of shoes, trending watches, furniture in a restaurant can all be yours with a snap of a camera.
Another area experimenting with visual search is the travel industry. Imagine walking down the streets of Italy without a tour guide. With visual search, you can just point at a historic building and instantly get information about it.
Major industries and companies are delving into visual search for their products and technologies. Infact, according to the stats here, the global visual search market will reach USD 28,470 Million by 2027, at a CAGR of 17.5%. Further, we’ll be seeing a boom in visual search results from the automotive, healthcare, food & beverage, and the education industry.
Visual search also offers enormous opportunities when it comes to reducing language barriers. With Google Lens, a user can simply point their camera at a sign in a foreign language and it will be translated automatically on their phone. Visual Search can be extremely useful in foreign markets as a result, especially in countries that use different, like Russia, the Arab, or China, and Japan.
OG players of visual search
- Pinterest Lens
Launched in 2017, Pinterest Lens uses a camera connected to the Pinterest app to take pictures of objects and show similar pins that match the picture.
Just after a year of its launch, Pinterest reported that there were 600 million visual searches taking place on its app every month. The most popular search categories in the Lens search included:
- Home Decor
Capitalizing on its users’ cravings for more, Pinterest went on to release the ‘Shop that look’ pins that added little white dots to images when they were expanded. The dots are clickable so that users can discover more details about selected products in the images. These dots, also adopted by Instagram, use elements of visual search that add extra cues to click through and buy a selected product.
- Google Lens
Just a few months after the launch of Pinterest Lens, Google launched its own visual search app called Google Lens. In May 2018, it was embedded in the camera, Google Assistant, and Google photos apps on android devices. In October 2018, Google released Lens within Google images. The new feature allowed users to click on an object within a photo and then pull up similar products or pages based on the image. For instance, if you see a photo of a coffee shop and you really like the bookshelf in it, you can click on the bookshelf in the photo and Google will pull up links to similar products and where you can buy it.
- Instagram Shopping
On September 17, 2018, Instagram announced plans to launch a new shopping feature that would incorporate elements of visual search. The new feature lets users click on a product that appears in an Instagram Story or Page where they are given a link to either buy the product or learn more about it.
- Amazon Rekognition
We expected Amazon to take the Pinterest and Google route and develop its own visual search engine. However, Amazon has made its visual search technology available to third parties for use. This is how it works: Amazon Rekognition allows companies or third parties to add images and video analysis to their applications. These third parties provide the image or video to Amazon and the Rekognition API then identifies the objects, people, or text in the provided media. Rekognition’s features include facial recognition (similar to Facebook suggesting friends to tag in uploaded pictures), the ability to identify celebrities, and the type or scene captured in a photo.
- Snapchat Camera Search
On September 24, 2018 Spanchat hopped on the visual search bandwagon by introducing the option in its app. Instead of developing its own technology, Snapchat partnered with Amazon’s Rekognition API as the basis of their visual search feature. The feature allows users to take a photo of an item and immediately be transferred to Amazon’s app with a link to the product they took a picture of to complete the purchase.
In 2017, eBay announced it will be joining Pinterest and Google in the visual search game to help users shop better for products with just a ‘click’ on its platform. But, eBay does things a bit differently. It offers two visual search options – the first is the image search which allows users to take a photo and the app pulls up similar content on eBay. The second feature is ‘Find It On Ebay’ that allows users to take a photo they find on the web and submit it to eBay, which will then use its visual search technology to find similar products.
- Bing Visual Search
On June 21st, 2018 Bing launched its Visual Search that gave users the ability to search for, shop, and find information about the world around them through the photos they take on their phone. Bing’s visual search engine allows users to take a photo of a flower they like and Bing’s app will identify the type and provide you with links to find out more information, like a Wikipedia page. The Bing app can also be used as a shopping app with visual search. With its features similar to Google Lens, it sure provides a fierce competition to the former.
How can brands and marketers leverage visual search?
As search expands beyond its traditional forms, new technologies continue to come to the forefront. It is important to understand and take advantage of it as soon as possible.
To get the most out of visual search brands and businesses should concentrate on providing seamless customer experience and optimize visual search into their platforms. But, what are the primary benefits of optimizing visual search?
- Get discovered by the next gen
60% of GenZ are discovering brands solely through social applications and 69% of them look to purchase directly off the back of the platforms. This is the right time for brands to hit the nail and get discovered outside of traditional search fields. For example, Marketers are seeing Pinterest’s potential for reaching consumers as they’re considering products. They are inserting their brand into the interaction by delivering relevant content, ads and shopping experiences.
- Create a connection with new customers
Potential customers will look to make any kind of personal connection with a brand. Trust is critical in driving customers towards a potential purchasing decision. Visual search allows the consumer to generate a more emotional connection, which translates into less price sensitivity.
- Hit customers that have already made a decision
With properly optimized content through visual search, brands can interact with people that have already made up their minds whether they want to make a purchase, especially if they are using Pinterest Lens. Furthermore, with the advent of immersive AR technology, consumers can not only see products in 3D but can also project the products into real-world surfaces. It makes products compelling and personal, and bridges the connection between buying products at a store vs. online. Often referred to as “spearfishing”, visual search closes the number of steps and work between a customer searching for an item and then buying it from a website.
Some tactics marketers can deploy right away to make the most of visual search:
- Integrating with intelligent chatbots: Having a chatbot initiate a conversation based on a picture a user takes is a good starting point. This partially solves the problem of intent. For example, you buy a foreign-labeled product and want to read the description. You click the picture and a chatbot comes up asking whether you would like a translation of the product or find similar products.
- Build an image library of products: Make sure to have an image that accompanies all products you sell so that they can be picked up by visual search engines in the future.
- Combining visual search with text: Rather than view visual search as competing with text search, marketers should find ways to integrate the two together.
- Increase presence on image-centric social media: Google, Pinterest, Instagram, and Snapchat are leading the way when it comes to integrating visual search. So marketers should consider organizing photos on these platforms in a structured way that will help them become more readily identifiable.
- Identify items in pictures: marketers should make it easier for different items in images to be identified. This will help users to click on products in photos that they want to learn more about.
- Make product information easily available: Information about the price and availability of a product that gets pulled up by a visual search should be easily available and clear to read for users.
Optimizing visual search with image SEO
Image SEO is now more important than ever as we use more and more images to search for stuff. In the past, image SEO has been an afterthought but with visual search becoming a game-changer, it is now gaining much-needed attention. (Check out our DIY Guide on SEO Audit) You can win a lot if you just use high-quality, relevant images and optimize them thoroughly. It’s not hard to get your images ready for a quality visual search. Use these tips to get a headstart:
Size matters, but that’s not all: to make it easier for visual applications to process your images, ensure that your images can be displayed easily. Pay attention to size, file type, appropriate tags and titles, relevant keywords, and device types when working with images.
Include images in your sitemaps: Google can only identify images when they are a part of your sitemap. To feed Google with image info, use Google’s Structured Data Testing Tool, and add info about images in those URLs, including image type, subject matter, caption, title, location, and license.
Schema markup for images: add product markup that meets the basic metadata requirements. Ensure your product has a name, price, description, currency, availability. The more details you add about your product, the better, as it will make your product more appealing and your results more clickable.
Alt-text is important: descriptive alt-text can help images to get indexed in search engines. Search engine crawlers use alt texts to understand the meaning of your images so when a user throws a real image on the search engine, crawlers know where to look for results.
Image badges: Similar to alt-text, badges are added over image thumbnails to help users identify the type of content associated with the image. Image badges were introduced by Google in 2017.
For now, the badges only work on mobile devices and include the following categories: recipes, products, videos, and animated images. While Google algorithms can automatically detect animations (GIFs), if you want your products to get noticed in other categories, you’ll need to add special markups and structured data fields.
The perils of visual search
Not all is glorious with visual search just yet. Being a gold-tiered HubSpot partner agency, we know it certainly has its shortcomings that require further refinements in the coming years. Here are some of the problems with visual search.
Technical complications: No doubt that it is much easier to take a picture of an item to search as opposed to typing out a request on a search engine. However, from a technical standpoint, visual search is quite complicated. It requires complex AI programs and enormous amounts of data to be rendered effectively. So far, improvements at the AI front have been a slow process but we can only hope to see better results when more and more people being using images for visual search.
Unclear intent: When it comes to users using visual search, the intent of the user who took the photo seems to be open for interpretations. A photo of a labeled product in a grocery store could be a request to see similar food products, or it could be a request for a translation. The user’s intent when taking a picture is not immediately clear with visual search, the way it is with text.
Scanty usage: Even though visual search has a lot of potential, it still pales in comparison to the traditional text-based searches.
When is visual search most effective?
Visual search is most effective when it answers questions that are hard to put into words. It answers questions like – show me chairs that’s kinda like this hipster egg chair, but different or I don’t know how to describe it, but I’ll know what I want when I see it. This is the most common case when you don’t know the name of the product.
Let’s check out some examples:
- At a grocery store, you see a fruit that looks like a sea urchin and you’re very curious to know its name. Simply taking a photo of the fruit is much more convenient than typing out “what is that?”
- When visiting a city as a tourist, a historic monument piques your interest. A text-based search like “ a red tall building with many windows in Bangalore” will not work, but a picture of the building will.
- You go to a restaurant where they serve your meal in beautiful bowls that you just have to have in your kitchen. Instead of typing “serving bowls with purple flowers” and hoping for the best, snap a picture and find the exact set online.
Does this mean the demise of text-based search?
Despite the attractiveness of visual search, it suffers from the same problems as voice and video searches. The technology is not yet there, and until massive advances are made in AI, users will likely continue to prefer text over visual searches. This means that text-based searches will still reign the search landscape. However, while visual search is in the early stages of development, marketers that position themselves now to capitalize on the growth of visual search in the future could see major payoffs down the road. And from a trend forecast standpoint, the two are likely to co-exist peacefully. You’ll still need the magic of traditional search and SEO optimization techniques to get ranked in visual search.
So, what’s in store for visual search?
First of all, technology will keep evolving with accuracy.
We should also be expecting Google to put a lot of resources into integrating visual search with its other products, like Google Maps and Shopping. So, we can expect to see a more accurate interpretation of images and, therefore, more varied and useful results.
My bet is that it will take the launch of a smart device with dedicated visual search capabilities to bring visual search into the mainstream, just like the launch of smart speakers lead to the advent of voice recognition and interfaces. The same device could possibly remove the friction engulfing visual search currently – the need to download an app, launch it when you want to perform a search (slower the UI, merrier the search :/) point your smartphone at something (and probably offend someone) and search – making it natural and intuitive. All of these points to smart glasses, also known as AR glasses, that have the potential to define the future of visual search.
Check Out: Content Curation Tactics for SEO
How can marketers monetize visual search?
Visual search has immense potential for monetization. Infact, it is hard to imagine a type of search that’s more commercially friendly. This is because visual search is the most intuitive way to search for things that you’d like to buy – or things you’d like to buy to go with things you already own.
With “regular” text-based search as well as voice search, queries are more likely to be informational, with a relative minority of searches having purchase intent. Whereas visual search, while it can also be used to find out information about an object, is much more likely to be used for product searches – or for product inspiration.
Augmented reality opens up a whole new world of possibilities for visual search. Companies like IKEA have already developed tools that use augmented reality to place furniture within your home, so you can visualize it before you buy it. Even with LensKart, you can try spectacles frames on a 3D image of you and decide which frame suits your face the best.
With accurate visual search technology, the suggestions have the potential to be helpful without being intrusive. Machine learning could be used to learn the consumer’s behaviors over time.
So, how should marketers be preparing for this future? In actuality, we’re a few years away from visual search becoming the face of the internet search, and the technology that will usher it in hasn’t arrived yet. So, there’s a limit to how much marketers can do at this stage.
But having said that, experimentation leads to groundbreaking breakthroughs. Keeping a close eye on developments will place marketers in a strong position to move in on visual search when the time comes.