Earlier this month Google I/O 2021 came to our screens with a huge amount of content, news, updates, and advancements discussed over the event. Here, we take a look at some of the most significant updates across Shopping and Search and what they mean for digital strategies.
1. Shopping.
Late last year, Google Shopping launched the unpaid listings in the UK so we were expecting to see further developments in the Shopping space, especially as tracking the performance of shopping results becomes crucial to all ecommerce strategies, not just Paid Media.
Expanding the inventory.
This year, Google explained how their Shopping Graph is used to show users the most relevant shopping information through a “comprehensive, real-time dataset about products, inventory and merchants”. As part of this, Google announced they are expanding their partnership with Shopify to allow Shopify’s 1.7 million merchants to easily feature their products across Google.
Previously a hassle for Shopify merchants, this could be huge for ecommerce sites that run on the platform, with their products now able to be discovered across Search, Shopping, YouTube, Images and more much more easily.
Shopping through screenshots.
In one of the more advanced developments, Google announced the capability for users to now shop through the screenshots they take. Recognising that screenshots are one of the most popular, and easiest, ways for people to save something for later, Google will now let you save these screenshots to Google Photos. Whenever those photos are then viewed, users will be prompted to search the photo with Lens, allowing them to see different search results which help them find the products or related in the photo.
This is huge for online shopping and could signify a shift in how people are searching and shopping for products. The best way to make this work? Optimise your images with alt tags and schema to give Google as much information as possible about what is being displayed. This is crucial for product photos, where the information you provide about them including price, image location, stock availability, reviews and so on can be used by Google to display the information directly in Search, Images, and Shopping.
Shop until you (don’t) drop.
Along with other modules being released, Google will be introducing a new feature in Chrome that allows you to see your open carts when you open a new tab.
Think of it like a to-do list browser extension, but instead of all your tasks for the day, you see all of your baskets. Regardless of whether you go to another site, or get distracted by other things, when you open a new tab you’ll be able to pick up exactly from where you left off.
How big is this? Time will tell. I personally wouldn’t want reminding of how many open carts I have dotted around my favourite websites, but it could be big for encouraging conversions. Whether this will impact things like basket abandonment email campaigns, retargeting activity, or even how these conversions are then tracked (e.g. will they be attributed to Direct traffic, even if that wasn’t the original source?) remains to be seen.
Shopping takeaways.
- Google is continuously expanding the Shopping Graph and building up the merchant information and product inventory
- Google Shopping is growing in importance for ecommerce sites – get your feeds right, get your schema set up, and optimise your product pages
- The focus on Lens and Cart Reminders leans heavily towards Google’s mission of building a more helpful web for everyone, but it does also lean to keeping people in the SERPs, using their features. Monitor your traffic, monitor your platforms, and stay up to speed on how the changes impact your performance.
2. Search.
LaMDA & the shift from text to conversation.
In one of the more nuanced advancements, Google announced their latest research breakthrough in LaMDA – Language Model for Dialogue Applications – which can engage in free-flowing conversations.
This is a massive development in how Google can understand, interpret, and relay information a user is asking for. The example application they gave was for chatbots, allowing them to move from narrow, pre-defined paths into free-flowing conversations about a seemingly endless topics – for example, you might start talking to a chatbot about an issue on the site, and go down a path that leads you to asking about services or product information or delivery timescales.
Conversations are not linear, and this development takes that into account allowing Google to gain a better understanding of how conversations develop.
Now, while they explained the development with chatbots, it would be naïve to think that it would only apply there. If Google gains a greater understanding of how conversations develop, they expand their understanding of how topics might relate to each other. If that happens, they can give you better results in Search through greater semantic understanding of what you, and other people, are searching for.
This could be massive for interpreting the intent of search queries and what information should be being displayed for users. Our recommendation? Monitor this, closely.
But MUM, what’s the answer?!
Everyone has been that annoying kid at some point pestering their parents with countless variations of the same question trying to get a different result. Turns out, Google is no exception.
As part of their advancements in making Search work better for users, they announced they are working on a new technology, aptly called MUM (Multitask Unified Model) to help with answering complex needs. In the future, you’ll need fewer searches to find the information you need.
Deemed 1,000 times more powerful than BERT, “MUM not only understands language but generates it” – according to Google, it is trained across 75 different languages and many different tasks at once which allows it to develop a more comprehensive understanding of information and world knowledge.
The real kicker? It’s multimodal. That means it doesn’t just understand information across text as other models do, but across images and in future will expand to video and audio as well.
That, honestly, is mind-blowing. A single model that can understand and interpret queries across a variety of modalities to give you the most comprehensive results. Not only will it be able to serve text-based results, but topical hub results featuring images, videos, podcasts, and more to give you the most complete answer, with related content to explore further through sub-topics.
This, combined with their Shopping development of searching through screenshots, could mean you conduct a screenshot-based image search and are taken to a result page with products, videos, and related articles about the best options.
Search takeaways.
- The move towards conversation is a huge step, both for companies using chatbots, and the general search space.
- Improved understanding of context, conversational triggers, and semantic relationships will lead to a higher level of understanding in results.
- Combine the improved contextual understanding with the development of models that analyse and provide results across multiple modalities, and Search is suddenly a one-stop-shop.
- More and more features bringing users to Google, and keeping them in Search, could both surface additional opportunities and limit the potential of some companies to effectively compete without changing their strategies.
If you’re looking for an agency that can help you identify and capitalise on your biggest Shopping and Search opportunities, driving business performance through digital activation, you’re in the right place. Get in touch with our team today to see how we can help you achieve your goals.