Google AI Search Generative experience gets videos and photos

Google’s generative AI-powered search experience gets a big new feature: images and video. If you enable the AI-based SGE feature in the search labs, you will now start to see more multimedia in the colored summary box at the top of the search results. Google is also making this summary box appear faster and adding more context to the links it places in the box.

SGE may still be in the “experimental” phase, but it’s very clearly the future of Google search. “It really gives us an opportunity, now, to not always be constrained by the way search worked before,” CEO Sundar Pichai said on Alphabet’s latest earnings call. “It allows us to think outside the box.” Then he said that “over time, that’s just going to be the way research works.”

The SGE acquisition raises huge, thorny questions about the future of the web, but it’s also just a tough product to get right. Google is no longer simply trying to find good links for you every time you search – it’s trying to aggregate and generate useful, valid and relevant information. Video in particular can go a long way here: Google has integrated YouTube more and more into its search results over the years, tying it to a specific chapter or moment within a video that might help with your “why is my dryer making that noise” query.

You can already see publication dates and photos starting to appear in SGE summaries.
Image: Google/David Pearce

Flattening and contextualizing links is still critical for Google if SGE is to work. Google said in a blog post announcing the new features that it will now display publication dates next to the three articles in the summary box in an effort to “help you better understand how recent the information from these web pages is.” 9to5Google I also noticed that Google experimented with adding inline links to the AI ​​summary, although that was only a test so far. Finding the right balance between giving you the information you’ve been looking for and helping you find it on your own—and all the implications of those two outcomes—is one of the hardest problems with Google Search forever.

Making SGE faster will also take Google some time. All of these tools based on large language models, from SGE and Bing to ChatGPT and Bard, take a few seconds to generate answers to your questions, and in the world of search, every millisecond counts. In June, Google said it cut load time in half — though I’ve been using SGE for a few months, and can’t say I’ve noticed much difference before and after. SGE is still very slow. It is always the last thing loaded on the page by a wide margin.

However, I have been continually impressed with how helpful SGE is to my research. It is especially useful for all kinds of “where should I go” and “what should I watch” questions, as there is no right Answer but I’m just looking for ideas and options. Armed with more sources, more media, and more context, SGE may begin to usurp the Ten Blue Links even further.

You may also like...

Leave a Reply

%d bloggers like this: