Search: Google shows how its AI-powered Bard is making Search better – Times of India

ChatGPT vs Bard: The AI-powered future is here. Google launched its conversational AI service Bard for Search hours ahead of Microsoft’s new Bing, which is powered by an upgraded ChatGPT language model. While Microsoft opened the limited preview, Google kept the details of how its ‘new’ Search will work with AI under the sheets. In an event in Paris, the company finally showed the “new ways” of searching.
“Since the early days of Search, AI has helped us with language understanding, making results more helpful. Over the years, we’ve deepened our investment in AI and can now understand information in its many forms — from language understanding to image understanding, video understanding and even understanding the real world. Today, we’re sharing a few new ways we’re applying our advancements in AI to make exploring information even more natural and intuitive,” the company said in a blog post.
Search using Google Lens
Google says that Lens is now used more than 10 billion times per month. The product allows users to search from a smartphone camera or photos from the Search bar. In the upcoming update, Google will allow users to search what’s on the mobile screen.
“In the coming months, you’ll be able to use Lens to ‘search your screen’ on Android globally. With this technology, you can search what you see in photos or videos across the websites and apps you know and love, like messaging and video apps — without having to leave the app or experience,” the company said.
For example, your friend sends you a video of his trip to Delhi. The video shows a monument and you want to find more about it. You can long-press the power or home button on your Android phone to invoke Google Assistant and then tap “search screen.” Google Lens will identify the monument and you can click to learn more about it.

AI on multisearch
Google launched multisearch in April last year. The feature allows users to search using both text and images at the same time. It is available globally on mobile, in all languages and countries where Lens is available.
The company later expanded it by adding “near me” functionality to find what users need. Google says this feature, which is currently available in the US in English, will be coming to more countries globally. Furthermore, users can use multisearch globally on any image you see on the search results page on mobile.

For example, you are searching for “wall furniture” and come across one that you like. However, that product is in Black colour and you specifically want something more natural to match your room. You can use multisearch to add the text “natural colour” to find a product that matches your needs.
“We’re creating search experiences that are more natural and visual — but we’ve only scratched the surface. In the future, with the help of AI, the possibilities will be endless,” Google said.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.