Google Lens now uses AI to identify skin conditions

Shutterstock

Google is expanding the capabilities of its computer vision-powered app, Google Lens, with a range of new features.

The app, which uses artificial intelligence to identify objects and provide relevant information, can now identify and surface skin conditions, such as moles and rashes. Users can upload pictures through Lens, initiating a search for visual matches.

The feature is not as advanced as the AI-driven skin, hair and nail condition diagnostic app that Google launched in the European Union in 2021.

The diagnostic app faced obstacles in the United States, where it would have required approval from the Food and Drug Administration, a step Google chose not to pursue. However, the Lens feature can still be valuable for individuals who are unsure whether to seek medical attention or try over-the-counter treatments.

Additionally, Google announced that Lens is integrating with Bard, the company's AI-powered chatbot experience. Users can now include images in their interactions with Bard, and Lens will assist Bard in understanding and providing accurate responses. For instance, if a user shows Bard a photo of shoes and asks for their name, Bard, informed by Lens' analysis, will provide an answer.

These updates are part of Google's ongoing investment in generative AI technologies, with a focus on enhancing the capabilities of Bard.

Last week, Google introduced a new capability that allows Bard to write, execute, and test its own code, improving its problem-solving and programming abilities. In May, Google collaborated with Adobe to enable art generation within Bard.

More from Business News

Blogs