Google develops AI tool to help diagnose skin conditions

Google is launching an artificial intelligence tool that will help people to self-diagnose skin conditions and know whether they need to visit a doctor. The tech giant unveiled the plans to launch Derm Assist in Europe this year at its annual developer conference. It said 2 billion people suffered from dermatologic issues worldwide that there was a “shortage of specialists.” As a result many people were attempting to find out about conditions through Google searches.

Its new AI tool, which has also been classed as a medical device in the EU, will let users take pictures of their skin, hair or nail from different angles and answer questions about other symptoms to provide a list of potential matching conditions. These could include conditions such as acne. Google will then show the commonly asked questions associated with those conditions and other dermatologist-reviewed information, which the company said would give people “a more informed decision about your next step”.

AI systems have in the past come under fire for failing to be inclusive, with many struggling to identify non-white faces due to biases in the models which have been built by majority white male teams. In 2015, the company’s AI tools in its photos app labelled a black couple as “gorillas”, prompting Google to ban the word from its labels. Peggy Bui, Google Health’s project manager, said it wanted to ensure that the model accounted for factors such as age, sex, race and skin type. She said the model had been developed by analyzing images, case data of diagnosed skin conditions and examples of healthy skin from different demographic groups.

All of this comes as part of Google’s push into healthcare after the company having merged its AI-centric DeepMind Health division with Google Health in 2019. Google said the latest move to launch the new AI tool would unlock ways for people to stay better informed about their health.


Photo Credit: Uladzik Kryhin / Shutterstock.com