Select Page

The most common form of cancer in the United States is, without a doubt, skin cancer. Every year, doctors diagnose over 70,000 melanomas, which requires the examination of thousands, possibly even millions, of moles, freckles, and skin spots. Doctors determine whether or not a blemish should be biopsied to confirm the presence of melanoma by looking at them under a variety of magnifying instruments, and they typically err on the side of caution and order more biopsies than necessary. In fact, only one out of 10 biopsied lesions results in the discovery of melanoma. While no one can blame the doctors for their prudence, the sheer volume of unnecessary melanomas begs the question: Is there a better way to detect this disease?

As it turns out, there is. Doctors are now turning to artificial intelligence programs (AI) that use deep learning algorithms in order to help discover melanomas and other diseases that require image interpretation. At Stanford University, for example, a team of computer scientists and doctors trained a deep learning to identify over 2,000 skin diseases based on a series of nearly 130,000 images. According to their study, the algorithm was able to identify skin diseases with the same accuracy as 21 board-certified dermatologists, which illustrates AI’s potential to help doctors identify potentially life-threatening skin lesions and other conditions. And as algorithms and AI become more advanced, they may even be able to outperform doctors by detecting cancers and diseases earlier and with fewer unnecessary biopsies.

“Outfitted with deep neural networks, mobile devices can potentially extend the reach of dermatologists outside of the clinic,” the researchers noted in the study. “It is projected that 6.3 billion smartphone subscriptions will exist by the year 2021 and can therefore potentially provide low-cost universal access to vital diagnostic care.”

Stanford’s researchers aren’t the only team working to teach AI to identify diseases. Watson, IBM’s AI engine, can analyze CT scans for blood clots, watch the motion of heart walls during ECGs, and it may soon be able to identify skin cancers; Watson has actually studied over 30 billion images, giving it an incredible breadth of knowledge across various fields of medicine. Arterys, a cloud-based medical imaging company, recently partnered with GE Healthcare and secured FDA approval for their algorithm that can and accurately measure the amount of blood that flows through a patient’s heart with each contraction–and it only takes 15 seconds.

The tech community has been excited by the possibilities of AI for years, and now, doctors and healthcare providers can share in that excitement. New advances in AI can lead to improved patient outcomes by enabling faster, more efficient, and less invasive methods of identifying disease, which can lead to more early detection and lower mortality rates. Greater incorporation of AI can also free up doctors to spend more time on patient care and less time studying images of potential diseases. Better diagnostic practices? Now, it seems, there’s an app for that.