This may change medicine forever

By: Rachelle Dragani

How AI and machine intelligence are impacting doctors’ diagnoses.

Full Transparency

Our editorial transparency tool uses blockchain technology to permanently log all changes made to official releases after publication. However, this post is not an official release and therefore not tracked. Visit our learn more for more information.

Learn more

If you’ve ever had a headache or fever and tumbled down a rabbit hole of Internet-inspired worry, you know how difficult a medical diagnosis can be. Even doctors who’ve spent years studying biology have a hard time correctly understanding patients’ illnesses to the point that about 10 percent of patient deaths are caused in part by diagnostic errors, according to the Institute of Medicine at the National Academies of Science, Engineering and Medicine.

While there’s no miracle cure for this complex problem, champions of artificial intelligence (AI) and machine learning are hoping new technology can help increase the accuracy of medical diagnoses.

In coming years, technology will likely impact medical diagnoses in several specialties. One of the most exciting developments: AI can scan physical samples that were previously too time-consuming. Plus, a scanning program’s findings can be more accurate than a human’s eyes.

For example, the WATS-3D diagnostic platform developed by CDx Diagnostics identifies the precursors to esophageal cancer (the fastest growing cancer in the United States) at a higher accuracy rate than humans. Unlike some cancer screening tests (like cervical), it’s nearly impossible to do a sweep of the esophagus that will give physicians a sample they can accurately review for abnormal or cancerous cells.

“You often miss it during an upper endoscopy, because cells are extremely scattered and small,” said Dr. Mark Rutenberg, founder and chief scientific officer of CDx Diagnostics. “I’ve had GIs tell me that some tests only get about three or four percent of the esophagus. It’s like they work for the TSA, but they’re only allowed to check every 30th bag that comes through security.”

The tissue samples collected are also too thick to accurately observe in standard microscopes, making them difficult for human technicians to accurately inspect for abnormalities. Rutenberg and his team created WATS-3D to address that challenge. They developed a 3-D image analysis system and trained AI-based neural networks to spot abnormal cells, helping physicians perform procedures that remove those cells before they can turn into cancer. In August, after about 10 years of development and testing, the American Society for Gastrointestinal Endoscopy added WATS-3D to the approved standard practice screening procedure for Barrett’s esophagus (BE), a known precursor to esophageal cancer.

But that’s not the only advance in medical diagnosis. A team of Stanford researchers has trained AI to spot skin cancers. The tool could potentially be marketed as a smartphone app that would allow users to take pictures of questionable moles and receive advice on whether or not to see a physician, minimizing diagnostic errors and making healthcare more accessible.

Accessibility is also aided by startup companies like Ada and K Health, which have developed chatbot-like apps that patients can use to list symptoms and receive feedback more customized than anyone will get with a panicked read of the Internet.

Machine intelligence is being utilized to cut down misdiagnosis in Cardiology. Companies like Smart Blood Analytics are using AI tools to develop technologies that allow users to diagnose everything from rare blood cancers to brain tumors with nothing but an at-home blood sample and a smartphone. These tools may also increase stats that can give insight to both doctors and data scientists.

While all of this technology is impressive and exciting, we’re not quite at the stage where anyone with an iPhone can become a doctor. Many of these applications need more testing. The most successful AI applications have been created when scientists and developers narrow their focus to identifying a specific malady. Those projects tackle the problems that humans and current technology can’t, improve access to healthcare, and save some lives in the process, said Rutenberg.

“The human brain is unmatched by any computer,” he explained. “But computers and trained neural networks can help human brains see things they might have missed, and help us figure out the next steps to prevent cancer or anything serious. It’s better than a cure for a disease that you already have; it’s making sure you never get that disease in the first place.”

For more information, see:

Stanford team trains AI to spot skin cancers

WATS-3D by CDx Diagnostics

For related media inquiries, please contact story.inquiry@one.verizon.com

About the author:

Rachelle Dragani is a freelance writer based in Brooklyn who regularly covers science, technology and innovation. Her work has appeared in TIME Magazine, Gizmodo, and Popular Mechanics.

Related Articles

09/28/2021

Skyward and our parent company, Verizon, have partnered with the French drone company Parrot to announce Parrot ANAFI Ai

05/10/2021

Fundamentally transforming the traditional network, NaaS enables enterprises to operate a "living" network that is scalable, untethered and dynamic.