By continuing to use the site or forum, you agree to the use of cookies, find out more by reading our GDPR policy

In his office at the VA hospital in Seattle, Dr. Nadeem Zafar needed to settle a debate. Zafar is a pathologist, the kind of doctor who carries out clinical lab tests on bodily fluids and tissues to diagnose conditions like cancer. It’s a specialty that often operates behind the scenes, but it’s a crucial backbone of medical care. Late last year, Zafar’s colleague consulted with him about a prostate cancer case. It was clear that the patient had cancer, but the two doctors disagreed about how severe it was. Zafar believed the cancer was more aggressive than his colleague did. Zafar turned to his microscope – a canonically beloved tool in pathology that the doctors rely on to help make their diagnoses. But the device is no ordinary microscope. It’s an artificial intelligence-powered microscope built by Google and the U.S. Department of Defense. The pair ran the case through the special microscope, and Zafar was right. In seconds, the AI flagged the exact part of the tumor that Zafar believed was more aggressive. After the machine backed him up, Zafar said his colleague was convinced. “He had a smile on his face, and he agreed with that,” Zafar told CNBC in an interview. “This is the beauty of this technology, it’s kind of an arbitrator of sorts.” The AI-powered tool is called an Augmented Reality Microscope, or ARM, and Google and the Department of Defense have been quietly working on it for years. The technology is still in its early days and is not actively being used to help diagnose patients yet, but initial research is promising, and officials say it could prove to be a useful tool for pathologists without easy access to a second opinion.  There are currently 13 ARMs in existence, and one is located at a Mitre facility just outside of Washington, D.C. Mitre is a nonprofit that works with government agencies to tackle big problems involving technology. Researchers there are working with the ARM to identify the vulnerabilities that could cause issues for pathologists in a clinical setting. At first glance, the ARM looks a lot like a microscope that could be found in a high school biology classroom. The device is beige with a large eyepiece and a tray for examining traditional glass slides, but it’s also connected to a boxy computer tower that houses the AI models. When a glass slide is prepared and fixed under the microscope, the AI is able to outline where the cancer is located. The outline appears as a bright green line that pathologists can see through their eyepiece and on a separate monitor. The AI also indicates how bad the cancer is, and generates a black-and-white heat map on the monitor that shows the boundary of the cancer in a pixelated form.  Patrick Minot, a senior autonomous systems engineer at Mitre, said since the AI is overlaid directly onto the microscope’s field of view, it doesn’t interrupt the pathologists’ established workflow. The easy utility is an intentional design choice. In recent years, pathologists have been contending with workforce shortages, just like many other corners of health care. But pathologists’ caseloads have also been mounting as the general population grows older. It’s a dangerous combination for the specialty. If pathologists are stretched too thin and miss something, it can have serious consequences for patients. Several organizations have been trying to digitize pathologists’ workflows as a way to increase efficiency, but digital pathology comes with its own host of challenges. Digitizing a single slide can require over a gigabyte of storage, so the infrastructure and costs associated with large-scale data collection can balloon quickly. For many smaller health systems, digitization is not yet worth the hassle. Full deatils are posted on OUR FORUM.