Privacy Not Included

Healthcare apps should be more focused on privacy. And we should use them only if they are.

Privacy Not Included
Photo by Dan Nelson / Unsplash

Last week was a holiday for me, so I decided to wait with the next email until today. This is somewhat of a special one as Medical Notes got its first ever sponsor! Read on to find out more.

Apart from that, this one includes:

  • Why and how AI is failing to help radiologists?
  • A combination of blockchain and AI might be perfect for training AI models.
  • Some mental health apps are terrible at privacy.

We should stop training radiologists

☢️ Occasionally, I try to show a different perspective on AI.

A computer scientist, Geoffrey Hinton (one of the godfathers of AI), declared in 2016 that we should stop training radiologists. He assumed AI would completely replace them in the next 5 years. Obviously not.

The number of radiologists increased between 2015 and 2019 by 7% in the US. Only 33% of them used any kind of AI in 2020. And just 40 of the 80 FDA-cleared AI algorithms are in use. What happened?

It’s all great when we’re reading scientific articles about how the newest AI algorithm is a reliable alternative to a radiologist. But putting it to practice seems to be causing some troubles. Those same AI models might be completely inefficient in a different with a slightly different image, system, or protocol.

All of AI, not just healthcare, has a proof-of-concept-to-production gap.

So, no, it looks like AI won’t replace radiologists anytime soon.


Together with

Capillary is transforming routine capillaroscopy practice with artificial intelligence. Whether you want to learn capillaroscopy or improve your practice, we have the perfect tool for you. Start using it today.

Introduce your company/startup to 230+ medical students and doctors. Become a sponsor.


Swarm learning

🙇‍♀️ Training AI algorithms and deep learning models requires large datasets - usually centralised. There are some “legal and logistical obstacles” when we’re sending data from one computer to another. Especially if that’s happening between different countries.

Federated learning (FL) is a little different. Each computer trains an AI model independently using their datasets. Then they only share what they learned with one another. But the downside is that there’s still a “central coordinator”, which monopolises control.

The next level are blockchain FL and swarm learning (SL). The core principle of using multiple computers remains the same. Models are also combined centrally, but they don’t require central coordination. More computers have control, which decentralises the process. A nice combination of AI and blockchain.

A recent article in Nature Medicine showed that SL can be used for training AI models for pathology.

In the context of healthcare data analysis, SL leads to equality in training multicentric AI models and creates strong incentives to collaborate without concentrating data or models in one place.

Privacy Not Included

🕵️ Do you ever read the privacy policy of an app before you start using it? Me neither. Luckily, Mozilla created a guide called Privacy Not Included. It follows and analyses the privacy practices of different consumer products.

But what does this have to do with digital health? There are many interesting apps and products on the list. But mental health apps really shocked me, as I assumed privacy would be their top priority. Apparently not. Even Calm, one of the most popular meditation apps out there, has questionable privacy practises. But Headspace looks like a safe alternative according to the guide.