this post was submitted on 16 Feb 2024
181 points (95.9% liked)

Privacy

29872 readers
1318 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 4 years ago
MODERATORS
 

As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor's voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that's beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I'm not willing to use a non local AI transcribing my voice. I don't want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a "cloud sollution". Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

you are viewing a single comment's thread
view the rest of the comments
[–] Boozilla@lemmy.world 2 points 4 months ago (2 children)

I had another idea. You might be able to use something that distorts your voice so that it doesn't sound anything like you, but the AI can still transcribe it to text. There are some cheap novelty devices on amazon that do this, and also some more expensive pro audio gear that does the same thing. Just a thought.

[–] possiblylinux127@lemmy.zip 5 points 4 months ago (1 children)

Voice cloning is the least of your concerns honestly as you are sending people private information to the cloud.

[–] FlappyBubble@lemmy.ml 1 points 4 months ago

Sure that's another problem but this data is already sent beyond the hospital. We have a national system in place gatjering all medical records.

[–] FlappyBubble@lemmy.ml 5 points 4 months ago (2 children)

Sure but what about my peers? I want to get the point across and the understanding of privacy implications. I'm certain that this is just the first of many reforms without proper analysis of privacy implications.

[–] Boozilla@lemmy.world 1 points 4 months ago

I agree that getting the point across and having them rethink this whole thing is a much better way of handling this than using a tech solution. I am just pessimistic you can change their minds and you might need a plan B.

[–] possiblylinux127@lemmy.zip 0 points 4 months ago (1 children)

Honestly I would be way more concerned about your patients privacy. You shouldn't just ship medical data to some third party. That leads to massive data breaches.

[–] Boozilla@lemmy.world 0 points 4 months ago

I agree with you but that ship has sailed. I work with big medical data and it's shocking the stuff that gets stored and passed around. The really big players like PBMs and major insurance providers are supposed to abide by HIPAA but they do not fear enforcement at all. Only the small fish like doctors, etc, need fear HIPAA.