Listen

Description

Who’s in the Room? Siri, Alexa, and Confidentiality
Curt and Katie chat about how therapists can maintain confidentiality in a world of AI assistants and smart devices. What duty do clinicians have to inform clients? How can we balance confidentiality with the reality of how commonly these devices are involved in therapy? Can telehealth therapy be completely confidential and data secure? We discuss our shift in clinical responsibility, best practices, and how we can minimize exposure of clinical data to ensure the confidentiality our clients expect and deserve.
In this podcast episode we talk about something therapists might not consider: smart devices and AI assistants
We received a couple of requests to talk about the impact of smart devices on confidentiality and their compliance with HIPAA within a therapeutic environment. We tackle this question in depth:
What are best practices for protecting client confidentiality with smart devices?

Turning off the phone, or placing the phone on “airplane mode”

Warning clients about their own smart devices and confidentiality risks

The ethical responsibilities to inform about limits of confidentiality and take precautions

It’s all about giving clients choice and information

What should therapists consider when smart devices and AI assistants are in the room?
 – Curt Widhalm

Whistle-blower reports on how often these devices are actually listening

Turning off your phone is a lot cheaper than identity theft

Consider your contacts, geolocation, and Wi-Fi connection

Some of this, as we progress into a more technological world, might be unavoidable

How do Alexa and Siri impact HIPAA compliance for therapists?

The importance of end-to-end encryption for all HIPAA activities (and your smart device may not be compliant)

The cost of HIPAA violations if identity theft can be traced back

Understand the risks you are taking, do what you can, and remember no one is perfect

What can modern therapists do with their smart devices?

GPS location services can be left on for a safety reason, emergency services use GPS location

Adjusting settings for voice activation, data sharing, when apps are running, locations, etc.

Turning off and airplane mode are also options

Always let the client know the limits of confidentiality

Resources for Modern Therapists mentioned in this Podcast Episode:
We’ve pulled together resources mentioned in this episode and put together some handy-dandy links. Please note that some of the links below may be affiliate links, so if you purchase after clicking below, we may get a little bit of cash in our pockets. We thank you in advance!
Psychotherapy in Ontario: How Confidential is my Therapy? By Beth Mares, Registered Psychotherapist The Privacy Problem with Digital Assistants by Kaveh Waddell
Hey Siri and Alexa: Let's Talk Privacy Practices by Elizabeth Weise, USA Today
Patient and Consumer Safety Risks When Using Conversational Assistants for Medical Information: An Observational Study of Siri, Alexa, and Google Assistant, 2018
Hey Siri: Did you Break Confidentiality, or did I? By Nicole M. Arcuri Sanders, Counseling Today
Alexa, Siri, Google Assistant Not HIPAA Compliant, Psychiatry Advisor
Hey Alexa, are you HIPAA compliant? 2018
Person-Centered Tech