Alexa, am I losing my voice?

Cathal McGloin, CEO of ServisBOT, offers his views on the recently announced partnership between NHSx and Amazon Alexa.

Opinion is divided on the NHSx partnership with Amazon, which will allow people to use Alexa to get NHS responses to common medical queries.

While the Alexa announcement has gained a lot of media attention, in fact, voice is not a new phenomenon in this sector. Telemedicine has been around for many years, enabling clinicians to provide remote care to patients and the NHSx Alexa partnership is not designed to replace that.

Cost saving exercise

In a healthcare context, using voice assistants such as Alexa to schedule GP appointments, get prescription reminders, and access basic FAQ information can be used to augment telemedicine, to support patient-centred care and help improve patient outcomes while reducing healthcare spending.

Take the example of using voice assistants to remind patients of their doctor or clinic appointments. Missed appointments cost NHS England an estimated £216 million, with patients failing to show up for one in every twenty appointments. If voice assistants can interact with a patient prior to their visit to confirm their attendance; cancel; or reschedule, this represents a huge opportunity to reduce waste.

Where people currently log into provider portals, or use mobile apps to schedule appointments and set reminders for taking medication; or where they currently receive email or text reminders for appointments, voice assistants provide patients and clinicians with a more convenient option to do the same things. Rather than typing or reading, these tasks can be done hands free using natural language.

Combatting misinformation

While the NHSx partnership with Amazon Alexa has ruffled some feathers, consumers already use the web to search for information about certain medications or illness symptoms. Some industry commentators point out that through this partnership, people will be provided with a direct channel to find genuine NHS responses to their medical queries, rather than being directed to less reliable sources of information through an online search provider.

Improving access

Consider an elderly or disabled person using a voice assistant to find information or to receive reminders about their medication or appointments, without the need to visit a clinic or navigate a web portal. There’s no doubt that voice assistants can play a role in different aspects of healthcare, offering greater convenience and a better patient experience. However, voice assistants are not the cure for all healthcare-related interactions, so their use needs careful consideration to avoid overstepping the boundaries of patient privacy and care.

While it’s still early days for widespread implementations of voice assistants in healthcare, the number and range of health and wellness use cases are steadily growing. However, concerns over data privacy and how personal information will be used by the technology vendors cannot be ignored.

Conversations need data protection too

As with any digital technologies that handle our personal information, there are security measures that need to be implemented to protect patients and healthcare providers from breaches and unauthorised use of patient information. When it comes to voice, this information is in the form of conversations, which sometimes leads to a perception that it is even more vulnerable than information that we provide over the phone, in a web portal, or in a Google search. This is not the case. Conversations are simply data in a different form and with the same implications for how they are stored, managed and used. The fact is that recent data breaches and misuse of consumer information, especially by some large and powerful technology companies, generate warranted fear and apprehension.

Securing conversational data

We need to consider how voice conversations will be stored and protected from security breaches. Whether patient conversations are voice or text-based, the challenge is the same. Essentially, patient information is being handled in the form of recorded conversations. However, by using the blended approach of voice and text, any personal data can be handled as text data and stored in a message history which can be encrypted and stored the same way as other sensitive information. Personal data can also be redacted from messages to prevent unauthorised access to sensitive personally identifiable information.

It is important to strike a balance between the hype surrounding voice-based chatbot solutions and the need for security and governance in protecting end users and providers, especially in areas like healthcare.

Securing access to patient data: The voice authentication dilemma

Unlike the current arrangement, where Alexa will be used to conduct a search for publicly available National Health Service information, there are other situations where, patients’ data would have to be properly and securely authenticated.

Where a voice assistant will be accessing patient data in order to process a specific request, for example, seeking information on private healthcare entitlements or scheduling an x-ray appointment, the data required could include the patient’s address, hospital record number, and date of birth.

While there are AI advances being made to recognise voice patterns as a means of authentication, there hasn’t been widespread adoption of it so far. So in a voice channel today, a user can be authenticated in the same way as they would be on a web portal, or by a contact centre. In this case, they speak their password, PIN or other identifiers to the device. In cases where voice authentication is enabled, speaking a PIN gives an added layer of security via two-factor authentication.

An alternative approach is to blend voice with secure text-based messaging where the user is asked to authenticate on the provider’s mobile app or on their web portal and then continue the conversation with the voice assistant, but manage sensitive information via the secure messaging channel. This blended approach of augmenting voice with messaging, or more traditional email channels, is an effective way of using the best of voice in a healthcare interaction while simultaneously engaging via the secure text-based channel that supports a more private setting.

Misconceptions about voice assistants for healthcare

Finally, it’s important to note what voice assistants like Alexa will and won’t be permitted to do. After all, they are subject to the same healthcare and data privacy regulations as any other channel. So, what applies today in how you interact with healthcare providers applies for the voice channel too.

Voice assistants are not a replacement for telemedicine, nor are they a means for giving patients diagnostic or any other highly sensitive information. However, they have some very powerful applications in areas of healthcare such as answering common queries about minor ailments and improving access to NHS information while reducing wasted appointments and the burden on GPs. Used in this way, with the appropriate data protection measures in place, voice assistants can play a significant role in reducing stress, cost and waste in the National Health Service.


'Alexa, am I losing my voice?' has no comments

Be the first to comment this post!

Would you like to share your thoughts?

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

© 2019 Rapid Life Sciences Ltd, a Rapid News Group Company. All Rights Reserved.

Privacy policy

Terms and conditions