We all witnessed the NHS falling victim to a crippling ransomware attack that took out most of the major IT systems in up to 45 sites across the country. In the following days and weeks, clinical personnel were unable to access information on the central network, including patient records, x-rays, blood results and appointment information.
In order to reduce disruption and delays in treatment, doctors sought alternative methods to communicate with each other, with some choosing to use the consumer messaging service WhatsApp. In the case of the Royal London Hospital, they told its staff to use the application to share pictures of scans to help with diagnosis, suggesting it was a ‘secure and infallible way’ to decide on the best course of treatment.
The prevalence of WhatsApp and other messaging apps in healthcare settings has been well documented. According to one survey published in the British Medical Journal, ‘98.9 per cent of UK clinicians now have smartphones, with about a third using WhatsApp or similar messaging tool’. On the one hand, it’s easy to see why. Apps like WhatsApp have been designed with consumer convenience in mind. With almost every clinician owning their own smartphone – and the rise of telecommunications standards like 4G improving the speed and performance of such messaging day by day – apps like WhatsApp provide a quick and reliable way of sharing information and making plans. By enabling clinicians to collaborate and communicate effectively they can, in a very real sense, enhance patient care.
On the other hand, this enhancement comes with some serious caveats – which meant that in this particular case, the use of WhatsApp was determined by the Trust to be ‘not in line with their policies’. Whilst the Trust did not disclose the precise failings of the app, we can take some educated guesses. Indeed, there are several alarming assumptions that can be determined in relation to the use of consumer messaging apps to transmit sensitive personal information:
There are few, if any, mechanisms in place to determine whether patients are happy for their data to be transmitted via apps like WhatsApp. For example, another survey found that nearly all (97 per cent) of surveyed doctors routinely send sensitive patient information on the app without explicitly gaining patient consent, even though 68 per cent are worried about this information sharing. The problem is, gaining genuinely informed consent is likely to be impossible. The typical patient – indeed, the typical medical practitioner – is not likely to understand the technical nuances of the app and the implications for the transmission and storage of their personal data, much less communicate these in the high-octane environment of a hospital.
The majority of mobile devices have an option to increase the security of their smartphone, either through passwords, number locks or fingerprints. However, not all medical practitioners protect their devices and given that WhatsApp does not require credentials to login, this means that if an unprotected phone or tablet was found, a third party could gain access to personal patient data at the tap of a button.
WhatsApp allows users to sync messages through Cloud applications such as Google Drive and iCloud, a function that is beneficial to the average user. In sensitive situations however, this could lead to leaked data exposing information on other personal devices, in that household for example. Imagine a doctor’s home where children play in front of a computer with the screensaver on and it displays sensitive or graphic information. This can happen all too easily.
WhatsApp automatically generates a backup of users’ conversations as the app considers it a bad user experience if someone cannot look over historic messages. In the case of medical information this means clinicians inadvertently allow WhatsApp to keep copies of patient information on their servers, which they are not authorised to do. Additionally, it is unclear what security measures are actually in place to protect these legacy conversations, but reports have revealed vulnerabilities.
Many clinicians choose to anonymise data manually when sending material via WhatsApp in a bid to protect their patient’s confidentiality. However, this could potentially be at the expense of patient care, where it is vital to have no doubt that the patient is getting the correct diagnosis and course of treatment based on the correct stats. If you remove identification from scans or blood tests, there is a risk that they may be accidentally associated with someone else.
All this is not to suggest that WhatsApp doesn’t have a vital role to play in healthcare and emergency settings. In the aftermath of the Croydon tram crash in 2016, for example, doctors at the scene used the app to communicate with colleagues at nearby hospitals so that they knew what to expect. Similarly, first responders to the Manchester Arena bombing in 2017 used the app to communicate with each other and work more effectively. WhatsApp is considered the gold standard for consumer messaging and deemed secure for general consumers after adding end-to-end encryption.
However, a recent white paper authored by legal firm Mishcon de Reya detailed the legal implications of mobile messengers in clinical settings, and, in addition to the technical caveats outlined above, this means there are significant concerns about WhatsApp’s suitability for the NHS.
The clear answer is to deploy applications that are specifically designed for use in medical settings, delivering the benefits but removing the data protection, security and compliance risks associated with WhatsApp. Siilo, for example, has been designed from the ground up to deliver secure messaging for medical practitioners, and incorporates a number of additional features that consumer apps cannot support, such as the ability to export PDFs of case discussions. In this way, consumer technology is a gateway for a truly industry-specific application.
The shortfall in adult social care funding is predicted to be £5,000,000,000 by 2024/5. Mere money and staff (both of which are in increasingly short supply) ca fix the problem. But technology might be able to. Look out for our upcoming article on tech in social care by Helen Dempster of Karantis360.
You're the expert! Write for The Engine or share your articles, papers and researchAdd your content
Add your content
Sign up for Ignition, our regular, ideas-packed newsletter