Digital therapist – data protection for depression apps
Digital therapist – data protection for depression apps
Interview with Prof. Andre Döring, Professor for Business Informatics and Data Protection Expert
The number of people suffering from depression worldwide is steadily increasing – as is the digitization in all areas of life. There is a wide variety of applications designed to make it easier to cope with this disorder. However, patients have to provide very sensitive information when they use these digital therapists. And in doing so, they often divulge more than they realize.
Andre Döring, Professor for Business Informatics and Data Protection Expert
In this MEDICA-tradefair.com interview, Professor Andre Döring explains what depression apps are generally able to accomplish, describes what app users should look out for to protect their data and reveals the changes the new Federal Data Protection Act will bring on that score.
Professor Döring, what are apps able to achieve as it pertains to depression?
Prof. Andre Döring: With the app as their digital companion, patients are meant to structure their everyday life to identify depressive episodes at an early stage and get help or alerts when their situation worsens. It's crucial for patients to choose the right app from a reputable and reliable provider. After all, you are dealing with a very serious illness. The findings that are disseminated as part of the app must also be medically verified at all times.
What makes data that is being collected via health apps and specifically depression apps so unique?
Döring: You are dealing with very sensitive data and every user should be very mindful of the specific data he or she has to provide. Sometimes patients are also able to use apps anonymously. Even though one drawback here is that patients can still be identified via their cell phones, it is much more difficult to do so in this instance. Another issue is whether patients have to give details about their diagnosis and emotional state. Needless to say, things become very personal very quickly. Patients provide information they would potentially only entrust to a physician. We are talking about personal data that must be protected by the app provider.
The number of people suffering from depression is increasing worldwide.
What can I do as a user to protect my data?
Döring: On the whole, this is a very complex subject. For example, it's key to find out whether the data remains on the device or whether it is being transferred to third parties. The country in which the provider is located is also a highly relevant concern. Not all countries have a high data protection standard as is the case within the EU, even though international providers from the U.S, for example, must also guarantee a comparable level of data protection. Another step for the user would also be to find out whether another company that’s located in yet another country is involved in the development. Data could be exchanged via this step as well.
How could these types of apps put your privacy in danger?
Döring: Needless to say, there is always the danger that this data falls into the hands of unauthorized third parties. If users exclusively manage the data on one device and if this data is not transferred, users are still somewhat in control – as long as they don't lose their cell phone. Having said that, the data must always be encrypted. There is a relatively low risk that third parties can access the data in this case. Data must also be encrypted if it is being transferred, so third parties are unable to access it. Obviously, if third parties already have access to the data, users have almost no control over what happens to their information. Especially, if they "pay" with their data, meaning they have to provide their information to benefit from the app. Needless to say, this is often hard to discern for the ordinary person. Many people don't read the data protection guidelines. When it comes to depression apps, it makes sense to use information sources like the German Depression Relief Foundation (Stiftung Deutsche Depressionshilfe). In their search for reputable providers, depression sufferers can look to recommendations provided by health insurance companies, physicians, foundations or independent websites.
Which important changes will take effect with the new Federal Data Protection Act?
Döring: In Germany, data protection has essentially been part of the legal framework for the past 40 years and the central idea has not changed much. Basically, users have to agree to the use and processing of their personal data. Users must first consent or there must be a legal basis such as a contractual relationship for example. The major difference from before is that the data controller can now indicate a "legitimate interest" to process the data. Of course, this is a very open concept. A "legitimate interest" can actually pertain to just about anything. That’s why an app provider can process the personal data of affected parties in many cases. On the flip side – and this is a new aspect as well – depression app users must be fully informed about any data processing. Every time data is processed, the app provider must now be proactive and inform the user about the data that is being collected and why and how long this data is being stored. App providers must also indicate when data is being erased. Managing directors are liable for data processing. There is now more transparency on that score. Users are also able to request information, though this was also an option before the new changes took effect. For example, the user of a depression app can request details about the data that is being processed by a provider.
With the app as their digital companion, patients are meant to structure their everyday life to identify depressive episodes at an early stage and get help or alerts when their situation worsens.
What information does the provider need to supply in this case?
Döring: The provider must indicate the data categories that he or she processes, for example, information like the name, first name, location, my entries or an entry protocol. In addition, users can request a digital record of their data from the provider. For example, Facebook offers an automated service where users can download their profile and all the stored data. Theoretically, this is now also an option with all other apps that store data. Now I can contact the provider and request this information. Needless to say, the question is whether all providers are actually technically able to do this – but users are now generally entitled to this information.
The same applies to so-called profiling. Users are entitled to know which personal data the provider used to create a dedicated profile about them. Oftentimes, score values are calculated based on certain algorithms. Based on location information, in the case of depression sufferers, the app could calculate a value on how often they leave the house. However, under the new Federal Data Protection Act, providers are now also required to reveal the logic - for instance, the algorithm – used to calculate these score values.
What's more, users now officially also have "a right to be forgotten", a right to erasure of their data – in practice this often equates to blocking their data. In this instance, providers must confirm that they entirely erased or blocked the data – also as it pertains to servers in any data processing centers etc. The overall rights of users have been reinforced.
The interview was conducted by Julia Unverzagt and translated from German by Elena O'Meara. MEDICA-tradefair.com