Part 2 of this blog series: Overview of the DiGA data protection criteria

In the first part of our blog series we looked at when an app idea is DiGA compliant and gave a first rough overview of the privacy requirements for a DiGA. In the second part of the blog series, we would like to give you a deeper overview of the most important criteria that a DiGA must fulfill from a data protection perspective. The presentation is not exhaustive.

Practical tip: You can find a comprehensive overview in the data protection criteria of the Federal Institute for Pharmaceuticals and Medical Products (BfArM) (available at the following link https://www.bfarm.de/DE/Medizinprodukte/Aufgaben/DiGA-und-DiPA/Datenschutzkriterien/_node.html) (hereinafter also referred to as “BfArM data protection criteria”).

If you would like to develop a digital care application (DiPA), please read on. This is because most of the data protection criteria published by the BfArM also apply to DiPA. If you want to develop a DiGA or a DiPA, we recommend that you take a close look at the BfArM’s data protection criteria and seek advice on data protection law. Nevertheless, a detailed overview of the data protection criteria of the BfArM is given below:

Lawfulness and purpose limitation (Part 2, Sections 2 and 4 of the BfArM data protection criteria)

The digital health application may only process data for certain purposes specified in the Ordinance on Digital Health Applications (DiGAV) and only on the basis of the user’s consent. The DiGAV provides for data processing  

  • for the intended use of the DiGA,
  • to demonstrate positive care effects,
  • to determine performance-related price components and
  • to permanently ensure the technical functionality, user-friendliness and further development of the DiGA.

The storage and processing of data for one of the above-mentioned legitimate purposes of a DiGA must be technically separated from the storage and processing of data for other purposes (guarantee of non-linkability).

When the digital application is activated, a user account must be created for the user. All data processed for the user must be linked to this user account. User accounts must be pseudonymous. All consents must also be linked to this account. The activation code used as proof of reimbursement allows for authentication without identification. Granting and revoking consent is limited to electronic means through the digital application in order to maintain pseudonymization.

The requirement to link all data to a pseudonymous user account will be a challenge for many DiGA manufacturer. For example, the following questions may arise: How much data can be retrieved from a DiGA without re-identifying the user? How do I send the necessary hardware to use the DiGA if I am only allowed to process pseudonymous user data? A data trustee could be the solution here, who only knows the name and address of the user for the purpose of sending the hardware. Is it possible to address the user personally with DiGA? According to the BfArM this is possible if it is clearly recognizable that only the first name or a pseudonym is desired. How can it be technically ensured that other smartphone functions required for DiGA, such as the address book or camera, can be used without the user being identified directly or via third parties?

Processing in fairness and in good faith (Part 2 Section 3 of the BfArM data protection criteria)

The digital application must take into account the reasonable expectations of data subjects with respect to the processing of personal data. In other words, the digital application must not engage in data processing that a reasonable data subject would not expect. The reasonable expectations of the data subject are based on what is customary in the industry in general and for presumably similar applications. Advertising promises of the DiGA manufacturer also influence the expectations of the users. In the opinion of the BfArM, a manufacturer who advertises a digital application with encrypted data storage suggests that access to the stored data is technically impossible. Although encrypting only the hard disk does not contradict the claim of encrypted storage, it is not the solution that a data subject can reasonably expect on the basis of the advertising claims. However, the BfArM recognizes that it must be possible for innovative solutions to deviate from the industry standard and pursue disruptive approaches. In this case, the DiGA manufacturer is subject to increased information obligations.

Transparency (Part 2 Section 4 of the BfArM data protection criteria)

DiGAs must comply with the transparency requirements of Art. 13, 14 GDPR. Users must be clearly and comprehensibly informed about the controller and its contact details, the contact details of the data protection officer, the purposes of the processing, the personal data processed in the DiGA, the recipients, the storage period of their data, and their rights as data subjects. In addition, the controller must provide a concrete and binding assurance in the privacy policy of the digital application and/or on the application’s website as to the period within which written inquiries regarding data protection will be answered. In addition, they must provide information on how to reach them for telephone inquiries about data protection. For written inquiries, the BfArM considers a response time of 2 working days after immediate return of an acknowledgement of receipt to be appropriate. Manufacturers of a DiGA must also be available by telephone for 8 hours on working days. In the opinion of the BfArM, a waiting time of 10 minutes is reasonable for the affected users. The DiGA manufacturer must therefore provide sufficiently trained personnel and align the data protection organization with these DiGA-specific requirements.

The data controller must refer to the possibility of lodging a complaint with the competent data protection authority in the privacy policy and in responses to written inquiries. At a minimum, the postal and e-mail addresses of the competent data protection authority must be provided. According to Art. 13 GDPR, “only” information on the right to lodge a complaint with a supervisory authority must be provided (see also Art. 77 GDPR on the right to lodge a complaint with a supervisory authority). To meet this requirement, an overview of all German supervisory authorities with postal and e-mail addresses could be linked in the privacy policy.

Practical tip: The Federal Data Protection Commissioner, for example, provides a link with the contact details of the German state data protection authorities at https://www.bfdi.bund.de/EN/Service/Anschriften/Laender/Laender-node.html.

The affected user of a DiGA must explicitly agree to a unilateral change of the privacy policy. Without this consent, the change will not take effect. The data of the user who has not yet consented to a change must be blocked and may not be processed further for the time being. Users must be informed of the changes made and their impact on the processing of personal data at least 14 days before the unilaterally changed privacy policy comes into effect. In the opinion of the German supervisory authorities, privacy policies are not contractual terms per se, but rather data protection information or factual notices. Therefore, the data subject does not have to agree to the privacy policy or even read it. Rather, the data subject must merely be given the opportunity to take note of the privacy policy. In this respect, the DiGAV appears to go beyond the requirements of the GDPR.

Practical tip: Therefore, DiGA manufacturers are advised to carefully formulate privacy policies and consider processing operations in order to avoid complex changes.

Data minimization and storage limitation (Part 2 Section 6 of the BfArM data protection criteria)

The DiGA manufacturer must be able to justify the contribution of all categories of processed personal data to the legitimate purposes and prove that these purposes or requirements cannot be fulfilled without these data. If this proof cannot be provided, the data may not be processed. This criterion is also intended to emphasize the importance of pseudonymization of personal data in DiGA. Processing that allows the user to be identified must be avoided. Medical and health care data may only be collected if the algorithms and processing based on them are proven to be medically and technically useful in order to achieve positive health care effects.

The DiGA manufacturer must create a deletion concept for the application based on the requirements of DIN 66398 and must be able to prove that the deletion rules defined in the deletion concept are legal and effective. The DIGAV makes partial specifications regarding deletion periods. System logs written for the purpose of secure operation must be deleted after three months at the latest. In addition, a grace period of a maximum of three months is provided to ensure continuity of care for subsequent prescriptions. The DiGA manufacturer is therefore not obliged to delete the data assigned to a user account immediately after the end of use, but may retain them for a maximum of three months. However, the data must be blocked during this period. The user must be informed about the use of this grace period in the DiGA privacy policy.

Intervenibility (Part 2 Section 7 of the BfArM data protection criteria)

The DiGA must be technically designed in such a way that it complies with the principle of privacy by design and enables the fulfillment of the rights of the data subjects such as access, correction and deletion as with any other application. For example, the DiGA can provide a data list and, if necessary, a search function so that the user can find personal data and correct and/or delete it. The user must be able to update his or her data at any time via his or her user account. 

If personal data has been written into the electronic patient record of the data subject by the DiGA and this data has subsequently been deleted, blocked, corrected or supplemented by the data subject, the digital application must inform the data subject that he or she can upload an updated data extract into the electronic patient record. It must be possible to configure the DiGA in such a way that a change in the data automatically leads to an update of the data stored in the electronic patient record, whereby deletion, blocking, correction and supplementation of the data must be taken into account. The digital application must inform the service providers concerned of the deletion, blocking or correction of personal data by the data subject if they have transmitted or made available personal data to them.

Accuracy, integrity and confidentiality (Part 2 Section 8 of the BfArM data protection criteria)

The DiGA manufacturer must establish measures to regularly verify that the processed data are accurate, complete and up to date with regard to the requirements of the legitimate purposes for which they are processed. The DiGA manufacturer must be able to prove that these measures correspond to the state of the art and are sufficient with regard to the risks identified in the data protection impact assessment (DPIA).

Technical measures to ensure integrity can include hash functions, digital signatures, version control of data, securing relationships, and transaction logs. Data confidentiality can be achieved through encryption, access control, and audit logs. Automated tests that use predefined test cases to ensure correct processing can be used to verify accuracy. Manual spot checks can also be used to detect deviations.

In any event, measures must be taken to detect any breach of the integrity or confidentiality of the data processed. These measures must also include the blocking or erasure of data for which it cannot be established that their integrity or confidentiality is sufficient to fulfill the purposes of the processing and/or to meet the need for protection.

Accountability (Part 2 Section 9 of the BfArM data protection criteria)

To meet accountability requirements, an audit trail must be created for the digital application. The audit trail must include documentation of accesses, data transfers and changes, proof of the origin of data collected through the digital application, and logging of (even automated) blocking and deletions of data. The audit trail must record all access to personal data by administrative, operational, and support personnel, including the audit trail, in order to detect internal data breaches.

Conclusion

The DiGA has undeniable benefits for patients, and the app on prescription creates financial incentives for entrepreneurs planning to develop a health app. However, the effort required for development and operation should not be underestimated. Not only are there a number of data protection requirements to be met. For example, a DiGA must first be certified as a medical device. Its positive effect on care must be proven by means of studies. The BfArM supports applicants in the application process. Applicants should definitely take advantage of the consultation offered by the BfArM to discuss questions and critical points of the DiGA idea openly with the BfArM.

We are happy to support you in all IT and data protection issues relating to DiGA. Thanks to our technical expertise, we can also advise you on whether your planned implementation meets the BfArM data protection criteria from a technical perspective.

Please contact us. We look forward to hearing from you!

planit legal dr. anna roschek

Dr. Anna-Kristina Roschek

Lawyer

Email: anna.roschek@planit.legal
Phone: +49 (0) 40 609 44 190