But why should such precautions be taken even in times of crisis ? In practice, even if health data are intangible, their disclosure can have very tangible consequences: identity theft, fraud, financial penalties, etc.
Therefore, the best way for a company or public body to protect its data and business is to comply with the GDPR.
Let’s put the GDPR (General Data Protection Regulation) in context before going further. The two pillars of the GDPR are the notions of personal data and data processing. But what do they mean in practice?
According to the French Data Protection Authority, the CNIL (Commission Nationale de l’Informatique et des Libertés), personal data are any anonymous data that can be double checked to identify a specific individual (e.g. fingerprints, DNA, or information such as “the son of the doctor living at 11 Belleville St. in Montpellier does not perform well at school”).
For example: a name, a photo, a fingerprint but also an IP address, a computer login identifier, etc.
Performing a processing of personal data means “carrying out an operation or a set of operations involving personal data, regardless of the process used (collection, storage, modification, transmission, etc. ) . ”
For example: keeping the register of subcontractors, managing payrolls, managing information of marketing prospects, etc.
However, let’s be clear: the GDPR is not so much the birth of a regulation on personal data protection, but rather the culmination of a process that began several years ago.
In France, the legal framework was set up as early as the 1980s, with the Law on Information Technology and Freedom of 6 January 1978. This will, for example, give birth to the CNIL, an independent administrative authority whose main role is precisely to ensure personal data protection. French legislators were already legislating on this idea 40 years ago.
In Europe too, the issue has been debated for several decades. And it was in the 1990s that the foundations of the current legal framework were laid. Faced with the rapidly changing technologies and the Internet, the EU recognised the need to legislate on these new subjects. In 1995, it therefore passed a European text on data protection: the European Data Protection Directive. This text will establish minimum standards in terms of confidentiality and data security.
Those examples of national and European legislation will serve as the basis for the development of the GDPR that we know today.
Although several European countries, like France, had legislation on personal data protection, the issue was taken into account and dealt with at national level. There was no consensus on all aspects of personal data protection. The process of creating the GDPR begins when the European Commission decided to take up this important subject in January 2012.
After consultation rounds, the first draft of the regulation was published in November 2013. Then the legislative back and forth started. The text will evolve in the course of negotiations between the European Commission, the European Parliament and the Council of the European Union. It’s only nearly two years after its first version, that the final version of the text we know today will be adopted in April 2016. This is due to the natural inertia of Europe’s cumbersome lawmaking process. As well as the importance of the text to be adopted and the participation of various actors: states, of course, but also companies and citizens.
The GDPR is implemented on 25 May 2018. The general objective of the Regulation is to establish a regulatory framework for personal data protection. This framework is extended and applied equally to all EU member states. It makes it easier for all EU citizens to understand how their data are used and, if necessary, to lodge a complaint about their processing. This objective is summarized in three key points by the CNIL:
As mentioned above, the GDPR aims in particular to make data processing actors responsible. To achieve this, it standardizes the obligations. And that is why, it applies de facto to everyone.
Any organization, regardless of its size, country of location and activity, may be affected. If an organisation processes personal data on its behalf, it is sufficient for one of the following two criteria to be verified for it to be subject to the GDPR:
For example, a foreign company with an e-commerce site in French and delivering products in France will be affected and must therefore comply with the GDPR.
There is only one exception: the GDPR only applies to legal persons and excludes “any processing by a natural person in the course of a strictly personal or domestic activity. ” This prevents the regulation from becoming a legal stranglehold for a small personal website publisher for example.
One of the major flaws of the GDPR could have been to ignore the case of subcontractors. Indeed, many companies and organizations of all kinds rely on other companies to process and collect data on their behalf. And many companies have tried to shift their responsibility onto these subcontractors.
This context has been well understood and the GDPR takes outsourcing into consideration: it also concerns and applies to subcontractors. The Regulation defines data protection as “any natural or legal person who processes personal data on behalf of the controller in the context of a service or provision. ”
The CNIL provides a guide to support them (Source) and gives us concrete examples of subcontractors:
Finally, any subcontractor must comply with clear obligations in terms of information security, confidentiality and even responsibility.
If you’re a subcontractor, this issue of data protection should be taken into account “by design”, from the very conception of the service or product. Doing so will allow to avoid a lot of potential issues in the future. It is a question of putting in place the measures that will guarantee optimal data protection.
For example if a company processes data, having a ROPA (Record Of the Processing Activities) in place is a basic measure necessary for optimal data protection. In specific cases, subcontractors must also appoint a Data Protection Officer DPO, a requirement similar to that of their clients.
To conclude, subcontractors processing the personal data on behalf of other companies have a duty to advise them and assist them in implementing certain obligations of the GDPR.
The health sector as a whole is closely scrutinized by Data Protection Authorities such as the CNIL. The reason is simple: the industry is the leading producer and user of particularly critical health data. But what do you call health data?
According to the CNIL, health data is “personal data relating to the physical or mental health, past, present or future, of a natural person (including the provision of health care services) that reveal personal information about the state of health of that person. ” Therefore, some measurement data, that would allow for the deduction of information about an individual’s health status, are also included.
We can then draw 3 large Categories of personal health data:
Making an appointment on a platform like Doctolib, consulting a doctor, getting a prescription, going to the pharmacy, etc. Each of these seemingly innocuous gestures leaves traces. The list is long if one wishes to list all the acts that create health data. With one observation: much of this information is collected without us being fully aware of it.
Health data is a key issue today. According to the article Sizing up big data, published in Nature Medecine in January 2020, their total volume has increased tenfold since 2013 and this increase is far from trivial. The whole area of health monitoring is growing rapidly. And it relies essentially on the availability of a large number of health data and its analysis by artificial intelligence.
Legally obtained health data are therefore a treasure. And so are their cousins, obtained by unscrupulous methods. This is evidenced by a report by the American cybersecurity company VMware Carbon Black published on June 5, 2019. It shows that, on average, health data (medical records, prescriptions, etc. ) are sold three times as much as “classic” personal data (name, first name, telephone number, etc. ). To give an idea, for the researchers of the PrivacyAffairs website (Source), you can find credit card numbers with owner IDs on the dark web for an average 125 to 200 euros depending on the money available in the account.
The first explanation for this price difference is to be found in the nature of the data concerned. It is easy to object on a credit card, but a medical record contains information that cannot be changed such as a patient’s history of illnesses. Health data is also a gateway to scams and other identity theft to obtain false insurance refunds. Of course, all of these numbers become even more important when it comes to research data, lab intellectual properties and so on, which have been stolen in sophisticated attacks.
The value of the stolen data alone does not justify a steady increase in cyber attacks against health sector actors. The current context is particularly conducive to attempts to steal and resell data.
As the FAIR (Factor Analysis of Information Risk) method rightly points out: for a potential attacker, the probability of taking action depends on the value of the asset being targeted (in this case, health data) but also on the level of effort that he will have to make to achieve his goals.
Digital transformation is in full swing and the whole health sector is on the front line because of the covid-19 crisis. In such context, the health sector’s actors are, paradoxically, in a very fragile state of health. A quick overview confirms what we said:
The bottom line is that none of the companies or organizations in the health sector excel at protecting the health data they create, use and maintain. And it is not the massive spread of remote working caused by the pandemic that will prove otherwise. It further expands the attack surface of companies and organizations.
Innovation is vital for the health sector. And the Covid-19 crisis is a stark reminder of how having a strong medical industry, and resilient health care structures are prime concerns for countries. These two elements necessarily require in-depth consideration of data protection, which, as we have seen, are the backbone of the war for the health sector. So what is the role of the GDPR in this reflection?
Ms. Dixon, president of the Irish Data Protection Authority, admits in a New York Times article: « the GDPR has not brought about a fundamental change in the way data is collected and used by large companies. »
Being compliant with GDPR is mostly seen by companies as an important effort to make. And to caricature a little, the main and dominant idea remains to do the minimum to avoid being sanctioned. However, with such a starting point we are as far as ever from the in-depth reflection that needs to be undertaken. It’s the whole paradigm of seeing the GDPR as a barrier that needs to evolve.
One of the ways to go beyond this idea of “ GDPR stranglehold” is perhaps to see the latter as a lever. Why not try to see the GDPR as a lever for innovation, for example ? Being compliant is not an end in itself and some have already understood this. Companies that want to continue sharing or using their customers' data are now looking to keep it anonymous. While there is no consensus today on how to achieve this objective, innovations are emerging in all sectors. In particular, the one we are interested in: the health sector. The Wall Street Journal already reported in a February 2019 that the sector was at the heart of innovations in this regard. For example, health care and pharmaceutical companies are already anonymizing the data they collect from clinical trials before sharing it with researchers and other companies. Large American hospital groups, for example, have decided to set up their own specialized companies. The objective is to collect and sell their anonymized data for research and drug development purposes.
The paradigm shift with respect to GDPR and innovation will never be the sole responsibility of the private sector. This line of thought is still largely underestimated. There is therefore also a need for real awareness and support from the public authorities on these issues. The GDPR must be the starting point, not the finish line.
In France, the issue is already being taken seriously by the competent authorities. The CNIL, in particular, has decided to make health data security one of its priority themes in 2021, along with cyber security, for example. In practical terms, this means that it will carry out a minimum of formal control procedures in connection with this issue. This is in fact a continuation of its 2020 strategy. But as we said earlier, controls and sanctions alone are not enough. We also need to be able to support thinking and innovation on these issues. Here, too, the CNIL intends to play a role. The Commission wants to help e-health actors combine innovation and compliance with the GDPR. All this through privileged contacts with its legal and technical teams. Only the 3 winners of a call for projects will be able to benefit from it, but the objective is clear: to implement “privacy by design” i. e. to take into account the protection of health data from the development phase of a product or service.
But it must also be underlined that GDPR compliance cannot be considered without a strong cyber security strategy. The two are closely linked.
ANSSI (National Agency for the Security of Information Systems) defines cybersecurity as “the desired state of an information system that enables it to withstand cyberspace events that may compromise availability, integrity or confidentiality the data stored, processed or transmitted and the related services provided or made available by those systems. »
On the one hand, we have the GDPR, which regulates personal data protection. And on the other Cybersecurity, which by its very nature seeks to fulfill a data security objective. Data security is thus one of the pillars of personal data protection.
Cybersecurity and GDPR are closely linked. But how is this relationship reflected in the law ? The Regulation requires “ the implementation of appropriate technical and organizational measures to ensure a level of security appropriate to the digital risk. ” Specifically, such measures may take the form of data encryption or, more broadly, means of ensuring the confidentiality, integrity and availability of data.
Again, let us remember that all this must be supported by the public authorities. In France, ANSSI provides everyone with a “data security kit” (Source). It brings together best practices, solutions and other recommended tools to enhance the security of personal data.
Not being compliant with GDPR means that in the event of data theft, investigation of a Data Protection Authority or formal complaint, the data controller concerned may face sanctions. They may be non-financial: call to order, restriction of data processing, suspension of data flow, etc. But these sanctions can also amount to up to €20 million or, in the case of a company, up to 4% of annual global turnover. What is the actual situation regarding the application of these sanctions since the implementation of the GDPR?
If some still doubted it, law enforcement is becoming a priority for DPA. This is evidenced by the total amount of fines, which more than doubled in 1 year according to data from enforcementtracker.com. In fact, in 2020, the total amount of sanctions for GDPR violation amounted to €171 million. While it was €72 million in 2019. However, these figures should be put into perspective: out of more than 280 000 complaints, 613 fines were imposed, 8 of which exceeded €10 million. France, which used to impose the heaviest fines (a record of €50 million imposed on Google in 2019), is now only sixth in 2020 with approximately €3 million in fines imposed. Italy is in the lead with €58. 16 million in sanctions. But these declining figures in France are not synonymous with a drop in vigilance. The record fines against Amazon, €35 million, and Google, €60 million, related to data protection (the cookie policies in these cases), but were imposed under the French data protection law (Law on Information Technology and Freedom) and not under the GDPR.
Finally, we have previously confirmed the attention paid to the protection of health data. It is therefore very logical that the GDPR and its sanctions apply equally to health actors. The figures compiled by C-Risk confirm this. Since the implementation of the GDPR, 52 fines have been imposed on industry players. Enough to be in the top 5 of the most sanctioned sectors, but still far from the top: the media and telecom sector with 123 fines. Moreover, between 2018 and 2020, the minimum amount of a sanction for a health actor was €510 while the maximum sanction was €1.24 million. The health actors most punished are still the hospitals.
The approach favoured by C -Risk is to consider the GDPR from the perspective of risk scenarios (operational and cyber security) to :
It is the FAIR method that quantifies the risks faced by a company or organisation wishing to comply with the GDPR. This is done in several steps:
We explored some ideas on the relationship between the GDPR and health data processing, as well as potential solutions to help comply with the requirements of the regulation. Finally, more than questioning sanctions or compliance, the GDPR should also lead us to rethink our vision of what’s considered a “good” processing of health data.
Today, it’s hard to deny the need to collect and process health data. The transfer of files documents to electronic healthcare records is probably the most concrete example. This means, among other things, better coordination of care between different healthcare professionals, greater flexibility and easier access to this information for patients themselves. Health data processing addresses specific needs for transparency, information, as well as efficiency and health safety.
In the same vein, Covid-19 is a clear demonstration of the importance of health data for research. Vaccines were developed in record time (10 months instead of 10 years). There are several reasons for this: virus is not completely unknown, huge investments, new vaccine technologies, etc. But all of this requires researchers to have access to the old SARS CoV-1 data. This requires the collection of volunteer data, the rapid sharing of the results to the supervisory authorities, etc. To sum up: this requires efficient processing of health data.
The health sector, which is changing and under pressure during crises such as the Covid-19 pandemic, undoubtedly has concrete needs for effective and large-scale data processing. And to meet these needs, the digital giants are eager to offer their solutions.
Google launched its Google Health Studies app in December 2020. His objective? To assist researchers in the collection of a large and representative body of health data. Every Android smartphone owner can now register and provide data requested by researchers. In France, the Health Data Hub was created on November 30, 2019 with the same goal: to facilitate the sharing of health data to promote research. The cloud hosting the data? It’s Microsoft.
More recently, it was the partnership between the Doctolib platform and the state that made the headlines. Doctolib becoming an official appointment-making platform for Covid-19 vaccination. Physician and patient associations believed that Doctolib’s cloud provider, Amazon Web Services, had access to insufficiently protected health data.
It’s therefore easy to see from all these examples that the processing and storage of health data goes far beyond the scope of the GDPR. It feeds the debate on the principles of data protection and emphasizes major geopolitical issues. First and foremost: national sovereignty. Are we condemned to use GAFAMs to collect and store the precious health data of our fellow citizens? Is the GDPR an effective deterrent against bad corporate practices? Especially when some companies are subject to extraterritorial laws that are inherently opposed to GDPR (the US Cloud Act in particular)?
We do not have clear answers to these questions yet, but this debate is far from over. One thing is certain: Europe is leading the way with a regulation that inspires others (cf. California Consumer Privacy Act). The GDPR is under close scrutiny because the issue of data protection is likely to develop everywhere.
The GDPR is first and foremost a requirement for data controllers, in order to guarantee the protection of the data of every European citizen. These requirements apply both to companies and to their subcontractors responsible for data processing. Cybersecurity, on the other hand, is undeniably a major issue and risk for businesses and individuals. The consequences of a cyber security breach are diverse, from the « simple » financial loss to reputation losses etc. GDPR compliance and cybersecurity are issues that concern us all, especially if you or your company are processing sensitive personal data vital to your business.
The GDPR, as its name suggests, focuses on data protection. Its primary objective is to regulate their use. Cybersecurity is all the means by which data protection can be ensured. This is why the two issues are closely linked. We can even observe a virtuous circle: improving one of them has often good rippling effects on the other. That is why the GDPR requires “the implementation of appropriate technical and organizational measures to ensure a level of security appropriate to the digital risk. ”
There are a number of ways to improve data protection and GDPR compliance. But the same problem comes up again and again: how to efficiently justify the implementation of a particular tool, project, etc to someone not well versed on these technical subjects ? Using FAIR simply means quantifying in financial terms the risks we’re exposed to. Specifically, the risks associated with non GDPR compliance or a lack of data protection. After a risk assessment and a quantification, the solutions put in place and their returns on investments are now justifiable in a language understood by all stakeholders.
related to Cyber Risk Quantification