Application of the principle of proactive responsibility to the processing of personal data within the scope of the Securhome project
This report will be divided into three parts. the first part aims to analyse the theoretical framework in which the obligations of the controller will be analysed through the study of the regulatory regime, taking into account the guidelines and directives of the supervisory authority (the spanish data protection agency), as well as the reports and opinions of the data protection committee (former article 29 working group and hereinafter wg29); a second part, in which these legal principles will be applied to the specific processing necessary for the development of the securhome mechanism. Thirdly, a specific paragraph will be devoted to how the requirement of informed consent should be adapted to research participants and, where appropriate, at a later stage of the marketing of the devices on the basis of the age of the participants and potential users or other circumstances such as possible disability.
Part 1: Theoretical framework and requirements of personal data protection legislation
i. The principle of proactive responsibility
One of the most relevant novelties of Regulation 2016/679, general data protection regulation (gdpr) is the inclusion of the principle of proactive responsibility. This is included in the second paragraph of Article 5 and establishes that the data controller shall be responsible for compliance with the principles relating to the processing (Article 5.1) and capable of demonstrating it. This principle is developed in Article 24 of the gprpd and in Articles 28 and following of lopdgdd1 by establishing the general obligation of the controller to implement appropriate technical and organisational measures to ensure and be able to prove that the processing is in compliance with the regulation, taking into account the nature, scope, context and purposes of the processing and the risks of varying degrees of probability and seriousness for the rights and freedoms of natural persons.
The principle of proactive liability implies a paradigm shift from a reactive, non-compliance protection system to a preventive and proactive model2. This principle requires the data controller to carry out an analysis of the data it processes and of the other circumstances of the processing in order to then take real and effective measures to ensure that the processing complies with the legal requirements. In doing so, he or she must take into account "the nature, scope, context and purposes of the processing, as well as the risks of varying degrees of probability and seriousness to the rights and freedoms of natural persons".
Prior to the adoption of the DGPD, the WP293 (now the European Data Protection Committee) had already proposed the introduction of a new principle of liability "which would require data controllers to implement appropriate and effective measures to ensure compliance with the principles and obligations laid down in the Directive and to demonstrate it when requested by the supervisory authorities "4.
For wg29, this principle has two main elements:
For WP29, this principle has two main elements: "(i) The need for the controller to take adequate and effective measures to implement the data protection principles
ii) The need to demonstrate, if required, that adequate and effective measures have been taken; thus, the data controller deberá́ provides evidence of (i). "5
WP29 considers that, for example, the following measures could be implemented: measures internal review, evaluation, establishment of written and binding data protection policies to ensure compliance with data quality criteria; establishment of procedures ensuring the correct identification of all data processing operations and the maintenance of an inventory of processing operations; appointment of a Data Protection Officer (in the dmr, the data protection delegate); conduct of privacy impact assessments in specific circumstances; training of staff members, in particular human resources managers and IT administrators; establishment of an internal complaints handling mechanism; etc.6.
The change of model implies in practice that we will have to implement a series of measures to guarantee that the processing of data is in accordance with the legislation in force, establishing procedures that allow us to prove it7 , since simple compliance with the regulations is a necessary condition, but not sufficient8 , to understand that the processing does not violate the legislation in force. In this sense, recital 74 of thegpd recalls that the data controller must be able to prove the conformity of the processing activities with the regulation, as well as the effectiveness of the measures taken to ensure it, taking into account the nature, scope, context and purposes of the processing and the risk to the rights and freedoms of natural persons.
The principle of proactive liability is directly related to a risk-based approach, which the processing of personal data might imply for the data subjects and, as pointed out by wg29, should not imply a weakening of the rights of individuals, but on the contrary, it incorporates new guarantee instruments such as the implementation of privacy principles from the design or the carrying out of impact assessments on the protection of personal data. A risk-based approach will require additional measures when specific risks are identified9 and therefore an adequate diligence by the controller10.
The prpd in its recitals 75 and 76 refers to the possible risks and their seriousness for the rights and freedoms of individuals deriving from the processing of their data and which may constitute 'physical, material or immaterial damages, in particular where the processing may give rise to problems of discrimination, identity theft or fraud, financial loss, damage to reputation, loss of confidentiality of data subject to professional secrecy, unauthorised revocation of pseudonymisation or any other significant economic or social harm'; where data subjects are deprived of their rights and freedoms or are prevented from exercising control over their personal data or disclose sensitive data, where personal aspects are assessed for the purpose of creating or using personal profiles or where 'personal data of vulnerable persons, in particular children, are processed; or where the processing involves a large amount of personal data and affects a large number of data subjects' (Recital 75)
The likelihood and seriousness of the risk to the rights and freedoms of the data subject should be assessed on the basis of an objective evaluation, taking into account 'the nature, extent, context and purpose of the processing'.
Derived from the principle of proactive liability, the following obligations of the data controller are currently of interest11:
- The obligations derived from the principles and rights established in the personal data protection regulations:
- Respect for the principles relating to the processing (data quality)
- To determine the basis for the legitimacy of the processing;
- To guarantee the processing of sensitive data in accordance with the legislation in force;
- Ensuring the principle of transparency and establishing the procedure for correctly fulfilling the obligations arising from the rights of the data subjects;
- Respect the guarantees for international data transfers.
- The obligations involved in taking technical and organisational measures under the principle of proactive responsibility:
- Keeping the register of processing activities;
- Adoption and implementation of data protection measures by default and from design;
- Establishment of adequate security measures;
- Carrying out, where appropriate, impact assessments and prior consultation;
- Contracting with data processors offering adequate and sufficient guarantees;
- Notification of security breaches;
- Where appropriate, appointment of a data protection delegate and development of codes of conduct and certification schemes.
ii. The principle of privacy by design and default
One of the novelties of the rgpd was the introduction of the principle of privacy by design. this is a principle that must be integrated into the development of devices, applications or business models, which are based on or require the processing of personal data. but before entering into the analysis of this principle properly, it is advisable to make a previous reflection on its relation with another broader principle: the principle of technological neutrality.
Recital 15 of the European regulation states that "in order to avoid a serious risk of circumvention, the protection of individuals should be technologically neutral and should not depend on the techniques used". The principle of technological neutrality was already included in recital 46 of Directive 2002/58 on privacy in electronic communications in the following terms: "the protection of personal data and privacy of the user of publicly available electronic communications services should be independent of the configuration of the different components required to provide the service and of the distribution of the necessary functionalities between these components. directive 95/46/ec covers any form of processing of personal data regardless of the technology used. the existence of specific rules for electronic communications services, alongside general rules for other components necessary for the provision of such services, may not facilitate the protection of personal data and privacy in a technology-neutral way. it may therefore be necessary to adopt measures requiring manufacturers of certain types of equipment used for electronic communications services to construct their products in such a way as to incorporate safeguards to ensure that the personal data and privacy of the user and subscriber are protected'.
The principle of technology neutrality arises to counterbalance the imbalance between the user of technological products, who does not have specific knowledge, and the technology makers, who predispose the systems to process personal data. the principle of neutrality is intimately connected with the principles of privacy by design and by default, since what is required is greater responsibility in the planning and design of technology to guarantee the right to privacy of the users of that technology. through this principle, it would be a matter of counteracting the opposite technique, so common in many internet services and applications, which consists of "disabling the privacy of applications by their owners to the maximum "13. In the field of industrial design, it is not uncommon for designs to incorporate a series of technical requirements with the aim, for example, of ensuring the physical integrity of those who use a particular machine, nor is the need to obtain certain assessments and certifications that guarantee the safety of the end products alien. The principles of technological neutrality and privacy by design reflect a similar philosophy, since although the physical integrity or health of the user is not at stake, his or her private life is and the risk that his or her personal information may end up being used by third parties for purposes that may be harmful to him or her, and ultimately to his or her autonomy and dignity.
The principle of neutrality of technology should therefore take the form of allowing the maximum autonomy of the user of the securhome device by establishing appropriate measures to counteract the imbalance between the user and the developer of the system, who is ultimately responsible for the procedures for processing personal data. if the technology is not neutral, it conditions the autonomy of its recipient and imposes systems that may damage the rights and freedoms of individuals (in our case, above all, but not exclusively, the protection of personal data). it is clear that the design of the technology always conditions the user, firstly because of the limits of the technology itself, which conditions what can and cannot be done, but, on the other hand, and this is the important thing, because it conditions the user's behaviour (it guides him). For example, one of the objectives of social network services is for users to spend as much time as possible on their platform and for them to expose as much personal information about themselves and others and to maintain as many interactions as possible in order to monitor their online behaviour, and their design is geared towards achieving these objectives. this design would not be transparent, insofar as the user does not know how he is being tracked, who is accessing that data or how the algorithm that aims to condition his behaviour and "seduce" him so that advertising on the platform is more effective. that is the point, since the more subtle the system, the more effective it will be14.
In the data protection regulations, the principles of technological neutrality and privacy by default and from design require that the developer of the technology assumes greater responsibility in the technological planning and design to guarantee the right to privacy of the users and, from the initial moment, "at the moment in which the means of processing are being defined "15.
The principle of privacy by design appears in the 1990s promoted by ann cavoukian, ontario's information and privacy commissioner, and would be extended to a "trilogy" of applications that would encompass information technology systems, responsible business practices and physical design and network infrastructure16 :
1. Proactive, not reactive; preventive not corrective, i.e. proactive measures should be taken to prevent risks and not reactive in the form of remedies once the invasion of privacy has occurred.
2. Privacy as the default setting or default privacy18. starts from the evidence that the user's majority choice is the "default" one in the system, service or application and requires that personal data is automatically protected, by default, in any given information system or business practice, so that no action is required from the user.
3. Embedded privacy. Privacy must be part of the design and architecture of information technology systems, becoming an essential component of the system, without diminishing its functionality.
4. Total functionality - "everyone wins", not "if someone wins, someone else loses". This principle seeks to balance all legitimate interests.
5.End-to-end security - privacy by design extends throughout the entire life cycle of the data involved, ensuring its security from the time it is obtained until it is destroyed.
6. Visibility and transparency. whatever business or technological practice is involved, clear information must be provided, subject to independent verification. all parts of the process and operations must remain transparent to users and suppliers.
7. respect for the privacy of users, which requires developers and operators to maintain the interests of individuals at a higher level.
Privacy by design is presented as "an essential way to exercise self-determination, the tool to facilitate the application of the law in accordance with its principles "19 , assessing "all processes and information flows envisaged in the system, analysing their privacy implications from a holistic, preventive point of view and with a focus beyond the current legal framework "20. In short, "the objectives of privacy by design are focused on ensuring privacy and on gaining personal control over one's information "21.
Recital 78 states that the protection of fundamental rights with regard to the processing of personal data requires "the adoption of appropriate technical and organisational measures in order to ensure compliance with the provisions of this Regulation". By way of example, some possible measures are set out in the said recital. Thus, among others, they may consist of "minimising the processing of personal data, making personal data as anonymous as possible, ensuring transparency of the functions and processing of personal data, allowing data subjects to monitor the processing and the controller to create and improve security features". The regulation wants that when developing, designing or using applications, which are based on the processing of personal data or which process personal data to fulfil their function, "the right to data protection is taken into account when developing and designing these products, services and applications, and that it is ensured, with due regard to the state of the art, that data controllers and processors are able to fulfil their data protection obligations". Article 25 sets out the specific obligations in this area, but makes them dependent on "existing technology and the cost of implementation for the controller".22 Under this principle, it is up to the controller:
1. to assess the risks of varying degrees of probability and seriousness to the rights and freedoms of natural persons which the processing operation entails
2. Similarly, in order to apply the principle of privacy by design, not only the risks of the processing must be assessed, but also the state of the art, the cost of implementation and the nature, scope, context and purposes of the processing.
3. Once the above assessment has been carried out, appropriate technical and organisational measures, such as pseudonymisation, shall be taken.
4. The measures to be adopted must serve to effectively implement data protection principles, such as minimisation of data and integrate all the necessary guarantees to meet the requirements of dmr and protect the rights of the data subjects. for example, organisational and technological measures must be put in place to ensure the integrity of information and the confidentiality of data, that only data necessary and appropriate for the purpose justifying their collection are collected and processed, that they are kept for the time necessary for that purpose, etc.
5. When should we do so? At the time of determining the means of processing and during the processing of personal data itself23.
Article 25 of the gprpd also includes another principle related to the previous ones, that of privacy by default, which requires that the data controller
1. The application of appropriate technical and organisational measures so that, by default, "only personal data necessary for each of the specific purposes of the processing are processed". This obligation will apply to the amount of personal data collected, the extent of their processing, their retention period and their accessibility.
2. With regard to accessibility, a system must be established whereby, by default, "the data are not accessible to an undetermined number of persons "24 , only to those who must have access to them in order to fulfil the purposes of the processing.
Privacy by default complements and pursues the effectiveness of the data quality principles of article 5 of the gprp. The collection and processing of any personal information must respect the above principles in order to guarantee the sufficient quality of the data and its processing and therefore, through the implementation of these principles, we will establish "a preventive system of protection of the individual against the processing of his personal data, establishing a balance between the advances of the information society and the respect for the freedom of citizens "25.
Part two: application of the principle of proactive responsibility to the securhome project
As provided for in the rgpd, data protection legislation applies to public authorities. public administrations not infrequently act as data controllers and are responsible for applying the principle of proactive liability. the lopdgdd establishes a list of public authorities or public bodies to which the legislation applies, expressly mentioning public universities26 .
In relation to research projects linked to universities that involve the processing of personal data, it will be that body that will act as data controller. the carlos iii unit will be responsible for the processing of the data linked to project 1 - sensory device for households - while the aveiro university will be responsible for the processing of the data linked to project 2 - mobile application responsible for domestic interaction by means of television. if the programme is understood jointly as it appears from the call for applications and not by separate projects, we would be in a situation of joint responsibility in the processing of the data 27. The legal basis for the processing of data is integrated in the fulfilment of the function of creation, development, transmission and criticism of science, technology and culture and of dissemination, valorisation and transfer of knowledge at the service of culture, quality of life and economic development, which the Organic Law on Universities 6/2001 attributes to them (Article 1.2 letters a and d)28.7 of the lgpd, the natural or legal person, public authority, service or other body which, alone or jointly with others, determines the purposes and means of the processing. It is the responsibility of the controller, according to Article 25(2) of the aforementioned law, to implement appropriate technical and organisational measures to ensure that, by default, only personal data necessary "for each specific purpose of the processing" are processed.
The securhome project aims to develop a non-invasive device that, by means of the analysis of daily behaviour, can detect, by means of ia techniques, whether a person over 65 is at risk or not through the analysis of his or her daily activity, collecting the data and processing it to prevent any eventuality. the obligations that the rgpd indicates with regard to this data processing will be analysed below and detailed with regard to the securhome project, according to the data available at the time of preparing this report.
i. The register of processing activities
Article 5 of the gpd establishes the principles relating to the processing of personal data, the respect of which is essential in order to comply with the regulations: legality, loyalty and transparency, legitimization of the purpose, minimization of data, accuracy, limitation of the period of conservation, integrity and confidentiality.
On 25 May 2018, the obligation to register files was replaced by the need to have a register of activities. Article 30 establishes, in general, the obligation of data controllers and processors to keep a register of activities in order to be able to prove compliance with the cpmr, which must be in writing29. The objective is to identify the threats and risks to which the processing of data is exposed in order to minimise them. This is a change of perspective in the area of data protection, which moves from taking into account a catalogue of specific measures to the adoption of those most suitable for privacy.
In the securhome project, the activity register seems to be obligatory, taking into account the information provided by the programme coordinator and the ip of project 2, due to the processing of some sensitive data, so that the exception contained in 30.5 of the gprp does not apply. although the Spanish data protection agency, with the aim of determining whether a processing operation involves little risk has made the tool available to those responsible for the processing, it has not been possible to use it for the needs of the project. In order to generate the basic documentation, risk management will be carried out by the data controllers in order to comply with the provisions of the gprpd and lopdgdd. As indicated in article 30, each responsible party and, if applicable, its representative shall keep a record of the activities carried out under its responsibility. this record must contain all the information indicated below with respect to the securhome project:
(a) Details of the persons involved
Data of the person in charge: carlos iii university data of the correspondent: aveiro university data of the delegate: dpd@uc3m.es
b) Legal basis of the processing:
Article 6.1 of the gpd provides, in a generic way, for different conditions to determine the legal basis of the processing. However, taking into account, according to the information provided by the project coordinator, that it would be necessary, in order to implement the device, to process certain biometric data (such as voice identification) as well as certain medical data, Article 9.2(a) of the regulation applies to determine the legal basis of the processing: explicit consent of the data subject for the purposes specified in the following paragraph30.
Taking into account the legal basis of the processing and in order to guarantee specific, informed and unequivocal free consent, it is essential to provide the information at the time of collecting the data from the data subjects, adapting it to the specific needs and persons involved. The right to information is presented as an essential faculty of the individual, since it conditions the subsequent control of the data and the exercise of the rights of access, rectification and cancellation. In relation to the explicit informed consent in the project, it seems necessary to take into account not only the type of data, but also the category of persons interested in the research, especially aimed at persons over 65 years of age who, in some case, may have some degree of disability or not, but who must be specifically assessed when seeking explicit consent, adapting the request procedure to each case and making, if necessary, the corresponding reasonable adjustments, especially with regard to understanding the information. These are measures aimed at avoiding a possible imbalance between the data subject and the data controller by ensuring free consent in relation to the processing of personal data by public bodies (Recital 43gpd)31.
This issue, which has been specifically studied and analysed in the third part of this report, details the specific connotations to be taken into account when dealing with people with disabilities, taking into account the possible impact on their rights and freedoms.
Thus, although a model of consent is attached (Annex i), this model is provisional and will have to be revised, especially with regard to the safeguarding of the information that conditions its validity, in each case, by the researchers responsible for projects i and ii.
(c) Purposes of the processing:
According to the data provided by the project coordinator, the purpose is the implementation of a non-invasive device that uses the recognition of user behaviour patterns to send warnings or alerts to carers and alert users to their medical needs.
This is a very specific purpose that requires a differentiated analysis, taking into account that the attribution of behavioural patterns to a subject can lead to profiling. In this case, because of the information provided, the data can lead to behavioural profiling32.
The data may not be processed in a way that is incompatible with the purposes described. however, taking into account that it is a research project, it should be remembered that the gprpd states that the purposes "of scientific and historical research and statistical purposes are not incompatible with the purposes initially foreseen - article 5.1.b) gprpd-. d) categories of data subjects
Persons over 65 as a prerequisite.
(e) Categories of personal data
According to the data provided for the elaboration of this study, the categories of personal data that will be handled in the project are the following
1.User identification data:
- User's data.
- User's profile (name, sex, date of birth, location, profile picture, interests and skills)
- Postal address.
2. Carer's identification details (person who will receive the alerts)
- Caregiver profile data (name, sex, date of birth, location, profile picture, interests and skills)
- List of dependents
- Messages sent
- Postal address, e-mail and mobile phone
3. Biometric data
- Voice data (basic command analysis and speaker detection and identification)
4. Health-related data:
- List of medicines and dosage
- List of future medical consultations
- List of marked medical tests
- Medication history with time and status (take/no take)
5. Other information (access by address)
- Login List
- Time of the last data sent.
- Personal data relating to movement, light intensity, infrared, consultation of consumption and interaction with television (through sensors).
(f) Categories of recipients for communication
The data communications that are planned to be made are directly related to the purpose of the processing activity, aimed at research.
Although the purposes "of scientific and historical research and statistical purposes are not incompatible with the purposes initially envisaged - Article 5.1.b) rgpd -, in order to guarantee respect for the rights of users it is recommended that an information clause be added indicating the communication and consent to its performance.
g) International data transfer
In principle, they are not foreseen.
(h) Time-limits for the deletion of data.
The data will be kept for the duration of the project, July 2020. also, as stated in article 89.1 of gprp, in these cases, use will be possible provided that appropriate technical and organisational measures are in place to respect the principle of minimisation of data. in the case of the securhome project, the inclusion of pseudonymisation for research purposes seems possible.
(i) Overview of technical and organizational measures
(If possible)
The persons responsible listed in article 77.1 of the lopdgdd, which includes public universities, must apply the corresponding security measures provided for in the national security scheme to the processing.
A general description of the technical and organisational security measures referred to in Article 32(1) is not possible at this stage of project implementation.
This basic documentation must be completed and analysed by the controller, especially as the project progresses, in order to demonstrate at all times that the processing is carried out in accordance with the requirements laid down in the cpmr. the list of regulatory compliance is included as Annex ii to this work.
ii. Basic risk analysis
Taking into account that the data to be processed include biometric data33 (voice data with basic order analysis as well as speaker detection and identification) and medical or health-related data (type of medication, medical appointments and medication history)34 and that individual automated decisions are made, including profiling, which is based on special categories of personal data because the data subject's consent is given, a risk analysis has been carried out in order to obtain appropriate guidelines for safeguarding the rights and freedoms and legitimate interests of the data subject. the organization of the register, as a structured set of data, has been carried out following the practical guide for risk analysis in the processing of personal data subject to the gprp, prepared by the aepd taking into account the possible identification of threats and the risk assessment for their processing is attached as annex iii.
iii. Impact assessment
The impact assessment is set out in Article 35 cpg in the following terms:
"1. Where a type of processing operation, in particular where it involves the use of new technologies, is likely, by its nature, its scope, its context or its purposes, to present a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the processing operations on the protection of personal data.
2. The controller shall seek the advice of the Data Protection Officer, if appointed, when carrying out the data protection impact assessment.
3. The data protection impact assessment referred to in paragraph 1 shall be required in particular in the case of
(a) a systematic and thorough assessment of personal aspects of natural persons based on automated processing, such as profiling, on the basis of which decisions producing legal effects for natural persons or significantly affecting them in a similar way are taken;
(b) large-scale processing of special categories of data referred to in Article 9(1) or of personal data relating to convictions and criminal offences referred to in Article 10; or
(c) systematic observation on a large scale of a public access area.
The Spanish Data Protection Agency has just published, in accordance with Article 35(4) cpd, a list of the types of processing operations that require a data protection impact assessment and which should be understood as a non-exhaustive list:
Processing operations involving profiling or assessment of subjects, including the collection of data from the subject in multiple areas of his life (work performance, personality and behaviour), covering various aspects of his personality or habits.
2. Processing operations involving automated decisions or contributing significantly to such decisions, including any kind of decision preventing a data subject from exercising a right or having access to a good or service or from entering into a contract.
3. Processing operations involving the observation, monitoring, supervision, geolocation or control of the data subject in a systematic and comprehensive way, including the collection of data and metadata through networks, applications or in publicly accessible areas, as well as the processing of unique identifiers allowing the identification of users of Information Society services such as web services, interactive TV, mobile applications, etc.
4. Processing involving the use of special categories of data referred to in Article 9(1) of thegpd, data relating to convictions or criminal offences referred to in Article 10 of thegpd or data making it possible to determine the financial situation or the solvency of assets or to deduce information about individuals relating to special categories of data.
5. Processing operations involving the use of biometric data for the purpose of uniquely identifying a natural person
6. Processing operations involving the use of genetic data for any purpose.
7. Processing operations involving the use of data on a large scale. In determining whether a processing operation can be considered on a large scale, the criteria set out in the Article 29 Working Party's 'Guidelines on Data Protection Officers' shall be considered.
8. processing operations involving the association, combination or linking of database records from two or more processing operations for different purposes or by different controllers.
9. Processing of data of vulnerable subjects or subjects at risk of social exclusion, including data of minors under 14 years of age, elderly persons with some degree of disability, disabled persons, persons accessing social services and victims of gender-based violence, as well as their descendants and persons under their care and custody.
10. Treatments involving the use of new technologies or innovative use of established technologies, including the use of technologies on a new scale, for a new purpose or in combination with others, in a way that involves new forms of data collection and use that puts at risk the rights and freedoms of individuals.
11. Processing of data that would prevent data subjects from exercising their rights, from using a service or from performing a contract, such as processing operations where data have been collected by a controller other than the one who is going to process them and where one of the exceptions on information to be provided to data subjects according to Article 14.5 (b,c,d) of the DGPN applies "35.
As has been pointed out, the eipd is not always necessary, but must be previously evaluated, carrying out an analysis of the lists of processing provided for in the regulation and determining the nature, scope, context and purposes of the processing36:
a) analysis of the lists of processing operations provided for in the regulation.
Article 35.3 of the gprpd establishes the three cases, already mentioned, in which it is obligatory to carry out the impact evaluation. with regard to the securhome project, of the information that is currently available, it should be noted that it does not take place. the development of the project does not involve the systematic observation on a large scale of a public access area or the treatment on a large scale of the so-called "sensitive data". Although certain medical information is collected, this is limited to appointment dates and the regular intake of medication by a normally low number of subjects. Furthermore, although the application through an artificial intelligence mechanism produces a certain behavioural profile of the subject, the objective of this profile is not the taking of decisions with legal effects, but has a very specific and determined purpose specified in the processing register.
Further doubts about the need or otherwise for eipd presents the processing of the above-mentioned sensitive data by virtue of the list published by the aepd, bearing in mind that the list includes two possible processing operations that may involve serious risks for the rights and freedoms of the subject: "processing operations involving the use of special categories of data referred to in Article 9.1 of the dmr, data relating to convictions or criminal offences referred to in Article 10 of the dmr or data making it possible to establish the financial situation or creditworthiness of individuals or to infer information on individuals concerning special categories of data' and 'processing operations involving the use of biometric data for the purpose of uniquely identifying a natural person'. With regard to the first scenario, it should be recalled that the securhome project deals with certain health-related information, meaning data relating to the physical or mental health of persons who disclose information on their health status, such as the taking of medication, including the provision of health care services.37 The second scenario is the processing of data relating to the health of persons who have been diagnosed as suffering from a mental illness. In relation to biometric data, the rgpd includes its definition as personal data obtained through a specific technical processing, relating to the physical, physiological or behavioural characteristics of a natural person that allow or confirm the unique identification of that person38. In this regard, the project coordinator confirms the need for the use of voice data as a biometric element that distinguishes each user for the purpose of performing a basic analysis of orders and speaker detection and identification.
On this point, it should be noted that, although the lawfulness of the processing of the securhome project is essentially based on the consent of the user of the system, the processing of the data indicated here could be partially covered by articles 6.1 d) of the gpd: the protection of vital interests and the need to guarantee the security of persons. also, at this stage of the project, the use of sensitive data is presented as incidental and not as main processing, implying a residual risk. in this sense, and especially with regard to the use of biometric systems that identify the person, it is essential to study whether such a system is adequate, relevant and limited to what is necessary in relation to these data, especially with regard to the recording of the data. Likewise, it is relevant to reflect on whether it would be possible to achieve the same result with a technical mechanism that is equally effective and less intrusive for privacy.39 From the data initially provided, it seems that the voice command has a clear purpose: the emergency alert for the presence of a health risk, with a prior assessment on whether only a "panic button" is installed or also an activation of the emergency with a voice command. This assessment has been taken into account in the risk analysis, resulting in an acceptable risk. Therefore, in this phase of the project, the impact assessment is not carried out.
(b) analysis of the nature, scope, context and purpose of the processing.
The second phase of analysis focuses on evaluating the above aspects as outlined in Article 35.1 of the gpd.
On these, the following issues have been taken into account:
- Nature of the treatment: question already analysed in the previous section. At this stage of the project, according to the information provided, certain sensitive data are treated on a small scale, without involving massive monitoring of people. With respect to the collection of data regarding people of special vulnerability, a report has been drawn up on specific risks to be taken into account in the evolution of the project.
- However, for the future, it is necessary to evaluate certain risks with possible legal effects, especially those related to discrimination, as indicated in the following section of the report by Professors Migle Laukyte and Rafael de Asís.
- Context of processing: although within the securhome project the collection of data is carried out by means of new technologies, efforts have been made to ensure that the device is as minimally invasive as possible for the privacy of individuals.
- Purpose of the processing: the purpose of the data processing in the project is concrete and clear, and the necessary adoption of measures to minimize the risks in relation to profiling and monitoring must be taken into account in the progress of the research.
One issue that will have to be analysed in a subsequent and exhaustive way, if the device is finally marketed, in view of the need to carry out an impact assessment, is the processing of data relating to data subjects who are vulnerable because of the possible increase in the imbalance of power between the data subject and the controller (recital 75 cpd), due to the consideration of the elderly among the population subject to special protection. if in the future, due to the evolution of the project, it becomes necessary to carry out a eipd, the need for consultation prior to the start of high risk processing, as provided for in article 36 of the gpd, must be taken into account before the supervisory authority, the processing of which is exclusively directed to the data controller who detects that the processing involves a high risk to rights, once the corresponding guarantees and security measures have been implemented.
As indicated by the AEPD, the following are essential conditions for access to the processing: j being the data controller j having completed an eipd j the eipd shows a high risk for the rights and freedoms of the persons after having applied measures to mitigate it
v. Security of personal data
With regard to security, it should be recalled that thegpd introduces the concepts of "data protection by design and default" already analysed - Article 25.2 gpd - the controller must establish control procedures that guarantee these principles.
With the objective of establishing the relationship between the concepts of privacy by design and by default and, following the practical guide of risk analysis in the processing of personal data subject to pr pr pr of the Spanish agency of data protection, the following workflow has been followed, with the purpose of identifying threats and evaluating and treating the risks.
In the securhome project, the initial data of the processing activity have been defined. it has not been possible to use the aepd facilitates tool, especially for two reasons: because of the use of certain sensitive data, in particular health data, and because of the need for risk analysis derived from the elaboration of behavioural profiles; this initial analysis has resulted in an "acceptable risk" so it does not seem appropriate, at this time, to carry out an impact assessment, as mentioned in the previous section. However, the progress of the project and the development of the device may make this assessment necessary in the future.
Article 32 of thegpd regulates the security of processing, establishing in paragraph 1 the need for the controller and processor to implement the technical and organisational measures required to ensure the level of security appropriate to the risk. these measures should include
- Pseudonymisation and encryption of personal data.
- Mechanisms to guarantee the confidentiality, integrity, availability and permanent resilience of the system.
-Mechanisms to restore the availability of information in the event of a technical or physical incident. j the process for verifying and evaluating the effectiveness of technical and organizational security measures.
Taking into account the data provided and the status of the project, the risk analysis document provisionally carried out is attached.
With regard to the subsequent use of the data obtained for research purposes, it is appropriate to point out the application of anonymisation techniques and measures. Therefore, a periodic re-evaluation of the residual risk is recommended in order to include parameters that improve the quality of anonymisation41.
Part three: study on the special conditions for informed consent by category of data subject and specific risk analysis
i. It is considered whether it would be necessary to adapt the application for informed consent to the category of research subjects for the design of the device (and subsequently for its marketing) taking into account that these are persons over 65 years of age who, in some cases, may have some degree of disability.
Yes, it is necessary to adapt the application for informed consent to the category of research subjects for the design of the device and subsequently for the marketing of the device.
As it is known, in the field of data protection, informed consent is understood as "any free, specific, informed and unequivocal expression of will by which the person concerned accepts, either by means of a declaration or a clear affirmative action, the processing of personal data concerning him or her" (Art. 6 of the organic law 3/2018, of 5 December, on the protection of personal data and guarantee of digital rights - hereinafter lpdp; in the same sense, art. 4.11 of regulation 2016/679 of the European Parliament and of the Council, of 27 April 2016, on the protection of individuals with regard to the processing of personal data and on the free movement of such data - hereinafter rgdp).
This is a requirement deriving from the duty to inform and the principle of autonomy of will which in certain areas, such as health42 , has become a genuine right.
In any case, as we have seen, there are four requirements for valid consent: free, specific, informed and unequivocal. These are four elements that, although they imply general requirements (such as the demand for sufficient, intelligible and easily accessible information, in clear and simple language), must be adapted to specific circumstances and adapted to the people involved.
Thus, in relation to the processing of data, the lpdp, in its Article 28 (general obligations of the data controller and processor) provides that "appropriate technical and organisational measures must be taken and implemented in order to ensure and prove that the processing is in accordance" with the rgpd, the organic law and other applicable rules. This article establishes that the persons in charge and responsible for the processing must take into account the higher risks that occur in a series of cases, among which are those in which "the processing of data of groups of affected persons in a situation of special vulnerability and, in particular, of minors and disabled persons, is carried out".
This means that the provision of informed consent, when dealing with persons with disabilities, acquires its own connotations. Before exposing some of these connotations, it is important to clarify what the norm is referring to when it refers to persons with disabilities.
One of them is represented by the Convention on the Rights of Persons with Disabilities (henceforth the CoPD); the other is the general law on the rights of persons with disabilities and their social inclusion of 2013 (henceforth the LGDP).
Under the first, disability is the sum of two factors that accompany the person and that we can understand as a condition and situation43. We speak of a condition to refer to personal traits or, in terms of the lgdpd, impairments. We speak of situation to refer to the barriers that people encounter when it comes to satisfying their rights. these are environmental barriers but also attitudinal and social barriers. thus, article 1 of the cdpd states: "persons with disabilities include those who have long-term physical, mental, intellectual or sensory impairments which, in interacting with various barriers, may hinder their full and effective participation in society, on an equal basis with others".
For its part, the lgdpd, in its article 4.1, handles a similar vision of a person with a disability when it states: "persons with disabilities are those who present physical, mental, intellectual or sensorial deficiencies, predictably permanent, that, when interacting with diverse barriers, may impede their full and effective participation in society, in equal conditions with others". However, point 2 of that same precept states: "in addition to what is established in the previous section, and for all purposes, those persons who have been recognised as having a degree of disability equal to or greater than 33 per cent shall be considered disabled persons...". In this sense, recognition as a person with a disability is linked to the possession of a degree that is determined, mainly, by medical factors that have to do with features or, if you will, with condition44.
Well, between the two paths, it seems that the first finds a better accommodation with the meaning that informed consent has. as we have pointed out, informed consent must be adapted to the particularities of each person and this does not seem compatible with the determination of general degrees. on the other hand, this determination may make sense when establishing administrative measures but hardly if we refer to the satisfaction of rights.
In this way, we can affirm that when the lpdp, refers to persons with disabilities it is not only including those who have a degree of 33% recognized, but any person who encounters barriers, in this case, at the time of giving their consent.
From all of the above, it can be deduced that the elderly can be included within the reference to persons with disabilities made by the lpdp, as they often face sensory, intellectual or environmental barriers. elderly persons have more difficulty in developing computer skills, understanding the functioning of different devices and managing them. in this sense, article 81 of the lpdp, when referring to internet access, states: "access to the internet will seek to overcome the generation gap through actions aimed at training and access for the elderly "45.
Thus, and in relation to these persons, the provision of consent must be adapted to overcome these barriers, for which both the CoP and the LGDP establish the possibility of general accommodations and individual accommodations. the former within the universal design and the latter as reasonable accommodations.46 Thus, the lgdpd, in practically the same terms as the cdpd, establishes in its Article 2 that universal design is understood as "the activity by which environments, processes, goods, products, services, objects, instruments, programs, devices or tools are conceived or designed from the outset, and whenever possible, in such a way that they can be used by all persons, to the greatest extent possible, without the need for adaptation or specialized design". and continues: stating that universal design "shall not exclude assistive devices for particular groups of persons with disabilities, where needed". The same article defines reasonable accommodation as "necessary and appropriate modification and adjustments to the physical, social and attitudinal environment to meet the specific needs of persons with disabilities which do not impose a disproportionate or undue burden, where needed in a particular case in an effective and practical way, to facilitate accessibility and participation and to ensure to persons with disabilities the enjoyment or exercise on an equal basis with others of all rights".
Well, of the four elements of consent to which we referred before, it is surely the one that has to do with the requirement that it be informed, the one that has the most connotations with people with disabilities. This element requires, firstly, complete information and transparency and, secondly, information that is understandable. Thus, when it is a person with a disability who must give consent, the information will be provided to him/her at ofrecerá́ in appropriate formats, according to the rules set out by the principle of design for all, so that it is accessible and understandable, and the relevant support measures will be provided.47 The provision of informed consent may require adaptations so that it can be given correctly by any person; adaptations that may be more necessary when referring to persons with disabilities.
Not surprisingly, article 6 of the lgdpd, which refers to respect for the autonomy of persons with disabilities, states
1. The exercise of the rights of persons with disabilities shall be carried out in accordance with the principle of freedom to make decisions.
2. Persons with disabilities have the right to make decisions freely, for which information and consent must be given in appropriate formats and in accordance with personal circumstances, following the rules laid down by the principle of universal design or design for all persons, so that they are accessible and understandable. In all cases, account must be taken of the individual's personal circumstances, his or her ability to make the specific type of decision and to ensure the provision of support for decision-making".
Thus, the process of giving consent and everything related to the processing of his or her data must be carried out in such a way that the person understands the information and can make his or her own judgement, even if it is with the appropriate supports.
With all of this, we guarantee the exercise of the right to informed consent, we restrict possible discrimination that may occur (Article 14 ce, Article 5 cdpd, Article 63 lgdpd) and we protect other rights, such as privacy (and it should not be overlooked that the lack of respect for the privacy - which is part of the right to data protection and privacy - of an elderly person is also one of the forms of abuse48).
Some techniques that can facilitate older people's understanding of informed consent, further adapting it to different levels and forms of disability that will be present among the older people selected to participate in the research:
- Using graphic material: drawings, photographs, colours ;
- Use simple, everyday language without technical terms;
- Use practical examples to explain what will be done with the data that is collected and what the data is for and how the device works. For example, it may not be clear why a person's profile photo and data on their interests and abilities are needed.
Emphasize that there is data that will never be requested for this device and that if someone asks for, for example, the bank account number or other data that is irrelevant to this device, the person must immediately inform those responsible. o explain well what parameters of the device are modifiable and how (for example, the time control between interactions with the device), o establish who will be alerted in particular if the sensors of the device notice something strange.
- Create a space for dialogue: explain the things that the interlocutors do not understand and answer the questions, for example, how long their data will be kept or explain that the voice recording of the elderly person is necessary to be able to send the emergency request through the device, etc:
- Create a continuous channel of dialogue (also by email or telephone) before, during and after treatment, based on open-ended questions, and other measures to ensure the older person's understanding, e.g. ask what they have understood about how their data will be used, or create a device web page with options for older people or people with disabilities, or take into account that older people with or without disabilities may need more time to read, understand, activate, or do other things; or identify particular risks for older people and prepare a plan to manage them.
- Collaborate with the family, their representative or trusted assistance services and third parties involved in the life of the elderly person, such as volunteers from different associations, etc.; this area also includes elderly people in a situation of dependency as described in Law 39/2006, of 14 December, on the promotion of personal autonomy and care for dependent persons.
- Easy reading techniques (images, big letters, short sentences...) in written material, such as leaflet about the device, where it is explained what the device is for, how it may affect the person (can the device be heard or does it take up space?), who to call if there is a problem with the device, what to do if something happens to the device (it falls down, etc.)
However, the fact that the information element has the greatest impact on people with disabilities, understood in the broad sense that we are using, does not mean that the other elements do not also have some impact. the requirements of free, specific and unequivocal consent also have importance and uniqueness.
Thus, for example, and in relation to free consent, we must not overlook the fact that we are in the presence of people in a situation of vulnerability and who face barriers, in many cases of a social nature, which have their origin in discrimination. This means that in many cases, their relationship with other people cannot be considered as a relationship between equals, and there is a difference in power that must be taken into account.
ii. Specific risks for the rights and freedoms of these persons insofar as the Spanish data protection agency has included among the processing operations requiring data protection impact assessment the "processing of data of vulnerable subjects or those at risk of social exclusion, including data of minors under 14 years of age, elderly persons with some degree of disability, disabled persons, persons accessing social services and victims of gender violence, as well as their descendants and persons under their guard and custody".
Specific risks to the rights and freedoms of these persons:
Risk of manipulation or exploitation by third parties: e.g. asking the elderly person for data that are not relevant to the operation of the device in order to access their bank account;
2. Risk to the right to privacy (e.g. how to ensure the right of accessibility to personal data by such persons, how to facilitate the right to withdraw informed consent, accessibility to information about the device on the company website, etc.);
3. Risk to the right to personal autonomy, and freedom to make decisions about their life, their physical and mental integrity. For example, as the device is based on the profiles that the device creates for each of the elderly people and also on the patterns of behaviour, there is a risk of creating a profile of the person that does not correspond to reality (risk also for the right of rectification, see point 2. of this list);
4. Risk to human dignity by being treated as an object (generator) of data and not a human person; furthermore, older people run the risk of being considered a burden and not a resource for technological research. In other words, computer research has to empower older people and consider them (and help them to adopt this vision) as active subjects of research and not passive subjects;
5. Risk of not ensuring universal access to information society services: there is a danger of marginalising older people simply because they have difficulty in adapting to new digital and computer media and resources;
6. Risk to the right not to be subject to discrimination not only because they have a disability, but also because they belong to any other group in a situation of vulnerability (for example, being a woman with a disability: in this case we can speak of multiple or intersectional discrimination)
7. Risk to the right not to be discriminated against on the basis of age ("ageism");
8. Risk to the right to (medical) assistance because with this device we risk providing medical supervision at a distance and we increase the risk of less contact between the elderly person and the specialised medical human resources. In this way, we run the risk of not guaranteeing the elderly person the right to "live with dignity through adequate and person-centred medical care, in the long term and accessible to all" as the age manifesto (the network of European associations of the elderly) warns us.
9. Risk of increasing the isolation and loneliness of the person to be visited or cared for in person by human professionals if and only if something bad happens and/or if and only if the device signals irregularities.
10. Risk to their individual autonomy. e.g. in case of not being informed in an understandable way about the scope of the project or the data processing.
1ley orgánica 3/2018, de 5 de diciembre de protección de datos personales y garantía de los derechos digitales (lopdgdd).
2 lorenzo cabrera, sara: “posición jurídica de los intervinientes en el tratamiento de datos personales. medidas de cumplimiento”, en aa.vv.: protección de datos, responsabilidad activa y técnicas de garantía, reus, madrid, 2018, p. 123 y ss.
3grupo de trabajo del artículo 29.
4dictamen 3/2010 sobre el principio de responsabilidad del gt29, p. 2.
5ibídem, p. 9.
6ibídem, p. 12 y 13.
7núñez garcía, josé l.: “responsabilidad y obligaciones del responsable y del encargado del tratamiento, en rallo lombarte, artemi (dir.): tratado de protección de datos, tirant lo blanch, valencia, 2019, p. 355. 8lópez álvarez, luis f.: “la responsabilidad del responsable”, en piñar mañas, josé l. (dir.): reglamento general de protección de datos. hacia un nuevo modelo europeo de privacidad, editorial reus, madrid, 2016, p. 291.
9statementonthe role of a risk-basedapproach in data protection legal frameworks, adoptado el 30 de mayo de 2014.
10 vid. lópez álvarez, luis f.: “la responsabilidad del responsable”, ob. cit., p. 291.
11se sigue el esquema propuesto por sara lorenzo cabrera: “posición jurídica de los intervinientes en el tratamiento de datos personales. medidas de cumplimiento”; en aa.vv.: protección de datos, responsabilidad activa y técnicas de garantía, ob. cit., pp. 126 y 127.
12llácer matacás, maría rosa: la autodeterminación informativa en la sociedad de la vigilancia: ubiquitous computing, en llácer matacás, maría rosa (coord.): protección de datos personales en la sociedad de la información y la vigilancia, la ley, madrid, 2011, p. 89.
13touriño, alejandro: el derecho al olvido y a la intimidad en internet, catarata, madrid, 2014, p. 23.
14 sobre este tema, que se utiliza como ejemplo por tratarse de un servicio muy popular y conocido, puede verse, entre otros, lanier, j.; diez razones para borrar tus redes sociales de inmediato, editorial debate, barcelona, 2018.
15 miralles lópez, ramón: “protección de datos desde el diseño y por defecto (art. 25 rgpd. art. 28 lopdgdd)”, en lópez, calvo, josé. (coord.): la adaptación al nuevo marco de protección de datos tras el rgpd y la lopdgg, wolkers kluwer, segunda edición, madrid, 2019, p. 421.
16cavoukian, ann: privacy by design. the 7 foundational principles implementation and mapping of fair information practices, information and privacy commissioner of ontario, canadá, 2010.puede consultarse en: https://www.iab.org/wp-content/iab-uploads/2011/03/fred_carter.pdf.
17 ibídem.
18 megías terol, javier: “privacy by desing, construcción de redes sociales garantes de la privacidad”, en rallo lombarte, artemi y martínez martínez, ricard (coord..): derecho y redes sociales, civitas, madrid, 2010, p. 320.
19llácer matacás, maría rosa: la autodeterminación informativa en la sociedad de la vigilancia: ubiquitous computing, ob. cit., p. 90.
20 megías terol, javier: “privacy by desing, construcción de redes sociales garantes de la privacidad, ob. cit., p. 320.
21 miralles lópez, ramón: “protección de datos desde el diseño y por defecto (art. 25 rgpd. art. 28 lopdgdd)”, ob. cit., p. 424.
22 arenas ramiro, mónica: reforzando el ejercicio del derecho a la protección de datos personales, en rallo lombarte, artemi y garcía mahamut, rosario: hacia un nuevo derecho europeo de protección de datos, tirant lo blanch, valencia, 2015, p. 360.
23artículo 4.2 del rgpd: «tratamiento»: cualquier operación o conjunto de operaciones realizadas sobre datos personales o conjuntos de datos personales, ya sea por procedimientos automatizados o no, como la recogida, registro, organización, estructuración, conservación, adaptación o modificación, extracción, consulta, utilización, comunicación por transmisión, difusión o cualquier otra forma de habilitación de acceso, cotejo o interconexión, limitación, supresión o destrucción.
24 lorenzo cabrera, sara: “posición jurídica de los intervinientes en el tratamiento de datos personales. medidas de cumplimiento”, ob. cit., p. 149.
25 lesmes serrano, carlos (coord.): la ley de protección de datos. análisis y comentario de su jurisprudencia, ob. cit., p. 139.
26artículo 77 lopdgdd.
27artículo 26 rgpd.
28sobre los retos de la protección de datos en el ámbito universitario, algunos pendientes en la actualidad, vid. martínez martínez, ricard: “la protección de datos en la universidad: retos para el 25 de mayo de 2018”, en la articulación de la gestión universitaria a debate, thomson reuters aranzadi, 2018.
29se ha considerado que no opera en este supuesto la excepción del artículo 30.5 rgpd.
30 la lopdgdd, en su artículo 9, regula también las categorías especiales de datos, si bien no hace una referencia expresa en particular a los datos biométricos, omitiendo, de esta forma, la propuesta ya antigua, formulada por la comisión de libertades e informática (cli) que señalaba la necesidad, en una futura reforma de la lopdp, de incluir claras referencias a los datos genéticos y biométricos en aras a la efectiva protección del derecho a la privacidad. comisión de libertades e informática: “la necesidad de una reforma de la ley orgánica 15/1999 de protección de datos personales”, datospersonales.org, la revista de la agencia de protección de datos de la comunidad de madrid, núm. 15, 2005.
31se trata, en palabras del tribunal constitucional español, de uno de los elementos característicos de la configuración constitucional del derecho a la autodeterminación informativa, que permite al individuo precisar qué datos sobre su persona pueden obtenerse o tratarse, por quién y para qué finalidades. fundamento jurídico octavo, stc 292/2000, de 30 de noviembre.
32sobre las consecuencias jurídicas de elaboración de perfiles, vid. garriga domínguez, ana: “la elaboración de perfiles y su impacto en los derechos fundamentales. una primera aproximación a su regulación en el reglamento general de protección de datos de la unión europea”, derechos y libertades: revista del instituto bartolomé de las casas, núm. 38, 2018, págs. 107-139.
33según el artículo 4, apartado 14, del rgpd son “datos biométricos”: datos personales obtenidos a partir de un tratamiento técnico específico, relativos a las características físicas, fisiológicas o conductuales de una persona física que permitan o confirmen la identificación única de dicha persona, como imágenes faciales o datos dactiloscópicos.
34 según el artículo 4, apartado 15, del rgpd son “datos médicos”: datos personales relativos a la salud física o mental de una persona física, incluida la prestación de servicios de atención sanitaria, que revelen información sobre su estado de salud.
35 agencia española de protección de datos (aepd):listas de tipos de tratamientos de datos que requieren evaluación de impacto relativa a protección de datos (art 35.4). https://www.aepd.es/media/criterios/listas-dpia-es-35-4.pdf. consultada el 3 de junio de 2019.
36 algunas cuestiones sobre la eipd, davara fernández de marcos, elena: “la evaluación de impacto en protección de datos: aspectos de interés”, en actualidad administrativa, num. 4, 2018.
37la legislación de protección de datos utiliza un concepto amplio de dato relativo a la salud, resultando indiferente que se trate de una persona de buena o mala salud. se encuentran incluidas en esta categoría las informaciones relativas al abuso de alcohol o al consumo de drogas. es decir, cualquier dato que tenga una clara y estrecha relación con la salud. al respecto, vid. recomendación 5/1997, de 13 de febrero de 1997, del comité de ministros del consejo de europa a los estados miembros sobre protección de datos médicos. esta interpretación amplia del concepto de dato médico ha sido amparada por el tjue - sentencia de 6 de noviembre de 2003 (caso lindqvist)-”.
38así, parece que el rgpd incluye en esta categoría a los datos biométricos dirigidos a identificar a una persona de forma univoca y no aquellos que implican cierto grado de probabilidad, que no dejan de ser datos biométricos pero que no reciben la calificación de sensibles.
39informe 0065/2015 aepd.
40 flujo de trabajo previsto en la guía práctica de análisis de riesgo en los tratamientos de datos personales sujetos al rgpd.
41sobre este proceso resulta de interés las orientaciones y garantías en los procedimientos de anonimización de datos personales publicadas por la agencia española de protección de datos, así como el dictamen 05/2014 del grupo de trabajo del artículo 29 sobre técnicas de anominización, de 10 de abril de 2014. asimismo, teniendo en cuenta el marco legislativo actual, la agencia española de protección de datos ha publicado un documento orientado a organizaciones que aborden procesos de anonimización sobre conjuntos de datos, “la k-anonimidad como medida de la privacidad.
42 ver sentencia del tribunal constitucional 37/2011, de 28 de marzo de 2011; sentencia del tribunal supremo de 12 de enero de 2001, sala de lo civil.
43 ver de asís, r., sobre discapacidad y derechos, dykinson, madrid 2013.
44 ver real decreto 1971/1999, de 23 de diciembre, de procedimiento para el reconocimiento, declaración y calificación del grado de minusvalía.
45 a mayor abundamiento, conviene recordar que el criterio de edad avanzada es uno de los que se utilizan para saber si una persona puede considerarse vulnerable en el ámbito del derecho penal. sobre el tema ver perianes lozano, a. y alia ramos, m.j. “medidas jurídicas de prevención y protección” en personas mayores vulnerables: maltrato y abuso directores c. ganzenmuller y carmen sanchez carazo, consejo general de poder judicial, centro de documentación judicial, 2009, disponible en: https://www.cermi.es/sites/default/files/docs/colecciones/personasmayor… .
46 sobre el diseño universal y los ajustes ver la sentencia del tribunal constitucional 3/2018 de 22 de enero.
47 ver barranco, m.c., cuenca, p. y ramiro, m.a., “capacidad jurídica y discapacidad: el artículo 12 de la convención de derechos de las personas con discapacidad”, en anuario de la facultad de derecho, universidad de alcala, v, 2012, pp. 53 y ss.
48san segundo manuel, t., “como se detecta el maltrato” en personas mayores vulnerables: maltrato y abuso, cit. disponible en https://www.cermi.es/sites/default/files/docs/colecciones/personasmayor…;