Privacy

This article has been written by Viswajit Srinivasan pursuing Diploma in International Data Protection and Privacy Laws and edited by Shashwat Kaushik.

This article has been published by Sneha Mahawar.

Introduction 

The rapid development of artificial intelligence and machine learning technologies virtually across all sectors of the global economy is very evident. Privacy, on the other hand, is becoming more imperative given the above rapid development. As technology continues to evolve at a breakneck pace, the collection and processing of personal information, especially in the realm of emotional recognition, displays challenges in the ethical and privacy domains.  Emotional recognition, a technique utilised by artificial intelligence to identify human emotional responses, is gaining traction in areas such as personalised environments, employment, security, marketing, and healthcare. This article delves into the intersection of data privacy and emotion recognition technologies and probes into the advantages, disadvantages, and regulatory frameworks that affect this concept.

Download Now

What is emotion recognition

Emotion recognition technology is a rapidly evolving field that identifies and analyses human emotions through various methods, including facial recognition, voice analysis, and biometric sensors. Facial emotion recognition is a technology used for analysing sentiments from different sources, such as pictures and videos. Voice analysis is primarily used for identifying even the slightest change in a subject’s nervous system or changes in respiration and muscle tension, which in turn affect the voice production process.

An overview of data privacy and emotion recognition technology

It is estimated that the concept of emotion recognition technology is likely to be worth almost $56 billion by 2024. This goes on to show the wide application and utilisation of technology in the contemporary world. It will not be far from the truth for us to presume that various governments, corporate entities, and tech giants are subjecting us to the very same technology by this minute. For instance, the Lucknow Police announced their intention to implement emotion recognition technology to track expressions of “distress” on the faces of women who come under the scope of AI-enabled surveillance cameras in public places. The cameras would automatically alert the nearest police station even before the alleged victim took any action to report any issue herself.

How emotion recognition works

As discussed above, emotion recognition technology is employed in various sectors with the help of different emotion recognition mechanisms. The following are the primary emotion recognition methods:

Facial recognition

Facial expressions are forms of non-verbal communication that are captured from different sources, such as pictures and videos. Facial emotional recognition is analysed via, a) facial detection; b) facial expression detection; and c) expression classification to an emotional state. Depending on the algorithm, facial expressions are classified into basic emotions (e.g., anger, disgust, fear, joy, sadness, and surprise). 

Voice analysis

By analysing the tone, pitch, and cadence of a person’s voice, this method can detect emotions such as joy, frustration, and fear. With devices listening everywhere you go, privacy concerns are endemic to advancing technology. For instance, a mobile application or virtual assistant that is designed to adapt to the user’s mood and recognise emotions in real-time. 

Biometric sensors

This is by far the most accurate method for recognition technology, as it corresponds to direct physiological signals, such as heart rate and skin conductance, to infer emotional states. The other two methods, combined with biometric identification, improve accuracy, thus decreasing the chances of wrong inferences.

Application of emotional recognition technology

This technology is employed by various industries and has a plethora of applications. We will discuss some of the prominent industries that use this technology.

Marketing and advertising

To analyse customer’s emotions while shopping for particular goods or their arrangements. Further, to capture the customer’s reaction to a particular product for targeted advertising.

Healthcare

It can help assess and monitor mental health conditions, offering a more nuanced understanding of a patient’s emotional well-being. Additionally, it aids in detecting autism or neurodegenerative diseases, predicting psychotic disorders or depression and primarily helping in suicide prevention.

Education

Emotion recognition technology can be employed to monitor student behaviour and engagement in classes, which affect learning experiences. 

Security

Security covers both public safety and crime detection. This technology is used as a lie detector in border control and as predictive screening in public places to identify emotional triggers. Furthermore, it also aids in crime detection by effectively monitoring changes in emotion.

Data privacy concerns

A groundbreaking technology with immense potential, such as emotion recognition technology, though effective in a lot of scenarios, is bundled with certain significant concerns related to privacy and data protection. The collection and processing of personal data, especially emotional data, raises ethical and regulatory concerns that cannot be neglected. An evident concern that might arise is that of bias, which can be a result of the design and training of the platform or the inadequate data set utilised to develop the technology.

Data protection

The storage and transmission of emotional data must be secure to prevent unauthorised access. Emotional data would fall under the category of sensitive personal data, which, if leaked or breached, could potentially lead to a violation of privacy, i.e., confidentiality, integrity, and availability. Recently, the mental health records of almost 2,000 students, including 60 current students of an educational institution in Los Angeles, were uploaded to the dark web after falling victim to a ransomware attack.

Necessity and proportionality

Utilising human emotional expression by converting it into data to increase sales or improve the experience is quite often not well-received by society. The utilisation of such technology ought to be counter-balanced with necessity and proportionality principles. Not adopting such safe practises often leads to the exploitation of data subjects (data principals) and the violation of privacy rights. However, the proportionality depends on the types of data collected, how long the data is retained and potential further processing.

Accuracy and fairness

Emotional recognition technology provides data that, in isolation, may not be accurate. Emotional expression can most often be hoodwinked and lead to inaccurate inferences. Further, emotion recognition technology relies on a theory called basic emotion theory, which is a set of non-scientific assumptions that claim there is a connection between facial expression and internal emotion and that such emotions are distinct and consistent across all cultures.

It is also understood by many that societal and cultural differences have a prominent impact on facial expression, which in essence affects accuracy. For instance, a person from a particular ethnicity might have a facial expression that corresponds to a particular emotion, but he might be wrongfully profiled for being angry. Unless adequate and diverse data is fed to the technology, there is a possibility for inaccurate and unfair inference.

A peculiar situation where the emotion recognition technology might fail is when it tries to infer the emotion of a person diagnosed with alexithymia, which is a state in which a person has difficulty comprehending the emotion he/she is experiencing.

Surveillance and privacy

Thanks to the ubiquitous nature of cameras and microphones, we are constantly under surveillance that we are unaware of. These cameras and microphones not only capture facial expressions but also the voice of a person. This raises a serious concern about transparency in the collection and processing of such sensitive data. The subjects of such surveillance are blissfully unaware of the purpose and means of processing their emotional data. This leads to the processing of personal data without a legal basis warranted by data protection legislation.

Regulatory framework

The General Data Protection Regulation (GDPR)

The GDPR, implemented by the European Union (EU) in 2018, is one of the most comprehensive data protection regulations globally. It requires transparent data processing, informed consent, and the right to be forgotten. Emotion recognition technology companies that process the emotional data of EU residents must comply with the regulations. It is pertinent to mention that despite the sensitive nature of emotional data, the GDPR does not categorise it as special data, resulting in a lack of comprehensive protection.

The Digital Person Data Protection Act, 2023

The DPDPA was granted presidential assent on August 11, 2023, marking the birth of a novel regulation governing digital personal data. However, the legislation is silent on sensitive personal data or biometric data. The Indian legislation that addresses this concern is the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011. Section 2(b) of the rules defines: “biometric” as the technologies that measure and analyse human body characteristics, such as fingerprints, eye retinas and irises, voice patterns, facial patterns, hand measurements and DNA for authentication purposes. Additionally, Section 43A of the Information Technology Act of 2000, merely prescribes a relatively feeble requirement for private data controllers to maintain “reasonable security practices and procedures.”

Ethical AI guidelines

Organisations like IEEE (Institute of Electrical and Electronics Engineers) and the AI Ethics Guidelines by the European Commission have released guidelines for ethical AI development. These guidelines emphasise transparency, fairness, and accountability in the development and deployment of emotion recognition technology.

Conclusion

To conclude, emotional recognition technology has immense potential and can be utilised in various industries. It is deployed to infer real-time human emotion and alter the user experience, respectively. However, the collection and processing of emotional data raises significant data privacy and ethical concerns. The development of such futuristic technology ought to be guided by an ethical regulatory framework that encompasses informed consent, data security, fairness and accountability.

References


Students of Lawsikho courses regularly produce writing assignments and work on practical exercises as a part of their coursework and develop themselves in real-life practical skills.

LawSikho has created a telegram group for exchanging legal knowledge, referrals, and various opportunities. You can click on this link and join:

https://t.me/lawyerscommunity

Follow us on Instagram and subscribe to our YouTube channel for more amazing legal content.

LEAVE A REPLY

Please enter your comment!
Please enter your name here