“Privacy is power. What people don’t know, they can’t ruin.”
The human right to Privacy (ECHR Art 8) incorporates the concepts of individual autonomy, personal freedom and self-determination. When we are able to manage our identity and how we present ourselves to the world, we are empowered. Being able to exercise control over information about us, is a fundamental part of this power.
As the adoption of revolutionary and undoubtedly useful technology increases, so too do the privacy risks associated with the collection of vast data sets and the use of artificial intelligence, invisible processing, profiling and automated decision making.
Safeguarding citizen information is incredibly important, paticularly information that falls under the list of special categories (sensitive) data types like sexual orientation / sexual life.
As a special category of personal data, information about our sexual orientation or sexual life can be susceptible to misuse, discrimination, and harm if not properly protected. This is made clear by its inclusion as a protected characteristic under the Equalities Act 2010. Below I have outlined some of the risks associated with data protection and sexual orientation:
As our world is increasingly digital, we risk being reduced to a digital profile which can be dehumanising. These profiles can be used to discriminate and marginalise individuals, including in relation to their sexual life or gender identity. Particularly in the world of Artificial Intelligence, there is a significant risk of allocative harm affecting people from specific communities and groups. We have been working with our customers to complete Algorithmic Risk Assessments to prevent bias and discrimination. We hope to see these becoming more commonplace for organisations using ‘socially consequential’ technology, in particular.
Big Data, Big Risks
The modern existence of vast data sets and varied and expanding technological means of surveillance and profiling, including the arbitrary use of facial recognition technology, poses its own risks. Alongside the issues around reduced anonymity, as we collect and share more information about citizens identities and orientations, the risk of data breaches increases. This is particularly key at the moment; the conflict in Ukraine, placed the UK on high-alert for state sponsored attacks, with cyber-attackers seeking to pollute training data, attack algorithms and disrupt national services.
“It is not data that is being exploited,
It is people.”
The application of data analysis techniques, algorithms, or machine learning processing can make logical connections and draw conclusions from existing data. These processes can create new information that is derived or deduced from other data, rather than being explicitly provided or directly observed.
Where this derived data is not known to the individual, there is no opportunity to object or correct the data and could result in unintended disclosure of information related to sexual life or orientation.
Respect and Protect
Despite the challenges, there are steps that can be taken to support the privacy of citizens sexual life and other sensitive data in the digital age:
Data Minimisation: Organisations should practice data minimisation, collecting the minimum amount of data necessary to provide their services. They should avoid collecting unnecessary information, especially regarding sensitive attributes like sexual orientation or anything that may act as a proxy (online behaviour, social media activity or group affiliations). Data should also be subject to a strict retention schedule.
Data Encryption: All data that companies hold should be encrypted both in transit and during storage to ensure that even if a company is hacked or the sharing of data is intercepted, it remains unreadable. The level of encryption should match the risk associated with the data.
Transparency: Companies should ensure that they are open and transparent about any information being collected, how it is used and who it is shared with. This should include information about profiling or inferred data, providing citizens with an opportunity to raise objections or correct information.
Anonymisation and Pseudonymisation: Personal data, especially sensitive information, should be anonymised or pseudonymised whenever possible to protect individuals’ identities.
Here at Kafico we work hard to help our clients implement data protection measures to work towards compliance with legislation as well as creating a safer and more inclusive environment, where everyone’s privacy is respected and protected.
If there were any topics discussed in this article that you’d like to discuss further, or you’d like to learn more about how we at Kafico can help your organisation or business, please email us at email@example.com
Maddy, Governance Consultant.