Emma Cooper, LLM Information Rights Law and Practice

Return to Previous Chapters
8. Opacity in Public Sector Use of ADM
5. The Power Imbalance
When Civil Society Group Big Brother Watch, lobbied the Special Rapporteur for extreme poverty of the United Nations, prior to his visit to the United Kingdom in 2018[1], they cited the absence of transparency in relation to the use of ADM in the case of RBV and other initiatives by public authorities as a concern. They complained that ‘vulnerable citizens are not … afforded an opportunity to consent to their data being shared or accessed and have no way of knowing exactly how, when or why their personal information is being used’[2].
In Denmark, Gladsaxe, was also met with strong public, political and academic criticism when a national newspaper exposed the intention of three local authorities to apply the automated points-based approach. An incredulity is evident in the colloquial notion that; if a single parent missed a dental appointment for their child, their family life might be investigated by the state[3].
Le Grand and New suggest that paternalism arises where the government intervenes in the lives of citizens, seeking to address their failures of judgement for their own good. When perceived as excessive, these interventions are regarded as being symptomatic of a ‘Nanny State’; a government that ‘tries to give too much advice or make too many laws about how people should live their lives’[4].
Given the nature of ADM deployment by public authorities – spanning health, policing, welfare and troubled families – it seems conceivable that the impact of the use of profiling and Automated Decision making as part of a paternalistic democracy is more likely (than those used in the private sector) to engage across all four interests specifically stated in Article 8 – private life, family life, home and correspondence.
The risk is compounded by the availability of vast data sets and the existence of varied and expanding technological means of surveillance and profiling, including wholesale interception of and access to communications, arbitrary use of facial recognition technology and the indiscriminate tracking and surveillance of demonstrators through the use of mobile phones‘[5].
Modern engagement with social media, has been categorized by some as a form of ‘voluntary servitude’ where personal data is knowingly surrendered as a kind of ‘entry fee’ to online society[6]. Citizens, on the other hand, have been labelled as a ‘captive clientele’ whereby those unhappy with a service being provided by public authorities are not in a position to use a different provider[7] and have ‘no realistic alternatives’ to accepting the processing of those in authorities[8].
The chilling effect resulting from surveillance is conceivably intensified when encountered as a result of public authority surveillance. Here, how we are perceived and profiled has the potential to impact our financial welfare, our family and our future opportunities rather than simply which advertisements we might be presented with as we browse online.
Outcomes of decisions made or supported by these automated systems can deprive citizens of welfare, determine custodial sentences, suppress political demonstration or award citizenship and the ‘clear imbalance between the data subject and the controller’[9] manifestly renders citizens a vulnerable group[10] who require particular consideration when it comes to the impact of such technology.
Public authorities in democratic nations increasingly introducing policies that seek to amend the behaviours of citizens, driven by a greater understanding of the cost of those behaviours for both the citizen and society[11].
The recent COVID-19 pandemic has raised early questions around the ethical and long-term impact of the use of AI and surveillance technologies that ‘amass personal data and share for community control and citizen safety motivations’[12].
The idea that erosion of public expectations of liberty, in the wake of significant events, is part of a mechanism to acquire wealth or power has been known somewhat conspiratorially as ‘Shock Doctrine’[13]. However, Initiatives like Test and Trace[14] certainly affect family life, liberty and autonomy and discussions are already taking place about the potential that they will feature in society beyond any state of emergency[15]
In considering whether Article 8 is engaged, the courts will often consider what might be “highly offensive to a reasonable person of ordinary sensibilities”; as to whether the person should have expected to enjoy privacy regarding the case at hand. The concern raised by Human Rights Group Liberty is the potential for ‘normalisation’ of technologies that can exercise control and limit freedom[16].
6. GDPR Article 22
Initially this chapter will draw out some of the contention surrounding certain elements of the General Data Protection Regulation, intending to demonstrate the ambiguity of Article 22 and therefore the potential weakness of the Regulation in providing safeguards for the fundamental rights of individuals impacted by automated decision making.
The remainder of the chapter will be limited to identifying what duties and safeguards the regulations offer to protect the rights of citizens subject to automated decisions rendered by public authorities more specifically. This will naturally take the discussion outside of the wider debate around Article 22 due to the presence of an exemption that applies to processing that is ‘authorised by law’.
It is noted that GDPR has been retained in UK law and the ‘UK GDPR’ combines with an amended Data Protection Act (2018)[17]. The text may refer to the Data Protection 2018 (DPA 2018) specifically as appropriate to highlight any divergence.
In October 2017, the European Data Protection Board adopted the Guidelines for GDPR Article 22[18] which concern profiling and automated decision making; intending to ‘address’ the risks posed by these activities[19].
The General Data Protection Regulations[20] (GDPR) sparked much debate about how effective the regulations would be in offering safeguards to protect the rights and freedoms of data subjects (individuals whose data is being processed in the use of ADM) and to what extent duties for Controllers (those exercising control over the purpose and manner of processing) were clear[21].
GDPR Article 22 provides that ‘The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her’.
Articles 13 and 14 of GDPR require the provision of ex anti information by the Controller about ‘the existence of automated decision-making, including profiling… and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences’ and under Article 15, the data subject may request access to the same.
The presence of a number of qualifiers or conditions within the regulations, underpin the debate as to whether the provisions of GDPR collectively offer effective duties and safeguards in relation to profiling and automated decision making.
Whilst there was some initial debate about whether the ‘right not to be subject’ to automated decision making requires individuals to actively engage such a right[22], The EU Working Party 29 (Now the European Data Protection Board) guidance helpfully clarifies that the provision amounts to a general prohibition where such processing is solely automated and produces significant legal effects[23].
GDPR Recital 71 provides that ‘In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision’[24].
Although arguing that appropriate duties and safeguards are indeed present, the concerns are abridged succinctly by Gianclaudio Malgieri and Giovanni Comandé as;
‘businesses processing personal data having just a general duty to disclose information on the general functionality of profiling algorithms, in very limited cases (only in absolute absence of human intervention in the decision-making and only when it produces legal effects or very relevant similar effects) and anyway just as long as it does not affect trade secrets of data controllers’[25].
Gianclaudio argues, contrary to the above sentiment, that suitable safeguards and duties are found if those articles and recitals are applied systematically. They contend that ‘significant legal effects’ can be interpreted broadly, including those effects that might necessarily have been excluded as not meeting this threshold[26]. They assert that the ‘right not to be subject to solely automated decision making’ can be found through interpreting the requirement for human involvement to include more than ‘nominal’ intervention[27].
Sandra Wachter et al asserted that the absence of precise language, combined with the requisite to sew together an explanation right using different articles and recitals, renders the regulations at best ambiguous and at worst; ‘at risk of being ‘toothless’[28].
Regardless of one’s position in the above debate, the Regulations are indisputably less reassuring for citizens subject to automated processing performed by public authorities. This is because the general prohibition does not apply if the decision is ‘authorised by law’[29].
The UK Information Commissioner clarifies that ‘If you have a statutory or common law power to do something, and automated decision-making/profiling is the most appropriate way to achieve your purpose, then you may be able to justify this type of processing as authorised by law and rely on Article 22(2)(b)’[30], The caveat is then offered that; Controllers must demonstrate that it is ‘reasonable to do so in all the circumstances’.
Big Brother Watch complained about the dilution of the safeguards found in the EU GDPR through the enactment of DPA 2018, asserting that, despite their lobbying during passage, the 2018 Act does not provide comparable safeguards[31].
When automated decision making is not prohibited by Article 22 (1), because it is authorised by law, Article 22 (b) requires that there are ‘suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests’. This provision can be read in an unrestricted way, as requiring exploration of rights set out in the European Charter, for example[32].
The DPA 2018 does not include this reference and s 50(2) and (3) require, merely, that Controllers make the individual aware of automated decisions, after the fact, and allow one month for challenge. It does certainly appear that the qualifying elements of Art 22(2)(b) have indeed lost some potency in the ratification of DPA 2018.
Big Brother Watch also complained that the human involvement applied by case workers overseeing RBV appeared tokenistic, complaining that the authorities had suggested the prohibition did not apply to RBV because the system was merely ‘advisory’ when Big Brother Watch scrutiny revealed the system appeared to be regarded as ‘decisive’ in nature[33].
It is more likely that the prohibition simply does not apply because activities related to welfare payments and fraud reduction are governed by UK law, and systems involving Solely automated decision making and profiling that result in significant legal effects, when ‘authorised by law’, are not subject to the prohibition of Article 22 (1) and expressly permitted under Article 22(2)(b) and s 49 (1) DPA 2018.
The imprecise language of ICO guidance and the dilution of the DPA 2018 itself, falls short of the comparable Article 22 (b) safeguard, referring to internal justifications of purpose rather than wider consideration of the ‘data subject’s rights and freedoms and legitimate interests’[34].
In summary, the majority of ADM deployment by UK public authorities would appear not to be captured by Art 22 prohibition since they are largely linked to statutory duties, meaning that;
- There is no prohibition on citizens being subject to solely automated processing producing significant effects.
- There is no requirement for routine human intervention as part of general system operation.
The apparent safeguards provided by DPA 2018 are;
- Public bodies are required to provide ex ante information about the existence, logic and consequences of the proces
- Public authorities must justify the processing as an appropriate and reasonable way to achieve the purpose
- Citizens are entitled to seek and receive human intervention ex post.
- Citizens can challenge the decision ex post.Citizens can request a new decision that is not automated ex post.
Essentially, UK public authorities are provided with the imprecise notion of a requirement to ‘justify’ the processing without an obvious prerequisite to consider the impacts on the rights and freedoms of individuals in a broader sense.
Under DPA 2018 s 50 (2)(b)(ii), individuals have the opportunity to challenge and may request that the Controller reconsider or ‘take a new decision that is not based solely on automated processing’. However, there is no requirement to subject any justification to external scrutiny which would allow for challenge and public participation.
As lamented by Algorithm Watch, despite the socially consequential nature of the use of ADM by public authorities, transparency rules within the legislation ‘do not include mechanisms for an external deep look into the ADM systems necessary to protect group-related and societal interests such as non-discrimination, participation or pluralism’[35].
The burden for redress in the event of abuse or malfunction of these systems appears to be largely on the citizen; to seek human intervention, raise a challenge or request a new decision.
Sweel Leng Harris proposes that the Data Protection Impact Assessment might be considered a remedy[36]. S 64 DPA 2018 provides that Controllers identify the purposes and anticipated consequences of processing and to make an ‘assessment of the risks to the rights and freedoms of data subjects’.
In setting out the proposal, Harris draws on Janssen’s position that ‘rights and freedoms’ should be read as referring to the rights set out in the European Charter rather than just privacy rights as suggested by the ICO guidelines on completion of DPIAs[37].
Harris also suggestion that DPIA’s could be integrated with assessments of the impact on equality, given the overlap with the Public Sector Equality Duty under s 149 (discussed further in Chapter 7). Optimistically offering that, by combining the activities, ‘proper equality analysis of the potential for direct and indirect discrimination could be informed by technical information on the data processing, and assessment of the impact of the data processing on rights and freedoms could be informed by expertise in equality’.
However, as the following chapters will suggest, locating the requisite expertise within public sector and then encouraging collaboration is another challenge entirely.
Harris provides a compelling illustration of how ‘to improve the conformity of government data processing systems with rule of law principles’ through systematic assessment of human rights implications and environmental information regulation-inspired transparency and engagement[38].
The key point made by Harris is the importance of publishing such assessments which is not currently a requirement of UK GDPR. He points out that despite public authorities already being subject to frameworks that mandate transparency ‘the use and operation of data processing systems by government is not transparent at present.’[39].
7. The Rule of Law
In his 2013 speech, the Attorney General described a disconcertion when observing “some countries publicly proclaim adherence to the Rule of Law and Human Rights, whilst at the same time eroding those very same standards behind the cover of legislative processes”[40].
Accessibility is one of the principles set out by Lord Bingham underpinning the realisation of the core principle of law which he defines as; ‘all persons and authorities within the state, whether public or private, should be bound by and entitled to the benefit of laws publicly and prospectively promulgated and publicly administered in the courts’[41].
Summarily, the principles are;
1. The law must be accessible, understandable and predictable.
2. Questions of legal right and should be resolved by the application of the law and not discretion.
3. The laws should apply equally to all.
4. Ministers and public officials must not exceed the limits of their powers and exercise them in good faith.
5. The law must afford adequate protection of fundamental Human Rights.
6. The state must provide routes for resolving disputes which the parties cannot resolve themselves.
7. State adjudicative procedures should be fair.
8. The rule of law requires state compliance with international as well as national laws[42]
During the drafting of the Equality Act 2010, ‘Framework for a Fairer Future’ described the important role that public sector organisations play in promoting equality in society[43]. The paper notes that public sector bodies are in a prime position to effect change from the position of employer, commissioner and procurer of services for citizens. The Equality Act 2010 placed positive duties on public bodies that ‘focus on the way their spending decisions, employment practices and service delivery affect local people whatever their race, disability or gender[44]‘.
The Equality Act 2010, which consolidated various different equality Acts, intended to clarify equalities law and strengthen protections, however, the passing of the Act removed the explicit requirement for public bodies to publish ‘equality analysis, information, and engagement’[45].
Some community and voluntary groups understandably raised concerns that proper assessment of equality impacts would not be undertaken as a result. The government disagreed, asserting that the implicit Public Sector Duty under s 149 would be sufficient[46]. S 149 requires that, inter alia, authorities have due regard to eliminating ‘discrimination, harassment [and] victimisation’[47].
Scholars have positioned equalities law as a route by which the impact of ADM on fundamental rights might be managed. Since algorithmic decision making is a rule-based process – Cloisters place it squarely within the meaning of ‘provision, criterion or practice’ (PCP) and therefore falling under section 19 (1) of the Equality Act 2010. This means that activities that result in Prohibited Conduct[48] (direct discrimination, indirect discrimination, victimisation and harassment) fall foul of the legislation, insofar as the Protected Characteristics under Section 4 are concerned[49].
Many ADMs are applied unilaterally[50] and so will ostensibly satisfy Section 19 (2) (a). However, Section 19 (2) (b) and (c) require assessment as to whether particular individuals having a protected characteristic are put at a disadvantage and then further, if that disadvantage is permissible under S 19 (2) (d) because it is ‘proportionate to achieving a legitimate aim’.
Arguably, completion of ‘Equality Impact Assessments’ (EIAs) allow the public sector body to demonstrate the extent to which particular PCPs are ‘equal before the law’ in accordance with the Rule of Law and, certainly in the case of s 19 (2) (d), that they are exercising their power ‘in good faith’[51].
Perhaps ironically, the importance of being able to ‘draft [EIAs] with openness and candour’ has been offered as the reason for not routinely publishing them[52]. Consequently, Big Brother Watch and Cloisters both found that for the ADM models they researched, these equality assessments were generally not present for review, even where indirect discrimination, prima facie, waspresent[53].
R (Hurley & Moore) v Secretary of State for Business, Innovation and Skills[54], Lord Justice Elias found that the (9) ‘[T]he duty of due regard under the statute requires public authorities to be properly informed before taking a decision. If the relevant material is not available, there will be a duty to acquire it’[55].
Seemingly contrary to this duty, in the cases where such assessment are available, they appear to be somewhat tokenistic, simply indicating that the application of the system to all claims means that no impact should be present[56]. For RBV, of the published assessments, some appear to have been completed by finance managers[57], individuals unlikely to have knowledge of the potential for breaches of equalities law as a result of complex machine learning, for example.
Despite the supposed duty placed upon the state to scrutinise PCPs (including the deployment of ADMs) prior to deployment, much like GDPR Article 22, the Equality Act 2010 places vague obligations on the state to make ‘assessments’ or give ‘due regard’ without requiring them to be externally scrutinised.
Although the courts have found that the Equality Act 2010 ‘imposes a heavy burden upon public authorities’[58] to effectively discharge their public sector equality duties and ensure that evidence is available, the law does not explicitly require that public authorities carry out Equalities Impact Assessments, instead seeking to make a determination as to whether the Duty has been satisfied at Judicial Review[59].
The onus is again placed upon the citizen, barring actions brought by the Equality and Human Rights Commission[60], to obtain the necessary information to scrutinise the ADM such that public duty failures might be found, and then to seek redress through judicial review but, conceivably only after a violation and therefore potential harm, has occurred.
The rule of law, as Lord Bingham asserted, provides that ‘Questions of legal right and liability should ordinarily be resolved by the exercise of the law and not the exercise of discretion’. And yet, the law itself affords considerable room for discretion to public authorities in the deployment of ADM systems, despite posing a risk to the fundamental rights of citizens.
Cloisters explored the ADM technology applied to the Settled Status Scheme implemented by the UK Home Office, whose intention was to streamline the management of applications from individuals wishing to remain in the UK post Brexit and to reduce fraud and error[61].
To verify that a person has been resident for five years, in accordance with the required threshold, the Home Office uses their National Insurance number to analyse DWP and Her Majesty’s Revenue and Customs (HMRC) data. Whilst the exact data and processing parameters are unclear, the decision uses some of the thirteen categories of 38 data held by the DWP to make its assessment. The HMRC case worker will then use the data to make a final determination and, where required, supplementary information will be sought[62].
Cloisters highlighted, inter alia, the absence of an equalities assessment to explore the impact that the processing might have on women[63]. The system excluded child benefits or credit information of which women are much more likely to be the recipients. This makes it possible that women could suffer the disadvantage of having to produce supplementary information more regularly, a process that is described as ‘extremely time consuming’ and that can make citizens ‘nervous’.
The second Rule of Law principle laid out earlier in the chapter, that ‘questions of legal right and should be resolved by the application of the law and not discretion’ makes clear that decisions made in the exercise of statutory power should not be arbitrary. However, on being questioned in parliament about the exclusion of Child Benefit / Credit data from the Settled Status scheme (affecting around 60,000 people), Caroline Nokes MP stated ‘It was simply not one of the functionalities included. There is no hidden reason.[64]’ Moreover, it seems there was no reason.
ECHR provides that in managing human rights obligations, the state recognises a ‘fair balance that has to be struck between the competing interests of the individual and of the community as a whole’[65].
In Pretty v UK[66] for example, the notion of allowing a person to end their own life was weighed against the state’s positive obligation to ensure that the legislation provides suitable safeguards for the wider community. The decision found that permitting individuals to exercise such self-determination, without state interference, would fail to protect vulnerable individuals including those not able to make informed decisions about decisions around ending their life’[67].
In Bridges[68], South Wales Police argued the public interest found in the prevention of disorder or crime was a legitimate exception to their negative obligations versus the countervailing argument of Mr Bridges’ privacy rights.
On appeal[69], deficiencies in the legal framework were highlighted, that the data protection legislation and the Surveillance Camera Code of Conduct[70] requires only the presence of a law enforcement purpose and determination that the surveillance is considered to be necessary.
The appeal judgement found that the scope for AFR had been set “impermissibly wide” allowing for too much discretion with regards to who was targeted and where the technology should be deployed[71].
The discussion put forward the need for a framework permitting less discretion in these areas but the court disagreed with the 2015 dissenting judgement of Lord Kerr. In Beghal v Director of Public Prosecutions[72]. Kerr asserted that authorities exercising self-restraint, where legal constraints are insufficient, is not enough to establish legality and that legality should be judged by its potential reach[73].
Instead, Lord Justice Singh said, ‘we consider that what must be examined is the particular interference with Article 8 rights which has arisen in this present case and in particular whether that interference is in accordance with the law[74]’. He drew on Munjaz v United Kingdom[75] to establish that locally drawn policies can offer sufficient clarity to protect individuals from interference with their Article 8 rights[76].
The judgement did not appear to benefit from any substantial weighing of a ‘pressing social need’[77] that might be found in considering privacy interests of the community or society as a whole and reflecting on the framework for legal application.
This predominantly ‘individualistic’ framing of privacy expectations is explored and challenged by Mead[78] whose study of UK Misuse of Private Information (MoPI) cases illuminates the propensity for decisions to focus more on wider social value of competing ECHR elements such as public safety, rarely recognising the value of privacy as a comparable social instrument, rather than purely a personal concern.
Encouragingly though, following a somewhat scathing response[79] to the appeal judgement from the Surveillance Camera Commissioner who accused the Home Office and Secretary of State of being ‘asleep on the watch’ despite ‘fruitlessly and repeatedly been calling upon to update the Surveillance Camera Code[80], a new code was published in 2020. The code includes safeguards intended to limit police discretion when considering who will be placed on a watchlist and where and when their conduct is to take place. The code acknowledges that ‘state intrusion in such matters [FRT] is significantly increased by the capabilities of algorithms which are in essence’[81].
Data protection legislation, human rights legislation and equalities law all provide public authorities with tailor made exemptions, caveated with abstract duties of assessment, justification, and balancing exercises. This appears to deepen the power imbalance between citizen and state risks a ‘system of oppression and tyranny camouflaged by what purports to be a legal framework[82]‘
These self-justifications and echo-chamber assessments are carried out behind closed doors and generally only subject to external scrutiny in the courts. Mr Justice Sales is reported to have indicated that the courts are somewhat burdened by having to ‘spell out the practical content of the duty’[83] which again points to ambiguity and room for discretion by public authorities.
The application of paternalistic discretion can result in arbitrary application[84] of privacy inhibiting technologies across the private and public domains which results in individual intrusion which is often weighed up against significant countervailing interests without considering of collateral intrusion or the societal privacy interests.
It has been rather optimistically suggested that a manifest advantage arising from the use of ADM systems in public sector is that they have placed authorities’ decision making into the public domain, shining a light on decisions that ‘up to now, had been taken far out of citizens’ sight’[85].
This suggestion, put forward in a study by the European Parliament[86] , that the disquiet caused by widespread deployment of state sanctioned ADMs is somehow a ‘silver lining’ to the implied ‘cloud’ of entrenched state opacity, appears somewhat specious.
At the very least it appears to minimise the conceivable impact that these systems already have on ordinary citizens and, to the contrary, this paper argues that the responsibility placed on the state is characterised by, guarded self-regulation and that the burden for intervention and scrutiny rests with vulnerable citizens.
As demonstrated Harris’ suggested remedy, potential self-regulatory or regulatory remedies to the risks to fundamental rights posed by state sanctioned ADM, rest on a requirement for public authorities to conform to the rule of law which, is manifestly absent under the existing framework
LJ also found that the public must not be ‘vulnerable to public officials acting on any personal whim, caprice, malice, predilection or purpose other than that for which the power was conferred[87]’ and yet UK citizens have been impacted by the deployment of ADM whose scope for discretion were unreasonably wide[88], whose operators withheld meaningful information about the logic applied without comprehensible justification[89], Controllers who have disregarded advice and complaints about significant system 42 errors[90] and those who produced tokenistic or incompetently drafted impact assessments[91].
The following chapters will consider whether non-conformance with the Rule of Law, rather than being the result of ambiguous legislation, complex technology or the absence of a framework within which to operate, is the result an entrenched culture or attitude that is unique the UK public authorities.
8. Opacity in Public Authority Use of ADM
Aside from accessibility and understandability being a key principle in the Rule of Law, it is indisputable that being able to comprehend how an ADM system works is an important part of managing the risk that decisions pose to citizens[92].
The EU Parliament Report “Understanding algorithmic decision-making: Opportunities and challenges”[93] explores a number of interchangeable and overlapping terms in relation to how stakeholders might achieve such understanding.
The report defines ‘understandability’ as ‘the possibility to provide understandable information about the link between the input and the output of the ADS.’, comprising of Transparency and Explainability[94].
Transparency is about the availability of information, such as the ADM code, to internal or external stakeholders and, perhaps, the public. Explainability requires the provision of information beyond the ADS itself- ideally tailored to the recipient such that the information is meaningful[95].
The ICO provides that explanations around AI decisions fall into two categories, ‘process-based explanations which give you information on the governance of your AI 43 system across its design and deployment; and outcome-based explanations which tell you what happened in the case of a particular decision.[96]’.
Meaningful explanations can help stakeholders improve the system, demonstrate compliance with regulations and allow individuals that are subject to its decisions to express their views[97].
Explainable ADM systems can support understanding of the ‘cause and effect relationship’, demonstrate compliance with regulatory frameworks and assist stakeholders to identify and correct any bias (See Competency and Authority) [98].
In 2015, Jenna Burrell identified 3 potential types of algorithmic opacity[99] which have since been explored and contextualised further by Jennifer Cobbe[100];
· Technical Illiteracy / Illiterate Opacity which is seen to occur when systems are understandable only by those to those who can read and write computer code’[101].
· Intentional Opacity as a method of self-protection by corporations, seeking to protect trade secrets and competitive advantages[102]
· Intrinsic Opacitywhich arises when experts or system developers struggle to understand, given the complexity of technology, particularly where there is machine learning[103].
Later that same year, The Legal Education Fund published the aforementioned Cloisters’ Joint Opinion in which they describe the deletion of the FRT images by South Wales as another type of opacity to which they do not give a name[104].
Cloisters commend the attempt to minimise data collection and storage to comply with the data protection principles. As previously discussed, however, it creates problems 44 with being able to assess how the system has behaved and how decisions were made, including any bias or discrimination that may have occurred[105].
This type of opacity could be referred to as;
· Compliance Opacitywhere information that could support understanding of the system or output are destroyed, deleted or unavailable to comply with data protection principles.
Also in 2016, Stohl et al discussed;
· Strategic Opacity as resulting from providing inappropriate or superfluous information such that information that the actor wishes to conceal may be hidden in plain sight[106].
· Inadvertent Opacity is described where the appropriate information is available ‘but it is rendered meaningless because of recipients’ cognitive limitations—or … information overload’[107]
Most recently, in 2021 Cobbe et al introduced the concept of;
· Unwitting Opacity whereby it simply does not occur to stakeholders to record pertinent information about ADM processes because they are potentially ‘unaware of their relevance for meaningful accountability’[108]‘
In relation to their aforementioned lobbying of the Special Rapporteur for Extreme Poverty, Big Brother Watch undertook a Freedom of Information (FOI) campaign, ‘asking every local authority in the UK … for information about their uses of artificial intelligence (AI), algorithms and automated decision-making tools in the provision of their services’[109].
This involved sending many hundreds of initial requests and subsequent clarifying or refining requests[110]. Big Brother Watch identified the various obstacles they faced in obtaining the information they required, including Intentional Opacity (‘reluctance to disclose’) and some Illiterate Opacity (‘many replies asking for definitions’).
However, many authorities’ FOI Officers were ‘unfamiliar with the practice or even the concept of automated decisions’ and despite providing definitions and explanations, many claimed that no such technology was in place, only for Big Brother Watch to discover that they were in fact deployed[111].
This opacity appears, prima facie, not to be neither strategic, intentional or intrinsic. The issue here appears not to be whether the parties are unable to explain the machinations of ADM or are seeking to conceal its existence. It is rather that they do not appear conscious of its existence and therefore have not been internal communications to ensure that staff, even those tasked solely with facilitating transparency (FOI Officers), are not suitably prepared to provide the necessary information. This scenario appears to conflict with the well-established transparency obligations and processes for public sector bodies[112].
There could be an argument that the opacity is unwitting, that the authorities simply did not realise the importance of making the information available. Big Brother Watch observed, however, that key information was often scattered across different departments or held by private companies rendering it unavailable. This, combined with an alleged ‘wilful blindness’ and ‘apathy’ towards the detail of the system[113] and the impact on the rights of citizens suggests something altogether different.
The results of the Big Brother Watch campaign could point to a type of opacity, perhaps particular to public sector; and one that might best be referred to as “Neutral Opacity”. Here, due diligence has been apathetically disregarded, or perhaps accountability has been shifted to a non-visible or inaccessible actor. This type of opacity appears distinct from Intentional Opacity in that it is characterised by an absence of any intention, rather a kind of indifference to or evasion of accountability.
The following chapters will explore a number of factors that could underpin this type of opacity, including whether it could be rooted in entrenched public sector culture.
9. A Culture of Indifference
In discussing modern technology and the legal and regulatory frameworks[114] within which it advances, Nemitz describes the de-centralisation of power from the government that has its roots in the Youth Movement in the 60s. His description of the eventual re-centralisation of power into the hands of the five modern-day key technology providers (The Frightful Five) makes a chilling read[115].
This power is found, he explains, in their development of technology in black box form -making it difficult for legislators to regulate, their ability to finance and lobby for regulation that suits their aims and their apparent enduring disruption of democratic process through legislative action. He warns that the ‘fiascos of the Internet, in the form of spreading of mass surveillance, recruitment to terrorism, incitement to racial and religious hate and violence as well as multiple other catastrophes for democracy’[116] which were preceded, he explains, by a failure to attribute or assume responsibility.
Arguably ADM use by public authorities, who do not adequately assume responsibility, as potentially suggested by the proposition of Neutral Opacity in Chapter 8, do risk a further concentration of power to the hands of technology providers.
Of some concern, are the reports that UK technology companies are exploiting the lack of accountability within public sector. West Midlands Police and Crime Commissioner’s Strategic Adviser reportedly told the Guardian, when discussing the ‘quiet’ scrapping of programmes like RVB, that he had concerns about businesses ‘pitching algorithms to police forces knowing their products may not be properly scrutinised.’. The presence of a West Midlands Police ethics committee, who scrutinise AI projects ‘may have deterred some data science organisations from getting further involved with us’, he claims[117].
While the consequences for UK public authorities may not necessarily equate to a ‘catastrophe for democracy’, at the very least, it already poses a risk to the Rule of Law and the remainder of this chapter will explore the notion that the above coined Neutral Opacity finds its roots in an entrenched public sector culture, the impact of which extends beyond Algorithmic Opacity and results in avoidable failures, violations to fundamental rights and community disenfranchisement.
There are, of course, examples of technological innovations are used by public authorities to empower citizens, for example, by support them to care for themselves at home rather than in a healthcare setting[118].
Whilst, this is arguably true, it cannot be ignored that the inherent paternalism of modern democracies means that automated or otherwise, significant decisions are routinely made over which individuals are able to exercise little control[119].
Personal autonomy has been categorised as having three main elements;
1. That the individual is competent and able to assess information and identify options.
2. The individual has efficiency; the ability to actually select an option and achieve their goals
3. The individual is able to express authentic desiresthat are free from coercion or manipulation[120].
Drawing on this bioethical framework, the right to autonomy can be a negative one, in the sense that the individual should not be forced into something. It can also be a positive action where individuals are actively supported to exercise autonomy.
Whilst paternalism may often be considered as the antithesis to autonomy[121], it could be argued that some ADM systems are ‘paternalism for the sake of autonomy’[122]. FRT, for example, could be argued to support the autonomy of the community whereby locating and apprehending suspects who are identified through the technology allows for greater enjoyment of public spaces, made safer and more accessible to the community through these means.
Nevertheless, in the case of ADM systems deployed when authorised by law, it is clear that individual negative autonomy cannot be exercised. Citizens do not have the right not to be subject to automated decision making and so are subject to a kind of ‘weak paternalism’.
‘Strong paternalism’ is said to occur where a person clearly has competence, efficiency and has expressed desires that are overridden for their own benefit[123]. By contrast, ‘weak paternalism’ is categorised by the individual’s lack of autonomy and where ‘an agent intervenes on grounds of beneficence or nonmaleficence only to prevent actions that are substantially nonautonomous’[124]. This might be, for example, where a person is not competent or does not have sufficient information to be efficient[125].
Crime reduction and administration of justice or welfare certainly have ‘grounds of beneficence or nonmaleficence’ such as to qualify as paternalistic interventions. The absence of information available to citizens (see Opacity in Public Authority) renders them incompetent and inefficient and therefore unable to exercise positive autonomy, having already had negative autonomy removed by Article 22 (b).
Without a full understanding of the technology and its uses and potential impacts, it is conceivable that citizens are rendered incapable of exercising autonomy through 49 expression of authentic desires. It follows then that constitutional democracy, which serves to ‘express the will of the people in a form obligatory for everyone’, is undermined; How can the people establish their ‘will’ or ‘authentic desires’ in the absence of comprehension?
The disquiet around state sanctioned ADM and their effect on citizens as ‘vulnerable stakeholders has resulted in calls for an holistic approach that extends beyond the technology itself and includes the socio-technological framework for the model and the ‘political and economic environment surrounding its use’[126]. The debate includes contemplating how accountability and ethical principles might be incorporated into the design and deployment of ADM systems and it has been suggested that this might include allowing individuals to ‘shape its design and operation’[127].
Previous chapters have demonstrated the burden placed on the individual to raise challenge and appeal decisions made about them using this technology. As part of the Algorithmic Fairness and Opacity Working Group (AFOG) Workshop at UC Berkeley School of Information, Jenna Burrell asked ‘how can those who are subject to algorithmic classification be better supported to understand how these systems work and the role they play within [them]?’[128].
Burrell explores a number of mechanisms by which commercial organisations engage ‘users’ in the development of systems, such as ‘flagging’, where email account holders are able to flag inappropriate classifications themselves by recategorizing emails labelled as spam[129].
The paper recognises the weakness of this approach since the user is effectively just being ‘put to work’, and as a result is removing the burden on the operators to improve the system themselves. The individual is not affecting the policy of labelling, only managing the effect[130].
One could argue that a similar approach is manifest in the current UK legislative framework, placing the onus on the citizen to ‘flag’ the effects of ADM, seek judicial review and develop legal precedent, rather than obligating the authorities to external scrutiny before the fact.
The paper does suggest that a ‘user advocacy function embedded in business teams help steer decisions in ways that preserve autonomy’[131]. On this reasoning, the rise of co-production in UK government would appear to represent an encouraging move from paternalistic service delivery to a more collaborative relationship between citizens and public authorities. Co-production is described by the Local Government Association as;
‘… focused around a relationship in which professionals and citizens share power to plan and deliver support together … people are no longer passive recipients of services, but are equal partners in designing and delivering activities to improve outcomes’[132]
In the case of state sanctioned ADM, this approach could increase citizen competence and support the expression of authentic citizen desires and goals (based on acquired understanding), underpinning the concept of positive autonomy described above.
Certainly, in Finland, the government developed Elements of AI course is an integral part of the Finnish AI programme. It is an online course that explains the basic concepts and some of the social consequences of AI, with almost 100,000 Finns having enrolled by 2019[133].
In the UK however, there appears to be some way to go to achieve to fully engage citizens. Concerns about ‘tokenism’[134], although predominantly in relation to health research co-production, conceivably apply in a broader sense to public sector. The imbalanced power relations between those in authority and the public can render co-production a merely symbolic effort to engage citizens and make the transition from ‘a consultative paternalistic model to a collaborative partnership model’ a difficult one[135].
In fact, far from being co-produced, ADMs deployed by UK public authorities appear to be designed and developed with little or no engagement from those whom it impacts the. In the case of RBV, local authorities have unquestioningly concealed the inner workings of ADM from the public at the request of the DWP, ostensibly to prevent ‘gaming’ or manipulation of the system. Burrell notes that ‘preventing ‘gaming’ [of the system] may not necessarily mean maximizing concealment’ and gives example of Wikepedia as a fully transparent platform that demonstrates this notion[136].
The suggestion that the public sector is shaped to respond to the directives of Westminster rather than the people is not a new one and rests on the legacy of Margaret Thatcher’s public reform legacy of curtailments issued to local authorities by Westminster in relation to their organisation and management, result in services that are driven by providers rather than citizens[137].
Moreover, it has been considered that the ‘captivity’ of citizens, unable to choose alternative services to those delivered by public authorities, creates a ‘chronic lack of incentives for the public sector to become more …responsive to the wishes of [citizens]’ and that this absence of competition leads to ‘arrogance and atrophy’[138].
In 2018, it was discovered that the DWP had been underpaying an estimated 70,000 benefits recipients for years. The Public Accounts Committee noted that the DWP failed to create a process that implemented its own legislation and then failed to subject that process to scrutiny[139]. It disregarded staff, the public and experts when concerns were raised and appeared to completely ignore the ‘painfully obvious’ errors that were being made, taking more than six years to correct them. These inactions amount to what the Committee described as a ‘culture of indifference’.
Some might consider the experience of Big Brother Watch and the findings of the Public Accounts Committee to be symptomatic of what the media have referred to as ‘institutional indifference’, blamed for the failures and suffering of Windrush and Grenfell[140] and categorised by a ‘lack of official interest in what happens to people’ who are sanctioned, disregarded, unprotected, delayed or underpaid; particularly if they are poor and / or part of the BME community[141].
In 2010, the 2020 Public Services Trust, in making the case for a more integrated public service, attribute indifference to a far more innocuous cause, as a bi product of ‘government fragmentation’[142]. The report describes the ‘highly siloed professional or organisational compartments’ that are endemic of public sector. It is described as a structure that allows cases to ‘fall through the cracks’, for responsibility and accountability to be avoided and for ‘boutique bureaucracy’ to prevent systematic application of processes and create gaps in provision[143].
‘Managerialism’ is considered another legacy of Thatcher’s ‘New Public Management’[144]. Its features include increased strategic and operational management, focus on assessing output of public services and performance criterion and standards. In his exploration of the Thatcher Legacy, Dorey highlights the reported[145] burden of Managerialism as preventing public service professionals from attending to essential services – academics, nursing etc. It is conceivable that this ‘endless cycle’ of box ticking, form-filling and audits could result in the kind of apathy and reticence encountered by Big Brother Watch.
In 2020, Bureaucracy was still being reported as being burdensome for public sector workers – suggested an entrenched culture. The 2020 ‘Busting Bureaucracy’ report cited overly-administrative processes for procurement, complex regulations, information management and data requests paper as cause for malaise and ironically proposed the use of AI to reduce bureaucracy (in the form of remote monitoring software that prevents unnecessary patient appointments[146]).
Lack of accountability is highlighted as a cause for concern, in the 2019 report by the National Audit Office, when exploring the ‘Challenges in Data Use Across Government’[147]. The report identifies a ‘culture of tolerating and working around poor-quality data’[148] of ‘silo working’[149] and that ‘[w]ell-publicised misuse of data has increased concerns and undermined efforts to communicate benefits’[150]. The report bemoans a lack of leadership, illustrated not least by the commitment to appoint a UK chief data officer in 2017 that had yet to be fulfilled when the report was issued[151], in fact the role was not fulfilled until January 2021[152].
Reflecting then, on the observations presented above, it is possible that the experiences of Big Brother Watch during its FOI campaign are perhaps symptomatic of an entrenched public sector culture. The absence of completed impact assessments, lack of familiarity or concern regarding the impact of ADM and distribution of key information across various departments and absence of any real public engagement could conceivably be suggestive of an entrenched public sector culture that is characterised by a lack of leadership and integration, poor data quality, indifference, bureaucracy, apathy and paternalism.
10. Competency and Authority
ADM is often deployed by public authorities with the ambition of removing human bias or preconceptions from the process and creating something more fair, accurate and consistent[153]. Studies have shown that profiling models can be accurate[154], but there is evidence that errors can occur with significant effect.
Caruna et al described a health sector ADM system which sought to predict risk levels in patients with pneumonia and therefore whether they should remain at home or go to hospital for treatment[155]. The system predicted that asthmatic patients were at lower risk of dying from pneumonia. The data set was biased because those patients generally received intensive care that reduced their risk of dying from pneumonia.
Doctors were able to identify and correct the bias by applying their own experience were able to identify this anomaly in the system output[156].
Guido Noto La Diega argues that human trust is misplaced in the use of algorithms which ‘Dehumanise’ decision making’[157] and notes a UK case involving 20,000 divorced couples whose financial terms for their divorce were potentially miscalculated by the software as a example[158]. Noto La Diega puts forward the argument that human decision making can be trusted because human beings tend to emulate one another and so their decisions are consistent and predictable[159].
Of course, consistency does not always equal fairness. There is also the argument that, consideration for the circumstances of particular cases lends itself to genuine fairness. In his discussion on the fairness of international law, John Tasioulas considers the application of certain frameworks as consisting of ‘culture-specific, value-constructs’ foisted upon adherents[160]. He asserts that unilaterally applying values in a world that is categorised by diversity risks ethnocentrism.
An example of how ADM can apply unilateral rules to the detriment of individuals can be found in a report published by the Association for Computational Linguistics in 2019. The Computer Scientists undertaking the investigation found that automatic toxic language identification tools used in social media to flag and remove offensive content were biased towards removing the social media posts of African American individuals. Common phrases in the African American English dialect (AAE) were seen to be labelled by one particular toxicity detection tool as far more toxic than general American English equivalents, regardless of their being regarded as non-toxic by AAE speakers[161].
These cases make plain the value of human oversight and scrutiny for ADM systems but, as Working Party 29 (now European Data Protection Board) Guidelines suggest, it should be carried out by someone who has the ‘authority and competence to change the decision’[162].
Competence and authority appear particularly important given the psychological phenomenon of ‘Automation Bias’ described by Jennifer Cobbe. She describes the wealth of evidence to suggest that humans tend to trust decisions made by machines and defer to them willingly and generally without challenge[163].
The Working Party 29 Guidelines warn that human involvement cannot be ‘fabricated’ but the apparent ‘tokenism’ of human oversight was at the core of concerns raised by Big Brother Watch regard York Council’s approach to RBV. Big Brother Watch described the use of the tool as ‘decisive’ rather than ‘advisory’ because the main risk that was identified through their impact assessment, having found no legal or equalities implications, was that staff should be trained to ensure they trust the risk scores produced[164].
Whilst systems like RBV are ‘authorised by law’ and therefore not subject to the Article 22 (1) prohibition on solely automated decision making (by virtue of Article 22 (1) (b), the Recital 71 does allow individuals to ‘obtain human intervention’ and express their viewpoint, obtain explanation of the decision reached and raise challenges.
The second requirement for ‘meaningful assessment’ provided through human involvement, according to the WP29 guidelines, is that the human should be competent.
In 2016, Jenna Burrells, explored the opacity of machine learning algorithms, describing the reading and writing of computer code and the development of algorithms as a ‘specialised skill’ that is ‘inaccessible to the majority of the population’ since it involves language that differs notably from human language[165]. More recently, in 2021, a partnership of employment and skills specialists that included the Learning and Work Institute produced a report on the digital skills gap. The report identified software coding in its definition of Advanced Digital Skills that are in shortage in the UK and that the average age of those holding Advanced Digital Skills is below thirty[166]. Despite the demand for these types of skills increasing, the number of those training in this specialist area is declining[167].
In their 2019 report, the Institute for Employment Studies found that, possibly as a result of austerity measures reducing investment in the workforce, young people are half as likely to be employed public sector roles than older counterparts[168].
It is conceivable then that, in the context of the UK skills gap affecting these specialist skills, public authorities deploying ADM systems may find it more difficult that private sector bodies to engage individuals that are suitably skilled to support the design, assessment and oversight for of ADM systems being deployed.
Another feature of modern public services is ‘marketisation’; originally part of the New Public Management[169], subsequent reforms have placed pressure on services to reduce operating costs and meet quality targets through a number of measures including outsourcing services to private providers. Seeking to encourage competition and therefore drive ‘efficiency, effectiveness and economy’[170], the assertion is that the ‘involvement of private firms in public services … results in the best allocation and delivery of services at any given cost’.
It is clear from the provisions of Article 22 and Recital 71 that GDPR confers an ex post opportunity to request human intervention in the use of ADM systems where the purpose for the system is authorised by law. There is also clear evidence that human review of the output of ADM systems can serve to flag inappropriate classifications resulting from unliteral application of rules.
It is possible that a digital skills gap, potentially felt more acutely in public sector, combined with public sector outsourcing policies mean that the those working with ADM systems simply do not have the competency and authority required to challenge or explain the ADM them, potentially resulting in tokenistic human involvement and unilateral deference to automated decisions.
There is a spatial chasm between those deploying and operating the ADM and the, likely outsourced, developers of the system. Those who understand the system well enough to recognise anomalies are not those who are tasked with assessing the risks to the fundamental rights of those subject to the decisions. This potentially deepens opacity and blurs the lines of accountability.
11. Conclusion
The enduring appetite of the state for registration of its citizens and modification of their behaviour has resulted in vast data sets drawn from all facets of citizen life including 58 health, welfare, crime, political activity, travel, military service, childcare, marriage and divorce.
It is manifest in the emerging case law and academic discussion that the use of ADMs by public authorities can engage with fundamental rights on multiple fronts and have both individual and societal privacy implications including affecting anonymity, autonomy, identity and self-determination.
The profiling of citizens can result in predictions of behaviour that, when underpinning decisions made by public authorities, can deprive the individual of moral reflection and identity revision, even when accurate.
Profiling and automated decisions can be based on structural bias or the preconceptions of the designer or operator and can perpetuate stereotypes and penalise marginalised communities.
The pervasive ‘observation’ of citizens, whether through the creation of digital profiles or the surveillance in the physical sense, can modify behaviour and force identity management, providing little opportunity for anonymity in public.
The power imbalance between the state and the citizen renders the citizen a vulnerable stakeholder whose rights and freedoms warrant careful consideration with regards to the deployment of ADM by public authorities.
The opacity surrounding the design and operation of ADM systems has been attributed to a number of potential factors including the complexity of the technology, a lack of awareness, strategic concealment for commercial reasons.
There is evidence, as presented above, that there are barriers to accountability for the design, operation and the impact of ADMs by UK public authorities which are distinct from hitherto proposed opacity types.
The current legal framework provides much room for public authorities to exercise discretion and to make decisions that are arbitrary and not expertly evaluated, and where rationales are produced, the process does not include the involvement of suitably competent individuals. Furthermore, they generally are not subject to public scrutiny 59 before the technology is deployed. The presence of any public engagement is potentially tokenistic and hampered by a pervasive power imbalance and paternalism.
The burden of holding public authorities accountable appears to rest largely with vulnerable citizens who do not have the requisite information to express authentic desires and exercise the necessary autonomy to effect challenge.
Contrary to the Rule of Law, discretion is exercised to authorities that are potentially entrenched in a culture characterised by a lack of leadership and skills, poor integration, indifference, bureaucracy, apathy and paternalism.
As a result, any effective remedies or safeguards proposed must consider the potential predisposition for inertia and, as such, include mandated transparency and engagement, co-production, a collectivistic approach to privacy and upskilling of both the public and the authorities to allow for effective challenge and exercise of autonomy.
References
[1] Big Brother Watch, ‘SUBMISSION TO THE UN SPECIAL RAPPORTEUR ON EXTREME POVERTY AND HUMAN RIGHTS AHEAD OF UK VISIT NOVEMBER 2018’ (Ohchr.org, 2021) <https://www.ohchr.org/Documents/Issues/EPoverty/UnitedKingdom/2018/NGOS/BigBrotherWatch.pdf> accessed 25 May 2021. [2] Ibid 7 para 3. [3] Automating Society(n 29). [4] ‘Nanny State’ (Dictionary.cambridge.org, 2021) <https://dictionary.cambridge.org/dictionary/english/nanny-state> accessed 25 May 2021. [5] Privacy International, ‘Mass Surveillance | Privacy International’ (Privacyinternational.org, 2021) <https://privacyinternational.org/learn/mass-surveillance> accessed 25 May 2021. [6] Alberto Romele and others, ‘Panopticism Is Not Enough: Social Media As Technologies Of Voluntary Servitude’ (2017) 15 Surveillance & Society. <https://ojs.library.queensu.ca/index.php/surveillance-and-society/article/view/not_enough> accessed 1 May 2021. [7] Peter Dorey, ‘The Legacy Of Thatcherism – Public Sector Reform’ [2015] Observatoire de la société britannique para 17 < https://journals.openedition.org/osb/1759> accessed 12 May 2021. [8] European Commission, ‘Guidelines On Consent Under Regulation 2016/679 (Wp259rev.01)’ (2016) 6 para 2 <https://ec.europa.eu/newsroom/article29/items/623051> accessed 25 May 2021. [9] Ibid. [10] Dr David Leslie, Understanding Artificial Intelligence Ethics And Safety A Guide For The Responsible Design And Implementation Of AI Systems In The Public Sector (The Alan Turing Institute 2019) <https://doi.org/10.5281/zenodo.3240529> accessed 25 May 2021. [11] Parliament of Australia, ‘Paternalism In Social Policy When Is It Justifiable?’ (2010) <https://www.aph.gov.au/About_Parliament/Parliamentary_Departments/Parliamentary_Library/pubs/rp/rp1011/11rp08> accessed 13 May 2021. [12] Mark James Findlay and others, ‘Ethics, AI, Mass Data And Pandemic Challenges: Responsible Data Use And Infrastructure Application For Surveillance And Pre-Emptive Tracing Post-Crisis’ [2020] SSRN Electronic Journal. <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3592283> 46 para 3 [13] Naomi Klein, ‘Naomi Klein: How Power Profits From Disaster’ (the Guardian, 2021). <https://www.theguardian.com/us-news/2017/jul/06/naomi-klein-how-power-profits-from-disaster> accessed 25 May 2021. [14] ‘If You’re Told To Self-Isolate By NHS Test And Trace’ (nhs.uk, 2021) <https://www.nhs.uk/conditions/coronavirus-covid-19/self-isolation-and-treatment/if-youre-told-to-self-isolate-by-nhs-test-and-trace-or-the-covid-19-app/> accessed 25 May 2021. [15] Mark James Findlay and others (n 189). [16] ‘Liberty: Pandemic Must Not Be Used To Normalise Technology That Threatens Our Rights’ <https://www.libertyhumanrights.org.uk/issue/liberty-pandemic-must-not-be-used-to-normalise-technology-that-threatens-our-rights/> accessed 27 May 2021. [17] ‘The UK GDPR’ (Ico.org.uk, 2021) <https://ico.org.uk/for-organisations/dp-at-the-end-of-the-transition-period/data-protection-now-the-transition-period-has-ended/the-gdpr/> accessed 25 May 2021. [18] Article 29 Data Protection Working Party, ‘Guidelines On Automated Individual Decision-Making And Profiling For The Purposes Of Regulation 2016/679’ (Article 29 Data Protection Working Party 2017) <https//file:///C:/Users/Emma/Downloads/wp251rev_01_en_A754F3E1-FB46-9E76-C0A919864E4B6641_49826%20(5).pdf> accessed 27 May 2021. [19] Ibid 6 para 2. [20]Council Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1. [21] Sandra Wachter, Brent Mittelstadt and Luciano Floridi (n 78); Gianclaudio Malgieri and Giovanni Comandé, ‘Why A Right To Legibility Of Automated Decision-Making Exists In The General Data Protection Regulation’ (2017) 7 International Data Privacy Law. < https://academic.oup.com/idpl/article-abstract/7/4/243/4626991?redirectedFrom=fulltext> access 3 May 2021. [22] Sandra Wachter, Brent Mittelstadt and Luciano Floridi (n 78) 94 para 3 [23]Article 29 Data Protection Working Party, Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 (Wp251rev.01)’ (2018) 9 para 3 <https://ec.europa.eu/newsroom/article29/items/623051> accessed 20 May 2021. [24]Council Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1 [25]Malgieri and Comandé (n 198) 7 para 3 [26] Such as marketing manipulation and price discrimination. [27] Malgieri and Comandé (n 198). [28] Sandra Wachter, Brent Mittelstadt and Luciano Floridi (n 78) 1 para 1. [29] Article 22 (b) and DPA 2018 s 49. [30] ‘When Can We Carry Out This Type Of Processing?’ (Ico.org.uk, 2021) <https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/automated-decision-making-and-profiling/when-can-we-carry-out-this-type-of-processing/> accessed 1 April 2021. [31] Big Brother Watch (n 178) 4 para 3 [32] Heleen L Janssen, ‘An Approach For A Fundamental Rights Impact Assessment To Automated Decision-Making’ (2020) 10 International Data Privacy Law <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3302839> accessed 20 May 2021. [33] Big Brother Watch (n 178) 8 para 4. [34] GDPR Article 22 (2) (b). [35]AlgorithmWatch (n 28) 28 para 2. [36]Harris (n 68). [37] Harris (n 68) 9 para 1; Janssen(n 209) 18. [38] Harris (n 68) 17 para 3. [39]Harris (n 68) 17 para 2 [40] The Rt Hon Dominic Grieve QC, ‘The Rule Of Law And The Prosecutor’ (18th Annual Conference and General Meeting of the International Association of Prosecutors (IAP), 2021) <https://www.gov.uk/government/speeches/the-rule-of-law-and-the-prosecutor> accessed 12 May 2021. [41] Tew Y, “The Rule of Law. By Tom Bingham. [London: Allen Lane, 2010. 213] (2011) 70 The Cambridge Law Journal 481 <https://www.cambridge.org/core/journals/cambridge-law-journal/article/abs/rule-of-law-by-tom-bingham-london-allen-lane-2010-213-pp-hardback-2000-isbn-9781846140907/66725BA80C6FC0C4636CDAB5BC394E94> Accessed 20 May 2021. [42]Ibid [43] Government Equalities Office, Framework for a Fairer Future – The Equality Bill(Cmd 7431, 2008) 12 para 1 <https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/238713/7431.pdf> accessed 1 May 2021, [44] Ibid 12 para 3. [45] Government Equalities Office, Explanatory Memorandum to the Equality Act 2010 (Specific Duties) Regulations(2011) para 8.12 < https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/85297/EM.pdf> accessed 1 May 2021. [46] Ibid para 8.13. [47] Equality Act 2010. [48] Ibid ch 2. [49] Age, disability, gender reassignment, marriage and civil partnership, pregnancy and maternity, race, religion or belief, sex and sexual orientation. [50] AI Law and Cloisters (n 33) para 87. [51] Text to n 217. [52] Government Equalities Office, Explanatory Memorandum to the Equality Act 2010 (Specific Duties) Regulations(2011) para 8.12 < https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/85297/EM.pdf> accessed 1 May 2021. [53] AI Law and Cloisters (n 33) 30 para 90. [54] [2012] EWHC 201 EWHC 201 (Admin). [55] Ibid [89]. [56] Mandy Emery, ‘Audit Committee: Housing Benefit And Council Tax Reduction Risk Based Verification’ (South Northamptonshire Council 2019) <https://modgov.southnorthants.gov.uk/documents/s23509/SNC%20Audit%20Committee%2014%2003%2019%20Report%20on%20RBV%20Final.pdf> accessed 25 May 2021. [57] Joanne Stanton, ‘Pre-Screening Equality Impact Assessment’ (Borough Council of Kings Lynn and West Norfolk 2005) <https://democracy.west-norfolk.gov.uk/documents/s5323/CAB111%20EIA.doc.pdf> ; Fern Silerio, ‘Equality Impact Assessment Template’ (Harrow Council 2004) <https://www2.harrow.gov.uk/documents/s128005/Appendix%20B%20EqIA.pdf>; accessed 25 May 2021. [58] Stuart Bracking and Ors v Secretary of State for Work and Pensions [2013] EWCA Civ 1345, [2014] Eq LR 60 [59] Doug Pyper, The Public Sector Equality Duty and Equality Impact Assessments(Briefing Paper, 06591) 26 para 1 < https://commonslibrary.parliament.uk/research-briefings/sn06591/> accessed 20 February 2021. [60] The Equality and Human Rights Commission are able to bring cases for Judicial Review as well as those affected. [61] AI Law and Cloisters (n 33) para 96. [62] Ibid. [63] AI Law and Cloisters (n 33) 27 para 83. [64] AI Law and Cloisters (n 33) 34 para 103. [65] Guide on Article 8 of the European Convention on Human Rights(n 80)8 para 2. [66] (Application no. 2346/02) (2002) ECHR 235. [67] Pretty v UK (Application no. 2346/02) (2002) ECHR 235. [68] Bridges (n71). [69] R (Bridges) v The Chief Constable of South Wales Police & Others [2020] EWCA Civ 1058 [70] Surveillance Camera Code of Practice(n 111). [71] Bridges Appeal (n 242) [54]. [72] Ibid [55], per Beghal v Director of Public Prosecutions [2015] UKSC 49 [102]. [73] Ibid [102]. [74] Bridges(n 71). [75] Bridges Appeal (n 242) [61], per Munjaz v United Kingdom (Application no. 2913/06) [2012] ECHR 1704. [76] (Application no. 2913/06) [2012] ECHR 1704 [83]-[95]. [77] Guide on Article 8 of the European Convention on Human Rights(n 80)8 para 2. [78]David Mead, ‘A Socialised Conceptualisation Of Individual Privacy: A Theoretical And Empirical Study Of The Notion Of The ‘Public’ In UK Mopi Cases’ (2017) 9 Journal of Media Law 3 para 1 .<https://doi.org/10.1080/17577632.2017.1321227> accessed 2 May 2021. [79]‘Surveillance Camera Commissioner’s Statement: Court Of Appeal Judgment (R) Bridges V South Wales Police – Automated Facial Recognition’ (2020) <https://www.gov.uk/government/speeches/surveillance-camera-commissioners-statement-court-of-appeal-judgment-r-bridges-v-south-wales-police-automated-facial-recognition> accessed 20 April 2021. [80]Surveillance Camera Code of Practice(n 111). [81]Facing the Camera(n 114). [82] The Rt Hon Dominic Grieve QC (n 217). [83] Doug Pyper (n 236) per Mr Justice Sales, The Public Sector Equality Duty, Lecture to the Employment Law Bar Association and Administrative Law Bar Association, 13 December 2010. [84] Bridges Appeal (n 242) [54]. [85] Understanding algorithmic decision-making: Opportunities and challenges (n 38) 14. [86] Understanding algorithmic decision-making: Opportunities and challenges (n 38) viii para 3. [87] R (Gillan) v Commissioner of Police for the Metropolis [2006] UKHL 12 AC 307 [34] [88] Bridges Appeal (n 242) [54]. [89]Department of Work and Pensions, ‘Housing Benefit And Council Tax Benefit Circular, HB/CTB S11/2011 [90] ‘Thousands Of Claimants Failed By DWP ‘Culture Of Indifference’ (UK Parliament 2018) para 2 <https://committees.parliament.uk/committee/127/public-accounts-committee/news/98330/thousands-of-claimants-failed-by-dwp-culture-of-indifference/> accessed 26 May 2021 . [91] Text in n 233 ch 7 [92] Explainable AI: the basics (n 1) 11 para 1. [93] Understanding algorithmic decision-making: Opportunities and challenges (n 38). [94] Ibidii – iii. [95] Understanding algorithmic decision-making: Opportunities and challenges (n 38). [96] What goes into an explanation?(Guidance, Information Commissioner’s Office 2020) 20 para 1 < https://ico.org.uk/media/for-organisations/guide-to-data-protection/key-data-protection-themes/explaining-decisions-made-with-artificial-intelligence-1-0.pdf> accessed 20 May 2021. [97] Explainable AI: the basics (n 1) 9. [98] Ibid. [99] Jenna Burrell, ‘How The Machine ‘Thinks’: Understanding Opacity In Machine Learning Algorithms’ (2016) 3 Big Data & Society <https://journals.sagepub.com/doi/full/10.1177/2053951715622512> accessed 10 May 2021. [100] Jennifer Cobbe, ‘Big Data, Surveillance, and the Digital Citizen (2019) Queen’s University Belfast 15 para 1 <https://pureadmin.qub.ac.uk/ws/portalfiles/portal/153330563/Thesis_Complete.pdf > accessed 10 May 2021. [101] Burrell (n 270) 3 para 7. [102] Burrell (n 270) 3 para 7. [103] Burrell (n 270) 4 – 5; Cobbe (n 268) 15. [104] AI Law and Cloisters (n 33) para 12. [105] AI Law and Cloisters (n 33). [106] Cynthia Stohl, Michael Stohl and Paul M. Leonardi, ‘Managing Opacity: Information Visibility and the Paradox of Transparency in the Digital Age’ (2016) 10 International Journal of Communication < https://ijoc.org/index.php/ijoc/article/viewFile/4466/1530> accessed 26 May 2021. [107] Ibid (n 274) 133 para 1 [108] Jennifer Cobbe, Michelle Seng Ah Lee and Jatinder Singh, ‘Reviewable Automated Decision-Making’ [2021] Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency < https://arxiv.org/pdf/2102.04201.pdf> accessed 26 May 2021. [109] Big Brother Watch (n 179) 9 para 4. [110] Ibid. [111] Ibid 10 – 11. [112] ‘The Seven Principles Of Public Life’ (GOV.UK, 2021) <https://www.gov.uk/government/publications/the-7-principles-of-public-life> accessed 26 May 2021. [113] Big Brother Watch (n 179) 10 – 11. [114] Paul Nemitz, ‘Constitutional Democracy And Technology In The Age Of Artificial Intelligence’ (2018) 376 Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 2 ch 2 <https://royalsocietypublishing.org/doi/10.1098/rsta.2018.0089> accessed 20 May 2021 [115] Ibid. [116] Nemitz (n 282) 6 para 5. [117] Sarah Marsh, ‘Councils Scrapping Use Of Algorithms In Benefit And Welfare Decisions’ (the Guardian, 2021) <https://www.theguardian.com/society/2020/aug/24/councils-scrapping-algorithms-benefit-welfare-decisions-concerns-bias> accessed 26 May 2021. [118] Cosima Gretton and Matthew Honeyman, ‘The Digital Revolution: Eight Technologies That Will Change Health And Care’ (The King’s Fund, 2016) <https://www.kingsfund.org.uk/publications/digital-revolution> accessed 26 May 2021. [119] Text to n 186 in ch 5. [120] M. Sjostrand and others, ‘Paternalism In The Name Of Autonomy’ (2013) 38 Journal of Medicine and Philosophy 711 – 712 <https://academic.oup.com/jmp/article-abstract/38/6/710/910917?redirectedFrom=fulltext> accessed 26 May 2021. [121] Ibid 713 para 2. [122] Ibid. [123] Tom L Beauchamp and James F Childress, Principles Of Biomedical Ethics (5th edn, OUP). [124] M. Sjostrand and others(n 288) 714 para 1. [125] Ibid 712 para 2 page 712 para 2. [126] AlgorithmWatch (n 28) 9 para 3. [127] Explainable AI: the basics (n 1) 23 para 6. [128] Jenna Burrell, ‘ALGORITHMIC FAIRNESS AND OPACITY WORKING GROUP (AFOG) Workshop Panel 3: Human Autonomy And Empowerment’ (UC Berkeley School of Information 2018) 3 para 3 <https://sites.ischool.berkeley.edu/afog/files/2018/08/AFOG_workshop_panel3_report.pdf> accessed 26 May 2021. [129] Ibid 4 para 1. [130] Ibid. [131] Ibid 9 para 3 [132] ‘Co-Production’ (Local.gov.uk, 2021) <https://www.local.gov.uk/topics/devolution/devolution-online-hub/public-service-reform-tools/engaging-citizens-devolution-7#:~:text=NEF%20defines%20co%2Dproduction%20as,their%20families%20and%20their%20neighbours> accessed 26 May 2021. [133]AlgorithmWatch (n 28) 14 para 4. [134] Doreen Tembo and others, ‘Is Co-Production Just A Pipe Dream For Applied Health Research Commissioning? An Exploratory Literature Review’ (2019) 4 Frontiers in Sociology <https://www.frontiersin.org/articles/10.3389/fsoc.2019.00050/full> accessed 14 May 2021. [135] Ibid [136] Jennifer Burrell (n 296) 11 para 3. [137] Bill Coxall and Lynton Robins, Contemporary British Politics (2003). [138] Peter Dorey, ‘The Legacy Of Thatcherism – Public Sector Reform’ [2015] Observatoire de la société britannique ch 17 <https://journals.openedition.org/osb/1759> accessed 10 May 2021. [139] ‘Thousands Of Claimants Failed By DWP ‘Culture Of Indifference’ (UK Parliament 2018) para 2 <https://committees.parliament.uk/committee/127/public-accounts-committee/news/98330/thousands-of-claimants-failed-by-dwp-culture-of-indifference/> accessed 26 May 2021 . [140] Ruth Lister, ‘From Windrush To Universal Credit – The Art Of ‘Institutional Indifference’’ (openDemocracy, 2018) <https://www.opendemocracy.net/en/opendemocracyuk/from-windrush-to-universal-credit-art-of-institutional-indifference/> accessed 26 May 2021. [141] Ibid para 12. [142] Patrick Dunleavy, The Future Of Joined-Up Public Services (Economic and Social Research Council 2010) 9 para 1 <https://eprints.lse.ac.uk/28373/1/The_Future_of_Joined_Up_Public_Services.pdf> accessed 10 May 2021. [143] Ibid 17. [144] Dorey (n 306). [145] Ibid 9. [146] ‘Busting Bureaucracy: Empowering Frontline Staff By Reducing Excess Bureaucracy In The Health And Care System In England’ (GOV.UK, 2020) <https://www.gov.uk/government/consultations/reducing-bureaucracy-in-the-health-and-social-care-system-call-for-evidence/outcome/busting-bureaucracy-empowering-frontline-staff-by-reducing-excess-bureaucracy-in-the-health-and-care-system-in-england> accessed 26 May 2021. [147] The Comptroller and Auditor General, ‘Challenges In Using Data Across Government’ (National Audit Office 2019) <https://www.nao.org.uk/wp-content/uploads/2019/06/Challenges-in-using-data-across-government.pdf> accessed 26 May 2021. [148] Ibid 5 para 2. [149] Ibid 11 para 2. [150] Ibid 10 para 4. [151] Ibid 23 para 3. [152] Alex Chisholm, ‘New Year, New Ddat Leadership’ <https://gds.blog.gov.uk/2021/01/13/new-year-new-ddat-leadership/> accessed 26 May 2021. [153] Text in n 132 at ch 4.5. [154] Youyou and Others (n 12). [155] Rich Caruana and others, ‘Intelligible Models For Healthcare’ [2015] Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining <https://dl.acm.org/doi/pdf/10.1145/2783258.2788613> accessed 12 May 2021. [156] Understanding algorithmic decision-making: Opportunities and challenges (n 38) 15 para 3. [157] Guido Noto La Diega, ‘Against The Dehumanisation Of Decision-Making-Algorithmic Decisions At The Crossroads Of Intellectual Property, Data Protection, And Freedom Of Information’ (2018) 9 JIPITEC <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3188080> accessed 26 May 2021. [158] Will Grice, ‘A Tiny Government Error Has Affected 20,000 Divorced People’ (The Independent, 2015) <https://www.independent.co.uk/news/uk/home-news/ministry-justice-software-glitch-could-see-thousands-revisiting-painful-divorce-settlements-a6777851.html> accessed 26 May 2021. [159] Ibid para 24. [160] Article 29 Data Protection Working Party, ‘Guidelines On Automated Individual Decision-Making And Profiling For The Purposes Of Regulation 2016/679’ 21 para 2 (Article 29 Data Protection Working Party 2017) <https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053> accessed 27 May 2021. [161] Maarten Sap and others, ‘The Risk Of Racial Bias In Hate Speech Detection’ [2019] Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 1168 <https://www.aclweb.org/anthology/P19-1163/> accessed 27 May 2021. [162] Article 29 Data Protection Working Party, ‘Guidelines On Automated Individual Decision-Making And Profiling For The Purposes Of Regulation 2016/679’ 21 para 2 (Article 29 Data Protection Working Party 2017) < https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053> accessed 27 May 2021. [163] Cobbe 2019 (n 26) 1 – 20 [164] Big Brother Watch (n 179) 8 – 11. [165] Burrell (n 270) 4. [166] ‘Stack Overflow Developer Survey 2016 Results’ (Stack Overflow, 2021) <https://insights.stackoverflow.com/survey/2016#developer-profile-age> accessed 27 May 2021. [167] World Skills UK; Learning and Work Institute; and Enginuity., ‘Disconnected? Exploring The Digital Skills Gap.’ (2021) <https://learningandwork.org.uk/resources/research-and-reports/disconnected-exploring-the-digital-skills-gap/> accessed 27 May 2021. [168] Institute for Employment Studies, ‘Young People’S Future Health Inquiry The Quality Of Work On Offer To Young People And How It Supports The Building Blocks For A Healthy Life’ (2019) 6 para 9 <https://www.employment-studies.co.uk/system/files/resources/files/532_2.pdf> accessed 27 May 2021. [169] Text in n 315, ch 9 [170] Peter Dorey, ‘The Legacy Of Thatcherism – Public Sector Reform’ [2015] Observatoire de la société britannique para 17 < https://journals.openedition.org/osb/1759> accessed 12 May 2021.
12. Bibliography
Table of Cases UK
Beghal v Director of Public Prosecutions [2015] UKSC 49
CTB v News Group Newspapers [2011] EWHC 3099 (QB)
R (Gillan) v Commissioner of Police for the Metropolis [2006] UKHL 12 AC 307
R (Bridges) v The Chief Constable of South Wales Police [2019] EWHC 2341 (Admin
R (Bridges) v The Chief Constable of South Wales Police & Others [2020] EWCA Civ 1058
60 R (Hurley and Moore) v Secretary of State for Business, Innovation and Skills [2012] EWHC 201 (Admin)
S and Marper v United Kingdom [2009] 48 EHRR 50
Table of Cases: Other Jurisdictions
Laird v. Tatum, 408 U.S. 1 (1972)
ECtHR Cases
Aksu v. Turkey (Application Nos. 4149/04 & 41029/04) (2012) ECHR
Munjaz v. The United Kingdom (Application no. 2913/06) [2012] ECHR 1704
Treaties
Convention for the Protection of Human Rights and Fundamental Freedoms (European Convention on Human Rights)
Table of Legislation
Council Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1
Data Protection Act 2018
Equality Act 2010
Secondary Sources
____, ‘Busting Bureaucracy: Empowering Frontline Staff By Reducing Excess Bureaucracy In The Health And Care System In England’ (GOV.UK, 2021) <https://www.gov.uk/government/consultations/reducing-bureaucracy-in-the-health-and-social-care-system-call-for-evidence/outcome/busting-bureaucracy-empowering-frontline-staff-by-reducing-excess-bureaucracy-in-the-health-and-care-system-in-england> accessed 26 May 2021 ____, ‘Census-Taking In The Ancient World – Office For National Statistics’ (Ons.gov.uk, 2021) <https://www.ons.gov.uk/census/2011census/howourcensusworks/aboutcensuses/censushistory/censustakingintheancientworld> accessed 5 January 2021
____, ‘Coalition Letter To Amazon Urging Company Commit Not To Release Face Surveillance Product’ (American Civil Liberties Union, 2021) <https://www.aclu.org/coalition-letter-amazon-urging-company-commit-not-release-face-surveillance-product> accessed 25 May 2021
____, ‘Co-Production’ (Local.gov.uk, 2021) <https://www.local.gov.uk/topics/devolution/devolution-online-hub/public-service-reform-tools/engaging-citizens-devolution7#:~:text=NEF%20defines%20co%2Dproduction%20as,their%20families%20and%20their%20neighbours.> accessed 26 May 2021
____, ‘Domesday Book : Britain’s Finest Treasure | The National Archives’ (Nationalarchives.gov.uk) <https://www.nationalarchives.gov.uk/domesday/> accessed 27 May 2021
____, European Parliament, Connected Digital Single Market: White Paper on Artificial intelligence Including Follow Up (White Paper, Cm 2018-5) < https://www.europarl.europa.eu/legislative-train/theme-connected-digital-single-market/file-white-paper-artificial-intelligence-and-follow-up/07-2019> accessed 20 February 2021
___, ‘Risk Stratification And Predictive Analytics – Digital Marketplace’ (Digitalmarketplace.service.gov.uk, 2021) <https://www.digitalmarketplace.service.gov.uk/g-cloud/services/198405605081528> accessed 23 May 2021
____, ‘SUBMISSION TO THE UN SPECIAL RAPPORTEUR ON EXTREME POVERTY AND HUMAN RIGHTS AHEAD OF UK VISIT NOVEMBER 2018’ (Ohchr.org, 2021) <https://www.ohchr.org/Documents/Issues/EPoverty/UnitedKingdom/2018/NGOS/BigBrotherWatch.pdf> accessed 25 May 2021 ____, ‘Taking Stock Of Automated Decision Making In The EU’ (A report by AlgorithmWatch in cooperation with Bertelsmann Stiftung, supported by the Open Society Foundations 2019) <https://algorithmwatch.org/de/wp-content/uploads/2019/02/Automating_Society_Report_2019.pdf> accessed 4 February 2021
____, ‘THIM Monitoring Service’ (Surrey and Borders Partnership NHS Foundation Trust) <https://www.sabp.nhs.uk/tihm> accessed 20 May 2021
‘EU Declaration on Cooperation on Artificial Intelligence’ [2018] < https://ec.europa.eu/jrc/communities/en/community/digitranscope/document/eu-declaration-cooperation-artificial-intelligence> accessed 3 May 2021
‘Expert Group On AI’ (Shaping Europe’s digital future) <https://digital-strategy.ec.europa.eu/en/policies/expert-group-ai#:~:text=on%20artificial%20intelligence-,High%2Dlevel%20expert%20group%20on%20artificial%20intelligence,academia%2C%20civil%20society%20and%20industry> accessed 20 April 2021
___, ‘Summary of Proceedings: Automated Voting and Election Observation’ (The Carter Center 2005) < https://www.cartercenter.org/documents/nondatabase/automatedsummary.pdf> accessed 3 March 2021
63 ____, ‘Thousands Of Claimants Failed By DWP ‘Culture Of Indifference’ (UK Parliament 2018) <https://committees.parliament.uk/committee/127/public-accounts-committee/news/98330/thousands-of-claimants-failed-by-dwp-culture-of-indifference/> accessed 26 May 2021
algo:aware, ‘Raising Awareness On Algorithms: State-Of-The-Art Report’ (algo:aware 2018) <https://actuary.eu/wp-content/uploads/2019/02/AlgoAware-State-of-the-Art-Report.pdf> accessed 1 May 2021
Agre P, and Rotenberg M, Technology And Privacy (MIT Press 1997)
Allen AL, ‘Taking Liberties: Privacy, Private Choice, and Social Contract Theory’ [1987] Faculty Scholarship at Penn Law 1337 64 <https://scholarship.law.upenn.edu/cgi/viewcontent.cgi?article=2337&context=faculty_scholarship> accessed 25 May 2021
Artificial Intelligence For Europe (European Economic and Social Committee 2019)
Article 29 Data Protection Working Party, ‘Guidelines On Automated Individual Decision-Making And Profiling For The Purposes Of Regulation 2016/679’ (Article 29 Data Protection Working Party 2017) <https://file:///C:/Users/Emma/Downloads/wp251rev_01_en_A754F3E1-FB46-9E76-C0A919864E4B6641_49826%20(5).pdf> accessed 27 May 2021
Association for Progressive Communication, ‘The Rights To Freedom Of Peaceful Assembly And Association And The Internet: Submission To The United Nations Special Rapporteur On The Rights To Freedom Of Peaceful Assembly And Association By Association For Progressive Communication (APC).’ (APC 2019) <https://www.apc.org/sites/default/files/APC_Submission_FoA_Online.pdf> accessed 2 May 2021
Beauchamp T, and Childress J, Principles Of Biomedical Ethics (5th edn, OUP)
Becker H, Outsiders Studies In The Sociology Of Deviance. Free P (Free 1973)
Binns R, and Gallo V, ‘Human Bias And Discrimination In AI Systems’ <https://ico.org.uk/about-the-ico/news-and-events/ai-blog-human-bias-and-discrimination-in-ai-systems/> accessed 18 March 2021 Blunn S, ‘The Public Sector And Data Literacy’ (Government Technology) <https://www.governmenttechnology.co.uk/features/public-sector-and-data-literacy> accessed 25 May 2021
Burrell J, ‘ALGORITHMIC FAIRNESS AND OPACITY WORKING GROUP (AFOG) Workshop Panel 3: Human Autonomy And Empowerment’ (UC Berkeley School of Information 2018) <https://sites.ischool.berkeley.edu/afog/files/2018/08/AFOG_workshop_panel3_report.pdf> accessed 26 May 2021.
Burrell J, ‘How The Machine ‘Thinks’: Understanding Opacity In Machine Learning Algorithms’ (2016) 3 Big Data & Society
Boldyreva EL, Grishina NY and Duisembina Y, ‘Cambridge Analytica: Ethical and Online Manipulation with Decision Making Process’ (18th PCSF 2018) < https://www.researchgate.net/publication/330032180_Cambridge_Analytica_Ethics_And_Online_Manipulation_With_Decision-Making_Process> accessed 3 March 2021
Caruana R and others, ‘Intelligible Models For Healthcare’ [2015] Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
Chisholm A, ‘New Year, New Ddat Leadership’ <https://gds.blog.gov.uk/2021/01/13/new-year-new-ddat-leadership/> accessed 26 May 2021
Cloisters and AI Law, ‘In The Matter Of Automated Data Processing In Government Decision Making: Joint Opinion’ <https://www.cloisters.com/wp-content/uploads/2019/10/Open-opinion-pdf-version-1.pdf> accessed 24 April 2021
Cobbe J, ‘Administrative Law And The Machines Of Government: Judicial Review Of Automated Public-Sector Decision-Making’ (2018) 39 SSRN Electronic Journal <https://www.cambridge.org/core/journals/legal-studies/article/abs/administrative-law-and-the-machines-of-government-judicial-review-of-automated-publicsector-decisionmaking/09CD6B470DE4ADCE3EE8C94B33F46FCD> accessed 3 April 2021
Cobbe J, ‘Big Data, Surveillance, and the Digital Citizen (2019) Queen’s University Belfast 15 para 1 <https://pureadmin.qub.ac.uk/ws/portalfiles/portal/153330563/Thesis_Complete.pdf > accessed 10 May 2021
Cobbe J, Lee M, and Singh J, ‘Reviewable Automated Decision-Making’ [2021] Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency
Copeland B, ‘Alan Turing | Biography, Facts, Computer, Machine, Education, & Death’ (Encyclopedia Britannica, 1972) <https://www.britannica.com/biography/Alan-Turing> accessed 23 May 2021
Council of Europe, ‘Convention For The Protection Of Human Rights And Fundamental Freedoms (European Convention On Human Rights)’ (Council of Europe 2020) <https://www.echr.coe.int/documents/guide_art_8_eng.pdf> accessed 1 April 2021
Coxall B, and Robins L, Contemporary British Politics (2003)
Department of Economic and Social Affairs, Digital Government in the Decade of Action for Sustainable Development(E-Government Survey 2020) <https://www.un.org/development/desa/publications/publication/2020-united-nations-e-government-survey> accessed 2 May 2021
Department of Work and Pensions, ‘Housing Benefit And Council Tax Benefit Circular, HB/CTB S11/2011’ (2011)
Dorey P, ‘The Legacy Of Thatcherism – Public Sector Reform’ [2015] Observatoire de la société britannique
Dunleavy P, The Future Of Joined-Up Public Services (Economic and Social Research Council 2010)
Edwards L, and Veale M, ‘Slave To The Algorithm? Why A Right To Explanationn Is Probably Not The Remedy You Are Looking For’ (2017) 16 SSRN Electronic Journal 67 <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2972855> accessed 25 January 2021
Emery M, ‘Audit Committee: Housing Benefit And Council Tax Reduction Risk Based Verification’ (South Northamptonshire Council 2019) <https://modgov.southnorthants.gov.uk/documents/s23509/SNC%20Audit%20Committee%2014%2003%2019%20Report%20on%20RBV%20Final.pdf> accessed 25 May 2021
European Commission, ‘Communication from the Commission to the European Parliament, The European Council, The Council, The European Economic and Social Committee and the Committee of the Regions: Artificial Intelligence for Europe’ COM (2018) 237 final < https://digital-strategy.ec.europa.eu/en/library/communication-artificial-intelligence-europe> accessed 20 April 2021
European Commission, ‘Guidelines On Consent Under Regulation 2016/679 (Wp259rev.01)’ (2016) <https://ec.europa.eu/newsroom/article29/items/623051> accessed 25 May 2021
European Parliament: Panel for the Future of Science and Technology, ‘Understanding Algorithmic Decision-Making: Opportunities And Challenges’ (European Parliament 2019) <https://www.europarl.europa.eu/RegData/etudes/STUD/2019/624261/EPRS_STU(2019)624261_EN.pdf> accessed 15 April 2021
‘Facing the Camera: Good Practice and Guidance for the Police Use of Overt Surveillance Camera Systems Incorporating Facial Recognition Technology to Locate Persons on a Watchlist in Public Places in England & Wales’(Guidance, Surveillance Camera Commissioner 2020) <https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/940386/6.7024_SCC_Facial_recognition_report_v3_WEB.pdf> accessed 3 April 2021
Findlay M and others, ‘Ethics, AI, Mass Data And Pandemic Challenges: Responsible Data Use And Infrastructure Application For Surveillance And Pre-Emptive Tracing Post-Crisis’ [2020] SSRN Electronic Journal
Gillespie T, Boczkowski PJ, and Foot KA, ‘The Relevance of Algorithms’ [2014] Media Technologies: Essays on Communication, Materiality, and Society 11 para 3 < https://mitpress.universitypressscholarship.com/view/10.7551/mitpress/9780262525374.001.0001/upso-9780262525374-chapter-9> accessed 18 May 2021
68 Gretton C, and Honeyman M, ‘The Digital Revolution: Eight Technologies That Will Change Health And Care’ (The King’s Fund, 2016) <https://www.kingsfund.org.uk/publications/digital-revolution> accessed 26 May 2021
Grice W, ‘A Tiny Government Error Has Affected 20,000 Divorced People’ (The Independent, 2015) <https://www.independent.co.uk/news/uk/home-news/ministry-justice-software-glitch-could-see-thousands-revisiting-painful-divorce-settlements-a6777851.html> accessed 26 May 2021
Grieve QC T, ‘The Rule Of Law And The Prosecutor’ (18th Annual Conference and General Meeting of the International Association of Prosecutors (IAP), 2021)
Goffman E, The Presentation of Self in Everyday Life, (Doubleday 1959)
GOV.UK, ‘Government Digital Service’ <https://www.gov.uk/government/organisations/government-digital-service> accessed 27 May 2021
Government Equalities Office, Explanatory Memorandum to the Equality Act 2010 (Specific Duties) Regulations(2011) < https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/85297/EM.pdf> accessed 1 May 2021
Halbert C, ‘TOUGH ENOUGH AND WOMAN ENOUGH’ (1997) 21 Journal of Sport and Social Issues
Hampton J, ‘The Moral Education Theory of Punishment’ (1984) 13 Philosophy & Public Affairs 208 <https://www.jstor.org/stable/pdf/2265412.pdf?refreqid=excelsior%3A54672fe4976a6ace3faeda0ba77354dd> accessed 24 May 2021
Harris S, ‘Data Protection Impact Assessments As Rule Of Law Governance Mechanisms’ (2020) 3 Data & Policy <https://www.cambridge.org/core/journals/data-69 and-policy/article/data-protection-impact-assessments-as-rule-of-law-governance-mechanisms/3968B2FBFE796AA4DB0F886D0DBC165D> accessed 5 April 2021
Hildebrandt M, ‘Profiling And The Rule Of Law’ (2008) 1 Identity in the Information Society
Hildebrandt, M and Gutwirth, S, Profiling the European Citizen: Cross Disciplinary Perspectives (1st edn, Springer, Dordrecht 2018)
HL Deb 13 March 2018, Paper 100 <https://publications.parliament.uk/pa/ld201719/ldselect/ldai/100/100.pdf> accessed 23 May 2021
HM Government, Regulation for the Fourth Industrial Revolution (Cp 111, 2019) <https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/807792/regulation-fourth-industrial-strategy-white-paper-web.pdf> accessed 21 February 2021
Home Office, Surveillance Camera Code of Practice (Code of Practice, Home Office 2013) < https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/282774/SurveillanceCameraCodePractice.pdf> accessed 2 April 2021
‘If You’re Told To Self-Isolate By NHS Test And Trace’ (nhs.uk, 2021) <https://www.nhs.uk/conditions/coronavirus-covid-19/self-isolation-and-treatment/if-youre-told-to-self-isolate-by-nhs-test-and-trace-or-the-covid-19-app/> accessed 25 May 2021
Information Commissioner’s Office, ‘Big Data, Artificial Intelligence, Machine Learning And Data Protection’ (Ico.org.uk, 2017) <https://ico.org.uk/media/for-organisations/documents/2013559/big-data-ai-ml-and-data-protection.pdf> accessed 18 March 2021
Information Commissioner’s Office, ‘The UK GDPR’ (Ico.org.uk) <https://ico.org.uk/for-organisations/dp-at-the-end-of-the-transition-period/data-protection-now-the-transition-period-has-ended/the-gdpr/> accessed 25 May 2021
Information Commissioner’s Office, ‘What Is Automated Individual Decision-Making And Profiling?’ (Ico.org.uk, 2018) <https://ico.org.uk/for-organisations/guide-to-data-70 protection/guide-to-the-general-data-protection-regulation-gdpr/automated-decision-making-and-profiling/what-is-automated-individual-decision-making-and-profiling/> accessed 11 March 2021
Information Commissioner’s Office, ‘When Can We Carry Out This Type Of Processing?’ (Ico.org.uk) <https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/automated-decision-making-and-profiling/when-can-we-carry-out-this-type-of-processing/> accessed 25 May 2021
Information Commissioner’s Office, ‘When do we need to do a DPIA?’(Ico.org.uk) < https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/data-protection-impact-assessments-dpias/when-do-we-need-to-do-a-dpia/#when10> accessed 10 March 2021
Institute for Employment Studies, ‘Young People’s Future Health Inquiry The Quality Of Work On Offer To Young People And How It Supports The Building Blocks For A Healthy Life’ (2019) <https://www.employment-studies.co.uk/system/files/resources/files/532_2.pdf> accessed 27 May 2021
Janssen H, ‘An Approach For A Fundamental Rights Impact Assessment To Automated Decision-Making’ (2020) 10 International Data Privacy Law
Joshi I, and Morley J, ‘Artificial Intelligence: How To Get It Right Putting Policy Into Practice For Safe Data-Driven Innovation In Health And Care’ (NHS X 2019) <https://www.nhsx.nhs.uk/media/documents/NHSX_AI_report.pdf> accessed 18 April 2021
Klein N, ‘Naomi Klein: How Power Profits From Disaster’ (the Guardian, 2021) <https://www.theguardian.com/us-news/2017/jul/06/naomi-klein-how-power-profits-from-disaster> accessed 25 May 2021
Knuth D, ‘Ancient Babylonian Algorithms’ (1972) 15 Communications of the ACM <https://dl.acm.org/doi/10.1145/361454.361514> accessed 14 January 2021
Koops, B, ‘Forgetting Footprints, Shunning Shadows. A Critical Analysis Of The “Right To Be Forgotten” In Big Data Practice’ (Script-ed.org, 2021)
71 Lampson M, ‘Technology And Privacy: The New Landscape, Edited By Philip E. Agre And Marc Rotenberg’ (1999) 50 Journal of the American Society for Information Science
Leslie D, Understanding Artificial Intelligence Ethics And Safety A Guide For The Responsible Design And Implementation Of AI Systems In The Public Sector (The Alan Turing Institute 2019) <https://doi.org/10.5281/zenodo.3240529> accessed 25 May 2021
‘Liberty: Pandemic Must Not Be Used To Normalise Technology That Threatens Our Rights’ <https://www.libertyhumanrights.org.uk/issue/liberty-pandemic-must-not-be-used-to-normalise-technology-that-threatens-our-rights/> accessed 27 May 2021
Lister R, ‘From Windrush To Universal Credit – The Art Of ‘Institutional Indifference’’ (openDemocracy, 2018) <https://www.opendemocracy.net/en/opendemocracyuk/from-windrush-to-universal-credit-art-of-institutional-indifference/> accessed 26 May 2021
Local Government Association, ‘Digital Transformation Programme’ (Local.gov.uk, 2014) <https://www.local.gov.uk/digital-transformation-programme> accessed 23 May 2021
Malgieri G, and Comandé G, ‘Why A Right To Legibility Of Automated Decision-Making Exists In The General Data Protection Regulation’ (2017) 7 International Data Privacy Law (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3088976> accessed 10 May 2021
Marsh S, ‘Councils Scrapping Use Of Algorithms In Benefit And Welfare Decisions’ (the Guardian, 2021) <https://www.theguardian.com/society/2020/aug/24/councils-scrapping-algorithms-benefit-welfare-decisions-concerns-bias> accessed 26 May 2021
Mann M, and Matzner T, ‘Challenging Algorithmic Profiling: The Limits Of Data Protection And Anti-Discrimination In Responding To Emergent Discrimination’ (2019) 6 Big Data & Society <https://journals.sagepub.com/doi/pdf/10.1177/2053951719895805> accessed 22 May 2021
71 Marsh S, ‘Councils Scrapping Use Of Algorithms In Benefit And Welfare Decisions’ (the Guardian, 2021) <https://www.theguardian.com/society/2020/aug/24/councils-scrapping-algorithms-benefit-welfare-decisions-concerns-bias> accessed 26 May 2021
Mead D, ‘A Socialised Conceptualisation Of Individual Privacy: A Theoretical And Empirical Study Of The Notion Of The ‘Public’ In UK Mopi Cases’ (2017) 9 Journal of Media Law
Mikians J, Gyarmati L, Erramilli J and Laoutaris N, ‘Detecting price and search discrimination on the internet’ (Conference paper prepared for Proceedings of the 11th ACM Workshop on Hot Topics in Networks, October 2012) <https://dl.acm.org/doi/10.1145/2390231.2390245> accessed 10 May 2021
‘Nanny State’ (Dictionary.cambridge.org, 2021) <https://dictionary.cambridge.org/dictionary/english/nanny-state> accessed 25 May 2021
National Health Service, ‘Joining up health and care data’ <https://www.england.nhs.uk/digitaltechnology/connecteddigitalsystems/health-and-care-data/joining-up-health-and-care-data/> accessed 20 May 2021
Nemitz P, ‘Constitutional Democracy And Technology In The Age Of Artificial Intelligence’ (2018) 376 Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
Noto La Diega G, ‘Against The Dehumanisation Of Decision-Making-Algorithmic Decisions At The Crossroads Of Intellectual Property, Data Protection, And Freedom Of Information’ (2018) 9 JIPITEC <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3188080> accessed 26 May 2021
Parliament of Australia, ‘Paternalism In Social Policy When Is It Justifiable?’ (2010) <https://www.aph.gov.au/About_Parliament/Parliamentary_Departments/Parliamentary_Library/pubs/rp/rp1011/11rp08> accessed 25 May 2021
72 Penney JW, ‘Chilling effects: online surveillance and Wikipedia use’ (2016) 31 Berkeley Technology Law Journal <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2769645> accessed 24 May 2021
Privacy International, ‘Data Is Power: Profiling And Automated Decision-Making In GDPR’ (Privacy International 2021) <https://privacyinternational.org/report/1718/data-power-profiling-and-automated-decision-making-gdpr> accessed 6 January 2021
Privacy International, ‘Mass Surveillance’ (Privacyinternational.org, 2021) <https://privacyinternational.org/learn/mass-surveillance> accessed 25 May 2021
Publications Office of the European Union, ‘Handbook On European Data Protection Law’ (Publications Office of the European Union 2018) <https://fra.europa.eu/sites/default/files/fra_uploads/fra-coe-edps-2018-handbook-data-protection_en.pdf> accessed 11 February 2021
Romele A, Emmenegger C, Gallino F, and Gorgone D, ‘Panopticism Is Not Enough: Social Media As Technologies Of Voluntary Servitude’ (2017) 15 Surveillance & Society
Rovatsos M, Mittelstadt B, and Koene A, ‘Landscape Summary:
Bias in Algorithmic Decision-Making: What is bias in algorithmic decision-making, how can we identify it,
and how can we mitigate it?’ (UK Government 2019) <https://www.research.ed.ac.uk/en/publications/landscape-summary-bias-in-algorithmic-decision-making-what-is-bia> accessed 24 May 2021
73 Rustad ML, and Kulevska S, ‘Reconceptualizing the Right to Be Forgotten to Enable Transatlantic Data Flow’,28 HARv. J.L.TECH. 349 (2015). https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2627383 access 10 May 2021.
Samuel A, ‘Some Studies In Machine Learning Using The Game Of Checkers’ (1959) 3 IBM Journal of Research and Development <https://ieeexplore.ieee.org/document/5392560> accessed 11 February 2021
Silverio F, ‘Equality Impact Assessment Template’ (Harrow Council 2004) <https://www2.harrow.gov.uk/documents/s128005/Appendix%20B%20EqIA.pdf> accessed 25 May 2021
Sjostrand M and others, ‘Paternalism In The Name Of Autonomy’ (2013) 38 Journal of Medicine and Philosophy
Smiley C, and Fakunle D, ‘From “Brute” To “Thug:” The Demonization And Criminalization Of Unarmed Black Male Victims In America’ (2016) 26 Journal of Human Behavior in the Social Environment
Song G, and Yang P, ‘The Influence of Network Real-name System on the Management of Internet Public Opinion’ [2013] Advances in Intelligent Systems Research <https://doi.org/10.2991/icpm.2013.9> accessed 20 May 2021
‘Stack Overflow Developer Survey 2016 Results’ (Stack Overflow, 2021) <https://insights.stackoverflow.com/survey/2016#developer-profile-age> accessed 27 May 2021
Stanton J, ‘Pre-Screening Equality Impact Assessment’ (Borough Council of Kings Lynn and West Norfolk 2005) <https://democracy.west-norfolk.gov.uk/documents/s5323/CAB111%20EIA.doc.pdf> accessed 25 May 2021
South Wales Police, ‘What Is AFR?’ (What is AFR? South Wales Police) <https://afr.south-wales.police.uk/> accessed 27 May 2021
74 Stohl C, Stohl M, and Leonardi PM, ‘Managing Opacity: Information Visibility and the Paradox of Transparency in the Digital Age’ (2016) 10 International Journal of Communication < https://ijoc.org/index.php/ijoc/article/viewFile/4466/1530> accessed 26 May 2021
Tavani HT, ‘Informational privacy, data mining, and the Internet’ (1999) 1 Ethics and Information Technology <https://link.springer.com/content/pdf/10.1023/A:1010063528863.pdf> accessed 3 May 2021
Tembo D and others, ‘Is Co-Production Just A Pipe Dream For Applied Health Research Commissioning? An Exploratory Literature Review’ (2019) 4 Frontiers in Sociology
The Comptroller and Auditor General, ‘Challenges In Using Data Across Government’ (National Audit Office 2019) <https://www.nao.org.uk/wp-content/uploads/2019/06/Challenges-in-using-data-across-government.pdf> accessed 26 May 2021
The Royal Society, ‘Explainable AI: The Basics’ (2019) <https://royalsociety.org/-/media/policy/projects/explainable-ai/AI-and-interpretability-policy-briefing.pdf> accessed 19 May 2021
‘The Seven Principles Of Public Life’ (GOV.UK, 2021) <https://www.gov.uk/government/publications/the-7-principles-of-public-life> accessed 26 May 2021
Thommesen J, and Boje Andersen H, ‘Privacy Implications of Surveillance Systems’ [2009] Mobile communication and social policy <https://orbit.dtu.dk/en/publications/privacy-implications-of-surveillance-systems> accessed 21 May 2021
Tiku N, ‘ACLU Says Facebook Ads Let Employers Favor Men Over Women’ Wired (18 September 2018) <https://www.wired.com/story/aclu-says-facebook-ads-let-employers-favor-men-over-women/> accessed 20 April 2021
75 Turek M, ‘Explainable Artificial Intelligence (XAI)’ (Darpa.mil) <https://www.darpa.mil/program/explainable-artificial-intelligence> accessed 23 May 2021
Ur B and others, ‘Smart, Useful, Scary, Creepy’ (2012) 4 Proceedings of the Eighth Symposium on Usable Privacy and Security – SOUPS ’12 <https://dl.acm.org/doi/10.1145/2335356.2335362> accessed 9 March 2021
van den Bosch K, ‘Human-AI Cooperation To Benefit Military Decision Making’ (The North Atlantic Treaty Organization (NATO) and S&T Organisation 2018) <https://www.researchgate.net/publication/325718292_Human-AI_Cooperation_to_Benefit_Military_Decision_Making> accessed 23 May 2021
Wachter S, Mittelstadt B, and Floridi L, ‘Why A Right To Explanation Of Automated Decision-Making Does Not Exist In The General Data Protection Regulation’ (2017) 7 International Data Privacy Law
Westin AF, Privacy and Freedom (New York: Atheneum 1967)
White K, ‘A Leading AI Ethics Researcher Says She’s Been Fired From Google’ (MIT Technology Review, 2020) <https://www.technologyreview.com/2020/12/03/1013065/google-ai-ethics-lead-timnit-gebru-fired/> accessed 27 May 2021
Youyou W, Kosinski M, and Stillwell D, ‘Computer-Based Personality Judgments Are More Accurate Than Those Made By Humans’ (2015) 112 PNAS <https://www.pnas.org/content/112/4/1036> accessed 15 February 2021
66 Zheng S, Trott A, Srinivasa S, Naik N, Gruesbeck M, Parkes DC, and Socher R, ‘The AI Economist: Improving Equality and Productivity with AI-Driven Tax Policies’(Harvard University 2020) <https://arxiv.org/abs/2004.13332> accessed 23 May 2021