On this page we’ll be adding stories about wider themes in computing and society including justice, ethics and discrimination. These might be used as case studies / vignettes in the classroom, for example highlighting examples of unconsidered consequences. We’re starting with a particular focus on the Black Lives Matter movement.
Copies of some of our articles are also published on our blog too.
We also have a page on Positive Stories about Computing.
Black Lives Matter
- Facing up to the problems of recognising faces (27 July 2020) – how the use of facial recognition technology caused the wrong Black man to be arrested [link to page] [link to blog post]
- Pulse oximeters are small medical devices used in hospitals (and at home) which use beams of light (including infrared light) through the skin to measure how much oxygen is in someone’s blood. Differences in skin tones can affect the reading, and perhaps clinical decisions: “How a popular medical device encodes racial bias” (5 August 2020)
- Twitter apologises for ‘racist’ image-cropping algorithm (21 September 2020) – images shared on Twitter are usually cropped to show a preview in the Tweet. It appears that where two images are given side by side of a Black person and a white person, only the white face is visible in previews.
- Why are memes of black people reacting so popular online? (8 July 2018) – article looking at ‘digital blackface’ where white people use memes of Black people. See also Are you guilty of emoji blackface? (1 December 2017) which references the use of darker skin tone emojis by white people. More generally white people have also tended to use the ‘default’ yellow skin tone emojis 👍 in preference to lighter skin tone emojis 👍🏻, see Why white people don’t use white emoji (9 May 2016).
Algorithms
- Channel Five news clip with Timandra Harkness (21 August 2020) – short news clip discussing the use (or misuse) of algorithms, and the decisions resulting from their use, in education and pensions
- See “Twitter apologises for ‘racist’ image-cropping algorith” in the Black Lives Matter section above.
- Universal Credit Algorithm Is ‘Pushing People Into Poverty,’ Report Finds (29 September 2020) – a “poorly designed” algorithm may be overestimating the amount of money being earned by someone, and so (wrongly) adjusting downwards their Universal Credit payments.
Data handling
Excel spreadsheet limitations apparently affected Covid-cases tracking (October 2020) – the limitations of Excel as a tool for handling the volume of positive coronavirus tests came to light after several tends of thousands of cases were missed. This is claimed to be due to the maximum number of columns (which would be a surprising use of columns instead of rows in itself) being exceeded, meaning that data could not be properly stored. The interim solution was to split the spreadsheet into multiple sheets. This will also have resulted in lost contact-tracing as people were not ‘on the system’ – a problem for a system designed to record and limit the spread of coronavirus, it also means that the spread of the disease was worse than first thought. Excel spreadsheets are not databases (Twitter thread).
See also Covid: how Excel* may have caused the loss of 16,000 test results in England (5 October 2020) which also notes other erroneous uses of Excel which led to problems. There seem to be plenty!
*Technically the inappropriate use of Excel rather than the software itself, and should not have been used.
Facial recognition technology
- See our article ‘Facing up to the problems of recognising faces‘ in the Black Lives Matter section above
- Woman, 35, arrested in UK first after new live facial recognition cameras rolled out (27 February 2020) Evening Standard – the ethnicity of the woman arrested is not given.
Cyber security and privacy fails
- Never post photos of your flight’s boarding pass publicly, eg on Instagram. It can include your passport number and your flight number – that can be enough to let someone else log into your account (they probably have your name too) and they may be able to uncover other information too, and commit identify fraud. “When you browse Instagram and find former Australian Prime Minister Tony Abbott’s passport number” from “Alex” @mangopdf is a great teaching tool on how to uncover (‘hack’) information, but also how to be a helpful member of society by letting people know that there’s an information breach. This is a very long but very entertaining blog post (the main ideas are in the first couple of paragraphs though if you don’t have time to read the whole thing).
- Strava (1) can overshare runners’ activities if they don’t (or don’t know how to) switch to the relevant privacy settings (14 September 2020). Writing on Twitter runner Andrew Seward shared a data leak in which someone he didn’t know was tagged in his run after he ran past her (a ‘flyby’) while out. It gave him information about the routes she’d recently run, which could give away her movements. Replying to his thread, Cindy Gallop highlighted the need for diverse design teams – “The white male founders of giant tech are not the primary targets (online or offline) of harassment, abuse, racism, sexual assault, violence, rape, revenge porn. So they didn’t, and they don’t, proactively design for the prevention of any of those things” (read more in the article she links to, How to design safe and inclusive social media [14 August 2020]). Although users can switch off (or on) various settings it might be better for the default setting not to be public sharing, so that people actively choose what to share. See also Internet sleuths name wrong man in police appeal (9 June 2020)
- Strava (2) can overshare military secrets! Military personnel, working at otherwise-secret military bases, were unwittingly giving away the location of those bases through wearing devices that shared information about their running routes. Read more at Strava suggests military users ‘opt out’ of heatmap as row deepens and The Strava Heat Map and the End of Secrets, both from 29 January 2018.
- #PlaneBae: A cautionary tale about privacy when a story goes viral (13 July 2018) A young woman chatted to a young man who had been seated next to her on a flight and other passengers concocted a budding romance, sharing their discussion and blurred images on Twitter, leading their followers to uncover their identities. While the man enjoyed the later attention the young woman was subject to doxxing and harrassment and found the episode very unpleasant.