Computing and society – useful vignettes for teaching about online safety and better, inclusive design ^JB

We have a newish page on our site, called Computing and Society. Whenever we come across stories in the news that might be useful to include when teaching people about keeping safe online or designing better and more inclusive systems we’ll add them there.

There are numerous examples of how an improvement in the design of a piece of computer software or technology, or more diversity on the design team, or considering diverse users, could result in a better or safer product. These stories are about the wider themes of computing in society and include personal safety, justice, ethics and discrimination.

We’re also adding our own articles on this blog, some of which will make it into a future edition of CS4FN magazine.

This blog post (below) is a snapshot of what’s on the page so far.

Black Lives Matter

  • Facing up to the problems of recognising faces (27 July 2020) – how the use of facial recognition technology caused the wrong Black man to be arrested [link to page] [link to blog post]
  • Pulse oximeters are small medical devices used in hospitals (and at home) which use beams of light (including infrared light) through the skin to measure how much oxygen is in someone’s blood. Differences in skin tones can affect the reading, and perhaps clinical decisions: “How a popular medical device encodes racial bias” (5 August 2020)
  • Twitter apologises for ‘racist’ image-cropping algorithm (21 September 2020) – images shared on Twitter are usually cropped to show a preview in the Tweet. It appears that where two images are given side by side of a Black person and a white person, only the white face is visible in previews.

Algorithms

  • Channel Five news clip with Timandra Harkness (21 August 2020) – short news clip discussing the use (or misuse) of algorithms, and the decisions resulting from their use, in education and pensions
  • See “Twitter apologises for ‘racist’ image-cropping algorith” in the Black Lives Matter section above.

Facial recognition technology

Cyber security and privacy fails

  • Never post photos of your flight’s boarding pass publicly, eg on Instagram. It can include your passport number and your flight number – that can be enough to let someone else log into your account (they probably have your name too) and they may be able to uncover other information too, and commit identify fraud. “When you browse Instagram and find former Australian Prime Minister Tony Abbott’s passport number” from “Alex” @mangopdf is a great teaching tool on how to uncover (‘hack’) information, but also how to be a helpful member of society by letting people know that there’s an information breach. This is a very long but very entertaining blog post (the main ideas are in the first couple of paragraphs though if you don’t have time to read the whole thing).
  • Strava (1) can overshare runners’ activities if they don’t (or don’t know how to) switch to the relevant privacy settings (14 September 2020). Writing on Twitter runner Andrew Seward shared a data leak in which someone he didn’t know was tagged in his run after he ran past her (a ‘flyby’) while out. It gave him information about the routes she’d recently run, which could give away her movements. Replying to his thread, Cindy Gallop highlighted the need for diverse design teams – “The white male founders of giant tech are not the primary targets (online or offline) of harassment, abuse, racism, sexual assault, violence, rape, revenge porn. So they didn’t, and they don’t, proactively design for the prevention of any of those things” (read more in the article she links to, How to design safe and inclusive social media [14 August 2020]). Although users can switch off (or on) various settings it might be better for the default setting not to be public sharing, so that people actively choose what to share. See also Internet sleuths name wrong man in police appeal (9 June 2020)
  • Strava (2) can overshare military secrets! Military personnel, working at otherwise-secret military bases, were unwittingly giving away the location of those bases through wearing devices that shared information about their running routes. Read more at Strava suggests military users ‘opt out’ of heatmap as row deepens and The Strava Heat Map and the End of Secrets, both from 29 January 2018.
  • #PlaneBae: A cautionary tale about privacy when a story goes viral (13 July 2018) A young woman chatted to a young man who had been seated next to her on a flight and other passengers concocted a budding romance, sharing their discussion and blurred images on Twitter, leading their followers to uncover their identities. While the man enjoyed the later attention the young woman was subject to doxxing and harrassment and found the episode very unpleasant.

 

Image credit: Social Media Digitization Faces image by Gerd Altmann from Pixabay