Access denied

UX Designer Sam Yu

Most privacy and security mechanisms are only designed with the majority, ‘average’ population in mind, excluding a variety of diverse groups with unique and specific needs. As we understand, inclusion and accessibility are critical for the design of all things. When it comes to security and privacy, however, the stakes are raised. Those in traditionally underserved communities are already more likely to live in more vulnerable positions on the peripheries of society. The neglect of privacy and security for those in already marginalised positions places them at greater risk. 

The idea of inclusive security and privacy is the design of inclusive mechanisms that are considerate of different human abilities, characteristics, needs, identifiers, and values. Protective measures should cater to the widest possible range of users. 

Here are two examples where privacy and security mechanisms are not inclusive.

‘Real name’ policies 

In attempt to protect platforms from false accounts and to prevent people from causing harm under the use of pseudonyms, ‘real name’ policies have been implemented by many online services. These policies require everyone using the platform to provide their ‘real’ name, which would be analysed by software and occasionally human judgement. Accounts are identified as fake through the detection of characteristics such as having too many words, containing symbols, too many capital letters, repeating character, first names containing initials and more. Each platform has a different series of rules. 

Despite the good intentions of these policies, what results is the exclusion of many groups of people from using their genuine real names. This issue has arisen throughout the history of the internet and social media usage, colloquially named the ‘nymwars’. 

Some of the groups affected have been: 

  • Non-Western cultural and ethnic communities, 
  • Non-binary and transgender people, as well as drag performers, 
  • Victims of abuse and harassment who need their identity protected. 

The decision-making tools used to evaluate naming have prevented countless people from signing up to platforms, as well as having accounts suspended. Naming policies and requirements can cause major issues such as the banning of accounts but can also amount to microaggressions via disaffordances.

I am personally of Chinese background, with a short surname containing two characters. On several occasions, I have been unable to register for things online as naming inputs have required a minimum of three characters.  

CAPTCHA 

CAPTCHAs are a security measure to determine whether or not a user is a person. This protects platforms from bots and automated programming. The most common form of CAPTCHA is text-based. These work by displaying a series of distorted characters, which must be interpreted and typed out by a user. As a computer’s optical character recognition is unable to detect these characters, CAPTCHAs are quite a good way to separate people from machines. Other common forms of CAPTCHAs are image-based, where you are asked to identify an image of something, or audio-based, where you are asked to listen for a specific word.  

Text-based CAPTCHAs are inaccessible to some groups of disabilities, such as people who are blind or low vision (using a screen reader), or people with dyslexia. Screen readers are unable to detect the distorted text, often presented as an image, and providing alt text defeats the purpose of the security procedure. Imaged-based CAPTCHAs present similar vision-based accessibility issues and audio-based CAPTCHAs are unable to be used by those who are Deaf, or hard-of-hearing. Most platforms generally only use one form of CAPTCHA, which excludes entire groups of people from using their services. Providing multiple options is the best practice for accessibility. 

In response to the lack of inclusive considerations in general work, inclusive security and privacy is an emerging field of research and practice. By developing an understanding of the diversity of unique user needs and challenges, inclusive security and privacy creates opportunities to re-examine current ways of working and develop new innovative practices. The designing of inclusive mechanisms not only benefits underserved communities but can also provide benefits to everyone. 

Inclusive security and privacy is focused on inclusion, beyond instrumental accessibly. This means it is not only considerate of developing accessible mechanisms, but also addresses wider social and cultural issues. 

To speak with the team at CfID about security, Inclusive Design or accessibility, contact info@cfid.org.au.