1. World problems
  2. Discriminatory design of information systems

Discriminatory design of information systems

  • Biased computer software design
  • Biased computerized information displays

Nature

Discriminatory design of information systems refers to the creation or implementation of digital technologies that unintentionally or deliberately disadvantage certain groups based on attributes such as race, gender, age, or disability. This problem arises from biased data, algorithms, or design choices that perpetuate social inequalities, leading to unfair outcomes in areas like hiring, lending, or access to services. Discriminatory design undermines trust, reinforces stereotypes, and can result in legal and ethical violations. Addressing this issue requires inclusive design practices, diverse development teams, and ongoing evaluation to ensure equitable and just information systems for all users.This information has been generated by artificial intelligence.

Background

The discriminatory design of information systems emerged as a global concern in the late 20th century, when researchers and advocacy groups began documenting systemic biases embedded in software, databases, and algorithms. High-profile cases—such as racially biased facial recognition and exclusionary digital services—drew international attention, prompting interdisciplinary studies and policy debates. Over time, the proliferation of digital technologies worldwide has intensified scrutiny of how design choices perpetuate social inequalities, spurring calls for ethical and inclusive system development.This information has been generated by artificial intelligence.

Incidence

Two American airlines were rigging their computerized reservation systems, on which travel agency business is dependent, so that their flight information received more display-screen prominence than competitors' flights.

Claim

The discriminatory design of information systems is a critical and urgent problem. When technology embeds bias, it perpetuates inequality, marginalizes vulnerable groups, and undermines trust in digital solutions. Ignoring this issue allows injustice to be coded into the very systems shaping our lives. We must demand accountability and inclusive design, or risk technology becoming a tool for oppression rather than empowerment. This is not just a technical flaw—it’s a moral crisis.This information has been generated by artificial intelligence.

Counter-claim

Concerns about the so-called "discriminatory design of information systems" are vastly overstated. Technology is inherently neutral; any perceived bias is a reflection of user input, not system design. Focusing on this issue distracts from real technological progress and innovation. Instead of obsessing over hypothetical discrimination, we should prioritize efficiency and functionality. Worrying about discriminatory design is an unnecessary distraction, not a significant problem demanding our attention or resources.This information has been generated by artificial intelligence.

Broader

Narrower

Aggravated by

Related

Strategy

Showing bias
Yet to rate
Interworking
Yet to rate

Value

Misinformation
Yet to rate
Information
Yet to rate
Disinformation
Yet to rate
Deformation
Yet to rate
Bias
Yet to rate

SDG

Sustainable Development Goal #10: Reduced Inequality

Metadata

Database
World problems
Type
(D) Detailed problems
Biological classification
N/A
Subject
  • Communication » Exhibitions
  • Cybernetics » Systems
  • Design » Design
  • Informatics, classification » Informatics
  • Information » Information
  • Societal problems » Imbalances
  • Content quality
    Unpresentable
     Unpresentable
    Language
    English
    1A4N
    D7450
    DOCID
    11474500
    D7NID
    139907
    Editing link
    Official link
    Last update
    May 19, 2022