In case you missed it, last week (November 30), the National Telecommunications and Information Administration (NTIA) announced that it would convene a series of virtual listening sessions on privacy, fairness and civil rights. According to the NTIA, the sessions (scheduled for December 14, 15, and 16) will provide data for a report on “how personal information business data flows can impact and have disparate outcomes for marginalized communities or disadvantaged â.
The NTIA cites the following examples to illustrate how data collection, âeven for legitimate purposesâ, leads to disparate impacts:
- Digital advertising offers content and opportunities based on indirect indicators of race, gender, disability and other characteristics, perpetuating historical patterns of discrimination.
- Insurance companies use information such as neighborhood, safety, bankruptcy, and gun ownership to infer who will need expensive health care, justifying higher premiums.
- Universities predict which students will have academic difficulties based on factors such as race.
Why this news?
As our readers may have noticed, the NTIA is not the first agency or constituency to make the connection between data collection and discrimination. In 2013, Harvard professor Latanya Sweeney published a groundbreaking study showing racial discrimination and stereotypes in online search and ad serving. In 2014, the FTC hosted a workshop, followed by a report (Big Data: A Tool for Inclusion or Exclusion?) Detailing the issue and making recommendations for businesses and researchers. In recent years, dozens of studies and conferences have examined the discriminatory assumptions embedded in algorithms and artificial intelligence (AI). And civil rights groups have raised concerns for years, and in 2019 secured a landmark settlement with Facebook to end discrimination on its online advertising platform.
The NTIA announcement is nevertheless significant for two reasons. First, by its own description, the NTIA is the President’s principal adviser on information policy matters, charged with assessing the impact of technology on privacy and the adequacy of existing privacy laws. Further, its announcement states that the listening sessions are designed to “build the factual record for further policy development in this area.” For these reasons, the notice has been touted as the administration’s “first step” on privacy and a possible attempt to revive stalled efforts by Congress to enact federal privacy law.
Second, when in doubt, the NTIA announcement asserts that the link between privacy and civil rights is now a widely accepted political position and will remain at the center of any debate on whether to pass a comprehensive federal law. on the protection of privacy. Whereas in the past it was debated whether civil rights provisions should be âaddedâ to privacy legislation, they are now essential elements.
This is true not only among Democrats, but also among Republicans. For example, provisions relating to discrimination and / or algorithmic decision-making appear in recent privacy legislative proposals, not only from Representative Eshoo and Senator Cantwell, but also from Senator Wicker and Members. Republicans from the House Energy and Commerce (E&C) Committee. The Republican E&C bill is particularly notable for the importance it places on the issue – banning data practices that “discriminate or render economic opportunity unavailable on the basis of race, color. , religion, national origin, sex, age, political ideology or disability or category of people.
But what does this mean for businesses Today?
You might be wondering what this means for business now, with Congress still (endlessly) debating whether to pass federal privacy legislation? It means that:
- Data discrimination is on everyone’s radar as Congress finally decides to pass federal privacy legislation.
- Businesses should expect stricter enforcement – even now, under existing laws – challenging data practices that lead to discriminatory results. These laws include the FTC law (recently used to challenge racial profiling by a car dealership), state UDAP laws, the Fair Credit Reporting Act, the Equal Credit Opportunity Act, and (of course) civil rights laws. .
- To avoid discrimination (and any allegation of discrimination), companies should test their data systems and the use of algorithms and AI for accuracy and fairness before using them in the real world.