Battling algorithmic bias at TC Sessions: Justice

At TC Sessions: Justice on March 3, we’re going to dive head-first into data discrimination, algorithmic bias and how to ensure a more just future, as technology companies rely more on automated processes to make decisions.

Algorithms are sets of rules that computers follow in order to solve problems and make decisions about a particular course of action. But there is an inherent problem with algorithms that begins at the most base level and persists throughout its adaption: human bias that is baked into these machine-based decision-makers.

Algorithms driven by bad data are what leads to biased arrests and imprisonment of Black people. They’re also the same kind of algorithms that Google used to label photos of Black people as gorillas and that Microsoft’s Tay bot used to become a white supremacist.

At TC Sessions: Justice, we’ll hear from three experts in this field. Let’s meet them.

Dr. Safiya Umoja Noble

Associate Professor at University of California Los Angeles a professor at the University of Southern California and author of “Algorithms of Oppression: How Search Engines Reinforce Racism,” Noble has become known for her analyses around the intersection of race and technology.

In her aforementioned book, Noble discusses the ways in which algorithms are biased and perpetuate racism. She calls this data discrimination.

“I think that the ways in which people get coded or encoded particularly in search engines can have an incredible amount of harm,” Noble told me back in 2018 on an episode of TC Mixtape, formerly known as CTRL+T. “And this is part of what I mean when I say data discrimination.”

Mutale Nkonde

Image Credits: Via Mutale Nkonde

It’s important to explicitly call out race in order to create just technological futures, according to Nkonde. In her research paper, “Automated Anti-Blackness: Facial Recognition in Brooklyn, New York,” Nkonde examines the use of facial recognition, the history of the surveillance of Black people in New York and presents potential ways to regulate facial recognition in the future.

Nkonde is also a United Nations adviser on race and artificial intelligence and is currently working with Amnesty International to advance a global ban on facial recognition technology.

Haben Girma

Woman walking with guide dog.

Image Credits: Courtesy of Haben Girma

Author of memoir “Haben: The Deafblind Woman Who Conquered Harvard Law,” and human rights lawyer, Girma focuses on advancing disability justice.

At Sight Tech Global last month, Girma spoke about how discussions around algorithmic bias as it pertains to race have become somewhat normalized, but too often do those conversations exclude the effects of algorithms on disabled people. Girma told me at that when it comes to robots, for example, the topic of algorithmic bias is lacking among developers and designers.

“Don’t blame the robots,” she said. “It’s the people who build the robots who are inserting their biases that are causing ableism and racism to continue in our society. If designers built robots in collaboration with disabled people who use our sidewalks and blind people who would Use these delivery apps, then the robots and the delivery apps would be fully accessible. So we need the people designing the services to have these conversations and work with us.”

If you’ve made it this far in the post, you’re probably wondering how to attend. Well, you can snag your ticket right here for just $5.

( function() {
var func = function() {
var iframe = document.getElementById(‘wpcom-iframe-97fcb02edc714d7c4ba26638f20b3954’)
if ( iframe ) {
iframe.onload = function() {
iframe.contentWindow.postMessage( {
‘msg_type’: ‘poll_size’,
‘frame_id’: ‘wpcom-iframe-97fcb02edc714d7c4ba26638f20b3954’
}, “https://tcprotectedembed.com” );
}
}

// Autosize iframe
var funcSizeResponse = function( e ) {

var origin = document.createElement( ‘a’ );
origin.href = e.origin;

// Verify message origin
if ( ‘tcprotectedembed.com’ !== origin.host )
return;

// Verify message is in a format we expect
if ( ‘object’ !== typeof e.data || undefined === e.data.msg_type )
return;

switch ( e.data.msg_type ) {
case ‘poll_size:response’:
var iframe = document.getElementById( e.data._request.frame_id );

if ( iframe && ” === iframe.width )
iframe.width = ‘100%’;
if ( iframe && ” === iframe.height )
iframe.height = parseInt( e.data.height );

return;
default:
return;
}
}

if ( ‘function’ === typeof window.addEventListener ) {
window.addEventListener( ‘message’, funcSizeResponse, false );
} else if ( ‘function’ === typeof window.attachEvent ) {
window.attachEvent( ‘onmessage’, funcSizeResponse );
}
}
if (document.readyState === ‘complete’) { func.apply(); /* compat for infinite scroll */ }
else if ( document.addEventListener ) { document.addEventListener( ‘DOMContentLoaded’, func, false ); }
else if ( document.attachEvent ) { document.attachEvent( ‘onreadystatechange’, func ); }
} )();

 


Source: Tech Crunch

Leave a Reply

Your email address will not be published. Required fields are marked *