|
VIEW EVENT INFORMATION: Google |
|
 |
Racial Bias Observed In Hate Speech Detection Algorithm From Google |
 |
|
|
 |
|
 |
About the organization Google: Type: Business Sub-Types: Website, Search Engine, Computer Software, Application Software, Telecommunication Software |
 |
Notable Organizations: Google, Techcrunch |
 |
Understanding what makes something offensive or hurtful is difficult enough that many people can’t figure it out, let alone AI systems. And people of color are frequently left out of AI training sets. So it’s little surprise that Alphabet/Google -spawned Jigsaw manages to trip over both of these issues at once, flagging slang used by black Americans as toxic.
To be clear, the study was not specifically about evaluating the company’s hate speech detection algorithm, which has faced issues before. Instead it is cited as a contemporary attempt to computationally dissect speech and assign a “toxicity score” — and that it appears to fail in a way indicative of bias against black American speech patterns.
The researchers, at the University of Washington, were interested in the idea that databases of hate speech currently available might have racial biases baked in — like many other data sets that suffered from a lack of inclusive practices during formation. |
 |
 |
Leave Comments: |
 |
|
|
|
|