Lynne Peskoe-Yang is a freelance science and science fiction writer.

STEM Saturdays: Grief and grit fuel development of 'Yelp for police interactions'

STEM Saturdays: Grief and grit fuel development of 'Yelp for police interactions'

After losing his partner to police violence, Brandon D. Anderson - a queer, formerly homeless army veteran turned community organizer - took his pain to work.

When Brandon D. Anderson first enlisted in 2003, military protocol forbade him to discuss his love life with his peers and superiors. Under the now-abolished policy of Don't Ask, Don't Tell, Anderson was obliged to keep his relationship with his beloved partner since high school a secret - until he had to ask for permission to see him on his deathbed. Anderson's partner was lying in a hospital bed in Oklahoma City, where he had been brutally beaten by police after a traffic stop in which law enforcement accused him of stealing a car (he hadn't). He died from his injuries soon afterward.

As teenagers, the two young men had been homeless but happy, fearful for their lives and yet utterly devoted to one another. Anderson's mother was frequently incarcerated for nonviolent substance abuse offenses, and his grandparents eventually kicked him out of their home. The relationship survived, and Anderson had just returned from a tour in Iraq as a satellite engineer when it was abruptly cut short by police violence. In the aftermath, Anderson worked odd jobs in the area before enrolling at community college in Seattle, from which he transferred to Georgetown. After graduation, he set out to dismantle the forces that had caused him so much pain in his youth.

What started as a vision of automating the police complaint system soon shifted focus to something less official but much more expansive. Anderson's app, Raheem.ai (formerly SWAT, "Safety With Accountability and Transparency"), is still in beta, but it already promises a revolution in the relationship between law enforcement and communities of color. The chatbot app invites users to contribute accounts of interactions with police via Facebook Messenger, from which it will aggregate interaction data on an interactive map along with details on race, gender presentation, and income level of civilians.

Despite their meticulous record-keeping practices, police departments are notoriously stingy about sharing their data - especially when it comes to information that paints officers of the law in a negative light. Most departments have no obligation to strive for transparency with the populations they serve, and the communities historically preyed upon by racist policing have no reason for optimism about the future.

“We’re all on board with creating trust in community government, but we’re not going to wait around for them,” Anderson told The Atlantic. “We’re gonna build this shit ourselves.”

What happens when AI gets too smart for its own rules? [The Technoskeptic]

What happens when AI gets too smart for its own rules? [The Technoskeptic]

Repost: The Non-Binary Brain - Why it's so unhelpful to talk about male or female brains [Aeon]

Repost: The Non-Binary Brain - Why it's so unhelpful to talk about male or female brains [Aeon]