Wednesday, December 31, 2014

Algorithmic cruelty

By now, most of us know about Facebook’s algorithmic retrospectives, and, of course, how some people thought it could be cruel. Indeed, posts about some sorts of issues, such as divorces or deaths, can get a lot of “likes” (where the “like” button means something other than “like”) and comments, and therefore get flagged by whatever algorithm the Facebook data scientists came up with as a post worthy of a retrospective.

There are a lot of issues here. When someone “likes” a post, they do not necessarily mean they “like” the event the post is about. It could mean a number of different things, such as “I hear you,” or “I’m empathizing with you,” or even “Hang in there.” However, the algorithms treat all likes equal.

Comments, of course, carry much more sophisticated meaning, but are much harder to analyze especially in the presence of sarcasm. And algorithms that do analyze comments (or any free text) for sentiment will require a large training set of hand-coded comments. (Which I suppose Facebook does have the resources to generate.)

Which leaves a few ways of handling this problem:

  • Do nothing different. Which is probably my favorite solution, because I’d like to look back on the good, the bad, and the ugly. It’s my life, and I want to remember it. Besides, the event that really sucked at the time (say, a torn ACL leading to surgery) may lead to good things.
  • Add a “I don’t want to see this” button. Which was already accomplished by including an “X” button, but maybe not so obvious.
  • Eliminate the retrospective, which I don’t think anybody agrees is a good solution.

I suppose one day Facebook’s algorithm will be smart enough to withhold posts it knows people don’t want to review, but then that will open up another can of worms.