There are lots of calls, every day, for Facebook to take down, leave up, or put back up some piece of content on its platform.
And every day Facebook tries to strike the right balance between safeguarding free speech and protecting people’s safety, between what is and what is not acceptable on our platforms across many continents and countries. The question then is: where do we draw the line? And more to the point: who decides? How does Facebook ensure its decisions are fair, transparent, and free from our own biases? And what remedy should people have when their content is taken down?