On Monday, Taylor Lorenz posted a telling story about how Meta has been suppressing access to LGBTQ content across its platforms, labeling it as “sensitive content” or “sexually explicit.” Posts wi…
This is the same website that helped coordinate violent actions in Myanmar against Rohingya people. In Myanmar, Facebook had essentially no non-English language moderation. So the leaders of the violence campaign were posting their plans openly on Facebook and facebook picked it up and pushed it to more people because anti-Rohingya content was bringing in the most ad dollars and platform engagement.
From Amnesty International;
It revealed that even Facebook’s internal studies dating back to 2012 indicated that Meta knew its algorithms could result in serious real-world harms. In 2016, Meta’s own research clearly acknowledged that “our recommendation systems grow the problem” of extremism.
This is the same website that helped coordinate violent actions in Myanmar against Rohingya people. In Myanmar, Facebook had essentially no non-English language moderation. So the leaders of the violence campaign were posting their plans openly on Facebook and facebook picked it up and pushed it to more people because anti-Rohingya content was bringing in the most ad dollars and platform engagement.
From Amnesty International;