How do we reconcile their toxicity with their utility?

Social media encourages us to observe conflicts and pick sides on topics which we would otherwise have very few opinions. But what might be good for capturing human attention is often bad for them as well. Emotional reactions to posts like outrage are encouraged since they are strong indicators of engagement which means any controversy generated more attention. Moral outrages equals virality, which keeps you on these platforms. But how could we change that?

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/3252d577-8261-4732-a3a3-148e40d7120f/Bildschirmfoto_2021-01-30_um_16.05.29.png

1: Change the Algorythm

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/8549a61f-3a1e-4d3e-8dc4-8da31dc6db28/Bildschirmfoto_2021-01-30_um_16.05.15.png

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/f41d520b-a237-4c4b-b60d-cb17d97d50f0/Bildschirmfoto_2021-01-30_um_16.05.20.png

2: Measure the bad things

E.g. train algorithms and humans to find the stuff we don't want to see. Facebook already trains their news feed algorithm around the metric of what is considered "meaningful" to its users. This results in the problem that many strong human interactions – like outrage, disgust and anger – are broadly considered meaningful and are therefore displayed in the news feeds.

Idea: Define unhealthy content

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/c4c76b12-82cc-41d6-bbbd-d07238348ccc/Bildschirmfoto_2021-01-30_um_16.09.28.png

Give users a menu of choices that accurately represent their preferences, not just what they will click on.

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/47441296-19c5-4e14-b09e-5aecbfc2cc37/Bildschirmfoto_2021-01-30_um_16.10.05.png