Welkom! Inholland respecteert je privacy. Deze website maakt gebruik van cookies om je bezoek makkelijker en persoonlijker te maken, de site te verbeteren en om marketingactiviteiten te kunnen doen. Wanneer je op ‘Ja, ik accepteer’ klikt geef je hier toestemming voor.
Social media platforms such as Facebook, YouTube, and Twitter have millions of users logging in every day, using these platforms for commu nication, entertainment, and news consumption. These platforms adopt rules that determine how users communicate and thereby limit and shape public discourse.2
Platforms need to deal with large amounts of data generated every day. For example, as of October 2021, 4.55 billion social media users were ac tive on an average number of 6.7 platforms used each month per internet user.3 As a result, platforms were compelled to develop governance models and content moderation systems to deal with harmful and undesirable content, including disinformation. In this study:
• ‘Content governance’ is defined as a set of processes, procedures, and systems that determine how a given platform plans, publishes, moder ates, and curates content.
• ‘Content moderation’ is the organised practice of a social media plat form of pre-screening, removing, or labelling undesirable content to reduce the damage that inappropriate content can cause.