Community Moderation: A Beginner’s Guide to Keeping a Community Healthy

Community Moderation: A Beginner’s Guide to Keeping a Community Healthy

You know that a community is doing a good job when its members actively kickstart discussions and participate in activities. It shows that there is a healthy relationship between its members, where veteran members encourage and welcome the opinions of new members in the community.?

After all, no one wants to enter a place where their opinions are not considered, right?

For brand communities, it is important to create a platform for users to be comfortable when initiating or responding to a conversation. They should be able to feel free to talk about what is on their mind.?

At the same time, you need to ensure that they do not conduct any activity that affects the community negatively. Rude posts and promotional spam can easily pollute the environment of a community and can be treated by looking further into the well-being of a community.


Community Guidelines

The first step is to create Community Guidelines that contain the policies of the community. Community Guidelines basically explain what is the purpose of the community and how members are expected to behave. Most communities apply a ban on content that is offensive, promoting a third-party product or service, harassment, or provocative.?

At the same time, it is important to pick the right tone and rules depending on the community’s purpose and goals so that users do not feel lost.

For example, in a community where developers come together to share and learn code, it is safe to set a polite and relatively formal code of conduct so that activities can proceed without a hitch.

On the other hand, a community aimed to bring sports-lovers together may prefer going with a relatively flexible set of community guidelines encouraging animated discussions amid sporting events fever.


Community Moderation

While the community guidelines clarify what actions are considered appropriate or not, it is important to enforce that activity through community moderation. This refers to being on the lookout for and removing content that violates the Community Guidelines. The moderation team is responsible for ensuring that the discussions and activities in the community are running smoothly.

They need to follow a fixed moderation process to reduce confusion between the members of the community and the moderation team.

Like the guidelines, it is important to decide a process that is most suitable for the community’s goals.


Types of Community Moderation?

1. Self-Moderation

In this type of moderation, anyone is free to post and add content in the community. Members are given the option to flag content that they consider inappropriate on the community. This alerts the moderation team and they can decide whether to remove the content or keep it.

This method is ideal when looking to encourage community users to take initiative. It provides them with the freedom to communicate as they wish while giving them the responsibility to maintain the health of the community.

2. Full Moderation

No content can be posted without consent from the moderation team. This method is commonly used in threads and groups where it is important to maintain clean content. These are effective for company announcements or pages that contain knowledge resources.

3. AI-Based Moderation

Artificial intelligence can be integrated to make the moderation process faster and more efficient. A certain set of keywords can be flagged as inappropriate, according to the community guidelines.

Similarly, posts containing suspicious media (such as violent or pornographic material) and personal identifiers (such as phone numbers) can also be auto-detected and flagged by the system. If any such content is posted, it gets added to the moderation queue for the moderators to resolve. It helps identify violating posts as soon as they arrive, saving the time needed to look for them manually.

4. Audit Based Moderation

In some select cases, moderators may want to take an active role in pro-actively going through the content posted and taking spontaneous action. For example, if a conversation thread has already solved its purpose and a parallel conversation between select members has started on it, the moderator may take a call to close that thread.

?? Tool Tip: Glynk’s platform notifies moderators when any inappropriate content is flagged by users or if it contains any word or type of media that is listed as unaccepted. Moderators can also directly supervise and/or rectify any content easily with the admin console.

In most of the above cases, flagged posts or users are added to the moderation queue. Depending on the attitudes towards community moderation, as dictated by community guidelines and code of conduct, appropriate action is to be taken.


Attitudes of Community Moderation

1. To err is human

All posts in the moderation queue are kept active till they are reviewed and then modified. This attitude encourages conversation but requires a relative level of self-moderation from the members of the community.

For example, in an alumni community, the standards of decency expected in conversations are quite high. More so because any form of professional network incentivizes good behavior on the part of members.

2. Better safe than sorry

Posts in the moderation queue are hidden from members till they are approved or modified by the team. This attitude helps in ensuring that sensitive content is immediately removed from the feed, reducing the chances of friction between members.

For example, in an LGBTQ+ community, a person who looks like a woman but identifies as a man may post a topless photograph on a beach. This form of self- expression may be by the rules since this would not classify as nudity in case a cis-man posted the same photo. However, there are chances of polarizing reactions which need to be proactively managed by the moderator.


Corrective Measures & Disciplinary Action

When any violating content is posted on the community, it is the content moderation team’s role to decide and take the right call. Depending on the seriousness of the violation and the frequency with which the member violates the guidelines, the moderation team can take varying levels of corrective measures.

They include:

1. Rejecting a post

Users are notified when their post has been rejected along with clear communication on the grounds for rejection. They are given the option to modify the content to make it appropriate and post it again.

2. Deleting a post

If the content of a comment or post is found in extreme violation of the community’s policies and adds no real value to the conversation, it may be deleted. Rejected posts that are not modified are deleted as well.

3. Temporary ban on the user’s account

If a user repeatedly goes against the community guidelines, the moderation team can warn them about their violations before banning their account temporarily. Users cannot perform activities till the ban is lifted.

4. Permanent ban

Permanent bans are a strict measure to take. It should be reserved for users who repeatedly misbehave despite warnings or temporary bans. Their accounts are frozen indefinitely unless they appeal for a ban reversal and get approved.

5. Account Deletion

Account deletion can be considered as the last resort in content moderation. The member’s profile and all activity are deleted from the community.?

?? Tool Tip: Glynk supports a feature to send notifications to users when their content is being moderated. This feature can provide a reason for users about what they did wrong so that they do not repeat it again. It also maintains a sense of transparency with the members, ensuring that they are not alienated.


Important tips to note when setting the moderation process

  1. Community guidelines should always be easy to understand and easily accessible to the users. Sending a copy of the guidelines during user onboarding is a good practice. Users should also be notified whenever any revision is made so that they stay updated.
  2. The moderation team can contain members of the community as well, increasing self-efficiency. It is important to select members who have shown their value to the community and who are happy to work in moderating conversations.
  3. It is necessary to make a clear moderation process and share the guidelines with every moderator. The key is to be consistently objective & empathetic.
  4. When working on a large scale, it is alright to distribute the flagged content in the moderation queue between the team, as long as there is a supervisor who can make the final call.

While managing a community, look for what the members want. It is important to listen to your members. While you should not stray from the policies of the community without reason, you can always come to a decision that will satisfy your members.

At the same time, to run a community well, you need a platform that can support and enable regular tasks like administration and moderation. That is why the platform on which a community is built, requires a certain level of technical and quality-oriented infrastructure.

That’s why many brands have shifted to ready-made, fully functional community platforms. They reduce the time, cost and resources required to create a customer community. The logic is simple, rather than use the resources to develop a platform, use it to build a strategy that can enrich the experience of the members.

Manage your community easily with Glynk. Every feature required can be easily accessible from one place.

Natarajan H K

Founder & CEO at Lovemarriage.app

3 年

Goal of community moderation should be to define the right boundaries and setup the right systems, so that users can engage freely. A mix of manual and auto moderation is ideal. Moderation is not just about getting rid of bad posts, but also an opportunity to reiterate and remind every member about the purpose the community. So the idea should be to educate members rather than restrict them. What they see is what they create - so the initial moderation should ensure that the right content is delivered to the members. They will start adhering to and enforcing the guidelines on their own. An ideal community is where users voluntarily choose to become moderators. That's the end state any moderation strategy/ system should aim to get to.

要查看或添加评论,请登录

Glynk的更多文章