Social Pinpoint's moderation system screens inappropriate content contributed by your participants, ensuring content stays appropriate and respectful so you can engage with confidence. It uses both automated and human processes to review and remove undesirable content on your behalf.

Certain participation tools, including the Social Map, Gather, Visioner, Conversation, and Forum, allow your participants to post content such as comments and images that are visible to all visitors of your site.

Our moderators monitor your site 24/7 and review content posted by your participants, usually checking it within a few minutes after it's submitted.

If they think the content might violate one of our Moderation Rules, they'll refer it to you for further action and temporarily remove it from your site. You can then decide to either permanently remove the content from your site or reinstate it if appropriate.

You also have the option to 'self-moderate' your content, which removes our moderators from the process, giving you complete control (and responsibility) for managing your participant's public content.

Automated processes such as a Banned Words filter and anti-SPAM technology also help keep undesirable content from appearing on your site.

Moderation rules

Our moderators review content against the nine rules contained in our Moderation Policy:

  1. Content may not contain personal attacks on other participants, staff or public figures;
  2. Content may not intentionally or inadvertently disclose sensitive personal and/or confidential information about any other participants, staff or public figures (including themselves unless otherwise asked for - for example Social Map can be used to identify specific address for reasons of identification of issues);
  3. Content may not contain discriminatory language relating to the gender, race, religion, culture, sexual preference, appearance or background of a person;
  4. Content may not contain language perceived as threatening, inciting violence or encouraging, endorsing, approving or recommending the performance of dangerous or illegal acts;
  5. Content may not contain language perceived as offensive including profanity (not included in the Banned Words list);
  6. Content is considered duplicate as it appears multiple times in a perceivably inappropriate way;
  7. Content is considered off-topic or SPAM;
  8. Content contains links to advertising or illegal material; and
  9. Image content is considered graphic as it contains portrayals of (for example) acts of violence, hate speech, terrorism, pornography, racism and self-harm.

The moderation rules are set, and you cannot add or modify them.

The moderation process

Both Social Pinpoint and the Customer share the responsibility of moderation, so it's important to understand how the moderation process works.

First, our moderators perform an initial assessment of any content submitted by your participants against our standard Moderation Rules. If they believe the content doesn't meet our Moderation Rules or feel uncertain whether the content is appropriate, they'll refer it to you for further review.

When our moderators refer content to you, it is temporarily removed from public view. At the same time, any users you assign to the 'moderator' role will receive an email notification letting them know of the referral.

Our moderators do not 'reject' content outright, so you are responsible for making the final decision to approve or reject it, which you can do through the provided moderation tools.

You can nominate moderators for your whole site or specific projects only. Moderators assigned at the project level can only receive notifications and perform moderation actions related to that project.

When our moderators refer content to you, the email notification indicates the reason for their referral:

  • SPAM - Content appears to be machine-generated or contains advertising.
  • Duplicate - Content is identical or similar to other contributions by the same contributor.
  • Inappropriate - Content is off-topic, reveals personal details, or contains illicit or vulgar content.
  • Discrimination - Content is considered discriminatory towards a particular individual or group.
  • Other - Content is deemed unsuitable for any other reason.

The email also includes a link to the Moderation area of the Dashboard where you can review the content and take action to 'approve' or 'reject' it(see instructions for using the moderation interface below).

Our moderators do not review video content, so you will need to screen this content internally. Your nominated moderators will receive an email notification when video content is posted (via the Gather tool).

Moderation methods

You can control how the moderation process works on your site by adjusting the moderation method. There are two options to choose from:

  • Pre-moderation - Content is reviewed and approved by the moderators before it is published. This method is useful when activities are considered high-risk or when issues with inappropriate content reoccur.
  • Post-moderation - Content is immediately published and is subsequently reviewed by the moderators. This method is considered best practice, and we strongly recommended using it by default in most cases.

You can set the moderation method you prefer to use at a default site level. However, you also have the option to override this default setting in the participation tools if necessary, so you can choose the method that best supports your project.

Diagram showing the moderation workflows for processing contributions

Two different moderation methods give you control over how your content is reviewed and approved.

Self Moderation

You can remove our moderators from the screening process by enabling the self-moderation feature. This option makes you fully responsible for reviewing all content.

You can still choose whether content is pre-moderated or post-moderated.

Typically, self-moderation is only necessary where the engagement collects sensitive information, or there is a need for a greater level of control.

Please note: With this option you are fully responsible for the moderation and as such should still consider user experience as part of the time line for moderation. To expect your community to continue a conversation with you comments or submissions need to me moderated in a timely manner. We recommend that you follow similar guidelines to our own moderators and ensure that all comments and submissions are moderated within 2 hours.

You can remove our moderators from the screening process by enabling the self-moderation feature. This option makes you fully responsible for reviewing all content.

Typically, self-moderation is only necessary where the engagement collects sensitive information or there is a need for a greater level of control.

You can still select the moderation method (pre-moderation or post-moderation) most appropriate for your use case when using this option. Just make sure you're actively reviewing any contributions to keep the conversation flowing, especially if using pre-moderation. We recommend trying to review content within 2 hours.

Automated content screening

Social Pinpoint uses automated processes to screen content before a contribution can enter the moderation system. These features prevent participants from posting obvious profanity and detect automated 'bot' activity, reducing the need for human review.

Banned words

Social Pinpoint uses a 'banned words' filter to screen your participant's content for 'black words'. Participants who try to post content containing a word from the Banned Words list will have their contribution refused when trying to submit.

You can customise the Banned Words list, which contains hundreds of words out of the box (see 'Key actions' below). You should review this list to make sure it is appropriate for your organisation.

Anti-Spam protection

Social Pinpoint uses anti-SPAM technology to detect and prevent suspicious activity that attempts to 'SPAM' content to your site. Usually, this results from automated, 'bot' activity - robots that deploy programs to find and complete web-based forms, sometimes at high volumes.

Social Pinpoint uses Google's reCAPTCHA technology to detect non-human behaviour and block suspicious activity before it becomes a problem. You can set up this feature to always work across your site by default or configure it in relevant participation tools on a case by case basis (see 'Key actions' below).

You can also enable this feature on your site's registration form to prevent bots from creating user accounts, but you'll need our help for this (contact customer support).

When you enable this option, a CAPTCHA field appears on the input form of the participation activity and asks the participant to confirm they are not a bot before submitting their content (in most cases).

Moderation interface

The Moderation area of the Dashboard provides a view of all moderation activity occurring across a site. It facilitates a number of actions relating to the approval, review and removal of public contributions.

Each time a participant makes a public contribution, a moderation ticket is created and added to the Moderation interface with the following details:

  • the date and time of submission
  • the content of the contribution (text and/or image)
  • which participation activity it was collected through (e.g. Social Map, Visioner, Conversation, etc.)
  • its current moderation status

Moderation tickets appear in three columns, each corresponding to a step in the moderation process:

  • Awaiting Moderation - Content yet to be reviewed by the moderators.
  • Recently Moderated - Content reviewed by moderators and has been either 'approved' or 'rejected'.
  • For your Review - Content referred by our moderators and awaits further action by you.

The moderation ticket also shows its moderation status indicating whether the content is live on your site (or not):

  • Awaiting Approval (Published) - Content is currently live and appears to all visitors of your site, but the moderators haven't yet reviewed it (post-moderation).
  • Awaiting Approval (Unpublished) - Content is not yet live on your site and is waiting for review by the moderators (pre-moderation).
  • Approved - The moderators have reviewed and accepted the content which appears to all visitors of your site.
  • Referred - The moderators have referred the content to you for further review and action, as they have deemed it to breach the Moderation Rules, and the content is temporarily removed from your site.
  • Rejected - Content has been reviewed and rejected by you and is no longer visible to the visitors of your site.

You can perform three key actions by clicking the 'action buttons' on each moderation ticket:

  • Approve - Publishes the content to your site (pre-moderation) or confirms that the already published content is appropriate (post-moderation).
  • Refer - Temporarily removes the content from your site and notifies your moderators that further review is required.
  • Reject - Permanently removes the content from your site.

To view the history of any moderation actions, click the 'View Details' button on the moderation ticket.

Key actions

Site Administrators can view all moderation activity in the Moderation area of the Dashboard and perform all moderation actions on a site. However, they will not receive referral notifications unless their email address is nominated.

To adjust the list of email addresses which will receive referral notifications:

  1. Go to the 'Social Pinpoint Settings' area of the Dashboard, then click the 'Moderation Method' link under the 'Moderation' heading
  2. Add an email address into the 'Enter email address' field, separating multiple addresses with a comma
  3. Click the blue 'Save' button at the bottom-right of the page.

Only users with appropriate permissions can perform moderation actions, regardless of whether they receive referral notifications or not.

You can also nominate Site Users to act as moderators at the project level. These users will receive email notifications when our moderators refer content for review. They can also perform moderation actions, but only for content collected on the projects which they are assigned.

To assign a user to act as a moderator for a project:

  1. Click the 'Settings' button in the top-left of the toolbar on the relevant project page.
  2. Add the user to the 'Permitted Users' section at the bottom of the project Details, giving them the appropriate User Role.
  3. Select the checkbox under the 'Moderator' column next to the user
  4. Save or publish your changes as appropriate.

You can select between two moderation methods to determine whether content is pre-screened before it is published (pre-moderation), or it is immediately published and only removed if there is an issue (post-moderation).

You can set the default moderation method at a site level but can override the default within each participation tool.

To change the moderation workflow of a participation tool:

  1. Open the settings of the participation tool
  2. Click the 'Advanced' tab, and under the 'Moderation Method' setting, choose between 'Pre-moderation' or 'Post-moderation'
  3. Click the 'Save' button.

You can reject any content referred to you, or remove content already approved by the moderators if you do not want it to appear on your site.

To reject or remove content:

  1. Go to the 'Moderation' area of the Dashboard and find the moderation ticket for the contribution you want to remove
  2. Click the 'Reject' button at the bottom of the moderation ticket, then select a reason for removing the contribution from the dropdown menu
  3. Choose whether to notify the author of the submission a reason for the rejection (only works if you have also collected their email details)
  4. Click the 'Moderate' button to confirm the action.

When you reject a contribution, you can notify the participant who made it to explain the reasons for the rejection, which is generally considered good practice.

To use this feature, you need to have registration enabled on the tool or use a preset email question, so the software knows where to send the notification.

The notification email sent to the participants will include the reason for the rejection and can optionally include a custom message to provide further details.

To send a rejection email notification to the participant:

  1. Go to the 'Moderation' area of the Dashboard and find the moderation ticket for the contribution you want to remove
  2. Click the 'Reject' button and select the most relevant reason for rejection from the dropdown menu
  3. Click the 'Notify the author of the submission' checkbox
  4. Type a custom message to the participant (optional)
  5. Click the 'Moderate' button to confirm the action and send the notification.

You can take full control of the moderation process and exclude our moderators from the process. Enabling the self-moderation feature makes you solely responsible for monitoring and reviewing content contributed by your participants:

To enable self-moderation:

  1. Open the settings of the participation tool
  2. Click the 'Advanced' tab, and turn the 'Self Moderation' setting to on.
  3. Change the 'Moderation Method' to 'Pre-moderation' or 'Post-moderation as desired
  4. Click the 'Save' button.

The Banned Words list can be customised by Site Administrators to include additional, or remove unwanted, words.

To customise the Banned Words list:

  1. Go to the 'Social Pinpoint Settings' area of the Dashboard, then click the 'Banned Words' link under the 'Moderation' heading
  2. To add a new word to the list click the 'Add Word' button at the top, enter the new word, and click 'Save'
  3. To modify an existing word, click the 'Edit' button next to the word, enter the new word, and click 'Save'
  4. To delete an existing word, click the 'Delete' button next to the word, and click 'OK' in the pop-up window to confirm

You can permanently or temporarily disable the Banned Words filter on your site by clicking the 'Disallow posts that include banned words?' checkbox at the top of the page.

The anti-spam features of Social Pinpoint help protect your site from bots that automatically submit content through your participation activities or registration form.

They add a CAPTCHA field to the input form of the participation activity which users need to check to confirm they are not a bot.

You can enable the anti-spam features at a site level, and relevant participation tools will use these default settings when added to your site.

Usually, we configure these features on set up of your site, but our customer service team can assist you to make changes to your defaults settings.

You can also turn the anti-spam features on or off for a specific participation tool to override the site's default setting.

To change the anti-spam settings of a tool:

  1. Open the settings of the participation tool
  2. Enable the 'Activate Anti-Spam' toggle under the Advanced tab
  3. Click the blue 'Save' button at the bottom right.