Accessing detailed Form Reports

The reporting for your forms and surveys provides a summary of results along with associated data and meta-data.

Steps to access a Form Report

  1. Go to the "Dashboard" > "Reports"
  2. Select "Results" in the left-hand menu
  3. Find your survey in the list and click the 'View Results' link
  4. Select the desired date range from the dropdown menu at the upper-right of the report. By default, the last seven days are shown.

The form report has four sections:

  • Summary - Shows the results of your forms and surveys using summary tables and dynamic visualisations.
  • Overview - Shows meta-information about the activity, including when participants' left contributions, what devices they used, and known demographic information for registered members.
  • Data - Shows results in a tabular format that can be exported in .xlsx, .csv and .pdf formats.
  • Analysis - Lets you view text-based responses to open-ended questions and use text analysis tools.

Summary

The Summary section shows the results for each question in a form or survey, illustrating them through dynamic visualisations and summary tables.

You can export the report as a .pdf by clicking the 'Export' button at the top-right of the page.

Responses to quantitative questions such as multiple-choice, rating, ranking, and matrix questions are automatically aggregated and analysed, while responses to open-ended questions can be summarised with Social Pinpoint's text-analysis tools (see below).

The way responses are analysed depends on the type of question asked in the form or survey.

The Multi Choice lets participants choose one or multiple answers from a list of options configured.


Single Selection: If participants can only select one option, a summary table displays the total count and percentage of each response. The percentages will sum to 100%. A doughnut chart visualizes the percentage breakdown of all responses.


Multiple Selections:
If participants can select more than one option, the summary table still shows the total count and percentages, but these may not sum to 100%. In this case, a bar chart visualizes the number of times each option was selected, with longer bars representing more frequently chosen options.


Rating questions ask participants to respond to a question on a visual, stepped scale. A weight is assigned to each step in the scale to determine a weighted average in the results.

A summary table shows the number of times each step of the rating scale (e.g. 1-5, 1-10, etc.) was selected along with the percent value for each step (percentages will always total 100% across responses), the total count or number of responses and the weighted avg. score across all responses.

A 'column' visualisation shows the percentage breakdown for each rating option, along with the average overall rating score at the upper-right.

Slider questions are similar to rating questions in that they ask respondents to respond to a question on a stepped, numerical scale. However, this is done by dragging an interactive slider element along a scale bar rather than selecting the value directly.

Slider questions also allow for negative values, so users can adjust the slider bi-directionally along the scale as opposed to just one way.

The results of a Slider question are shown as a 'column' visualisation similar to the Rating question.

A summary table shows the number of times each step of the scale was selected along with the percentage of times it was chosen. Additionally, it shows the total count (which should correspond to the number of times the question was answered), the average and median scores across all responses, and the minimum and maximum values that were selected on the scale.

If the number of steps in the numerical scale exceeds 11, the bar visualisation will automatically combine responses into a smaller number of breakpoints so the graph is still legible. The full, detailed results are visible in the summary table.

Ranking questions ask participants to order a list of items in order of their preference. A weighted average 'score' is calculated for each answer choice, helping you determine the most and least preferred answer choices.

The ‘score’ is calculated by:

  • summing up the weight of each ranked position.
  • multiplying the response count for each position choice.
  • dividing by the total responses of the question.

The 'score' weighs the results by the number of contributions so outliers do not bias the score. This is particularly useful in cases where the ranking of all options is not required.

Weights are applied in reverse order. In other words, a participant's most preferred choice (which they rank as #1) has the largest weight and their least preferred choice (which they rank in the last position) has a weight of 1.

In addition to the new ‘Score’ calculation a new column to display the ‘Average Rank’ of the choice is also available in the data report. The ‘Average Rank’ is calculated by:

  • summing up the ranked position of the choice,
  • multiplied by the response count for the position choice and then;
  • dividing by the ‘Count’ of the choice.

This change has been applied retroactively to all previous ranking questions, and results reports will be updated.

Matrix questions ask participants to evaluate one or more questions against a common response scale.

A summary table shows results of the questions, with each question listed down the first column and the answer choices listed across the first row. The total number of responses for each answer choice is shown in each cell along with the percentage values which will always total 100%.

Additionally, the summary table shows the total count or times the question has been answered, and the average score for each question. To determine the average score, answer choices are assigned a number value depending on how many choices there are. For example, if a question askes users to respond on a scale with five steps, responses will be assigned a value from 1 to 5 respectively.

A 'bar' visulisation plots the average score to indicate were responses fell along the answer scale.

Short Text and Long Text questions collect open-ended, text-based responses from participants.

Before seeing the results for these questions, users must first analyse responses either manually, or automatically through Social Pinpoint's Analysis Assistant. Following this process, the aggregated results are available in the Summary report, shown under three key sections.

Sentiment indicates the overall emotional tone of your responses with results displayed as a 'horizontal' bar visualisation showing the distribution across four classifications: positive, mixed, negative, and neutral.

Tags are words or phrases manually assigned to responses that indicate a particular topic, theme or other categories. Results are displayed in a summary table showing the total count and percentage for each tag and visualised as a 'word cloud' providing a visual representation of the most common tags in your responses.

Featured Contributions displays selected responses to the question chosen by the user. They can be used to highlight exemplary responses and provide more context and depth to the analysis.

The summary report for a map question displays the map with all markers placed by participants. Users can click the "View Contributions" button below the map to access detailed responses for that map question.

  • Important: When building in Form, if you want to test the survey/form you are building please use the preview button within the tool. Do not test in the page

  • Important: Social Pinpoint staff will not delete data from Form reports without good reason. This includes test data as a preview function is available

  • Important: The Form tool does not allow Users to remove, change or manipulate submitted data. This is for openness and transparency reasons