Surveys collect feedback from users in the Learner Portal, helping admins better understand the learner experience and gauge their needs.
Surveys are multi-use and may be added to multiple Lessons from the Content and Activities tab or as the Final Course Survey from the Advanced Settings tab in Courses. When Learners access the same Survey across different Courses and Lessons, the survey is recorded as a unique Survey Result.
This article outlines the following concepts:
Create a Survey
Go to Content Creation > Click Surveys > Click New
Details tab
In the Details tab, give this survey a Title and description. Mark if this survey is published (active).
Survey Questions
In the Questions/Statements tab, select the Question Type. Each type includes different options. Question Types include:
- Rating Scale
- Comment Box
- Single Textbox
- Select List (single-select)
- Select List (multi-select)
Click the green “+” icon to include additional questions in this survey.
Settings tab
In the Settings tab, configure Rating Scale settings. If applicable, customize a Completed Message that shows to Users when they complete the Survey.
When changing the rating scale labels to the customize option, this is just to change the wording of the scale, it does not change the point value of the scale. Here is how the scale metrics in the reports are calculated.
NPS:
1 - Strongly Disagree - Detractor and Unsatisfied
2 - Disagree - Detractor and Unsatisfied
3 - Neutral - Detractor and Neutral
4 - Agree - Passive and Satisfied
5 - Strongly Agree - Promoter and Satisfied
Net Promoter Score is calculated by:(percentage of promoter responses - percentage of detractor responses) For the example provided, with 5 options: A detractor is a score of 3 or less. A promoter is a score of 5. The user answered 1 out of 5 as a promoter (Strongly Agree) and 0 out of 5 as detractors (Neutral or lower), so:NPS = ((1/5) × 100) - ((0/5) × 100) = 20% (or 20)
CSAT Score is calculated as:(satisfied responses ÷ total responses) × 100. A satisfied response is 4 or 5 (Agree or Strongly Agree). In this example: ((5/5) × 100) = 100%
CSAT Average is the user's raw score ÷ maximum possible score. The maximum possible points for 5 questions = 25. The user answered 4 × “Agree” (4 points each) and 1 × “Strongly Agree” (5 points): User total = (4 × 4) + 5 = 21. CSAT Average = 21 ÷ 25 = 84%
Click Save.
Survey Reporting
When accessing Survey Data in an Education Report of the Report Builder, each survey result is tracked separately and may be filtered in the Run Report.
“Course Title”, “Lesson Title”, and “Lesson Activity Title” data points can be included in a Report to filter a Survey included in multiple Lessons.
Duplicate a Survey
Duplicating surveys makes quick work of creating new surveys with slight differences.
Check the box of the survey(s) to be copied > click Duplicate Survey(s) > the new survey will appear marked as (copy)
Use Cases
The following are possible general use cases for utilizing this feature:
Gathering course-level feedback to improve content and instruction
A training coordinator wants to understand how learners feel about specific courses and instructors so they can improve materials, adjust pacing, or introduce new topics.
This feature is useful because:
Surveys can be embedded directly in Lessons or at the end of Courses to capture impressions while content is still fresh.
Results are tracked separately per activity or course, even when the same survey is reused.
Admins can generate reports that break down feedback by course title, lesson title, and activity.
Example Use Case
The coordinator adds a 5-question “Course Feedback” survey to every lesson in a new product training course. Learners rate content clarity, instructor engagement, and knowledge gained. Reports filter by course to identify that Lesson 3 consistently receives lower clarity scores, flagging it for revision.
Assessing learning environment satisfaction across departments
An HR department uses Surveys to gauge how different departments feel about the learning experience in the LMS—interface usability, course relevance, and instructor-led events.
This feature is useful because:
Multi-select and comment boxes allow both structured and open-ended feedback.
Surveys can be duplicated and tailored slightly for each department or training type.
Admins can compare feedback across business units to identify training needs or LMS improvement areas.
Example Use Case
The Admin duplicates a “Learner Satisfaction” survey and assigns one to Sales, another to Engineering, and another to HR—each with one or two tailored questions. Reporting shows that Sales finds the LMS navigation confusing, while HR praises content relevance. These insights guide both system updates and content prioritization.
Using surveys as part of certification or milestone checkpoints
A professional certification program requires feedback collection at key points in a learning path—for example, after completing required modules or before unlocking a final assessment.
This feature is useful because:
Surveys placed at the end of Courses or Lessons can act as soft gates or reflective checkpoints.
Completed Messages can affirm next steps, offer encouragement, or provide instructions.
Results help ensure quality assurance across multiple certification cohorts.
Example Use Case
At the end of each certification module, learners complete a “Module Reflection Survey” asking about confidence, clarity, and remaining questions. The survey ends with a custom message guiding users to schedule their final assessment. This flow gathers feedback and supports learners while maintaining forward momentum.