In July we released a new version of the Adoption dashboard, which marked an important milestone in finalizing the transition of our Insights Portal dashboards to a new user interface started last year and expanding their functionality to better align with the needs of users.
These updates are based on our continuous research, as we listen to your feedback and continue to invest and improve our analytics offerings.
In this article, I’d like to do a few things:
- Review what the new Adoption dashboard offers and how to begin taking advantage of the new experience.
- Refresh our memory on other dashboards in the Insights Portal and what use-cases they address.
The new Adoption Dashboard experience
Note: The new experience is currently in an “opt in” format (turned off by default). To use it, your administrator must set the d2l.Tools.Insights.Adoption.EnableNewExperience configuration variable to “ON” state.
This dashboard is designed for administrative roles to understand the degree of consistent system use over time. It reflects the breadth (coverage of organizational units) and depth (coverage of various necessary tools and user roles) of usage
As we try to understand how the system is being used and in what ways, the dashboard helps answer some specific questions around adoption. For example:
- Are there organizational units with little to no activity?
- Are organizational units consistently using the tools required?
- Are all necessary user roles performing their expected tasks in the system?
- Are there sudden spikes in withdrawals for specific parts of the organization?
The dashboard presents aggregate data from the organization as a whole to specific course offerings.
Figure: An example of the Insights Adoption Dashboard.
What changes you can see in this update:
- Look and feel aligns with other dashboards
- Login Trend graph is replaced by System Access, which also includes Pulse access. Please refer to the Adoption Dashboard documentation for information on how System Access differs from Logins.
- Ability to filter the dashboard by date range, organizational unit, role and tool
- Card-to-card interactions – selecting something (a specific role, for example) on one card affects other cards that share the same dimension.
- Ability to drill into children of organizational units selected on a graph is preserved . Drilling into an organizational unit on a specific card now applies the selected unit as a filter to the entire dashboard (not just the card being drill into).
- Accessibility: full screen reader and keyboard navigation support for all filters, graphs and drills
Keep an eye on the 90-day preview for further updates to the Adoption dashboard.
This dashboard was researched and built primarily for roles that are invested in learner success and intervention. The main question that it helps answer is:
- Who are the learners that might require proactive help or might be at risk? (based on learner data available in Brightspace)
What can you do with the engagement data:
- You can gauge the overall health of a course of interest, if learners are doing well, the degree of their participation, etc.
In the example below, we can see that even though there are no overdue assignments, the course has a number of students at the low end of the current grade distribution and most people accessed the system more than 2 weeks ago. This could prompt a review of additional data or a follow up with the instructor to see if the course is progressing as intended or if interventions are needed with specific students.
Figure: The Summary View with the Current Grade and Course Access highlighted.
- You can also identify and follow up with learners meeting specific conditions, for example:
- Learners with low or no access to Brightspace and courses
- Learners with low to no activity in content, discussions, or assignments
- Learners at the low end of the grade distribution
- Learners with declining performance or engagement
What data is available here:
- Course access dynamic over time and learners with no access at all
- Content views over time and learners with no views at all
- Current grade distribution with ability to view learners behind each band
- Scatterplot (correlation) between time in content and current grade
- Learners with overdue assignments
- Learners with high or low discussion activity (including posts, replies and reads) to indicate whether they are passively or actively engaged
Figure: An example of the Engagement Dashboard.
Read more about the Engagement Dashboard.
From here you can also select an individual learner and view their performance across courses and over time, such as:
- their active and past courses
- engagement with content
- current grade performance
- overdue assignments
Figure: An example of the Summary View with learner performance over time displayed in line graphs.
Read more about the Learner Engagement Dashboard.
Assessment quality dashboard
Next is our Assessment Quality Dashboard researched and built for roles that develop and maintain high-stakes examinations. It is especially valuable for quizzes of certain types, such as high-stakes examinations (mid-terms, finals), certification exams and so on.
It helps answer the question of what is the quality of assessments in my part of the organization:
- Are learners able to consistently answer individual quiz questions and achieve overall quiz scores?
- Are quizzes too difficult or unclear to learners
- Are there questions/concepts with gaps in learner comprehension?
What can you do with this data:
- Ensure quiz questions reflect actual material learned in course
- Align actual level of difficulty with yours and learners’ expectations
- Identify questions that consistently bring learner scores down or are too easy to answer
- Learner quiz completion consistency (via quiz reliability score distribution and average score)
- Quiz questions that produce consistently low or undesirable results for my learners (via individual quiz and question metrics):
- Large variation in learner results
- Low average scores
- High performing learners answering incorrectly and vice-versa
Figure: An example of the Assessment Quality Dashboard.
Read more about the Assessment Quality Dashboard.