Adoption Planning: Setting Desirable and Realistic Brightspace Adoption Goals
Authored by: Holly Whitaker, PhD, Learning Strategy Consultant at D2L.
If you have participated in any professional development or strategic goal setting experiences in the last few years, you're probably familiar with the SMART method of goal setting. The "R" in SMART typically stands for Realistic.
As campus-based change leaders, we often have some grand expectations thrust upon us from key leaders at our institutions. For example, I've heard of numerous institutions whose leaders expect 100% adoption of the LMS for posting a syllabus for every course.
At first glance, this goal sounds reasonable, but it is highly likely that the Teaching Assistant is most likely posting the syllabus for those few professors who never actually log in to the LMS. This is happening while your support team and trainers are overwhelmed with help tickets and training requests, without the capacity to meet the demand. That is likely a situation you do not want.
Setting adoption goals that your institution can realistically support can be tricky. This article will give you step-by-step instructions for how to customize attainable and realistic goals for your campus.
First: Define ideal adoption
Defining best practice before looking for real-life examples will help you identify ideal examples in practice. Work with your instructional design team to brainstorm the ideal or highest quality use for the tool you've identified. This will be a concept-first thought process, rather than an example-driven thought process.
Suggestions for setting up this brainstorming session:
- Select a tool that is in your MVLE - we'll use Rubrics for this example (You can select a tool from your MVLE or a new tool)
- Ask your instructional designers to envision an ideal course that uses Rubrics
- Imagine what the ideal student experience of Rubrics would be
- Imagine what the ideal instructor experience of Rubrics would be
Second: Document the ideal
Use your notes from this brainstorming session, create a checklist with detailed definitions. Alternatively, create a flowchart, diagram or document that describes the ideal use in great detail. The sample below is designed to get you thinking in the right direction - it's not intended to be an exhaustive example of how to use Rubrics.
- Instructor easily creates, edits and adds Rubric to an assignment
- Student is able to view Rubric before submitting an assignment
- Student can view instructor feedback after grading
- Instructor uses Rubrics for more than one assignment
- Instructor uses consistent ordering of high and low ratings across all Rubrics
- Instructor uses Rubric to give in-depth feedback
To wrap up this stage, cite any research or external standards you find for your quality definition. Include accessibility requirements in your definition.
Third: Benchmark current adoption
Use DataHub to look at Tool Usage reports on Rubrics. Let's say your campus has 1000 courses and that your DataHub Tool Usage tells you that 300 of your courses use a Rubric in some form.
Task your instructional design team with determining how many of those courses use Rubrics in the ideal manner you identified. Looking at 300 courses can take a lot of time, so I'd select a strategy to look at a sample of those 300 courses. Let's say the sample showed that 100 courses use Rubrics in an ideal manner. This means that 200 courses that currently use rubrics could use some help or guidance to improve their use of Rubrics.
Next: Are your current levels of adoption satisfactory?
Not all courses can effectively use all tools. That goes for Rubrics, too. In large enrollment courses that rely on exams, Rubrics are not practical. However, in the example we're looking at in this article, we already identified 200 courses that could use some help with their use of Rubrics.
The question is: Are you happy that 100 courses are using Rubrics ideally, or do you think the other 200 need help? Do you think there are more than 300 courses that could use Rubrics well?
- If you are happy with your 100 courses, then no additional resources need to be invested in the adoption of Rubrics.
- If you determine that you want to help the 200 courses that are not using Rubrics ideally, invest focused resources for those specific courses to help them raise the bar. Use the guidance in the next section of this article to help you determine what to do next.
- If you think there are more than 300 courses that could use Rubrics well, use the guidance in the next section of this article to help you determine how many courses you'd like to be using Rubrics.
Fourth: Forecast implications of increased adoption
Every change creates ripple effects across your institution. Any desired increase in Rubrics will ripple out toward your training team, help desk staff, instructional designers and data people. The ripple effects may even go out as far as your provost's office, the office of graduate studies, the dean of faculties, faculty senate and other politically charged environments. Evaluate these impacts before determining your desired level of adoption.
The higher you set your goals for adoption, the larger the ripple effects. Gather a team with representatives from training, help desk, instructional design and data to think through the ripple effects of various levels of adoption:
- Does your training team have the capacity to train 100% of your instructors on Rubrics in the desired time frame? Could they train 50% or 25% in that time? What if you doubled the time frame?
- What would need to change about help desk operations if a 100% adoption mandate occurred?
- What capacity would you need to make available in your instructional design team if 50% adoption of Rubrics was your goal? What about 100%? 25%?
- How much capacity do your data people have to closely monitor this initiative?
- What kind of political interest might this initiative draw from your campus leaders?
Think through these and other questions that emerge from the discussion, considering the ripple effects. Think about percentages, but also sheer numbers. Fifty-percent of your instructors could mean 100 or 1000, depending on the size of your institution. That is a big difference when it comes to training seats, help desk tickets and instructional designer availability.
Finally, optimize your adoption goal
Based on your discussion from the previous section, you should have a good feeling for the capacity of each team to handle the ripple effects of any increase in adoption.
Let's say that your ideal approach was to improve the way that the 200 courses were using Rubrics to help those courses move toward the ideal Rubric use you identified, but your forecasting discussion from the previous step revealed that this would overwhelm the capacity of your training team, your help desk and your instructional designers. You cut the goal in half to improve 100 courses already using Rubrics, and the group still seemed like that was not realistic to achieve with high quality. Fifty courses felt a little too low to make any kind of impact at the institution level, so the group agreed on improving 75 courses as a realistic goal. To align for that goal:
- Your trainers determined that they could train the 75 instructors over the course of one semester, considering their availability and current training commitments.
- The help desk group determined that they would need one month to get up to speed to support Rubrics used at the ideal level, and that they needed to add to their internal knowledge base as a part of that ramping up process. They asked that the training team begin training sessions after the ramp-up period.
- The instructional design team decided that they could reserve several slots of available project time to provide in-depth help to 15 of those 75 instructors this semester. They asked the training team to help identify the instructors who need the most help getting up to level.
- The data team will simply help you with adoption reporting, adding that to the monthly reports they already run.
The team also determined that this would impact the teaching & learning office, the assessment office and would filter up to the provost's office. Therefore, an immediate step to take is to schedule meetings with each of those groups to talk through your desired plans.
What's Next?
Once you've optimized your adoption goal and worked with your support teams to align to achieve the goal, it's time to create a plan for how and when you'll roll out this plan to instructors.
Check out the rest of the Adoption Planning Series:
Looking for more ideas and inspiration on how to build and grow the usage and adoption of Brightspace? Check out the Higher Education Adoption Playbook!:
Not in Higher Education?
If you are a Corporate or K-12 Customer, we have an Adoption Playbook just for you!