Once you have fully investigated your Course using the Content Dashboard and isolated a poor performing exercise, you will need to decide what edits need to be made.
The CHECK Method
Here’s a handy acronym to remember the steps you should take when interpreting Code Diffs and Learner Feedback
C - Correct Code: Does the code submitted actually work?
H - How Many: How many learners submitted this type of error? Is it a lot, or a few?
E - Examine the Error: Make sure you really understand what the learner did wrong
C - Context: Did the learner submit any feedback that provides context on why they got it wrong?
K - Knowledge: Is the error based in a misunderstanding of the material, or is it syntax based?
More Things to Consider:
Try to think like a DataCamp learner
- Remember when you were first learning this, what did you struggle with?
- Are you asking the learner to make a leap of logic, or are you correctly assuming their level of knowledge in other areas?
How can you best help guide the learner to the answer without giving it away?
It can be tempting to rewrite the code with more scaffolding to make it easier for the learners - but wait!
We WANT our exercises to be challenging and really test learner knowledge!
Before making the answer easier, thinking about the instructions, context, and hints you are providing.
Examine the learner’s journey to this exercise
Have they learned everything they would need up to this point to solve the exercise?
- Video: Was the code properly covered or illustrated in the video, and if so, are learners struggling to connect the material shown with the exercise question?
- Instructions: Are they unclear or ambiguous? Could they be more specific on which function, object, etc to use?
- Hints: Do they actually address the questions that learners might be having?
- Lastly, Code Scaffolding: Is the scaffolding clear enough?