What does a high hint rate mean?
In general, we aim to have our exercises have an asked hint rate below 35%. While some challenging exercises are encouraged (e.g. exercises at the end of a chapter), too many exercises with a high hint rate indicate that students are either failing to absorb the information provided in the lessons or the exercise is not fairly testing the material.
How can I figure out why so many students are asking for hints?
To help answer these questions, you can make use of the Course Dashboard to get an idea of where students might be going wrong. Here are some steps you can take to try and identify the issue:
If the course is in Python, R, or SQL, you can view the most frequent incorrect responses in the CQ Dashboard (for more information, you can refer to this article) to see what students are most often submitting when they have an incorrect answer.
Student feedback can be another valuable source of information. While not all feedback may be instructive, it is often worth looking to see if anyone has explicitly stated why the exercise is so difficult. For more information, you can refer to this article.
Take the exercise again. Reread the text, and try to solve the exercise from a student’s perspective. Do the instructions and tasks make sense? Is everything clear?
Watch the preceding video. Does the lesson equip students with everything they need to solve the exercise? In some cases, students may also be struggling to recall skills taught in preceding lessons.
I’ve located the issue with the exercise, what should I do?
Students do not seem to be clear about what they should do:
Be more precise about the steps necessary to complete the exercise. Oftentimes this requires greater granularity in the instructions.
Students do not always read thoroughly, and streamlining text and instructions, as well as styling (e.g. bold fonts) can help ensure the most relevant instructions are visible to students.
Students are struggling to remember to use a particular element in the code (e.g. a function name):
Be more clear about what they should do in the instructions (without explicitly giving them the code).
Remind them of it in the context by showing them a use case or example.
Explicitly lay out the code in the instructions (i.e. using code formatting).
As a last resort, provide some additional help via scaffolding.
Note, we often get complaints from students that our courses hold their hand too much. What we are aiming for is a fair challenge that encourages students to think for themselves. Thus, additional code scaffolding should be avoided when possible!
SCT related issues (such as correct answers being rejected or SCT feedback not helpful).
File an issue on GitHub, tag @datacamp-contentquality, and describe the problem. The Content team will look into the exercise and make the necessary changes as quickly as possible.
Preceding video(s) are not clear, have a mistake, or could contain additional information.
If the edits can be made with no changes to the audio, make the necessary adjustments in the Teach editor and file a pull request.
If the edits require changes to the audio, file an issue on GitHub, tag @datacamp-contentquality, and describe the problem. Please note that changes to videos are resource-intensive, and it is worth exploring alternative solutions if they are available.