The Diff Viewer tool allows you to see students' code submissions for each exercise, and the difference between their attempt and the official solution. Once you see what students commonly get wrong, you can update the instructions and hints to account for these mistakes.
Before you begin
- Create a course.
- See Accessing the Diff Viewer to access and log in to the Diff Viewer.
- Find your Course ID.
Retrieving the course details
- In the "course ID" text box near the top left, enter your course ID.
- Click the "fetch course" button.
Retrieving the submissions
- Click the "select exercise" dropdown and choose an exercise that you want to inspect. Multi-step (Iterative and Sequential) exercises are given one entry per step, numbered from 0.
- Click the "fetch attempts" button. ("Attempts" is synonymous with "Submissions.")
Exploring the submissions
The submission explorer is a table on the right-hand side of the page. Each row corresponds to a submission, and the rows are arranged in order of most common to least common.
The table contains the following columns.
- N: The number of students submitting this answer.
- Prop: The proportion of students submitting this answer.
- Correct: 1 if the answer was considered correct; 0 if not.
- Compare: Click the button to see the students' answer.
- Step through each of the submission rows in the table, starting with the most common.
- Green lines indicate "this line appeared in the student submission but not the official solution. Red lines indicate "this line appeared in the official solution but not the student submission."
- If the answer was marked correct, check that this is a genuine solution to the exercise. If it is a genuine solution, no further work is needed. If it is incorrectly being marked as correct, file an issue on the GitHub repo for the course, and notify the Content Developer or Content Quality Analyst who is working with you.
- If the answer was marked incorrect, check that this is a genuine mistake by the student. If it is, consider editing the instructions, hints, or code comments to make them clearer to the students. If it is incorrectly being marked as incorrect, file an issue on the GitHub repo for the course, and notify the Content Developer or Content Quality Analyst who is working with you.
Here are some examples from an exercise in the R course "Foundations of Inference."
In the following example, students load two packages in a different order to what is specified in the instructions. There's nothing wrong with this, so the answer is correctly marked as correct.
In the following example, a student thought that NHANES was an object inside the
ggplot2 package, not a package in its own right. The answer is correctly marked as incorrect. It's worth rereading the instructions for the exercise to make sure that they make it clear that NHANES is an R package.