Why are we updating our existing projects?

We are currently in the process of updating our existing library of Guided Projects (previously just known as Projects). Why? Through user interviews and reviewing learner feedback, we know that many of our learners desire a more challenging experience on DataCamp, and this applies to both our courses and our projects.

However, until mid-Summer of 2020, while our projects did offer learners a hint should they get stuck, there was no additional help beyond this (such as a ‘Ask for Solution’ button). This meant two things:

  1. Learners who got stuck and found the hint unhelpful simply had to abandon the project.
  2. Projects were authored in such a way as to prevent this. This meant that many of the hints and instructions in existing projects provide much more hand-holding than would otherwise be necessary.

However, with the introduction of the new “Show Answer” button in projects (available after learners have already taken a hint), we are now free to create and adapt projects with a greater degree of difficulty, safe in the knowledge that learners can request a solution should they need it.

Hint module with a

How to update a Guided Project?

When updating a guided project to align with our user goals (as well as to adjust for the new solution button), there are a few things you will want to look out for:

Hints that give away too much or act as an interim solution.

Many of our projects currently were designed to help learners who had no other way to get through a project. This led to many project hints that actually provide complete or nearly complete code to help learners through difficult tasks.

Try to write hints that can help learners arrive at the solution while providing them as little code as possible.

  1. External links to documentation are one way to help, as this helps simulate how someone might solve a problem outside of DataCamp.
  2. As a rule of thumb, in our courses, we aim to develop hints that help learners get halfway to the solution.

Example from before

Before, the hint simply provided a block of fill-in-the-blank code.

sets = pd.___(___)
elements_by_year = df[['year', ‘parts']].\
___(___, as_index = False).\
___()
elements_by_year.plot(x = ___, y = ___)

Example after:

Afterward, the hint is broader, and while it provides the functions and methods needed to solve the task, it forces the learner to write it themselves and figure out the correct parameters.

- You will again want to use a .groupby(), this time in combination with a  different aggregation function.  
- The .plot() method can be used to visualize a DataFrame, which will use a line plot by default.

Instructions that already give away too much.

For the same reason that hints are often too easy, the instructions in many of our projects are often too quick to provide the names of functions, arguments, and other such forms of assistance which prevents learners from trying first on their own. Try to get away without telling learners how to do it, but rather what they should do in this step (the hint is there to help them figure out how).

Example from before:

In this modified example from an existing project, the exact method needed to examine the DataFrame is provided, rather than letting the learner try to remember it themselves.

Display and inspect the summaries of the DataFrames for issues. 
- Use the .info() method to inspect the first DataFrame.
- Use the .info() method to inspect the second DataFrame.

Example after:

In this fictional adaptation, the instruction is revised to simply instruct learners to inspect the DataFrames themselves, with the additional assistance of reminding them they should check for missing values.

Display and inspect the two DataFrames for issues such as missing values.

Too much scaffolding/sample code.

It is important that our projects give learners a chance to apply their knowledge independent of assistance. Sample code can be helpful, but also risks harming the practice learners would get from writing the code on their own. Sample code should be used sparingly, and reduced where possible.

  • It is best used when you need to funnel learners into a particular syntax to assist in testing purposes, or to reduce repetition.

Example from before

In this modified example from an existing project, the precise syntax for subsetting is provided in the first example and leaves learners to simply fill in the blanks for the second example.

# Subset the DataFrame for when the `total` was above 75 and below 25
display(df_1[df_1['total'] > 75])
display(df_1[... < ...])

Example after:

In this fictional adaptation, the code comment remains, but the code is stripped away of all subsetting. Ideally, the learner will already know how to subset DataFrames, and so they should be given an opportunity to try and recall the correct approach to do so.

# Subset the DataFrame for when the `total` was above 75 and below 25
display(...)
display(...)

Alignment between projects and prerequisites.

Although all of our projects contain course prerequisites, they were often designed independently and were not created to be a direct application of these courses. This is now changing as we aim to position projects as a step learners take after completing courses (e.g. putting projects in Tracks). Thus, here are some considerations/questions you can ask about your project.

  1. Does the project make use of a sufficient number of skills from the prerequisite courses?
  2. Does the project make use of skills taught outside of (especially beyond) the prerequisites courses (e.g. seaborn plots when there is no expectation that a learner will have encountered seaborn yet). Note: Even if the un-encountered code is provided to learners, it is best to swap it for packages/skills they will have already learned.

Learner feedback and hint rates.

Many of our projects have received little in the way of maintenance since their launch. As a result, there are many unaddressed feedback messages and lingering issues. You can reference this article to help you with this step (https://instructor-support.datacamp.com/en/articles/3358583-content-dashboard-project-maintenance). While updating your project, this is a perfect time to look for any critical issues in the project such as:

  1. Tests that are too strict or provide unhelpful feedback.
  2. Tasks that learners get consistently stuck on.
  3. Typos.

In terms of the process of updating, you can treat the update as you would any form of maintenance on a project. Simply create a branch locally (for clarity, name it ‘update’), and begin to make your revisions. When you are finished, submit a pull request and tag @datacamp-contentquality on GitHub for it to be reviewed!

Did this answer your question?