Skip to main content

Test Insights Dashboard

The test insights dashboard provides instructors with detailed analytics about student performance on autograded tests, helping you identify common challenges, track learning progress, and improve assignment design.

Dashboard Overview

Access the test insights dashboard from an assignment’s management page to view comprehensive analytics about student test performance.

Key Analytics

Test Performance Metrics

The dashboard displays performance data for each test case:
  • Pass rate: Percentage of students passing each test
  • Failure patterns: Common errors and failure modes
  • Attempt distribution: How many attempts students make before passing
  • Time to completion: How long students take to pass tests

Student Performance Tracking

Monitor individual and cohort-level performance:
  • Student progress: Track which students are struggling with specific tests
  • Performance trends: Identify improvement or decline over time
  • Outlier detection: Spot students who may need additional support
  • Cohort comparison: Compare performance across sections or groups

Test Case Analysis

Understand how your test cases are performing:
  • Test difficulty: Identify tests that are too easy or too hard
  • Discriminatory power: See which tests best differentiate student understanding
  • Common failures: Analyze the most frequent error messages
  • Edge cases: Identify tests that only a few students fail

Dashboard Features

Visual Analytics

The dashboard provides multiple visualization options:
  • Heat maps: Show pass/fail patterns across students and tests
  • Trend lines: Display performance changes over time
  • Distribution charts: Visualize score distributions
  • Error frequency graphs: Highlight most common failure types

Filtering and Segmentation

Customize your analysis:
  • Filter by student section, group, or tag
  • Segment by submission timing (early, on-time, late)
  • Focus on specific test suites or categories
  • Compare performance across assignment versions

Drill-Down Capabilities

Investigate specific issues in detail:
  1. Click on a test case to see all student attempts
  2. View error messages and stack traces
  3. Access student code that failed the test
  4. Compare successful vs. failed implementations

Using the Dashboard

Identifying Learning Gaps

Use test insights to understand where students struggle:
  1. Review tests with low pass rates
  2. Analyze common error messages
  3. Identify conceptual misunderstandings
  4. Plan targeted interventions or clarifications

Improving Assignment Design

Leverage analytics to enhance future assignments:
  • Adjust test difficulty based on pass rate data
  • Clarify instructions for frequently failed tests
  • Add hints or scaffolding for challenging concepts
  • Remove or revise tests that don’t discriminate well

Supporting Student Success

Proactively help students who are struggling:
  1. Identify students with consistently low test pass rates
  2. Reach out with targeted support or resources
  3. Create discussion posts addressing common errors
  4. Adjust office hours focus based on common challenges

Calibrating Grading

Ensure fair and appropriate assessment:
  • Verify test weights align with difficulty
  • Identify tests that may need point adjustments
  • Ensure test suite comprehensively covers learning objectives
  • Balance between basic and advanced test cases

Integration with Other Features

The test insights dashboard works with:

Best Practices

  • Review early: Check insights soon after assignment release to catch issues
  • Act on patterns: Address common failures with announcements or discussion posts
  • Iterate designs: Use insights to improve test suites for future offerings
  • Balance difficulty: Aim for a range of pass rates across tests
  • Communicate findings: Share aggregate insights with students to guide their learning
  • Protect privacy: When discussing patterns, avoid identifying individual students

Example Use Cases

Scenario 1: High Failure Rate

A test has a 20% pass rate. Investigation reveals:
  • Common error: “NullPointerException”
  • Pattern: Students aren’t handling edge cases
  • Action: Post a discussion thread about defensive programming and edge case testing

Scenario 2: Bimodal Distribution

Half the class passes all tests, half fails most tests. Analysis shows:
  • Gap: Students missing prerequisite knowledge
  • Pattern: Correlation with prior assignment performance
  • Action: Create supplementary materials and targeted office hours

Scenario 3: Late Improvement

Students initially fail tests but improve significantly over time:
  • Insight: Assignment is appropriately challenging
  • Pattern: Learning is occurring through iteration
  • Action: Maintain current design, possibly add more scaffolding for future offerings