AI-powered candidate report analysis

Use AI to enhance your analysis of a candidate's coding challenge submissions

Every candidate that completes an Assessment will have a report that you can open from that Assessment's dashboard.

If you have opted in and enabled the Project grading & report analysis AI feature, you will see an automatically generated summary of the candidate's coding challenge submissions and any take-home project challenge grading if they're selected on an assessment.

This is found under the section AI analysis in the candidate's report.

Report Analysis

The summary provides a review of a candidate's performance on the following topics within individual challenges:

  • Accuracy & Completeness
  • Efficiency
  • Documentation & Best Practices

This is followed by a general analysis of a candidate's technical abilities and understanding of programming languages.

Multiple choice, open-ended, and other question types are not considered. Please note candidate submissions should always be reviewed by domain experts before any decision is made.

Project grading

If AI grading is enabled on take-home projects, candidates will be automatically assessed based on a custom rubric designed to evaluate best practices and technical coding abilities. The AI-generated analysis will display numerical scores in addition to a summary.