AI Coding
The GenAI Report provides a comprehensive and data-driven view of how AI tools, like copilots and Cursor AI, are influencing your engineering team's efficiency and workflow. This feature helps you move beyond simple adoption rates to understand the true impact on key DevOps metrics like cycle time and throughput.
SetUp AI Coding Report for Cursor SetUp AI Coding Report for GitHub Co-pilot
AI Usage
The Usage tab focuses on how your team is engaging with AI tools. It gives you a clear picture of adoption and utility.

Adoption Rate: This metric shows the percentage of developers actively using the AI tool. You can view this data By User or By Code to understand how widespread AI is throughout your codebase.
Acceptance Rate: This metric tracks the percentage of AI-generated suggestions that are accepted. A high acceptance rate indicates the AI is providing useful and accurate code. You can view this Overall or filter by specific programming Language.
The table at the bottom of the page provides a detailed breakdown for each team and developer, including:
Enabled Users: The total number of developers who have access to the AI tool.
Active Users: The number of developers who have actively used the tool.
AI Ratio: The proportion of code generated by AI compared to code written by the developer.
Suggestions: The total number of suggestions the AI has provided.
Acceptance Rate: The percentage of suggestions that were accepted.
AI Impact
The Impact tab helps you understand the direct effect of AI on your team's performance metrics. We compare the AI PR metrics with the non AI PR metrics to generate the trend to understand its impact on each metrics and to overall see the impact of AI coding on team's performance.

PR Cycle Time Change: This chart shows how your PR cycle time has changed with using the AI tool vs not using it. A positive number indicates an increase, while a negative number shows a decrease.
PR Review Time Change: This metric tracks the change in the average time it takes for a AI PRs to be reviewed vs the ones which are not generated using AI coding assistance..
PR Throughput Change: This shows the change in the total number of merged PRs.
PRs/Developer Change: This chart helps you understand if the AI tool is helping each developer complete more PRs.
Code Quality Change: This show the impact on the number of severe issues in the code health.
PR Size Change: This metric tracks the change in the average size of your PRs, which can indicate whether the AI is helping to produce more concise code.
Team-Level Impact Table
This table provides a comprehensive summary of how AI is impacting each of your teams. It compares key metrics for each team to show the direct effect of AI on their performance.
PR Cycle Time: The change in the average time it takes for a PR to be merged.
PR Review Time: The change in the average time a PR spends in review.
PR Throughput: The change in the total number of merged PRs.
PRs/Developer: The change in the number of PRs merged per developer.
Code Quality: The change in the quality of the code, measured by your pre-defined quality metrics.
PR Size: The change in the average size of PRs.
Last updated