AI Coding

The GenAI Report provides a comprehensive and data-driven view of how AI tools, like copilots and Cursor AI, are influencing your engineering team's efficiency and workflow. This feature helps you move beyond simple adoption rates to understand the true impact on key DevOps metrics like cycle time and throughput.

SetUp AI Coding Report for Cursor SetUp AI Coding Report for GitHub Co-pilot SetUp AI Coding Report for Claude Code

AI Usage

The Usage tab focuses on how your team uses AI tools. It gives you a clear picture of adoption and utility.

Adoption Rate

Adoption Rate represents the percentage of developers actively using the AI tool within your organization. This helps you understand how widely AI is being adopted across your teams and codebase.

You can view this metric in two ways:

  • By User – Displays the number of active users out of the total enabled users (active in Typo) within the selected time range.

  • By Code – Shows the proportion of code changes made using AI compared to the total code changes during the selected time period.

Acceptance Rate

Acceptance Rate measures the percentage of AI-generated suggestions that are accepted by developers. A higher acceptance rate indicates that the AI is generating relevant and useful suggestions.

You can analyze this metric in the following ways:

  • Overall – Displays the total accepted suggestions as a percentage of all suggestions generated.

  • By Language – Breaks down generated and accepted suggestions at the programming language level.

  • By Terminal (Claude) – For Claude, shows the breakdown of generated and accepted suggestions at the terminal level.

  • By IDE (Copilot) – For Copilot, displays the breakdown of generated and accepted suggestions at the IDE level.

The table at the bottom of the page provides a detailed breakdown for each team and developer, including:

  • Enabled Users: The total number of developers who have access to the AI tool.

  • Active Users: The number of developers who have actively used the tool in the selected time range.

  • AI Ratio: The proportion of code generated by AI compared to code written by the developer.

  • Suggestions: The total number of suggestions the AI has provided.

  • Acceptance Rate: The percentage of suggestions that were accepted.

AI Impact

The Impact tab helps you measure the direct effect of AI usage on your team’s performance metrics.

We compare AI-generated PR metrics with non-AI PR metrics to identify trends and highlight how AI coding influences individual metrics, as well as overall team performance.

Enabling the Impact Module

To enable this module, please reach out to us at hello@typoapp.io and share how your team tags AI-related PRs. Based on your tagging method, we will configure the setup accordingly.

  • PR Cycle Time Change: This chart shows how your PR cycle time has changed when using the AI tool vs not using it. A positive number indicates an increase, while a negative number shows a decrease.

  • PR Review Time Change: This metric tracks the change in the average time it takes for an AI PR to be reviewed vs the ones that are not generated using AI coding assistance.

  • PR Throughput Change: This shows the change in the total number of merged PRs.

  • PRs/Developer Change: This chart helps you understand if the AI tool is helping each developer complete more PRs.

  • Code Quality Change: This shows the impact on the number of severe issues in the code health.

  • PR Size Change: This metric tracks the change in the average size of your PRs, which can indicate whether the AI is helping to produce more concise code.

Team-Level Impact Table

This table provides a comprehensive summary of how AI is impacting each of your teams. It compares key metrics for each team to show the direct effect of AI on their performance.

  • PR Cycle Time: The change in the average time it takes for a PR to be merged.

  • PR Review Time: The change in the average time a PR spends in review.

  • PR Throughput: The change in the total number of merged PRs.

  • PRs/Developer: The change in the number of PRs merged per developer.

  • Code Quality: The change in the quality of the code, measured by your pre-defined quality metrics.

  • PR Size: The change in the average size of PRs.

circle-info

To enable this report for your account, navigate to Settings > Integrations and connect the AI coding tool used by your team.

You can add multiple coding assistant tools from the Integrations section as needed.

Last updated