Data Analyst Performance Review Phrases: 75+ Examples for Every Rating Level

75+ data analyst performance review phrases for managers and employees. Covers analysis quality, SQL skills, dashboarding, stakeholder enablement, and data quality — written for every rating level.

Table of Contents
TL;DR: 75+ ready-to-use data analyst performance review phrases for managers and employees, organized by competency area and rating level. Covers the full range from SQL craft to strategic insight delivery.

A data analyst who builds ten dashboards that nobody acts on contributes less than one who delivers a single analysis that changes a decision. Reviews should measure the latter.


How to Write Effective Data Analyst Performance Reviews

Data analyst reviews get stuck on outputs. “Built X dashboards, wrote Y queries, produced Z reports” is a workload description, not a performance assessment. The question that distinguishes a good data analyst review from a great one is whether the work changed anything. Did the analysis inform a decision? Did the dashboard get used, or did it become another tab nobody opens? Did the analyst translate what the data was saying into something the business could act on?

The insight quality dimension is where strong data analysts separate themselves — and it is chronically underrepresented in reviews. A data analyst who can identify the right question to ask, not just answer the question they were given, is operating at a senior level. A data analyst who anticipates what a business partner needs to see before being asked is creating genuine leverage. These behaviors show up in the work but rarely show up in reviews unless the manager is specifically looking for them.

Technical skill matters and deserves honest assessment. SQL proficiency, dbt modeling discipline, Looker or Tableau dashboard design, and Snowflake or BigQuery query optimization are all real and differentiable competencies. But technical skill in service of bad analysis framing produces technically correct answers to the wrong questions. The best reviews assess both: the craft of the analysis and the judgment that directed it.

For employees: the review language pattern to aim for is “analyzed [what], discovered [finding], which led to [business outcome].” If you cannot fill in the last clause, the phrase describes work but not value. Prov helps you capture that final clause at the moment it happens — when you present a finding that leads to a decision, not six months later when you are trying to remember what happened.


How to Use These Phrases

For Managers

These phrases work best when you replace the generic descriptions with specifics from your direct report’s actual work: the project name, the business unit they supported, the number they moved, the decision they informed. The goal is a review that the analyst could not have received if they worked at a different company — that specificity is what makes feedback credible and motivating.

For Employees

Read the “Exceeds Expectations” phrases for each competency and ask whether your work from the past year has examples that fit. If it does, bring them to your self-assessment. If it does not, those phrases describe the level you are developing toward. The “Needs Development” phrases are written to be growth-framed — they name the gap and point at what would close it.

Rating Level Guide

RatingWhat it means for Data Analysts
Exceeds ExpectationsAnalysis drove decisions; stakeholders seek out this analyst proactively; technical work is trusted, well-documented, and reusable by others
Meets ExpectationsAnalysis is accurate and delivered on time; stakeholders are served reliably; technical work is solid and maintainable
Needs DevelopmentAnalysis requires more oversight than the role warrants; outputs describe data without interpreting it; technical debt is accumulating in owned work
STAR method framework for performance review examples

Analysis Quality & Insight Performance Review Phrases

Exceeds Expectations

  1. Reframed a stakeholder's data request — originally scoped as a reporting task — into a hypothesis-driven analysis that identified a $900K revenue recovery opportunity the business had not known to look for.
  2. Delivered a customer segmentation analysis using SQL and Python that surfaced three distinct behavioral cohorts, directly shaping the product team's roadmap prioritization for the following two quarters.
  3. Proactively identified a seasonality pattern in support ticket volume that the operations team had attributed to staffing issues, providing the data foundation for a scheduling change that reduced overtime costs by 22%.
  4. Produced a cohort retention analysis in BigQuery that diagnosed the point in the user journey where churn was concentrated, giving the growth team a specific, testable intervention hypothesis instead of a general "improve retention" mandate.
  5. Synthesized findings from four disparate data sources into a unified executive briefing that answered a strategic question the leadership team had been debating for two quarters without resolution.

Meets Expectations

  1. Delivered accurate, clearly framed analyses on all assigned projects, presenting findings in a format appropriate to the audience and including actionable interpretation alongside the data.
  2. Completed ad-hoc analysis requests within agreed timelines, communicating proactively when scope or data complexity affected delivery estimates.
  3. Produced analysis outputs that stakeholders could act on independently, including clear methodology notes and assumptions documentation alongside findings.
  4. Consistently distinguished between correlation and causation in findings presentations, maintaining analytical credibility with technically sophisticated stakeholders.

Needs Development

  1. Analysis outputs tend to present data without interpretation — developing the habit of leading every analysis delivery with the key finding and its business implication, before the supporting detail, would increase the influence of this work significantly.
  2. There is an opportunity to move from answering the question asked to interrogating whether it is the right question; practicing stakeholder conversations that explore the business decision being made before scoping the analysis would increase the strategic value of this work.
  3. Analysis assumptions are not always documented, which has required follow-up conversations to re-establish methodology context; building a consistent practice of including an assumptions section in every analysis deliverable would reduce this friction.

SQL & Technical Skill Performance Review Phrases

Exceeds Expectations

  1. Refactored a critical reporting query that had been running for forty minutes into an optimized BigQuery job completing in under three minutes, improving the reliability of a daily executive dashboard and reducing compute costs by an estimated 60%.
  2. Built a suite of dbt models that standardized metric definitions across three business teams, eliminating a persistent discrepancy in revenue reporting that had required monthly reconciliation effort.
  3. Developed reusable SQL libraries and Jinja templates in dbt that reduced new analysis development time by approximately 30% for the full analytics team.
  4. Identified and resolved a Snowflake schema design issue causing fan-out joins that had been skewing aggregated metrics for six months, restoring accuracy to four production dashboards that leadership relied on weekly.
  5. Built a Python-based data pipeline that automated a previously manual data preparation process consuming twelve analyst-hours per week, freeing capacity for higher-value work across the team.

Meets Expectations

  1. Wrote accurate, well-structured SQL consistently throughout the year, producing queries that were readable, maintainable, and documented well enough for other team members to modify without assistance.
  2. Applied dbt best practices to all model development work, including appropriate testing, clear documentation, and consistent naming conventions aligned with team standards.
  3. Worked effectively in Snowflake and BigQuery across a range of analysis types, demonstrating solid understanding of cost and performance implications when writing queries against large tables.
  4. Used Python appropriately to extend analysis capabilities beyond SQL, applying it to statistical operations and data manipulation tasks where it added meaningful efficiency or capability.

Needs Development

  1. SQL outputs are functionally correct but often structured in ways that are difficult to maintain or build on; investing in query readability practices — CTE structure, consistent naming, inline comments — would improve the team's ability to leverage and trust this work.
  2. dbt model development has not consistently followed team testing conventions, creating gaps in data quality coverage that have surfaced in production; working through a structured dbt testing review with a senior analyst would close this gap.
  3. Performance optimization is an area for growth; queries on large datasets have occasionally caused cost and latency issues that a query review habit before production deployment would prevent.

Dashboarding & Reporting Performance Review Phrases

Exceeds Expectations

  1. Redesigned the company's core revenue dashboard in Tableau, replacing a twelve-chart sprawl with a focused five-metric executive view that the CFO adopted as the primary board reporting tool.
  2. Built a self-service Looker reporting layer for the marketing team that reduced ad-hoc data requests to the analytics team by 35%, enabling marketers to answer routine questions independently.
  3. Established a dashboard governance framework that reduced the number of conflicting metric definitions in Looker from eleven to one per metric, eliminating cross-team reporting disputes that had consumed significant analyst time.
  4. Developed a real-time operational dashboard in Tableau that gave the customer success team visibility into account health signals for the first time, enabling proactive outreach that improved retention on the monitored segment by 18%.
  5. Conducted a reporting audit that identified and deprecated forty-seven unused dashboards, reducing maintenance overhead, improving query performance, and creating a cleaner, trusted reporting environment for new analysts joining the team.

Meets Expectations

  1. Built and maintained dashboards in Looker and Tableau that were accurate, well-documented, and used consistently by the business teams they were designed for.
  2. Responded to reporting requests with well-structured outputs that required minimal follow-up clarification, demonstrating solid understanding of what each stakeholder needed to see.
  3. Applied consistent design practices to dashboard development — appropriate chart types, clear labeling, accessible color use — resulting in outputs that stakeholders could interpret without guidance.
  4. Maintained owned dashboards proactively, catching and correcting data issues before stakeholders encountered them and communicating scheduled changes in advance.

Needs Development

  1. Dashboards often contain more charts than the audience needs; practicing a "one question, one chart" discipline before publishing would produce more focused, actionable reports and reduce the cognitive load on stakeholders.
  2. There have been several instances where dashboard data was not validated before sharing with leadership; establishing a personal pre-publish checklist — metric definition review, edge case validation, time range check — would prevent the credibility risk of distributing inaccurate data.
  3. Documentation on owned Looker content is sparse, making it difficult for other analysts to maintain or build on this work; adding field descriptions and dashboard context notes is a clear and achievable near-term goal.

Stakeholder Enablement Performance Review Phrases

Exceeds Expectations

  1. Partnered with the product team as an embedded analyst for two quarters, developing deep enough domain knowledge to anticipate analysis needs before they were requested and reducing the team's decision cycle time measurably.
  2. Ran a series of "data literacy" sessions for the marketing team that built their ability to interpret and challenge analytical outputs independently, reducing the volume of misinterpreted analyses escalated to the data team.
  3. Translated a complex multi-variable churn analysis into a one-page decision memo that gave a non-technical VP the clear, action-oriented summary they needed to approve a $500K retention investment.
  4. Built a relationship with the sales operations team that resulted in being included in strategic planning discussions, giving the analytics function early visibility into questions that historically arrived as urgent ad-hoc requests.
  5. Developed an analysis intake process with the growth team that clarified business questions before analysis began, reducing rework from misaligned scope by approximately 40% and improving stakeholder satisfaction with turnaround time.

Meets Expectations

  1. Worked effectively with multiple business teams throughout the year, building credibility as a reliable analytical partner who communicates clearly and delivers on commitments.
  2. Presented analysis findings in a format appropriate to the audience — executive summaries for senior stakeholders, detailed methodology for technical partners — without needing guidance on which format to apply.
  3. Managed stakeholder expectations well on complex or data-constrained requests, communicating limitations clearly and proposing alternative approaches when original requests were not feasible.

Needs Development

  1. Stakeholder relationships are largely transactional at present; investing in regular check-ins with key business partners between project requests would build the trust and domain knowledge that make analysis more targeted and more influential.
  2. There is an opportunity to communicate more confidently in cross-functional meetings; analysis work is strong, but findings are often under-presented — practicing a "lead with the finding" communication structure in stakeholder meetings would increase the impact of the work.
  3. Requests from unfamiliar stakeholders are sometimes taken at face value without exploring the underlying business question; developing a habit of asking "what decision will this inform?" before beginning analysis would improve scope alignment and reduce rework.

Data Quality Stewardship Performance Review Phrases

Exceeds Expectations

  1. Identified a data pipeline bug causing a systematic 8% undercount in a key activation metric that had gone undetected for three months, diagnosed the root cause in Snowflake, and led the remediation effort that restored accurate reporting.
  2. Implemented a dbt testing suite covering forty-two production models that reduced data quality incidents reaching business stakeholders by over 60% compared to the prior year.
  3. Developed a data quality monitoring dashboard that gave the analytics team real-time visibility into pipeline health, transforming the team's approach to data quality from reactive firefighting to proactive detection.
  4. Authored a data dictionary for the company's core business metrics that resolved persistent definitional conflicts between finance, product, and marketing, becoming the canonical reference the business now uses for board reporting.
  5. Championed a data governance initiative that established ownership, documentation standards, and review cadences for all tier-one data assets, reducing the risk of undetected data issues reaching executive reports.

Meets Expectations

  1. Maintained high data quality standards in all owned dbt models and pipelines, applying appropriate testing and catching issues before they affected downstream consumers.
  2. Documented data sources, transformation logic, and metric definitions consistently, ensuring that owned work could be understood and maintained by other team members.
  3. Reported data quality issues promptly when discovered, communicating impact clearly and following through on remediation without requiring management escalation.
  4. Applied a healthy skepticism to incoming data, validating source data before beginning analysis and flagging anomalies rather than presenting them as findings without investigation.

Needs Development

  1. Several data quality issues in owned pipelines were discovered by stakeholders rather than caught proactively; establishing a regular self-audit habit — reviewing dbt test results and anomaly patterns weekly — would shift this dynamic before it affects analytical credibility.
  2. Metric definitions in owned Looker content have diverged from the agreed definitions in some cases; a scheduled quarterly review of owned metrics against the data dictionary would prevent these inconsistencies from compounding.
  3. Data quality concerns are sometimes raised informally rather than tracked; using the team's issue tracking process consistently would improve visibility, prioritization, and accountability for resolution.

How Prov Helps Build the Evidence Behind Every Review

The gap between a data analyst who gets “meets expectations” and one who gets “exceeds” is rarely about the quality of the analysis — it is about whether the outcome was documented. When you deliver a finding that changes a product decision, that is a career-defining moment. When you fix a metric definition that had been causing cross-team confusion for months, that is an organizational contribution. But if those moments are not captured when they happen, they disappear into the noise of the next request, and review time becomes a memory exercise with declining returns.

Prov is a career achievement capture app that turns rough notes into polished achievement statements in real time. After a stakeholder presents your analysis to the exec team and acts on it, you open Prov, describe what happened in a sentence or two, and Prov transforms it into a structured achievement with the relevant skills and impact extracted. By review time, the evidence is already organized — specific, outcome-focused, and ready to inform a self-assessment or strengthen a manager’s review language.

Ready to Track Your Wins?

Stop forgetting your achievements. Download Prov and start building your career story today.

Download Free on iOS No credit card required