Remoteria
RemoteriaBook a 15-min intro call
500+ successful placements4.9 (50+ reviews)30-day replacement guarantee

Job description template

Data Analyst Job Description Template (2026)

A free, copy-ready Data Analyst job description covering responsibilities, must-have skills, tools, seniority variants, and KPIs. Written for hiring managers, not for SEO filler.

Key facts

Role
Data Analyst
Reports to
Reports to the Head of Data
Must-have skills
7 items
Seniority tiers
Junior / Mid / Senior
KPIs defined
6 metrics
Starting price (offshore)
$2000/month

Role summary

A Data Analyst owns the numbers our product, marketing, and finance teams make decisions on: writing SQL against the warehouse, modeling metrics in dbt or LookML, building dashboards in Looker/Tableau/Metabase, running funnel and cohort analyses, designing and reading A/B tests, and writing up findings so non-technical stakeholders actually make a decision instead of asking for another pull. This role is about defensible answers and stakeholder judgment, not pipeline engineering.

Responsibilities

Must-have skills

  • 3+ years writing SQL in production against a cloud warehouse — Snowflake, BigQuery, Redshift, or Postgres.
  • Fluent with window functions, CTEs, correlated subqueries, and recognizing fan-out from incorrect join grain.
  • Hands-on experience in at least one BI tool (Looker/LookML, Tableau, Power BI, Mode, or Metabase) with dashboards in production for a real team.
  • Comfortable designing and reading A/B tests — significance, power, sample size, guardrail metrics.
  • Cohort and retention analysis in SQL or in a product analytics tool (Mixpanel, Amplitude, PostHog, Heap).
  • Working knowledge of Python (pandas, matplotlib) or R for analysis that SQL cannot reach cleanly.
  • Strong written English — can turn a messy Slack thread into a one-page brief with a recommendation.

Nice-to-have skills

  • dbt experience modeling marts owned by the analytics team.
  • LookML development including Explores, derived tables, and persistent derived tables.
  • Marketing attribution experience with GA4, platform APIs (Meta, Google Ads), and warehouse reconciliation.
  • Hex, Deepnote, or Jupyter notebooks for reproducible ad-hoc analysis.
  • Statistical literacy beyond A/B: regression, clustering, Prophet/statsmodels forecasting.

Tools and technology

Reporting structure

Reports to the Head of Data, Analytics Manager, or VP Growth depending on org. Partners daily with product managers, growth marketing, finance, and the data engineering team that owns the warehouse and dbt project.

Seniority variants

How responsibilities shift across junior, mid, and senior levels.

junior

1-2 years

  • Answer scoped ad-hoc SQL requests under review from a senior analyst.
  • Maintain existing dashboards — fix broken tiles, add requested filters, refresh stale docs.
  • Write up weekly KPI summaries using an existing template.
  • Shadow A/B test readouts and learn the metrics dictionary.

mid

3-5 years

  • Own a domain (growth, product, or finance analytics) end-to-end.
  • Design and read A/B tests independently and present results to product leadership.
  • Build new dashboards and LookML / dbt models with tests and docs.
  • Run stakeholder intake and triage — kill the bad requests, scope the good ones.

senior

6+ years

  • Set the metrics standard across the company — definitions, ownership, and governance.
  • Mentor mid and junior analysts, run hiring loops, and set code-review standards for SQL and LookML.
  • Partner with execs on pricing, retention, and forecasting work that moves roadmap decisions.
  • Own experimentation platform choice and A/B testing practice across product teams.

Success metrics (KPIs)

Full JD (copy-ready)

Paste this into your ATS or careers page. Edit the company name and any bracketed placeholders.

# Data Analyst — Job Description

## Role summary
A Data Analyst owns the numbers our product, marketing, and finance teams make decisions on: writing SQL against the warehouse, modeling metrics in dbt or LookML, building dashboards in Looker/Tableau/Metabase, running funnel and cohort analyses, designing and reading A/B tests, and writing up findings so non-technical stakeholders actually make a decision instead of asking for another pull. This role is about defensible answers and stakeholder judgment, not pipeline engineering.

## Responsibilities
- Write production-quality SQL against Snowflake, BigQuery, or Redshift — CTEs, window functions (LAG, ROW_NUMBER, SUM OVER), correct grain on joins, QA'd against known totals.
- Model metrics in dbt or Looker LookML with clear grain, primary key tests, and documentation so numbers agree across every dashboard.
- Build and maintain dashboards in Looker, Tableau, Power BI, Mode, or Metabase tuned to the actual decisions stakeholders make weekly.
- Run funnel, cohort, and retention analysis in SQL or in Mixpanel/Amplitude against event tables; quantify drop-off in revenue terms.
- Design A/B tests with up-front power analysis, pick primary and guardrail metrics, and read results without p-hacking the segment.
- Own marketing attribution — first touch, last touch, multi-touch — and reconcile GA4, ad platform, and warehouse numbers when they disagree.
- Turn vague "can you pull the numbers" requests into a specific, answerable business question before touching SQL.
- Write weekly and monthly business reviews in Notion or Slides with the chart, the bottom line, and the recommended action up front.
- Maintain a metrics dictionary so "active user" and "revenue" mean the same thing in finance, product, and marketing.
- Run self-serve enablement so PMs and growth leads can answer their own questions in Looker or Metabase instead of filing tickets.
- Flag data quality issues upstream to the data engineering team with a reproducible query, not a Slack complaint.
- Push back when a request would produce a misleading number and propose a better framing.

## Must-have skills
- 3+ years writing SQL in production against a cloud warehouse — Snowflake, BigQuery, Redshift, or Postgres.
- Fluent with window functions, CTEs, correlated subqueries, and recognizing fan-out from incorrect join grain.
- Hands-on experience in at least one BI tool (Looker/LookML, Tableau, Power BI, Mode, or Metabase) with dashboards in production for a real team.
- Comfortable designing and reading A/B tests — significance, power, sample size, guardrail metrics.
- Cohort and retention analysis in SQL or in a product analytics tool (Mixpanel, Amplitude, PostHog, Heap).
- Working knowledge of Python (pandas, matplotlib) or R for analysis that SQL cannot reach cleanly.
- Strong written English — can turn a messy Slack thread into a one-page brief with a recommendation.

## Nice-to-have skills
- dbt experience modeling marts owned by the analytics team.
- LookML development including Explores, derived tables, and persistent derived tables.
- Marketing attribution experience with GA4, platform APIs (Meta, Google Ads), and warehouse reconciliation.
- Hex, Deepnote, or Jupyter notebooks for reproducible ad-hoc analysis.
- Statistical literacy beyond A/B: regression, clustering, Prophet/statsmodels forecasting.

## Tools and technology
- SQL (Snowflake / BigQuery / Redshift)
- Looker / LookML
- Tableau
- Metabase / Mode
- dbt
- Python (pandas)
- Google Analytics 4
- Mixpanel / Amplitude
- Hex
- Excel / Google Sheets

## Reporting structure
Reports to the Head of Data, Analytics Manager, or VP Growth depending on org. Partners daily with product managers, growth marketing, finance, and the data engineering team that owns the warehouse and dbt project.

## Success metrics (KPIs)
- Stakeholder decisions shipped per quarter backed by analyses they can cite — not number of dashboards built.
- Metrics agreement: zero conflicting numbers between finance, product, and marketing on shared KPIs.
- Dashboard reliability: greater than 99% of scheduled dashboards refresh green weekly.
- A/B test quality: every concluded test has up-front power analysis and a documented guardrail read.
- Self-serve adoption: queries run by non-analysts in Looker / Metabase trending up quarter-over-quarter.
- Turnaround on ad-hoc requests: median under 2 business days for scoped questions.

Frequently asked questions

What does a Data Analyst do day-to-day?

A Data Analyst owns the numbers our product, marketing, and finance teams make decisions on: writing SQL against the warehouse, modeling metrics in dbt or LookML, building dashboards in Looker/Tableau/Metabase, running funnel and cohort analyses, designing and reading A/B tests, and writing up findings so non-technical stakeholders actually make a decision instead of asking for another pull. This role is about defensible answers and stakeholder judgment, not pipeline engineering.

How many years of experience should a mid-level Data Analyst have?

A mid-level Data Analyst typically has 3-5 years of experience. At that level they should own a domain (growth, product, or finance analytics) end-to-end.

Which KPIs should I hold a Data Analyst accountable to?

The most important KPIs for a Data Analyst are: Stakeholder decisions shipped per quarter backed by analyses they can cite — not number of dashboards built.; Metrics agreement: zero conflicting numbers between finance, product, and marketing on shared KPIs.; Dashboard reliability: greater than 99% of scheduled dashboards refresh green weekly.; A/B test quality: every concluded test has up-front power analysis and a documented guardrail read..

Can you match our BI tool (Looker, Tableau, Power BI, Metabase, Hex)?

Yes, and we match on recent production experience. Our shortlist only includes analysts whose last 12 months of work were on your exact tool. A Tableau analyst and a Looker analyst write code that looks nothing alike because the modeling layers are different, and we would rather wait an extra week than send you someone who has to learn LookML on your dime. For teams migrating between tools (say Tableau to Looker) we can match analysts who have done that specific migration before.

How do they handle ambiguous stakeholder requests?

They push back before writing a single line of SQL. Standard practice is to ask three questions in the ticket: what decision will this number drive, what time window are we comparing against, and what does "good" look like. Most requests that start as "can you pull the numbers" turn into a different question once those three are answered, and the request is usually closed without producing a dashboard at all. This is not laziness, it is what keeps the analytics team from drowning in one-off pulls that nobody uses.

Related

Written by Syed Ali

Founder, Remoteria

Syed Ali founded Remoteria after a decade building distributed teams across 4 continents. He has helped 500+ companies source, vet, onboard, and scale pre-vetted offshore talent in engineering, design, marketing, and operations.

  • 10+ years building distributed remote teams
  • 500+ successful offshore placements across US, UK, EU, and APAC
  • Specialist in offshore vetting and cross-timezone team integration
Connect on LinkedIn

Last updated: April 12, 2026