Guide

Client‑Proof ROI Dashboard Kit (Stripe→Looker Studio→Notion)

Copy‑paste schemas, SQL, and settings to ship a secure Stripe→Looker Studio dashboard with conservative ROI tiles, plus a one‑click Notion embed. Built for solo operators who want clients to see value without manual reports.

Ship a self‑serve, view‑only ROI dashboard your clients will check without you nudging them. This kit gives you the exact schemas, connection options, security settings, and ROI math—so you can stand it up today and migrate cleanly from Sheets to BigQuery later without breaking charts.

What this kit delivers

Use this when you sell ongoing automation and need clients to see revenue and time‑savings without weekly screenshots. The dashboard pulls Stripe revenue and subscription data, overlays your operations time‑savings, and renders CFO‑friendly ROI tiles. All steps assume <50K Stripe events/month—perfect for a solo shop.

Choose your Stripe→Looker Studio route (with trade‑offs)

Pick based on your current volume, budget, and security posture. Our default for solo operators: start on Sheets (fast, cheap), then migrate to BigQuery (secure, scalable) once the client count grows.

  • Option A — Direct connector (Supermetrics/Windsor, etc.)
  • Pros: Fastest to live; no staging; managed refresh.
  • Cons: Ongoing license; vendor lock‑in.
  • Good fit: 1–3 clients, you value speed over cost.
  • Option B — Sheets staging (Make/Zapier → Google Sheets → Looker Studio)
  • Pros: Lowest cost; easiest to debug; great for <50K events/mo.
  • Cons: Respect Sheets quotas; keep schema lean; extracts >100 MB fail.
  • Good fit: Proving value quickly; single client; “learn the shape” phase.
  • Option C — BigQuery (Stripe → BigQuery via Transfer/Airbyte/Coupler → Looker Studio)
  • Pros: Best security (service account), partitioning, RLS, pennies at this scale.
  • Cons: Initial setup time; GCP project required.
  • Good fit: Multi‑client portal; long‑term reliability; growth.

Recommendation for most: Start with Option B this week; migrate to Option C within 30–60 days once the first client renews. Keep your field names identical so you can use Looker Studio’s “Replace data source” to swap Sheets→BigQuery without re‑building charts.

Option B quick start — Sheets staging with idempotent appends

Stand up a working pipeline in under an hour. The pattern: capture Stripe events daily, transform into a skinny fact table, and append idempotently using a unique key so retries don’t double‑count.

  1. Create a Google Sheet named “stripeeventsstaging”. Add a tab “events”. Paste this header row (CSV):

```
eventid,occurredat,eventtype,objecttype,invoiceid,subscriptionid,customerid,amountgross,amountfee,amountnet,currency,product,planinterval,clientemail,ingested_at
```

Notes:

  • Use cents for amounts (integers). Format in Looker Studio.
  • client_email maps to your client’s billing email if you operate a multi‑tenant shop. For single‑client, hardcode it in your automation step.
  1. Build the idempotent daily append in Zapier or Make.

Pseudo‑blueprint (Zapier):

  • Trigger: Stripe → “New Event” (polling or webhook). Filter to event types you care about (e.g., invoice.payment_succeeded, charge.refunded, customer.subscription.updated).
  • Formatter: Map values to your skinny schema; coerce amounts to integers (cents) and timestamps to ISO 8601.
  • Search: Google Sheets → “Lookup Row” in ‘events’ where eventid = {{event.id}}.
  • Path A (found): Do nothing.
  • Path B (not found): Google Sheets → “Create Row” with eventid={{event.id}} and ingestedat={{now}}.

Make scenario is analogous: Webhook (Stripe) → Array aggregator (batch daily) → Router: Search row by eventid → Append if missing.

  1. Quotas and performance guardrails you must keep:
  • Batch daily (or hourly) to stay well under Sheets’ per‑minute quotas.
  • Keep the schema skinny; offload joins/derived metrics to Looker Studio.
  • If extracts grow, use Looker Studio’s “Extract Data” cache on the data source; keep under 100 MB.
  1. Add a “Last updated” timestamp for the dashboard:
  • Create a separate tab “meta” with cells: A1=lastupdated, B1==MAX(events!N:N) where column N stores ingested_at.
  • In Looker Studio, add a data source for the meta tab and place a scorecard bound to B1.

Result: You’ll have a reliable, near‑free pipeline with duplicate protection and a visible freshness chip.

Option C — BigQuery setup, security, and clean migration from Sheets

Use BigQuery when you want durable security, easy multi‑client scoping, and painless scale. Keep the same column names so you can swap sources in Looker Studio without re‑designing the report.

  1. Minimal BigQuery table (partitioned by date):

```
CREATE TABLE PROJECT.DATASET.stripe_events
(
eventid STRING,
occurred_at TIMESTAMP,
event_type STRING,
object_type STRING,
invoice_id STRING,
subscription_id STRING,
customer_id STRING,
amount_gross INT64,
amount_fee INT64,
amount_net INT64,
currency STRING,
product STRING,
plan_interval STRING,
client_email STRING,
ingestedat TIMESTAMP
)
PARTITION BY DATE(occurred_at)
OPTIONS (requirepartitionfilter=true);
```

  1. Ingest Stripe → BigQuery:
  • Easiest: BigQuery Data Transfer Service (Stripe). Select objects (charges, invoices, payouts) and schedule daily.
  • Alternates: Airbyte/Coupler to BigQuery on a schedule. Map into the skinny schema above via a transformation step or view.
  1. Deduplicate safety net (even if upstream is idempotent):

```
CREATE OR REPLACE TABLE PROJECT.DATASET.stripeeventsdedup AS
SELECT AS VALUE t FROM (
SELECT *, ROWNUMBER() OVER (PARTITION BY eventid ORDER BY ingested_at DESC) AS rn
FROM PROJECT.DATASET.stripe_events
) t WHERE rn = 1;
```

  1. Connect Looker Studio using a Service Account (recommended for shared client portals):
  • Create a service account (e.g., ls-reporter@PROJECT.iam.gserviceaccount.com).
  • Grant it BigQuery Data Viewer on the dataset.
  • In Looker Studio, add BigQuery data source → Credentials → Service Account → upload key.
  1. Per‑client scoping (two patterns):
  • Pattern A — Viewer’s credentials + BigQuery RLS (tightest row security, grants viewers dataset access):
  • Ensure your LS data source uses Viewer’s credentials.
  • Create a policy filtering on viewer identity:

```
CREATE OR REPLACE ROW ACCESS POLICY clientemailrls
ON PROJECT.DATASET.stripeeventsdedup
FILTER USING (LOWER(clientemail) = LOWER(SESSIONUSER()));
```

  • Grant the policy to the viewer principals (users/groups) you invite.
  • Pattern B — Service Account + LS Email Filter (no dataset grants to end viewers):
  • Keep Service Account credentials.
  • In the LS data source, add a filter where client_email equals “Viewer’s email”. LS resolves the viewer’s email at view time and applies it before rendering.

Pick one pattern per deployment; don’t combine.

  1. Migrate Sheets → BigQuery without breaking charts:
  • Keep field names and types identical.
  • In Looker Studio: File → “Report settings” → “Replace data source” and select the BigQuery source (point to stripeeventsdedup).
  • Validate every chart’s “field mapping”; fix any type mismatches (e.g., numbers imported as text).

Looker Studio template — fields, tiles, and performance guardrails

You’ll build the same report once and reuse it for every client. Keep metric names stable so you can “Replace data source” later.

Data source(s):

  • Primary: Sheets ‘events’ tab OR BigQuery stripeeventsdedup.
  • Optional secondary: Sheets ‘meta’ tab or BigQuery query returning last_updated.

Calculated fields (examples):

  • Gross Revenue ($): amount_gross / 100
  • Fees ($): amount_fee / 100
  • Net Revenue ($): amount_net / 100
  • MRR ($): Sum of Net Revenue where objecttype = 'invoice' and planinterval IN ('month','year') normalized to month.
  • Refund Rate (%): SUM(CASE WHEN eventtype='charge.refunded' THEN amountgross END) / SUM(amount_gross)

Pages/tiles to include:

  • Scorecards: ARR, MRR, Net Revenue (30/90 days), Hours Saved $ (from ROI sheet), Last Updated.
  • Trends: Net Revenue by Month; New vs. Expansion vs. Churn (if you enrich events with simple tags).
  • Cohort: Subscription start cohort vs. 6‑month net revenue.
  • Filters: Date range; Product; Plan interval; Client (only for internal multi‑tenant admin views).

Performance tips:

  • Use aggregated charts; avoid table dumps with thousands of rows.
  • If on Sheets, enable “Extract Data” caching and keep under 100 MB.
  • Hide drill‑to‑detail for viewers; they don’t need raw rows.

Visual standards (reuse across clients):

  • Title: “[Client Name] — Revenue & ROI”.
  • Subtitle chip: “Data freshness: [Last Updated]”.
  • Color: Neutral base with one accent for “saves/impact” tiles so they pop in Notion.

Secure sharing — credentials, scoping, and QA

Hardening is not optional. Set it once, template it, and reuse.

Credentials mode (pick one per deployment):

  • Viewer’s credentials (pairs well with BigQuery RLS): Queries run as the viewer; you must grant dataset access per viewer/group.
  • Service Account (BigQuery‑only; best for portals): Queries run as a service account; end viewers don’t get dataset access.

Lock down sharing:

  • Report sharing: Specific people only; disable link sharing.
  • Data source sharing: Keep private to you and the service account (if used).
  • Disable: Download, print, and copy for viewers.

Per‑client data scoping:

  • With Viewer’s credentials: enforce BigQuery RLS using SESSION_USER() as shown above.
  • With Service Account: in LS, add a data source filter “Email equals Viewer’s email” and ensure client_email exists in your fact table.

Pre‑ship QA checklist:

  • Open in an incognito viewer test account; verify they only see their rows.
  • Confirm “Last Updated” shows a recent timestamp.
  • Attempt to export as viewer; ensure blocked.
  • Try a different viewer email; confirm zero data returns (expected) if not granted.
  • Kill public links everywhere (report and data source).

Embed in Notion — fast client portal setup

Clients live in Notion. Embed the dashboard where they already work.

  1. In Looker Studio: Share → “Invite” your client email (view only). Copy the viewer link.
  2. In Notion: “/embed” → paste the Looker Studio link. Set height ~900–1200px.
  3. Add a lightweight portal section above the embed:

```

[Client Name] Automation Portal

  • SLA: [link to your SLA doc]
  • What’s new: [2–3 bullet changelog]
  • Dashboard tips: Use the date filter for last 30/90 days.
  • Contact: [support@yourdomain.com]

```

  1. Optional: Add a Notion synced block that pulls a “Gotchas & Fixes” list from your internal wiki so clients see your steady improvements.
  1. On client onboarding: auto‑email the Notion page link with a one‑line explain: “This page is your live revenue & ROI view—no logins beyond Google needed.”

ROI math — conservative, CFO‑friendly tiles

Your ROI tiles should be conservative and simple. Use a small helper sheet or BigQuery view and pipe the results into Looker Studio.

Inputs (per client):

  • hourssavedmonth — Sum of hours your automation saved this month (from Notion timesheets or a simple form).
  • billable_rate — Client’s blended billable rate ($/hr).
  • productivity_recapture — Only a fraction of time saved converts to productive output. Default to 50% (0.5).
  • revenuerecoveredmonth — Revenue you recovered via fixes or automations.
  • monthly_fee — Your monthly fee for the engagement.

Outputs (formulas):

  • Cost savings ($/mo): hourssavedmonth billablerate productivityrecapture
  • Monthly impact ($/mo): costsavings + revenuerecovered_month
  • Payback period (months): setupcost / monthlyimpact (use your actual setup cost)
  • Net ROI (%): (monthlyimpact - monthlyfee) / monthly_fee
  • ROI multiple (x): monthlyimpact / monthlyfee

Google Sheets helper tab (paste headers row 1):

```
metric,value
hourssavedmonth,[ENTER NUMBER]
billable_rate,[ENTER NUMBER]
productivity_recapture,0.5
revenuerecoveredmonth,[ENTER NUMBER]
monthly_fee,[ENTER NUMBER]
setup_cost,[ENTER NUMBER]
```

Derived cells (examples):

  • B8 (cost_savings): =B2B3B4
  • B9 (monthly_impact): =B8+B5
  • B10 (payback_months): =IF(B9=0,"—",B6/B9)
  • B11 (net_roi): =IF(B5=0,"—",(B9-B5)/B5)
  • B12 (roi_multiple): =IF(B5=0,"—",B9/B5)

Expose B8–B12 to Looker Studio as a separate data source and show them as headline tiles labeled “Cost savings (conservative)”, “Monthly impact”, “Payback”, and “ROI multiple”.

Operations playbook — templating, costs, and upgrade triggers

Use this to speed future rollouts and reduce mistakes.

  • Keep one canonical Looker Studio template. For each new client, “Make a copy”, swap the data source, and confirm the ‘Last Updated’ chip.
  • Maintain a schema changelog. If you add a column (e.g., discount_amount), add it to both Sheets and BigQuery schemas before updating charts.
  • Cost sanity check at <50K events/mo: BigQuery query costs should be near $0/month (first 1 TiB free); your main cost is your connector if you use one.
  • Reliability bump: Move from Sheets to BigQuery when extracts near 100 MB, Sheets cells > ~500k rows, or you manage >3 clients.
  • Security default: If you can avoid granting dataset access to clients, prefer Service Account + LS email filter in managed portals; otherwise use Viewer’s credentials + RLS and grant access deliberately.
  • Version your Notion portal block; clients appreciate a tiny “What changed this month” log.