Get gptsheet →

gptsheet Documentation

BYOK LLM toolkit for Google Sheets and Excel. Pay once. Bring your own API key. Your data and keys never touch our servers.

Welcome

gptsheet adds AI to your spreadsheets through three surfaces: cell formulas like =GPT and =GPT_CLASSIFY, a sidebar with batch tools and an API fetcher, and a chat agent (Premium) that can read and edit your sheet conversationally.

There are two plans:

PlanPriceIncludes
Basic $69 lifetime All 13 core formulas, OpenAI + Anthropic, sidebar tools, 1 year of updates
Premium $99 lifetime Everything in Basic + chat agent + =GPT_VISION + =GPT_WEB + =GPT_IMAGE + all providers + lifetime updates

Install

Google Sheets

  1. Open the Google Workspace Marketplace listing for gptsheet.
  2. Click Install and pick the Google account you want to use.
  3. Open any sheet → Extensions → gptsheet → Open sidebar.

Microsoft Excel

  1. Open the Microsoft AppSource listing for gptsheet.
  2. Click Get it now and sign in with the Microsoft account you want to use.
  3. Open Excel → Home → My Add-ins → gptsheet.

One license covers both platforms.

Activate your license

After purchase you'll get an email with your license key, formatted like GS-XXXX-XXXX-XXXX-XXXX.

  1. Open the sidebar → click the menu (top-left).
  2. Tap License.
  3. Paste the key and click Activate.
One account per license

Your license binds to the Google or Microsoft account that activates it. You can sign in to that account on multiple devices, but the license itself can't be shared across users.

Set up API keys

gptsheet is BYOK — you bring your own keys from each LLM provider you want to use. Keys are stored only in your browser's localStorage for sidebar tools, and in your sheet's PropertiesService for cell formulas. Our server never sees them.

  1. Open the sidebar → ≡ menu → API keys.
  2. Click the ✎ pencil next to a provider (OpenAI, Anthropic, Gemini).
  3. Paste your key and click Save.

The first provider you save becomes the default — used by sidebar tools and as the default for cell formulas. You can change the default anytime by clicking Make default on another configured provider.

See the Providers section below for where to get each key.

Custom Prompt

The escape hatch for "do X to every row in this column" when no named formula fits.

  1. Select a range in the sheet (e.g. A2:A20).
  2. Sidebar → Prompt tab.
  3. Type an instruction. Click Run on selection.

The sidebar reads your selection's first column, calls the LLM once per row, and writes results into the column immediately to the right.

Example: select A2:A20 filled with company names → type "Find each company's industry, one word" → results land in B2:B20.

Bulk Apply

Run a named formula across hundreds or thousands of rows with progress + stop.

  1. Sidebar → Bulk tab.
  2. Pick a formula from the dropdown (=GPT_CLASSIFY, =GPT_TRANSLATE, etc.).
  3. Fill in the formula-specific parameters (categories, target language, etc.).
  4. Set the input range and output column (auto-populated when you click Use selection).
  5. Click Run. Watch the progress bar. Click Stop anytime — partial results are preserved.
Why use Bulk Apply over cell formulas

Bulk Apply runs in your browser, bypassing Apps Script's 6-minute execution cap and its UrlFetchApp daily quota. Use it for large batches (500+ rows).

API Fetcher

Paste a REST URL → JSON data appears in your sheet. Auth tokens stay in your browser; gptsheet never sees them.

  1. Click into an empty area of your sheet (the data will land at the active cell).
  2. Sidebar → Fetch tab.
  3. Paste the URL.
  4. Optionally paste JSON headers (e.g. {"Authorization": "Bearer ..."}).
  5. Click Fetch to sheet.

The fetcher handles common JSON shapes: array-of-objects (becomes a table with headers), array-of-primitives (single column), or a single object (two columns: key, value).

Cell formulas overview

Each formula calls an LLM from Google Apps Script's UrlFetchApp using your saved provider key. You can use them like any other Sheets formula — composing with ARRAYFORMULA, fill-down, conditional refs, etc.

Common parameters

Every formula accepts these optional trailing arguments:

ArgumentDefaultNotes
temperature0.30-2, controls randomness
modelyour saved defaultOverride per-call, e.g. "gpt-4o"

=GPT Basic

Generic LLM prompt that returns a single value.

GPT(prompt, [value], [temperature], [model])
=GPT("Capital of France")          → Paris
=GPT("Capital of " & A2)            → Tokyo (when A2 is "Japan")

=GPT_LIST / =GPT_HLIST Basic

Returns a list — one item per cell. GPT_LIST fills vertically; GPT_HLIST fills horizontally.

GPT_LIST(prompt, [value])
=GPT_LIST("5 marketing channels for SaaS")

=GPT_TABLE Basic

Returns a 2D table with headers in the first row.

GPT_TABLE(prompt, [head])
=GPT_TABLE("Top 3 EV makers", "brand,model,price")

=GPT_CLASSIFY Basic

Picks one category from a list. Deterministic, taggable.

GPT_CLASSIFY(value, categories, [instructions])
=GPT_CLASSIFY(A2, "positive, negative, neutral")
=GPT_CLASSIFY(A2, "enterprise, smb, mid-market", "Use company size as a hint")

=GPT_EXTRACT Basic

Pulls specific elements out of messy text. Returns comma-separated values.

GPT_EXTRACT(text, what_to_extract)
=GPT_EXTRACT(A2, "phone numbers")
=GPT_EXTRACT(A2, "email addresses")
=GPT_EXTRACT(A2, "dates in ISO format")

=GPT_TRANSLATE Basic

Translates text. Returns only the translation.

GPT_TRANSLATE(text, [target_language], [source_language], [instructions])
=GPT_TRANSLATE(A2, "Japanese")
=GPT_TRANSLATE(A2, "Japanese", , "Keep brand names in English")

=GPT_EDIT Basic

Edits text — fix grammar, change tone, rewrite for an audience.

GPT_EDIT(text, [task])
=GPT_EDIT(A2)                          → fixes grammar/spelling by default
=GPT_EDIT(A2, "make it more concise")
=GPT_EDIT(A2, "rewrite for a 5th grader")

=GPT_FORMAT Basic

Standardize dates, addresses, names, currencies — without writing regex.

GPT_FORMAT(input, target_format, [source_format])
=GPT_FORMAT(A2, "ISO date")
=GPT_FORMAT(A2, "USD with 2 decimals")
=GPT_FORMAT(A2, "First Last")

=GPT_TAG Basic

Apply tags from your taxonomy, or auto-suggest tags.

GPT_TAG(value, [tags], [instructions], [top_k])
=GPT_TAG(A2, "urgent, bug, feature, docs")
=GPT_TAG(A2, , , 3)                    → auto-suggest 3 tags

=GPT_SPLIT / =GPT_HSPLIT Basic

Semantic split — by sentences, paragraphs, sections, etc. Returns one piece per cell (vertical or horizontal).

GPT_SPLIT(text, split_by)
=GPT_SPLIT(A2, "sentences")
=GPT_SPLIT(A2, "paragraphs")

=GPT_MATCH Basic

Semantic match between two ranges — VLOOKUP that understands meaning. Uses OpenAI embeddings under the hood.

GPT_MATCH(search_keys, lookup_range, [confidence], [stats], [top_k])
=GPT_MATCH(A2:A100, B2:B500)
=GPT_MATCH(A2:A100, B2:B500, 0.7, TRUE)   → with confidence score in adjacent column
Requires an OpenAI key

Embeddings aren't supported by Anthropic or Gemini's chat APIs, so =GPT_MATCH always uses OpenAI regardless of your default provider.

=GPT_CREATE_PROMPT Basic

Helper that concatenates cells and ranges into a single prompt string. Useful for building prompts from sheet data.

GPT_CREATE_PROMPT(arg1, [arg2], [...])
=GPT(GPT_CREATE_PROMPT("Topic:", A2, "Audience:", B2, "Write a tweet."))

=GPT_VISION Premium

Analyze an image given its URL — caption, classify, extract text, etc.

GPT_VISION(prompt, image_url, [high_res])
=GPT_VISION("Describe in one sentence", A2)
=GPT_VISION("Extract any text", A2, TRUE)

Uses OpenAI's gpt-4o-mini by default (vision-capable). Pass the image URL in column A and the prompt as the first argument.

=GPT_WEB Premium

Web-grounded answer with citations — uses OpenAI's Responses API with the web_search_preview tool.

GPT_WEB(prompt, [value])
=GPT_WEB("AAPL closing price today")
=GPT_WEB("Latest stable Node.js version. Just the number.")

=GPT_IMAGE Premium

Generate an image and return a URL. Wrap with =IMAGE(...) to render in a cell.

GPT_IMAGE(prompt, [size], [model])
=IMAGE(GPT_IMAGE("a green apple on white background"))
=IMAGE(GPT_IMAGE("photo of a sunset", "", "gemini-2.5-flash-image"))

Defaults to OpenAI dall-e-3. Pass gemini-2.5-flash-image or imagen-3.0-generate-002 as the third arg to route through Gemini instead (more permissive rate limits).

Agent overview Premium

A conversational agent that reads your sheet, plans, calls tools, and writes results. Multi-step, context-aware, and editable.

Open the sidebar → Agent tab. Your current cell selection is auto-attached as context — the agent sees what data you're working with before you even ask.

Example prompts:

Agent tools

The agent has 7 tools it can call:

ToolWhat it does
get_selectionRead the user's currently selected range.
read_rangeRead values from a specific A1 range.
write_rangeWrite a 2D array of values to an A1 range.
get_sheet_infoGet sheet metadata (name, last row/col, active range).
create_chartInsert a native Sheets chart (COLUMN, BAR, LINE, AREA, PIE, SCATTER).
insert_formulaInsert a live formula at the active cell.
create_imageGenerate an image and embed it as a native image object in the sheet.

The agent picks the appropriate tool based on the request — you don't need to mention tool names. Use the model picker in the composer to switch between providers for that conversation.

Example tasks

Lead qualification

"For each company in A2:A50, look up the industry and employee count. Write industry to B, count to C. Then score each as 'enterprise', 'mid-market', or 'smb' in column D based on count."

Data viz from selection

"Compare these regions visually."   (with A1:B5 selected: Region, Revenue)

Formula composition

"Compute the running average of column B and put it in column C."   (uses insert_formula)

BYOK explained

BYOK = Bring Your Own Key. You pay LLM providers directly at their API price. gptsheet is the software layer; the actual LLM bill goes to OpenAI, Anthropic, or Google.

For a typical user running ~1,000 cells through gpt-4o-mini per month, the LLM bill is about $1–3/month. There's no markup or middleman.

ProviderWhere keys are storedWhen it's called
OpenAI / Anthropic / GeminiYour browser localStorage (for sidebar tools)Sidebar tools call provider directly from your browser
Same keysApps Script PropertiesService (synced on save)Cell formulas call provider via UrlFetchApp from your sheet's Apps Script runtime

In neither case does the key (or your prompts/data) pass through gptsheet's servers.

OpenAI

Get a key from platform.openai.com/api-keys.

Supported features (via OpenAI key):

Tier 0 rate limits

New OpenAI accounts can hit Cloudflare rate limits (HTTP 429) on DALL·E 3 image generation. Use =GPT_IMAGE(..., "gemini-2.5-flash-image") or upgrade to a higher tier in your OpenAI dashboard.

Anthropic

Get a key from console.anthropic.com/settings/keys.

Supported features (via Anthropic key):

Not yet supported via Anthropic key alone: =GPT_VISION (uses OpenAI), =GPT_WEB, =GPT_IMAGE, =GPT_MATCH. These always use OpenAI keys.

Gemini

Get a key from aistudio.google.com/app/apikey. Free tier is generous (15 requests/min).

Supported features (via Gemini key):

Recommended for image generation

Gemini's image-gen rate limits are far more permissive than OpenAI's DALL·E. If you generate images frequently, use a Gemini model.

Connectors — overview & setup model

Connectors let the Chat Agent (and direct Run-now buttons in the sidebar) pull data from your apps + databases into Sheets. Each one is configured once in Menu → Connectors; auth lives in your browser's localStorage and is never sent to our servers — only to the upstream provider (Stripe, Notion, GA4, etc.).

Connectors fall into three auth families:

Service Account 101 (for GA4 / GSC / BigQuery / YouTube / Google Ads)

All five Google connectors use a Google Service Account JSON key. The setup is the same for each — only the final access-grant step differs.

One-time setup (3 minutes)

  1. Open console.cloud.google.com → create or pick a project.
  2. Enable the API you need (BigQuery API, Google Analytics Data API, Search Console API, YouTube Analytics API, Google Ads API).
  3. Go to IAM & Admin → Service Accounts → Create service account. Name it gptsheet-reader, click Create.
  4. Skip role assignment (we'll grant access on the destination side). Click Done.
  5. Click the new service account → Keys tab → Add Key → Create new key → JSON. Browser downloads a JSON file.
  6. Open the JSON file. The whole contents go in the "Service Account JSON" field in the gptsheet connector.
  7. Note the client_email at the top of the JSON — you'll need it for the access-grant step below.

The JSON key never leaves your machine + your Google API calls. gptsheet's server doesn't see it.

Stripe

Connect Stripe to pull customers, subscriptions, charges, invoices, and balance transactions. The agent can answer "what's my MRR this month?" or "list churned customers last week."

Setup

  1. Stripe Dashboard → Developers → API keys → Create restricted key.
  2. Give it read-only permissions for Customers, Subscriptions, Charges, Invoices, Balance.
  3. Copy the key (starts with rk_live_ or rk_test_).
  4. In gptsheet → Menu → Connectors → Stripe → paste the key → Connect.

Google Analytics (GA4)

Pull GA4 sessions, conversions, channels, top pages. Service-account-based.

Setup

  1. Complete Service Account 101.
  2. Enable the Google Analytics Data API in your GCP project.
  3. Get your GA4 Property ID: GA4 → Admin → Property Settings → Property details. It's a 9-digit number.
  4. Grant the service account access: GA4 → Admin → Property Access Management → Add user → paste the client_email from the JSON → role Viewer → Save.
  5. In gptsheet → Menu → Connectors → Google Analytics → paste Property ID + JSON → Connect.

Google Search Console

Top queries, top pages, country breakdowns. Service-account-based.

Setup

  1. Complete Service Account 101.
  2. Enable the Search Console API in your GCP project.
  3. Get your Site URL as it appears in Search Console (e.g. sc-domain:example.com or https://www.example.com/ — copy the exact format).
  4. Grant access: Search Console → Settings → Users and permissions → Add user → paste the client_email → permission Restricted.
  5. Paste site URL + JSON in gptsheet → Connect.

BigQuery

Run SQL against your warehouse, list datasets/tables. Service-account-based.

Setup

  1. Complete Service Account 101.
  2. Enable the BigQuery API in the project.
  3. Grant the service account these IAM roles on the project: BigQuery Data Viewer, BigQuery Job User.
  4. Find your GCP Project ID (the alphanumeric ID in the GCP console header).
  5. Paste Project ID + JSON → Connect.

YouTube Insights

Channel + video performance via YouTube Analytics API.

Setup

  1. Complete Service Account 101.
  2. Enable the YouTube Analytics API in the project.
  3. Find your Channel ID: YouTube Studio → Settings → Channel → Advanced settings.
  4. YouTube channels can't directly grant access to a service account. The standard workaround is to use the YouTube OAuth flow for a brand account the SA can impersonate, or use Domain-Wide Delegation for Workspace accounts. For most users we recommend the OAuth flow described in YouTube's docs.

YouTube's auth model is more restrictive than the other Google connectors. If this trips you up, ping support@gptsheet.com — we'll walk you through it.

Google Ads

Campaigns, keywords, spend.

Setup

  1. Complete Service Account 101.
  2. Enable the Google Ads API in the project.
  3. Get a Developer Token: Google Ads UI → Tools → API Center → Apply for a developer token. (Free, but takes Google 1-2 days to approve.)
  4. Find your Customer ID (10-digit number with no hyphens, top-right of the Ads UI).
  5. Link the service account: Google Ads → Tools → Account access & security → Users → Invite user → paste client_email → access level Standard or Read-only.
  6. Paste customer ID + developer token + JSON → Connect. If you're calling from a manager (MCC) account, also fill login_customer_id.

HubSpot

  1. HubSpot Settings → Integrations → Private Apps → Create a private app.
  2. Give it Read-only scopes: crm.objects.contacts.read, crm.objects.companies.read, crm.objects.deals.read.
  3. Create → copy the access token (starts with pat-) → paste in gptsheet → Connect.

Salesforce

  1. Find your Instance URL at the top of the Salesforce UI (e.g. https://yourorg.my.salesforce.com).
  2. Get an OAuth access token: easiest path is the SF CLI (sf org login web, then sf org display --json | jq -r .result.accessToken). Tokens expire in ~2h by default.
  3. Paste instance URL + token → Connect.

Tokens expire. Re-paste when SOQL queries start returning 401.

Zoho CRM

  1. Pick your data center suffix: com (US), eu, in, com.au, jp.
  2. Go to api-console.zoho.com → Add Client → Self Client → Generate Code → exchange the code for a refresh + access token (Zoho's flow).
  3. Paste data center suffix + the access token → Connect.

Apollo

  1. Apollo Settings → Integrations → API → Generate new key.
  2. Paste the key → Connect.

Meta Ads

  1. Get your Ad account ID (starts with act_) from Meta Business Suite → Ads Manager.
  2. Generate a long-lived Marketing API token: Business Suite → Settings → System Users → Add → grant Marketing API access on the ad account → Generate Token.
  3. Paste both → Connect.

LinkedIn Ads

  1. Apply for LinkedIn Marketing API access at linkedin.com/developers. Requires manual approval.
  2. Once approved, complete the 3-legged OAuth flow with scope r_ads_reporting.
  3. Find your numeric Ad account ID in Campaign Manager.
  4. Paste both → Connect.

Facebook Page Insights

  1. Get your Page ID from the Facebook Page → About → Page transparency.
  2. Use Graph API Explorer (link) to generate a Page Access Token with scopes pages_read_engagement + pages_read_user_content.
  3. Paste both → Connect.

Instagram Insights

  1. Your Instagram must be a Business Account linked to a Facebook Page.
  2. Find your IG Business ID: Graph API Explorer → query /me/accounts?fields=instagram_business_account.
  3. Generate a token with instagram_basic + instagram_manage_insights.
  4. Paste both → Connect.

PostHog

  1. PostHog → Settings → User → Personal API keys → Create. Grant project:read scope (and any others you want).
  2. Get your Project ID from Project Settings.
  3. Host is usually https://us.posthog.com (or https://eu.posthog.com for EU).
  4. Paste all three → Connect.

Amplitude

  1. Amplitude → Settings → Projects → your project → API Keys.
  2. Copy API Key and Secret Key.
  3. Paste both → Connect.

Ghost

  1. Your Ghost Site URL (e.g. https://yourblog.ghost.io, no trailing slash).
  2. Ghost Admin → Integrations → Add custom integration → copy the Admin API Key. It's in the format <id>:<secret>.
  3. Paste both → Connect.

Beehiiv

  1. Beehiiv → Settings → Integrations → API → Create new key.
  2. Find your Publication ID (starts with pub_) in Settings → Publication.
  3. Paste both → Connect.

Alpha Vantage

  1. Get a free API key at alphavantage.co/support/#api-key (5 calls/min, 100 calls/day on free).
  2. Paste the key → Connect.

Massive

  1. Sign up at massive.com → Dashboard → API Keys.
  2. Paste the key → Connect.

Notion

  1. Go to notion.so/my-integrationsNew integration.
  2. Name it, copy the Internal Integration Token (starts with secret_).
  3. In each Notion database or page you want gptsheet to read, click … → Add connections → pick your integration.
  4. Paste the token → Connect.

Airtable

  1. Go to airtable.com/create/tokensCreate token.
  2. Scopes: data.records:read, schema.bases:read. Pick which bases to grant access to.
  3. Copy the token (starts with pat) → Paste → Connect.

GitHub

  1. GitHub → Settings → Developer settings → Personal access tokens → Fine-grained tokens → Generate new token.
  2. Scopes: repo (or scoped to specific repos) and read:org for org-level access.
  3. Copy the token (starts with github_pat_ or ghp_).
  4. Paste → Connect.

Linear

  1. Linear → Settings → API → Personal API keys → Create.
  2. Copy the key (starts with lin_api_).
  3. Paste → Connect.

Trello

  1. Go to trello.com/power-ups/admin → New → Generate a new API Key.
  2. On the same page, click manually generate a token → authorize.
  3. Paste both → Connect.

Calendly

  1. Calendly → Integrations → API & Webhooks → Personal access tokens.
  2. Generate a token, copy it.
  3. Paste the token → Connect. (Leave User URI blank; the connector resolves it automatically.)

Mailchimp

  1. Mailchimp Account → Extras → API Keys → Create A Key.
  2. Note the data center suffix at the end of the key (e.g. -us1, -eu1).
  3. Paste data center + key (username can be anything) → Connect.

Shopify

  1. Shopify Admin → Settings → Apps and sales channels → Develop apps → Create an app.
  2. Configure Admin API scopes: read_orders, read_products, read_customers, read_inventory.
  3. Install the app → copy the Admin API access token (starts with shpat_).
  4. Paste your shop subdomain (for mystore.myshopify.com, type mystore) + the token → Connect.

MySQL

  1. You need a publicly-reachable MySQL host (RDS, Cloud SQL, PlanetScale, your own VPS). Apps Script cannot reach hosts in private VPCs without a tunnel.
  2. Create a read-only DB user: CREATE USER 'gptsheet'@'%' IDENTIFIED BY '...'; GRANT SELECT ON yourdb.* TO 'gptsheet'@'%';
  3. Allow Google Apps Script's outbound IP ranges through your security group (docs).
  4. Paste host + port (3306) + database + user + password → Connect.

MS SQL Server

  1. Public-reachable Azure SQL or self-hosted SQL Server.
  2. Create a read-only login + map to a DB user with SELECT permissions.
  3. Allow Apps Script outbound IPs through firewall.
  4. Paste host + port (1433) + database + user + password → Connect.

Postgres

Apps Script doesn't have a Postgres JDBC driver, so this connector talks to a PostgREST-style HTTP layer in front of your DB.

  1. Stand up PostgREST in front of your Postgres (Lambda, Cloud Run, Fargate, or a self-hosted VPS).
  2. Alternatively, Supabase, Neon HTTP API, or a custom Lambda proxy work too.
  3. Mint a JWT or API key for the PostgREST role you want gptsheet to use.
  4. Paste the base URL + token → Connect.

Supabase

  1. Supabase Project Settings → API.
  2. Copy the Project URL (https://<ref>.supabase.co) + the service_role secret key.
  3. Paste both → Connect.

The service_role key bypasses RLS. Only paste it if you understand the implications. For RLS-respecting access, use the anon key + JWT instead.

DynamoDB

  1. AWS Console → IAM → Create user gptsheet-dynamo.
  2. Attach a custom policy granting dynamodb:GetItem, Query, Scan, ListTables on the tables you want exposed.
  3. Create an access key (Access Key ID + Secret).
  4. Paste region + access key + secret → Connect.

Snowflake

  1. Snowflake → Admin → Users → your user → Programmatic Access Tokens → Generate token.
  2. Find your account identifier (e.g. xy12345.us-east-1) from the URL.
  3. Note the warehouse + database you want gptsheet to query.
  4. Paste all four → Connect.

Redshift

  1. AWS Console → IAM → grant redshift-data:ExecuteStatement, GetStatementResult, DescribeStatement on your cluster.
  2. Create an access key for that IAM user.
  3. Paste region + cluster ID + database + DB user + access keys → Connect.

Databricks

  1. Databricks workspace → User Settings → Access tokens → Generate new token.
  2. Find your SQL Warehouse ID under SQL → Warehouses.
  3. Workspace URL is the https://dbc-...cloud.databricks.com from your browser.
  4. Paste workspace URL + token + warehouse ID → Connect.

MongoDB

Atlas's Data API was sunset in 2025. This connector talks to any HTTPS proxy you stand up in front of your cluster.

  1. Deploy a small proxy (Cloudflare Worker, Lambda, Cloud Run) that takes POST /find {collection, filter, limit} and runs it against your Mongo cluster.
  2. Mint a bearer token for the proxy.
  3. Paste base URL + token → Connect.

Pricing

BasicPremium
Price$69 lifetime$99 lifetime
Formulas13 coreAll 16 (+ vision, web, image)
ProvidersOpenAI + Anthropic+ Gemini + Groq + others
Sidebar toolsCustom Prompt, Bulk Apply, API Fetcher+ Chat Agent
Updates1 yearLifetime
SupportEmailPriority email

One license = one account. 14-day money-back guarantee, no questions asked.

Privacy

gptsheet is built around a local-first data flow:

ComponentSees your data?
Our server (Supabase)Never. Only stores license key + bound email.
Your browser (sidebar tools)Yes — sends data directly to your chosen LLM provider over HTTPS.
Google Apps Script runtime (cell formulas)Yes — same trust boundary as your spreadsheet itself.
LLM providers (OpenAI, Anthropic, Gemini)Yes — per their privacy policies. You're a direct customer.

Read our full privacy policy.

Updates

Basic users get all updates released within 1 year of purchase. After that, your existing version keeps working forever — you just don't get new formulas or provider integrations released after the cutoff. Upgrade to Premium any time for lifetime updates.

Premium users get every update, forever. New formulas, new providers, new sidebar features — all included.

Refunds

14-day money-back guarantee. Email support@gptsheet.app within 14 days of purchase for a full refund. No questions, no clawbacks.

FAQ

Why is gptsheet cheaper than GPT for Work?

GPT for Work bundles the LLM cost into expiring credit packs. We don't — you pay providers directly. For a moderate user, that means $1–3/month in LLM costs vs $29–$299 in expiring GPT for Work credits.

Do you train on my data?

No — our server never sees your sheet data, prompts, or LLM responses. Your data flows directly to the LLM provider you chose; how they handle it is governed by their privacy policy (OpenAI, Anthropic, and Google all support API opt-out from training).

Can I use a local model with Ollama?

Premium tier supports custom OpenAI-compatible endpoints. Point at http://localhost:11434/v1 and run any model you've pulled. Useful when you want zero outbound LLM calls.

Does it work in Excel?

Yes. Same formulas, same sidebar UI, same chat agent. One license covers both Google Sheets and Microsoft Excel.

What happens after my 1-year Basic updates window ends?

Your existing version keeps working forever. You stop receiving new features released after the 1-year mark. Upgrade to Premium any time for lifetime updates, or stay on your current version indefinitely.

Troubleshooting

"Google hasn't verified this app" warning

Expected for newly-published add-ons before Google completes OAuth verification (usually 2-6 weeks after submission). Once verified, the warning disappears for all users.

To proceed in the meantime: click Advanced → Go to gptsheet (unsafe)Allow. The "unsafe" label is generic — it does not mean the add-on is malicious; only that the verification review is still pending.

HTTP 429 from OpenAI when generating images

Cloudflare rate limit triggered by OpenAI's DALL·E 3 service on free / Tier 0 accounts. Switch to a Gemini image model:

=IMAGE(GPT_IMAGE("a green apple", "", "gemini-2.5-flash-image"))

"Authorization required" / "Specified permissions are not sufficient"

Apps Script triggers re-authorization when scopes change (e.g. we added Drive access for image generation). To re-grant:

  1. Open Extensions → Apps Script from your sheet.
  2. Pick any function in the toolbar and click Run.
  3. Approve the re-authorization dialog. New scopes are added to your existing grant.

Cell shows [gptsheet] OpenAI key not set

The cell formula needs its key synced to Apps Script. Open the sidebar → ≡ → API keys → edit OpenAI → Save. (Saving also syncs to formulas.)

Need more help?

Email support@gptsheet.app. Premium customers get priority response (typically within 24h).