Work

Case StudyJanuary 20265 min read

Landing IQ

Landing IQ turns a landing page screenshot into a structured UX audit — overall score, severity-rated issues, and per-issue conversion impact estimates — in under 60 seconds. Three subscription tiers, usage enforced at the database layer, billing maintained through Stripe's full webhook lifecycle. Built solo, from schema to production.

Landing IQ — cover image

[Solo Full-Stack Build] · [AI-Powered UX Analysis] · [Atomic Usage Enforcement]

Overview

Landing IQ serves marketing teams and designers who need specific, actionable feedback on a landing page — not a generic best-practice checklist. A user uploads a screenshot, provides context about their page type, primary CTA, target audience, and device focus, and receives a structured audit in under 60 seconds: a 1–10 score, severity-tiered issues, quick wins, and a conversion impact estimate for each finding.

The analysis is context-aware by design. An e-commerce product page is evaluated against different conversion signals than a SaaS trial signup. Gemini 2.0 Flash uses the injected context to assess visual hierarchy, fold positioning, CTA placement, WCAG contrast compliance, and typographic clarity — scoping its output to what the user is actually trying to diagnose. Every issue carries a severity level and an estimated conversion impact range, which drives the report's visual hierarchy: critical issues surface first, quick wins are separated into their own section.

The product runs three tiers: a free plan with 5 analyses per month, Pro at $19 per month with 50, and Business at $49 per month with no cap.

The Challenge

The core difficulty was not the AI call — it was everything that had to happen correctly around it. An analysis writes to three systems in sequence: file storage, the database, and Gemini's inference endpoint. Any of those can fail independently. A file that uploads successfully but whose database record never gets created leaves orphaned storage. A database row written before Gemini finishes and never cleaned up on failure leaves a record stuck in a state the user cannot resolve. Getting the failure paths right was the foundation the rest of the product depended on.

Subscription enforcement added a separate constraint. Usage limits had to be checked and incremented inside the same request that triggered the analysis, because the analysis was the billable event. A gap between the check and the increment — or either running in application code rather than the database — would allow a user to exhaust or bypass their quota by firing simultaneous requests. Quota enforcement had to be atomic at the database layer, not optimistic at the API layer.

The AI response required defensive handling at every stage. Gemini's structured output can arrive wrapped in markdown fences, contain control characters, or include partial JSON when the model hedges on confidence. Any of those breaks a naive parser and surfaces a broken report to a paying user. The response pipeline had to parse, sanitize, repair, and validate before anything reached the frontend — a valid report had to be guaranteed, not assumed.

My Role

I designed and built Landing IQ solo — no team, no contractor, no inherited codebase. I owned the database schema, the API layer, the auth and storage configuration, the Stripe integration from checkout through webhook handling, and the complete frontend: marketing site, dashboard, analysis flow, and report UI.

The constraint of solo ownership shaped the architecture. Every decision had to be defensible without a second opinion — which meant being explicit about failure states, atomic about side effects, and conservative about what the frontend was allowed to assume about backend state.

What I Built

Analysis Pipeline

The analysis endpoint orchestrates four steps in sequence: auth check, usage limit verification via a database function, screenshot upload to storage, and the Gemini call with the image and injected context. Each stage only proceeds if the previous one succeeded. On failure at any point after the storage write, I delete both the file and the database row — the system never leaves partial state visible to a user. Analysis status moves from pending through analyzing to completed or failed, and the client resolves against that state.

Before sending the image, I inject the user's page context directly into the prompt — page type, CTA copy, target audience, device focus, and optional conversion rate baseline. That context shapes every evaluation the model makes. The response carries a severity tier for each issue and an estimated conversion impact range, which drives the report's visual hierarchy.

The response pipeline handles structural instability in model output explicitly. After receiving Gemini's reply, I strip markdown fences, scan for control characters, attempt JSON repair on truncated responses, and validate the result against a strict schema before writing anything to the database. A report only reaches the user if the full structure is valid.

Subscription and Usage Enforcement

Usage enforcement runs through two database functions: one fires before the analysis starts to check the limit, another fires after the screenshot upload succeeds to increment the counter. Moving this logic into the database rather than application code gives atomic semantics for concurrent requests — a user's monthly counter is checked and incremented inside the database, making quota bypass through simultaneous API calls structurally impossible rather than just unlikely.

The Stripe webhook handler processes the full subscription lifecycle through a security-definer database function that writes subscription state with elevated access — the webhook handler has no user session, so standard row-level security would block the write. The handler is idempotent by design: the same event delivered twice produces identical subscription state. Retries and out-of-order delivery from Stripe do not corrupt the billing record.

Dashboard and Analytics

The dashboard computes per-user analytics at request time: total audits run, average issues per completed report, severity distribution, audit frequency over the trailing seven days, and remaining monthly quota against the plan limit. Server state is cached and deduplicated across the dashboard shell, so navigating between views does not re-fetch data already in memory.

The Plan Health card surfaces remaining quota and upgrade paths contextually — copy and destination link change depending on the current plan, keeping the upgrade conversation inside the product rather than routing to a static pricing page. The CTA switches between "Upgrade" and "Manage Subscription" based on plan state.

Auth, Profiles, and Notifications

Auth reads and writes session cookies server-side on every protected route. On signup, a database trigger automatically creates a user profile from the new user's metadata — full name, company, and job title — so the dashboard has profile data on first load without a separate API call. Route protection lives in the dashboard layout: an unauthenticated request hits a server-side redirect before any rendering begins.

A second database trigger on the analyses table writes a notification when an analysis reaches completed or failed status. A dropdown in the dashboard nav fetches unread notifications and marks them read on open — users get a direct link to completed reports without polling the analyses list.

Results

The parse-repair-validate pipeline in front of every Gemini response means no structurally broken report reaches a user. The guarantee is enforced in code, not monitored after the fact.

Atomic usage enforcement in the database means three subscription tiers behave correctly under concurrent load. Quota is a constraint on the system, not a suggestion to the API.

The idempotent Stripe webhook handler means billing state survives out-of-order delivery and duplicate events without manual reconciliation. Subscription state reflects what Stripe has confirmed, not what the UI optimistically assumed.

The notification system driven by database triggers means users learn when an analysis completes inside the product — without polling, without a background job, and without a third-party service.

Closing

One developer. One codebase. Auth, AI, atomic billing, and no broken reports in production.