Alea
Back to Podcast Digest
Riley Brown··27m

Why I’m Quitting Replit (I Built My Own in 4 Hours)

TL;DR

  • Riley rebuilt Replit’s mobile experience as a simpler Swift app called Jerry in about 4 hours — using OpenAI Codex, Xcode, Claude’s agent SDK, and the VibeCode CLI, he made an iPhone app that chats like iMessage and generates apps without writing code by hand.

  • The core bet is UX, not raw capability — Riley says Replit’s mobile app feels “small,” “cluttered,” and confusing, so Jerry strips the flow down to texting a smart developer named Jerry, then auto-opens generated app links inside the app.

  • He works with AI tools in parallel, not step-by-step — while Codex planned and coded the app, Paper.design generated UI mocks at the same time, and Riley kept “steering” both agents live with tweaks like full-screen preview, simpler home screen, and better chat formatting.

  • Most of the build was debugging agent plumbing, not product vision — early versions failed with missing data, stuck “Jerry is thinking” states, broken formatting, missing VibeCode CLI access, and link-handling bugs, which he fixed by feeding logs and screenshots back into Codex.

  • The final prototype actually shipped the key loop — Jerry could chat normally, call the VibeCode CLI to generate a landing page and a Notion-style todo app, open previews in-app, and even accept voice edits via an OpenAI Whisper-powered record button.

  • The most interesting interface idea came late: chat layered over live app preview — inspired by how Twitter/X opens links, Riley added a bottom chat overlay on top of the preview so you can keep prompting the builder while actively using the generated app.

The Breakdown

Why he bailed on Replit mobile

Riley opens with a pretty direct critique: after testing Replit all weekend, he just didn’t like the mobile app. To him it feels cramped, cluttered, and kind of boring, so the whole video becomes a challenge to build a better mobile “vibe coding” app from scratch — one that feels more like texting than operating a tiny IDE.

The pitch: an app that builds apps

His concept is called Jerry, a Swift iPhone app where the user chats with an AI developer named Jerry to create web or mobile apps. Under the hood, Claude’s agent SDK handles the conversation, the VibeCode CLI does the actual sandboxed app generation and hosting, and the app automatically opens the returned URL inside an in-app preview.

Starting in Xcode, then handing the wheel to Codex

Riley manually creates the barebones Swift project in Xcode — basically a Hello World shell — then opens the top-level project folder in OpenAI Codex. From there he pastes in a long prompt describing the product, the architecture, and the desired UX, then flips Codex into plan mode to map out the first build.

Designing in parallel with Paper while Codex codes

Instead of waiting around, he opens Paper.design — “basically a Figma but designed specifically for AI agents” — and has it generate the four main screens: inbox, conversation, preview, and build result. This is where you see his workflow philosophy clearly: he doesn’t move linearly, he runs research, design, and implementation simultaneously, dropping “steer” messages into the threads to refine details as the agents work.

The first build: long waits, ugly failures, then a heartbeat

The initial Codex run takes so long he literally goes for a 30-minute walk. When he finally runs the app on his iPhone, it immediately throws a “data couldn’t be read because it’s missing” error, then later gets stuck on “Jerry is thinking through the build” without replying, so he starts the real game: copying Xcode logs and screenshots back into Codex and asking it to fix the app iteratively.

Getting chat to feel like chat

Once the agent finally responds — “Hey, I’m Jerry. What are you building today?” — the formatting is a mess, with broken multiline layout and sloppy alignment. Riley pushes it toward an iMessage feel, asking for cleaner bubbles, proper alignment, and the classic three-dot typing indicator, which is one of the first moments where the app starts feeling like a product instead of a hacked demo.

Wiring up the VibeCode CLI and proving it can generate apps

The next hurdle is giving the Claude-powered agent actual access to the VibeCode CLI. After a few rounds of “it’s not on your machine” and confusion over missing skills files, he gets it working: Jerry confirms it has CLI access, then successfully generates a simple landing page for Joe & The Juice smoothies and later a Notion-style todo app, with the preview opening directly inside the iPhone app.

The late-stage UI idea that makes the whole thing click

The final stretch is all about preview UX. Riley doesn’t just want links to auto-open; he wants the app preview to feel native, and after noticing how Twitter/X opens links with the app still visible underneath, he borrows that pattern and asks Codex to keep the chat input overlaid at the bottom of the preview so he can keep prompting while testing the app live. He caps it off with voice input using OpenAI Whisper, so you can literally speak changes like “make this app light mode” while the preview is open.