Initializing
Noah Robison
Portfolio
TARGET
PNL-01 // ARCHIVE

Projects

PNL-02 // PROFILE
About

About

PNL-03 // RESEARCH
Resume
$ status --all
> experiments: 6
> status: ok
> _|

Resume

PNL-04 // TRANSMIT
Contact

Contact

PNL-05 // DIRECTIVE
Calypso
01
02

Calypso

PNL-06 // MANIFEST
COSMOS
001
002
003
004
005
006
6
6
4

COSMOS

Selected Projects
Calypso AI platform

Calypso

Product Design / 2026

COSMOS NASA data tools

COSMOS

Personal Project / 2025

Lexicon typographic mood boards

Lexicon

Personal Project / 2025

Iterate design brief generator

Iterate

Personal Project / 2025

Wavelength mood playlist curator

Wavelength

Personal Project / 2025

About / Noah Robison

Founder. Designer.
Builder.

I'm Noah Robison, a product designer and founder based in Bellevue, WA. I built Calypso, an AI platform that goes from natural language to a fully deployed, production-ready application. I have spent the last three years working across UX, design systems, and frontend.

Background

Product design is my focus, end-to-end, from the first user interview to the shipped component. I have spent the last few years working with frontend close enough to know exactly what I am handing off, and building Calypso from the ground up gave me a front-row seat to how design decisions play out in the real world.

At a Glance

Location Bellevue, WA
Focus Product & Interaction Design
Experience Founder @ Calypso.build

What I work with

Design

End-to-end product design

Interaction and motion design

Design systems and component libraries

User research and usability testing

Visual design and art direction

Figma (prototyping, variables, dev mode)

Build & Ship

Frontend development (React, JS, TypeScript, HTML, CSS, etc.)

Design-to-dev handoff

Consumer social product design

Cross-functional collaboration

Rapid iteration from real usage data

Experience

Founder & Product Designer

Calypso.build -- Jan 2026 to Present

Sole designer of an AI platform that ships full apps, websites, and games from natural language. Responsible for the end-to-end product experience, visual system, component library, and design-to-dev handoff. Ship multiple product updates weekly driven by real usage data and rapid user feedback loops.

UX / Product Designer

Freelance -- May 2022 to Jan 2026

End-to-end UX design across web and mobile for multiple clients, including user research, wireframing, prototyping, and visual design. Designed accessible, responsive interfaces across iOS, Android, and web with a focus on clarity, hierarchy, and interaction quality.

Events Manager

Bierkeller Columbia -- Oct 2023 to Sep 2025

Managed all programming and community experience for a high-volume German brewery and biergarten. Developed instincts for engagement, retention, and experience design at real scale across events ranging from intimate gatherings to large-scale festivals.

Toolkit

Figma FigJam Framer Webflow Maze Miro Notion React HTML / CSS GSAP Cursor VS Code

CV / Resume

Resume

Product designer with a background in interaction design, design systems, and frontend development. Founder of Calypso.build, an AI-powered platform for generating full-stack applications.

Download Resume

Bellevue, WA

noahrobisone@gmail.com  |  803-767-6009

Area of Expertise

Product Design (End-to-End)
Visual Design
User Research & Synthesis
Figma & Design Systems
Interactive Prototyping
Design Quality & Standards
Interaction Design
Cross-functional Collaboration
Consumer Social Product Design

Professional Experience

Founder & Product Designer -- Calypso.build

Jan 2026 -- Present
  • Sole designer of an AI-powered platform that ships full apps, websites, and games from natural language, responsible for end-to-end product experience, visual system, and brand identity
  • Designed the full component library and multi-platform UI system in Figma; manage design-to-dev handoff directly
  • Ship multiple product updates weekly, driven by real usage data and rapid user feedback loops

Events Manager -- Bierkeller Columbia

Oct 2023 -- Sep 2025
  • Managed all programming and community experience for a high-volume German brewery and biergarten
  • Developed instincts for community engagement, retention, and experience design at real scale
  • Managed all programming and community experience for events ranging from intimate gatherings to large-scale festivals, developing instincts for engagement, retention, and audience-driven design

UX / Product Designer -- Freelance

May 2022 -- Jan 2026
  • Delivered end-to-end UX design across web and mobile products for multiple clients, spanning user research, wireframing, prototyping, and visual design
  • Conducted usability testing and qualitative research to inform design decisions, iterating based on real user feedback throughout each engagement
  • Designed accessible, responsive interfaces across iOS, Android, and web, with a focus on clarity, hierarchy, and interaction quality

Education

Google UX Design Certificate

Google / Coursera

Dec 2022 -- June 2023

USC

Columbia, SC

Jan 2019 -- May 2025

Contact

Say hello.

Whether it's about work, something I built, or anything else, I'm easy to reach.

Product Design / Founder / 2026

Calypso

From idea to deployed application.

https://calypso.build
Visit Calypso.build
Calypso home screen

Overview

Every AI design and development tool on the market produces the same thing. Same fonts, same layouts, same color choices, same generic output that looks like it came from the same template. And beyond the aesthetics, most of these tools stop at a landing page. There is no continuity, no path forward, and no way to build anything that functions as a real product.

Calypso was built to solve both problems. The goal was a platform where anyone, with no background in design or development, could take a raw idea and build a complete product from start to finish. Not a landing page. A real product, whether that is a website, an app, a game, or something else entirely, designed, built, and published in one place by one person.

Calypso project overview showing the six-step progress bar and step cards

The Workflow

The core of Calypso is a six-step project flow: Features, User Flows, Pages and Content, Design, Build, Publish. Each step builds directly on the last. By the time a user reaches Design, the AI already knows the features they want, the flows connecting them, and every page the product needs. The design it generates is specific to that project, informed by everything that came before it.

This sequential structure was a deliberate design decision. Calypso treats the process like a guided path, concrete and ordered, always moving forward. The AI can do more at each step because it carries context from every step before it. Users never face a blank canvas without knowing what to put in it.

Once every step prior to Build is complete, the AI generates the full frontend and backend codebase of the project live, building it out in real time for the user to watch as it happens. The user selects their preferred framework and styling approach, and the AI handles the rest, producing a production-ready codebase from everything defined in the steps before it.

Features step showing card grid
User Flows step showing diagram view

The Workspace

The biggest design challenge was the workspace. Every step in the flow needed to feel like it belonged to the same product while using the screen completely differently. Features shows a card grid. User Flows shows a diagram. Pages and Content shows structured summaries. Design shows a live rendering of the actual page being built, generated in front of the user in real time. Build generates the full codebase live. Publish handles deployment.

Navigation through the project lives in a persistent left sidebar. All six steps are always visible on the left side of the screen, giving the user a clear and constant view of where they are in the project, where they have been, and where they are going. The sidebar never changes in structure because the six steps never change. What changes is the entire right side of the screen, which adapts completely to whichever step is active.

The main canvas to the right is where the work happens. During the Design step, Calypso renders the actual page live as the AI builds it, and the user watches it construct in real time. An AI avatar is accessible at any point across the entire platform. It opens as a panel alongside the sidebar on the left side of the screen, and can make changes to the current page, move between steps, or take over the entire project workflow on the user's behalf.

Design step showing Generate Design modal over live page preview

The Design Problem

The tension throughout the whole project was the same: how creative to go versus how familiar to stay. A tool that looks completely unlike anything else is exciting but disorienting to use. The resolution was to draw that line at the workspace boundary.

The home screen, project library, and designs gallery all follow familiar, well-tested layouts. Rows and columns, clear navigation, sorting and filtering where expected. The workspace is where the creative decisions live, because that is where users are already in a focused, exploratory mindset, and watching a product build itself in real time earns a certain amount of unfamiliarity.

Designs gallery showing range of generated designs

Role

Sole designer. Responsible for the full product experience, visual system, component library, design-to-dev handoff, user research, QA, and marketing.

Live: calypso.build

Personal Project / Design and Development / 2026

COSMOS

Seven NASA data tools inside a living galaxy.

https://cosmos-zeta-steel.vercel.app/
Visit COSMOS
COSMOS galaxy homepage with navigation nodes visible

The Starting Point

I watched the Artemis II launch live. Space exploration carries a kind of hope that is almost impossible to find anywhere else. There is no cynicism in it, no agenda, just humanity looking to the stars and asking the question "what's out there?" and following up with "let's find out". That feeling has been a constant for a long time, through dark periods and bright ones, and watching the launch turned it into something that needed to become more than a feeling.

The first move was joining the NASA Citizen Science program, contributing to actual ongoing research. The second was COSMOS, a single place to access NASA's live data streams, built to be genuinely beautiful and genuinely useful, for anyone who has ever looked up and wanted to feel a little closer to what is out there.

What It Is

Seven tools, each pulling from a different NASA or NOAA data source.

An ISS tracker showing the station's live position as it orbits at 28,000 km/h. An asteroid tracker monitoring near-Earth objects by closest approach distance. A space weather monitor for solar flares, coronal mass ejections, and geomagnetic storms. A moon phase and planetary events tracker calculated for your location. An exoplanet atlas of thousands of confirmed planets beyond our solar system. An astronomy picture of the day. And a tool showing daily images of Earth taken by the NOAA DSCOVR satellite, 1.5 million kilometres from home. That last one had a personal angle: the spacecraft is called NOAA, one letter away from Noah, which felt like reason enough to make it part of the project.

All seven tools live inside a galaxy built from particle animations. The navigation nodes are clusters of particles orbiting the galaxy. Clicking one collapses the galaxy and opens the tool. Scrolling pulls everything inward and reshapes it.

ISS Tracker tool open
Asteroid Tracker tool open

The Technical Problem

The hardest part was the particle system, getting tens of thousands of points to move, form, and respond to interaction at a consistent frame rate. GLSL shaders handle the heavy lifting, with particle positions computed on the GPU rather than the CPU so the browser can keep up.

There was also a full day that felt like a complete loss. Nothing was working, the NASA API was returning errors on every call, and hours went into rewriting and debugging code that turned out to be completely fine. NASA's servers had been down all day. Every fix had been unnecessary. Some problems have nothing to do with anything you wrote.

The Mobile Decision

COSMOS was designed for desktop. The galaxy experience, scroll to rotate and click to navigate, needs a mouse and a large screen to work the way it was intended. Every user who tested it tried to use it on their phone, and found the experience lacking when it did not translate.

Rather than adapting the desktop version, the mobile experience was rebuilt from scratch in a single day. On mobile, COSMOS becomes a swipeable card carousel. The galaxy sits behind it as atmosphere. Each card surfaces the key data point immediately without requiring navigation into the galaxy. It is a different experience, not a compressed version of the same one, and both versions work the way they are supposed to on the device they are on.

Why It Exists

COSMOS is completely free and fully open source. It was built over three to four consecutive days of locking in and not stopping, from the initial idea to a published, working product. There was no financial incentive, no external deadline, no brief. Just the thing itself, and the desire to make it good.

Role

Solo. Design, development, data integration, mobile adaptation. Built with Three.js, WebGL, GLSL shaders, and the NASA and NOAA public APIs.

Live: cosmos-zeta-steel.vercel.app

Personal Project / Design and Development / 2025

Lexicon

Typographic mood boards generated from a feeling.

https://lexicon-delta.vercel.app
Visit Lexicon
Lexicon gallery showing the full grid of vibe cards

The Problem

Font pairing is taught as a rule system but experienced as a feeling. Most tools make you browse endless lists looking for something that matches a vibe you can barely articulate. The search almost never starts with the typeface. It starts with a word.

Lexicon flips that. Type any word and get back typographic mood boards that match its mood. Not keyword tags, not dropdown categories. A real word, looked up in real time, matched against the emotional character of each font pairing.

How the Matching Works

The hard version of this problem is making it work for any word, not just a curated list. Someone typing melancholy is easy. Someone typing cozy, hellish, frostbite, or bored is harder.

Every vibe in Lexicon is tagged with a set of emotional and aesthetic attributes drawn from 40 signal categories covering temperature, luminance, energy level, emotion, aesthetic style, and more. Each category maps to an expanded list of signal words. When a user types something, Lexicon runs it against every signal list and activates the tags that match.

For words that do not hit any signal directly, the tool calls the Dictionary API and Datamuse in parallel. Datamuse returns adjectives commonly used to describe the word, words it triggers associations with, and close synonyms. Those get weighted and added to the scoring. Dictionary definitions get sorted by part of speech, with adjectives weighted heaviest since they carry the most mood signal. Antonyms from both sources are stripped before scoring so hot does not accidentally activate cold tags.

Lexicon search input with a word typed and the resulting filtered card grid below

The Design

Each vibe card is its own type specimen. The card background, the large-scale Aa at display size, the sample heading and body copy demonstrate the pairing rather than describing it. If a card is doing its job you know immediately whether it is what you are looking for before you hover to see the details.

The hover overlay keeps the detail panel clean. Font names, the copy CSS button, and category label all live there. The card face stays uncluttered.

The Typefaces view documents all ten typefaces in the library with their weights, character samples, and cross-links back to the vibes that use them. Clicking a vibe link from the Typefaces view jumps to the gallery and highlights the matching card.

Lexicon Typefaces view showing a typeface specimen with weight samples

Role

Solo. Design, development, data curation. Built with HTML, CSS, JavaScript, Tailwind, the Dictionary API, Datamuse, and Google Fonts.

Live: lexicon-delta.vercel.app

Personal Project / Design and Development / 2025

Iterate

A daily design brief generator for designers who need a push.

https://designbriefgenerator.vercel.app
Visit Iterate
Iterate showing a generated brief card with client, deliverable, constraint, and time limit

The Problem

Design practice without a brief is hard to start. You sit down to work and the blank canvas is the first obstacle. Constraints are what make practice productive, and the right constraint is specific enough to start immediately and open-ended enough to go anywhere.

Iterate generates a randomized design brief in one click: a fictional client, a deliverable, a target user, a creative constraint, and a time limit. Five elements. Enough to start.

What It Is

A brief generator with a session archive. Every generated brief gets logged below the active card so you can scroll back through what you have worked on, search previous sessions by client, deliverable, or constraint, and pick up something you left unfinished. The copy button puts the full brief on your clipboard as formatted text.

Twenty-four fictional clients, twenty-four deliverables, eighteen user types, twenty constraints, and eight time limits give the generator enough range that repetition is unlikely in a normal session. The spacebar shortcut turns brief browsing into something closer to flipping through a deck.

Iterate session archive showing a grid of previously generated brief cards

The Design Decision

The brutalist aesthetic was the right choice for a tool about discipline and practice. Sharp borders, high contrast, no border radius, the serif-free stack. A tool that generates design challenges should feel like it means business.

The text scramble animation on generation is the most considered interaction in the product. The characters cycling through random glyphs before resolving to the real brief makes it feel earned rather than instant. It creates just enough anticipation that when the brief lands, you are already reading it.

Role

Solo. Design, development, data writing. Built with HTML, CSS, JavaScript, and Tailwind.

Live: designbriefgenerator.vercel.app

Personal Project / Design and Development / 2025

Wavelength

Describe how you feel. Receive a playlist tuned to it.

https://wavelength-lake-gamma.vercel.app
Visit Wavelength
Wavelength homepage showing the arcade aesthetic with the spectrum analyzer and input field

The Starting Point

The idea was a direct line from a feeling to music that fits. Not a playlist algorithm optimizing for engagement, not a mood tag buried in a streaming app. Type what you are feeling in your own words and get back music that matches the actual character of that feeling.

The harder problem was making it work for any word someone might type. Happy and sad are easy. Overwhelmed, burnt out, frostbite, hell. Those require something more than a keyword list.

How the Matching Works

Every playlist in Wavelength has a natural language description written to capture the emotional territory it covers. When a user submits a word, the tool looks it up in the Dictionary API and falls back to Urban Dictionary for slang. Every meaningful word in the input gets expanded into its definitions, and those definitions are tokenized and scored against every playlist description.

Words from the original input are weighted double so the literal typed word outweighs vocabulary that came from dictionary expansion. Playlists whose descriptions share the most content words with the expanded query win. The winning category gets front-loaded so results feel coherent rather than random.

Wavelength definition panel showing a typed word and its matched playlist results below

The Spectrum Analyzer

The centerpiece of the interface is a VU-meter-style spectrum analyzer built on HTML Canvas. It displays a bank of frequency bars color-coded green through amber to red at peaks, the way analog hi-fi gear used to.

When a track preview plays, the analyzer reacts to real audio frequencies pulled through a Web Audio AnalyserNode. The implementation uses two audio elements: a primary element without crossOrigin that plays sound regardless of CORS headers, and a secondary element with crossOrigin that feeds the Web Audio graph silently. This lets the bars respond to real frequency data while the primary audio always plays.

Track Previews

Each track has a play button that fetches a 30-second preview from the iTunes Search API and plays 5 seconds of it. The lookup uses a three-strategy approach: artist plus title first, then title alone to catch alternate artist credits, then artist alone to sweep the full catalog. Candidates are scored by how well the artist name and track title match, and only results above a threshold are accepted. Preview URLs are cached so repeat plays are instant.

The Aesthetic

Wavelength is designed to look like it exists inside the music it serves. The arcade cabinet reference, the CRT screen, the starfield background, the yellow-on-black color system, the spectrum analyzer. These are not decoration. The aesthetic is the experience.

Role

Solo. Design, development, data curation. Built with HTML, CSS, JavaScript, Tailwind, GSAP, the iTunes Search API, the Dictionary API, Urban Dictionary, and the Web Audio API.

Live: wavelength-lake-gamma.vercel.app