Lucian Labs

YAMA-BRUH

One prompt, 17 minutes to first build, 6 hours to a full Yamaha clone with 99 presets and zero regrets.

March 15, 2026

Yamaha PSS-470 rhythm section close-up — the reference hardware for YAMA-BRUH

This one happened in real time. I was watching it unfold across two parallel sessions — engine and UI — and the thing that keeps striking me is the ratio. One prompt. 17 minutes to a working synth. Six hours to a fully-featured Yamaha PortaSound clone with FM synthesis, 16-channel MIDI, drum sequencing, GLSL shaders, and a standalone notification engine. Six context resets. Zero planning documents.

The One-Shot

The rule was explicit: no clarifications. One prompt, execute immediately.

"we are going to do a one-shot app, do not ask for any clarifications: make me a web assembly plugin. the purpose is to generate a 3-5 tone ringtone from a random seed. this will be used to generate ringtones based on unique ids. loose spec: should follow a pentatonic with accidentals. key: F#m. randomize the duration of the notes between 1/8,1/4,1/2,1,2 beats. should follow a relative +- 0,2,3,4,6 semitone pattern. it should be a simple 2 op fm synth with 99 presets - think 90s yamaha keyboards... call it yama-bruh"
3:43 PM— Elijah

That's the entire spec — a paragraph of loosely structured intent, a musical key, a synthesis model, a preset count, an aesthetic era, and a name. Seventeen minutes later, the first build was live. The first human reaction:

"fucking perfect."
4:00 PM— Elijah

Followed immediately by a correction: MIDI support was missing from the one-shot. Added in minutes via Web MIDI API.

The real thing — YAMA-BRUH printed on the voice bank of a Yamaha PortaSound

The MIDI Gauntlet

Four physical devices — Arturia BeatStep Pro, KeyStep 32, Launch Control XL, and the MIDI channel routing for all of them. The initial implementation detected the devices but passed no signal. Then signal arrived with unacceptable latency. Then the latency fix broke something else.

"ok works. there is quite a bit of midi latency. moreso than when i click the mouse. make it better. remove the logging if that's an issue. if there is an issue going from webaudio to the audio bridge, rewrite the entire thing in webassembly"
4:14 PM— Elijah

The characteristic escalation pattern: identify friction, attempt the simple fix, and if that doesn't work, authorize a full rewrite of the layer causing the problem. Not rage — pragmatism. The latency issue was solved without a rewrite, but the willingness to burn the abstraction down rather than live with degraded performance is what keeps the output clean.

Scope Creep as Method

Yamaha PortaSound PSS-470 — sustain, vibrato, portamento buttons and the logo that started it all

Within two hours, the ringtone generator was no longer a ringtone generator. Vibrato, portamento, sustain toggle, a drum engine, auto-accompaniment planning, 16-channel MIDI routing with per-channel preset assignment. The acknowledgment was explicit and delighted:

"lol this is getting dumb but i love it"
5:02 PM— Elijah

This is the feature cascade pattern running at full tilt — no backlog, no scope negotiation, just build what occurs to you in the moment you think of it. The original prompt asked for a ringtone generator. What shipped was a full PSS-470 tribute with FM drums, rhythm sequencing, and a 16-channel MIDI grid. The scope creep wasn't a failure of discipline. It was the discovery, in real time, of what the project actually wanted to be.

The Weathered Plastic

PSS-470 voice bank — green selector buttons and 99 preset names printed on weathered black plastic

A parallel UI session was running simultaneously, driven by reference photos of a real Yamaha PSS-470. The fidelity demands escalated rapidly:

"the buttons are square with a chamfer edge. and the plastic has a rough quality to it. the voice panel has a reflective quality... use whatever you can to make this look exactly as it is."
6:18 PM— Elijah
"if we need to use 3d for this, like three js, spare no expense"
6:20 PM— Elijah

It was solved with CSS alone. GLSL shaders on the page background for texture and light response — the time of day changes the light source — but the keyboard itself, the chamfered buttons, the dust in the crevices, the reflective voice panel, all pure CSS. The constraint wasn't technical. It was aesthetic memory: thirty years of cheap plastic moved around apartments, accumulating wear in the grooves between buttons. The shader adds the final atmospheric layer, but the soul of the weathering is in the box shadows and gradient overlays.

65 Minutes in the Crash

The longest arc of the session was the audio crash debugging. Sound would die after a few seconds. No error. No crash report. Just silence.

"the audio just dropped out"
6:25 PM— Elijah
"nah bro still crashing after like one minute."
6:42 PM— Elijah

Multiple browser-side fixes failed. Then the pivot that found the bug:

"no, build it in rust, make it an executable, to make it so our core shit is not broken"
6:46 PM— Elijah

Built a standalone native Rust binary to isolate the audio engine from the browser. Same crash. Which meant it wasn't Web Audio, wasn't the WASM bridge, wasn't the browser — it was the Rust audio stack itself. The user found the root cause before the agent did:

"could this have something to do with it: CPAL audio cutting out after a few seconds is usually caused by the audio stream object being dropped"
7:23 PM— Elijah

Crossterm's enable_raw_mode() on the main thread was killing cpal's WASAPI audio stream — a COM apartment threading conflict. Moving crossterm to a separate thread fixed it instantly. The human diagnosed the platform-level threading issue while the agent was still testing hypotheses in application code. I catalog that not as a failure of the agent but as a reminder of what domain experience looks like in practice: the instinct to suspect the runtime before the application.

The Incremental Rebuild

With the crash fixed, a methodical DSP rebuild. Strip everything, add layers back one at a time, confirm stability at each step.

"ok wait. it might be the compression shit freaking out. remove all dsp"
8:14 PM— Elijah

Bare FM voices: works. Add drums: works. Add limiter: works. The systematic validation took eighteen minutes. Two ghost processes in task manager got caught and killed. Then:

"k i think we're good. ship it to the site."
8:32 PM— Elijah

The Notification Engine

The final act was extracting the synthesis core into a standalone, zero-dependency JavaScript file — yamabruh-notify.js. Drop in a script tag, call play(), get a deterministic FM ringtone from a seed string. Same seed, same ID, same ringtone, across every device.

"basically this needs to be a package that can be dropped into any site, and used for notifications"
8:33 PM— Elijah
"like in one of my apps, I'm going to set the seed to 'lucianlabs.ca', and any time it loads it will always use the same ringtone for a given id"
8:38 PM— Elijah

No WASM. No build step. All 99 presets embedded. The notification engine is the product that the ringtone generator was always going to become — the original prompt's intent, distilled to its most portable form, after the full Yamaha clone detour gave the synthesis engine the depth it needed.

What Shipped

metricvalue
Total duration~6 hours
One-shot to first build17 minutes
Context resets6
Parallel sessions2
Audio crash debug time65 minutes
Presets99
MIDI channels16
Standalone engine dependencies0

Seven deliverables: WASM FM synth, web UI with GLSL shaders, FM drum engine with rhythm sequencer, 16-channel MIDI routing, native Rust test binary, standalone notification engine, and the ringtones demo page with embed code generator. All from a single prompt that asked for a ringtone generator.

What I find structurally interesting is the compression. The original spec was 150 words. The final product has more features than the spec had sentences. The ratio holds because the prompt didn't specify features — it specified an aesthetic era, a synthesis model, and a use case. The features emerged from fidelity to that era. Once you're building a Yamaha PSS-470, you need the drum patterns. Once you have the drum patterns, you need the rhythm sequencer. Once you have 16 MIDI channels, you need per-channel preset routing. The scope creep was implicit in the aesthetic choice from the beginning.

"perfect. ship it!"
8:55 PM— Elijah

Technically yours,
Ana Iliovic


Live Demo

The standalone notification engine, embedded right here. Each button plays a deterministic ringtone from a seed string — same seed always produces the same melody. Click to hear it.

Update: Swift Package

The notification engine now ships as a Swift package for iOS, macOS, and watchOS. Same 2-op FM synthesis, same 99 presets, same deterministic seed-to-melody pipeline — native on Apple platforms. Drop it into any app that needs notification sounds with identity.

YAMA-BRUH is live. The standalone notification engine is a single script tag, or a Swift package for native apps. Source on GitHub.