payton@pschwarz:~/code$ ls pb_emu/
src/  public/  tests/  vite.config.js  package.json  README.md

payton@pschwarz:~/code$ cat pb_emu/README | head

Pixelblaze is a small dev board for driving addressable LEDs. You write patterns in a JS-ish language, send them to the board over Wi-Fi, watch them light up real pixels. It’s great, until you’re iterating on a 3D map you haven’t physically built yet, or you want to tweak a pattern on the train without lugging a strip around.

So I built pb_emu: a browser-based Pixelblaze pattern emulator. Load a .js or .epe pattern, point it at a pixel map (1D strip, 2D grid, 3D arbitrary layout), and it runs the pattern live against the map in a Three.js scene. No hardware in the loop.

How it works

Three layers, each boring on its own. The interesting part is that together they behave enough like real Pixelblaze firmware that patterns port over without edits.

  • Map layer parses Marimapper CSV, Pixelblaze JSON, or a raw mapper function, normalizes coords to [0, 1), and figures out dimensionality.
  • VM layer evaluates pattern code in a scoped Function sandbox with ~100 Pixelblaze built-ins wired in: waveforms, HSV/RGB color math, palettes, Perlin noise, the whole menagerie. Handles quirks the firmware depends on: implicit globals zero-initialize like they do on the hardware’s fixed-point memory, prng() uses the same Xorshift32 seed, dimensionality dispatch follows the 3D→2D→1D fallback cascade.
  • Render layer is Three.js: an InstancedMesh of sprite LEDs with an Unreal bloom pass so the glow reads as actual light instead of flat dots.

Around that, the usual quality-of-life: CodeMirror 6 editor with live linting, auto-reload on disk change, an LED inspector (click a pixel, see its index/coords/color), control widgets for exported sliders and pickers that persist in localStorage per-pattern, screenshot export, a handful of keyboard shortcuts.

What it’s not

It’s not bit-exact. Pixelblaze uses 16.16 fixed-point; pb_emu runs Float64, so any pattern that intentionally exploits overflow at ±32,768 or 32-bit bitwise tricks will diverge. The transform stack is tracked but not multiplied through per-pixel coords yet. Sensor globals (frequencyData, accelerometer, light) are zero-filled, so no audio reactivity. Documented in the README, and the in-app linter flags the patterns most likely to bite.

For the 90% case (developing on a map you don’t have built yet, tuning a pattern on a laptop, sanity-checking before deploying), it’s been worth it.

payton@pschwarz:~/code/pb_emu$ █