Quite a number of people have requested a way to keep track of the progress I'm making on my volumetric LED mapping application, and I figured this might be the easiest way to post a few videos without spamming LED Facebook groups and hopefully give some of you the answers that I haven't had a lot of time to properly address.
Please do not spam my contact form just to tell me you have a question. This page will be updated wiith answers as to distribution, function, and anything else that you probably have questions about as I resolve issues and refine function. You are able fill in your email address at the bottom of the page if you would like to be updated about release date etc.
Spatial Pixel Mapper is a desktop application for 3D spatial mapping of individually addressable LEDs. Using a pair of
stereo cameras and WLED-compatible controllers, it automatically scans each LED one by one, detects its position in both camera views, and triangulates its precise 3D location in space. The app features automatic WLED device
discovery, per-port LED range selection, adaptive speed and brightness testing to optimize for your specific setup, and a multi-scan workflow that lets you capture from different angles and merge the results into a single unified point cloud. A built-in refine tab provides tools for cleaning up outliers, editing points, and visualizing your mapped installation in an interactive 3D viewer with slice-plane inspection. Once finalized, the completed spatial map can be exported for use in LED mapping and visualization software like Resolume and MadMapper, turning any freeform LED installation — whether it's wrapped around a sculpture, strung across a ceiling, or shaped into a sphere — into a
precisely mapped 3D pixel canvas ready for spatial video and generative content.
The application has been streamlined into a logical, high-efficiency pipeline:
Here's a quick example of a simple scan and implementation using one single scan because it is a relatively 2d I hate to see the song called Harder Than You object and I just wanted to demonstrate the functionality rather than achieve perfect results.
This 2D LED Mapper is a mobile Android app for mapping the physical 2D positions of individually addressable LED strips, Currently driven by an inexpensive W LED controller. It automatically discovers WLED devices on your network, then uses your phone's camera to scan each LED one-by-one — lighting them up via the WLED artnet and detecting their position in the camera frame using real-time computer vision (VisionCamera + FastOpenCV). Multiple scans and be stitched together in a workspace viewer with automatic alignment, letting you map large installations in overlapping sections. The workspace provides a full suite of non-destructive correction tools: interpolation fills gaps in detection, straighten snaps jittery points onto best-fit lines, spacing lock smooths out outlier positions, and a select mode lets you drag-select groups of LEDs to move, evenly
space, or straighten specific sections. A probe tool lights up LEDs on the physical strip to verify positions match reality. All corrections are reversible through an effects stack. The final mapped layout and be exported for use in LED animation software Not limited to but including Resolume, Madrix, ELM, and WLED, turning a physical layout into a precisely mapped pixel canvas.

This 3D LED Effect Generator is a professional-grade, real-time LED show control applicatio. It features a dual-deck mixing system with a crossfader and 10 blend modes, allowing seamless blending between 68+ built-in visual effects — ranging from classics like rainbow waves and strobes to advanced 3D volumetric effects like DNA helix, vortex, aurora, and even custom GLSL shaders. A 27-effect output FX chain provides post-processingincluding trails, pixelation, axis mirroring, and beat flash. The app is deeply music-reactive, with real-time audio frequency analysis, beat detection, tap tempo, per-effect BPM sync, and full Ableton Link integration. Comprehensive MIDI control with learn mode supports mapping any controller to effect parameters, crossfader, BPM, palette colors, and more. Video clips can be played back from files or captured live via Spout, with frame-accurate in/out points, speed control, and BPM-synced playback. A show sequencer enables timeline-based automation with cued effect changes, crossfader moves, and BPM shifts. Presets are organized in an 8x6 grid browser with unlimited pages, and full show files persist every aspect of a session. Output is transmitted over industry-standard sACN and Art-Net DMX protocols, with a real-time Three.js 3D point-cloud visualizer providing live preview. The entire system runs at 60Hz on dedicated worker threads for reliable, low-latency performance in live installations and performances.
For anyone interested in release dates etc