A nodal real-time video processing tool : put together pre-made "processing boxes" to generate interactive video. It runs on pretty much anything, uses a plugin architecture.
Say, plug a camera, and it will blend two videos streams using a silhouette detected on the camera, with various effects. It's very, very early, pre-alpha stuff, but it already was used for a demo by a customer.
GitHub pestacle, be warned, it's undocumented and larval stage
Will it be something like TouchDesigner[1] ? I never used TD myself, but I follow a lot of creative types who do for making music visualizations, art installations, etc.
I can't find it on Github though, maybe repo is private ?
Be warned, zero documentation, because things are at larval stage and change often. Will include a couple of demos this week.
In the spirit, yes, but targeting different hardware, public, and environments.
* It runs on Linux, Mac, Windows. Bare metal on rp2040 and rp2350 is planned.
* It written in C, build with Make.
* It is meant to run on something like a Raspberry Pi, Latte Panda, etc
* A setup is a text file, no fancy UI.
* The plan for live parameter fiddling will be a web server. Web UI will be tailored to each setup, no one size fits all UI. Typically I pay someone to do the UI.
* For now, it's only video, no sound output
It will be used for several large interactive LED displays and object tracking systems. It's a way for me to factories all those projects I was contracted for.
Say, plug a camera, and it will blend two videos streams using a silhouette detected on the camera, with various effects. It's very, very early, pre-alpha stuff, but it already was used for a demo by a customer.
GitHub pestacle, be warned, it's undocumented and larval stage