Zigverse Studio creates living visual works born from musical performance. Using an EWI wind controller, every expressive decision a performer makes — breath, pitch, attack, dynamics, and timbre — becomes visual data in real time. Color, light, depth, and motion rendered as generative HTML and captured as 4K video for permanent installation.
The data chain from performance to pixel is documented, repeatable, and auditable. The work is not AI-generated. It is AI-assisted, human-originated, and biologically sourced.
Current work is available for platform licensing, permanent installation, and private collection.