ROCKMUSIC
Rocks, Camera, Action!
Rockmusic turns ordinary rocks into an experimental musical playground. No instruments, no training, just arrange stones, trigger sounds, and discover unexpected rhythms and melodies. By blending physical interaction with simple technology, the project reveals how everyday objects can transform into expressive creative tools, inviting curiosity, play, and a new way to discover music.
Course: Programming
Mentors: Jacob Remin, Dennis P Paul
Team: Vedant, Rohan
Music begins with a simple action: placing a rock on the platform. An overhead camera scans the surface and identifies each stone in real time using a computer-vision pipeline built in Python.
Every rock becomes an input, its position selects the instrument, while the total number of rocks shapes the rhythm and tempo. This spatial and numeric data is processed in Python and transmitted directly into VCV Rack’s modular audio engine, where it is transformed into evolving sound and generative music.
Early prototypes relied on a MIDI bus connection between Python and GarageBand, helping validate and refine the rock-recognition pipeline quickly. While effective for testing, GarageBand’s single MIDI input channel limited the system to one instrument, restricting musical scalability.
To unlock richer sound design, the system migrated to VCV Rack, an open-source modular synthesis platform that supports multiple parallel instrument channels (10+), enabling greater creative freedom.
Stack:
OTHER PROJECTS