Spaces:
Running
Running
| license: mit | |
| title: Arpeggiator | |
| sdk: static | |
| emoji: 🦀 | |
| colorFrom: purple | |
| colorTo: pink | |
| short_description: Hand-controlled arpeggiator, drum machine, and visualizer | |
| # Hand Gesture Arpeggiator | |
| Hand-controlled arpeggiator, drum machine, and audio reactive visualizer. Raise your hands to raise the roof! | |
| An interactive web app built with threejs, mediapipe computer vision, rosebud AI, and tone.js. | |
| - Hand #1 controls the arpeggios (raise hand to raise pitch, pinch to change volume) | |
| - Hand #2 controls the drums (raise different fingers to change the pattern) | |
| [Video](https://youtu.be/JepIs-DTBgk?si=4Y-FrQDF6KNy662C) | [Live Demo](https://collidingscopes.github.io/arpeggiator/) | [More Code & Tutorials](https://funwithcomputervision.com/) | |
| <img src="assets/demo.png"> | |
| ## Requirements | |
| - Modern web browser with WebGL support | |
| - Camera access enabled for hand tracking | |
| ## Technologies | |
| - **MediaPipe** for hand tracking and gesture recognition | |
| - **Three.js** for audio reactive visual rendering | |
| - **Tone.js** for synthesizer sounds | |
| - **HTML5 Canvas** for visual feedback | |
| - **JavaScript** for real-time interaction | |
| ## Setup for Development | |
| ```bash | |
| # Clone this repository | |
| git clone https://github.com/collidingScopes/arpeggiator | |
| # Navigate to the project directory | |
| cd arpeggiator | |
| # Serve with your preferred method (example using Python) | |
| python -m http.server | |
| ``` | |
| Then navigate to `http://localhost:8000` in your browser. | |
| ## License | |
| MIT License | |
| ## Credits | |
| - Three.js - https://threejs.org/ | |
| - MediaPipe - https://mediapipe.dev/ | |
| - Rosebud AI - https://rosebud.ai/ | |
| - Tone.js - https://tonejs.github.io/ | |
| ## Related Projects | |
| I've released several computer vision projects (with code + tutorials) here: | |
| [Fun With Computer Vision](https://www.funwithcomputervision.com/) | |
| You can purchase lifetime access and receive the full project files and tutorials. I'm adding more content regularly 🪬 | |
| You might also like some of my other open source projects: | |
| - [3D Model Playground](https://collidingScopes.github.io/3d-model-playground) - control 3D models with voice and hand gestures | |
| - [Threejs hand tracking tutorial](https://collidingScopes.github.io/threejs-handtracking-101) - Basic hand tracking setup with threejs and MediaPipe computer vision | |
| - [Particular Drift](https://collidingScopes.github.io/particular-drift) - Turn photos into flowing particle animations | |
| - [Video-to-ASCII](https://collidingScopes.github.io/ascii) - Convert videos into ASCII pixel art | |
| ## Contact | |
| - Instagram: [@stereo.drift](https://www.instagram.com/stereo.drift/) | |
| - Twitter/X: [@measure_plan](https://x.com/measure_plan) | |
| - Email: [stereodriftvisuals@gmail.com](mailto:stereodriftvisuals@gmail.com) | |
| - GitHub: [collidingScopes](https://github.com/collidingScopes) | |
| ## Donations | |
| If you found this tool useful, feel free to buy me a coffee. | |
| My name is Alan, and I enjoy building open source software for computer vision, games, and more. This would be much appreciated during late-night coding sessions! | |
| [](https://www.buymeacoffee.com/stereoDrift) |