A minimal, modular, and scalable toolkit for creating spatial vibrotactile feedback systems. Supports up to 128 synchronized actuators with per-unit microcontrollers and a clean BLE control pipeline. Built for haptics research, VR, robotics, accessibility, and rapid prototyping.
Minimal hardware, modular architecture, and expressive tactile capabilities — designed for rapid R&D and premium user experience.
Mapping speech phonemes to tactile signatures.
Impact cues, directional hits, rhythmic vibration patterns.
Spatial perception via vibration — direction, distance, urgency.
A fully featured editor for designing, previewing, and streaming spatial vibrotactile patterns — all in real time.
Follow these steps to assemble your VibraForge Dev Kit v1 and start playing spatialized vibrotactile patterns. For full details, see the VibraForge User Manual (PDF). Assembly walkthrough video: Watch on YouTube → Open-source software & examples: VibraForge GitHub Repository →
Huang, Ren, Luo, Cheng, Cai, Sang, Sousa, Dietz, Wigdor (2024)
Read Paper →
Huang, Wang, Cheng, Ren, Cai, Alvarez Valdivia, Mahadevan, Wigdor (2024)
Read Paper →We're building a growing community of researchers, creators, engineers, and designers exploring spatial vibrotactile feedback. Contribute your ideas, experiments, or feedback — and receive a chance to get a FREE VibraForge Dev Kit as a thank-you for helping us push the field forward.
Fill out this short form to tell us about your interests, research area, or project ideas. Selected contributors will receive early-access hardware, software updates, and feature testing opportunities.
Apply to Receive a Free Toolkit →