Interaction Design Institute Ivrea @ Belmer Negrillo  see personal portfolio at
On  the body          
   Professor :: Camille Norment >>            

Final Project

















Concept Description
At nightclubs, the significance and the quality of Vjs are increasing every day, becoming a respected job due the maturing of the professionals involved, in th same fashio as happened with Djs in the 80’s. At the same time, more and more the audience perceives a lack of interaction or misses the stage performance existing in concerts exhibitions. This nightclub paradigm with DJ-pickups, VJ-laptop/mixer and a large screen with images that dominate the visual attention, may cause the impression that there is no humans using their skills to do things in real-time.

From the other side, it is also frustrating for the VJ to be attached to a computer, constrained and in an introspective position, having to generate an expansive, exciting and extrovertive video output. Often there is a strong desire to step down from the stage and join the crowd at the dance floor. Ideally still keeping control over the video, but from a different point of view. Exploring it even further, would be nice to be able leverage the energy from the dancing audience and share the video control with the crowd for a moment.

The proposed solution, called GoDance, is a mixing tool for VJs, allowing wireless control via gestures (instead of the usual keyboard, mouse and sliders) of video playback. A secondary feature, a limited "audience mode", allows people in the audience to wear the tool and interact with the video on the big screen, without elaborated instructions.

Features Specification

  • This tool should allow more natural or intuitive movements for controlling the different properties of the video/sound.
  • Flexible enough to let the user use a different garment each time according to his fashion desire.
  • Allow flexible movements and feel comfortable on sustained use.
  • Have a simple version/user mode that can be easily used by anyone without further explanations.
  • Maybe have a professional version/user mode that allows more complex or specific controls.
  • Real-time feedback. A gesture movement would generate a correspondent action in the video and must happen again if the same gesture is repeated.
  • Don’t get lost in using thousands of sensors and a too complex algorithm to try to recognize all the subtle nuances of the human gestures.
  • Be a conceptual device in this stage, but keep in mind that the next step will be do it for real use.

With these constrains in mind and after some exploration (ex3, ex4, Dance Positions) and references, the Wrist Brace VJ aroused as the best and feasible solution.

The prototype consists of a wrist brace and 8 RFID buttons with icons that identifies basic commands for VJing. The buttons are attached to the VJ's garment using pins or Velcro. The Velcro is specially useful when attaching the buttons to the wrist, ankles, or other bare skin parts. For communication of commands, a wireless receiver box is connected by USB to the base computer, which sends MIDI signals to the software.

The professional user (VJ) wears the wrist brace tool and places the buttons in key positions on his body, according to his convenience and personal style. The buttons are video commands (like Play or Increase speed) that can be used to trigger events or toggle commands. The VJ can change the icons on the buttons and change the MIDI signals interpretation in the software according to his personal mixing style. Besides the binary button commands, the VJ can use arm and body movements (that causes the wrist brace to move) to produce analog and continuous changes of video properties.

When the wrist brace touches a button, it reads the button RFID tag. The wearable tool also continuously reads and calibrates the accelerometer signals. When events or significant changes are tracked, the information is transmitted to the base.

All combined, the VJ can leave his lonely role behind the computer screen and join the crowd, executing commands by dancing and touching parts of his body with the wrist brace.

Target User Description
This tool is targeted to professional and amateur VJs. These users are familiar with video software and with the core commands used for mixing. A considerable part is familiar with Midi input devices, reducing the learning curve to this new tool. For shy Vjs, the tool can be used for receiving inputs from the dancing floor or just to not be in the stage :-) .
It can also be used on audio/video performances, as it is simply a Midi input device.


The Wrist Brace VJ has three main elements in the control workflow. There are the pin-buttons, the wrist brace and the base box that connects to the computer.
The pin-buttons are read-only RF-Id tags with unique numbers that are assigned to specific commands.
In the wrist brace, the RF-Id reader ECM110-02 reads the pin number and sends it to the chip processor PIC16F628, which will send the information to the RF transceiver. The PIC processor will also send out the inputs from the accelerometer ADXL210E - sensible to vertical and horizontal movements, the 2 axes defined by your stand up body and the arms opened.
The data sent from the wrist brace transceiver arrives in another transceiver in the base box. At the box, this information will be processed by software in a PIC 16F876, interpreted and converted to Midi signals.
Then the Midi signals are trasmitted via USB to the computer and are recognized by the software that the VJ uses. VJs can use the suggested default assignment of control commands or configure it according to personal style.

Storyboard Scenario
The initial step was to map the possible commands that can be set with this tool. This exploration is documented in the Dance Positions Study page.

Video Scenario
Click here to see the video GoDance!, a wearable tool for VJs.

Process Pictures
Click here to see images of the creation process.



01 - Personal Items On and Near the Body
02 - Hybridization
03 - Considering Technological Constrains
04 - Assigning Technology to the Interaction



Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 2.5 License.