It is surprisingly beautiful how binary digits can be transformed into pleasurable experiences. Watching colors blend, listening to music, and seeing animating visualizations make us feel happy. But I believe there is something missing, that is the emotional connection in the way we interact to produce them. Simply clicking a mouse or scrolling on a trackpad lacks the connection that our intent and the feedback deserve. If we add natural ways to control these reactions using our body movements, they become a delightful experience.
Illumination Pinch is inspired by generative art techniques used by Etienne Jacob to create GIFs. Most of his artworks are illusions that represent dynamic systems. In this experiment, the visualization is nothing but a normal distribution. Users can move their thumb and index fingers across the screen and when a pinch action is performed, a circular grid of circles lighten up at that point and ripple away radially. As the lights on the grid ripple away, they change their size and opacity to give the desired illusion of the circles actually moving outwards.
Current design trends are geared towards layered combinations of radial and/or linear gradients. It's difficult to make gradients quickly and I always looked for tools but ended up using background images from Google search. This experiment tries to solve this challenge. Also, the interaction is inspired by how artists create abstract art by splashing paints on the canvas.
In this study, I have worked on creating a way to generate new gradients using splashing colors onto the screen. For this study with p5.js, I've used handPose(ml5.js) to track one hand's movements. When a user expands fingers (splashes on the screen), a new color is added to the canvas in the form of a layered radial gradient that blends outwards.
This experiment is aimed to control song visualizations we usually find media players like VLC and Windows Media Player. The mechanics are based on sin wave structures and are inspired by a "Colorful Coding" video on YouTube I watched a while ago.
In this setup, the visualization reacts to the user's wrists. Using ml5.js(poseNet), a scroll is detected and when the user quickly moves their arms together to left/right. The rotation speed of the visualization depends on the speed of the hand movement. Users can also move their hands up and down; during which the illustration loops through the sin wave propagation providing spring feedback.