top of page

Audio Reacting
Particles

This project showcases an interactive and visual dynamic particle system created using touch designer and has ability to react on audio or sound.

PROJECT BRIEF

CLIENT

N/A

YEAR

2024

SHOWCASES

Interaction Design  |  Visual Experience  |  Touch Designer

PROJECT BRIEF

The core idea behind this project was to explore the interaction between audio signals and visual elements. The particle system dynamically changes its behavior, shape, and color based on the audio frequencies and amplitudes it receives. Lower frequencies create smoother, flowing motions, while higher frequencies generate sharper and more energetic movements, resulting in a captivating visual symphony that mirrors the rhythm and tone of the sound.

PROCESS

  • To achieve this, various components of TouchDesigner were used.
     

  • Audio analysis nodes were implemented to extract key data points, such as frequency ranges and amplitude levels, from the sound input.
     

  • This data was then mapped to control parameters of the particle system, including speed, direction, color gradients, and density.
     

  • The use of optical flow and transformations further enhanced the fluidity of the visuals, creating a seamless connection between the auditory and visual elements.

Software Used

TouchDesigner is a node-based, real-time visual programming language primarily used for creating interactive multimedia experiences. It's a versatile tool for artists, programmers, and designers to build performances, installations, and fixed media works. Key applications include real-time 3D compositing, projection mapping, application building, and interactive media systems. 

Project Images

Installation & Projection

The final result is a mesmerizing visual experience where the particles move, swirl, and interact in harmony with the audio input. This project demonstrates a deep understanding of procedural design, real-time data processing, and creative coding. It also highlights the potential of TouchDesigner as a powerful tool for audio-reactive visualizations.

IMG_0693.heic

WHY?

I was inspired by trends in digital media, motion graphics, and interactive design. Filters today are not just about face effects; they are tools for self-expression, storytelling, and brand engagement. My goal was to develop filters that not only entertained but also added value to the user experience. The design and development process involved experimenting with face tracking, hand tracking, 3D modeling, and custom scripting to create engaging effects. I explored various AR capabilities, including interactive gestures, physics-based effects, and visual transformations. Each filter went through multiple iterations to refine the aesthetic and technical aspects, ensuring smooth performance across different devices. Testing was a crucial phase, as optimizing AR effects for mobile devices requires balancing high-quality visuals with efficient performance.

IMPACT

I created a custom Snapchat Lens using Lens Studio as part of my exploration into augmented reality and interactive media. The project allowed me to experiment with face tracking, object recognition, and animated triggers, bringing visual storytelling to life through AR. I designed original assets and implemented interactive elements to make the experience playful and engaging for users. This project not only enhanced my skills in AR design and creative technology, but also demonstrated how immersive media can be used to boost social engagement and connect with audiences in a more dynamic, personal way.

Graphic Design  |  Branding  |  Visual Design  |  Motion Design  |  UIUX  |  Interaction Design  |  Augmented Reality  |  Illustration  |  Content Design

Home

Projects

Resume

IMG_6621 3.jpg

About Me

Achievements

Contact Me

Untitled-1-02.png

Contact Me!

(US+1) 201 589 7877

  • Instagram
  • LinkedIn
bottom of page