arrowStep On Me

Step On Me

2024

ExperiencePhysical Computing

I created a pair of shoes that allow users to step on their own face. 

 

This system uses webcam-based image recognition to reconstruct the user’s face in a Unity environment and display it on a monitor. When the user steps on the white shoe, the action is reflected in Unity as a virtual foot movement, making it appear as though the face on the monitor is being stepped on and shattered. Additionally, the rotating motion of shoelaces attached to a box-like device, along with the accumulation of shoelaces displayed on the monitor, visualizes the total number of times the face has been stepped on.

Concept

In Japan, there is a custom that says, "It's good to step on new shoes," while there is also a saying, "The person whose foot is stepped on will remember it for life." These expressions suggest that the act of stepping holds special significance. By using "stepping" as an action to reconnect with bodily awareness, which is often overlooked in daily life, I sought to express the experience of "feeling discomfort by stepping on something yourself." Initial System Diagram  

Technical Implementation

The entire system is divided into the following three parts:

 

  1. A shoe with an FSR (Force-Sensitive Resistor) switch: Sensor data is generated when the user steps on the shoe. The signal is sent to component 3 via Bluetooth using an ESP32.
  2. A box with shoelaces attached: The sensor data is converted into physical output. A stepper motor, 3D printer, and laser cutter were used to create this component.
  3. Unity-based software, monitor, and webcam: User facial data is recognized using OpenCV for Unity and the Dlib Face Landmark Detector. This data, combined with sensor input from component 1, is converted into digital output. It also sends signals to component 2. Blender was used to create the shoe animations. Overall System Diagram system1 system2 system3 final_animation