In the field of high-end equipment manufacturing, humanoid robots have long been in the spotlight. To date, humanoid robots have mainly been tested in physical tasks that assist humans in daily activities, such as carrying objects, sampling in hazardous environments, caring for the elderly, or serving as physical therapy assistants. However, their potential to perform expressive physical tasks in creative disciplines (such as playing musical instruments or participating in performing arts) has rarely been explored. Recently, researchers from SUPSI, IDSIA, and Politecnico di Milano have introduced a new humanoid robot named "Robot Drummer," opening a new path for the application of humanoid robots in creative performance.

The idea for "Robot Drummer" originated from a casual coffee conversation between the paper's first author, Asad Ali Shahid, and co-author Loris Roveda. They discussed how humanoid robots, although increasingly proficient in various tasks, rarely venture into creative and expressive domains, leading to an intriguing question: What if humanoid robots could take on creative roles, such as playing music? Drumming, with its strong sense of rhythm, flexible limb movements, and need for rapid coordination, became the perfect frontier for their exploration.
The main goal of Shahid and his colleagues was to develop a machine-learning-based system that enables a humanoid robot to play entire musical pieces and exhibit rhythmic skills similar to those of human drummers. Using the Unitree G1 humanoid robot as the platform, they successfully developed and simulated the "Robot Drummer" system. The system represents each piece of music as a series of precisely timed contact events — rhythmic contact chains — where these contact targets guide the robot on when and which drum to strike. Through practice and refinement in a simulated environment, the robot gradually learned human-like behaviors, including dynamically switching drumsticks, crossing arms to reach different drums, and optimizing movements to match the rhythm.
The researchers tested the robot's ability to play various popular songs across genres such as jazz, rock, and metal on a simulated Unitree G1 robot, including Linkin Park's In the End, Dave Brubeck's Take Five, and Bon Jovi's Living on a Prayer. The test results showed that the robot could effectively learn complex rhythmic structures and perform the songs with high precision, with rhythmic accuracy typically exceeding 90%. Even more impressive was that the robot also discovered human-like strategies, such as planning upcoming strikes, executing cross-arm hits, and dynamically reallocating drumsticks.
The success of "Robot Drummer" not only demonstrates the potential of humanoid robots in creative performance but may also have significant value for the entertainment industry. Shahid stated that, in the long term, robot drummers could pave the way for robotic performers to accompany live bands on stage and provide a framework for teaching precise timing skills in fields beyond music. In addition, this latest research may inspire other research teams to develop machine-learning-based tools that enable humanoid robots to play musical instruments or engage in other performing arts.
Looking to the future, Shahid revealed the team's ambitious plans. Their next step is to bring "Robot Drummer" into the real world and transfer the learned skills to actual hardware. At the same time, they plan to teach the robot to improvise and adjust its playing style so that it can respond in real time to musical cues, "feeling" the music and reacting like a human drummer. The realization of this plan will undoubtedly bring even broader prospects for the application of humanoid robots in creative performance.












