Dentsu Lab's Project Humanity demonstrates how EMG and brainwave sensors can translate residual muscle signals and neural activity into digital expression, enabling people with severe physical disabilities to interact with software and perform live on stage.
There is a quiet assumption embedded in most of the products we build: that the person using them can move their hands. They can swipe, tap, type, click. The entire paradigm of human-computer interaction has been built around that premise. But for an estimated 200 million people worldwide living with serious physical disabilities, that's not the case.

The Signal in the Silence
ALS – amyotrophic lateral sclerosis – is a progressive neurodegenerative disease in which the motor neurons responsible for voluntary movement gradually degrade. The body becomes immobile, often entirely, while the mind remains fully intact. For most patients, the disease does not take cognition. It takes expression.
The engineers at Dentsu Lab saw this as an interface problem. If the output channel – the body – is compromised, then the solution is to find a new one. Their approach begins with electromyography (EMG) sensors attached directly to the patient's body. These sensors detect the electrical activity generated by even the smallest, most residual muscle contractions. That raw biological data is then processed and mapped onto a digital avatar – a full-body representation in virtual space that mirrors the user's intended movements.
The interface is not symbolic or abstract; it is embodied. The avatar moves because the person's muscles are still trying to move. The system simply makes that intention legible.
More Than Accessibility
What makes this technically interesting is that the avatar is not just an avatar – it is a control surface. Once mapped, it can be used to interact with any software: productivity tools, communication platforms, creative applications. The user is not confined to a bespoke assistive tool. They have a general-purpose body in digital space.
Dentsu Lab has already demonstrated this in an e-sports context, running trial events where participants with physical disabilities played games online alongside people without disabilities. The result was instructive: the differences in physical ability became irrelevant at the layer of the interface.
This has direct implications beyond gaming. If the interface can be generalised, it becomes a pathway into the Metaverse for people who currently have no viable route in – opening up educational platforms, remote work environments, and social spaces that are otherwise inaccessible.

Amsterdam, December 2025: A Live Proof of Concept
If the e-sports trial was a proof of function, Waves of Will was a proof of expression. On the evening of 10th December 2025, in collaboration with NTT Inc., Dentsu Lab staged a live dance performance in Amsterdam featuring Breanna Olson – a professional dancer who has been living with ALS.
The technical mechanism here shifted from EMG to brainwave detection. Breanna's neural signals – the electrical activity corresponding to her intention to move, to dance – were captured and translated in real time into choreography performed by her digital avatar on stage.
In an interview with BBC News, Breanna described the experience as "exhilarating" and "magical" – watching herself, in virtual form, take to the stage once more.

The performance wasn't just a technical demonstration; it was a philosophical statement about what digital interfaces can become when we stop assuming everyone has the same physical capabilities. The technology doesn't just restore function – it creates new forms of expression that weren't possible before.
What's particularly compelling about this approach is how it reframes the entire conversation around accessibility. Instead of building separate tools for people with disabilities, Dentsu Lab is creating interfaces that work for everyone by design. The same system that allows someone with ALS to control a computer can also be used by someone without disabilities who wants a different interaction model.
This represents a fundamental shift from accessibility as an afterthought to accessibility as the primary design principle. When you build for the most constrained users first, you often end up with solutions that are better for everyone.
The implications extend far beyond individual empowerment. As we move toward increasingly immersive digital environments – whether we call it the Metaverse, spatial computing, or something else – the question of who gets to participate becomes crucial. Current interfaces exclude millions of people from these emerging spaces. Technologies like Dentsu Lab's create pathways for full participation, ensuring that the digital future isn't just built for those who can manipulate traditional input devices.
What makes this work particularly powerful is that it doesn't just solve a technical problem – it restores something deeply human. Dance, expression, creativity – these aren't luxuries. They're fundamental to the human experience. When technology can restore these capabilities to people who've lost them, it crosses from being merely useful to being transformative.
The question now isn't whether this technology works – the Amsterdam performance proved that it does. The question is how quickly we can scale it, refine it, and integrate it into the tools and platforms that people use every day. Because if we get this right, we won't just be building better interfaces for people with disabilities. We'll be building better interfaces for everyone.

Comments
Please log in or register to join the discussion