I recently led a project team at ThoughtWorks to create and open source a new Facial Expression Recognition (FER) toolkit named EmoPy. The system produces accuracy rates comparable to the highest rates achievable in FER, and is now available for anyone to use for free.
This article explains how the EmoPy system is designed and how it can be used. It will examine the architectures and datasets selected, and illustrate why those choices were made.
Interactions between humans and robots are becoming increasingly ubiquitous. Questions naturally emerge about the how human-robot communications can be understood, now and in the future.
Where does communication flow, and where are there ‘gaps’? Further, what is not said in human-robot interactions? Movement and gesture are vital aspects of HRI, often making up shades of unspoken meaning.
Machine learning systems can be trained to recognize emotional expressions from images of human faces, with a high degree of accuracy in many cases.
However, implementation can be a complex and difficult task. The technology is at a relatively early stage. High quality datasets can be hard to find. And there are various pitfalls to avoid when designing new systems.
Artists working with emerging technologies frequently generate new insights on the future of culture, industry and society.
At ThoughtWorks we regularly engage with artists, collaborating on cutting edge technology projects, and enriching the perspectives we bring to our clients. Our recent explorations have taken us on journeys into cyborgism and transhumanism, bias in machine intelligence, movement in robotics, and more.
Waves of technology-driven change now regularly disrupt industry, culture and society. The challenge of navigating this turbulence grows increasingly complex, requiring deep examination of overlapping trajectories in search of hidden insights.
Research of this kind necessitates synthesis from varied disciplines, viewpoints and areas of expertise. However as the speed of change accelerates, new forms of collaboration are required — forms capable of producing original, far-reaching perspectives from the cutting edge of cultural and technological transformation.
Adrianne has been exploring the expressive potential of the non-humanoid MekaMon, manufactured by Reach Robotics. During her residency, Adrianne will be initiating a project to develop a social and psychotherapeutic tool for nonverbal expression through gesture.