you to render digital touch for any kind of haptics device in the world,” explains the head of the company, Eric Vezzoli. “The problem we’re solving is that the haptics market is fragmented. You have several different technologies, each with their own way of doing the design and rendering. The way you code haptics for a phone is different from how you code for a controller, which is different from code for an exoskeleton. The market doesn’t develop, because it’s too much of an investment to create a piece of software that isn’t multiplatform.” Interhaptics is also working with standards bodies, including the Haptics Industry Forum, to try to define common practices for haptics content, but Vezzoli believes it will be years before they finally come to any
are the important haptic sensations that need to be encoded for multiple haptics devices to work? It is not a settled argument, but it more or less came down to four different perceptions – force, textures, vibrations and temperature,” explains Vezzoli. Using these four parameters, Interhaptics software simplifies the design process for haptics and allows simple export of haptic tracks for each
parameter, which can be embedded and triggered as necessary, defined by location (eg texture) or time (eg vibration) and, importantly, can be easily tweaked and edited.
THE HAPTICS MARKET IS SUPER FRAGMENTED. YOU HAVE SEVERAL DIFFERENT TECHNOLOGIES, EACHWITH THEIR OWN WAY OF DOING THE DESIGN AND RENDERING
THE HUMAN- MACHINE-HUMAN INTERFACE
solid agreements on interoperability. “We have started defining the value chain in haptics – who does what and how – but it will take few years to establish the standards and execute them and make them useful. The technology needs to be there first, so we’re offering this opportunity to device manufacturers and developers to create haptic-enabled content that works on multiple platforms.” The Interhaptics project came out of academia and an attempt to build a language for haptics from scratch. Treading the same paths as audio and video technologists in past generations, they tried to define the basic human perceptions around touch. “For video, they started with three colours and with a single pixel that can represent any one of those colours. For touch, we asked: what
Densitron has been in the media tech business for almost 40 years, primarily as a developer of display technology, producing early LED displays and mobile devices. But as its business has grown, so has its notion of what ‘display’ really means. Densitron is working to layer new technologies on to its display systems, including a full gamut of HMI – that’s human-machine interface – technologies. “We identified that HMI is going through kind of a rebirth,” says Matej Gutman, Densitron’s technical director for embedded tech. “A display used to be a simple device that would show you pieces of information. Displays then evolved into touch surfaces, then with mobile, you have tactile feedback now. There’s an evolution going on. Displays are becoming more and more intelligent and more things are being packed inside the word ‘display.’” Gutman points out that human- machine interfaces have been a reality for many years – since a human first used a rock as a hammer, one could argue – but what is newer, and more interesting in today’s world, are human-machine-human interfaces.
@feedzinesocial
Powered by FlippingBook