Up until now, our communication with machines has always been limited to conscious and direct forms. Whether it’s something simple like turning on the lights with a switch, or even as complex as programming robotics, we have always had to give a command to a machine, or even a series of commands, in order for it to do something for us. Communication between people, on the other hand, is far more complex and a lot more interesting because we take into account so much more than what is explicitly expressed. We observe facial expressions, body language, and we can intuit feelings and emotions from our dialogue with one another. This actually forms a large part of our decision-making process. Our vision is to introduce this whole new realm of human interaction into human-computer interaction so that computers can understand not only what you direct it to do, but it can also respond to your facial expressions and emotional experiences. And what better way to do this than by interpreting the signals naturally produced by our brain,our center for control and experience.
So I’d like to show you a few examples, because there are many possible applications for this new interface. In games and virtual worlds, for example, your facial expressions can naturally and intuitively be used to control an avatar or virtual character. Obviously, you can experience the fantasy of magic and control the world with your mind. And also, colors, lighting, sound, and effects can dynamically respond to your emotional state to heighten the experience that you’re having, in real time. And moving on to some applications developed by developers and researchers around the world, with robots and simple machines, for example—in this case, flying a toy helicopter simply by thinking “lift” with your mind. The technology can also be applied to real world applications—in this example, a smart home. You know, from the user interface of the control system to opening curtains or closing curtains. And of course, also to the lighting—turning them on or off. And finally, to real life-changing applications, such as being able to control an electric wheelchair. In this example, facial expressions are mapped to the movement commands.
[Video] Man: Now blink right to go right. Now blink left to turn back left. Now smile to go straight.
TL: We really—Thank you. We are really only scratching the surface of what is possible today, and with the community’s input, and also with the involvement of developers and researchers from around the world, we hope that you can help us to shape where the technology goes from here. Thank you so much.
沒有留言 :
張貼留言