...an electronic robot includes a motor mounted inside its head. A head part of the robot consists of a display unit, to display emotional expressions and a motor control unit that can rotate the robot’s head in a clockwise or counter-clockwise direction on its axis.
The electronic robot decides which emotional expression to be displayed on a display panel by sensing information from its various sensors, such as a camera sensor, a pressure sensor, a geomagnetic sensor, and a microphone to sense the motion of a user. Accordingly, the electronic robot tracks the major feature points of the face such as eyes, nose, mouth, and eyebrows from user images captured by the camera sensor and recognizes user emotional information, based on basic facial expressions conveying happiness, surprise, anger, sadness, and sorrow. Once the information is received from the sensors, the data is then analyzed by the processor that sends an appropriate voltage signal to the display unit and the motor control unit in order to express relevant emotion. The emotional states expressed by the electronic robot include anger, disgust, sadness, interest, happiness, impassiveness, surprise, agreement (i.e., “yes”), and rejection (i.e., “no”).
To express emotional states, the electronic robot stores motion information of the head predefined for each emotional state in a storage unit. The head motion information includes head motion type, angle, speed (or rotational force), and direction.
The technology disclosed in the patent document allows robots to express emotions. Therefore, these robots can communicate or interact with humans more effectively. These robots can be used in the applications that require interaction with humans, for instance, communicating with patients in a hospital in the absence of actual staff, or even interacting with pets such as dogs or cats that are left alone by their owners during their working hours.
from Deric's MindBlog https://ift.tt/2INCLna
via https://ifttt.com/ IFTTT
No comments:
Post a Comment