Robotics visual and
auditory system is provided which is made capable of accurately conducting the sound
source localization of a target by associating a visual and an
auditory information with respect to a target. It is provided with an audition module (20), a face module (30), a stereo module (37), a
motor control module (40), an association module (50) for generating streams by associating events from said each module (20, 30, 37, and 40), and an attention control module (57) for conducting attention control based on the streams generated by the association module (50), and said association module (50) generates an auditory
stream (55) and a visual
stream (56) from a auditory event (28) from the auditory module (20), a face event (39) from the face module (30), a stereo event (39a) from the stereo module (37), and a motor event (48) from the
motor control module (40), and an association
stream (57) which associates said streams, as well as said audition module (20) collects sub-bands having the interaural
phase difference (IPD) or the interaural intensity difference (IID) within the preset range by an active direction pass filter (23a) having a pass range which, according to auditory characteristics, becomes minimum in the frontal direction, and larger as the angle becomes wider to the left and right, based on an accurate sound source directional information from the association module (50), and conducts
sound source separation by restructuring the
wave shape of the sound source.