Using Visual and Auditory Feedback for Instrument-Playing Humanoids. Maier, D., Zohouri, R., & Bennewitz, M. 2014.
Paper abstract bibtex — In this paper, we present techniques that enable a humanoid to autonomously play instruments like the metal-lophone. The core of our approach is a model-based method to estimate the pose of the instrument and the beaters held by the robot using observations from the onboard camera. For accurate playing, we calibrate the kinematic parameters of the robot and find valid configurations of the arms for beating the individual sound bars of the instrument. To determine these, we rely on the estimated pose of the instrument and the beaters and apply inverse kinematics (IK). Hereby, we use precomputed forward kinematics solutions represented by a reachability tree to accelerate the IK computation and compensate for local minima. The robot automatically validates the computed IK configurations based on visual and auditory feedback using its sensors, and adapts its arm configurations if necessary. Our system parses MIDI-files of whole songs, maps the notes to the corresponding arm configurations for beating, and generates trajectories in joint space to hit the sound bars. As we show in the experiments with a Nao humanoid presented in this paper as well as in the accompanying video, our approach allows for clean and in-time playing of complete songs on a metallophone.
@article{
title = {Using Visual and Auditory Feedback for Instrument-Playing Humanoids},
type = {article},
year = {2014},
identifiers = {[object Object]},
id = {4f6b1146-d3a7-3d65-993c-9c347c5e0320},
created = {2017-01-27T07:45:46.000Z},
file_attached = {true},
profile_id = {5d07ba72-227e-314d-9d79-37271da759ee},
group_id = {e79131d5-b618-3b3c-ae97-e4263040fd28},
last_modified = {2017-03-14T16:56:30.626Z},
read = {true},
starred = {false},
authored = {false},
confirmed = {true},
hidden = {false},
abstract = {— In this paper, we present techniques that enable a humanoid to autonomously play instruments like the metal-lophone. The core of our approach is a model-based method to estimate the pose of the instrument and the beaters held by the robot using observations from the onboard camera. For accurate playing, we calibrate the kinematic parameters of the robot and find valid configurations of the arms for beating the individual sound bars of the instrument. To determine these, we rely on the estimated pose of the instrument and the beaters and apply inverse kinematics (IK). Hereby, we use precomputed forward kinematics solutions represented by a reachability tree to accelerate the IK computation and compensate for local minima. The robot automatically validates the computed IK configurations based on visual and auditory feedback using its sensors, and adapts its arm configurations if necessary. Our system parses MIDI-files of whole songs, maps the notes to the corresponding arm configurations for beating, and generates trajectories in joint space to hit the sound bars. As we show in the experiments with a Nao humanoid presented in this paper as well as in the accompanying video, our approach allows for clean and in-time playing of complete songs on a metallophone.},
bibtype = {article},
author = {Maier, Daniel and Zohouri, Ramin and Bennewitz, Maren}
}
Downloads: 0
{"_id":"yPyRC4hKC9AWBKvfM","bibbaseid":"maier-zohouri-bennewitz-usingvisualandauditoryfeedbackforinstrumentplayinghumanoids-2014","downloads":0,"creationDate":"2017-04-28T05:53:44.426Z","title":"Using Visual and Auditory Feedback for Instrument-Playing Humanoids","author_short":["Maier, D.","Zohouri, R.","Bennewitz, M."],"year":2014,"bibtype":"article","biburl":null,"bibdata":{"title":"Using Visual and Auditory Feedback for Instrument-Playing Humanoids","type":"article","year":"2014","identifiers":"[object Object]","id":"4f6b1146-d3a7-3d65-993c-9c347c5e0320","created":"2017-01-27T07:45:46.000Z","file_attached":"true","profile_id":"5d07ba72-227e-314d-9d79-37271da759ee","group_id":"e79131d5-b618-3b3c-ae97-e4263040fd28","last_modified":"2017-03-14T16:56:30.626Z","read":"true","starred":false,"authored":false,"confirmed":"true","hidden":false,"abstract":"— In this paper, we present techniques that enable a humanoid to autonomously play instruments like the metal-lophone. The core of our approach is a model-based method to estimate the pose of the instrument and the beaters held by the robot using observations from the onboard camera. For accurate playing, we calibrate the kinematic parameters of the robot and find valid configurations of the arms for beating the individual sound bars of the instrument. To determine these, we rely on the estimated pose of the instrument and the beaters and apply inverse kinematics (IK). Hereby, we use precomputed forward kinematics solutions represented by a reachability tree to accelerate the IK computation and compensate for local minima. The robot automatically validates the computed IK configurations based on visual and auditory feedback using its sensors, and adapts its arm configurations if necessary. Our system parses MIDI-files of whole songs, maps the notes to the corresponding arm configurations for beating, and generates trajectories in joint space to hit the sound bars. As we show in the experiments with a Nao humanoid presented in this paper as well as in the accompanying video, our approach allows for clean and in-time playing of complete songs on a metallophone.","bibtype":"article","author":"Maier, Daniel and Zohouri, Ramin and Bennewitz, Maren","bibtex":"@article{\n title = {Using Visual and Auditory Feedback for Instrument-Playing Humanoids},\n type = {article},\n year = {2014},\n identifiers = {[object Object]},\n id = {4f6b1146-d3a7-3d65-993c-9c347c5e0320},\n created = {2017-01-27T07:45:46.000Z},\n file_attached = {true},\n profile_id = {5d07ba72-227e-314d-9d79-37271da759ee},\n group_id = {e79131d5-b618-3b3c-ae97-e4263040fd28},\n last_modified = {2017-03-14T16:56:30.626Z},\n read = {true},\n starred = {false},\n authored = {false},\n confirmed = {true},\n hidden = {false},\n abstract = {— In this paper, we present techniques that enable a humanoid to autonomously play instruments like the metal-lophone. The core of our approach is a model-based method to estimate the pose of the instrument and the beaters held by the robot using observations from the onboard camera. For accurate playing, we calibrate the kinematic parameters of the robot and find valid configurations of the arms for beating the individual sound bars of the instrument. To determine these, we rely on the estimated pose of the instrument and the beaters and apply inverse kinematics (IK). Hereby, we use precomputed forward kinematics solutions represented by a reachability tree to accelerate the IK computation and compensate for local minima. The robot automatically validates the computed IK configurations based on visual and auditory feedback using its sensors, and adapts its arm configurations if necessary. Our system parses MIDI-files of whole songs, maps the notes to the corresponding arm configurations for beating, and generates trajectories in joint space to hit the sound bars. As we show in the experiments with a Nao humanoid presented in this paper as well as in the accompanying video, our approach allows for clean and in-time playing of complete songs on a metallophone.},\n bibtype = {article},\n author = {Maier, Daniel and Zohouri, Ramin and Bennewitz, Maren}\n}","author_short":["Maier, D.","Zohouri, R.","Bennewitz, M."],"urls":{"Paper":"http://bibbase.org/service/mendeley/bca0fddf-79ea-3c29-93ed-6177ce521efd/file/0971e55a-e02e-f60f-14a6-dc523327e7b6/2014-Using_Visual_and_Auditory_Feedback_for_Instrument-Playing_Humanoids.pdf.pdf"},"bibbaseid":"maier-zohouri-bennewitz-usingvisualandauditoryfeedbackforinstrumentplayinghumanoids-2014","role":"author","downloads":0},"search_terms":["using","visual","auditory","feedback","instrument","playing","humanoids","maier","zohouri","bennewitz"],"keywords":[],"authorIDs":[]}