Learning Perceptually Grounded Word Meanings From Unaligned Parallel Data. Tellex, S., Thaker, P., Joseph, J., & Roy, N.
Learning Perceptually Grounded Word Meanings From Unaligned Parallel Data [pdf]Paper  abstract   bibtex   
In order for robots to effectively understand natural language com-mands, they must be able to acquire meaning representations that can be mapped to perceptual features in the external world. Previous approaches to learning these grounded meaning representations require detailed annotations at training time. In this paper, we present an approach to grounded language acquisition which is capable of jointly learning a policy for following natural language commands such as " Pick up the tire pallet, " as well as a mapping between specific phrases in the language and aspects of the external world; for example the mapping between the words " the tire pallet " and a specific object in the environment. Our approach assumes a parametric form for the policy that the robot uses to choose actions in response to a natural language command that factors based on the structure of the language. We use a gradi-ent method to optimize model parameters. Our evaluation demonstrates the effectiveness of the model on a corpus of " pick up " and " go to " commands given to a robotic forklift by untrained users.

Downloads: 0