Seeing an object is a natural source for learning about the object’s configuration. We show that language can also shape our knowledge about visual objects. We investigated sign language that enables deaf individuals to communicate through hand movements with as much expressive power as any other natural language. A few signs represent objects in a specific orientation. Sign-language users (signers) recognized visual objects faster when oriented as in the sign, and this match in orientation elicited specific brain responses in signers, as measured by event-related potentials (ERPs). Further analyses suggested that signers’ responsiveness to object orientation derived from changes in the visual object representations induced by the signs. Our results also show that language facilitates discrimination between objects of the same kind (e.g., different cars), an effect never reported before with spoken languages. By focusing on sign language we could better characterize the impact of language (a uniquely human ability) on object visual processing.