The Worldwide Web/Internet has changed the music industry by making huge amount of music available to both music publishers and consumers including ordinary listeners or end users. The Web2.0 tagging techniques of music items by artist name, album title, musical style or genre (technically these are termed as syntactic metadata) have given rise to the generation unstructured free form vocabularies. Music search based on these syntactic metadata requires the search query to contain at least one keyword from that vocabulary and it must be an exact match. The semantic Web initiative by W3C proposes machine process-able representation of information but does not stipulate how that can be applied to music items specifically. In this paper we present a novel approach that details a semi-automatic semantic annotation tool to enable music producers to generate music metadata through a mapping between music consumers’ free form tags and the acoustic metadata that are automatically extractable from music audio. The proposed annotation tool enables onotology guided annotation process and uses MPEG-7 Audio compliant music annotation ontology represented in dominant semantic web standard OWL 1.0.