| | Teresa --
If I may judge by my own experience (when first I learned that I should say "Co-pen-hay-gen") -- on the few occasions that the matter came up after I'd learned the correct pronunciation, I'd dutifully start off with "Co-pen-hay-gen" and then start vacillating even though I now knew better (simply because I had a hard time persuading myself to stick by late-learned knowledge in a context that reminded me of earlier-formed habits). TV broadcasters have (and presumably sometimes refer to) network-issued pronunciation guides for names in the news: the people who started a broadcast correctly with "Co-pen-haa-gen" and then slipped back into "Co-pen-hay-gen" may have looked up the name shortly before the broadcast, then simply have had a hard time sticking to this newly acquired knowledge in the face of earlier habit (just as I myself, 20 years ago, had a hard time sticking to a newly acquired "Co-pen-hay-gen" after years of thinking "Co-pen-haa-gen" whenever I saw "Copenhagen" in print).
Nevertheless, I agree with you that people often allow prestige (peer pressure) to govern the way they speak and write. Have you ever seen the maps (drawn by linguist Peter Ladefoged, as I recall) showing how the distribution of a particular pronunciation of /r/ has changed in various European languages over the past 500 years? Half a millennium ago, French and German apparently pronounced /r/ as Spanish does to this day: the uvular pronunciation of /r/ (now characteristic of standard French and standard German) apparently arose about 400 years ago as a prestigious fad in Paris, whence prestigious speakers carried it outward (to other cities, and to the suburbs) until prestigious people normally said /r/ in this way and it thus became standard. What criterion does one properly use to determine whether a change in pronunciation is, or is not, a horrid fad? As you may know, 200-300 years ago the English word "balcony" was universally pronounced "bal-CO-ny": when more and more people began saying "BAL-co-ny," this was (at first) condemned as a horrid fad, devoid of etymological or other warrant, perpetuated by those who had nothing better to do with their English than to warp it in order to ape the fashionable creators and perpetrators of fads. So are we, today, all being stupid faddists when we follow our great-great-great-grandparents who got swept up in the fad of saying "BAL-co-ny" despite having no objective reason at all to change from the previous "bal-CO-ny"? What objective criterion properly determines when to regard some new pronunciation as a horrid, stupid fad -- and when to stop so regarding it?
(One might of course ask the same question about other sudden changes in usage, such as sudden widespread changes in the spellings of various words. In the first four-and-a-half decades of the 20th century, USA as well as UK writers most often spelled a certain word "pyjama" -- but, _circa_ 1945, "pajama" suddenly takes over in USA usage while "pyjama" remains the norm in the UK.)
Does a new pronunciation (or a new spelling for that matter) become okay simply because some particular percentage of speakers (5%? 25%? 80%? 100%?) are using it? Or should one assume that mere numbers of speakers (even if the numbers reach 100%) cannot confer correctness on an incorrect pronunciation?
|
|