Different electrical devices prefer different frequencies, and each has its advantages. Early engineers calculated that any frequency between 50-67ish hertz was the best compromise between a list of pro’s and con’s for higher or lower frequencies. Basically, this range was as low as early engineers could go benefiting motors and transmission efficiency without causing flicker in lights.
Read on for more details and history. Benefits of higher frequency
Benefits of lower frequency
Early Frequencies Early AC systems were isolated unlike our connected grid today. Each system used their own frequency depending on the type of generator and load it was powering. Early loads were electric railways, city lighting and motors so the frequencies that these loads preferred dominated the debate. Most railways preferred and used low frequencies such as 25 Hz in the US and 16.7 Hz in Europe. Lighting standardized on 133 Hz because it was high enough for no flicker and *boring stuff warning* an 8-pole machine operating at a comfortable 2000 RPMs would output this frequency. So at the time the frequencies were determined by engineers for its use and not as a business exploitation to limit competition like some think. However, they didn’t want to have separate distribution systems for each type of load, so a compromise frequency needed to be found. That range was determined at the time to be anything between 50 Hz and 67 Hz. Road to Standardization As early as 1891 the high-volume manufacturer Westinghouse had standardized on 60 Hz and the German company AEG on 50 Hz but many frequencies continued to be produced and used. As the new technology, international trade, and the interconnected ‘grid’ grew people saw a greater need for compatible electrical equipment. However, scrapping a large infrastructure of perfectly working equipment and buying all new equipment is easier said than done. WW2 marked a turning point for standardization because of the growth of affordable electrical equipment and probably because most of the infrastructure was blown up and needed to be rebuilt anyway. The most probably theory why European companies chose 50 Hz from the range of 50 Hz to 67 Hz was to better fit their metric standards. The most possible reason why Westinghouse (and therefore the US) chose 60 Hz over 50 Hz comes from the personal account of L.B. Stillwell, a principle Engineer at Westinghouse, of an informal committee in 1890 to recommend to his company the best frequency to use. According to him this committee was ready to standardize on 50 Hz but the widespread Arc Lights in use at the time operated with less observable flicker at 60 Hz. The early electrical pioneers didn’t document their deeds in an organized manner before some of the finer points of their decisions were lost. Today we are left with conflicting personal accounts from the dead who are unable to clarify themselves. Rest of the world The rest of the world began purchasing generators from Europe and the United States and who they purchased from mostly decided what frequency they would standardize to. Some countries, like Brazil and Japan, purchased equipment from both. Brazil didn’t fully standardize on 60hz until 1978. Japan never standardized and to this day the country is split down the middle with a frequency conversion station in the middle to connect the two halves. Southern California Edison, which supplied power to most of southern California, used 50hz and didn’t change their standard until 1948. Fun Bonus Facts:
0 Comments
Leave a Reply. |
Archives
February 2024
AuthorBrent is an electrical engineer specializing in utility power systems with a master’s in Energy Policy and Management an MBA, PMP and a degree in Spanish. |