logo elektroda
logo elektroda
X
logo elektroda

Why is the mains voltage 220 (230) V and the frequency 50 Hz.

robokop 107635 11
ADVERTISEMENT
Treść została przetłumaczona polish » english Zobacz oryginalną wersję tematu
  • #1 4264210
    robokop
    VIP Meritorious for electroda.pl
    Hello. Yesterday, the youth returned from school and brought an unusual problem to be solved. Well, the teacher asked the class a puzzle to solve over the weekend, why the mains voltage is 220 (230) V, and the frequency is 50 Hz, and not, for example: 180V 70 Hz, what are (were) the conditions for these values?
  • ADVERTISEMENT
  • ADVERTISEMENT
  • #3 4285550
    Quarz
    Level 43  
    Hello,
    voltage (effective) value 220 V it is a historical event and related to the first practical use of electricity generated then in direct current generators.
    It was lighting (street lighting, probably in Paris) with the use of arc lamps (the bulb was not invented then) in which the value of the electric arc voltage is approx. 55 V but one arc lamp burned unstable due to its non-linear current-voltage characteristics and negative (within a certain range) dynamic resistance.
    Therefore, the nominal voltage value in the first electric networks was 110 V and the arc lamps were connected in series by two.
    Soon after, Dolivo-Dobrowolski invented his system, i.e. series connection of two direct current generators, 110 each V each.
    And so there was already the voltage of 220 V on the extreme wires of the three-wire network and used to power electric motors (higher power value) and 110 each V (relative to the middle one) used for lighting.
    Production of electric energy receivers, e.g. heaters, and then light bulbs (first with carbon fiber) for a nominal voltage of 110 V .
    When Tesla began to prefer alternating voltage in the US, because of the ability to transform and transmit electricity over long distances, from the first hydropower plants built on the numerous (side) Niagara Falls, there was no alternative anymore; it was necessary to assume the nominal voltage value for which the direct current receivers were already, i.e. 110 V .
    Value 220 V was an ordinary consequence of minimizing energy losses in LV transmission lines
    Although the voltage of 110 is still used in America V , where a method of distribution of electricity to the recipient other than in Europe is preferred.
    Namely, every single-family house, flat, has its own individual step-down transformer from 6kV (usually) to 110 V Thus, at relatively short distances, these losses in wire resistance are not large.
    In Europe, the network model with one three-phase transformer lowering the voltage from medium to low for the local network is commonly adopted, and hence twice the value of this voltage minimizes the losses.
    Though Americans (with their individual transformers) have theselosses - medium voltage switching station on the way, end user - with the same value of power consumed, lower.
    The filtering operation of the transformer is also important here, so the energy in the medium voltage line is of better quality (less harmonic content in the current waveform).

    Therefore, as can be seen, the nominal value of the voltage in LV grids is increased by the method of small steps, so that the devices previously manufactured for a lower nominal value could still operate at the increased nominal value for some time (first 230 V and then 240 V ) and with time "die naturally".
    On the other hand, newly produced devices are usually adapted to work with a lower nominal value (excluding light bulbs, which then shine poorly) of voltage.

    Now two words about the frequency value (although there is a link to the page where it was sufficiently explained in the previous post) alternating voltage.
    In Europe, engineers (mainly German) came to the conclusion that 50 would be a sufficient value Hz (mainly due to the eddy current losses in transformer and generator sheets increasing with the square of the frequency).
    In the US, 60 were adopted Hz which for a synchronous motor (generator) with one pair of poles gives a synchronous speed of 3600 rpm (in Europe 3000 rpm ), i.e. 60 rev./sec. , but (as I once read the journals of Tomasz Alva Edison) not only was this the reason for adopting such a value that Edison (who was self-taught) only mentioned without justifying it exactly.

    greetings
  • #4 5573289
    lord_blaha
    Level 33  
    A colleague described it so nicely that it was on the main page of the excavation :)
  • ADVERTISEMENT
  • #5 5573309
    Quarz
    Level 43  
    Hello,
    lord_blaha wrote:
    A colleague described it so nicely that it was on the main page of the excavation :)
    I would like some details ... :idea: :D

    greetings
  • #7 5574626
    Paweł Es.
    VIP Meritorious for electroda.pl
    In the world (before unification) there were systems operating at frequencies from 16 and 2 / 3Hz to 133 1 / 3Hz. The dispersion resulted from the adopted technical solutions of the propulsion (steam turbines, water turbines) and the design of the generator itself.

    The choice of 60 Hz in the USA was made by George Westinghouse and 50 Hz in Europe by the German company AEG.

    60Hz resulted from the optimization of the arc lighting, it was found that it works more stable with a power supply of 60 Hz (less flickering). I also read somewhere that Nicola Tesla found out that the most optimal frequency for AC devices is 60 Hz.

    Edison was an advocate of direct current, even so much so that in his "handmade" electric chair he used alternating current to make it clear that the alternating current propagated by Westinghouse was very dangerous. ;)

    Frequencies below 50Hz cause unpleasant and exhausting flickering of lighting, especially when illuminated with low inertia lighting sources (fluorescent lamps, arc devices).

    In the early years of incandescent lighting, typical generators ran at 2,000 rpm with 4 pairs of poles, which gave a frequency of 133 and 1/3 Hz.

    Westinghouse Electric decided to standardize the grid frequency to a lower value in order to supply lighting equipment and induction motors from the same grid.
    As I wrote before, 60Hz was selected because the arc lighting provided better, less flickering light at this frequency.
    Frequencies much below 50Hz caused visible flickering of lighting (also incandescent).

    The AEG established 50Hz as the operating frequency of the network also as a result of observation of the flickering of light, but at lower frequencies (40Hz, 175 km of the transmission line from Lauffen to Frankfurt operated at this frequency in 1891 and a higher frequency was set that year).

    In 1893, General Electric, affiliated with AEG in Germany, built a 50 Hz power plant in Mill Creek, California (to enable operation on the same lighting line and induction motors invented by Tesla in 1888), later increased the frequency to 60 Hz for reasons of market (compliance with the Westinghouse standard).

    The first (Westinghouse's) generators at the Niagara Falls hydroelectric power plant operated at a frequency of 25 Hz, but this was in 1895 before standardization (and the frequency resulted from the choice of the generator used before the choice of the transmission system).

    The oldest (built in 1897) hydroelectric power station currently in operation in Mechanicsville, New York, operates at a frequency of 40 Hz and is connected to the power grid via frequency converters (40 to 60 Hz). It was supposed to be closed in 1996 but I think it was restored and it works on, although I do not know if it is full power or as a monument.






    Mechanicville Hydroelectric Power Station


    The voltage of 110V was adopted because of the carbon fiber bulbs used at that time, at this voltage they burned the longest, then when tungsten fiber was introduced, it was possible to give a higher voltage, which at the same time reduced the material consumption of the supply lines (lower currents for the transmission of the same power).

    In the States, they now seem to be using something like this:

    http://en.wikipedia.org/wiki/Split_phase
  • ADVERTISEMENT
  • #8 5575552
    robokop
    VIP Meritorious for electroda.pl
    And here's my part: wasn't the cinematography influencing the frequency of the current by any chance? I mean the number of frames, because I remember something so poorly from my school days that the image from the projector was displayed at a frequency of 24 frames / second, except that each frame was divided (a characteristic propeller cutting the light beam in the projector)?
  • #9 5577842
    Paweł Es.
    VIP Meritorious for electroda.pl
    In cinematography it was 24 frames per second and in TV it is 50 Hz (in Europe).
    In the silent film era, 16 frames per second were filmed.

    Just 50Hz and the filming speed were not related to each other.


    The shutter in the movie camera does not divide the frames. The film strip moves abruptly in the film camera and the shutter covers the light falling through the optical system of the lens onto the film at the moment of its movement. Only when the tape stops exactly with a film frame in front of the lens, the shutter reveals the lens and the next frame is exposed to the "pre-lens reality state"
  • #10 13412520
    erroid
    Level 1  
    50 Hz in old TVs is a screen refresh that has nothing to do with the frame rate in cinematography. If you look at such an old CRT TV out of the corner of your eye, you will register a slight flickering - while 50 Hz on the TV could be enough, it was not enough for comfortable work with a CRT monitor (cathode ray tube screen) at which you are staring from a distance of about half a meter - that's why it is accepted that refreshing should be at least 72 Hz (at this CRT screen refresh rate, the human eye can hardly register the flickering). today CRT screens have been replaced with LCD screens where the image is generated in a completely different way - it does not fade away as in CRT devices immediately after generation but lasts until the next one is generated - so there is no question of flickering but the refresh rate is still important for other reasons. Well, while not much has changed in the movie (the standard of 24 frames is still in force and the 48-frame technology is crawling and not accepted by everyone with delight, which is a pity because "Hobbit" in 48 frames + in 3D is respectable), in games the more frames the better. in addition, the 3D technology forces the doubling of the number of generated images (alternating one frame for each eye in active technology). now this frequency has to increase to keep up with the frame rate. let's assume that we want to have 60 fps (frames per second) in the game + 3D and it turns out that our monitor / TV with an adequate reserve should have a refreshment of at least 200 Hz.

    to sum up: 50 Hz in the socket has nothing to do with 50 Hz on the screen of the old TV.
  • #11 13412540
    jdubowski
    Tube devices specialist
    erroid wrote:
    to sum up: 50 Hz in the socket has nothing to do with 50 Hz on the screen of the old TV.


    Not really - once 50Hz (and in the USA 60Hz) was chosen as the frequency of the TV frame due to the fact that it was assumed that the network would be an element of frame synchronization between the camera and the receivers. Later this idea was abandoned but the frequency was not changed anymore.
  • #12 16679930
    tw84
    Level 17  
    jdubowski wrote:
    erroid wrote:
    to sum up: 50 Hz in the socket has nothing to do with 50 Hz on the screen of the old TV.


    Not really - once 50Hz (and in the USA 60Hz) was chosen as the frequency of the TV frame due to the fact that it was assumed that the network would be an element of frame synchronization between the camera and the receivers. Later this idea was abandoned but the frequency was not changed anymore.


    An urban legend and nothing else. The problem on television was line timing, not frame timing, so relying on the network for this will not do anything, not to mention the fact that each potential recipient can be powered from a network with a different phase shift, so the beginning of the frame would fall to some in the center of the screen :D

    On the other hand, the reason why television uses the same frequency as the network basically in this thread has already fallen: the arc lamps used in the studio. They blink at 100/120 Hz and if the frame rate was other than 50/60 Hz, viewers would watch the brightness ripple walking on the screen, which would be a nuisance.

Topic summary

The discussion revolves around the historical and technical reasons for the standard mains voltage of 220 (230) V and frequency of 50 Hz. The voltage originated from early electrical systems using direct current generators, where 110 V was the initial standard due to the characteristics of arc lamps. The voltage was doubled to 220 V for three-wire systems to power electric motors. The frequency of 50 Hz was adopted in Europe, influenced by the stability of arc lighting and the design of generators, while the USA opted for 60 Hz. The relationship between frequency and lighting flicker was also noted, with lower frequencies causing undesirable effects. Additionally, the connection between television frame rates and electrical frequency was discussed, clarifying that while 50 Hz was used for TV refresh rates, it was not directly related to cinematography frame rates.
Summary generated by the language model.
ADVERTISEMENT