There is no fixed number of DC voltages that computers use. Computers have used as few as one DC voltage to many of dozens of different DC voltages, depending on the design of the logic circuit electronics they use.
From the middle 1960s (with the introduction of TI's 7400 TTL integrated circuits) through the present +5 VDC became a nearly universal voltage in computers. From the early 1970s (with the introduction of NMOS microprocessors, e.g. the Intel 8080) the +12 VDC and -12 VDC became common in microcomputers. In the 1990s (with the introduction of low power energy efficient microprocessors) +3.2 VDC (and other even lower voltages) became common in microcomputers (sometimes completely replacing +5 VDC) as it both reduces power consumption and permits higher speed of operation.
The modern power supply produces +3.3, +5, and +12 volts within the computer. The computer itself uses various other voltages beyond that as well since even the miniscule 3.3 volts would be too much for RAM sticks, for example, but stepping down to those other voltages is easily accomplished by the motherboard.
There are three stars on Washington DC's flag.
DC = District of Columbia
No. The district was named the Territory of Columbia, Columbia being a poetic name for the United States in use during the 1790s. Later this was renamed the District of Columbia, and then even later Washington, DC.
Washington District of Columbia. DC.
because DC generator generates generally generate DC power but as it has slip rings which convert it into ac output power
yes
A fan can use both voltages depending on what the manufacturer nameplates the motor voltage to be. AC is the most common voltage but some smaller fans like power supply fans in computers use a DC voltage. Check the fan motor's nameplate to supply the correct voltage.
Like AC, DC can be at any voltage.
24volt dc
from the wall it could be 115v ac, or in non American countries that use it 230v ac. internal voltages range from 12v dc, 5v dc, 3.3v dc.
+ and - voltages not connected to earth
The specifications set out by the manufacturer determines what voltage will be used to make the device operate.
For measuring voltages, both Ac and Dc
4/8
+10
When electricity was first invented it was all DC. But DC only travelled a short distance, AC was shown to travel far from the generator and still be useful. That is why most of the electricity supply we use today is ACAnswerThe main reason is because the electricity supply system which supplies your house is an AC system. The reason for this is because, in order to operate, the system's voltage levels have to be raised and lowered (high voltages for transmission, low voltages for distribution), and this can only be done efficiently using transformers. Transformers only work with AC.As far as your home itself is concerned, your appliances operate at different voltages and some require DC. For example, your lighting and power outlets likely operate at 230 V (Europe) or 120 V (North America). While some appliances (stoves, kettles, toaster, etc.) are designed to operate at these voltages, some are not. Internally, many appliances, such as computers, radios, DVD-players, etc., operate at much lower voltages and, often, require DC. So these devices must have internal transformers to step the supply voltage down to the appropriate level, and rectifiers which will change AC into DC.In other words, AC is far more versatile than DC.
The rectifier is what changes your AC into DC that the computer can use. Transformer raises or lowers the voltage. So you have 110 volts AC coming into your computers power supply. The Rectifier circuit in there changes the AC voltage to DC voltage but it is still to high for the computer use. The transformer lowers that voltage down to the various voltages that your computer needs