It depends on the type of activities you want to accomplish. For example, if you want to browse the net, check your email, play a few simple games than 1.6GHz is more than enough. However, if you are using a heavy duty imaging software, editing video or playing a high end game than it may or not be enough, but it also depends on other computer specifications such as the amount of RAM you have to the power of your GPU.
no it's not, no normal computer is able to reacj 1 giga hertz of speed anyway, only supercomputers
2.08 GHZ would be the faster speed. MHz =Mega Hertz HGz =Giga Hertz 1 Giga Hertz = 1000 Mega Hertz
giga is the prefix for billion.
No. Hz is the basic unit. MHz is "mega hertz," and mega means 1,000,000. GHz is "giga hertz" and giga means 1,000,000,000.
Hertz is the measurement. A Mega and a Giga is the unit of measurement of a Hertz.
Ghz (Giga-hertz)
Giga-byte (GB) is a unit of memory size while Giga-Hertz (GHz) is a measure of frequency, there is no conversion factor.
Yes,its right the frequency is measured in GIGAHERTZ like you read 3.2 gHz that is giga Hertz. The Usual unit is Hertz but like we add mega,giga,tera to define a large quantity.
The frequency of activity on a motherboard is measured in megahertz (MHz), or one million cycles per second. The processor operates at a much higher frequency than other components in the system, and its activitiy is measured in gigahertz (GHz), or one billion cycles per second.
well the list of prefixes are as following in the "10 system" (metric length and so on) deka- hecto- kilo- giga- tera- peta- exa- zetta- yotta- In the binary system (bytes, hertz) deka and hecto are not used. So after gigahertz is terahertz then petahertz and so on Only missed "mega-" in the listed base 10 order of power. It goes ...kilo- mega- giga-... deka = 10 hecto = 100 kilo = 1,000 mega = 1,000,000 giga = 1,000,000,000 tera = 1,000,000,000,000 etc.
There is no such thing. A giga*hertz* is 1 000 000 000 ("one billion") hertz, i.e. 1 000 000 000 electrical or electromagnetic cycles in a second.
Giga means 10^9 (10 to the power of 9) or 1,000,000,000 (1 billion). Gibi means 2^30 (2 to the power of 30), or 1,073,741,824. So those are almost the same. The reason for the later is that computers really like to use base-2. Gigi doesn't mean anything, maybe you meany gibi or giga. A bit is the smallest amount of information a computer can think about. A byte is 8 bits. Giga-hertz per sec is a meaningless phrase. Hertz is a term that basically means "cycles per second", so if you say your computer has a 2.4 GHz (gigahertz) processor that means it takes 2,400,000,000 steps per seconds. But to add "per sec" after hertz is wrong because hertz already means that.