answersLogoWhite

0

I agree with Rtrahan in most of his answer except that he got the definitions of the terms wrong. Megahertz (expressed as Mhz) is one million cycles per second while Gigahertz (Ghz) is one billion cycles per second. This is not transactions. Transactions is something else entirely. Frequency measures the number of times that the wave cycles from trough through peak to trough again. The more cycles there are, the higher the frequency and the more that can be accomplished.

User Avatar

Wiki User

11y ago

What else can I help you with?

Related Questions

Where can I get a computer speed test done?

You can visit speedtest.com to get a computer speed test done. It measures the speed between your computer and the Internet but does not measure the speed between the Network Interface Device.


How much gigabites is in 1.6gigahertz?

They are not a measure of the same thing so there is no comparison. A Gigabyte is a measure of size (for example - how much memory a computer has) and Gigahertz is a measure of speed (for example - how fast the computer's processor can operate).


What is microprocessor measured by?

The processor or CPU of a computer is measure by the speed of the calculations it makes. This speed is presently being measured in gigahertz.


What is the speed rating computers are measured in today?

Computer processor speed is measured in Hertz, usually in Megahertz or Gigahertz. This is a measure of frequency. and in the context of computer processing units (CPU) the measure is of the number of cycles or completed instructions performed per second.


What is the unit time used to measure the speed of Ram?

RAM speed is measured in MHz, PC rating or NS (nanoseconds).


The computer term PC stnds for?

Personal Computer (PC).Personal Computer (PC).Personal Computer (PC).Personal Computer (PC).Personal Computer (PC).Personal Computer (PC).


Which computer device The unit KIPS is used to measure the speed of?

Kip was an obsolete unit of force.


What is more a GB or a ghz?

A gb and a ghz are completely different things. A gb is a measure of the memory of a computer and ghz is a measure of frequency and is used to measure your procesor's clock speed.


What is the definition and a sentence for the word gigahertz?

Gigahertz is a unit of frequency equivalent to one billion hertz, commonly used to measure the clock speed of a computer processor. Example: The new computer processor operates at a speed of 3 gigahertz, making it faster than the previous model.


What PC stands for?

personal computer PC stands for Personal Computer.


Are picoseconds used to measure the speed of 5th gen computers?

Picoseconds (ps) are a unit of time that are used to measure very short periods of time, and they are typically used to measure the speed of electronic devices and processes that operate at very high speeds, such as computer processors and other electronic components. However, it is not common to use picoseconds to measure the speed of computers. Instead, computer speed is usually measured in terms of the number of instructions that a computer can execute per second, which is typically expressed in units of hertz (Hz) or gigahertz (GHz). For example, a 5th generation computer might be described as having a clock speed of 3.5 GHz, which means that it can execute 3.5 billion instructions per second. The clock speed of a computer is determined by the speed at which the computer's processor (CPU) can operate, which is typically measured in GHz. The clock speed of a processor determines how quickly it can perform various tasks, such as executing instructions in a program or accessing data from memory. It's worth noting that the term "5th generation computer" is not a well-defined term, and it does not refer to any specific generation of computer technology. Different sources may use the term "5th generation computer" to refer to different types of computers or computer technologies.


The term -PC- stands for what?

personal computer