Apples and Oranges.
Hertz is a measurement of frequency (electrical power, soundwaves, etc.). Standard household power in the US runs at 60 Hertz or 60 cycles per second. A "megahertz" is 1000 "hertz".
The correct term is "Byte", not "Bite". Byte is a measurement used in computer science and represents 8 bits (basically 8 switches each representing ones and zeros). The measurements scale up by 1024, starting with Kilobytes (1024 bytes), Megabytes (1024 Kilobytes) and Gigabytes (1024 Kilobytes). The lastest is Terabytes, with is 1024 Gigabytes! One Gigabyte is 1,099,511,627,776 bytes.
A Giga bit is 1000 times bigger then a Mega bit.
A gigabyte is bigger, it is 1,024 megabytes.
MB=1,000,000
GB= 1,000,000,000
Gigs
giga bit
mega bit
It's not a question of "better", only of bigger. Giga is a lot bigger than mega.
In computers, giga is 1024 (2^10) times bigger than mega. In normal metric usage, giga is 1000 (10^3) times bigger than mega. This is due to the fact that computers use binary internally
1 giga bite =1024MB so giga bite is bigger than mega
mega bite
Some common prefixes used for computers are "micro-" (e.g., microprocessor), "multi-" (e.g., multitasking), "bi-" (e.g., binary), and "tele-" (e.g., teletype). These prefixes help provide information about the computer's functionality or components.
1048576 bytes
Not enough that's why they keep making bigger harddrives
No, Super is bigger.
Mega bats are bigger because Micro means small and mega mean big.
mega bite green
mega loin
The biggest meglodon was about 85ft long. The average is between 80ft to 75ft big