look that is a hard question to ask buddy 1 web page= 7 nibbles
You can't compare gigabytes by time. Gigabytes are memory storages.
That varies greatly depending on the size of the webpages. If one hour was 15MB of data, you'd get around 60 hours surfing the web.
Hours and Gigabytes are not the same thing and cannot be measured together. Hours represent time that has passed by. Gigabytes represent amount of space available on a computer.
KB (kilobyte), MB (megabyte), and GB (gigabyte) are units of digital information storage, often used to quantify file sizes and data transfer rates. 1 KB equals 1,024 bytes, 1 MB equals 1,024 KB, and 1 GB equals 1,024 MB. While they primarily measure data volume, they can indirectly relate to time, as larger file sizes generally take longer to download or process depending on the speed of the network or device. For example, transferring a 1 GB file will typically take longer than a 1 MB file over the same internet connection.
1/10 of a gig 1,000,000 kb to the gig
One tegabyte equals 1000 gigabytes,which equals 10,000 megabytes,which equals 100,000 bytes! 1000 gb 1.000.000 mb 1.000.000.000 kb 1.000.000.000.000 bytes* You multiply it with 1000 every time. And btw, I think it's terabyte.
GB, or gigabyte, is not a unit of time, but of capacity. A gigabyte is one billion bytes, according to the SI standard; or 1024^3 = 1,073,741,824 bytes, by the computing standard, where you go by powers of 2.
The limits placed on Internet usage by some ISPs relates to the amount of material you are downloading rather than a specific time limit. The time it would take to use a 2 Gigabyte limit would depend upon what you were doing. If you are just checking email and browsing the occasional web site then a 2 Gigabyte limit may be sufficient. If you are viewing video, listening to music; or buying and downloading movies, music, software etc. then you can use 2 Gigabytes in a day or two - a single movie from iTunes, for example, can be around 2 Gigabytes in size.
Gigabytes has no connection with time.
Gigabytes (GB) is not in any way related to time.
A bit rate or data transfer rate is the average number of bits, characters, or blocks per unit time passing between equipment in a data transmission system. This is typically measured in multiples of the unit bit per second or byte per second- A kilobyte is the lowest and equals 1024 bit's, A megabyte equals 1024 kilobyte and a gigabyte equals 1024 megabytes. So if you have a 3 gb hard drive then you it would equal 3072 megabytes.
a long time