Assuming the 1G refers 1 GB(or Gigabyte) no, 256mb is not more than that.
MB, or a Megabyte is equal to 1,000,000 bytes. Therefore, 256MB is 256,000,000 bytes.
Whereas a GB, or Gigabyte is equal to 1,000,000,000 bytes.
How many bytes are there in 32 bits?
A byte is 8 bits. Thus, 32 bytes is 256 bits. Bits are each either a single 1 or 0. By statistics, you then have 2256 possible combinations. In other words, 32 bytes can represent any number from 1 to 1.15792089 × 1077.
How many kilobytes are in 1 millibyte?
There is no such thing as a "milobyte," However, if you mean "megabyte", there are 1000 kilobytes in a megabyte. A kilobyte is 1000 bytes. A megabyte is 1 million bytes.
Describe the manager roles of information technology?
The job (role) of a project manager is extremely challenging and thereby exciting. Depending on the organizational structure of your organization, you may be reporting to a functional manager, a program manager, a portfolio manager, or to some other manager or executive. Nevertheless, it is your responsibility to work with your team and other relevant individuals and groups, such as program managers and portfolio managers, to bring all the pieces together and make the project happen i.e., to achieve the project objectives.
To do this, you need a range of skills and capabilities. They are:
1. Communication
2. Negotiation
3. Problem Solving
4. Influencing
5. Leadership
How much time does 1 gb give you on the internet?
It depends entirely on what you do, and how you view the internet.
Going to websites full of multimedia - images and videos will obviously give you a lot less time on the web.
According to a website I just tried out, 1gb gives you around 40 hours of internet browsing, but that's not accounting for music downloads, video streaming etc.
Some webpages are more geared up to browsing with a smaller download limit than others, and the trick is finding these.
Square wave is analog or digital signal?
This question has become the point of much heated debate here on WikiAnswers, and I will attempt to pare down the answers to its simplest form while keeping all valid viewpoints intact. Any further debate should be put in the "Discussion" area.
The Case For Digital: With a few exceptions, a square wave in it's theoretical form will most likely be used in digital applications.
A square wave may be "analog" during the time it is in transition from one stable state to another, and "digital" when it is in one state or the opposite state (i.e., voltage level), according to the type of digital inputs stimulated, but a transition is not an analog state function in the context of computers or digital logic. A square wave has two states hi and low and the transition from hi to low can be switched up to gigahertz (GHz) frequencies. Test equipment manufacturers are able to generate square wave signals well above 100GHz. Function generators have the square wave signal where the transition can be controlled. Practically all digital circuits utilize these states as "1" and "0" (or true and false) that computers use to make decision at the machine language level. Furthermore all modern computer uses this signalling scheme. But digital signals (1's and 0's) are not necessarily "square" waves, strictly speaking. A clock circuit generates a square wave that is used as a timing reference for the address, data, and control circuits.
The Case For Analog: All theoretical waveforms look great on paper, but it is impossible to produce any perfect waveform. Therefore, all waveforms (including square waves) are inherently analog.
By Fourier Analysis, a square wave is actually an infinite series of the summation of sine waves, in this case the odd harmonics divided by N. As a result, a square wave can actually be considered an analog signal. Looking at this another way, no pulse driver, conductor, or transmission line is perfect, so the rise and fall time of a square wave are not instantaneous, meaning that the series is not really infinite so, again, the square wave is analog, although quite complicated, but still analog. We call it digital because we sense it with discriminators that decide the "value" of the wave based on some thresholds with appropriate hysteresis.
The Case For Sanity: A wave form is not inherently digital or analog. "Digital" and "analog" are arbitrary words used to describe an application of waveforms. In other words, all digital signals are square waves, but not all square waves are digital.
A square wave is usually used in digital applications, but is also used in audio applications for a "distorted" guitar or vocal effect. A sine wave can become a square wave if the input signal on an amplifier is too great for the power of the circuit to amplify, "clipping" the high and low portions of the sine wave.
A square wave can be considered an analog or a digital signal. A perfect square wave (not possible to generate based on Fourier Analysis) would not in general be considered an analog signal because it only has two values, defeating the point of analog (in many cases, but not all). One can make a square wave oscillator to use as an alarm - different frequencies were used to convey different information. A square wave was simple to create, thus my reason for using as opposed to a sine wave (this is an example of an analog use of square waves - the information is not stored in the two values of the wave, but in the "infinite frequency range made available by the extra circuitry to change frequencies."
This point is moot, really, because it is the nature of the data (being analog or digital) that truly is important, and defines whether the wave should be catagorized as "analog" or "digital".
Oh-So-Clever Analogy: A wrench is the perfect tool for fixing trucks, but if I use the wrench to fix a desk, that doesn't make the desk a truck. In short, the application defines the state, but does not limit the tool.
Can you download a sound card?
No. Sound cards are pieces of hardware. You can't download one any more than you could download a new car. You can, however, download a driver for your existing sound card, as long as you know its make and model.
Explain how cold boot different from a warm boot?
Warm boot: not all the processes shut down. It does not take as long for a warm boot and many times users do not have to be off the system. Cold boot: Everything shuts down. There are shutdown scripts which shut down all the processes (so nothing is missed). All users must be off the system. Then, you run the startup script (check for errors). Test the system to make sure the changes you made took effect and there's no problems. Warm boot example: Configurable parameter was changed and the documentation states in order for it to take effect, a warm boot is required. In the documentation for that particular system, a warm boot's criteria will be listed. Cold boot example: Operating system upgrade on the server.
What is the difference between a paragraph and line spacing?
Line spacing is the spacing between two consecutive lines when you do NOT press the enter key.
paragraph spacing is the space between two lines when you DO press the enter key.
Line spacing<Para Spacing
What is infrared port for laptop computer?
Infrared or IRDA port is a wireless serial port which uses Infrared (invisible) light for transmission and reception of data. (Infrared light is also used in TV remote controls).
Infrared port on a laptop computer can be used to transfer data to and from another computer or a mobile device like a PDA or mobile phone without wires over short distance. Infrared port will need a line of sight communication path i.e. to establish a connection and for reliable communication Infrared ports of two devices should face each other.
Not all laptops have Infrared ports. Some have these built in the machine and for some if needed you will have to add an Infrared port device on USB port.
- Neeraj Sharma
My Computer->System Properties will only tell you the components that the operating system has device drivers for. On the other hand, a physical inspection might reveal more components, but they would only operate correctly if the device drivers are installed in the software.
"A specification from the DVD Forum that certifies DVD drives for media compatibility. Drives with the DVD Multi logo can read and write DVD-RAM, DVD-RW and DVD-R discs as well as read DVD-Video and DVD-ROM. DVD Multi drives may also be able to play DVD-Audio discs. See dvdand dvd-audio."
taken from dvd-multi
10 basic flowcharting symbols?
the five common flowchart symbols are:
1. ellipse : to denote the beginning or end.
2. Rectangle : to show the process
3. diamond : to show the decision making or condition
4. lines(with and without arrows): to show the links
5. parallelogram : to show an input and/or an output operation
PC3200 is another way of writing PC400 or DDR 400. All three of those mean that the bus speed of the RAM module is 400MHz. In similar fashion PC2700 is PC333 or DDR 333 (333 MHz) and PC2100 is PC266 or DDR 266 (266MHz). PC3200 is a type of RAM or random access memory for your computer. There are several different types of RAM: for example PC100, PC133, PC2100, PC2700 and now PC3200. Each number represents a specific megahertz bus system. If you going to be upgrading your ram make sure to look for PC 3200 on the chip to assure that it is compatible with your bus system. For regular SDRAM in the PCxxx designation the xxx is the memory clock rate in MHz i.e. 133MHz for PC133. For DDR SDRAM in the PCxxxx designation the xxxx is the memory bandwidth in MB/s i.e. PC3200 is 3200MB/s. This corresponds to a 200MHz clock (400MHz "data rate"). Since DDR SDRAM uses a 64-bit wide bus (8 bytes wide), 8 bytes * 400 MHz = 3200MB/s. Note these aren't real megabytes; they are found using 1MB = 1000 * 1000 B, not 1024 * 1024 B. Since all the computer companies can't seem to agree on a standard, there are many types of RAM especially with all the improvements with speed and number of pins allowing for faster access time and more storage. Finding the correct type of RAM is very difficult. You must know the following: the form factor (SIMMs, DIMMs, and RIMMs), the type (SD, DDR SD, RD, DDR, DDR2, DDR3), and the speed (333, 400, 667, etc). Because RAM is the most commonly upgraded component in computer system, some company (I forget who) developed a system for telling what type of RAM is which. So instead of having to know all the above information, you just need to know the PC rating (like PC3200) and you're good to go.
computer uses PC3200 by all means get it.
PC2700 stands for DDR Sdram that operates at 333MegaHertz.
Mega means 1 million, approximately.
Hertz stands for -> Cycles per Second.
I know this all probably sounds like 'Greek' to you.
Your processor and ram memory operate at a Frequency Rate. Frequency Rate, is usually just referred to as 'Speed'
"What "Speed' does your processor run at?" "What 'Speed' of ram do you have?" Like that.
The Frequency Rate is determined by Cycles per Second. Get's much deeper than this, I won't go into it now.
PC3200 is DDR Sdram that operates at 400MHz.
That's why I say go for PC3200.
Some things you need to know,
1.What is the Maximum TOTAL Amount of ram for your computer?
(Motherboard)
2.What is the Maximum Amount for EACH ram slot? Do you know if your ram slots have a Maximum Amount of 1GB? No fun to buy a 1GB stick of ram, and find out your computer can't use it!
The portion of the internal bus that connects the processor to the memory cache is called?
Back-side bus(BSB)
How is data processing cycle important?
Ans.The data-processing cycle describes how data is processed into information
by the computer. The input stage is the first stage of the data-processing cycle.
Data is collected and entered into the computer. In the processing stage, the
computer converts data into information according to given instructions. After
processing, the information is presented to users in the output stage. Information
is stored on different types of media in the storage stage. The stored information
can be used later for a different data processing cycle. In this way, the data-
processing cycle continues.
Has TC PIP replaced that protocol in the general usage or did another protocol replace it?
TCP/IP is the protocol used for the Internet. It will not be replaced anytime soon, it replaced a protocol called IPX/SPX which was used by Novell. But TCP/IP is here to stay.
Measured in hertz and refers to how quickly a proccesor can work?
CPU clock is the property that is measured in hertz (Hz) and represents how quickly a processor carries out instructions. Modern day CPUs have speeds that are expressed Gigahertz (GHz).
800MHz and DDR2 is the assigned DIMM