What are the five main parts of information processing?
Information processing involves five phases: input, process, output, storage, and retrieval. Each of these phases and the devices associated with each are discussed below.
Short answer: No. It's not. Longer answer: Usually "memory" refers to RAM, which the exact opposite of that. When your computer powers off, all the RAM is cleared, and everything in it gets erased. Anything you want to keep when you turn the computer back on has to get saved to your hard disk.
What has more memory a laptop or a desktop?
It depends on what the manufacturer put in and if any was added by the buyer. The technical term for memory is RAM (Random Access Memory) You'll have to check the specs to see which computer has more. Popular stock RAM sizes are 512MB and 1GB of RAM. In general, desktops have the capacity to hold more RAM than laptops. Some laptops now hold up to 4GB of RAM, but you'll likely have to buy extra RAM to max it out. Some desktops now how up to 16GB of RAM. Ram is particularly important in Video and Sound editing. If you're simply needing to run Microsoft Office and surf the web, then a laptop with 1GB of RAM is probably all you'll need.
How do you make an o with the line through it on the keyboard?
Alt+0248 (on number pad) will make the ø.
Also, with MS Word: go to Insert> symbol> enter Font: Arial> scroll down to insert this symbol Ø or many others.
by combing these layers the functionality is performed by a
single layer and overhead is reduced.The disadvantage is
more functions need to be performed by single layer
The TCP/IP model was adopted and condensed into 4 Layers.
1. Link layer ( Layer 1and 2 of the OSI model)
2. Internet Layer (Layer 3 of the OSI model)
3. Transport ( Layer 4, 5 and 6 of the OSI model)
4. Application ( Layer 7 of the OSI model)
http://en.wikipedia.org/wiki/TCP/IP_model#Link_Layer
The difference between buffer and cache?
Buffer
A data area, shared by hardware devices or program a process is called buffer. They are operated at different speeds or with different sets of priorities. The buffer allows each device or process to operate without holding up by the other. In order to a buffer to be effective, the size of the buffer needs to be considered by the buffer designer. Like a cache, a buffer is a "midpoint holding place" but does not exist so much to accelerate the speed of an activity as for supporting the coordination of separate activities.
This term is used not only in programming but in hardware as well. In programming, buffering sometimes needs to screen data from its final intended place so that it can be edited or processed before moving to a regular file or database.
Cache Memory
Cache memory is type of random access memory (RAM). Cache Memory can be accessed more quickly by the computer microprocessor than it can be accessed by regular RAM. Like microprocessor processes data, it looks first in the cache memory and if there, it finds the data from a previous reading of data, it does not need to do the more time consuming reading of data from larger memory.
Sometimes Cache memory is described in levels of closeness and convenience to the microprocessor. An L1 cache is on the same chip like the microprocessors.
In addition to cache memory, RAM itself is a cache memory for hard disk storage since all of RAM's contents come up to the hard disk initially when you turn on your computer and load the operating system that you are loading it into RAM and later when you start new applications and access new data. RAM also contains a special area called a disk cache that consists of the data most recently read in from the hard disk.
Unix (officially trademarked as UNIX®, sometimes also written as Unix or Unix® with small caps) is a computer operating system originally developed in 1969 by a group of AT&T employees at Bell Labs including Ken Thompson, Dennis Ritchie and Douglas McIlroy.
It is important because counting all of the various flavors of UNIX (freeBSD, openBSD, Solaris, HP-UX,etc. also sort of Mac OS-X) and all of the Linux Flavors (which are sort of descendants of UNIX systems) would make up a large portion of all of the computers in operation.
Unix is important in and of itself because it was designed to handle multiple stations connecting to a central hub which, in turn, may itself be connected to other hubs. This defines a network with a star topology which, amazingly enough, is the same as the basic structure of the entire internet.
In a nut shell UNIX is an operating system which at one point in time, was the most prevalent operating system in use. It is still widely used in scientific and professional circles. Unix is one of the oldest but still most popular Operating Systems. It was invented in 1969 at AT&T Bell Labs by Ken Thompson and Dennis Ritchie. All the contemporary operating systems of Solaris, HP-UX, Linux, AIX are variants of Unix. Unix is a family of multi-user operating systems. It was originally developed by AT&T in the 1970s. Unix has a very strong security and model and relatively simple design, making it popular and fairly easy to implement. Many operating systems are either based on or modeled after the first Unix systems, such as Linux, Solaris, or Mac OS X.
The back buton on a browser is used to go to the last page you were viewing in that tab or window. If you go to Google, click something else, and then click the back button, you will go back to Google.
What is the purpose of the cache?
Cache is a special kind of memory which is can be used as a spare to store data
What OSI Layer ensures packet delivery and retransmission?
Layer 4 (Transport) is responsible for correction.
Its just an another name of non - interactive graphics.
Actually interactive and non - interactive graphics are a classification of computer graphics on the basis of interaction of the computer graphics system towards users.
What is the pathname in a URL?
A pathname is the location of a file or object in the context of a file system.
A URL is the location of a file or object in the context of an internet web server.
NOPE.
Kb might be the smallest size of a file
Your computer memory is made out of GB and most games are made out of GB
A pictures or a word document can be made out of Kb
Kilobite
Gigabite
Differentiate between volatile memory and non-volatile?
Non-volatile memory, nonvolatile memory, NVM or non-volatile storage, is computer memory that can retain the stored information even when not powered. Examples of non-volatile memory include read-only memory, flash memory, most types of magnetic computer storage devices (e.g. hard disks, floppy disk drives, and magnetic tape), optical disc drives, and early computer storage methods such as paper tape and punch cards.
Non-volatile memory is typically used for the task of secondary storage, or long-term persistent storage. The most widely used form of primary storage today is a volatile form of random access memory (RAM), meaning that when the computer is shut down, anything contained in RAM is lost. Unfortunately, most forms of non-volatile memory have limitations that make them unsuitable for use as primary storage. Typically, non-volatile memory either costs more or performs worse than volatile random access memory.
'''Volatile memory''', also known as '''volatile storage or primary storage device''', is [[computer memory]] that requires power to maintain the stored information, unlike [[non-volatile memory]] which does not require a maintained power supply.
Most forms of modern [[random access memory]] are volatile storage, including [[dynamic random access memory]] and [[static random access memory]]. [[Content addressable memory]] and [[dual-ported RAM]] are usually implemented using volatile storage. Early volatile storage technologies include [[delay line memory]] and [[Williams tube]].
RAM=Volatile memory
ROM=NON-Volatile memory
What is larger 32 GB or a 8 GB?
GB is short for gigabyte. Gigabytes are units of memory for a computer, phone, iPod or other electronic devices like that. 64 GB will hold twice the usual memory of a 32 GB.
Why computer components called hardware?
Hardware is the core of the computer system .all the inputs will be given with the usage of hardware device such as keyboards,mouse etc,
SMTP stands for Simple Mail Transfer Protocol. This is the generally accepted protocol for sending e-mail messages between servers; most e-mail systems that send mail over the Internet use SMTP to send messages from one server to another; the messages can then be retrieved with an e-mail client using either POP or IMAP.
Is processor speed measured in GHz?
yes it is,
the term hertz represents repetitions per second, ie, a 3000 hertz processor makes 3000 calculations per second. the term hertz also refers to anything that follows a cycle a computer screen refreshes at about 70 hertz, or updates what you see 70 times a second
Absolutely false on both accounts!To answer the above question, "Is speed of CPU is measured in hertz?" - the answer is NO! The performance of a CPU is approximated by a multitude of different criteria including specific testing programs depending on what functionality of the CPU specifically needs to be tested and measured. This overall performance can widely vary depending on the testing program, all supporting hardware and the preconditions of the testing environment.Regarding the original answer, these are all too common misconceptions these days! Unfortunately, you'll see ill advised reference to this throughout the periodicals, both on-line and off and even the marketing departments that desire to make the technical "jargon" more palatable - but not necessarily accurate.
The metric of "speed," which involves physical movement, has absolutely NOTHING to do with frequency measured in hertz. The two are NOT the same thing and are not interchangeable. A high school or college physics class will also prove this. You certainly don't get in your vehicle look at the speedometer and read 750 MHz or 3 GHz. Conversely, I've yet to see an actual laptop or desktop personal computer get up and physically move around the room or the house! It even seems silly!
How about some on-line proof? Check out the definitions on Wikipedia and elsewhere as cited below:
http://en.wikipedia.org/wiki/Speed
(Notice that nothing is stated about computing)
vs.
http://en.wikipedia.org/wiki/Hertz
http://en.wikipedia.org/wiki/Clock_rate
http://en.wikipedia.org/wiki/Frequency
Also reference the following:
Megahertz, for example, is defined as MHz, kilohertz as kHz and hertz as Hz. It is, in fact, ALWAYS a capital "H" to pay homage to the German Physicist Mr. Heinrich Hertz. Consider:
http://www.ideafinder.com/history/inventors/hertz.htm
or,
http://searchnetworking.techtarget.com/sDefinition/0,,sid7_gci214263,00.HTML
(Scroll down and notice the table) or,
http://tf.nist.gov/timefreq/general/glossary.htm
(Click on "M" or "J-K" - these folks should know the difference)
Want further proof? Take a look at, www.fcc.gov and note their frequency references. In addition, simply take a look at a stereo dial, clock radio or even your transistor radio and notice how the manufacturers abbreviate frequency. Notice that this has NOTHING to do with "speed." You don't tune your radio to a different "speed" nor do you head down the highway at 2 kHz or 3.2 GHz.Obviously, they're not interchangeable! The point is they're entirely different metrics.
In addition, the above claim that "a 3000 hertz processor makes 3000 calculations per second" is also false. Different processor manufacturers such as AMD, Cyrix(former), NEC, TI, Intel and others manipulate various calculations and instruction throughput differently. Depending on what specific instruction is being executed in the processor, it may take from a few cycles to several processor cycles to finish the execution of any particular instruction. Meaning, it is NOT a one-to-one ratio in relation to the clock rate! In other words, a 3 GHz microprocessor does NOT execute 3 Billion instructions (or calculations) per second!
It was created by a standards body (many people contributing).
See the related link below.