answersLogoWhite

0


Best Answer

It is understood that programming languages are used to create programs and a program is a sequence of instructions written to perform a specified task with a computer.The important thing to be noted is - there are different tasks and different types of tasks that are to be performed with a computer and thus the facilities offered by a single programming language is not enough to accomplish all those tasks.

In other words the features or purpose of one programming language differs from others (for example HTML is used to create websites and C or Shell programming can be used for system programming). Different programming languages are also used in different platforms to perform the same task (for example Visual C# or Visual C++ can be used for creating an application in Windows but Objective C is used to create the same application in Mac OS X).So different programming language are used to create program to perform different types of tasks due to limitations in the facilities offered by a single programming language.

Regardless of what programming language has been used, the particular compiler for that programming language compiles the program and creates its equivalent executable code(.exe file) in machine language(0 and 1), which the computer understands.

User Avatar

Wiki User

11y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

9y ago

It is far easier and less expensive to design and build electronic circuitry (and magnetic data storage) that can reliably distinguish two different states than it is to design and build electronic circuitry that can reliably distinguish more different states.

In fact back (1940s & 1950s) when decimal computers were much more popular than binary computers for many purposes (e.g. accounting), while the users always thought of the machine operating on decimal digits instead of binary bits the internal coding that the machine used for a digit was always composed of several bits (anywhere from 4 bits per digit to 10 bits per digit, depending on the computer and coding - a few computers even had more than one code used in different parts of the computer for digits).

This answer is:
User Avatar

User Avatar

Wiki User

14y ago

Because 00111010101111010101010111101000101010101.

but no seriously, its because god created computers on the 10th day, thus computers talk in 0's and 1's.

This answer is:
User Avatar

User Avatar

Wiki User

14y ago

Usually it's converted to other values for human use. As far as the computer "sees" it, the number is always stored in binary.

This answer is:
User Avatar

User Avatar

Wiki User

11y ago

ger g ttgh rye34 brtytyr try

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Why computer understand only 0 or 1?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Engineering

Why the computer only understand the language of o and 1?

In computer programming, 1 is the presence of voltage, 0 is the lack of voltage. Thus, 1 is "on" and 0 is "off." In the interest of compacting more information into smaller spaces, bi-polar representations emerged. In a bi-polar format, a -1 is also considered "on." Since there can be no condition except presence of, or lack of voltage, then using symbols such as 20 will not work. And, I'm not a whiz at computer components, but I believe it is the math processor (CPU) that determines the values of "on" or "off" and compiles the bits into bytes which it then compares to pre-determined responses.


What are the differences between machine languages and high level languages?

Machine language is something which can be understood by machine(Computer), it can understand only 0 and 1 i.e. the binary code. High level language is something which can be understood by human beings.. for ex... english.


What are the bits of a computer system?

0 and 1


How 'pictures' can be converted into a language that computers understand?

Every thing a computer does is originated into Binary code with is a 1 or a 0 (on or off) and they are called bits that form into bytes like (101011001010). A Computer can understand anything and everything you are able to put in itself. The thing that matters is does it know where to put it and what to use the Data with. This is what applications, and device software is for.


Changing computer language of 1's and 0s to characters that a person can understand is?

decode

Related questions

What does computer understand?

Computer's understand binary, which is 0 as "off" and 1 as "on."


If you type any data why the computer language are 1 or 0?

Computer's only understand binary, which is 0 as "off" and 1 as "on."


What does understand?

Computer's understand binary, which is 0 as "off" and 1 as "on."


If you have natural number then why you use binary number in computer?

Humans understand natural numbers (1,2,3,etc) , but computers only understand binary (0,1). Computers only understand either 0 as "off" and 1 as "on."


What is 1 or 0 used in computer data is called?

The "1's and 0's" are referred to as binary. Binary is actually the only language a computer can understand. Everything else has to be translated to binary for it to understand it. 1 is conidered closed and 0 is open.


Why all data stored in a computer in binary form?

Because binary (0 or 1) is the only format that the computer can understand. A transistor is either off or on. There is no other state.


Which language can computer easily understand and execute?

A computer doesn't actually understand any language, it just processes binary numbers.


Why is the system called binary?

binary number system used in computers because computer can understand only binary language as it starts from 0and 1. which makes computer easier.


Why binary system is used for computer system?

bcoz internally it uses circuits, it can understand ON / OFF (1 or 0) for a open or closed circuit


How is text represented in a computer?

A computer is a very simple machine that can only understand 1's and 0's. We just put simple building blocks together to make it quite large. We must convert everything we want a computer to do into 1's and 0's, by convention we use whats called ASCII to do this.


What are the only two numbers that computers use for processing?

0 and 1. There are only 10 types of people. Those that understand binary and those that do not! If you do not understand that sentence you do not understand binary.


How computers use binary codes?

Computers have zero IQ. Computer can understand or feel "High voltage" or "Low voltage" or you can say, on and off. Computers use '0' for low voltage and '1' for high voltage. by using the conbinations of '0' and '1' all numbers and characters are classified. for example- if you have to write 'A', It is represented in ASCII code assigned to it and then converted to binary, hence use it.