answersLogoWhite

0

📱

Computer Science

Computer Science is the systematic study of algorithmic processes that describe and transform information. It includes the theoretical foundations of information and computation and the practical techniques of applying those foundations to computer systems. Among the many subfields of Computer Science are computer graphics, computer programming, computational complexity theory, and human-computer interaction. Questions about Computer Science, terms such as algorithms and proofs, and methodologies are encouraged in this category.

1,839 Questions

What is a buffer and what is the benefit of it?

A buffer is a chunk of fast memory that caches data from a slower IO device.

When reading data from a disk, a great deal of time is spent "seeking", which means physically moving the read head, and waiting for the disk to spin to the right place. If the application needs to read one byte at a time, then a lot of time is wasted seeking each time.

A buffer solves this problem. When accessing the disk, it seeks once, then reads thousands of bytes into memory at once. Now, when you need to read one byte at a time, they are read from the buffer, rather than from the disk. Depending on the size of the buffer, the effective seek time per byte can be made negligibly small.

Generally, there may be buffers in both the hardware and the operating system, which make the process transparent at the programming level. When the external device is writable, then care must be taken to ensure that the buffer contents remain current with the data on the device.

Is computer science a vernacular language?

Vernacular means in the common language of. As an example, a vernacular mass in an English speaking country would be in English, instead of Latin. As such, computer languages are not in the vernacular.

What is bug computer?

A bug is an unintended operation of a computer program. Computers nearly always "Do what I say", but they may fail to "Do what I mean". Thus, most bugs are the fault of the programmer or system architect, not the computer.

An unintended operation can occur anyway when an unforseen input was not considered by the programmer or architect. Often, the complexity of a computer model may make it difficult for the programmer or architect to anticipate all possible states or inputs. Various design principles are aimed at reducing complexity and eliminating common types of bugs.

A "soft bug" is a functionality that was actually intended by the system analyst, but which is seems incorrect or puzzling to the end-user. In this case, the fault lies with the system analyst, rather than the programmer.

According to computer folklore, the term "bug" originated from an actual insect interfering with the computing hardware. While there were actual cases of this, the term "bug" in this sense predates computers.

What is called the art of making a computer do what you want it to do?

The art of making a computer do what you want it to do is called programming.

Why is hashing all database inputs not considered encryption of the database and what values does hashing database entries from server to client?

Nowadays, simply hashing the database does not provide enough encryption as hackers can run brute forcing software and/or dictionary attacks against them, also using hash decrypters.

With today's sophisticated GPUs, millions of hashes can be cracked a second with programs like John the Ripper and Hashcat. Salting the passwords and information is safer, but still not foolproof by a long way.

What are d properties of finite state machine?

teri maa ki ...behn**** ye site kya apni maa ki ***** karne k liye bnai h

What are the two popular coding systems to represent data?

Different types of computer have slightly different character sets. Many character sets

also have codes for control characters. These non-printing characters are used for

special purposes.

Examples of control characters are, end of record and end of file markers in a file,

carriage return and line feed for a printer, begin and end transmission with a modem and

cursor movement on a screen. Over the years, different computer designers have used

different sets of codes for representing characters, which has led to great difficulty in

transferring information from one computer to another. Most computers nowadays use

internationally agreed character sets. Unless the coding scheme (character set) is

standard, it will not be possible to exchange textual data between computers.

ASCII

___________(American Standard Code for Information Interchange) is the most widely

used coding system to represent data.

ASCII is used on many personal computers and minicomputers.

ASCII is a 7-bit code that permits 27=128 distinct characters.

The 128 different combinations that can be represented in 7 bits are plenty to allow for

all the letters, numbers and special symbols. An eight bit was added. This allowed extra

128 characters to be represented. The extra 128 combinations are used for symbols

such as Ç ü è ©, ®, Æ, etc.

The codes for the alphabetical characters indicate their relative positions in the alphabet

in ASCII. This is known as collating sequence, thus, sorting textual items can be

transformed into sorting the corresponding character codes. Also, in ASCII, uppercase

characters, lowercase characters and digits etc, are grouped together. So it is easy to

map between upper and lower case characters.

ASCII

Value Character

ASCII

Value Character

ASCII

Value Character

ASCII

Value Character

000 NUL 032 blank 064 @ 096 '

001 SOH 033 ! 065 A 097 a

002 STX 034 " 066 B 098 b

003 ETX 035 # 067 C 099 c

004 EOT 036 $ 068 D 100 d

005 ENQ 037 % 069 E 101 e

006 ACK 038 & 070 F 102 f

007 BEL 039 ' 071 G 103 g

008 BS 040 ( 072 H 104 h

009 HT 041 ) 073 I 105 I

Student Notes Theory

Why are AND OR NOT gates are called logically complete?

All more complicated logical operators can be constructed by combining those types.

Why was the concept of Autonomous Systems introduced?

In the early 1980s, the routers (gateways) that made up the ARPANET (predecessor of the modern

Internet) ran a distance vector routing protocol known as the Gateway-to-Gateway Protocol (GGP).

Every gateway knew a route to every reachable network, at a distance measured in gateway hops.

As the ARPANET grew, its architects foresaw the same problem that administrators of many growing

internetworks encounter today: Their routing protocol did not scale well.

Eric Rosen, in RFC 827[1], chronicles the scalability problems:

l With all gateways knowing all routes, "the overhead of the routing algorithm becomes

excessively large." Whenever a topology change occurs, the likelihood of which increases with

the size of the internetwork, all gateways have to exchange routing information and

recalculate their tables. Even when the internetwork is in a steady state, the size of the

routing tables and routing updates becomes an increasing burden.

l As the number of GGP software implementations increases, and the hardware platforms on

which they are implemented become more diverse, "it becomes impossible to regard the

Internet as an integrated communications system." Specifically, maintenance and

troubleshooting become "nearly impossible."

l As the number of gateways grows, so does the number of gateway administrators. As a

result, resistance to software upgrades increases: "[A]ny proposed change must be made in

too many different places by too many different people."

The solution proposed in RFC 827 was that the ARPANET be migrated from a single internetwork to a

system of interconnected, autonomously controlled internetworks. Within each internetwork, known

as an autonomous system (AS), the administrative authority for that AS is free to manage the

internetwork as it chooses. In effect, the concept of autonomous systems broadens the scope of

internetworking and adds a new layer of hierarchy. Where there was a single internetwork-a

network of networks-there is now a network of autonomous systems, each of which is itself an

internetwork. And just as a network is identified by an IP address, an AS is identified by an

autonomous system number. An AS number is a 16-bit number assigned by the same addressing

authority that assigns IP addresses.

NOTE

Manoj kumar

Infopark

South Ex

New Delhi

India

Are user name and user ID the same?

Yes they are same, just to authenticate you, it is unique and it can be a alphabets,numbers, alpha numerals & limited special characters (eg.) _ .

What are the rules of interface?

The rules of interface design typically emphasize clarity, consistency, feedback, and simplicity. Clarity ensures that users can easily understand how to interact with the interface. Consistency promotes familiarity by using uniform elements and behavior throughout the design. Feedback informs users about their actions, while simplicity focuses on minimizing complexity to enhance usability.

Can all computations and logic operations including Boolean logic be broken down and equated with elementary arithmetic?

They can all be broken down to three basic operations: AND, OR and NOT. It is also possible to break them down to a single operation, for example NAND (= NOT AND), but this is a bit confusing.

What is MR-DC in 8086?

Its MRDC (memory read control) it is a maximum mode pin in 8086 microprocessor

What is a computer science diploma?

its a diploma or certificate to say your qualified to understand and repair computers and servers

What is the important of unit distance code?

unit distance code mean b c d codes.b c d code is also called unit distance code.it is weighed

Which thing should be learned by engineering students of computer science besides the course?

Evey engineering student must make sure to get aware of the currently evolving technologies (which wont be taught in the course). Learn about basic web development, GUI based application development. These will assist you when you are doing projects as part of the course.

Follow up with technology news, read about developments in open source communities, and always be updated on the latest computer hardware and software.