answersLogoWhite

0

📱

Computer Science

Computer Science is the systematic study of algorithmic processes that describe and transform information. It includes the theoretical foundations of information and computation and the practical techniques of applying those foundations to computer systems. Among the many subfields of Computer Science are computer graphics, computer programming, computational complexity theory, and human-computer interaction. Questions about Computer Science, terms such as algorithms and proofs, and methodologies are encouraged in this category.

1,839 Questions

What is the difference between ICT and IT?

ICT stands for 'Information and Comunication Technology'. IT stands for 'Information Technology'. The two are very similar. The main difference is that IT is more widely used within industry, whereas ICT is applied to the academic and education side and used in places of learning, such as school, colleges and universities.

Why have developments in IT helped to increase the value of the data resources of many companies?

The developments in IT have helped to increase the value of the data resources of many companies because with the help of IT a new data resource management regime has been laid down that gives stress on the development of a high-quality, integrated, comparative data resource within a common data architecture that is stable across changing business needs and changing technology. Information technology helps in supporting both the current and the future business information demand of an organization. Other than that through development in IT data can be managed from a single location and can be accessed and used at multiple locations at the same time. The resultant is also more efficiency, productivity and low cost of operations within the organization

Can you do marine engineering after diploma in Computer engineering?

I'm sure you can do but:

  • do you have any scuba diving qualifications/experience
  • what kind of technology were you working on
  • Do you know a guy in the business.

with a diploma in engineering, I'm pretty sure you can do anything with some sort of engineering based profession you want to.

if you make it your goal. yes you can.

2nd puc computer science question papers 2008?

Many answers to many tests have been posted online. The PUC Computer Science test seems to have very specific answers to questions available online, although some answers cost money.

Difference Artificial intelligence Artificial neural network?

The expression "man-made consciousness" traces all the way back to the mid-1950s, when mathematician John McCarthy, broadly perceived as the dad of AI, utilized it to portray machines that do things individuals may call shrewd. He and Marvin Minsky, whose work was similarly as compelling in the AI field, coordinated the Dartmouth Summer Research Project on Artificial Intelligence in 1956. A couple of years after the fact, with McCarthy on the personnel, MIT established its Artificial Intelligence Project, later the AI Lab. It converged with the Laboratory for Computer Science (LCS) in 2003 and was renamed the Computer Science and Artificial Intelligence Laboratory, or CSAIL.

Presently a pervasive piece of current culture, AI alludes to any machine that can reproduce human intellectual abilities, for example, critical thinking. Throughout the second 50% of the twentieth century, AI arose as an incredible AI approach that permits PCs to, as the name infers, gain from input information without being expressly customized. One procedure utilized in AI is a neural organization, which draws motivation from the science of the mind, transferring data between layers of supposed fake neurons. The absolute first counterfeit neural organization was made by Minsky as an alumni understudy in 1951 (see "Learning Machine, 1951"), yet the methodology was restricted from the outset, and even Minsky himself before long turned his concentration to different methodologies for making savvy machines. As of late, neural organizations have made a rebound, especially for a type of AI called profound realizing, which can utilize exceptionally huge, complex neural organizations.

socialprachar

What is meant by the term multi tasking on a computer?

which of the following is not a function of an operating system:

1. controls basic input and output

2. allocates system resources

3. manages storage space

4. carries out a specific task for the user

What is the history of sixth sense technology?

SixthSense is a wearable gestural interface device that augments the physical world with digital information and lets people use natural hand gestures to interact with that information.

It was developed by Pranav Mistry, a PhD student in the Fluid Interfaces Group at the MIT Media Lab.

How do you think computers will be used in school in the future?

Educational software will continue to improve to the point where artificial personalities deliver lectures and instruction tailored to each student's personal needs. The computers themselves will be flexible touch screens one can roll up and store in a pocket, and clean by rinsing with warm water. They will come with cameras equipped with magnifying lenses for identifying objects and minerals, and giving students close up views. They will also be able to patch in to orbiting telescopes to provide real time views of the planetary surface for maps, as well as celestial objects.

Imagine classrooms without walls. School will occur on the subway, park benches, lawns, up in a tree, while mountain climbing, or anywhere the student happens to be and can spend time interacting with the "magic scroll." Tests and coursework will be designed to incorporate the student's immediate environment. Virtually anything and everything can be used to create an educational atmosphere. The beauty of computers is that they never get bored, and can conduct math drills until they detect signs of weariness in the pupil.

Imagine each student progressing at his or her own pace, unencumbered by the knowledge limits of the professor or needs of other students. You want to learn about Galois Groups--the information is readily available, and presented at a level suitable for the individual student at that time. Real world examples will be presented, and the software can design suitable test questions on the fly.

Currently the man/machine interface consists of awkward input devices like keyboards, mice, touch screens and microphones for voice recognition. Further into the future computers will be implanted and accessible by thought. The human/computer interface may involve artificial symbiotes--organisms designed to assist the human/machine connection. Education by then will not consist of rote memorization of information, but methods for compiling, accessing and using databases. Imagine a world where you can communicate with anyone, anywhere, at any time--and not only that but view the world through their eyes, hear what they hear, smell, feel and experience the world just as they do. All experiences can be readily shared. Problems will be solved by crowd sourcing. Instead of just one or two collaborators endeavoring to solve some particular problem in physics, tens of thousands or hundreds of thousands of people may weigh in, each offering their own unique ideas and suggestions. We expect the rate of technological progress to increase significantly under these circumstances.

What are the advantages of threads in Operating System?

In simple terms, they allow you to multitask better, meaning run multiple programs without experiencing a major decrease in performance.

Who is known as the father of computer science?

You may never have heard of him, but it is largely thanks to his genius that you are reading this on your computer.

Alan Turing, a pioneering 20th century mathematician, is widely considered to be the father of modern computer science.Turing was born in London in 1912 into an upper-middle class family and displayed a fascination for science throughout childhood. It was his idea of creating a machine to turn thought processes into numbers which was one of the key turning points in the history of electronic boxes and screens.

A tape machine

While reading maths at King's College, Cambridge, in the 1930s, Turing spent much time reworking earlier scientific principles and developing his most significant mathematical theories. Despite his brilliance, he suffered from a feeling of isolation, and found it difficult to make friends. After graduating, Turing was elected a fellow of Kings, and worked at Princeton in the US, where he began work on what was later to become the first digital computer programme - the "Turing Machine".

His revolutionary idea was for a machine that would read a series of ones and zeros from a tape. These described the steps needed to solve a problem or task. But it was not until nine years later that technology had advanced sufficiently to transfer these ideas into engineering.

Cracking the code

Turing's experiments are credited with helping Britain win World War II by deciphering encrypted German communications, helping the Allies remain one step ahead. The wartime German computer Enigma generated a constantly changing code which was impossible for people to decipher. But Turing's creation of Colossus - one of the first steps toward a digital computer - managed to crack Enigma's codes, giving the Allies the break they desperately needed in fighting Germany.

After WWII, Turing took up long-distance running to relieve the stress and obtained record times in races in the Walton Athletic Club. He went to work for the National Physical Laboratory and continued research into digital computers including developing the Automatic Computing Engine.

Intelligence without life

His research overlapped with philosophy, raising questions about the relationship between computers and nature. He wrote a paper called Intelligent Machinery which was later published in 1969. This was one of the first times the concept of artificial intelligence was raised. Turing believed his invention was like the human brain Turing believed that machines could be created that would mimic the processes of the human brain. He acknowledged the difficulty people would have accepting a machine to rival their own intelligence, a problem that still plagues artificial intelligence today.

He likened new technology devices such as cameras and microphones to parts of the human body and his views often landed him in heated debates with other scientists. Turing believed an intelligent machine could be created by following the blueprints of the human brain. He wrote a paper in 1950 describing what is now known as the Turing Test. The test consisted of a person asking questions via keyboard to both a person and an intelligent machine. He believed that if computer's answers could not be distinguished from those of the person after a reasonable amount of time, the machine was somewhat intelligent. This test has become a standard measure of the artificial intelligence community.

Rebel with a cause

Turing was accustomed to being a nonconformist. At boarding school, he refused to adapt and ignored subjects that did not interest him. He was an atheist, and felt marginalised because of his homosexuality.

Turing's life came to a sad end when he committed suicide by taking potassium cyanide in June 1954.

The official explanation was that it was a "moment of mental imbalance". But his mother said he used to experiment with household chemicals, trying to create new substances and became careless. Others claimed he was embarrassed about his sexuality.

When he died, Turing left the world a permanent legacy. Computers have revolutionised so many aspects of our world that today it is hard to imagine life without them.

What is the similarity of a human brain and a computer?

Computers can't think for themselves like humans. Computers fabricated, Humans are born. Humans we are breathing, living creatures and computers are just wires and programs. Humans have intelligent brain but computer does not have.

You want UML diagrams for library management system?

An Online Public Access Catalog (OPAC) is e-Library website which is part of Integrated Library System(ILS), also known as a Library Management System (LMS), and managed by a library or group of libraries.

Patrons of the library can search library catalog online to locate various resources - books, periodicals, audio and visual materials, or other items under control of the library. Patrons may reserve or renew item, provide feedback, and manage their account.

See example of UML diagram in the link.

When was the digital computer invented?

Some believe the first such invention occurred in 1944. However, there is strong evidence that the Atanasoff-Berry computer was invented and "reduced to practice" in 1939 at Iowa State University, and that portions were copied by the ENIAC inventors. This was determined in 1973 by a federal court presiding over a patent case between Honeywell and Sperry Rand. In similar developments, Colossus was the first electronic computer (that was developed by the British) to crack the LORENZ codes used by the German high command. The existence of Colossus remained a secret long after WWII. Until recently ENIAC was thought to be the first (although later than Atanasoff-Berry), but the secrecy of Colossus was finally lifted and we find it (and 9 others) were first operational in Jan 1944 while various portions of ENIAC were made operational in the period of June 1944 through October 1945. There are yet others who believe earlier calculators qualify as the first "digital computers".

Why was the Atanasoff-Berry computer invented?

Because Dr. John Vincent Atanasoff had too many physics problems to solve that required the calculation of systems of simultaneous equations that were much too large for the manual methods of the time using either slide rules or mechanical desk calculators.

Is a video camera a input or output device?

It is a device or component that reads a video recording, like a DVD/VHS player, and sends the information to a display device like a TV. A video input device can also be a camera that records video and sends the information to a storage device and can also display the captured images on a TV.

When was the first modern computer invented?

in the 5os

-tshay


That depends on how you define 'modern-day computers'. The first electronic computers were developed in the mid-20th century (1940-1945), and were the size of a large room. Personal computers (PCs) only came about in the 1970s, after the microprocessor was introduced.


It's impossible to pin down an exact date of invention or inventor, as multiple companies worked on developing the computer as we know it. You could check out the release dates for certain milestone computers, such as the Scelbi & Mark-8 Altair & IBM 5100 Computers (1974/75), but all in all, the development of the computer was a progressive collaboration.

Who invented the first computer and what year?

It is believed that Archimedes invented the analog computer circa 100BC. An example of this was found by fishermen in the early 1900s on a shipwreck.

Charles Babbage invented the digital computer in the 1830s. This machine was mechanical but was never built, mostly because Babbage could not convince anyone to finance it.

John Vincent Atanasoff invented the electronic digital computer in 1937 in an Indiana bar. The machine was complete and operational in early 1942, but saw little use as Atanasoff and his assistant Clifford Berry were soon called away to war related work. The University soon scrapped it to make room for more classrooms, because although it was roughly the size of a desk and on casters it had mistakenly been built too wide to fit through a doorway.

Tommy Flowers invented the programmable electronic digital computer in 1943. Ten of these machines were used at Bletchley Park to break the German high command "fish" cypher.

The inventor of the modern stored program electronic digital computer is unknown and the exact date is unknown, but it was probably sometime in 1944. The idea came up in discussions on Project PX, the WW2 project to build ENIAC, but it was decided not to use it in ENIAC as this would delay construction. John von Neumann is often incorrectly credited with this invention as he wrote the first paper on it, which was prematurely released in draft form before proper credits could be given. The first such computer was not built until 1948 in Manchester, England.

What is a mini computer used for?

The wonderful thing about mini computers is that they can be used just about anywhere. Long battery life allows the unit to be used at the beach, at the pool, or anywhere where power is not available.

Who invented the small computer?

This is usually credited to Ted Hoff for the Intel 4004.

However a secret military microcomputer for the F-14 fighter predates his work by roughly 3 years. It was not declassified until the late 1980s. It does not have a single identified inventor as it was a team project.