answersLogoWhite

0


Best Answer

I just found out that ubiquitous computing and pervasive computing aren't the same thing. "What?!?" you're saying. "I'm shocked." Yes, brace yourselves. This time it appears to be the scientists, not the marketers, who adopted everyday terms to describe their once-futuristic technology, making things very confusing now that other folks are using those ordinary words -- sometimes interchangeably -- without their particular nuances in mind.

Now, I'm not going to blame anybody here -- they're a lot smarter than I am, and they started their research a long time ago -- but I'm going to suggest that things have come far enough that there are easier ways to explain what is meant by these terms. First, let's look at what they mean.

Ubiquitous means everywhere. Pervasive means "diffused throughout every part of." In computing terms, those seem like somewhat similar concepts. Ubiquitous computing would be everywhere, and pervasive computing would be in all parts of your life.

That might mean the difference between seeing kiosks on every street corner and finding that you could -- or need to -- use your Palm handheld to do absolutely every information-based task.

And, in fact, that's where the difference between these two types of computing lies. Pervasive computing involves devices like handhelds -- small, easy-to-use devices -- through which we'll be able to get information on anything and everything. That's the sort of thing that Web-enabled cell phones promise. Ubiquitous computing, though, eschews our having to use computers at all. Instead, it's computing in the background, with technology embedded in the things we already use. That might be a car navigation system that, by accessing satellite pictures, alerts us to a traffic jam ahead, or an oven that shuts off when our food is cooked.

Where IBM is a leader in the pervasive computing universe -- it has a whole division, aptly called the Pervasive Computing division, devoted to it -- Xerox started the ubiquitous thing back in 1988.

Ubiquitous computing "helped kick off the recent boom in mobile computing research," notes its inventor, Mark Weiser, who came out with the concept at Xerox's Palo Alto Research Center, "although it is not the same thing as mobile computing, nor a superset nor a subset." That means that people who use ubiquitous computing to mean computing anytime, anyplace -- to describe hordes on a street corner checking their stock prices until the "walk" light comes on or efforts to dole out laptops to all students on a college campus -- aren't using the rightterm.

We don't really need to use either one. I'd be happy to call pervasive computing mobile computing, and to call ubiquitous computing embedded or invisible or transparent computing -- or even just built-in functions.

Besides, until either ubiquitous or pervasive computing is anywhere and everywhere, those alternatives seem more accurate.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What is the difference between patvasive and ubiquitous computing?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

Compare the difference between pervasive computing and other computings?

First, let's look at what they mean. Ubiquitous means everywhere. Pervasive means "diffused throughout every part of." In computing terms, those seem like somewhat similar concepts. Ubiquitous computing would be everywhere, and pervasive computing would be in all parts of your life. That might mean the difference between seeing kiosks on every street corner and finding that you could -- or need to -- use your Palm handheld to do absolutely every information-based task. And, in fact, that's where the difference between these two types of computing lies. Pervasive computing involves devices like handhelds -- small, easy-to-use devices -- through which we'll be able to get information on anything and everything. That's the sort of thing that Web-enabled cell phones promise. Ubiquitous computing, though, eschews our having to use computers at all. Instead, it's computing in the background, with technology embedded in the things we already use. That might be a car navigation system that, by accessing satellite pictures, alerts us to a traffic jam ahead, or an oven that shuts off when our food is cooked. Where IBM is a leader in the pervasive computing universe -- it has a whole division, aptly called the Pervasive Computing division, devoted to it -- Xerox started the ubiquitous thing back in 1988. Ubiquitous computing "helped kick off the recent boom in mobile computing research," notes its inventor, Mark Weiser, who came out with the concept at Xerox's Palo Alto Research Center, "although it is not the same thing as mobile computing, nor a superset nor a subset." That means that people who use ubiquitous computing to mean computing anytime, anyplace -- to describe hordes on a street corner checking their stock prices until the "walk" light comes on or efforts to dole out laptops to all students on a college campus -- aren't using the rightterm. We don't really need to use either one. I'd be happy to call pervasive computing mobile computing, and to call ubiquitous computing embedded or invisible or transparent computing -- or even just built-in functions. Besides, until either ubiquitous or pervasive computing is anywhere and everywhere, those alternatives seem more accurate.


What is the difference between Grid computing and peer-to-peer Computing?

This is a home work my friend:)


What is the difference between manual computing device and automatic computing device?

poda myir


What is the difference between supercomputer and distributed computing?

supercomputers allows both parallel and distributed computing


Difference between super and mini computer?

The difference between a super computer and a mini computer is in their computing power. A super computer has infinitely more computing power than a mini computer.


What is the difference between the newiset and the oldest computer?

hardware, computing language


Difference between ignore all and ignore in computing?

sloth hacking


Difference Between Cloud Computing and SAAS?

Have you been wondering if there is a difference between cloud computing and SAAS, which stands for Software-as-a-Service? There definitely is a difference between the two terms. Cloud computing refers to the larger concept of using the Internet to access the network and services. Cloud computing is extremely scalable. SAAS differs in that it is software that is managed remotely by a provider on a subscription basis. SAAS is like cloud computing though because it is massively scalable.


What is the difference between hard computing and soft computing?

Soft computing differs from conventional (hard) computing in that, unlike hard computing, it is tolerant of imprecision, uncertainty, partial truth, and approximation. In effect, the role model for soft computing is the human mind.


What is the difference between information and computing?

Information can be defined as meaningful data. Computing can be the process of doing something with the data to produce information.


What is the difference between mobile and portable cellular phones?

Mobile Computing: In Mobile Computing, the user can move from onelocation to other and he can keep computing while moving.Portable computing: In Portable Computing, the user moves to otherlocation, connects his laptop to a port and the he performs computing.


Is there a website that shows the difference between cloud computing versus virtualization?

The ERP Software Blog has a helpful guide that distinguishes between cloud computing and virtualization. Tech Target is another website that breaks down the differences between virtualization, SaaS, and cloud computing.