answersLogoWhite

0

I just found out that ubiquitous computing and pervasive computing aren't the same thing. "What?!?" you're saying. "I'm shocked." Yes, brace yourselves. This time it appears to be the scientists, not the marketers, who adopted everyday terms to describe their once-futuristic technology, making things very confusing now that other folks are using those ordinary words -- sometimes interchangeably -- without their particular nuances in mind.

Now, I'm not going to blame anybody here -- they're a lot smarter than I am, and they started their research a long time ago -- but I'm going to suggest that things have come far enough that there are easier ways to explain what is meant by these terms. First, let's look at what they mean.

Ubiquitous means everywhere. Pervasive means "diffused throughout every part of." In computing terms, those seem like somewhat similar concepts. Ubiquitous computing would be everywhere, and pervasive computing would be in all parts of your life.

That might mean the difference between seeing kiosks on every street corner and finding that you could -- or need to -- use your Palm handheld to do absolutely every information-based task.

And, in fact, that's where the difference between these two types of computing lies. Pervasive computing involves devices like handhelds -- small, easy-to-use devices -- through which we'll be able to get information on anything and everything. That's the sort of thing that Web-enabled cell phones promise. Ubiquitous computing, though, eschews our having to use computers at all. Instead, it's computing in the background, with technology embedded in the things we already use. That might be a car navigation system that, by accessing satellite pictures, alerts us to a traffic jam ahead, or an oven that shuts off when our food is cooked.

Where IBM is a leader in the pervasive computing universe -- it has a whole division, aptly called the Pervasive Computing division, devoted to it -- Xerox started the ubiquitous thing back in 1988.

Ubiquitous computing "helped kick off the recent boom in mobile computing research," notes its inventor, Mark Weiser, who came out with the concept at Xerox's Palo Alto Research Center, "although it is not the same thing as mobile computing, nor a superset nor a subset." That means that people who use ubiquitous computing to mean computing anytime, anyplace -- to describe hordes on a street corner checking their stock prices until the "walk" light comes on or efforts to dole out laptops to all students on a college campus -- aren't using the rightterm.

We don't really need to use either one. I'd be happy to call pervasive computing mobile computing, and to call ubiquitous computing embedded or invisible or transparent computing -- or even just built-in functions.

Besides, until either ubiquitous or pervasive computing is anywhere and everywhere, those alternatives seem more accurate.

User Avatar

Wiki User

14y ago

What else can I help you with?

Related Questions

Compare the difference between pervasive computing and other computings?

First, let's look at what they mean. Ubiquitous means everywhere. Pervasive means "diffused throughout every part of." In computing terms, those seem like somewhat similar concepts. Ubiquitous computing would be everywhere, and pervasive computing would be in all parts of your life. That might mean the difference between seeing kiosks on every street corner and finding that you could -- or need to -- use your Palm handheld to do absolutely every information-based task. And, in fact, that's where the difference between these two types of computing lies. Pervasive computing involves devices like handhelds -- small, easy-to-use devices -- through which we'll be able to get information on anything and everything. That's the sort of thing that Web-enabled cell phones promise. Ubiquitous computing, though, eschews our having to use computers at all. Instead, it's computing in the background, with technology embedded in the things we already use. That might be a car navigation system that, by accessing satellite pictures, alerts us to a traffic jam ahead, or an oven that shuts off when our food is cooked. Where IBM is a leader in the pervasive computing universe -- it has a whole division, aptly called the Pervasive Computing division, devoted to it -- Xerox started the ubiquitous thing back in 1988. Ubiquitous computing "helped kick off the recent boom in mobile computing research," notes its inventor, Mark Weiser, who came out with the concept at Xerox's Palo Alto Research Center, "although it is not the same thing as mobile computing, nor a superset nor a subset." That means that people who use ubiquitous computing to mean computing anytime, anyplace -- to describe hordes on a street corner checking their stock prices until the "walk" light comes on or efforts to dole out laptops to all students on a college campus -- aren't using the rightterm. We don't really need to use either one. I'd be happy to call pervasive computing mobile computing, and to call ubiquitous computing embedded or invisible or transparent computing -- or even just built-in functions. Besides, until either ubiquitous or pervasive computing is anywhere and everywhere, those alternatives seem more accurate.


What is the difference between Grid computing and peer-to-peer Computing?

This is a home work my friend:)


What is the difference between supercomputer and distributed computing?

supercomputers allows both parallel and distributed computing


Difference between super and mini computer?

The difference between a super computer and a mini computer is in their computing power. A super computer has infinitely more computing power than a mini computer.


Difference between ignore all and ignore in computing?

sloth hacking


What is the difference between the newiset and the oldest computer?

hardware, computing language


What is the difference between information and computing?

Information can be defined as meaningful data. Computing can be the process of doing something with the data to produce information.


What is the difference between hard computing and soft computing?

Soft computing differs from conventional (hard) computing in that, unlike hard computing, it is tolerant of imprecision, uncertainty, partial truth, and approximation. In effect, the role model for soft computing is the human mind.


What is the difference between mobile and portable cellular phones?

Mobile Computing: In Mobile Computing, the user can move from onelocation to other and he can keep computing while moving.Portable computing: In Portable Computing, the user moves to otherlocation, connects his laptop to a port and the he performs computing.


Is there a website that shows the difference between cloud computing versus virtualization?

The ERP Software Blog has a helpful guide that distinguishes between cloud computing and virtualization. Tech Target is another website that breaks down the differences between virtualization, SaaS, and cloud computing.


What is the difference between cloud computing and cloud computing ETF?

I believe the difference is what you have to pay for and what you don't have to pay for. The cloud availability is definitely something that you should look into if you are a person who is constantly on the computer and constantly on the move.


Difference between cloud computing and grid computing?

Grid Computing is a method of multiple computers working together to solve problems. Cloud Computing accesses the application through means of a service rather than a hard drive or storage utility.