answersLogoWhite

0

The internet, so powerful for sharing text, images, sounds and videos, is now weakest at doing that for which it was originally designed - exchanging raw data between researchers. Last month, the National Center for Data Mining at the University of Illinois at Chicago launched the first version of an infrastructure called Data Space Transfer Protocol, or DSTP, for creating the next generation web of data

The current Web provides an infrastructure for working with distributed multimedia documents, which are exchanged using the familiar "http," or Hypertext Transfer Protocol. But although a massive amount of data is available online, it is stored in so many different formats that it has become difficult, if not impossible, to analyze and use in research, says Georg Reinhart, visiting research scientist in mathematics, statistics, and computer science at UIC.

"Doctors, for example, often need to share information and data, but each doctor stores and uploads data in his or her own format," Reinhart said. "Astronomers, physicists and other researchers often face the same problem."

According to Reinhart, who developed DSTP with Emory Creel, a colleague at the National Center for Data Mining, the new transfer protocol will unify the way data is stored online. Downloading data from different sites via high-speed networks and analyzing the data in real-time will become possible for the first time.

"DSTP will standardize the way data is shared, the same way HTTP revolutionized the way documents are shared," Reinhart said. "Researchers will be able to search, analyze and mine databases simultaneously, even if the databases contain different types of data." Reinhart predicts DSTP will motivate more researchers to post data globally and lead to "an avalanche" of new and existing data accessible and useful to a wider audience.

User Avatar

Wiki User

13y ago

What else can I help you with?