|About the Grid|
Grid is a new Information Technology (IT) concept of "super Internet"
for high-performance computing: worldwide collections of high-end resources
- such as supercomputers, storage, advanced instruments and immersive environments.
These resources and their users are often separated by great distances
and connected by high-speed networks. The Grid is expected to bring together
geographically and organisationally dispersed computational resources,
such as CPUs, storage systems, communication systems, real-time data sources
and instruments, human collaborators.
The "plumbing" of the Grid is essentially in place: there are already have
large-scale networks of distributed computers, connected by a (comparatively)
reliable networks using data communication protocols (TCP/IP etc) that
are commonly agreed and widely used. However, because many of the enabling
technologies have not yet been developed, the challenges of projects like
GriPhyN in grid computing therefore lie in
developing the software to drive the grid.This software will be eventually part of
a global infrastructure that will make computing power at multiple
different locations avalaible as easily as the electricity utility
grid to next-generation computer users.
Among, Grid pioneers the Globus
development team, for example, has created a set of underlying Grid services
and a software toolkit for using the geographically distributed resources
on Grids. You can find here some more examples of grid
Part of the original motivation for grid computing came from the problems
in processing scientific data, where the use of dedicated supercomputers is expensive and
frequently infeasible. The Grid will allow scientist worldwide to view and analyze
the huge amounts of data flowing from experiments in high-energy and nuclear
physics, gravitational waves,astronomy, biology and other area. Large networks
of much cheaper and less powerful processors have long been touted as a
natural alternative to such dedicated devices, but there has never been
a technology capable of exploiting such distributed computational resources.
The aim of grid computing is to provide such technologies.
The aim of the Grid Physics Network
is to develop a global software infrastructure that will make use of worldwide
distributed computing power to solve problems in processing huge amounts
of scientific data flowing from experiments in high-energy and nuclear
physics, gravitational waves, astronomy, biology and other areas. Such
a powerful information distribution network are needed because 21 st-century
physics, biology, astronomy and engineering increasingly depend on the ability
to manage and access huge quantities of very complex data. For example,
scientists using high-energy colliders to probe the origins of matter must
record the effects of billions of proton collisions per year. Biologists
or chemists studying proteins, meanwhile, work with exceedinggly complex
data gathered from many type of experiments. Among other large-scale experiments,
the international Virtual Data Grid Laboratory will serve as a
unique computing resource for the Laser Interferometer Gravitational-wave
Observatory and CERN, the world's largest particle physics center near
Geneva in Switzerland, for testing new GriPhyN computational paradigms
at the Petabyte scale and beyond.
The GriPhyN (Grid Physics Network) collaboration is a team of
experimental physicists and information technology (IT) researchers who
plan to implement the first Petabyte-scale computational environments for data
intensive science in the 21st century. Driving the project are unprecedented
requirements for geographically dispersed extraction of complex scientific
information from very large collections of measured data. To meet these
requirements, which arise initially from the four physics experiments involved
in this project but will also be fundamental to science and commerce in the
21st century, GriPhyN will deploy computational environments called Petascale
Virtual Data Grids (PVDGs) that meet the data-intensive computational needs
of a diverse community of thousands of scientists spread across the globe.
Among other large-scale experiments, the international Virtual Data Grid Laboratory
will serve as a unique computing resource for the Laser Interferometer Gravitational-wave
Observatory (LIGO) and CERN, the world's largest particle physics center near
Geneva in Switzerland, for testing new GriPhyN computational paradigms at the
Petabyte scale and beyond.
"Powering up the Grid" by Mark Ward (BBC News online, June 28, 2000)
"Big science gets a hand from home computers"
by David L. Chandler, (Boston Globe, November 3, 2000).
Computing and the Emerging Grid" by Ian Foster
(Nature, December 7, 2000)
Grid: The Next-Gen Internet?" by Douglas Heingartner (The Matrix, March 8, 2001)
"NSF announces plan for universities to create global data grid" by Paul Avery
(University of Florida News, September 25, 2001)
"Scientists On Four Continents Linking To Data Grid" by Paul Avery and Aaaron Hoover
(Daily University Science News, September 28, 2001).
"Machine-Made Links Change the Way Minds Can Work Together"
by Katie Hafner (NY Times, November 5, 2001).
"Ein neues, schnelles Super-Internet"
by Hans-Arthur Marsiske (
Telepolis Magazine, December 12, 2001).
"Wissen: Willkommen im World Wide Grid"
by Hans-Arthur Marsiske ( Financial Times Deutschland,
December 13, 2001).