Sciweavers

31 search results - page 4 / 7
» Data-Intensive Computing in the 21st Century
Sort
View
ICASSP
2010
IEEE
13 years 6 months ago
Random attributed graphs for statistical inference from content and context
Coping with Information Overload is a major challenge of the 21st century. Huge volumes and varieties of multilingual data must be processed to extract salient information. Previo...
Allen L. Gorin, Carey E. Priebe, John Grothendieck
CAD
2005
Springer
13 years 7 months ago
Computer-aided design of porous artifacts
Heterogeneous structures represent an important new frontier for 21st century engineering. Human tissues, composites, `smart' and multimaterial objects are all physically man...
Craig A. Schroeder, William C. Regli, Ali Shokoufa...
WSC
2007
13 years 9 months ago
High-performance computing enables simulations to transform education
This paper presents the case that education in the 21st Century can only measure up to national needs if technologies developed in the simulation community, further enhanced by th...
Dan M. Davis, Thomas D. Gottschalk, Laurel K. Davi...
CAMP
2005
IEEE
14 years 1 months ago
Virtual Astronomy, Information Technology, and the New Scientific Methodology
—All sciences, including astronomy, are now entering the era of information abundance. The exponentially increasing volume and complexity of modern data sets promises to transfor...
S. George Djorgovski
AC
2008
Springer
13 years 7 months ago
DARPA's HPCS Program- History, Models, Tools, Languages
The historical context surrounding the birth of the DARPA High Productivity Computing Systems (HPCS) program is important for understanding why federal government agencies launche...
Jack Dongarra, Robert Graybill, William Harrod, Ro...