Scientific data sets continue to increase in both size and complexity. In the past, dedicated graphics systems at supercomputing centers were required to visualize large data sets,...
Background: New technologies are enabling the measurement of many types of genomic and epigenomic information at scales ranging from the atomic to nuclear. Much of this new data i...
Thomas M. Asbury, Matt Mitman, Jijun Tang, W. Jim ...
Data cleaning is the process of correcting anomalies in a data source, that may for instance be due to typographical errors, or duplicate representations of an entity. It is a cruc...
Modeling is a severe bottleneck for computer graphics applications. Manual modeling is time consuming and fails to capture the complexity of real world scenes. Automated modeling b...
— Scientific applications often perform complex computational analyses that consume and produce large data sets. We are concerned with data placement policies that distribute dat...
Ann L. Chervenak, Ewa Deelman, Miron Livny, Mei-Hu...