Background: Normalization is the process of removing non-biological sources of variation between array experiments. Recent investigations of data in gene expression databases for ...
Timothy Lu, Christine M. Costello, Peter J. P. Cro...
To construct an efficient overlay network, the information of underlay is important. We consider using end-to-end measurement tools such as traceroute to infer the underlay topolog...
We present a practical approach to generate stochastic anisotropic samples with Poisson-disk characteristic over a two-dimensional domain. In contrast to isotropic samples, we unde...
Louis Feng, Ingrid Hotz, Bernd Hamann, Kenneth I. ...
The visual nature of geometry applications makes them a natural area where visualization can be an effective tool for demonstrating algorithms. In this paper we propose a new mode...
James E. Baker, Isabel F. Cruz, Giuseppe Liotta, R...
This paper investigates the potential information provided to the user by the uncertainty measures applied to the possibility distributions associated with the spatial units of an ...