Sciweavers

294 search results - page 39 / 59
» Optimized fixed-size kernel models for large data sets
Sort
View
ICDE
2010
IEEE
408views Database» more  ICDE 2010»
14 years 3 months ago
Hive - a petabyte scale data warehouse using Hadoop
— The size of data sets being collected and analyzed in the industry for business intelligence is growing rapidly, making traditional warehousing solutions prohibitively expensiv...
Ashish Thusoo, Joydeep Sen Sarma, Namit Jain, Zhen...
BIBM
2009
IEEE
172views Bioinformatics» more  BIBM 2009»
14 years 1 months ago
Identifying Gene Signatures from Cancer Progression Data Using Ordinal Analysis
—A comprehensive understanding of cancer progression may shed light on genetic and molecular mechanisms of oncogenesis, and it may provide much needed information for effective d...
Yoon Soo Pyon, Jing Li
LISA
2004
13 years 10 months ago
More Netflow Tools for Performance and Security
Analysis of network traffic is becoming increasingly important, not just for determining network characteristics and anticipating requirements, but also for security analysis. Sev...
Carrie Gates, Michael Collins, Michael Duggan, And...
CORR
2008
Springer
113views Education» more  CORR 2008»
13 years 8 months ago
Expressing OLAP operators with the TAX XML algebra
With the rise of XML as a standard for representing business data, XML data warehouses appear as suitable solutions for Web-based decision-support applications. In this context, i...
Marouane Hachicha, Hadj Mahboubi, Jérô...
CIKM
2010
Springer
13 years 7 months ago
FacetCube: a framework of incorporating prior knowledge into non-negative tensor factorization
Non-negative tensor factorization (NTF) is a relatively new technique that has been successfully used to extract significant characteristics from polyadic data, such as data in s...
Yun Chi, Shenghuo Zhu