The knowledge discovery process encounters the difficulties to analyze large amount of data. Indeed, some theoretical problems related to high dimensional spaces then appear and de...
The multi-period newsvendor problem describes the dilemma of a newspaper salesman--how many paper should he purchase each day to resell, when he doesn't know the demand? We d...
Obtaining large volumes of inference knowledge, such as entailment rules, has become a major factor in achieving robust semantic processing. While there has been substantial resea...
In a seminal paper, Amari (1998) proved that learning can be made more efficient when one uses the intrinsic Riemannian structure of the algorithms' spaces of parameters to po...
Knowledge discovery systems are constrained by three main limited resources: time, memory and sample size. Sample size is traditionally the dominant limitation, but in many present...
In designing learning algorithms it seems quite reasonable to construct them in such a way that all data the algorithm already has obtained are correctly and completely reflected...
An architecture is described for designing systems that acquire and manipulate large amounts of unsystematized, or so-called commonsense, knowledge. Its aim is to exploit to the fu...
A method to induce bayesian networks from data to overcome some limitations of other learning algorithms is proposed. One of the main features of this method is a metric to evalua...