Statistical Machine Translation (MT) systems have achieved impressive results in recent years, due in large part to the increasing availability of parallel text for system trainin...
Zhiyi Song, Stephanie Strassel, Gary Krug, Kazuaki...
Tokenization is one of the initial steps done for almost any text processing task. It is not particularly recognized as a challenging task for English monolingual systems but it r...
WCTAnalyze is a tool for storing, accessing and visually analyzing huge collections of temporally indexed data. It is motivated by applications in media analysis, business intelli...
Sebastian Gottwald, Matthias Richter, Gerhard Heye...
The accurate quantification of disease patterns in medical images allows radiologists to track the progress of a disease. Various computer vision techniques are able to automatica...
The perceived latency for a user surfing the Internet is the target of a transparent and speculative algorithm that relies on a user behavior model. The model is based on past use...