Search engines largely rely on robots (i.e., crawlers or spiders) to collect information from the Web. Such crawling activities can be regulated from the server side by deploying ...
Yang Sun, Ziming Zhuang, Isaac G. Councill, C. Lee...
Abstract— Following widespread availability of wireless Internet, a wide range of applications with differing QoS requirements must share the wireless access. The objective of th...
Dynamic software bug detection tools are commonly used because they leverage run-time information. However, they suffer from a fundamental limitation, the Path Coverage Problem: t...
Microprocessor vendors have provided special-purpose instructions such as psadbw and pdist to accelerate the sumof-absolute differences (SAD) similarity measurement. The usefulne...
Asadollah Shahbahrami, Ben H. H. Juurlink, Stamati...
Packet sampling is widely used in network monitoring. Sampled packet streams are often used to determine flow-level statistics of network traffic. To date there is conflicting e...
Bruno F. Ribeiro, Donald F. Towsley, Tao Ye, Jean ...