Abstract--In this note, we consider the robust stability of sampleddata systems with respect to continuous-time delay. We argue that many results in the literature implicitly assume that the delay is a multiple of the sampling period. We demonstrate that this approach might be misleading. Namely, we show that there are systems, which are destabilized by small continuous-time delays despite being delay-independent stable with respect to "integer" delays (this phenomenon has no continuous-time counterpart). We also propose a robustness analysis procedure based on embedding continuous-time delays into unstructured analog uncertainty with the subsequent reduction of the problem to a standard sampled-data H 1 problem. Toward this end, a novel nominal model for the uncertain delay is put forward. It yields a tighter unstructured uncertainty covering than in the existing approaches, thus having a potential to reduce the conservatism of the method. This might be advantageous in the p...