Interval-gradient cuts are (nonlinear) valid inequalities for nonconvex NLPs defined for constraints g(x) ≤ 0 with g being continuously differentiable in a box [x, ¯x]. In this paper we define intervalsubgradient cuts, a generalization to the case of nondifferentiable g, and show that no-good cuts (which have the form x−ˆx ≥ ε for some norm and positive constant ε) are a special case of interval-subgradient cuts whenever the 1-norm is used. We then briefly discuss what happens if other norms are used.