We further develop the idea that the PAC-Bayes prior can be informed by the data-generating distribution. We prove sharp bounds for an existing framework of Gibbs algorithms, and develop insights into function class complexity in this model. In particular we consider controlling capacity with respect to the unknown geometry of the data-generating distribution. We finally extend the localized PAC-Bayes analysis to more practical learning methods, in particular RKHS regularization schemes such as SVMs.