Abstract: Structure learning of dynamic Bayesian networks provide a principled mechanism for identifying conditional dependencies in time-series data. This learning procedure assumes that the data are generated by a stationary process. However, there are interesting and important circumstances where that assumption will not hold and potential non-stationarity cannot be ignored. Here we introduce a new class of graphical models called non-stationary dynamic Bayesian networks, in which the conditional dependence structure of the underlying data-generation process is permitted to change or evolve over time. Some examples of evolving networks are transcriptional regulatory networks during development, neural pathways during learning, and traffic patterns during the day. We define the non-stationary DBN model, present an MCMC sampling algorithm for efficiently learning the structure of an nsDBN and the times of non-stationarities (transition times) under different assumptions, and demonstra...
Joshua W. Robinson, Alexander J. Hartemink