Search engines are used to return a long list of hundreds or even thousands of videos in response to a query topic. Efficient navigation of videos becomes difficult and users often need to painstakingly explore the search list for a gist of the search result. This paper addresses the challenge of topical summarization by providing a timeline-based visualization of videos through matching of heterogeneous sources. To overcome the so called sparse-text problem of web videos, auxiliary information from Google context is exploited. Google Trends is used to predict the milestone events of a topic. Meanwhile, the typical scenes of web videos are extracted by visual near-duplicate threading. Visual-text alignment is then conducted to align scenes from videos and articles from Google News. The outcome is a set of scene-news pairs, each representing an event mapped to the milestone timeline of a topic. The timeline-based visualization provides a glimpse of major events about a topic. We conduc...