The massive data volumes acquired, simulated, processed and analyzed by globally distributed scientific collaborations continue to grow exponentially. One leading example is the LHC program, now at the start of its second three year data taking cycle, searching for new particles and interactions in a previously inaccessible range of energies, which has experienced a 70% growth in peak data transfer rates over the last 12 months alone. Other major science programs such as LSST and SKA, and other disciplines ranging from earth observation to genomics, are expected to have similar or great needs than the LHC program within the next decade. The development of new methods for fast, efficient and reliable data transfers over national and global distances, and a new generation of intelligent, software-driven networks capable of supporting multiple science programs with diverse needs for high volume and/or real-time data delivery, are essential if these programs are to continue to progress, a...
Harvey B. Newman, Azher Mughal, Dorian Kcira, Iosi