Several parallel programming languages, libraries and environments have been developed to ease the task of writing programs for multiprocessors. Proponents of each approach often point out various language features that are designed to provide the programmer with a simple programming interface. However, virtually no data exists that quantitatively evaluates the relative ease of use of different parallel programming languages. The following paper borrows techniques from the software engineering field to quantify the complexity of three predominate programming models: shared memory, message passing and HighPerformance Fortran. It is concluded that traditional software complexity metrics are effective indicators of the relative complexity of parallel programming languages. The impact of complexity on run-time performance is also discussed in the context of messagepassing versus HPF on an IBM SP2.
Steven P. Vanderwiel, Daphna Nathanson, David J. L