Supporting data decomposition in parallel programming

作者: Anne Meade

DOI:

关键词:

摘要: Parallelising serial software systems presents many challenges. In particular, the task of decomposing large, data-intensive applications for execution on distributed architectures is described in literature as error-prone and time-consuming. The Message Passing Interface (MPI) specification de facto industry standard to program such architectures, but requires low level knowledge data distribution details programmers must explicitly invoke inter-process communication routines. This research reports findings from empirical studies conducted industry, explore characterise challenges associated with performing decomposition. Findings these culminated a list derived requirements tool support, encompassing automation grid indexing, generation structures calls, provision assistance when changing an implemented decomposition strategy. Additional include need be MPI focused, initially target structured grids have impact application code. These were subsequently buttressed address gaps state-of-the-art provided motivation development named MPIGen. MPIGen provides abstraction MPI, encapsulating involved exchanging messages between processors. Users can express parallel intent their through input parameters then generate code containing wrapper functions that encompass functionality. invoked within resulting semi-automated parallelised solution. programmer relieved burden deciphering memory locations was evaluated two involving both students High Performance Computing (HPC) practitioners subjects. concluded efficient it satisfies empirically requirements. Parallel programming difficult skill developers learn, yet nature specifications adverse factor its adoption. makes easier adopt this skill-set offers effective support undertaking communication.

参考文章(119)
Michael Stal, Peter Sommerlad, Hans Rohnert, Regine Meunier, Frank Buschmann, Pattern-Oriented Software Architecture Volume 1: A System of Patterns ,(1996)
Ken Kennedy, Jack Dongarra, Geoffrey Fox, Ian Foster, Dan Reed, Andy White, None, Parallel programming considerations parallel computing. pp. 43- 71 ,(2003)
Atul Saini, David R. Musser, Gilmer J. Derge, STL tutorial and reference guide, second edition: C++ programming with the standard template library Addison-Wesley Longman Publishing Co., Inc.. ,(2001)
Jyrki Kontio, Johanna Bragge, Laura Lehtola, The Focus Group Method as an Empirical Tool in Software Engineering Guide to Advanced Empirical Software Engineering. pp. 93- 116 ,(2008) , 10.1007/978-1-84800-044-5_4
W Gropp, P Brune, A Dener, S Abhyankar, K Rupp, B Smith, S Balay, D May, T Munson, D Kaushik, H Zhang, K Buschelman, M Knepley, R Mills, L Curfman McInnes, L Dalcin, J Brown, P Sanan, D Karpeyev, M Adams, S Zampini, Eijkhout, PETSc Users Manual Argonne National Laboratory. ,(2019)
Henry Kasim, Verdi March, Rita Zhang, Simon See, Survey on Parallel Programming Model network and parallel computing. pp. 266- 275 ,(2008) , 10.1007/978-3-540-88140-7_24
Barbara A Kitchenham, Shari L Pfleeger, None, Personal Opinion Surveys Guide to Advanced Empirical Software Engineering. pp. 63- 92 ,(2008) , 10.1007/978-1-84800-044-5_3
Steve Easterbrook, Janice Singer, Margaret-Anne Storey, Daniela Damian, Selecting Empirical Methods for Software Engineering Research Guide to Advanced Empirical Software Engineering. pp. 285- 311 ,(2008) , 10.1007/978-1-84800-044-5_11