HPC and the Big Data challenge

作者: Violeta Holmes , Matthew Newall

DOI: 10.1080/09617353.2016.1252085

关键词:

摘要: AbstractHigh performance computing (HPC) and Big Data are technologies vital for advancement in science, business industry. HPC combines power of supercomputers computer clusters, parallel distributed processing techniques solving complex computational problems. The term refers to the fact that more data being produced, consumed stored than ever before. This is resulting datasets too large, complex, and/or dynamic be managed analysed by traditional methods. Access systems ability model, simulate manipulate massive data, now critical research, innovation. In this paper an overview technology presented. outlines advances enabling Peta Exa scale energy efficient computing, challenges extracting meaning new information from data. As example synergy risk analysis, a case study...

参考文章(6)
Hans Meuer, E. Strohmaier, J. Dongarra, Horst Simon, Top500 Supercomputer Sites University of Tennessee. ,(1997)
Peter Marksteiner, High-performance computing — an overview Computer Physics Communications. ,vol. 97, pp. 16- 35 ,(1996) , 10.1016/0010-4655(96)00018-5
Stephen Bonner, Grigoris Antoniou, Laura Moss, Ibad Kureshi, David Corsair, Illias Tachmazidis, Using Hadoop To Implement a Semantic Method Of Assessing The Quality Of Research Medical Datasets Proceedings of the 2014 International Conference on Big Data Science and Computing - BigDataScience '14. pp. 7- ,(2014) , 10.1145/2640087.2644163
Ian Foster, Yong Zhao, Ioan Raicu, Shiyong Lu, Cloud Computing and Grid Computing 360-Degree Compared grid computing environments. pp. 1- 10 ,(2008) , 10.1109/GCE.2008.4738445
Peter Hughes, Miguel Figueres-Esteban, Coen van Gulijk, Learning from text-based close call data Safety and Reliability. ,vol. 36, pp. 184- 198 ,(2016) , 10.1080/09617353.2016.1252083