Big Data and Their Epistemological Challenge

作者: Luciano Floridi

DOI: 10.1007/S13347-012-0093-4

关键词:

摘要: It is estimated that humanity accumulated 180 EB of data between the invention writing and 2006. Between 2006 2011, total grew ten times reached 1,600 EB. This figure now expected to grow fourfold approximately every 3 years. Every day, enough new are being generated fill all US libraries eight over. As a result, there much talk about “big data”. special issue on “Evolution, Genetic Engineering Human Enhancement”, for example, would have been inconceivable in an age “small data”, simply because genetics one datagreediest sciences around. why, USA, National Institutes Health (NIH) Science Foundation (NSF) identified big as programme focus. One main NSF–NIH interagency initiatives addresses need core techniques technologies advancing science engineering (see NSF-12-499). Despite importance phenomenon, it unclear what exactly term data” means hence refers to. The aforementioned document specifies that: “The phrase ‘big data’ this solicitation large, diverse, complex, longitudinal, and/or distributed sets from instruments, sensors, Internet transactions, email, video, click streams, other digital sources available today future.” You do not be analytic philosopher find both obscure vague. Wikipedia, once, also unhelpful. Not relevant entry unreliable, but reports common definition, which unsatisfactory: “data so large complex they become awkward work with using onhand database management tools”. Apart circular problem defining “big” “large”, definition suggests too or only relation our current computational power. misleading. Of course, “big”, many terms, relational predicate: pair shoes you, fine me. trivial acknowledge we tend evaluate things non-relationally, case absolutely big, whenever frame reference obvious left Philos. Technol. (2012) 25:435–437 DOI 10.1007/s13347-012-0093-4

参考文章(0)