User:Arcpkl/sandbox

Copied from article: "Big data/Lead"

Big data is a field that treats ways to analyze, systematically extract information from, or otherwise deal with data sets that are too large or complex to be dealt with by traditional data-processing application software. Big data was originally associated with three key concepts: volume, variety, and velocity, with veracity being included later on. Big data often includes data with sizes that exceed the capacity of traditional software to process within an acceptable time.

Current usage of the term big data tends to refer to the use of advanced data analytics methods that extract value from data, and seldom to a particular size of data set. Analysis of data sets can find new correlations to "spot business trends, prevent diseases, combat crime and so on." Scientists, business executives, medical practitioners, advertising and governments alike regularly meet difficulties with large data-sets in areas including Internet searches, fintech, urban informatics, and business informatics. Scientists encounter limitations in e-Science work, including meteorology, genomics, connectomics, complex physics simulations, biology and environmental research.

Data sets grow rapidly, to a certain extent because they are increasingly gathered by cheap and numerous information-sensing Internet of things devices such as mobile devices, aerial (remote sensing), software logs, cameras, microphones, radio-frequency identification (RFID) readers and wireless sensor networks. A question large enterprises face is determining who should own big-data initiatives that affect the entire organization.

Relational database management systems, desktop statistics and software packages used to visualize data often have difficulty handling big data. The work may require "massively parallel software running on tens, hundreds, or even thousands of servers". What qualifies as being "big data" varies depending on the capabilities of the users and their tools, and expanding capabilities make big data a moving target. "For some organizations, facing hundreds of gigabytes of data for the first time may trigger a need to reconsider data management options. For others, it may take tens or hundreds of terabytes before data size becomes a significant consideration."

Editing notes:

Changes:
 * Lead contains unique information that does not appear elsewhere.
 * Overall, the Lead contains too many details. Instead of summing up the entirety of the article, it is it's own section of information.


 * Delete: Data with many cases (rows)...
 * Edit: Big data challenges -> Challenges of big data
 * Delete: When we handle... Therefore/
 * Delete: "There is little doubt..."
 * Edit: One question for large enterprises -> A question large enterprises face
 * Add: …with veracity being included later on.
 * Edit & citation: as of 2012 -> as of 2017; "Domo Resource - Data Never Sleeps 5.0"
 * Delete: "Big data challenges include..."
 * Edit: within an acceptable time and value. -> within an acceptable time.
 * Remove: The world's technological per-capita...
 * Edit: "predictive analytics...certain other"

Copied from article: "Big data/Definition"

Definition
The term "big data" has been in use since the 1990s, with some giving credit to John Mashey for popularizing the term. Big data usually includes data sets with sizes beyond the ability of commonly used software tools to capture, curate, manage, and process data within a tolerable elapsed time. Big data philosophy encompasses unstructured, semi-structured and structured data, however the main focus is on unstructured data. Big data requires a set of techniques and technologies with new forms of integration to reveal insights from data-sets that are diverse, complex, and of a massive scale.

"Variety", "veracity" and various other "Vs" are added by some organizations to describe big data, a revision challenged by some industry authorities.

Big data "size" is a constantly moving target, ranging from a few dozen terabytes to many zettabytes of data as of 2012. The world's technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s;, every day 2.5 exabytes (2.5×260 bytes) of data are generated. Based on an IDC report prediction, the global data volume was predicted to grow exponentially from 4.4 zettabytes to 44 zettabytes between 2013 and 2020. By 2025, IDC predicts there will be 163 zettabytes of data.

A 2018 definition states "Big data is where parallel computing tools are needed to handle data", and notes, "This represents a distinct and clearly defined change in the computer science used, via parallel programming theories, and losses of some of the guarantees and capabilities made by Codd's relational model."

The growing maturity of the concept more starkly delineates the difference between "big data" and "Business Intelligence":


 * Business Intelligence uses applied mathematics tools and descriptive statistics with data with high information density to measure things, detect trends, etc.
 * Big data uses mathematical analysis, optimization, inductive statistics and concepts from nonlinear system identification to infer laws (regressions, nonlinear relationships, and causal effects) from large sets of data with low information density to reveal relationships and dependencies, or to perform predictions of outcomes and behaviors.

Changes:


 * Move: "The world's technological per-capita... By 2025, IDC predicts there will be 163 zettabytes of data." from Lead to Definition
 * New paragraph: Big data "size" is a constantly...
 * Reorder: "Big data requires a set of techniques and tech..."
 * Edit: describe it -> describe big data
 * Edit: The term has been in use -> The term "big data" has been in use