Data processing

Data processing is the collection and manipulation of digital data to produce meaningful information. Data processing is a form of information processing, which is the modification (processing) of information in any manner detectable by an observer.

Functions
Data processing may involve various processes, including:


 * Validation – Ensuring that supplied data is correct and relevant.
 * Sorting – "arranging items in some sequence and/or in different sets."
 * Summarization (statistical) or (automatic) – reducing detailed data to its main points.
 * Aggregation – combining multiple pieces of data.
 * Analysis – the "collection, organization, analysis, interpretation and presentation of data."
 * Reporting – list detail or summary data or computed information.
 * Classification – separation of data into various categories.

History
The United States Census Bureau history illustrates the evolution of data processing from manual through electronic procedures.

Manual data processing
Although widespread use of the term data processing dates only from the 1950s, data processing functions have been performed manually for millennia. For example, bookkeeping involves functions such as posting transactions and producing reports like the balance sheet and the cash flow statement. Completely manual methods were augmented by the application of mechanical or electronic calculators. A person whose job was to perform calculations manually or using a calculator was called a "computer."

The 1890 United States Census schedule was the first to gather data by individual rather than household. A number of questions could be answered by making a check in the appropriate box on the form. From 1850 to 1880 the Census Bureau employed "a system of tallying, which, by reason of the increasing number of combinations of classifications required, became increasingly complex. Only a limited number of combinations could be recorded in one tally, so it was necessary to handle the schedules 5 or 6 times, for as many independent tallies." "It took over 7 years to publish the results of the 1880 census" using manual processing methods.

Automatic data processing
The term automatic data processing was applied to operations performed by means of unit record equipment, such as Herman Hollerith's application of punched card equipment for the 1890 United States Census. "Using Hollerith's punchcard equipment, the Census Office was able to complete tabulating most of the 1890 census data in 2 to 3 years, compared with 7 to 8 years for the 1880 census. It is estimated that using Hollerith's system saved some $5 million in processing costs" in 1890 dollars even though there were twice as many questions as in 1880.

Computerized data processing
Computerized data processing, or electronic data processing represents a later development, with a computer used instead of several independent pieces of equipment. The Census Bureau first made limited use of electronic computers for the 1950 United States Census, using a UNIVAC I system, delivered in 1952.

Other developments
The term data processing has mostly been subsumed by the more general term information technology (IT). The older term "data processing" is suggestive of older technologies. For example, in 1996 the Data Processing Management Association (DPMA) changed its name to the Association of Information Technology Professionals. Nevertheless, the terms are approximately synonymous.

Commercial data processing
Commercial data processing involves a large volume of input data, relatively few computational operations, and a large volume of output. For example, an insurance company needs to keep records on tens or hundreds of thousands of policies, print and mail bills, and receive and post payments.

Data analysis
In science and engineering, the terms data processing and information systems are considered too broad, and the term data processing is typically used for the initial stage followed by a data analysis in the second stage of the overall data handling.

Data analysis uses specialized algorithms and statistical calculations that are less often observed in a typical general business environment. For data analysis, software suites like SPSS or SAS, or their free counterparts such as DAP, gretl, or PSPP are often used. These tools are usually helpful for processing various huge data sets, as they are able to handle enormous amount of statistical analysis.

Systems
A data processing system is a combination of machines, people, and processes that for a set of inputs produces a defined set of outputs. The inputs and outputs are interpreted as data, facts, information etc. depending on the interpreter's relation to the system.

A term commonly used synonymously with data or storage (codes) processing system is information system. With regard particularly to electronic data processing, the corresponding concept is referred to as electronic data processing system.

Simple example
A very simple example of a data processing system is the process of maintaining a check register. Transactions&mdash; checks and deposits&mdash; are recorded as they occur and the transactions are summarized to determine a current balance. Monthly the data recorded in the register is reconciled with a hopefully identical list of transactions processed by the bank.

A more sophisticated record keeping system might further identify the transactions&mdash; for example deposits by source or checks by type, such as charitable contributions. This information might be used to obtain information like the total of all contributions for the year.

The important thing about this example is that it is a system, in which, all transactions are recorded consistently, and the same method of bank reconciliation is used each time.

Real-world example
This is a flowchart of a data processing system combining manual and computerized processing to handle accounts receivable, billing, and general ledger