User:FGuerino/Information technology industry

The information technology industry is made up of a broad grouping of enterprises and individuals who are involved in the production or consumption of information technology products and services. A myriad of different enterprises and individual participants span the supply and demand sides of the industry. On the supply side, there are enterprises and individual professionals who practice, produce, sell and provide education about information technology products, services and concepts. On the demand side there are enterprises and individuals who consume information technology products and services.

While the exact physical size of the industry is unknown, it is estimated to consist of billions of people around the world, based on the aggregation of all individuals who supply, support, own or use technology-based products and services, including in the software, telephones, mobile device, radio, television and computer industries, to the extent they represent, are composed of, or leverage information technology.



Differing uses of the phrase information technology
The phrase information technology (or IT), may refer to or be used interchangeably within one of four common contexts:
 * 1) Information technology as it relates to actual technology or products, as in the case of a computer or a software language. For example, "a mobile device is a product of information technology"  (see information technology for more information);
 * 2) Information technology as it relates to  a profession or a discipline. For example, someone can study information technology, get a degree in it, and practice it (see information technology profession  for more information'');
 * 3) Information technology as it relates to organizations or groups that provide information technology solutions and/or services (as in the case of an IT organization).  For example, "information technology is running a project to deliver a solution" (see information technology organization for more information); and
 * 4) Information technology in the context of an industry that is composed of all people, organizations, products and services that are related to information technologies in the product context, as in the case of the IT industry.  For example, "information technology will spend and grow to approximately $2.7 trillion dollars, in 2013."

This article is about the last meaning: information technology, as an industry.

Related industries
The information technology industry ties multiple other smaller technology related industries together. Some examples include but are not limited to:
 * Computer industry
 * Data Processing Industry
 * Informatics and Computing (IC) Industry (also known as the Computing Industry)
 * Information Systems (IS) Industry
 * Management Information Systems (MIS) Industry
 * Software industry

Independent industry research sources
Sources of independent research about the information technology industry include Gartner Research, Forrester Research, Thomson Reuters, Dun & Bradstreet and Moody's, which explicitly develop and sell their research results for profit. There are also notable banking, investment and securities enterprises that perform and publish research on the IT industry for shareholders as a means of explaining and supporting their business strategies and operations, such as Barclays, Citigroup, Goldman Sachs and Morgan Stanley.

Such research organizations are recognized for driving the definition of the industry through classifications of its segments, industry tracking and measurement and the analyzing of the IT industry so that others who consume such information can use such material to understand industry growth, performance and behavior.

Manner of classification within the IT industry
Enterprises (public, private or otherwise) get classified by independent industry research sources as being part of the IT industry by means of their primary or majority business functions and purposes or by their largest sources of revenue. For example, if 51% of an enterprise's business activities and purpose are related to the selling of information technology-related products and services, or an enterprise earns 51% of its revenue from the selling of IT related products and services, that enterprise will be classified as an IT industry enterprise, regardless of what comprises the remaining 49% of its business. Research and analysis companies who are responsible for tracking, measuring and identifying trends within the IT industry, nevertheless incorporate the minor business functions of enterprises, that are not primarily classified as IT enterprises or companies, into the broader IT industry, as a means of maintaining accuracy about the industry.

Examples of enterprises that are considered part of the IT industry
The information technology industry is extensive, being made up of thousands of enterprises. While it is impossible to maintain a list of all IT industry enterprises as the industry is constantly changing, examples include:

Additionally, the NASDAQ stock exchange maintains a list of 655 public information technology companies that appear on its exchange.

Financial quantification of the industry
While estimation of the total finances involved in the industry by different research institutions (such as Gartner Research and Forrester Research) may vary, according to the American information technology research and advisory firm Gartner, the general estimated valuation of all aspects of IT as of 2013, including people, organizations, products and services, is in the range of $2.7 to $3.7 Trillion U.S. Dollars (USD).

Industry related products and services


The IT industry is driven by supply and demand. The supply side of the industry delivers products and services to its consumers, who generate and control demand for such products and services.

IT products are segmented into two broad categories: hardware products (or just hardware) and software products (or just software), and each has an appurtenant service industry.


 * Hardware products represent all information technology items that have a physical presence (i.e. devices) that can be touched by a human, such as a computer, a mobile telephone, a video game console, or a hard disk. Hardware services represent those services performed by individuals, on behalf of themselves or the enterprises they represent, which exist to support any part of a hardware technology, its lifecycle or its use.


 * Software products represent those items that are digital in form and which run on computing devices, such as desktop computers, laptop computers, mobile telephones and mainframe computers.Software services represent those services performed by individuals, on behalf of themselves or the enterprises they represent, which exist to support any part of a software technology, its lifecycle or its use.

Classifications as a means of understanding the industry
Given the information technology industry's size, those who research and track it use a variety of methodologies in an attempt to make sense of it. The most common method of doing so is to compartmentalize the industry into labeled sections that have clearer meaning or purpose (i.e. categorize, organize and classify).

The information technology industry invests a great deal of time and money trying to predict, measure and analyze industry performance. In order to do so, categorizations or classifications of people, enterprises, products and services within the industry are used as a means of segmenting both the providers and consumers within it. Some classifications are common to those in other industries, such as classifications by consumer type or by enterprise size, while others are unique to the IT industry, such as in the case of deliverables types and the use of technology adoption lifecycles. Some classifications are specific to the supply side of the industry (i.e. provider specific), while others are specific to the demand side of the industry (i.e. consumer specific).

Such classifications are applied against each other and used by professionals such as researchers, marketers, product developers and sales staff to predict, track, measure and understand both the supply side of the industry and the demand side of the industry. For example, a researcher or marketer might want to understand the expected and actual flow of industry deliverable types that originate in and are supplied by different sized enterprises, over time, against the consumption by different sized enterprises in various vertical industries, by technology adoption lifecycle phases, so as to understand certain supply versus demand trends and patterns.

Such classifications are also used by educators within the industry to teach students and professionals about the industry. For example, to answer such questions as who exists on the supply side of the industry, what they develop and deliver, what they consume, how they behave when they consume and what drives that consumption.

General industry classifications
Those who study, track or work within the IT industry invest a great deal to constantly categorize or classify it by traits so that they can perform functions such as strategic planning, development, marketing, sales, delivery, operations, support and decommission of products and services.

Individuals versus enterprises
The simplest classification of the industry, applying to both the supply side or its demand side if the industry, is to break it down by the broad grouping of individuals versus enterprises.

In the case of the supply side, this is represented by individual suppliers (or individual providers), denoring the pool of human beings who build, deliver and support information technology related products and services to or for others. This is contrasted against enterprise suppliers (or enterprise providers), denoting the pool of human beings who build, deliver and support information technology related products and services to or for others.

In the case of the demand side, this is represented by individual consumers, denoting the pool of human beings who purchase and consume IT related products and services for themselves and other individual consumers, such as their family members and friends. This is contrasted against enterprise consumers, referring to entities (including private and public companies, non-profit and charitable organizations and governments and educational institutions), that build, deliver and support information technology related products and services to or for others.

Industry by enterprise size
One of the most common general forms of classification is the segmentation of the industry by the size of an enterprise. For example:
 * 1) Small sized enterprises (also known as small enterprises)
 * 2) Mid-Sized Enterprises
 * 3) Large sized enterprises (also known as large enterprises)

While the defined size of these categories vary across sources, the largest research institutions, such as Gartner Research and Forrester Research, commonly use small sized enterprises to refer to a range between one and a few hundred people; mid-sized to refer to a few hundred to a few tens of thousand people, and large sized enterprises  to refer a tens of thousands to hundreds of thousands.

Further sub-classification may be used to achieve greater granularity. For example, the large enterprises category may be broken down into large and super large (or jumbo) enterprises, respectively, for ranges capped at about one hundred thousand people, and those above this amount, such as in the case of conglomerates. However, most industry research entities, such as Gartner Research, Forrester Research, Thomson Reuters and Dun & Bradstreet, tend to work to the above.

Classifications specific to supply
The IT industry attempts to classify, both, those people and enterprises who provide IT products and services as well as the different types of products and services that are delivered by them.

Classification of industry by deliverable types
One common classification method is by type of product sold or consumed and the producing industry actor. Such ordering is often done by research organizations, such as industry research companies and marketing organizations within companies. Examples include:


 * By device: Physical products that are directly used by human consumers.
 * By datacenter systems: Products that are targeted at, exist for, or are run within datacenters, which are enclosed rooms or facilities of grouped computing equipment, often intentionally secured from and made off limits to end users.
 * By information technology services: Industry offerings that are not physical products, but rather represent services performed to plan for, deliver or support physical IT products.
 * By enterprise software (a/k/a Software): Products that are virtual, not physical, and that run on computing devices.
 * By telecommunications services (a/k/a Telecom Services): Services which focus, specifically, on the enabling, execution and support of communicating data and information between devices.

An alternative representation that is published and followed by Forrester Research takes the form of:


 * By communications equipment: Equipment that is dedicated to all aspects of communicating data and information between devices.
 * By computer equipment: Computers, computing devices and supporting peripherals.
 * By IT consulting and systems integration services:' Services performed by people for the setting of strategy, delivery, operations and support of information technology devices.
 * By IT outsourcing and hardware maintenance: Third party solutions.
 * By software: Products that are virtual, not physical, and that run on computing devices.

Professionals who perform functions like predicting, tracking, or analyzing industry performance use such segmentation to quantify who delivers what, the costs to develop such deliverables, the revenu generated by each deliverable type, and who is consuming such deliverables within and across different lifecycles, such as the market maturity lifecycle or the technology product lifecycle. This allows for an understanding of trends and patterns that highlight where investments are flowing within and across the industry, where demand is high or low, and where supply is high or low.

Classification of industry by hosting type
An emerging classification is that of self hosting services versus cloud services. This includes but is not limited to cloud computing.

Self hosting services are enterprises that consume IT solutions take full control for the procurement, delivery, installation, execution and support of many of their IT solutions. In other words, they look to themselves to provide and maintain internal IT solutions, regardless of the industry they work in or are a part of. For example, a bank or a vehicle manufacturer may want to also deliver, operate and support their own IT solutions, themselves.

Cloud services are enterprises that consume IT solutions and look to other enterprises that are external to their own boundaries, for the procurement, delivery, installation, execution and support of their IT solutions. For example, a bank or a vehicle manufacturer may look to an external third party to deliver, operate and support IT solutions, on their behalf.

Another common way of looking at the two is internal cloud services or internal cloud, which represents self hosted IT solutions, versus external cloud services or external cloud, which represents third party hosted IT solutions.

Classification of industry by technology lifecycle maturity
Industry maturity is often broken down into product and service lifecycle(s). (Note: Technology lifecycle should not be confused with systems sevelopment lifecycle (SDLC), which is specifically about controlling product development and delivery pipelines.)

Although there may be multiple labels for such stages, they are usually classified into one of the five key areas:




 * 1) Pre-emerging: Technologies, products or services in the market that are still in research and which have not been released to the consumers for Generally Accepted (GA) use;
 * 2) Emerging: Technologies, products or services that have been developed and are in their initial phases of introduction to and penetration of the market, in anticipation of sales to the broader market;
 * 3) Mature: Referring to technologies, products or services that have been delivered, are established and are currently in use by the masses;
 * 4) Declining: Technologies, products or services  that are nearing the end of their lifecycle and will soon be replaced by emerging solutions; and
 * 5) Declined: Technologies, products or services that are either no longer in use, have very limited use because of modern replacements, or which are heavily established or entrenched.

Industry trackers, such as research institutions and providers of IT products and services use such lifecycles and their individual phases as a means of setting strategy for products and services, planning for and controlling development and delivery of products and services, estimating costs and revenue recuperation for products and services within each phase, marketing products and services to consumers who buy within specific phases, and understanding how long such products and services will be relevant and useful to their consumers after delivery to the market.

Classifications specific to demand
The IT industry categorizes and classifies those people and enterprises who consume products and services. Doing so allows those who track and measure the industry, as well as those who participate within it, to perform functions like setting strategy, planning, marketing, developing, selling, delivering, operating, supporting and decommissioning for products and services that are driven by industry demand.

Classification of vertical industries that consume technology
Another means of categorizing of the broader IT industry is to break it down into what are called vertical industries, which represent categories or groupings of purchasers or consumers with common traits, usually related to the purpose of their existence. Examples of such vertical industries include those which are commonly published by Gartner:


 * Banking and Securities
 * Communications, Media and Services (CMS)
 * Education
 * Government
 * Healthcare
 * Insurance
 * Manufacturing and Natural Resources
 * Retail
 * Transportation
 * Utilities
 * Wholesale

Classification of individual consumer personality traits
In 1991, Geoffrey A. Moore published a classification of technology industry consumers that characterized each consumer type by traits that highlighted how they reacted to discontinuous or disruptive technology. These traits broke consumers into five (5) distinct categories that fell within areas of a Bell Curve and which included innovators, early adopters, the early majority, the late majority and the laggards. This representation is known throughout the industry as the technology adoption lifecycle and represents a standard for technology marketing and sales.

Moore's work was based on an earlier set of work called The Diffusion Process, published in 1957. However, The Diffusion Process had only been written about the concept's application to agriculture and home economics, allowing Moore the opportunity to extend it by applying it to information technology.

These five traits, as described by Moore, include:



Innovators – technology consumers that aggressively pursue new technologies and technology related products for a wide range of reasons that include obsessive interest, curiosity, intrigue, pleasure and ego (i.e. competitive need to be seen as leaders or groundbreakers), where technology is a central interest in their lives, regardless of its purpose. As a result, innovators seek out new technologies before others do and often before the public is ever informed to such technologies. The pool of innovators is small in comparison to all other technology consumer types but critical, because winning technology innovators leads to prophets who help drive products to market, as the endorsement by such innovators helps educate and reassure other market consumers that the product is viable for use.

Early adopters – similar to innovators with the exceptions that the technology consumers are not technologists and are not as aggressive about seeking out new technologies and technology related products. Instead, early adopters are considered to be people who can easily imagine and appreciate the potential benefits that come with the application or use of new technologies. These are people who see new technologies as a means for solving real problems, long before others see such potential use. Early adopters also tend to buy and apply such technologies on intuition, rather than on reference, because they are buying long before a technology or product is established in the marketplace. As a result, they are considered critical to the spearheading of the market segments they represent. This segment is considered to be slightly larger than the innovative consumer segment.

The early majority – technology consumers that have some ability to relate technologies to problems that need solving like early adopters but, being more practical, have less tolerance for risk and more patience for stability. Early majority individuals sees technology more as a passing fad and only want to interface with those technologies and technology related products that are considered stable and lasting, reducing the need for significant investment to replace such solutions, every time a similar technology is introduced to market. Early majority individuals seek well-established market references before they buy into technology solutions. This segment is one of the largest, representing approximately one third of the market and because of its size, winning business in this segment is critical in order to develop substantial growth and profit in the market.

The late majority – technology consumers that are very much like the early adopters with the exception that they fear technology and tend to avoid it until they see adoption of a technology by the masses. This market is so risk averse that highly established references are often not good enough to convince those in the class to purchase technology products, who prefer to wait until they see that a majority of the consumer base is already using such solutions. This includes use by highly established, large enterprises, with very well-known brands. Like the early majority, this segment is roughly one third of the market and is considered highly profitable because selling into such a market helps maintain profits while technology products are moving towards the end of their lifecycles, where all investments to develop and deliver them have been fully amortized.

The laggards – technology consumers who are either totally disinterested in or terrified of technology. This segment of consumers will only intentionally buy technology when they feel they have to or when they are forced to do so. The laggards are considered to be a market segment that is not worth pursing by technology sales organizations because of their small market footprint, because they are difficult to sell to, and because they buy very little on thee rare occasion that they make purchases.

Moore's work to extend upon The Diffusion Process principles to the IT industry is now used by all IT research and marketing professionals, allowing such work to be applied to IT product and service strategy development, research, design, development, marketing and selling.

Because human consumer traits (i.e. technology adoption lifecycle traits) are closely related to market maturity traits, they can be used by professionals who perform marketing and sales functions to identify the specific types of consumers they want to attract and sell their own products and services to, in each of the market maturity phases, as well as to understand things like the psychology and behavior patterns of such consumers, within each phase.

History and important events
The information technology (IT) industry, or what many view as the modern computing era, has evolved and established itself through the occurrence of many important events in history, over the span of a few centuries.

The storage of electricity as a foundation for batteries
In 1800 (specific date unknown), Alessandro Volta developed a battery made from copper and zinc (credited as the first electrochemical cell), that allowed the storage of electricity and the race to develop more powerful and stable sources of electricity began. Such technology would eventually evolve into modern electronic batteries that help power things like mobile devices and laptop computers.

Data over long distance wire in the form of codes


In the mid-1800s, the industry saw the birth of two key concepts of information technology that included the ability to communicate electronic signals (i.e. data) for long distances over a medium such as wire, and the ability to use levers and buttons (via the human sense of touch) to control data entry and transmission, which would later drive the evolution of solutions such as the teletypewriter and the electronic keyboard.

In 1832 (specific date unknown), Pavel Schilling (also known as Paul Schilling), invented the common electrical telegraph, having improved upon previous optical models of the telegraph by allowing transmission lengths that exceeded 1,200 meters, far surpassing its optical predecessors.

In 1833 (specific date unknown), Carl Friedrich Gauss and Wilhelm Weber invented their own communications code which could be transmitted over Schilling's electrical telegraph, and which later became the foundation for Morse code and, ultimately, digital signal processing.

Transmission of data and information evolves to handle voice and audio
In March of 1876, the United States Patent and Trademark Office (USPTO) granted Alexander Graham Bell a patent for a device that allowed the transmission of analog audio signals over wire – i.e., the telephone. While there is dispute over who first invented the telephone, Bell was the first to receive a patent for such work, allowing him to secure commercial rights for its development and sale.

The award of the patent became the foundation for what would become the Bell Telephone Company, on July 9, 1877 and all its derivative Bell companies, which all evolved to take significant roles in the development of the information technology industry as they created competition around telephony and, more specifically, the advancement of analog (and later digital) data and information over various forms of transmission media.

>Wireless data transmission is born


In 1879 (specific date unknown), David E. Hughes is credited with having transmitted the first radio signals over a few hundred yards, without a physical medium such as a cable, by means of what was described as a clockwork keyed transmitter. Hughes's work would go on to become the very foundation and backbone for wireless computing networks and wireless mobile communications.

In addition to the work performed by Hughes, Thomas Edison used a vibrating magnet to develop induction transmission of signals. Based on this work, in 1888 he delivered a simple communications system that allowed for the transmission of signals for the Lehigh Valley Railroad, earning him a patent for his work, in 1891.

In 1888, Heinrich Hertz proved the existence of electromagnetic waves, which is the underlying basis of most wireless technology.

While Michael Faraday and James Clerk Maxwell had predicted the theory of electromagnetic waves in earlier research, Hertz was able to prove that electromagnetic waves traveled through space in direct paths and could be transmitted as well as received by an electromagnetic transmitter and receiver, respectively.

As wireless data and information transmission progressed, practical applications of wireless radio communication and remote control technology were implemented by later inventors, such as Nikola Tesla.

Video establishes itself as a means of data and information transmission
In 1897 (specific date unspecified), German physicist Ferdinand Braun published his work on the Braun tube, which would later drive experimentation for the development of the Cathode Ray Tube (CRT).

Visual transmission of signals would ultimately become the foundation for the television and, even later, the computer monitor, both prevalent to many of the products sold in the IT industry which leverage video as a means of data and information transmission.

Transistors become the foundation for semiconductors
In 1925 (specific date unknown), the physicist Julius Edgar Lilienfeld, working in Canada at the time, filed a patent for a field-effect transistor (FET), which was intended to be a solid-state replacement for the triode. In 1926 and in 1928 Lilienfeld also filed U.S. patents, further solidifying the foundation for what would become a booming semiconductor industry, in the not so distant future, despite that he never published any formal research on the topics. The magnitude of his work can only be measured by understanding that almost every electrical device sold in the modern era, which relies on, manipulates, or transmits data, has one or more semiconductors in it.



Storage as a means of persisting data and information
In 1932 (specific date unknown), what was called drum memory was invented by Gustav Tauschek, in Austria. It became the first form of a magnetic data storage device and the foundation for computer related data storage work.

Data and information explosion is presented to the world
In 1944 (specific date unknown), Fremont Rider, a Wesleyan University librarian, published a paper called "The Scholar and the Future of the Research Library," in which he estimated that American university libraries were doubling in size approximately every sixteen years. In his paper, Rider speculated that according to this calculated growth rate the Yale Library of 2040 would have grown to contain "approximately 200,000,000 volumes, which would also occupy over 6,000 miles of book shelves... also requiring a cataloging staff that was estimated to consist of over six thousand persons." Unbeknownst to Rider, his revelation is now also considered the first published work to highlight the problem of data explosion or big data.

Big information companies start to evolve
On July 9, 1877, the Bell Telephone Company was established as a common law joint stock company in Boston, Massachusetts by Alexander Graham Bell's father-in-law Gardiner Greene Hubbard, who also helped organize a sister company — the New England Telephone and Telegraph Company. The two companies merged on February 17, 1879 to form two new entities, the National Bell Telephone Company of Boston and the International Bell Telephone Company, which was established by Hubbard and which later became the headquarters in Brussels, Belgium.

On March 20, 1880, the National Bell Telephone Company subsequently merged with others to form the American Bell Telephone Company, also of Boston, Massachusetts.

In 1911 (specific date unknown), the Computing Tabulating Recording Company (CTR), which would later rebrand itself as International Business Machines Corporation (IBM), establishes itself through a merger of three companies: the Tabulating Machine Company, the International Time Recording Company and the Computing Scale Company. CTR adopted the name International Business Machines in 1924, using a name previously designated to CTR's subsidiary in Canada and later South America. Securities analysts nicknamed IBM Big Blue in recognition of IBM's common use of blue in products, packaging and logo. Later, IBM established itself as one of the most dominant and long lasting forces in electronic computing by delivering IT products and services.

In 1925 (specific date unknown), Bell Labs, formally known as Bell Telephone Laboratories, Inc., as established as a separate legal entity through the consolidation of Western Electric Research Laboratories and part of the engineering department of the American Telephone & Telegraph company (AT&T).

Electronic computers are born


On April 2, 1943, John Mauchly and J. Presper Eckert, both of the Moore School of Electrical Engineering at the University of Pennsylvania (UPenn), submitted a formally documented proposal that represented their ideas for building an "Electronic Calculator" to the U.S. Army’s Ballistic Research Laboratory.

On April 9, 1943, the contract was signed and agreed to by both parties. As a result of the partnership, the now historic Electronic Numerical Integrator And Computer (ENIAC) was born, making it the first electronic general-purpose computer.

On June 30, 1945, John von Neumann published the First Draft of a Report on the EDVAC and is credited for being the inventor of the Von Neumann architecture. This is considered to be the first formally documented discussion of what is now considered to be the stored program concept and the foundation for general computer architecture to this day.

On February 14, 1946, almost three years after John Mauchly and J. Presper Eckert signed their agreement with the U.S. Army’s Ballistic Research Laboratory, ENIAC was finally completed and first fully electronic computer was delivered to the U.S. Army’s Ballistic Research Laboratory where, to this day, it continues to be recognized as the foundation for all computers.

The above two events became the critical merger between electronic computing devices and stored programs, mixing software (firmware at the time) with hardware.

In February 1951 (and about five years after the delivery of the ENIAC), the Ferranti Mark 1, which was also known as the Manchester Electronic Computer in its sales literature, was delivered to the University of Manchester, just ahead of the UNIVAC I, which was delivered to the United States Census Bureau a month later. The Ferranti Mark 1 is recognized by many as the first commercially available electronic computer that could be purchased outside of research funding governments.

Integrated circuits and Moore's Law


In 1957, after some foundational research from Geoffrey W.A. Dummer, who came up with the idea of an integrated circuit in 1952 but was unable to successfully implement one, Jack Kilby proposed his ideas of creating small ceramic squares, called wafers, that would contain miniaturized components to the United States Army.

Kilby later worked for Texas Instruments, where he demonstrated the first working integrated circuit on September 12, 1958, and later won the Nobel Peace Price for his contributions to the development of semiconductor based integrated circuits. According to the description provided in his filed patent documentation, Kilby described his design as "a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated."



In 1965, Intel Corporation cofounder Gordon E. Moore published a paper that described an observed trend where, over the documented history of computing hardware, the number of transistors that could be co-located on semiconductor based integrated circuits had doubled approximately every two years.

In this publication, Moore also predicted that transistors on a chip would double approximately every year – a statement that would later be adjusted to approximately every two years, for at least a decade. While unknown at the time, Moore's prediction became a significant factor in the rapid growth of the IT industry, as enterprises continued to plan and make predictions based on Moore's law, long after his initial prediction of one decade had passed. In fact, it is more than half a century since his observations and predictions and the industry continues to revolve around his work. Also, given that transistors had become the foundation for most information technology, Moore's law would eventually go on to be proven accurate even for very specific areas of technology, such as computer processing, storage and persistence, communications and visualization.

Compression helps make data smaller and faster
In November of 1967, B. A. Marron and P. A. D. de Maine publish an article called, "Automatic data compression" in the Communications of the ACM (Volume 10 Issue 11, Nov. 1967 Pages 711-715), stating that "The information explosion noted in recent years makes it essential that storage requirements for all information be kept to a minimum." In the publication, Marron describes his compression algorithm as being: "a fully automatic and rapid three-part compressor which can be used with any body of information to greatly reduce slow external storage requirements and to increase the rate of information transmission through a computer." His work would go on to impact the development of now massive areas such as digital storage, data reception and data transmission, all of which rely heavily on compressed data algorithms for their success.

Computers connect through what is to become the internet
On October 29 of 1969, the first two nodes of what soon became the ARPANET (and later the Internet) were interconnected to each other between two separate physical locations, the first being Leonard Kleinrock's Network Measurement Center at the UCLA's School of Engineering and Applied Science and the second being Douglas Engelbart's NLS system at SRI International (SRI) in Menlo Park, California, on October 29, 1969. Later, a third site, which was the Culler-Fried Interactive Mathematics center at the University of California at Santa Barbara, and a fourth site, which was the University of Utah Graphics Department, was also added to ARPANET as the network started its expansion toward what we now call the modern internet.

Young adults and children are exposed to video game computing


Until the 1970s, IT products were predominantly marketed to adults, either as private consumers or through the enterprises they worked for, with the limited exception of lower end consumer technologies, such as record players, radios and cassette tape recorders. It was in the 1970s, with the introduction of low cost to participate video games, that information technology marketers started to realize just how influential young adults and children really were, when it came to demand for new and more modern information technology products. The establishment and marketing of video games rapidly highlighted how young adults and children represented a very large and influential set of innovators and adopters on the technology adoption lifecycle curve.

The first such introduction occurred in September of 1971, when the very first of its kind, coin-operated video game called Galaxy Game, was installed on the campus of Stanford University. Only a single model was built, using a DEC PDP-11 computer and vector display terminals as a means of projecting visualizations to the end user. Soon afterwards, in 1972, the game was expanded in order to handle four to eight consoles.

Also in 1971, Nolan Bushnell and Ted Dabney created their own version of a coin-operated arcade calling it Computer Space. The rights to the system were purchased by Nutting Associates, who manufactured more than 1,500 Computer Space machines, with the initial release recorded on November of 1971. The game was considered a major landmark in the video game industry for two specific reasons: it represented the first mass-produced and commercially available video game; and it was the first public and mass exposure of computing devices to young adults, who make up a significant consumer segment of the information technology industry (long before adults were exposed to home use of computers).

In 1972, Bushnell and Dabney founded Atari, Inc., and soon after released their next game: Pong. While Computer Space was a commercial failure because of its steep learning curve, Pong was met with widespread success; Atari eventually sold over 19,000 Pong machines, which led to an explosion in the industry as imitators jumped to try to replicate Atari's success.

Transmission media becomes commercial and robust
On May 22, 1973, Robert Metcalfe published a memorandum at Xerox Palo Alto Research Center (PARC) that is considered to be the documented invention of Ethernet. Ethernet cable became the transmission medium of choice for commercial applications of computing and is still often used, and Xerox was positioned to become an industry leader in computer networking equipment.



Introduction of the first commercial personal computers
In 1973 (specific date unspecified), the Micral N became the first mass produced personal computer. What made the Micral N different was that it was based on the Intel 8008 microprocessor and was the first of its kind to not require being built by the purchaser, as part of a kit. This model of pre-constructed and, later, even pre-configured systems would become the baseline for what is now the vast personal computer industry, consisting of things like desktop computers, laptop computers, notebook computers, mobile devices and even video games.

Commercial mainstreaming of data and information on the internet
In March 1989, Tim Berners-Lee published the paper, "Information management: A proposal" while at CERN, in which he outlined his view of a semantic web (also referred to as Web 3.0) by highlighting a global hypertext or markup language system that as of 2013 is one of the most influential means of transmitting data between computing devices. In this publication, he discussed his vision of computers interoperating with each other, in a fashion where they could also interpret and understand each other.

In late 1992 (specific date unknown), the Mosaic web browser, also known as NCSA Mosaic, was created by the National Center for Supercomputing Applications (NCSA) at the University of Illinois Urbana-Champaign as a means of discovering data that was being shared by other computers, across the Internet. The browser acted as a visual user interface that was local to the user's computer and that acted as a software client to early computer communications protocols such as FTP, NNTP and gopher.

Later in 1993, NCSA released the browser to the general public. What made Mosaic important is that it was also the first browser to display images inline with text instead of displaying them in a separate window. This made it user friendly and easier to install and learn by the general public, which further helped solidify its popularity within commercial web applications. The Mosaic browser later became the foundation for more advanced browsers, as many of the original developers from NCSA went on to work with other browsers, such as Netscape Navigator and Mozilla Firefox. Many of the most notable web browsers, such as Google Chrome, Internet Explorer and Mozilla Firefox, continue to leverage and exploit many of the same features and traits of the original Mosaic browser, as they all share things like its inline image displaying, back button and page refresh concept.

Interactive applications become the norm on the internet
On September 30, 2005, Tim O'Reilly published his formal definition of Web 2.0 in his article, "What is Web 2.0, Design Patterns and Business Models for the Next Generation of Software." O'Reilly discussed how the concept of Web 2.0 began with a conference brainstorming session between O'Reilly & Associates (now O'Reilly Media and MediaLive International). Contrary to the public chatter that Web 2.0 represented a set of technology features or even specific types of technologies, O'Reilly clarified that Web 2.0 represents a set of possible traits or characteristics that are common in enterprises that have and wish to thrive on the internet or web. A summary of Web 2.0 traits are:


 * 1) Using the Internet web as a backbone infrastructure for enterprise class solutions;
 * 2) Static publishing and page views by individuals is replaced or enhanced with dynamic collaboration by groups;
 * 3) Applications on the web will harness and share collective intelligence;
 * 4) Elimination of traditional HTML as it is replaced with more dynamic web pages and dynamic links;
 * 5) Replacement of static content with more transactional databases'
 * 6) Elimination of traditional software development, deployment and maintenance with more managed solutions; and
 * 7) Rich user experiences.

According to O'Reilly, it was never intended that a successful web application must have all of the traits listed above in order to be successful but, rather, that it must display any combination of most of the traits above in order for it to be successful.

Market indices as a means of tracking industry growth and performance
The growth patterns of the IT industry are commonly tracked and monitored through the stock exchange and stock market indices, that represent price-weighted groupings of similar technology stocks, which can be compared against each other as well as other market indices.

Examples of such indices include but are not limited to: