Talk:History of computing hardware (1960s–present)

General
I'm no designer, but would it be possible to clean up the table for the "History of Computing" series. It is a bit of an eye sore. --Small business 18:21, 4 May 2004 (UTC)

Should something be said in this article about the rudimentary quantum computers being built today? Just a thought. Lupin 22:16, 15 Jun 2004 (UTC)

"15-inch printed circuit board": Is inches a standard measurment of circut board sizes? A link to a page that explains how to convert from this archaic measurment system would be helpfull. Rakshasa 10:36, 23 Jun 2004 (UTC)

Reworked the Apple II section. It sounded great but was wildly inaccurate. Alatari 11:23, 22 June 2007 (UTC)

Mainframes

 * With the development of storage area networks and server farms of thousands of servers, by the year 2000 the minicomputer had all but disappeared, and mainframes were largely restricted to specialised uses. The Google server farm is thought to be the largest, with a total calculation rate three times that of Earth Simulator or Blue Gene, as of 2004.

This is bull-shit. Mainframe is still IBM's most profitable branch (literally billions of dollars earned), because banks and governments would never accept the inherent unreliability of PC-farms. Also, PC based clusters simply cannot have the vast I/O performance, traditional to mainframes, which these big brothers require. Without tremendous I/O bandwidht (e.g. serve 15,000 interactive users and six fully loaded ATM circuits at the same time) the CPU power is of little real-world use. — Preceding unsigned comment added by 195.70.48.242 (talk) 15:00, 25 January 2005 Amen. Definition of mainframe: an obsolete device still used by thousands of obsolete companies serving billions of obsolete customers and making huge obsolete profits for their obsolete shareholders. And this year's run twice as fast as last year's. Signed, an enterprise architect for one of the US's largest banks.

65.25.216.35 03:59, 8 January 2006 (UTC)

Tandy's Dominance
I worked in a software store in Saint Louis Missouri from 1979 to 1981 and the TRS-80 was the dominant business machine. Small businesses were paying $300-$500 1979 dollars for database and accounting packages plus $50 per hour for custom programming for their TRS-80. TRS-80's success in the custom business accounting and database markets got IBM's attention. Review the article on TRS-80 and it's contiuation even into emulators being written for Windows. The number of clones also testifies to it's success making it the foreshadow of what was to come in the IBM market. We need national sales figures for Apple and TRS-80 cited and figures for total sales of software products for both machines. Working on a separate section for TRS-80 Alatari 13:58, 20 June 2007 (UTC)

Cloning
Clipped the details of the cloning legal challenge and reverse engineering... interesting stuff, but in essence a digression:

"Because they expected a well-funded legal counter-attack, Compaq engineers developed a special reverse-engineering method called "clean-room" development, in which all reverse-engineered code was provably written from an English specification, and therefore could not possibly be an illegal copy of the copyrighted IBM code.

Legal battles established the legitimacy of the machines, and the lower prices made them popular. Some introduced new features that the popular brands didn't have &mdash; the Franklin, for example, had lowercase display that the Apple II lacked, and Compaq's first machines were portable (or "luggable" in the terminology later developed to distinguish their quite heavy suitcase-sized machines from laptops)."

— Preceding unsigned comment added by Nkedel (talk • contribs) 20:41, 7 June 2005 (UTC)

Timeline
This is something you may find useful for summarising the article (see source to see how it works):

Regards, Samsara contrib talk 15:28, 22 February 2006 (UTC)

An invitation/plea
To contributing authors of this article, I'm currently working on rewriting personal computer to make it something respectable. My hope is that it can be a featured article in the near future. This article has a pretty good history of the home computer and personal computer, and any editors who worked on it that would like to help on personal computer would be much appreciated. Thanks in advance! -- uberpenguin 19:21, 14 March 2006 (UTC)

Working on that article Alatari 11:22, 22 June 2007 (UTC)

VDUs
Which was the first computer to utilise a cathode-ray tube to display information? (I discount the Williams tube used by the Manchester 'Baby' (SSEM) because in that machine the CRT was actually the random-access memory device.) When I first started using computers in the late 60s, results were always output either on punched tape, punched card or paper printout. The idea of hooking up a TV screen, now so obvious, was very rare. --82.138.200.126 16:13, 21 March 2006 (UTC)

In the early sixties we in the Marconi Radar Division designed a generator system for use in air defence systems that displayed alphanumeric characters at the writing rate of 50000 per sec on both PPIs as target tags and a separate tabular display as rows and columns of data. There was also a scan conversion system for output to TV projection units.

Marconibod 16:44, 1 February 2007 (UTC)

Grammar?
The line - "Throughout the mid 1970s to late 1980s, hundreds of computer hardware companies were founded, most out of business." Wouldn't that line make more sense if it were 'Throughout the mid 1970s to late 1980s, hundreds of computer hardware companies were founded, of which most went out of business.' or something to that affect?

not complete
The article is not complete. Some computer are still missing e.g. FM Towns
 * Then be bold and add them yourself. :-) — Wackymacs 08:27, 5 September 2006 (UTC)

Triumph of the Nerds
What does triumph of the Nerds have to do with this article. Who knows maybe it does have something do with this article but Im just not quite sure. -- This unsigned comment was added on 03:25, 2006 September 5 (UTC) by Aceofspades1217 (Talk)
 * Triumph of the Nerds is a 1996 documentary about the history of the personal computer revolution, it covers quite a bit of old hardware including the Altair, IBM PC, Xerox Alto and Macintosh. — Wackymacs 08:27, 5 September 2006 (UTC)


 * Aceofspades1217 has obviously never seen the documentary. C'mon man it's a classic 2600:1700:5F81:22C0:4931:DA09:967B:B762 (talk) 01:59, 23 March 2022 (UTC)

More than bitty boxes
We jump right into the Apple II and IBM PC? C'mon, let's make this a litlle more deep - there's a lot of interesting stuff happening in the gradual transition between vacuum tube computers, and discrete transistors to the slow progression in SSI and LSI integrated circuits, and *then* Intel takes over the Earth. That's the trouble with an encyclopedia written by teenagers...no perspective. --Wtshymanski 22:51, 29 September 2007 (UTC)
 * You referring to the intro paragraph or the entire article? Don't assume everyone watching this page is under 40.  Alatari 15:31, 30 September 2007 (UTC)
 * The whole article needs a tune-up. We have perfectly good articles on home computers and personal computers, we don't need to re-cap them here. We need more about the revolution that solid-state, ICs, and LSI made in big iron. Is there *anyone* over 40 watching the article, or at least someone under 40 who can crack open a book or two? --Wtshymanski 05:38, 28 October 2007 (UTC)

I agree with the assesment. However:

--Aleksandar Šušnjar 18:58, 28 October 2007 (UTC)
 * I am one of those "just under 40" who has a lot of information but just a tad bit short on that part of the history - while I know what happened, I can't quote or provide verifiable references.
 * In fact, I happen to know a lot about "before" and "after" the "middle ages", but not much about the transition from tubes to transistors.
 * I (slight!) defense of the article, this article is "after" 1960s. There is another article for "before 1960s" that, interestingly enough, includes 60s as well... So some restructuring may be required as well.
 * If the pre-1960 article has 1960 info then it does need cleaning.Alatari (talk) 03:20, 14 January 2008 (UTC)

Article clean-up
I'm organizing by date wherever possible. Especially the pictures. And adding pictures to make the article interesting. Alatari (talk) 03:08, 14 January 2008 (UTC) The Commodore section was a tough choice. Put in their first machine or the best seller of all-time. I went with the most notable. Alatari (talk) 03:20, 14 January 2008 (UTC) If we put in the truly most notable or popular pieces of computer hardware from 1960 till 2000 this article will get very large. We probably should split it into decades. Definitely we should split the 1990's and later off into a new article. Alatari (talk) 04:47, 14 January 2008 (UTC) The Amiga and it's design of a CPU for each audio, video CPU is another milestone that needs adding. Alatari (talk) 06:18, 14 January 2008 (UTC)
 * Histories of individual brands of home computers belong in their respective articles - this article should be more of an overview and should NOT get bogged down describing how the Binford 64 shown at the June CES was replaced by the Binford 128 at the fall Comdex, until the company went bankrupt the following spring. Overview! Important historical developments! Encyclopedia, remember, not trivia manual.  This article should acknowledge personal computers as becoming important especially after the mid 1990's but there's a whole lifetime in the period where most people never saw anything closer to a computer than a punch card for thier electric bills. More about big iron, less on gameboxes. --Wtshymanski (talk) 15:16, 14 January 2008 (UTC)

Picture call
Need a 1977 original Apple ][ pic whcich had a tape deck for I/O. The current pic is from 1978. Alatari (talk) 03:08, 14 January 2008 (UTC)

This article
Is now nearly empty. No mention of landmark machines during the 70's and beyond, no mention of the usage of Cartridges, CD-ROM, DVD, SD cards, etc. There are so many milestones in computing history through the last 2 decades this article should be 100kb in size. The personal computer article is not the History of personal computers article and doesn't have the space for all those landmark machines. I stronlgly disagree with User:Wtshymanski's edits and will have to build some consensus in the WikiProjects discussion page on how to handle this. Alatari (talk) 13:22, 18 January 2008 (UTC)
 * The article is hardly empty. As I said above, a trivial roll call of microcomputers is not really in keeping with what an article with the grand title of "History of computing hardware" should have in its contents. The IEEE Encyclopedia of Computer Science devotes 6 pages to the period 1965-1993, and less than 1/4 of that is a recounting of personal computers - one page. We shouldn't be focussing on brand names in this article, but on the fundamental technologies that have been introduced. Introduction of the microprocessor and its impact on making home computers possible is a milestone and deserves discussion; a list of all the 6502 and Z80 and 6809 machines is trivial ( the evolution of word processing from Electric Pencil on a TRS 80 to Word 2008 running on a Windows box is of some interest but not a milestone for inclusion in *this* article). How you get the program into the box (Blu-ray, DVD ROM or cartridge or punched paper tape) is are trivial changes on the theme of the user interacting with the computer.
 * There were no landmark machines or at least not as many as were in this article - once you've noticed that you can build a $1000 box that conceivably could be sold to a consumer under the pretense it would be useful, the rest of the personal computer development is just different flavors of Coke - marketing, not engineering. The landmark idea was getting the box into the home in the first place.
 * The article should be about hardware and how the massive decreases in cost have affected the application of computers. This article doesn't even mention Moore's Law! Introduction of a GUI over the character-mode interface is a significant topic, but what flavor of GUI runs on your brand of hardware is not significant for this article. Massively parallel multiple-instruction multiple-data supercomputers are fundamental; a new gadget for a Windows box is not a milestone. Speech recognition may be a milestone though it's still fairly limited. Wide-area networking might be a topic worth a paragraph. Has the desktop personal computer actually improved white-collar productivity? I don't know, anyone have a good scholarly article they'd care to paraphrase for the Wikipedia? The *massive* effect of embedded systems on everyday life is a milestone - you can scarcely flush a toilet today without involving computer power that would have stunned Alan Turing. Cellular telephony was conceived in the '70's but completely impractical until you could run a whole computer on a few hundred microamps at 3 volts - and cell phones have a social impact far greater than ROM cartridge video games.    SETI at home. Cryptography as a feature of everyday life. ATMs!  Desktop publishing. The Intenet itself, for goodness' sake! Web 2.0 and tis very encyclopedia of which we are all part! C'mon, the *real* milestones are much bigger than the introduction of the Binford 64! --Wtshymanski (talk) 16:11, 18 January 2008 (UTC)
 * We're not in complete disagreement. That's why I proposed the History of personal computers article because placing all the machines into the Personal computer article (like you have done) is extremely objectionable.  What constitutes a landmark is of some debate it seems and I'm not comfortable with us being the only two taking part in a major rewrite of this article.  Breaking the article into the time period 1965 to 1993 along the IEEE lines is a possibility.  —Preceding unsigned comment added by Alatari (talk • contribs) 16:32, 18 January 2008 (UTC)
 * I don't understand why putting the history of various personal computers into the personal cmputer history section is objectionable, as opposed to sticking them in *this* article. Personal computers are a branch off the history of computing hardware - yes, an important segment, but not the whole of the computing hardware field. The 1993 end date of the IEEE encyclopedia is meaningless since it happens to be the date the book was printed - I wouldn't suggest that as a demarcation point at all. 1960 has some significance because at that point you coud buy transistorized computers from more than one vendor; 1965 has less significance (early ICs) and I'm not sure what the next significant year would be. The personal computers article doesn't need to have the history of every bitty box brand ever sold, since most of them were unimportant - a few representatives would be sufficient.  --Wtshymanski (talk) 17:46, 19 January 2008 (UTC)

I think the idea of establishing a History of personal computers article is the best approach. This article is better with all that pc history detail removed; it needs some more summary material about PCs, but not all that had been here. That detail is also relevant to PC, but really shouldn't overstuff that article either, whose focus should be a light touch over all aspects of PCs, not concentrating on their history. Hence, History of personal computers is very appropriate now that WP has a considerable amount of info on that area. -R. S. Shaw (talk) 21:53, 19 January 2008 (UTC)
 * Go for it. I prefer history in-line with an article and the trivia booted out to a "List of XXXX" format, but a stand alone history article could work. --Wtshymanski (talk) 16:36, 20 January 2008 (UTC)

Fair use rationale for Image:Data General Super Nova.jpg
Image:Data General Super Nova.jpg is being used on this article. I notice the image page specifies that the image is being used under fair use but there is no explanation or rationale as to why its use in this Wikipedia article constitutes fair use. In addition to the boilerplate fair use template, you must also write out on the image description page a specific explanation or rationale for why using this image in each article is consistent with fair use.

Please go to the image description page and edit it to include a fair use rationale. Using one of the templates at Fair use rationale guideline is an easy way to ensure that your image is in compliance with Wikipedia policy, but remember that you must complete the template. Do not simply insert a blank template on an image page.

If there is other fair use media, consider checking that you have specified the fair use rationale on the other images used on this page. Note that any fair use images lacking such an explanation can be deleted one week after being tagged, as described on criteria for speedy deletion. If you have any questions please ask them at the Media copyright questions page. Thank you.

BetacommandBot (talk) 21:20, 13 February 2008 (UTC)

Mobile phones
This article has no section on cellphones. This is a huge omission!--greenrd (talk) 21:26, 26 May 2012 (UTC)


 * You are so right, how can there be an omission this big? Who is running this page? 2600:1700:5F81:22C0:4931:DA09:967B:B762 (talk) 01:57, 23 March 2022 (UTC)

Rename
I propose renaming this article from History of computing hardware (1960s–present) → History of computing hardware since 1960 as more concise czar ♔  16:42, 24 May 2014 (UTC)

what about the amiga computer? — Preceding unsigned comment added by 200.92.223.52 (talk) 23:22, 17 March 2015 (UTC)

Bias on the Computer Systems and Important Hardware Timeline section
Hi all, In my opinion the section Computer Systems and Important Hardware Timeline is quite biased towards Apple Inc. It is not true that the main advances on computer systems from 1998 on are only the Apple Inc products. This is section is not neutral. Samsung, LG, Motorola have great products that could be considered as the annual hardware reference as well --Xbosch (talk) 16:10, 14 May 2015 (UTC)


 * Fair point. Feel free to add other processors/hardware to the tables.  nagual  design   19:23, 14 May 2015 (UTC)


 * Apparently the only advances in computing between 2000 and now are from Apple - how anyone could write that and consider it in the spirit of neutrality is baffling. iPhone releases past the first do not need to be included. New laptop models from Apple do not need to be included. The first release of Android should be. The release of the Raspberry Pi should be. It's not a section for every 'computing' release, it's for significant events. 92.0.104.159 (talk) 01:16, 1 June 2015 (UTC)
 * It's probably best to try and add other hardware to the list for now, and leave the trimming for later.  nagual  design   02:25, 1 June 2015 (UTC)

External links modified
Hello fellow Wikipedians,

I have just modified 5 external links on History of computing hardware (1960s–present). Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
 * Added archive https://web.archive.org/web/20020209154232/http://members.fortunecity.com/pcmuseum/gernelle.htm to http://members.fortunecity.com/pcmuseum/gernelle.htm
 * Added archive https://web.archive.org/web/20160303210719/http://febcm.club.fr/english/chronoa10.htm to http://febcm.club.fr/english/chronoa10.htm
 * Added archive https://web.archive.org/web/20010605232405/http://www.ox.compsoc.net/~swhite/history.html to http://ox.compsoc.net/~swhite/history.html
 * Added archive https://web.archive.org/web/20090705015058/http://dir.yahoo.com/Computers_and_Internet/History/ to http://dir.yahoo.com/Computers_and_Internet/History/
 * Added archive https://web.archive.org/web/20041115011247/http://davidguy.brinkster.net/computer/ to http://davidguy.brinkster.net/computer/

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

Cheers.— InternetArchiveBot  (Report bug) 02:16, 5 November 2017 (UTC)

Title is wrong/misleading
We shouldn't have to read into the article to find out it's about 1966 onwards. I propose that the title be changed to "...(mid-1960s - present)" (although I'd also argue that "present" is never a good term to use in an encyclopedia, but don't want to detract from my main point).72.16.99.93 (talk) 20:36, 26 December 2018 (UTC)
 * Actually the earliest date is 1963 so I personalyly don't have a problem with the title. There probably should be links to articles on 1st and 2nd generation computers. Tom94022 (talk) 06:54, 27 December 2018 (UTC)

How long were 3rd Gen Computers Built?
I think the article can note that 3rd generation computers were offered well into the 1990s. For example the biplolar IBM ES9000 9X2 announced April 1994 used 5,960 chips to make a 10 way processor - clearly not a "microprocessor" even though microprogrammed. Even the 1997 CMOS 9672-RX5 used 31 chips to make a 10 way processor. So when did the 3rd generation end, if ever? Tom94022 (talk) 20:41, 1 March 2019 (UTC)


 * Unless you consider a mainframe using multi-core single-chip microprocessors as CPUs to be "3rd generation", it appears to have ended, at least in IBM mainframe land. If you consider the existence of support chips, such as the SMP Hub Chip for the z10, to be sufficient to render a machine "3rd generation", then the laptop on which I'm typing this is 3rd generation, even though the CPU is a 4-core single-chip microprocessor.  Maybe my smartphone finally achieves 4th generation status by that criterion, as its CPU is part of a system-on-chip. Guy Harris (talk) 22:01, 1 March 2019 (UTC)
 * I suspect a multi-core single-chip microprocessors qualifies as at least a 4th generation, maybe 5th, but what generation is a IBM ES9000 9X2 announced April 1994 used 5,960 chips to make a 10 way processor? Tom94022 (talk) 23:49, 1 March 2019 (UTC)


 * Generally 3rd gen ends when 4th gen starts :) (1971-1980 range, WP:OR).
 * It may require to define on what technology 3rd gen "end" (if it end), and then what "end" mean (last computer, market share below 50%, general knowledge, ...). --MarMi wiki (talk) 23:16, 1 March 2019 (UTC)
 * Any gen ends when they stop producing computers whose logic is primarily of the type required. 1st did not end when the first transistor computer shipped.  Thus 3rd gen ends when they stop producing computers whose logic is primarily LSI and not microprocesors.  BTW the idea is not to declare an end (which might be WP:OR) but instead to note the fact that "3rd generation computers were offered well into the 1990s," citing perhaps IBM mainframes as above. Tom94022 (talk) 23:49, 1 March 2019 (UTC)


 * OK, so clearly "if ever" was just a rhetorical gesture, as nobody's building non-microprocessor-based CPUs any more (evidence required for any claims to the contrary).


 * Machines with multi-chip CPUs, such as the VAX 9000, were being built in the early 1990's. Somewhere between then and 2008, when the IBM z10 came out, the 3rd generation ended. The IBM mainframes may have been the last major line of computers to go all-microprocessor. Guy Harris (talk) 00:57, 2 March 2019 (UTC)


 * Sorry for the rhetorical gesture, I would add CRAY ECL computers to the list of 3rd gen computers built into the 1990s. And I agree that it needs an existance proof  to claim any such computers being built this century.  Tom94022 (talk) 02:07, 2 March 2019 (UTC)


 * I have, in front of me, "A high-frequency custom CMOS S/390 microprocessor", from the IBM Journal of Research and Development, Volume 41, Issue 4/5, July/September 1997. It says


 * "The S/390® Parallel Enterprise Server Generation 4 processor is an implementation of the IBM ESA/390TM architecture on a single custom CMOS chip."


 * so it appears that the 3rd generation ended at IBM in 1996-1997, or perhaps after that if they continued to make and ship the older machines.


 * It appears that Unisys also were making single-chip A-series and 2200 machines.


 * DEC Alpha came out in 1992, so the writing was on the wall for VAXes at that point; I don't know when the last non-MicroVAX-based VAX came out, but I doubt it survived the 1990s.


 * The Cray J90, first out in 1994, had two-chip CMOS processors - one scalar chip, one vector chip; that sounds pretty 4th-generation to me, just as an 80386+80387 does.


 * So I wouldn't be surprised to hear that the 3rd generation was completely dead by 2000, if not sooner. It may have died when the last ECL machine of any sort shipped (although the MIPS R6000 was an ECL microprocessor). Guy Harris (talk) 02:29, 2 March 2019 (UTC)


 * FWIW the IBM ES9000 9X2 announced April 1994 used 5,960 ECL chips to make a 10 way processor and its lease, rental, and maintenance agreements were withdrawn effective June 30, 2003.
 * At this point I think it is appropriate to add a sentance to the section that 3rd gen computers were built well into the 1990s or words to that effect. Tom94022 (talk) 06:33, 2 March 2019 (UTC)

(Yes, we know, it had 5,960 ECL chips....)

The existence of maintenance services in 2003 doesn't necessarily mean they were still manufacturing them. Lease and rental services might mean that, but were they still manufacturing and leasing/renting new machines at that point, or just leasing/renting "used" models returned by people who were previously leasing or renting them?

They were obviously making that particular in the mid-1990s, given it came out in the mid-1990s; the question is how long did they continue to make it - or how long did they continue to make the last bipolar S/390, which was probably not microprocessor-based, whatever that model might be.

The G3 model that came out in 1996, however, had a single-chip processing unit (PU), according to "S/390 Parallel Enterprise Server Generation 3: A balanced system and cache structure" in the same IBM Journal of Research and Development issue that the JRD paper you cited and the JRD paper I cited were in, so they were going with microprocessors at least that far back. I couldn't find anything obvious indicating whether the 1994 G1 or the 1995 G2 had a single-chip processing unit or not.

"Into the 1990s" probably suffices. Guy Harris (talk) 08:38, 2 March 2019 (UTC)


 * FWIW Exploring IBM's New Age Mainframes - Tech Insider suggests the G1 and G2 also had single chip PUs. I suppose if the article used the work "offerred" instead of "built" it could say offered (or available) into this century :-). Regardless, we agree that the April 1994 announcement of the IBM ES9000 9X2 amounts to "mid-1990s" so I would prefer either "well into the 1990s" or "as late as the mid-1990s".  We don't know its production life, but its highly unlikey it was announced and stopped in the same year.  I'm going to edit the article at this point mainly based upon IBM but will probably include pointers to Cray and DEC.  My thanks to Guy Harris for all his research Tom94022 (talk) 18:46, 2 March 2019 (UTC)


 * IMHO it's not an important variable. Each time IBM introduced a new processor complex it revisited technological tradeoffs. Using the definition in wiki, IBM switched back and forth between "3rd generation" and "fourth generation". Since they marketed the systems as black boxes, there was no marketing advantage in any particular division of labor, so their designs were driven strictly by engineering considerations. IAC, the FRU hasw been much larger than a single chip for decades. Shmuel (Seymour J.) Metz Username:Chatul (talk) 17:18, 17 March 2019 (UTC)


 * The FRU on my laptop isn't a single chip, either; the chips are mostly, if not completely, pretty solidly attached to the motherboard.


 * IBM had multiple independent implementations of various flavors of the S/3x0 ISA; they may have, at one point, offered, at the same time, lower-end machines with single-chip processors and higher-end machines with multi-chip processors, just as DEC did with the PDP-11 and VAX. It probably took the final victory of CMOS over bipolar to finish off the third generation (although MIPS had the R6000, which was a 3-chip ECL microprocessor - one chip was the non-floating-point part of the CPU, one was the FPU, and one was the bus interface, so that was a fourth-generation ECL processor). Guy Harris (talk) 19:36, 17 March 2019 (UTC)

Missing all of the players in the 1960s
The 1960's were dominated by IBM and the Seven Dwarfs, yet I can find no mention, outside the timeline, of major product lines from  Burroughs, Control Data Corporation (CDC), General Electric,  Honeywell, IBM, NCR, RCA or UNIVAC, to say nothing of, e.g., Digital Equipment Corporation PDP-6 and PDP-10, Philco TRANSAC S-2000, Scientific Data Systems (SDS) 9 and Sigma series.

The 1960s was an era of radical innovation, in which, e.g., I/O channels, interrupts, changed from rarities to standard features, and in which concepts such as digital communications, display terminals, interactive computing, operating systems, paging, segmentation, time sharing and virtual machines either originated or came into their own. Shmuel (Seymour J.) Metz Username:Chatul (talk) 05:05, 31 May 2020 (UTC)


 * It's not entirely surprising that a single article covering a 60-year span in a field that has advanced as quickly as computing might give short shrift to some topics. Perhaps this article should be split into multiple articles. Guy Harris (talk) 00:01, 1 June 2020 (UTC)


 * Given the focus on generations and components making up the hardware it is not surprising that most of the companies listed in this talk didn't make the article as for the most part they were followers. They all do have separate articles as do most of the radical innovations also listed, BTW, including Channel I/O.  It does seem like the business aspects of the computing industry is not in any one Wikipedia article but that would be a huge article.  Tom94022 (talk) 06:43, 1 June 2020 (UTC)
 * After further thought it's probably a good idea to both rationalize two articles, this one and its precursor History of computing hardware and to include for each generation brief statements about the business aspects and key technological innovations beyond components. Rationalization should include moving much of the 3rd and 4th generation material from the precursor article into  this one.  's lists above are interesting but it might be difficult to find reliable sources to support inclusion of any one company or technology in any one generation (IBM and the BUNCH a notable exception as to available RS's).  Tom94022 (talk) 17:23, 1 June 2020 (UTC)


 * There's a lot of information in conference proceedings, house organs, e.g., Bell System Technical Journal, IBM Systems Journal, IBM Journal of Research and Development, and professional society publications, e.g., IEEE Annals of the History of Computing; unfortunately not all are publicly available. Shmuel (Seymour J.) Metz Username:Chatul (talk) 17:57, 1 June 2020 (UTC)

Need dates for timeline
Does anybody have dates for Shmuel (Seymour J.) Metz Username:Chatul (talk) 05:33, 11 August 2020 (UTC)
 * GE
 * 605
 * 625
 * 635
 * Honeywell
 * DDP-516
 * RCA
 * 301
 * 501
 * 601
 * Sylvania
 * S 9400

Misleading sentence: "By 1959 (...)"
@User:Tom94022: So, you made this revert. The original version was:

But "by 1959" means "not later than 1959", see here, and I don't think it was the intended meaning. So I proposed the following version:

You rejected my version, and wrote a very unclear explanation. What do you mean by "the overlap btw 2nd and 3rd gen"? Could you, please, be more specific? Is the Oxford Dictionary wrong? 85.193.228.103 (talk) 15:36, 1 March 2021 (UTC)
 * Sorry if my summary was unclear. I was referring to the overlapping years when new models of both new vacuum tube (2nd gen) and new transistor (3rd gen) computers were being introduced:
 * List of vacuum tube computers for 2nd generation computers showing 6 models in 1958, 6 models in 1959 and 6 models in 1960.
 * List of transistorized computers for 3rd generation computers showing early production 1953-56 (6 units total) and then 6, 9 and 11 (1959) models in the years thereafter
 * There is no evidence about anything happening in 1959 that supports an "in 1959" characterization; given that 1959 was the first year there were more new models of 3rd gen than 2nd gen supports the original "by" (e.g. not later than) language since shipments and announcements in 1959 likely began prior to 1959. Tom94022 (talk) 18:14, 1 March 2021 (UTC)
 * @User:Tom94022:: Wow, fantastic explanation! Finally I understood. Thank you :)
 * But (in my opinion) the original sentence is misleading because it can be understood that either transistor were considered reliable before 1959 or in the course of the year 1959. So it says nothing about what happened after 1959. Were transistors still considered reliable a year later? Just because we know what really happened does not mean that the sentence is correct. The intended meaning was completely different - transistors started being considered reliable in the course of the years 1953 - 1959. So, how about:
 * "" 85.193.228.103 (talk) 18:56, 1 March 2021 (UTC)
 * "I was referring to the overlapping years when new models of both new vacuum tube (2nd gen) and new transistor (3rd gen) computers were being introduced" Presumably meaning "...when new models of both vacuum tube (1st gen) and new transistor (2nd gen) computers were being introduced". Guy Harris (talk) 21:56, 1 March 2021 (UTC)
 * AFAIK the 1st generation were electro-mechanical. It sounds like Guy and I agree that "by", meaning not later than 1959, accurately describes

"being considered reliable in the course of the years 1953 - 1959." Tom94022 (talk) 22:14, 1 March 2021 (UTC)
 * "AFAIK the 1st generation were electro-mechanical." That's... not what the article says. It speaks of the 2nd generation as being computers built from discrete transistors.
 * And, yes, I'm not sure what's wrong with "not later than 1959". Guy Harris (talk) 22:36, 1 March 2021 (UTC)
 * All of the computer literature, at least in the US, lumped relay and vacuum computers together as first generation and referred to transistorized computers as second generation; I know of no published source that referred to vacuum tube computers as second generation. Shmuel (Seymour J.) Metz Username:Chatul (talk) 01:18, 2 March 2021 (UTC)
 * Sorry for misuse of the term generation, but that doesn't change the the validity of the use of "by" in this context. Tom94022 (talk) 07:32, 2 March 2021 (UTC)

Decimal or binary?
The article says "The second-generation computers were mostly character-based decimal computers, ". Weren't just about all second-generation computers binary (or octal or hexadecimal)? Bubba73 You talkin' to me? 05:38, 15 December 2021 (UTC)
 * No, they were all the different types stated but you do have a good point, perhaps the word mostly should be dropped? Tom94022 (talk) 06:59, 15 December 2021 (UTC)


 * From decimal computer, it seems like the IBM 7000 series were about the last of the decimal computers. Bubba73 You talkin' to me? 07:14, 15 December 2021 (UTC)


 * Maybe, but there was a lot of second-generation computers so absent an RS we have to be careful. I wordsmithed the sentence to remove the ambiguous "mostly."  Tom94022 (talk) 17:42, 15 December 2021 (UTC)


 * IBM 1401 and 1620 were also decimal, and Univac Solid State, LARC, and Burroughs B2500. Bubba73 You talkin' to me? 00:25, 16 December 2021 (UTC)


 * The 7070 was the last of the pure decimal machines, but there were also hybrid machine of various types. Typically an address contained 6-bit characters with zone bits extending a decimal address. Decimal computers after the 7070/7972/7074 include
 * Burroughs B2x00/B3x00/B4x00
 * IBM 1400 series
 * IBM 1620
 * IBM 1710
 * IBM 7010
 * UNIVAC II
 * UNIVAC LARC
 * The UNIVAC Solid State 80/90 may have been earlier than the 7070; they were both 1958, and I don't know the exact dates. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 15:11, 16 December 2021 (UTC)


 * Decimal machines were most popular on the small end and binary machines were most popular on the large end, but there was a log of overlap. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 15:11, 16 December 2021 (UTC)

History outside of US and USSR
The article is very much US-centric. I believe that it should mention companies and computers in, e.g., England, France, Israel, Japan. Although some European companies sold re-branded computers from US companies, Europe had a robust computer industry of its own, as did Japan.

As a secondary issue, the article is missing the history of university computers, e.g., Golem at the Weizmann Institute, ILLIAC at multiple universities. -- Shmuel (Seymour J.) Metz Username:Chatul (talk) 14:51, 22 January 2024 (UTC)