Talk:History of Computers

The role of (see [Ada_Lovelace]) has been disputed. Lord Byron left Ada at an early age and did not work with Babbage. What is known is that Ada wrote and article about the Analytical Engine

Any other influential languages?
A few fairly influential languages seem to have been forgotten like Lisp and C, for example. Should these be included? Tinned Tuna (talk) 16:18, 9 January 2009 (UTC)

Small restructuring...
I've done a small restructuring to enable the creation of small chapters and comply with the request to fragment the text. A linear approach was not fallowed on the development of the content so rearrangements are not extremely useful but I've selected a down to top approach and worked the sections titles a bit to provide more context and structural consistency. If no one opposes the changes and no better section names appear in 7 days I will try to split the content based on the created structure. --Panic (talk) 08:56, 8 April 2010 (UTC)
 * Finished the proposed/requested implementation/separation in sections. --Panic (talk) 00:51, 17 April 2010 (UTC)

Time to get this moving again.
It looks like this Wikibook was largely if not compeltely abandoned, which is a serious shame because there's so much to say. I have an academic interest in historical computing and unless objections show up, I intend to start working on it. Hexadecima (discuss • contribs) 19:03, 5 August 2013 (UTC)

Time to get this moving again, once again.
This looks to be started and left alone for the last decade mostly other than tweaks. As I look at the book content and based on the last 60 years of experience I have had in computing, software and related engineering, this has a LOT of omitted history and computing legacy. This looks to be more an enumeration rather than history. As I read this, it sounds like the history is essentially:
 * - Early Origins: Pascal, Jacquard, Babbage and Lovelace created the early concepts. [covering period of 1700s-1800s]
 * - Early Innovations: Components, tubes, transistors, IC Microchips [covering a period of roughly 1920s to 1960s] seems like a big gap, what of the Hollerinth machine, comptometer, even the typewriter as a keyboard influence? What about the CPU?
 * - Rise and Fall of the Mainframe - Not written, but a poor title, IMHO. The world of computing owes a great debt to the rise and predominance of mainframe computing. Certainly it was replaced by other computing paradigms as time went on, but it is neither dead nor completely obsolete. The title suggests failure, but it was more a matter of economics, better throughput, and better marketing in some cases. Perhaps a better title would be "The Age of the Mainframe". In this, period of ENIAC, IBM, UNIVAC and others would be highlighted.
 * - A major section is missing, that is the the "Evolution of the Minicomputer"- From the 60s to the 90s, the growth and influence of the minicomputer DEC, Data General, Wang Laboratories, Apollo Computer, and Prime Computer among many others. It was from this technology and the reduction of the mini was the impetus to create the desktop. I saw this evolution through Digital Equipment Corp(DEC) primarily. The journey started after and somewhat in parallel with mainframes, but converged with the evolution of desktop computing to the server/storage/network model used by government, enterprise and internet users. Tandem, DG Eclipse, Xerox, Honeywell, Burroughs, ...
 * - Rise of the Microcomputer looks like a good core. The parallel development of the microcomputer and desktop seemed to be around the chipset used. That would be a good way to organize. The sections seems to somewhat in alphabetical order, but maybe by evolution timeline or by CPU would be better. The impact of the hobbyist, and the open design concept seems to have evolved here. Designs from BYTE magazine and hobby groups influenced Apple hugely. Timex had Zilog based computer, the Zilog chip was far superior to the 8086 (original x86) chip, but the latter was far cheaper and chosen for the IBM. Lots of untold stories and history here.
 * - Operating Systems - Gee, it looks like IBM has a monopoly on OS. Although certainly notable and influential, there are so much more. All the minicomputer and evolutionary/revolutionary OS's...DEC VAX/VMS, Unix, Linux, ... The evolution and affect of the internet is grossly missing.
 * - Application History - looks like early thought. Lots of others, process systems, CAD, CAM, warehouse management, manufacturing systems, enumerations by business use, personal use, government, IT management tools, AI,  ...
 * - Missing is Later and Future Evolutions of Computers - Utility computing, cloud computing, embedded systems, mobile, edge computing, robotics, AI, ...
 * Sorry if this looks like a rant, but this could be interesting and useful. Is anyone spearheading this?