The demise of the generation idea in the Computer Industry

Computer Genesis: Perspective 1975 – 2016

How technology evolution changed the way we understood the generation idea

In the opening of the Genese do Computador (Computer Genesis) I stated my intention to give a perspective for further investigation and ended up with the generation idea to figure out the development occurred. I also ended up showing up the leading technologies then used by IBM and unfortunately it didn’t become clear but the size to get an x number of bytes per square inch was the key issue, because, taking the same amount of computing power, the cubic (and the square) feet to have such a machine was and is a direct function of that. Not to mention the cost, specially the initial investment, installation and maintenance. This is extremely well discussed in detail for the IBM 1401, which is the first “mass-produced” computer and is the logical basis ta take into account when deciding to put to the market a machine, any size.

And it is one piece of a correct frame of mind (among others, listed bellow) to understand what happened and is still happening. Although in the pointer file which brought you here I still frame it up under the generation concepts, I want to branch out to a more proper way to see what happened, to justify my point that there are better ways now to understand what happened and what is going on.

Not necessarily in order of importance, the following cross sectioning concepts are better for the understanding the evolution of computers and the like from 1975 to this day (2016)

1 – Figure out what is involved in the creation of a modern computer, and the example given here is the IBM 1401

2 – The cost factor, the money involved, the accessibility to the equipment due to economical constraints and its influence also in the manufacturers and manufacturing.

3 – Technology involved: the power up escalation and the shrinking down and its effects in products that can be created and its influence in mankind and its endeavours

4 – The new products and the ergonomic factor

5 – The shape of things to come Android, Google, Apple, etc.

How does that fits to a concept to replace the generation idea?

1rst. When the product is well established and recognized, it should be classified by models and a good example is the Apple iPhone or smart phone

2nd. The category of what is under discussion, specially its design purpose and its development

Products of special interest to us because of their connection with computers and should be discussed on their own context are:

TABLETS, IPads, IPHONEs, SMARTPHONES, INTERNET,

The picture (see Gallery: Before Internet & Smartphone)

From the inception till 1975, or even a few years before, the functions today associated with the list above either didn’t exist or were limitedly done with the help of the following:

  • The telephone
  • The post office
  • Radio
  • Home Appliances then
  • Cinema
  • TV
  • Sound reproduction equipment
  • stationery and office supplies
  • books and magazines (encyclopedias and dictionaries)
  • Go to the library to do essays
  • typewriters
  • maps
  • Photo cameras (including celluloid films, photo development, etc)
  • paper money and credit cards
  • Recorders (sound and image)
  • Fax machines
  • Travel agencies
  • watches to tell what time it is
  • Go to the bank to do banking
  • Phone book and yellow pages
  • pagers

At the beginning – The fifties

It is not the purpose of this essay to cover everything, but simply to give an idea with our eyes of today (2016). I recommend the first movie of Back to the future because the emphasis  the script stretches is how the fifties looked like in the eyes of a teenager from the  mid eighties. Although smart phones and Internet weren’t realities yet, the picture is well covered  there.

Although powerful computers were designed more  to be used for aircraft design, nuclear development and munitions manufacture, in 1957 one event that would change all that occurred: The Sputnik..

The sixties

It saw the the introduction of  the System 360, really becoming mass produced and computers became common place. The pioneer days of the 700 & 7000 family of computers, which started on tubes and ended up transistors, ended up.The IBM 7030 Data Processing System — or “Stretch” computer — was delivered in April 1961, offering a performance that was 200 times faster than the IBM 701, 40 times faster than the IBM 709 and seven times faster than the IBM 7090. Although the 7030 was the industry’s fastest computer in 1961, its performance was far less than originally predicted. IBM cut its price from $13.5 million to $7.8 million and offered the 7030 to only eight customers.

The idea of generation made a lot of sense, not only because the replacement of tubes by transistors, and transistors by solid state meant a  lot less space and energy, but also because of the increase in computing power went strongly up

Seventies and eighties  As it was pointed out somewhere else, slightly above the entry-level, introduced in September 1970, the Model 145 was the first IBM computer to have a main memory made entirely on monolithic circuits on silicon chips (previous 370 models used magnetic core main memories). It operated at speeds up to five times the Model 40’s and up to 11 times the Model 30’s. It should be replaced with the Model 148, which incorporated one novelty that is very significant to the advent of the Personal Computer: It was operated from a “dumb” workstation. Stated more simply, it was a display screen with a keyboard attached; see IBM archives. The 3270 terminal had only rudimentary communications capabilities and was text-based. One of the earliest model 3270 terminal displays (3278 model 1) consisted of 12 rows and 80 columns of text characters, no more and no less. Eventually, a 24 x 80 screen size became the standard, with some alternate sizes available.This exactly the point in history where IBM lost the chance to occupy the place Microsoft with its Windows did. To understand what is involved you have to take a look on the control panel of the 145 and the 148 models of the IBM family (and the earlier 158)

IBM 145 console panel

370145

IBM 148 Console Panel

3278-3

Before the introduction of the 327x terminal, you have to follow the lights to read whatever you wanted in the computer memory (remember, everything in a computer is a memory position allocation) and the turning knobs, all set up in hexadecimal, meaning whatever. The conversion to plain English or logical or both, you made it in your mind.  The 327x terminal eliminated that and put it all to plain English. Since the terminal was “dumb” you must have some “intelligence” to do it. It didn’t happen instantly, but to shorten the history, IBM engineering ended up providing a processor (with the  help of Intel and others) with intelligence to do it. This processor was an extremely mini computer, I don’t remember, but it started as a 4 bit computer, going to 8 and 16 bits, ending up to 32 and 64 bits machines of today. These little processors needed an operating system and software to do what it has or had to do, which was to interpret the main frame in plain English to the operator or the custom engineer. One third-party company contracted to do that was led by Bill Gates. These small “engines” led to the creation of Home computers. Before the IBM PC was introduced, the home computer market was dominated by systems using the 6502 and Z80 8-bit microprocessors, such as the TRS 80, Commodore PET and Apple II series, which used proprietary operating systems, and by computers running CP/M. After IBM introduced the IBM PC, by 1984 IBM PC and clones became the dominant (home) computers.This is explored in more detail at  The Missing Link 1975 2016 Personal Computers

All that interacted with the outside world (read us, normal human beings) by text.

It is or was what can be called, instead of user-friendly, a user enemy process…

Bill Gates tried to convince IBM to overcome that, with no interest from IBM which strangely believed that home computers some day would run in a sort of  minimalized version of 360/370 architecture (specially VM), which is totally text driven and quite user enemy, and declined bill Gates offer. After that he started his first Operating System around the IBM PC DOS, which he left aside and created the Windows. As we know, it is  graphical and the rest is history, and it could  be summarized as the essence of the nineties and early two thousands.

Actually (2016)

Today Windows is perhaps fading out, being replaced by Android and Apple IOS. As  matter of fact, PC’s, as such, are also fading out.

After almost go bankrupt, IBM re invented itself in the mid nineties to become a service oriented company, changing completely the way it did business, specially to its workforce. Hardware became a commodity with lots of competition to choose from.The one and only basic reason was that all those machines priced in the $millions of dollars range felt in the $100000 dollars range and it was difficult to make money the way IBM was accustomed to. Technology made and killed IBM. Although it maintained its position in the Fortune 500’s, it is an entirely new company and this is a story yet to be told, but it can be summarized at

1993–present: IBM’s near disaster and rebirth

Wrap up

Generations of IBM360 -> 370 -> 390

Main frames hardware and software stabilized, i.e., always more of the same eventually from somebody else than IBM. Microsoft grew up to become more valuable than IBM, to be surpassed by Apple and although the picture changes systematically, today it looks like that

Funny thing, though, it is that, as brand name, IBM still is running strong.

Perhaps, besides the model and purpose of some product should be also brought together with the market value of its parent company and the paradox between that and its brand name value, I leave it to the social media to decide…

Advertisements

FORECASTING

Forecasting or Reliability, Serviceability (RAS) and Rental Cost

Having worked most of my professional life at IBM connected with Manufacturing and Machine Development, it failed to me to realize how much the existence of any equipment depended on the forecasting of that equipment. It does not matter how brilliant, how clever or whatever reason around these justifications existed, if it does not meet the forecasting expectancy of who foot the bill: IBM. I could say that any successful or responsible company which is paying by its own means such expense justification does the same. Forecasting when deciding on bring to existence a product at IBM is called Market Planning. The Market Planning Manager for the 1401 was Sheldon Jacobs, and from his speech at the Computer History Museum, he stated that Market Planning or simply Planning, has three main aspects: 

What users look like?

There is a basic question that very seldom, if any, is questioned by historians and anlysts but which it is basic to IBM decision on creating a machine: it has to be a consumable product.

Commercial users

They used a lot of punched cards, Keypunches, Punched card input/output and Unit Record equipment. IBM didn`t sell its equipment, it rented them and the punched card had to come from IBM. From 1930 to 1950, some 25% of profits came from punched cards. The US Supreme Court ruled this out in 1936. Later on, in another case, heard in 1955, IBM signed a consent decree requiring, amongst other things, that IBM would by 1962 have no more than one-half of the punched card manufacturing capacity in the United States. Tom Watson Jr.’s decision to sign this decree, where IBM saw the punched card provisions as the most significant point, completed the transfer of power to him from Thomas Watson, Sr. In 1955 a box with 1000 punched cards cost US$.99 or roughly a dollar and that was the basic reference to compare business interest, i.e., a dollar was a point and the rental of the machines were all defined in terms of points. Shel Jacobs was very skillful quantifying that and after extensive research, he came to a conclusion that the prospect customer of a “small” computing system capable of replacing what such customers computational needs looked like, would run in the 2000 points break even point. So he set out 2500 points as reference to the 1401, or US$2500 dollars monthly rental. He underestimated initially that it would sell 2000 machines in its life cycle, raised it to 5000 after the enormous response from customers and the 1401 ended up selling 15000 machines in total. This enabled IBM to have the necessary money to face the IBM 360 challenge, which is considered fairly the same cost of the development of the A bomb in 1945 and I would imagine that it is comparable with the costs to have the man on the moon in 1969. William Rodgers in his best seller Think, discussed in more detail bellow, stated that the IBM 360 development costs in 1967 dollars $5 345 000 000, Such was the money the 1401 brought to IBM.

How did the punched card fit into the operation? Lets take a look on the typical sequence the punched card goes through:

1 – Key punch

2- Verify

3 – Sort

4 – Collate

5 – Calculate

6 – Tabulate

7 – Summary punched

Shel Jacobs found out which were the most complex operations and what they were all about and went to these customers to ask them what they wanted. “More of the same” they replied. The control panels were carefully studied and Frank Underwood came to conclude that with some 3000 memory positions he could emulate any such control panels, provided that one of his most successful insights were put in place: The variable length instruction. Machine programs previously tended to use only a portion of the memory position, because they were fixed length, leaving portions of the memory empty, wasting precious and costly memory positions. The idea of resetting each instruction with a “word mark”, or a character, enabled to make full use of the empty space, what meant in practical terms more memory for more sophisticated programs. After Frank Underwood findings and advances in programming, he came to the conclusion that a 4K positions memory would be enough, and so it was defined that would be the size of the memory of the 1401. Frank also introduced “user-friendly” acronyms for the instructions, such as “A” for adding, “M” for multiplying “B” for branching “E” for editing and so on. It ended up with two languages, autocoder, when using tapes and symbolic language when using only punched cards. RPG or Report Program Generator was also an innovative feature and together with the Input.Output Control System (IOCS) saved time and trouble of programming. (Good Bye Control Panels!…) The control panels were hell and prone to errors to be programmed and they not only were eliminated, but replaced with a much more accessible format to make the machine to do what was wanted or needed.On top of that, everything was to be transistorized! Thomas Watson Jr foresaw where the electronics technology was heading to and simply discarded tubes and ordered that everything from them on should be transistorized! For those more familiarized with main frames, take a look on the 1401 Reference Manual

Scientific Users

One thing that also never occurred to me is that behind all that charm connected to computers which is always thought and linked with most complicated things as it was the case of the initial entry of IBM into the world of computing, namely Mark I and the IBM_SSEC,The real question was what really mankind needed as a whole at an affordable cost. Anyway, those initial sophisticated machines became instrumental to the development of the first family of “commercial” albeit really “scientific” computers, the IBM 700 family.How did this happened?

Thomas Watson Senior & Jr

IBM, is the brainchild of Thomas Watson, although a creation of Charles R.Flint, who invented the system of combining corporations into trusts. Some consider Thomas Watson as the Henry Ford of computers, but IBM was really a creation of his son, Thomas Watson Jr. His father grew up and lived with a line o products, namely, meat slicers grocery store scales, time clocks and primitive tabulators, all promoted as “business machines” as it can be seen bellow:

vintage-ibm

ibm-vintage-line-of-products

_ibm1935

In 1911, Charles F. Flint, a noted trust organizer, engineered the merger of Hollerith’s Tabulating Machine Company with two others – Computing Scale Company of America and International Time Recording Company. The combined Computing- Tabulating- Recording Company (C-T-R) manufactured and sold machinery ranging from commercial scales and industrial time recorders to meat and cheese slicers, along with tabulators and punched cards. Based in New York City, the company had 1,300 employees and offices and plants in Endicott and Binghamton, New York; Dayton, Ohio; Detroit, Michigan; Washington, D.C.; and Toronto, Ontario.

 

ctr

william-rodgersPRESSthink-pad1969

As you can see in the Amazon reviews for the William Rodgers book, and I quote:

“Williams Rodgers was a gifted writer for many newspapers and magazines, including the New York Herald Tribune and Harper’s. This book was a huge best-seller when it was published in the late 1960s and very accurately describes the early dominance that IBM had in the computer industry. Although at the time IBM resented the close look its company was given in these pages, later on a member of the Watson family said it was the best biography of the company’s founding genius, Thomas Watson. Try to track down an out-of-print company, as it makes for fascinating reading, still today.”

I entered to IBM Brasil in December 1970 and my first job was to do the layout of the Sumare Plant and to install it, part from IBM Rio de Janeiro, part from the provisory installation in Campinas, SP. An old-timer IBM’r was sent to help us and he had one copy of this book. When he went away he left the book among other things on his desk and I took it and have it to this day (2016). I read the book avidly and the saga of IBM described in the book fascinated me and fueled and feed my youth dreams to go to the USA and see how it really was. I moved from Industrial Installation department to the Product Engineering department to pursue my dream, what was permitted as some sort of prize or recognition for the good job I did in solving the layout problem of the Sumare Plant.

Incidentally, the very first day I came to work at IBM I was taken to visit the Sumare Plant, which was empty, except from a 1401, which was deposited there, waiting to be installed.

The account William Rodgers does about how Tomas Watson Sr got involved with scientific calculations is one of those things. It explains how IBM got involved with Columbia and Harvard, namely, how the Think motto was instrumental in involving IBM with heavy players in the scientific arena. It was, and it is, obvious, that to Think constructively, to become expert in one’s work, one has to be educated. And this was strongly encouraged within IBM anywhere it did business, as a consequence of Thomas Watson Sr. Think perception. Although a little bit poisonous the observation of William Rodgers that, and I quote, “Watson felt little need for academic learning that might have broadened his judgment, or allowed him to view the world with something more encompassing than a copy book maxims and bromidic verities.”, he was enormously helped in to that by Benjamim D.Wood, and I quote once again from this book:

“Watson had the good fortune to have the vision of his own corporation substantially widened by a diffident, scholarly intellectual at Columbia University. It was good fortune for which he was ready, and to which he was immediately responsive. His benefactor in what amount to a step in Watson’s own education was Benjamin D.Wood, an US Army psychologist in World War I and, in 1928, a thirty-four year old professor of Collegiate Educational Research at Columbia.”

Ben Wood’s doctorate dissertation on his doctorate thesis revolved around E.LThorndyke, which simply put states that “whatever exists at all, exists in some amount”, and applied it to the measure of human intelligence.

How IBM got involved with scientific calculations

If you read the two articles above involving Ben Wood and Columbia, you will like to know the human history behind the scenes, as told by William Rodgers, and I transcribe it from the book above:

“He was appointed assistant to Dean Herbert Edwin Hawkes of Columbia College, with the task of advising third-year students entering business, journalism, and law. To do this, he needed some form of prediction of their probable performance in professional schools. Because of previous experience with Thorndike, who was called “the big chief” at Columbia-and who devised the measurement tests which still bear his name-and because he was appalled by the fact that only two out of every seven boys entering college ever graduated, Wood began to devise testing and scoring methods that turned up some truly astonishing findings. In particular, Wood demonstrated a disparity in the quality of education so extreme that seniors achieving the highest possible test scores in some colleges were still below students who got the lowest scores at, for example, Haverford College. His findings seemed incredible, and Wood was pretty roundly denounced as a fake for “mechanizing” measurement best left to human evaluation.
Dean Hawkes weathered the criticism, however, and supported Wood in his investigations. Grants from the Carnegie Foundation, the Commonwealth Fund, and a half-million dollars from the General Education Board subsidized his efforts, and before long the New York Board of Regents and the educational system in Pennsylvania began to reevaluate their objectives in relation to Wood’s revelations.
The monumental task of recording and processing test results became a physical impossibility. Handling 35,000 tests at one time costing five dollars each to process was too costly even with large and generous grants. When Wood had millions of tests to deal with, he was harassed beyond anything even his own tests could measure. He was given floor space in Hamilton Hail, which he furnished with wooden boards and sawhorse tables scrounged from secondhand stores and lumber yards, and staffed with “two acres of girls” reading and classifying test answers. A labor problem arose when the girls numbed by the tedium of the work, gave vent to their wrath over snagging their stockings and clothes on the improvised furniture. Burlap was tacked to splintered surfaces and the uprising quelled. In desperation intensified by the knowledge that the work was enormously important, Wood culled from directories the names of chief executives of ten corporations in the equipment manufacturing business and sent out a call for design and engineering help. Nine answers were “brush-offs from the secretaries of third vice-presidents.” The tenth was a crisp telephone call:
‘I’m Thomas Watson. I’m very busy and can spare only an hour. Be at the Century Club promptly at twelve; I have an engagement at one.’Wood had scarcely said a word, partly out of shyness but largely out of inopportunity.”

The meeting lasted almost 5 hours and started one of the most fruitful relationships ever between the industry and academia.

Watson was eager to get government contracts, specially military and Ben Wood, based on his thesis around Thorndyke ideas, told what was music to Watson’s ears: According to Thorndyke and his philosophy of quantitative measure, Watson should bear in mind that in psychology and elsewhere, whatever existed at all exists in some quantity. If it is zero, it is nonexistent. But all the existing aspects of earth, science, nature, life, intelligence could me reduced to measure. Money and inventories, the hours a man worked, and the rice and profit of goods where not the only things that could be manipulated quantitatively. Everything from virus to super-galaxy – from microcosm to macrocosm – was a matter of quantity. The only way man could ever learn all the things needed to know to keep civilization ahead in the race between education and catastrophe was to recognize que quantitative basis of all phenomena. The machines of IBM could extend all measure whether it was for the doctor, who took a temperature, counted a pulse, took a blood sample, counted cells, or for any other field of endeavor in which the need for knowledge preceded judgment or activity. In summary, there is no aspect of life to which these IBM machines cannot make a basic and absolutely essential contribution. According to William Rodgers, the connection was made this way.

Ben Wood was the first PhD to become IBM’s consultant for life and access an enormous amount of equipment and money to do whatever he deemed right to do and this is how Columbia became one of the most advanced computer users for scientific research. It is very interesting to note that from 1927 to 1932 Ben Wood secured a couple of million dollars in subsidies from the typewriter industry, in which IBM itself later became dominantly involved. His interest was to demonstrate the value of the typewriter as a learning machine. Wood always made an explicit distinction between a teaching machine, used to convey facts and ideas an a learning machine, “an instrument to stimulate creativity and independent, self-initiated and success motivated learning, and to expedite absorption and comprehension..” He didn’t, as far as I know, said, but he could have said that this is basically the difference between training and education as I would discover later in life.

The results can be better perceived at the Columbia site entry on the SSEC

and the Columbia

 Thomas J. Watson Scientific Computing Laboratory, 612 West 116th Street (1945-1953).

Two perfect examples of a scientific calculation and what such users looks like, when Government sponsored

I – The Battle of the Atlantic

“Le bon Dieu est dans le détail” (the good God is in the detail) is generally attributed to Gustave Flaubert (1821–1880) and although later on it was reversed to “The devil is in the detail”, if there is somewhere where this happens is when at war. And such was the case of the Battle of the Atlantic. The big picture can be seen above, but William Rodgers in his aforementioned Think publication gives an account of a details that, depending of which side you look at it, it can be either God or devil’s detail. Let’s take a look and figure out how this brings up comprehension on what it was about scientific calculation from the point of view of perhaps its more important user: The US government

As it was told by William Rodgers, in Think, 1969

“Long before Pearl Harbor and the United States declaration of war, merchant tonnage losses of U.S., British, and Canadian bottoms increased to catastrophic proportions. Tankers, in particular, were torpedoed and left in flames within sight of American coastal citizens. The ports at Halifax and the Gulf Coast were graveyards for vessels. With war declared, the situation grew so ominous (although at the time the real danger was not made public) that German submarines along the Great Circle route of the North Atlantic operated nearly at will.
Working in deepest secrecy, Dr. Wallace J. Eckert (not to be confused with Dr. J. Prosper Eckert co inventor of ENIAC) and his staff at the Naval Observatory were charged with carrying out an operation that meant, almost literally, the difference between life and death to the Allied cause, since on it depended the use of the sea lanes to Murmansk and the United Kingdom. Before such technological miracles as radar and sonar were extensively in use, priority was given to means more readily at hand to disperse the murderous submarines which had sent to their deaths half of the pre-war U.S. maritime tanker personnel and the ships they manned.
Effective air assault against submarines was inevitably delayed by navigation techniques requiring as much as thirty minutes to determine a ship’s or a submarine’s position, a preliminary requisite to a radio call for help. Manual calculations, even with available tables and a sextant, were too time-consuming and cumbersome, leaving supply laden ships exposed to torpedoes and destruction in the interim.
Working from Professor Brown’s tables, their accuracy confirmed by the IBM equipment at Columbia, Dr. Eckert, a supervisor at the Observatory named Jack Belzer, and a group of young especially trained women began calculating and producing nautical almanacs for air and sea navigators. Limiting the operation to cover a ten-degree band of the heavens over the North Atlantic-it would have required a million pages of condensed type to provide complete almanacs for the navigable waters over the full north-south range-they produced the printed calculations that constituted the first and oldest scientific computer output in the world. It had been done, experimentally, at the old Pupin computing laboratory at Columbia, where fire control equipment calculations for the B-29 aircraft were later done under subcontracts to the General Electric Corporation.
The almanacs were modified to carry very small slugs of type, printed a line at a time. The type was so tightly condensed that every other digit was printed in the initial operation, then the platen on the machine was shifted one-half a notch to open up alternate spaces for the remaining half of the data on each page. This was done to compress the data and reduce the bulk of the document. Calculations were related to specific dates, and production schedules of the almanacs sometimes left only a few hours during which their delivery was absolutely crucial to navigators in the spotter aircraft, in the assault planes, and on the ships. In Washington, planes stood by to fly them to waiting navigators. Eckert and the staff were never more than a week ahead of delivery. But with the data, navigators on North Atlantic patrol and on ships in transit could determine a fix often in a single minute after sighting a submarine, and radio its position to every craft within range. Corvettes from Canada, destroyers, and anything capable of carrying a gun or a depth bomb could then converge on the spot at maximum speed.
In a matter of weeks, loss of lives and tonnage in the North Atlantic diminished; in time, and with new sensing technology, the sea lanes were brought under Allied control.
It was the sequence of events started by the cooperative Watson, in response to the requests of Ben Wood and Wallace Eckert for IBM machines to pursue their work and scholarship, that led directly to these accomplishments. Watson’s Astronomical Computing Bureau at Columbia, established after Eckert had read the articles written by scientists at England’s Nautical Almanac Office at Greenwich, was the seminal instrument for giving birth to the new machines to measure the phenomena of the universe.
The newest and most apocalyptic phenomena to which Watson’s machines were applied were the nuclear fission and atomic bomb projects; Eckert and Grosch labored on these, not altogether certain of the objectives of their research until word filtered back to them about the Manhattan Project and the awesome accomplishments of Dr. Robert Oppenheimer and his associates, who were dispatching instructions by mail postmarked Los Alamos, New Mexico.
Eckert and Wood remained in close touch with Watson until age, failing health, and his son’s ascension elevated him to emeritus status. Following his years at the Naval Observatory, Eckert was preeminent among scientists of the world in his field, and thus precisely suitable as Watson’s choice to direct, at $30,000 a year, the Thomas J. Watson Laboratory, endowed and established at Columbia after the world entered the nuclear age. A good and true scientist, he served in the post for twenty-two years and found his own heaven where mystics always said it was, among the galaxies and the stars.

If you go to the pointers above you will have a very goo overview of not only computer genesis, but the IBM computer genesis history. This is the missing link to got to the quantum leap that the System 360 was not only to IBM, but to the world. If you want or need to know what the Von Neumann concept is all about, you should take a careful look at the 1401 Reference Manual because it is all there. Frank Underwood made a perfect use of the Stored Program invented by von Neuman. It is curious his remarks about von Neuman when he was in Endicott acting as consultant Von Neumann came as a consultant to answer questions, deep mathematical problems, he would sit back think he was sleeping, spring up and this is the way it is.

II – Ballistics

By 1949, the General Picture for such kind of scientific computing looked like that.

How it started for private Companies or Bridging the gap

 

II)Business Planning

III) Announcement Preparation

The missing link: 1975 – 2016 Personal Computers

(See also The missing link: 1975 – 2016 Main Frames)

(See also: Internet)

Following the Generation idea, which to me is no longer valid, or a good idea to cope with the subject, let’s try to fit it accordingly. In the discussion why generation is no longer a valid idea, I focus on IBM mainframes and here I will focus in the Apple McIntosh PC, sprinkled with examples of Windows oriented PC mostly because you do not have an example which can compare back to back to Apple. The main difference is that if you go McIntosh, you go Apple and when you go Windows, you have an enormous list of manufacturers options. In the case of the Windows it makes more sense to discuss the processor in terms of bits width, clock speed, RAM memory, direct access memory interfacing, etc. Whereas in the McIntosh, it also make sense to do that, the machine can be one and only one, from Apple. historically, it looks like that and summarizing, for the Windows, it is the following. Obviously, quite before 2016, Android, Google, Apple are making strides and they are discussed under The Shape of Things to come.

It seems to me more useful try to compare Mac’s and PC’s

It is neither easy neither obvious, but simply put, a study conducted by market research firm NPD found that 79 percent of all computers bought at U.S. retail stores in October 2010 were Windows PCs. However, of those that sold for $1,000 or more, 88 percent were Macs [source: McCracken, above in Time Magazine].

I try to fit the pieces In the discussion about technology, under the article Computer Genesis: Perspective 1975 – 2016

Third Generation Computers

From 1964 to 1971 computers changed mostly in terms of speed and becoming smaller. This was due to the hardware now integrated circuits, or semiconductor chips, were large numbers of miniature transistors packed on silicon chips. Prices went down and the i/o`s, input output systems, keyboards, monitors took over the previous punched cards and print outs

Fourth Generation Computers

The circuits, as the name says, became put together in the LSI Large Scale Integration making it possible to place millions of transistors on a single circuit chip. This was called monolithic integrated circuit technology.One of such chips was the Intel 4004 chip which was the first microprocessor to become commercially available in 1971. That is when the Personal Computer comes to existence. As it was pointed out recently in a special edition of News Week, the trend followed was:

Apple, McIntosh & Others

1975

On the cover of the January Popular Electronics, Micro Instrumentation and Telemetry Systems announces its Altair 8800 personal computer kit. Upon its release, MITS co-founder Ed Roberts coins the term “personal computer

pe-1975-01-altair-cover

altair-8800

1976

The Apple I is completed by Steve Wozniak and presented at a Homebrew Computer Club meeting. After receiving 50 orders for the Apple I from a computer store called the Byte Shop, Wozniak and Steve Jobs found Apple Inc.

apple_i_computer

1977

Spanning beyond the hobbyist community, the Apple II swiftly gains popularity and is given to schools by the company. It’s sold with a switching power supply, keyboard, main logic board, case, game paddles and a copy of the arcade game Breakout.

Commodore PET was also successful in the education area.

1981

IBM announces its first PC, which sees success based more on name value than memory capacity. Microsoft creates MS-DOS for IBM’s PC. The Osborne 1, the first mass-produced portable microcomputer, weighs in at 23 pounds and becomes available for $1,795.

ibm_pc_5150

983

Apple introduces the first commercial PC with a graphical user interface, the Lisa. The GUI proves to be a milestone in the PC’s evolution. Lisa, as well as its GUI, were created based on a concept from Xerox’s Palo Alto Research Center (PARC).

1984

The Apple Macintosh is introduced to the world in a critically acclaimed, Orwellian Super Bowl commercial. As the first successful mouse-driven computer with a GUI, the unit is sold for $2,500. Mouse-utilizing applications include MacPaint and MacWrite.

ad_apple_1984

1998

Shortly after Jobs’s return to Apple, the company launches the iMac, a series of desktop computers. Praise for the machines stems from their easy usability. The product is seen as the reason for Apple’s emergence from near bankruptcy in the mid-’90s.

iMac 95.jpg

Time Line of Apple Products

MacBook_Air_Mid_2012.png

2008

Apple continues to simplify its own designs, leading to the release of the MacBook Air. The leaner, lighter laptop boasts a long-lasting battery. To achieve the thinner design, it swaps out the hard drive for a solid state disk, the first computer to boast this feature.

2012

The U.K.’s Raspberry Pi Foundation creates a credit card-sized single board computer sold for the low price of $25. Known as the Raspberry Pi, the compact machine has everything—so long as you have a monitor, keyboard and mouse to hook it up to.

The Fifth Generation of Computers

It is still in the future, but it is safe to say that computer users can expect even faster and more advanced computer technology. Fifth generation computing has yet to be truly defined, and it is doubtful that the future can be discussed under this concept. It is more suitable to discuss what will result between the marriage of the computer as it has been discussed here and the Smartphone. Or what will result between the marriage of the smart phone and the automobile. Or the home. Or the office. Or whatever….

The Missing link 1975 2016 – Main Frames

Computer Genesis: Perspective 1975 – 2016

(See what I am talking about)

(See also: The Missing Link 1975 2016 Personal Computers)

(See also: Internet)

In the opening of the Genese do Computador (Computer Genesis) I stated my intention to give a perspective for further investigation and ended up with the generation idea to figure out the development occurred. I also ended up showing up the leading technologies then used by IBM and unfortunately it didn’t become clear but the size to get an x number of bytes per square inch was the key issue, because, taking the same amount of computing power, the cubic (and the square) feet to have such a machine was and is a direct function of that. Not to mention the cost, specially the initial investment, installation and maintenance. This is extremely well discussed in detail for the IBM 1401, which is the first “mass-produced” computer and is the logical basis to take into account when deciding to put to the market a machine, any size.

And it is one piece of a correct frame of mind (among others, listed bellow) to understand what happened and is still happening.Although in the pointer file which brought you here I still frame it up under the generation concepts, I want to branch out to a more proper way to see what happened, to justify my point that there are better ways now to understand what happened and what is going on.

Not necessarily in order of importance, the following cross sectioning concepts are better for the understanding the evolution of computers and the like from 1975 to this day (2016)

1 – Figure out what is involved in the creation of a modern computer, and the example given here is the IBM 1401

2 – The cost factor, the money involved, the accessibility to the equipment due to economical constraints and its influence also in the manufacturers and manufacturing.

3 – Technology involved: the power up escalation and the shrinking down and its effects in products that can be created and its influence in mankind and its endeavours

4 – The new products and the ergonomic factor

5 – The shape of things to come Android, Google, Apple, etc

What, then, should be use in place of the generation idea?

Basically two things:

1rst. If we are discussing the same product, the reference is the model. The example anyone understands is the I phone, which runs by models, 

2nd. What kind of product and to what purpose it is designed

Under the above criteria, i.e., models and purpose, you can see all of them (mainframes) and details at IBM site. What follows is a kind of summarization or wrap up to make a point, because it is not only very difficult to figure it out, but it is also extremely boring.

The picture

From the inception till 1975, or even a few years before, the emphasis was on the hardware.But still price was not really objection, because although the IBM 1401 was rented by 2500 dollars a month and it was not sold and typically large systems would rent for more than 10000 dollars a month, the direct price of the hardware involved, for these machines industry oriented, was a fraction of that and this enabled IBM to pay the expenses to develop the IBM System 360 and the following which would come. But always with a significant price reduction, i.e., prices have since then always come down. The 1401 was withdrawn in 1971.

It is quite a mess to try to cover all the prices of IBM’s equipment , although it could be done, it suffices to point out price ranges along a timeline. Better yet for a period or a family of machines. Because this will be indirectly discussed later under the technology involved, which makes more sense.

At the beginning – The fifties

These big systems, before the 360 era amounted to a couple of dozen, if that much, and for instance, an IBM 7094, withdrawn in 1969, was sold for $3,134,500 with a complete package not only the machine programs, but also Fortran, Cobol and all the input output systems, including sorting.

Coming of age – Early sixties

At the entry-level, the monthly rental for a basic RAMAC was $3,200, of which $650 was for the disk storage unit, $1,625 for the processing unit and power supply, and $925 for the console, printer and card punch. More than a thousand of these vacuum tube-based computers were built before production ended in 1961.

The IBM 7030 Data Processing System — or “Stretch” computer — was delivered in April 1961, offering a performance that was 200 times faster than the IBM 701, 40 times faster than the IBM 709 and seven times faster than the IBM 7090. Although the 7030 was the industry’s fastest computer in 1961, its performance was far less than originally predicted. IBM cut its price from $13.5 million to $7.8 million and offered the 7030 to only eight customers.

System 360 – Late sixties early seventies

IBM really got started as a giant it became with the System 360. They sold by the hundreds, even the thousands in the smaller models. The 360 family was composed of the following machines

The IBM 1130 Computing System was announced in February 1965 as the “lowest-priced stored program computer ever marketed by IBM.” Capable of performing 120,000 additions a second, the system was offered for lease for as little as $695 a month and for sale at $32,280. The 1130 used microelectronic circuits employing IBM’s Solid Logic Technology similar to those used by the IBM System/360. It was manufactured in San Jose, Calif., and Greenock, Scotland.

But the entry point and the equivalent to the IBM 1401 was the System 360 model 40., which became the IBm 145 later 148 in the 370 family.

Thomas J. Watson, Jr., said of the System/360 when it as introduced in April 1964 that it was “the most significant product announcement in IBM history.” The word “system” was chosen to signify that the new product line was an interchangeable family of processors and peripherals with programming compatibility between all models. The Model 40 had a maximum memory of 256K, a cycle time of 2.5 microseconds and it transferred 16 bits per cycle. It was withdrawn from marketing in October 1977.

The most powerful was the System 360 model 75.

The IBM System/360 Model 75 was introduced in April 1965, with the first delivery, to the NASA Institute of Space Study, following in January 1966. A powerful processor for integrated data management and processing, the Model 75 had a storage capability of up to 1,048,576 bytes. The machine had a memory cycle time of 750 nanoseconds, and it featured four-way interweaving of memory for faster effective access. (interweaving is a technique in which the computer’s memory is implemented by two or more electronically independent units, any one of which can be accessed while the others are still responding to previous requests.) The Model 75 was withdrawn from marketing in March 1977. A console from one of the machines has been preserved in the IBM Collection of Historical Computers.

The 360 family was superseded by the 370.

System 370 – Early seventies late eighties

Slightly above the entry-level, Introduced in September 1970, the Model 145 was the first IBM computer to have a main memory made entirely on monolithic circuits on silicon chips (previous 370 models used magnetic core main memories). Its system storage ranged from 112K to 512K, twice that available with the IBM System/360 Model 40 . It operated at speeds up to five times the Model 40’s and up to 11 times the Model 30’s. Model 145 users were able to run their System/360 programs with little or no reprogramming. Monthly rental for typical configurations of IBM System/370 Model 145 ranged from about $14,950 (with 112,000 characters of main memory) to $37,330 (512,000 characters), with purchase prices ranging from about $705,775 to $1,783,000. Initial customer shipments will be scheduled for late next summer.The Model 145 was withdrawn from marketing in November of 1971.

The most powerful IBM computer of its time, in the 370 family, the 3090 high-end processor of the IBM 308X computer series incorporated one-million-bit memory chips, Thermal Conduction Modules to provide the shortest average chip-to-chip communication time of any large general purpose computer, and the industry’s most advanced operating systems. Announced in February 1985, the Model 200 (entry-level with two central processors) and Model 400 (with four central processors) IBM 3090 had 64 and 128 megabytes of central storage, respectively. At the time of announcement, the purchase price of a Model 200 was $5 million, and the machine was available in November 1985. The Model 400 was available only as a field upgrade from the Model 200 at a cost of $4.3 million beginning in the second quarter of 1987. A later six-processor IBM 3090 Model 600E, using vector processors, could perform computations up to 14 times faster than the earlier four-processor IBM 3084.

System 390 – early nineties mid nineties

IBM re invented itself in the mid nineties to become a service oriented company, changing completely the way it did business, specially to its workforce. Hardware became a commodity with lots of competition to choose from.The one and only basic reason was that all those machines priced in the $millions of dollars range felt in the $100000 dollars range and it was difficult to make money the way IBM was accustomed to. Technology made and killed IBM. Although it maintained its position in the Fortune 500’s, it is an entirely new company and this is a story yet to be told, but it can be summarized at

1993–present: IBM’s near disaster and rebirth

IBM Mainf Frame Wrap up

Generations of IBM360 -> 370 -> 390

The original 360 family was announced in 1964, and the lower midrange model 40 was the first to ship a year later. The most interesting version was model 67 (first shipped June 1966) which had hardware to support virtual memory. IBM had planned a special operating system for it (TSS/360), which they never managed to get to work well enough to be usable. Within IBM, model 67 was used with a system known as CP-67, which allowed a single 360/67 to simulate multiple machines of various models. This turned out to be very useful for developing operating systems.
In the summer of 1970, IBM announced a family of machines with an enhanced instruction set, called System/370. These machines were all designed with virtual hardware similar to 360/67, and eventually all the operating systems were enhanced to take advantage of it in some way.

When System/360 was successful, other companies started making their machines similar to IBM’s, but not close enough to actually run the same software. In 1970, however, Gene Amdahl (who had been the chief architect for the 360 family) started a company to build a series of machines that were direct clones of the 360-370 architecture, and later Hitachi followed suit. (The first Amdahl machine was shipped in 1975.)

Big, fast disk drives were one of the strengths of IBM. In 1973, the big mainframe disk drive was model 3330-11: 400 MB for $111,600 or $279/MB. By 1980, you could get the 3380: 2.5GB for $87,500 or $35/MB. DRAM prices were dropping, too: In 1979 the price was cut from $75,000/MB to $50,000/MB.

Through the 1970’s and 1980’s, the machines got bigger and faster, and multi-processor systems became common, but the basic architecture did not change. Around 1982, addresses were extended from 24 bits to 31 bits (370-XA), and in 1988 extensions were put in to support multiple address spaces (370-ESA). In 1990, the ES/9000 models came out with fiber-optical I/O channels (ESCON), and IBM began using the name System/390.

In 1999, IBM released a new generation of S/390. A special issue of IBM Systems Journal describes it’s technology.