Amazon’s Jeff Bezos recently expressed the view that a dominance position in technology is transient, adding that Amazon may not escape that fate. The pattern that includes AT&T, Xerox, DEC and IBM indicates that there is an intrinsic difficulty for a technological leader to identify the natural next stage in a given field. Functional overlap among desktop computers, laptops, tablets and cell phones suggest that information technology (IT) is overdue in embarking on the next stage. Current technology leaders seem confronted with this challenge, without a clear direction emerging thus far.
In the late 1960s while I was a doctoral student in philosophy in the graduate center of the City University of New York, I noted that the hardware of fax machines, electronic calculators and word processors can be replaced as application software on a general-purpose computer. This view was combined with my awareness of the rapid increase in transistor density per unit area and the corresponding drop in the cost per transistor. I concluded that user-dedicated information technology would replace both time-sharing computing and the above-mentioned special purpose equipment.
At that time I felt pressed to prove, to myself at least, that philosophy provides powerful problem-solving means that are not available elsewhere. I decided to use that situation as a test. I formed the company Q1 Corporation, dropped out of school, and recruited a core technical team.
In December 1972 Q1 delivered to Litcom, a division of Litton Industries on Long Island, the world’s first microprocessor-based personal computer. It utilized an 8-bit single-chip microprocessor, the Intel 8008. The 8008 became the first member of Intel’s x86 microprocessor family. In 1975 Microsoft was formed. It focused on developing software, including the Windows operating system, for the Intel x86 microprocessors. By the end of the 1970s, “Wintel” (Windows/Intel combination) dominated computing worldwide.
Intel did not design the central processing unit (CPU) for the 8008. Instead, it was designed in 1969 in response to my urging of Computer Terminal Corporation (CTC) to develop a microprocessor-based user-dedicated personal computer. Instead, CTC designed an intelligent terminal with a CPU. After asking Intel to implement the CPU as a single-chip, CTC decided against using such chip. Intel then shelved that project.
I then met with Robert “Bob” Noyce, who was at the time the president of Intel. I pleaded with him to resume the development of the 8-bit microprocessor chip. He said that he would do so after concluding an ongoing project (a 4-bit processor), provided Intel obtain from CTC the permission to sell such chip in the general market. I obtained that permission, and the Intel 8008 was introduced in April 1972. Q1 delivered the first microprocessor-based personal computer in December 1972.
The account below provides a somewhat fuller chronology. The events proved to me that philosophy provides unique and unrecognized problem-solving methods, whereas the accepted analytic philosophy, in contrast, seeks to clarify rather than solve problems. The best way for me to establish that philosophy contributed to my actions and decisions is to undertake the challenge of specifying aspects of the next phase in information processing as it applies to the present situation.
The transistor was invented at Bell Telephone Laboratories in 1947. In 1958 the integrated circuit was invented at Texas Instruments (TI). During the decade of the 1960s, there were five stages of halving the linear distance between transistors. It halved the travel time for electrons (at a given voltage). Another consequence was the quadrupling of the number of transistors per unit area. Thus, the five stages resulted in more than a thousand-fold increase in the number of transistors per unit area. To the extent that the cost per unit area remained the same, the production cost-per-transistor dropped by the same thousand–fold factor. Furthermore, computing cost can be taken as the cost per transistor multiplied by processing time. Thus, the 1960s represented about a thirty thousand-fold drop in computing cost.
IBM was the world’s dominant computer company during the 1950s and 1960s. Its main product line was the IBM 360 time-sharing computer series. In 1967 I met with Jacques Maisonrouge, who was at that time the president of IBM World Trade Corporation. I conveyed to Maisonrouge my conviction that the rapid density increase of transistors per unit area and the corresponding decline in the cost-per-transistor would transform the computer field by the emergence of user-dedicated personal computing devices. I proposed that IBM explore this new area. Maisonrouge arranged for me to meet J. C. R. Licklider at the Thomas J. Watson Research Center in Yorktown Heights, New York. I did not know at the time that Licklider was a key promoter of the time-sharing approach to computing at MIT, DARPA and IBM and that he was about to return to MIT. His book Libraries of the Future (1965) reflects his perspective. Our meeting proved pointless.
Advanced Memory Systems
In 1968, Advanced Memory System (AMS) of Santa Clara, California, developed the first random access memory semiconductor chip with 1,024 transistors. It was a significant milestone. First, at that transistor density, it could replace the older technology of magnetic core memories. Furthermore, a chip with 1,024 transistors was a stage or so away from having the number of transistors necessary to implement the central processing unit (CPU) of a computer on a single chip.
AMS contacted Philips, Appel & Walden, who was at that time a Wall Street firm with focus on technology. AMS sought to raise funds through an initial public offering (IPO). James “Jim” Walden, the Managing Partner, asked me to evaluate AMS. First I discussed the state of the technology with Professor Carver Mead, at California Institute of Technology (Caltech). Specifically, I sought his view as to how small a transistor may be and how the rate of increasing transistor density is likely to decline. I was impressed with the technical competence of the AMS team. I found myself overlooking their less than adequate marketing orientation and recommended to Jim Walden that the underwriting the IPO of AMS go ahead.
Computer Terminal Corporation (CTC)
In 1969, CTC of San Antonio, Texas (later renamed Datapoint) developed a computer terminal, the Datapoint 3300. CTC contacted Philips, Appel & Walden seeking to raise $4 million through an IPO. Jim Walden asked me to evaluate their technology. In San Antonio, I had an extended discussion with Austin “Gus” Roche, who was the vice president for research and development. I
repeated to him what I conveyed two years earlier to Jacques Maisonrouge of IBM about transistor density and single-chip CPU implementation revolutionizing computing to be local and user-dedicated. I urged that CTC consider developing a personal computer.
I recommended that CTC:
- Develop a computer CPU.
- Make the computer user-dedicated.
- Locate the computer where the user is.
- Seek to have the CPU implemented on a single silicon chip.
Roche responded saying that their next product would contain a computer. He thus accepted my first recommendation and their next planned product, the Datapoint 2200, did contain a CPU.
CTC also acted on my last recommendation and asked both Intel and TI to submit proposals for a single-chip implementation of the Datapoint 2200 CPU. But, after receiving microprocessor chip samples from Intel and TI, my recommendation was rejected. Instead, their CPU was implemented using existing technologies.
I later learned that the Datapoint 2200 was designed to be an intelligent terminal to a remote computer, not a personal computer: it did not include a disk drive to provide direct access storage; it did not provide a high-level programming language, and its 2 kilobytes of internal memory was insufficient to function as a general-purpose computer. Thus, my main recommendation was not accepted.
On my return to New York, I told Jim Walden that I liked the company, their technical competence, and “can do” attitude. I added that their product philosophy, as reflected by their initial product – a “dumb” computer terminal, was conceptually obsolete. I mentioned to Walden that I suggested to CTC my views about their next product.
Intel was formed to develop, manufacture and market semiconductor memory chips. Bob Noyce, the president of Intel at the time, and Gordon Moore, a co-founder, were reportedly concerned that if Intel produced CPU chips it would seem to be competing with its customers for memory chips. This consideration must have played a role in Intel’s decision to shift focus to the development of a chip for Busicom, a Japanese electronic calculator consortium. Intel stopped working on the development of the CTC chip in July 1970.
On hearing this, I flew to California to meet with Bob Noyce. I conveyed to Noyce my view that since the chip Intel was developing for Busicom was 4-bit, it was a limited purpose device with a limited market appeal. I gave as an example the fact that 4-bits is insufficient for representing alphabetic characters; in contrast, the 8-bit single chip microprocessor would unleash a technological revolution. I concluded that if Intel completes the development of the 8-bit microprocessor chip, then Q1 would be Intel’s first customer for that chip.
Noyce said that Intel would resume the development of the 8-bit single chip microprocessor but would first need to obtain the consent of CTC to produce and sell a chip based on the Datapoint 2200 CPU. I told Noyce that I would provide Intel with the required consent. I flew to San Antonio, met with Phil Ray, who was President of Datapoint at the time, obtained the consent for Intel to produce and sell the 8-bit single chip microprocessor based on the Datapoint 2200 CPU, and so informed Noyce.
As Bob Noyce told me when we met, Intel resumed the development of the 8-bit single-chip implementation of the CTC’s CPU after completing the development of the 4-bit chip for Busicom. Busicom went out of business soon thereafter. Intel obtained rights to offer the 4-bit chip to the general market. As I told Noyce, the 4-bit chip proved to have a limited market. In April 1972 Intel introduced the 8-bit single-chip microprocessor, the 8008.
In December 1972, the first Q1 personal computer was installed at Litcom, a division of Litton Industries in Long Island. This was the world’s first installation of a microprocessor-based general purpose computer.
The Q1 computer was:
- User-dedicated general-purpose computer system
- Utilizing an 8-bit single-chip microprocessor
- It contained random-access external information storage
- It came with the PL/1 high-level programming language
Early in 1973, Heinz Nixdorf, the president of Nixdorf Computer, invited me to visit his facility in Paderborn, Germany. I went to Paderborn with Dr. Ron Sommer, who was at the time vice president of Q1. Sommer, having received his PhD in mathematics from the University of Vienna, was fluent in German. The meeting resulted in a $40k/month software development agreement for the Intel 8008 and anticipated next generation microprocessor the Intel 8080.
Later in 1973, Q1 received an order, subject to acceptance tests, from the Israel Supply Mission in New York City for four Q1 systems, to be based on the expected second generation Intel 8008. A Q1 computer with a pre-production 8080 microprocessor was delivered during the first quarter of 1974 and later replaced by a computer with production-level 8080 chip.
In 1975, The National Aeronautic and Space Administration (NASA) NASA ordered Q1/Lite systems for all its eleven worldwide bases. Also in 1975, the Institute of Electrical and Electronic Engineers (IEEE) organized its first international conference about the microcomputer revolution, which took place in New York City. I understand that on the recommendation of Bob Noyce, IEEE invited me to organize and chair the opening session. It felt strange: I am neither an electronic engineer, nor a computer scientist, and had no prior association with the IEEE. This fact has been, to me, a validation of the unique problem-solving power of philosophy.
In 1979, the British National Enterprise Board (NEB) invested over ten million dollars in forming a company to represent the production, marketing, and service of Q1 computers in Europe.
Returning to my core interest
I then recruited a president to replace me and returned to my main interest. I believe that I solved some basic problems of knowledge (e.g. 2010 & 2012 US patents). But this claim awaits further
confirmation and acceptance.
The subsequent growth of the x86 personal computers
The introduction of the Intel 8080 prompted Bill Gates and Paul Allen to quit Harvard and form Microsoft. Microsoft developed software, including the Windows operating system, for the Intel 8080 and subsequent members of that family, the x86. By the end of 1970s, Wintel (the Intel x86/Microsoft Windows) became the dominant personal computer engine in the world. Until then, Intel revenues were from selling semiconductor memory chips. By the end of the 1970s, Intel discontinued the memory business, and the x86 became its main source of revenue.
A re-testing the problem-solving capacity of philosophy
Currently, there is considerable functional overlap among user-dedicated computing devices, including desktop computers, laptops, tablets, cell phones, and smart wristwatches. This is a situation
where the optimal next step cannot be reached in bottom-up research of computer science or electronics. It requires conceptual top-down reasoning. This situation presents another opportunity to confirm or disconfirm the methods used in my initial philosophical experiment.