Philosophy, Gene Editing and The Next Phase of Human Evolution

We can now control future human evolution. It is the most far-reaching technological development since humans branched from other primates some six million years ago. There does not exist at present a conceptual framework with which to address this (or any other) long-term global issue. Typically, any long- term global issue is a looming disaster. Consider a few examples: climate change, pollution of the oceans and air, nuclear proliferation and demographic upheavals. This record suggests that the worst outcome is also the most likely in the instant case: that during this century, some national entities would introduce heritable enhancements to the human genome in their country – making the last millennium Homo sapiens’ last.

This book identifies philosophy as the root problem. It then outlines how current science requires updating the 300-year-old foundation of knowledge. It concludes by indicating how such reconstruction provides the ground for
formulating normative social policies.

De-nuclearizing North Korea

What has prevented atomic conflict since the Second World War is the doctrine of Mutually Assured Destruction (MAD). Kim Jeong Il discovered that this formula is inapplicable to the potential atomic conflict between a superpower and a small country. Instead, the superpower, having more to lose, is in a military disadvantage. This fact confers negotiating advantage in the smaller country. However, such an advantage is limited to negotiations. In an atomic conflict, neither side wins.

In the interim, North Korea is subject to a punishing embargo. It desperately needs a source of income. They have one thing that many entities desire, so naturally, North Korea is in the business of selling atomic know-how. Some well-funded terrorist entities that seek to obtain atomic weapons are not geographically locatable. As a result, there is no way to counter any attack by such entities. China may be among the initial targets for such unilateral attacks. Such prospects are utterly unacceptable. It would force China to prevent opening this Pandora’s box: this means de-nuclearization of North Korea.

Some notes relating to the forthcoming publications of the revised The New Foundation of Knowledge (2017)

A. Philosophy

A1. The current state of affairs.

Philosophy is the most basic and most troubled field of knowledge. Present-day knowledge is still based on assumptions about human nature that are now known to be false, that were introduced some 300 years ago. These assumptions underlie normative disciplines, including ethics, law, politics and economics. As a result, human institutions are guided by policies which appear inconsistent with long-term survival.

A2. Bringing the foundation of knowledge up to date

A2.1. Psychological attributes are heritable. The theory of evolution led Darwin to conclude that heritability applies to biological as well as psychological attributes. Present day science proved Darwin right on this point. Specifically, humans possess innate sensations emotions and cognitions. For example, the newborn human (or rodent) likes sweet and dislike bitter. It shows that both the sensations of taste and likes and dislikes are innate. Furthermore, the innateness of the preference constitutes knowledge of the world prior to personal experience.

A2.2. The denial of heritable psychological attributes. Empiricism is the theory of knowledge that is based on the denial that sensations or emotions or cognitions are innate. Empiricism underlie all present-day theories of knowledge. By and large, the philosophic community proved unable to set aside the 300 year epistemology legacy, and do not acknowledge the scientific evidence.

A2.3. Truth and consequences. Innate commonalities of human nature is the ground for deriving universals of human
conducts. In the United States, the Universal Declaration of Human Rights is a manifestation of the view that some moral principles are universal. But this is an exception. The more basic law is called “positive”, which means non-universal. In contrast, the legal doctrine of natural law is based on the view that these laws should not be relativistic. Relativistic ethics and laws make it impossible to bridge the cultural chasm separating East and West in trying to address the long-term global issue of the future of humanity.

A3. Toward a dawn of a new day.

Updating the foundation of knowledge is the most important and most urgent problem confronting humanity now. The philosophic community ought to undertake the long-term challenge of making explicit the implications of the scientific evidence about biology, mind and brain. It would bring philosophy the recognition and authority it deserves, once it does its job.

The 1951 UN Refugee Convention is inconsistent with the US Constitution

Nations are sovereign: they have exclusive authority over a territory and its borders. A sovereign entity controls entry and stays within its borders. The 1951 United Nations Convention Relating to the Status of refugees, commonly known as the Refugee Convention undermines sovereignty by creating the legal right of persons to claim asylum in countries other than their own. Such a claim is subjected to 2-3 stages of the process during which the claim is examined by the selected country. During this period persons claiming asylum are entitled to the following rights:

  1. The right not to be punished for illegal entry
  2. The right to be issued identity and travel
  3. The right to freedom of movement in that country
  4. The right to access the courts
  5. The right to work
  6. The right to housing
  7. The right to education
  8. The right to public assistance.

In the US the children born to asylum claimants become citizens under the 14th Amendment. Such children are not deportable if the parents & claim for asylum is denied. Separating a child from his father or mother is not a humane or realistic option.

Apart from these considerations of principle, there is a looming reality. The end of the second world war was the period that colonialism in sub-Sahara Africa came to an end. The United Nations introduced several programs aimed at improving health and economic self-sufficiency. It proved successful in the first aim but failed in the second.

The improved health led to a sharp drop in child mortality producing explosive population growth. Food production did not keep up. As a result, migrating to more developed counties appears as the best option. Some states in sub-Saharan Africa are not democratic, and their population is deprived of human rights. Thus, they satisfy the UN criteria of persons entitled to asylum.

It is projected that by the end of the current century the population of sub-Saharan Africa will grow by some three billion persons. Many of them, if not most of them, would migrate to more developed countries.

This human flood could make citizens in developed counties into minorities. This extreme development is as grave as climate change. As climate change, it is a manifestation of human un-wisdom.


Why this century is unlike any other

T. Philosophy and survival
T1. The state of the world.
T1.1. Climate change. From the evolutionary perspective species come and go; Homo sapiens is still a work-in-progress. Climate change exemplifies the fact that long-term global consequences of technology are generally toxic, irreversible this century, and raises concerns about the challenge of surviving convergent natural disasters.

T1.2. Demographic trends. Some other long-term trends suggest that Western philosophy may be the implicit cause of self-destructive policies. For example, there is a question Western culture can survive having the white population of Europe and the United States become the minority this century (e.g. non-Hispanic whites in the US would become the minority among the newborn within a year).

T1.3. The prospect of controlling our future evolution. Biotechnology now makes possible to introduce heritable enhancements in the human genome. If any national entity undertakes to do that then, whether or not others follow, the last millennium would prove Homos sapiens last.

T2. Philosophy
T2.1. Updating the foundation of philosophy is a priority. Philosophy is the only part of knowledge that could have served as a survival manual in confronting the looming upheavals. But philosophy is not only the most fundamental part of knowledge but also the most troubled. This makes bringing the foundation of knowledge up to date a priority.

T2.2. A single factual issue. Neuroscience has recently established the fact that sensations are innate. For the last 300 years, the most basic assumption at the foundation of knowledge was the direct opposite – that no sensation is innate. John Locke (1689) introduced that assumption, concluded that the brain of the newborn is like a blank slate (tabula rasa).

T2.3. The challenge. It is now necessary to make explicit the epistemological implications of replacing the tabula rasa by its direct opposite. This would be the most basic change in the foundation of knowledge since Locke introduced his factually false assumption.

T2.4. The philosophic community. Updating the foundations of knowledge would establish the central role of philosophy in guiding social policy. But it would take time before the philosophic community is ready to set aside the 300-years of epistemological legacy. In the interim, the most pressing philosophical issue confronting humanity now is virtually terra incognita.

T2.5. The forthcoming revision of my 2017 book. The forthcoming revision of The New Foundation of Knowledge (2017) reviews the evidence for the innateness of sensations and provides an initial glimpse of the new epistemological landscape.

T3. Sensations are innate.
T3.1. The sensation of sound. The electrical stimulation of the cochlea elicits sensations of sound in the normal hearing and in the deaf. The heard pitch is determined by the cochlear locus stimulated. It proves that heard sound is not a property of air vibration. Some children are born with a dysfunctional auditory nerve. They can be made to hear by an implant that electrically stimulates hearing-related brain loci (e.g. brainstem, thalamus, or cortex). This proves that heard sound is innate and elicited by the brain: it is neither a property of air vibration nor originates in the ears.

T3.2. Any sensation. In every sensory modality (e.g. vision, hearing, touch, taste or smell), the same type of electrical stimulus elicits the modality-specific sensation as determined by the modality-specific area stimulated. This proves that said electrical stimuli do not contribute to the resulting qualitative sensation; it is the stimulated brain loci that determine the qualitative aspect of the sensation. Thus, sensations are innate and are elicited by the brain.

T4. The first empirical proof that consciousness exists. Innateness of sensations and consciousness. Evolution stumbled on consciousness, and natural selection let it be. From an evolutionary perspective, the role of conscious knowledge is to improve survival. Yet, to date all attempts to account for what consciousness is and what it does have failed. The reason for this failure is the denial of the fact that sensations are innate.
The physical is publicly observable. Innate sensations are private or alternatively termed subjective, phenomenal, or mental. Thus, our knowledge of the physical is an inference from the phenomenal. This conclusion confers epistemological priority on the phenomenal relative to the physical. The hope that Physicalism could account for consciousness is not realizable.

T5. Spatiality and Ubiquity. Physical objects, such as triangular tiles, are locatable in space. The concept of triangularity is not. The phenomenal cannot be said to be located in space. It is ubiquitous.

T6. Some top-down implications
T6.1. Pain. The tabula rasa assumption presumes that pain originates in the body and is imported into the brain by afferent C-fibers. Pain, like all sensations, is innate and is elicited by the brain. Based on the tabula rasa misconceptions, neurosurgeons performed numerous operations to disconnect the presumed source of pain from the brain, hoping to stop painful stimuli. In many of those cases, the pain returns with a vengeance. The continued failure of medicine to effectively address chronic pain is based on the philosophical error.

T6.2. Light. Like all sensations, the sensation of light is innate. The electric stimulation of visual cortex elicits the visual sensation of spots of light, called phosphenes both in normally seeing subjects and in the blind. On the basis of this fact, visual cortical prostheses were developed. Such prostheses are about to be available for the born blind. As in the case of auditory prostheses, it is best to implant prostheses in the subject during childhood. I expect visual prostheses for the born blind would be demonstrated within five years.

Applied Philosophy and the Advent of the Personal Computer

Revised and extended 2018, October 7

K. An epistemological challenge
K1. Philosophy. Philosophy is the most fundamental and most troubled knowledge area. While I was a doctoral student in philosophy at the Graduate Center of the City University of New York, I found philosophy to be in a unique position to:

  • Identify aspects that are common to the different sciences
  • Resolve issues involving the interrelations among sciences
  • Provide unique top-down conclusions for particular fields
  • Derive conclusions of what ought to be done

There does not exist any single scientific discipline that can do any of the above. My view was outside the mainstream. Currently, analytic philosophy seeks merely to clarify rather than solve problems. I, therefore, felt pressed to find a down to earth example to demonstrate the problem-solving power and plain utility of such top-down inferences. I chose information technology. The account below is mostly chronological and alludes only briefly to epistemological considerations. These considerations are addressed in the forthcoming revision of The New Foundation of Knowledge.

K2. Semiconductor technology. Following the invention of the transistor (1947), the integrated circuit was invented (1958). During the decade of the 1960s, there were five stages of halving the linear distance between transistors, which quadruples the number of transistors per unit area. Thus, the five stages resulted in more than a thousand-fold increase in the number of transistors per unit area.

Transistors: 1 → 4 → 16 → 64 → 256 → 1,024

To the extent that the cost per unit area remained the same, the production cost-per-transistor dropped by the same thousand–fold factor. In 1968, a California company named Advanced Memory Systems was the first to develop a 1024 transistor (1 kilobit) memory chip.

K3. Time-sharing computer systems. During the 1960’s computers were used as a shared facility. The accepted wisdom was that data processing, like the generation of electricity, ought to be centralized and then distributed to end-users. In addition, electronic engineers shared the view that the solution to a given problem must be customized, that there is no panacea. This view led to the development of application-specific products such as word processors and scientific calculators.

K4. The advantage of generalizing problems and solutions. My view, to the contrary, that a problem must be generalized before a solution is sought, and then the solution ought to be generalized prior to any customization. Word processors, scientific calculators, fax machines and printers were, to me, conceptually outdated. By analogy, I believed that time telling should be based on user-dedicated wristwatch rather than a clock in the center of town, and ditto for computing.

K5. IBM. At that time, IBM was the dominant company in the computer field. Therefore, in 1967 I met with Jacques Maisonrouge, who was at the time the president of IBM World Trade Corporation. I conveyed to Maisonrouge my conviction that there will be a transition from shared computing facilities to user-dedicated computers. I then proposed that IBM explore doing so. Maisonrouge arranged for me to meet J. C. R. Licklider. I did not know it at the time, but Licklider was a key promoter of the time-sharing approach to computing. The meeting proved pointless. IBM eventually entered the personal computer market in 1981. It was too late. By then, they had lost their leadership in the computing field. Subsequently, IBM sold the personal computer business to the Chinese computer company, Lenovo. In his book Inside IBM – a personal story (1985) Maisonrouge acknowledged the importance of the personal computer, writing; “The most important change in the last few years has been the introduction of personal computers in homes, offices, and schools.” (p. 284). He did not, however, explain IBM’s failure to enter the computer field during the 1970’s and its loss of leadership in the computer field

K6. Datapoint.
K6.1. The Datapoint 3300 computer terminal. In 1969, Computer Terminal Corporation (CTC) of San Antonio, Texas, (later renamed Datapoint) developed a computer terminal, the Datapoint 3300. Seeking to raise $4 million by an initial public offering, they contacted the Wall Street firm Philips, Appel & Walden, which was at that time specializing in funding high technology companies. James (Jim) Walden, the managing partner, asked me to go to San Antonio to evaluate the technology of that company.

K6.2. My visit. In San Antonio, I had an extended discussion with Austin (Gus) Roche, who was the vice president for research and development at Datapoint. I suggested that the revolutionary advance in semiconductor technology would drive changes in how computing is done. I urged that CTC consider developing a personal computer.

K6.3. My recommendations. Specifically, I recommended that Datapoint:

  • Develop a computer to be located where the user is.
  • Check if the processor can be implemented on a single chip.
  • Make the computer user-dedicated.

K6.4. The Datapoint 2200 intelligent terminal. Roche responded saying that their next product would contain a computer. He thus accepted my first recommendation. Datapoint acted on my second recommendation and asked Texas Instruments (TI) and Intel for proposals of implementing the central processing unit (CPU) of the next product, the Datapoint 2200, on a single silicon chip. But the Datapoint 2200 was designed to be an intelligent terminal for processing data on a remote computer. Thus, my third recommendation that the next product be a personal computer, was not accepted at that time.

K7. Philips, Appel & Walden. On my return to New York, I told Jim Walden that I liked the company, their technical competence, and “can do” attitude but found that their product philosophy was conceptually obsolete. To my surprise, Jim Walden challenged me by saying that if I could do it the right way, then Philips, Appel & Walden would fund me. Despite misgivings, I accepted the offer. I formed a company naming it Q1 Corporation and recruited a core team.

K8. Intel
K8.1. Putting off the development of the Datapoint 2200 CPU chip. As Gus Roche told me when we met, Datapoint did develop a CPU of a computer and did ask TI and Intel for proposals to implement it on a single chip. Initially, Intel was uncertain whether it would be in its interest to design a chip implementing the Datapoint CPU. Intel was at that time in the memory chip business. There was a concern that producing and marketing a CPU chip would make Intel viewed as a competitor by its memory chip customers. Intel then shifted priorities from the CPU for the Datapoint 2200 to the development of an electronic calculator chip for Busicom of Japan.

K8.2. Shelving the development of the Datapoint 2200 CPU chip. TI, for its part, filed a patent application for the single-chip implementation of the 2200 CPU, which was eventually granted. However, it was unable to deliver usable chips to Datapoint in time. As a result, the Datapoint 2200 intelligent terminal was produced with its CPU implemented using discrete components. This, in turn, led Intel to shelve the project of developing the 8-bit CPU chip for Datapoint.

K8.3. My meeting with the Robert (Bob) Noyce, the president of Intel. On hearing this, I flew to California to meet with Bob Noyce who was the President of Intel at that time. I conveyed to Noyce my view that since a 4-bit chip is insufficient for representing alphabetic characters, the chip Intel was developing for Busicom would have a limited market. In contrast, I said that the 8-bit CPU that Datapoint developed at my urging would revolutionize information technology, and Q1 would be Intel’s first customer for that chip.

K8.4. Getting Intel the 2200 CPU rights for single chip processor. Noyce said that Intel would have resumed the development of the single chip Datapoint 2200 CPU 8-bit except that it would first need to obtain the consent of Datapoint to do so. I told Noyce that I would provide Intel with the required Datapoint consent. I flew to San Antonio, met with Phil Ray, who was President of Datapoint at the time, obtained the consent that Intel develop and sell the single chip processor based on the Datapoint 2200 CPU, and so informed Noyce. I expected that in the subsequent formalization of such agreement Datapoint would receive some percentage royalty on the sale of the future 8-bit chip. This did not happen. Intel then developed the 8-bit single chip processor, calling it Intel 8008. As I promised Noyce, Q1 became first customer for the Intel 8008 single chip 8-bit microprocessor.

K9. Q1
K9.1. The world’s first 8008 and 8080 computer installations. The Q1 computer was designed to be a user-dedicated general-purpose computer system. It was the first personal computer. It was also the first computer system to incorporate a single chip 8-bit microprocessor. In December 1972, the first Q1 personal computer was installed at Litcom, a division of Litton Industries in Long Island.

K9.2. Nixdorf Computer, Paderborn, Germany. Early in 1973, Heinz Nixdorf, the president of Nixdorf Computer, invited me to visit his facility in Paderborn, Germany. I went there with Dr. Ron Sommer, who was a vice president of Q1 and fluent in German. It resulted in a $40k/month software development agreement for the Intel 8008 and anticipated 8080 microprocessors.

K9.3. The New York City Israel Supply Mission. Later in 1973 Q1 received an order, subject to acceptance tests, from the Israel Supply Mission in New York City for four Q1/Lite systems, to be based on the expected second generation of the Intel 8008, later named the Intel 8080. A Q1/Lite computer with a pre-production 8080 microprocessor was delivered late in
1973 and later replaced by a unit with a production-level 8080 microprocessor.

K9.4. NASA. In 1975, the National Aeronautic and Space Administration (NASA) ordered Q1/Lite systems for all its eleven worldwide bases.

K9.5. IEEE. Also in 1975, the Institute of Electrical and Electronic Engineers (IEEE) organized their first international conference about the microcomputer revolution, which took place in New York City. I understand that on the recommendation of Bob Noyce, IEEE invited me to organize and chair the opening session. It felt strange: I am neither an electronic engineer, nor a computer scientist, and had no prior association with the IEEE. Despite being an outsider, or perhaps because of that fact, I was able to see where the field was going and could go in way that people inside the industry could not.

K9.6. The UK National Enterprise Board. In 1979, the National Enterprise Board of the British government invested over $11 million for the right to represent Q1 in Europe.

K9.7. Exit. I then recruited a President to replace me and returned to my main interest in making explicit the epistemological implications that sensations are innate – the direct opposite of the tabula rasa assumption.

K10. World dominance
K10.1. Microsoft. The introduction of the Intel 8080 prompted Bill Gates and Paul Allen to quit Harvard and form Microsoft.
Microsoft developed software, including the Windows operating system, for the Intel 8080 and subsequent member of that family, the x86. By the end of 1970s, Wintel (the Intel x86/Microsoft Windows) became the dominant personal computer engine in the world.

K10.2. $1 billion? Until then, Intel revenues were from selling semiconductor memory chips. By the end of the 1970s, Intel discontinued the memory business, and the x86 became its main source of revenue. Lamont Wood, in his book Datapoint (2012) wrote about the 8008 microprocessor that “In hindsight, it’s clear that this chip, through its direct descendants, was the foundation of the digital world”. In chapter 9, named The Worst Business Decision in History, Lamont writes that relinquishing the intellectual property rights for the Datapoint 2200 CPU to Intel was a $1 billion giveaway.

K11. A personal perspective
K11.1. Confirmation. To me, the sequence of events confirmed the power of abstract top-down reasoning. Some would argue that it was just improbable happenstance, that I just happened to be in the right place at the right time. This can be put to a test. Currently, there is considerable functional overlap among user-dedicated computing devices, including desktop computers, laptops, tablets, cell phones and smart wristwatches. This is a situation where the optimal next step cannot be reached in bottom-up research of computer science or electronics. It requires conceptual top-down reasoning. If any computer company is interested, I would be willing to make explicit how my top-down considerations outline the next phase in information technology.

K11.2. Next. The more important challenge is to apply top-down thinking to more fundamental issues. I try to do this in the forthcoming revision of my book about the foundation of knowledge (2017).

Alroy, Daniel. The New Foundation of Knowledge. 2017
Maisonrouge, Jacques. Inside IBM. 1985.
Wood, Lamont. Datapoint. 2012.

This century whites are due to become minorities in the US and EU

It is projected that whites in the EU and US will become the minority by the end of the century. In the EU, that transition is expected in the second half of the century. In the US (non-Hispanic) whites will be a minority within a generation and among the newborn, within a year.

In the EU, that prospect is a subject for intense discussions, for example, see The Strange Death of Europe by Douglas Murray (2017). In contrast, the US is still in denial of the inevitable transition. For example, the media addresses immigration issues on daily basis, but discussion of the impending transition is avoided.

The EU is the primary destination of asylum seekers. States in Sub-Saharan Africa cannot feed their rapidly growing populations. Many are under a dictatorship and are known for human rights abuses. Thus, immigrants from these countries satisfy the asylum seeker criteria. The United Nations projects that by the end of the current century the Sub-Saharan African population would grow by some billions. At present, there is no real or conceivable mechanism to stem the expected tidal wave of migration. Include the fact that immigrants during the initial two generations double in number, while Europeans, like Americans, do not even reach replacement levels: the demographic shifts will be massive.

Murray in his book addresses only Europe and only from the perspective of a journalist. However, the issue applies to Western culture on both sides of the Atlantic, and the root cause is not political in that people did not vote for this consequence. Western culture has been dominant for the last
100 years. The only explanation for its current predicament is that it is the consequence of principles implicit in the philosophy ascribed to in the West. Philosophy is the only knowledge area that can address this failure of self-preservation. With authority comes responsibility. The philosophic community has let us down by retreating to the position of a spectator on vital issues of survival.

It is now imperative that the philosophic community confronts the challenge of bringing the foundation of knowledge up to date and, recognize, for the first time, the innate commonalities of human nature, and derive survival imperatives to guide policy.