Essays

June 23, 2016

 

Full Employment and the Struggle for Human Dignity       

 

In recent years innumerable studies, books, articles, conferences and seminars have been devoted to the topic, “The Future of Work”. One reason is the prospect that in the foreseeable future robots will be able to replace human beings in most activities presently carried out by human labor, leading to the question of how mass unemployment can be avoided. Another reason is the growing consciousness of the fact that the political instability in countries around the world is rooted to a large extent in the worsening objective and subjective situation of their working populations.  The perspective of “robotization” of the economy is still far away, but it already influences the way people think about the present crisis.

 

There is no doubt that rapid process of improvement of robotic technologies will have a profound effect on the role of human labor in practically all sectors of the economy. In contrast to traditional automation, which is mainly limited to a small repertoire of repetitive operations, robots are characterized by a vastly greater flexibility compared to traditional automated systems. Robotic systems are becoming able to match many of the tactile and motor abilities of human beings;  to spontaneously orient themselves and move about in various environments, to utilize a variety of optical, acoustical and mechanical sensors to recognize and to manipulate objects and arrays of objects with respect to their positions, orientations and motions in three dimensional space; to recognize and respond to human speech; to improve their performance, to integrate new information and adapt to new situations through various forms of “artificial learning” etc.

 

In a number of recent conference presentations former U.S. Secretary of State Robert Reich has highlighted his view of the economic challenges posed by the robotic revolution in the following way: Imagine a huge machine –Reich calls it the “iEverything” -- which is capable of producing everything you could possibly desire. For example you tell the machine that you want a car, and describe the kind of car you want. The car is immediately produced and delivered to you automatically, with no human labor involved. What is the problem? Reich asks the audience. His answer: Nobody will be able to buy the cars, because no one will have any means of earning money! Robotics will have eliminated nearly all industrial and service sector jobs. Reich notes that this scenario is futuristic, “but when more and more can be done by fewer and fewer people, the profits go to an ever-smaller circle of executives and owner-investors.”

 

This last remark points to the immediate, burning issues which are motivating much of the present discussions about the “Future of Work”, even when not explicitly mentioned. They have nothing to do with futuristic scenarios, but rather with the epidemic of political crises afflicting the world today, not only in the so-called developing sector, but also in the United States and Europe.     

 

The recent political earthquakes in the United States are particularly revealing. Evidently the U.S. establishment – which thought it had firm control of the country -- was not prepared for the degree of popular support that the anti-establishment candidates Donald Trump and Bernie Sanders have been able to mobilize in the population. As it became more and more clear that Trump was going to win the Republican nomination and that large parts of the Democratic Party base – especially young people -- supported the “Sanders revolution” against the Hillary Clinton and Wall Street, articles and commentaries began to appear, calling attention to the miserable situation of much of the working-age population in the United States.

 

Already on February 22, 2016 for example, an article appeared in the Huffington Post entitled “Hidden Unemployment Explains Rise of Trump and Sanders”. Author Alan Singer recalled that the official unemployment rate in the United States, according to the Federal Bureau of Labor Statistics, is currently under 5 percent. But in his New Hampshire primary victory speech Donald Trump declared that the official unemployment numbers were phony, and said “The number is probably 28, 29, as high as 35.” Many journalists laughed at this statement. “But”, writes Singer, ”the Trump unemployment numbers are not ridiculous and that may explain the rise of both Trump and Bernie Sanders in the Republican and Democratic Party Presidential primaries.” Singer notes that the official unemployment rate only includes people who do not have a job but want one and have actively sought a job in the last four weeks. “But if you include people who want to work and looked for a job during the previous year, discouraged workers who stopped looking because of economic conditions, and people who are working part time but want to work full-time job (then) the unemployment rate rises to 9.9 percent. (But) even this number may be artificially low.  In 1999, 85 percent of Americans age 25 to 54 were working. But the figure now is only 81 percent. If we count those missing workers as unemployed, the unemployment rate rises to over 12 percent. And when you add college students who want to work, older people who were forced into retirement, people on disability who would work if they could, and women with children who would work if there were adequate day care, the hidden unemployment rate for the United States is probably over 30 percent.”

 

The situation of the many Americans who are “privileged” to have a job is often not much better. A March 21, 2016 commentary in the Boston Globe by Roland Merullo focuses on the subjective condition of the masses of working poor in the U.S.: “Trump’s appeal is not primarily grounded in racism or anger. It’s primarily grounded in humiliation. .. I’m close to people who are working, and poor. There is a particular kind of humiliation involved in their lives, though many of them are too proud to use that word.  They’re not hungry… But every hour of every day they’re shown images of people who have things they will never have. Virtually every TV show and Internet site offers ads featuring relaxed families … that own a home, have new cars, take cruiseline vacations, and use the kind of electronic gadgetry that would bankrupt the working poor...”  Referring to the “political correctness” of the media which seems more interested in the rights of transsexuals than in the problems of the majority of citizens, Merullo writes: “Imagine what it’s like to come home from working a job (or two jobs) you hate, that exhausts you, that leaves you five dollars at the end of the week for a child’s birthday gift, and hear someone call you ‘privileged.”… So when they see protesters disrupting the speech of the candidate they hope can change their lives, and when they hear him say, ‘I’d punch that guy in the face’— the kind of language they grew up with — and when they listen to him talk about the decent-paying jobs that were moved to China (something Trump says more often than any other candidate), is it really a surprise that these people go into the voting booth and cast a ballot for Donald?”

 

In a recent speech Robert Reich – who became a prominent supporter of Bernie Sanders -- pointed to widespread economic insecurity as a key cause of the political upheaval in the United States. Two thirds of employed persons in the United States have no long-term work contracts. Many of them are working “on demand” on a short-term basis and do not know what they will be doing a year from now. Trade unions, which once played an important role in the fight for job security, now represent less than 7% of employees in the private sector. At the same time, the large section of the workforce that do not have the high-level education and specialized qualifications needed to get secure well-paying jobs, have no other option than low-paying jobs in retail shops, restaurants, elderly care, truck driving etc. 

 

In the 1950s and 1960s the situation was totally different. The “G.I. Bill” enacted at the close of the WWII granted free college education and many other forms of support to returning soldiers. The expansion of U.S. industry provided the opportunity for a majority of the working class families – including black people – to rise into the middle class and to share in the so-called “American dream”. But today, after decades of deindustrialization, tens of millions of formerly middle class working families are living under ghetto-like conditions in the notorious “Rust Belt” and elsewhere, impoverished and largely unemployed, without any future perspective. Formerly prosperous communities are now totally run down, their physical and social infrastructure collapsing, their youth devastated by drugs and crime.   

 

Economic insecurity has drastically increased even among the people who still belong to middle class. It has an especially heavy impact on members of the younger generation, who find it difficult or impossible to plan their lives. Under pressure to obtain the best possible academic qualifications for competing on the job market, many of them are incurring huge debts to pay for their college or university education. The average debt of American college graduates has constantly increased, reaching a level of over $37 000 last year. But in many cases the debt is much higher – over $50 000 for masters of science and over $160 000 for graduates in the field of medicine and health sciences. This has led to a situation where in 2015 over half of outstanding student loans in the U.S. are in deferral, delinquency or default. Many young people enter the job market in a condition of financial ruin. Where will they live? How will they pay their bills? How are they going to start a family?

 

All of this is happening in the nation which is supposed to be the richest in the world, whose economy has supposedly recovered from the financial crisis and now enjoys “robust growth”. But in fact the process of decline or even extinction of the traditional working middle class, and the drastic increase in economic insecurity has been going on in all industrial countries. Needless to say, the situation of the so-called developing countries is much worse. Practically every nation of the world is today in a state of a profound economic, social and political crisis. Where is the paradise promised by the advocates of financial globalization and other neoliberal policies? Nearly everywhere the crisis is related, directly or indirectly, with the failure to provide secure employment at modern, middle-class wage levels to the majority of the working-age population.

 

In this situation there is growing interest in the long-standing proposal that governments should provide some sort of unconditional, guaranteed income to all citizens. As is well-known, this idea has already been implemented, in one form or another, in a number of regions and countries, including of course Brazil. Although it was rejected in a recent referendum in Switzerland, the deepening crisis has broadened its base of support among “progressives” as well as conservatives, the latter seeing it as a way to eliminate the huge state bureaucracy connected with existing social systems. 

   

It is not my purpose here to take discuss the pros and cons of a “guaranteed basic income”. Instead I want to ask a different question: What has happened to the goal of achieving full employment? Why can’t governments adopt policies that insure – by direct and indirect means – that there will be a sufficient demand for labor and a sufficient supply of well-paying, productive jobs for the entire labor force? Why don’t governments adopt emergency measures, including the use of productive credit generation in combination with large-scale state investment in infrastructure and other key sectors of the economy, to create millions of productive jobs, to reverse the disastrous policies of deregulation, privatization and dismantling of social systems, and to restore the economic security of the population?

 

In this context it is unfortunate that large parts of the “left” internationally, including parties with a social democratic orientation, who are supposed to represent the interests of working people, have de facto abandoned the traditional struggle for full employment. One reason for this is evidently the belief that there is no realistic possibility today to overturn the basic “rules of the game” which have been established by financial globalization and other neoliberal reforms. At a conference on “The Future of Work”, held a week before the Swiss referendum on guaranteed basic income, former Minister of Finance of Greece, Yanis Varoufakis, expressed this by declaring  -- in agreement, ironically, with the neoliberals on this point -- that “the social democratic, New Deal paradigm is finished. It can never be revived.” According to Varoufakis, capitalism has entered a new objective stage of development in which the old-fashioned solutions have become obsolete. But no doubt the biggest reason why the idea of full employment has disappeared from the agenda is the belief, articulated by Robert Reich at the same conference, that in the foreseeable future the large-scale use of robots and robotic systems will cause a collapse in the demand for human labor. Only a small fraction of the workforce will be needed to supply the economy. In such a situation the idea of ever returning to full employment would obviously be a complete fantasy, and Reich does not even mention it.

 

Reich appears to be a decent person, but in my opinion his conclusions are based on a fundamentally flawed conception of economics. The problem is not only typical of professional economists like Reich, but is shared by most politicians and ordinary people.  The problem becomes apparent when we ask ourselves: what is the goal of an economy? What is the purpose of economic activity? The obvious answer is that the purpose of economic activity is to fulfill human needs.  But what are human needs? Taken seriously, this question demands a completely different sort of answer than (for example) listing types and amounts of goods and services corresponding to some chosen “minimum standard of living”. Contrary to the biases of nearly all “establishment” as well as leftist economists, it is impossible to define human needs without taking account of Man’s essential nature as a spiritual being – a being possessing creative mental powers of the sort that distinguish Man absolutely from all other living species. To live as a human beings in the fullest sense requires that each individual’s creative powers be constantly exercised, realized “in actu”. “Human needs” therefore includes everything required in order for people in society to live as human beings in the fullest sense.

 

Obviously the task of fulfilling human needs in this sense can never be achieved all at once, but only in open-ended process of development. Also it goes beyond the domain of what an economy by itself can accomplish. But what are the implications for economics, when we postulate that the purpose of an economy is to contribute as much as possible to that process? We begin to see clearly the fallacy underlying the idea that “in the future fewer and fewer people will be needed to produce what society needs”. In fact, what society needs are not only goods and services, but above all certain forms of human activity – activities through which the members of society can develop and realize their potential as beings “created in the image of God”. Hence the primary task of the economy is to “produce” those sorts of activities. How can this be done? By employing people to do them! More precisely: by guiding the economy – through suitable economic policies – along a trajectory of economic development which insures a constant expansion of employment in the desired forms of activities, as well as the supply of the goods and services (including especially education) necessary to support that expansion.

 

I specify such a trajectory in my book on “The Physical Economy of National Development”. Its essential feature is to drastically increase the percentage of the workforce employed in scientific research and related technological development, in part through state sponsorship of science-intensive projects such as the long-term exploration and colonization of space. Far from being a threat to employment, “robotization” will help to accelerate this process, by freeing the labor force from routine, uncreative activities.  Supplying the hardware and infrastructure for scientific research – for the study of living Nature, for the exploration of the Universe on all scales -- will account for an increasing portion of the total demand for manufactured goods. Employment in manufacturing will be concentrated more and more on the development, engineering and production of “one of a kind” products required for scientific experiments and for prototypes of new technologies intended for use in various sectors of the economy. Such work is a labor-intensive and science-intensive activity demanding large amounts of creative problem-solving and innovation. This is an area in which small and medium-sized enterprises have unique advantages, and their economic importance will grow.

 

With the economy structured in the indicated way, the growing expenditure of manpower and resources for science-related activity will be compensated by the increases in physical productivity which will be constantly generated as a byproduct of the growth of scientific knowledge and its applications in technology. I call this the “Knowledge Generator Economy”. Naturally, a revival of classical music, art and poetry has an essential role to play, especially for young people, because of its spiritual content and because it makes people more creative, rather than more stupid as does present-day popular culture. In summary: The goal is to transform the development of human knowledge and human creative capabilities into the main generator of employment, investment and demand!

 

The neoliberal world order is collapsing, and with it the authority of the ideologies and institutions upon which that world order has been based. The political earthquakes occurring around the world make possible radical changes in policies.  Therefore it is especially urgent to correct the false assumptions and habits of thinking about economics, which block the way toward solving the crisis, and could make it even worse.  The goal I have sketched above may seem very far away at the moment, but it can function as an indispensible “star on the horizon” for guiding economic policies in the right direction. 

 

                                                                                                                                                                    Jonathan Tennenbaum

 

  

May 22, 2016

 

Mass Participation in Scientific Research – Antidote against Cultural Decadence ?

 

In my forthcoming book on the “Physical Economy of National Development” I propose that governments and other relevant public and private institutions should adopt a policy for engaging all or a large percentage of the population directly in scientific research. Most people would participate as volunteers or “hobby scientists” taking part in important large-scale scientific projects in areas such as astronomy (including planetary research) biology, medicine and the Earth sciences. If organized in a suitable way, this policy can have enormous beneficial effects in cultural, social and economic terms.

 

Perhaps most important benefit, in the longer term, is to reorient society away from accumulation of material wealth per se, toward the expansion of human knowledge, as a goal in-itself. Ironically, such a reorientation does not at all mean that the economy will converge on a state “zero growth”, as many in the environmentalist movement demand. On the contrary! The pursuit of knowledge requires more and more human exploration of the Universe, more and more human beings to participate in this process, and an economy which can sustain a rapid increase in the scale of scientific activity in every direction. Conversely, the acceleration of scientific and technological progress, which could be realized on the basis of a much larger participation of the population in scientific activity, would provide the basis for a new era of real physical-economic development. The expansion of scientific activity -- including space exploration, for example – will become the main driver for investment, demand and employment.

 

At first glance this idea might seem to be just a utopian dream. But strong evidence for its feasibility can be seen in the remarkable successes of recent “Citizen Science” initiatives in the United States and a number of other countries, as well as the growth of popular interest in projects such as the colonization of Mars. In the context of Citizen Science millions of people have contributed, via internet, to the analysis and evaluation of scientific data in a variety of fields. In astronomy, for example, “citizen scientists” are helping to analyze images taken by the Hubble and Spitzer space telescopes, interplanetary probes as well as ground-base astronomical instruments. The tasks include for example searching for pulsar stars, planetary discs and extra-solar planetary systems; classifying galaxies, star clusters and other objects; and identifying specific structures on the surface of the Moon and Mars etc. The second main area of Citizen Science today is biology and medicine, where activities include characterizing cancerous and precancerous cells in micrographs of tissue samples, identifying the species and forms of behavior of animals, insects and microorganisms from video films.

 

From the side of the scientific community, recruiting large numbers of volunteers to assist in scientific work is becoming more and more a necessity, because the amount of data generated by present-day scientific instruments is so enormous, that it cannot be adequately processed and evaluated by the professional scientists themselves, even with the help of modern computers.  Despite all the progress in artificial intelligence, the visual abilities and judgment of human beings can still not be replicated by computers, even on the level of discovering certain sorts of patterns. Another reason why millions of volunteers are needed, is that the limited number of profession scientists is much too small to constantly monitor various processes in Nature over long periods; as a result, many important events are missed. This is particularly true in astronomy and the Earth sciences.

 

The most common form of Citizen Science is organized using special interactive websites. Volunteers register themselves, choose the project they wish to participate in and then are given a short introduction to the relevant scientific area, the goal of the research project and how they can contribute to it. After an initial tutorial, participants are presented with real data, typically in the form of interactive images and videos, and then answer questions in an interactive questionnaire.  The answers are processed, stored in data banks, and utilized by professional scientists for research. According to the results, new Citizen Science projects can be launched.

 

The number of projects is constantly increasing, and the Citizen Science websites are becoming more and more sophisticated, but so far the tasks carried out are extremely simple and engage the cognitive abilities of the participants only to a very limited extent, typically remaining at the level of identifying and classifying various sorts of objects in video images. My proposal is much more ambitious in terms of the intellectual level and intended scale of participation.

 

As far as the number of participants is concerned, the potential is virtually unlimited. This is most obvious in the case of astronomy: An estimated more than 12 billion galaxies are presently within the range of observation of the Hubble Space Telescope alone. I wrote in my book: “Thus, there are more than enough galaxies for each living human being on Earth to ‘adopt’ one of them for study. (They could also be dedicated to a child at birth, or as a birthday present!) The same applies to the approximately 100 billion stars within our own galaxy, the Milky Way…” I go on to note, that ”with the establishment of a large network of astronomical instruments utilizing the most advanced technologies and detector systems, it will become possible not only to investigate each of these objects in detail, but also to carry out observations and measurements on millions of objects in parallel… In this way millions of people on Earth can individually study their choice of objects via data links to space observatories.”

 

Going beyond the very simple-minded tasks of today’s internet-based Citizen Science programs poses a more difficult challenge in many respects. But wonderful models are provided by the long history of amateur science and the activity of millions of amateur (or “hobby”) scientists in the world today, many of whom have attained a high level of scientific knowledge and technical expertise, and are “networked” with professional scientists in a variety of projects. Countless amateur science clubs exist, in which resources are pooled to purchase sophisticated equipment, and information is shared and discussed in meetings and seminars. Thanks to technological developments such as CCD detectors and CCD cameras, amateur astronomers today can carry out observations with their own instruments, which would have been difficult or impossible for professional astronomers even 20 years ago. The passionate enthusiasm of amateur astronomers is legendary. A recent example, from the technical side, is a truck driver from Utah USA, Mike Clements, who in 2013 constructed the world’s largest amateur telescope, more than 10 meters tall and with a 180 cm mirror. Another case is the discovery and imaging of anomalous structures in Mars atmosphere -- so-called “plumes” – by Wayne Jaeschke, who works as a patent lawyer during the day and spends most evenings making astronomical observations with his own telescope.  (Jaeschke has co-authored a scientific paper on this subject together with professional astronomers.) 

 

Another area with a long tradition of impassioned amateurs is the study of living nature, including botany, zoology, entomology, ornithology, as well as microbiology. In recent years, the rapid advances of biotechnology and related instrumentation has made it possible for amateurs to build sophisticated home laboratories able to carry out experiments in molecular biology, biophysics and other fundamental areas of biology. The prices of sophisticated equipment such as polymerase chain reaction (PCR) thermocycler have dropped to a level where they are affordable to many amateur biologists.

 

The growing sophistication of equipment available to amateur scientists in all fields, and the growing possibilities of networking with professional scientists, creates a situation in which amateurs require – and desire! – more and more advanced scientific knowledge and technical skill. This is very important phenomenon, because here the motivation to study is not to earn money or make a career, but rather a passionate interest in scientific activity per se. At the same time, the demand for scientific equipment intended for use by amateurs can become an increasingly significant economic factor -- potentially even more important that many traditional categories of consumer goods, if the percentage of amateur scientists in the population grows sufficiently.

 

In my opinion there is no doubt, that with appropriate policies and actions by governments and relevant institutions, the number of amateur scientists and people involved in various improved forms of interactive citizen science could be very rapidly expanded up to a level of 50% or more of the population over the next 20 years.  In future articles I intend to make specific suggestions for how this goal could be achieved. But in the remainder of this article I want to briefly call attention to its political implications.  

 

A key motivation for promoting mass participation in scientific research today is to address the problem I identified in an earlier commentary entitled “Stupidity is the Enemy”. In practically all societies up to now, most of the population has existed in a condition of cultural backwardness, while serious intellectual activity has been limited to a relatively tiny minority. Today this problem has taken on monstrous dimensions due to the pervasive influence of electronic media, the entertainment and sports “industry” and other forms of commercial mass culture. The problem embraces every section and every social class of the population.  Symptomatic is the growing addiction of youth and even young children to video games, action films, drug-like states induced by rap and heavy metal music, and other types of intense sensual stimulus provided through digital technologies. Another symptom is the fascination with aggression, brutality and violence and the growing hegemony of an “anti-aesthetic” of the ugly, the grotesque and the perverse, in contrast to the classical orientation to natural harmony and beauty.  

 

There is much debate about how much danger these tendencies pose to today’s society and even to the future of civilization. The key problem, from my standpoint, arises from the inseparable relationship between emotionality and thinking. To think rationally, to distinguish between reality and phantasy, to search for the truth, to distinguish between good and evil and to seek to do good – all of these things require an investment of emotional energy, and require that people attach positive emotion to these tasks. But in today’s dominant cultural environment, people’s emotional energies are channeled in the completely opposite direction. The history of the Roman empire teaches us about the political consequences, when popular culture is dominated by “panem et circenses“ (“bread and circuses”). In this condition the masses of the population become complicit in their own oppression and exploitation. Michael D’Antonio’s recent biography of Donald Trump (“Never Enough”, 2015) provides valuable insights into the effects of today’s version of “panem et circenses”. Among other things the book highlights the profound transformation of U.S. society which occurred from the 1960s on under the impact of the mass media and entertainment industry. The phenomenon of Donald Trump cannot be understood without looking at the increasingly close interconnection between the media and entertainment industry, politics, real estate and finance in the post-1960 period. This occurred on the background of “post-industrial” economic policies which drastically reduced the proportion of the U.S. workforce involved in physical production and related technical activity. The deindustrialization process greatly weakened the population’s orientation to reality and causality, while more and more emotion became attached to mindless consumerism and the phantasy-world of movies and television.

 

 How could mass participation in scientific research change this situation?  Mainly by helping to reorient the emotional life of the population toward the real Universe, toward their own intellectual development, and restoring a sense of personal value and dignity by giving people the chance to contribute something substantial to the knowledge and future of humanity; generating a shared experience of the joy of scientific discovery which could rapidly spread to embrace a large part of the population, across national borders; reawakening natural curiosity; evoking a sense of the inexhaustible richness and beauty of Nature, far beyond any “virtual reality” that human beings could ever produce or imagine. 

The passionate enthusiasm of amateurs scientists, and the enthusiastic response of hundreds of thousands of ordinary people to the first internet-based Citizen Science projects -- which greatly astonished the scientists who launched them --, give us a first glimpse of the cultural transformation that could be unleashed by the policy I have proposed.

 

The biggest obstacle to accomplishing the desired cultural transformation, in my opinion, is not the attitude of the population, but rather the problematic situation within science itself. Put very briefly: since the early decades of the 20th century (or in some respects even earlier) European science began to lose its orientation toward classical philosophy, and became more and more pragmatic and technical in character. The German philosopher Edmund Husserl characterized this process as “Sinnentlehrung” – an “emptying-out of “meaning”. Connected with this is a serious weakening of the revolutionary spirit of science, the tendency to make exaggerated claims about the extent of present-day knowledge, and the emergence of a “scientific establishment” which tends to suppress the work of creative thinkers who challenge the so-called “scientific consensus” (See my article, “The First Detection of Gravitational Waves: A Triumph of Science and Technology – But What Does It Mean?”). Among other things this situation has led to false and misleading forms of popularization of science. The result is to make it much more difficult to transmit the true spirit of science to the general population.

 

Nevertheless I am optimistic. In my  book on the “Physical Economy of National Development” I emphasize the classical work of Alexander von Humboldt, “Cosmos – Sketch of a Physical Description of the Universe” as a key reference-point for the approach needed to recruit masses of the population to scientific work. I wrote:

 

“‘Cosmos’ was enormously popular in his time and one of the most influential books ever written on science. It was directed both to scientists as well as to a broad educated audience. A literary masterpiece embodying much of the spirit of Renaissance humanism, Alexander von Humboldt’s ‘Cosmos’ conveys a conception of science, Nature and Man diametrically opposed to the alienated thinking prevailing in popular culture and among most scientists today… The book is unique in that it combines a great part of the detailed scientific knowledge at his time in the fields of astronomy, geography, geology, zoology and botany, together with a historical account of the development of human conceptions of Nature. This development is portrayed both in terms of the striving for objective knowledge as well as the subjective/aesthetic dimension expressed in poetry, literature and the relationship of culture and history to the natural environment. ‘Cosmos’ is written with the boundless enthusiasm of a great scientific thinker and world explorer striving to awaken the thirst for knowledge and the enjoyment of Nature in the minds of the readers.”

 

                                                                                                                                                                          Jonathan Tennenbaum

 

 

 

March 31, 2016

 

The First Detection of Gravitational Waves: A Triumph of Science and Technology – But What Does It Mean?

 

The February 11 announcement of the first successful detection of gravitational waves by the LIGO Observatory in the United States is joyous news, and a welcome contrast to the rather depressing panorama of world events in the preceding period. This breakthrough provides a good occasion to reflect upon the strengths and weaknesses of science in our present era.

 

Let us look at first at the strong side. Thanks to decades-long efforts of physicists, astronomers and engineers of countries around the world, a revolutionary new type of instrument has been created for exploring the Universe – an instrument able to “see” a form of radiation whose physical nature and characteristics are fundamentally different from those of electromagnetic waves and cosmic rays, which have been the basis for practically all astronomical observations until now. In the long term this breakthrough promises to have a comparable impact on the development of astronomy, as the emergence of optical telescopes four centuries ago. Indeed, now that LIGO has provided a “proof of principle” for the detection of gravitational waves, more measuring stations can be added, on the Earth and later in space, creating more and more powerful “gravitational telescopes”.

 

A great deal of information is available concerning how the first gravitational wave signal was detected, and concerning the presumed source of the signal at an estimated distance of 1.3 billion light years from the Earth. The signal, which lasted only about two-tenths of a second, was identified through the analysis of data from two independent detectors located in the U.S. states of Louisiana and Washington. The relevant signal was recorded in the morning of September 14 last year, but the formal announcement was made only after a painstaking process of analysis and efforts to reduce the probability of error to as close to zero as possible.

 

Happily, in contrast to many developments in modern physics, which are virtually inaccessible to people who lack specialized knowledge and mathematical training, the basic ideas behind the LIGO are relatively simple and can be understood by a much wider audience. This includes the principle of interferometry; the method of comparing signals from interferometers at distant locations; and the basic strategy for how to “pick out” the extraordinarily weak signal of a gravitational wave from raw data containing large amounts of noise caused by a variety of physical mechanisms and sources inside and outside the detectors. A crucial role was played by the use of physical models to predict the characteristics of the gravitational wave signals that would be generated by relevant types of astrophysical events. As a simple analogy: it is much easier to identify the voice of a specific person in a crowded and noisy room, if we know in advance what the voice sounds like and what the person is saying.

 

For readers with a basic knowledge of physics and astronomy I would very much recommend the article “LIGO and the Detection of Gravitational Waves“ by Barry C. Barish and Rainer Weiss, two of the leading scientists in the project (http://www.fcaglp.unlp.edu.ar/~observacional/papers/PDFs/obs_grav-waves_ligo_barish_b.pdf). This article, written back in October 1999, explains the basic approach and strategy behind the design of LIGO. The “official” scientific report on the first observation of gravitational radiation, just published in Physical Review Letters (116, 2016), can be found at the address: http://www.ligo.org/science/Publication-GW150914/index.php).

 

It is important to point out, that although the successful detection of gravitational waves was a brilliant achievement of technical ingenuity, no new scientific discovery was involved. At most, the detection of gravitational waves can be seen as a welcome corroboration (although not a proof) of the scientific theories which were the basis for designing the LIGO observatory and for processing and interpreting the data. This includes above all the General Theory of Relativity and Einstein’s original 1916 prediction of the existence of gravitational waves.

 

On the other hand we have reason to expect that future gravitational astronomy will lead to the discovery of anomalies that do not fit into present physical theories. How will the scientific community react to such anomalies? This question brings up the weaker side of science in our present era.

 

There is unfortunately a great discrepancy between the spectacular development of technology over the last 100 years, and the relative lack of progress at the fundamental level of physics. The last true revolution in physics occurred nearly a century ago, with the discoveries of quantum mechanics and the theory of relativity. Despite a vast development of theoretical physics over the last 100 years, the basic foundations of present-day physics were already established by the 1930s and have remained essentially the same since then.

 

The clearest symptom of stagnation at the fundamental level is the tendency for many scientists to overestimate the scope of validity of present-day scientific theories and to make exaggerated claims concerning how much we really know today (with any degree of certainty) about our Universe. This tendency is particularly strong in astrophysics and in cosmology -- two fields most directly concerned with gravitational waves and the future interpretation of data obtained from detection of gravitational waves.

It is not uncommon to hear astrophysicists and cosmologists describe details of the birth and age of the Universe, about what happened in the first billionths of a second after the Big Bang -- when the Universe supposedly consisted a plasma of quarks and gluons etc. -- as if they were talking about established facts. Sometimes the impression is given, especially in presentations to general audiences, that most of the fundamental questions about the origin and development of the Universe have already been answered; and that only details remain to be worked out. It is not sufficiently emphasized that the various assertions concerning the history of the Universe, concerning black holes, dark matter, quarks and gluons etc. all have a strongly hypothetical character; they are derived from theoretical models which involve far-reaching assumptions having only a limited basis in empirical reality. Terms such as “Standard Model” or “consensus view” tend to distract attention from the fact, that the some of the most distinguished scientists in the relevant fields have rejected these theoretical models and assumptions.

 

In their book “A Different Approach to Cosmology” the celebrated astronomers Fred Hoyle, Geoffrey Burbidge and Jayant Narlikar made fun of the conformist behavior of scientists, with a picture of a crowd of geese all running in the same direction. Speaking at a conference in 2005 Geoffrey Burbridge described the situation with reference to theories of the origin of the Universe:

“I hope that by now that I have provided enough evidence for a reasonable person to conclude that there is no particularly compelling reason why one should so strongly favor a standard model universe … apart from the fact that it is always easier to agree with the majority rather than to disagree. This sociological effect turns out to be actually extremely powerful in practice, because as time has gone on young cosmologists have found that if they maintain the status quo they stand a much better chance of getting financial support, observational facilities and academic positions, and can get their (unobjectionable) papers published…”

The French astrophysicist Jean-Claude Pecker warned against the hubris of modern cosmologists:

 

“And we would pretend to understand everything about cosmology, which concerns the whole Universe? We are not even ready to start to do that. All that we can do is to enter in the field of speculations. So far as I am concerned, I would not comment myself to any cosmological theory… Actually, I would like to leave the door wide open.”

 

A similar situation applies to a certain extent to the subject of black holes. The presumed source of the signal detected by LIGO last September is a pair of black holes orbiting each other, losing energy by radiating gravitation waves and collapsing into a single object. The form of the resulting signal was predicted with the help of mathematical models, and the prediction was used in looking for the “signature” of a gravitational wave within the noisy raw data generated by LIGO. The fact that a signal with exactly the expected form was found, adds to the existing body of astronomical evidence for the real existence of black holes in our Universe. But the famous cosmologist Stephen Hawkings, for example, believes that their physical nature differs fundamentally from what is predicted by “standard theory”. In a 2014 paper Hawkings writes: “There are no black holes - in the sense of regimes from which light can't escape to infinity.” The Indian physicist Abhas Mitra, head of Theoretical Astrophysics at the famous Bhabha Nuclear Research Center, states in a 1999 paper: “Thus we confirm Einstein’s and Rosen’s idea … (that) Schwarzschild Singularities (i.e. Black Holes – JT) are unphysical and cannot occur in Nature.” Mitra considers that the “candidate black holes” identified by astronomers are simply massive compact objects of a certain type. In an interview with the Times of India Mitra remarked: “Many Nobel laureates too have been struggling to resolve this paradox (the so-called Black Hole Information Paradox of Hawkins – JT), but they want to keep black holes alive. …You see, black hole is one of the biggest physics paradigms for almost 100 years with thousands of celebrity professors, researchers, Nobel Laureates having personal stake. … If my papers were wrong, they would have torn me apart and feasted like vultures on me. … Many Indian academicians desire that maybe someday somebody from the West would do that and they would be relieved of their moral guilt of ignoring me.”

 

The purpose of this commentary is not to take sides in scientific controversies. Our main point is this: Scientists must never forget that the real Universe is not constrained to follow the laws of physics as we know them today. On the contrary, we can be sure that every real process in Nature involves physical principles which Man has not yet discovered. In the progress of human knowledge we are always at the beginning! Thus, when attempting to interpret data from the detection of gravitational waves, scientists must keep in mind that present-day physical theories will inevitably become obsolete in the future, as a result of new, fundamental scientific revolutions. The most important task of the emerging gravitational astronomy is exactly to help lay the basis for such revolutions -- rather than merely filling in facts in some “standard theory” which future generations of physicists will laugh at.

 

                                                                                                                                                               Jonathan Tennenbaum

 

 

 

Note: The following text was written in September, 2016. Since then Space X has had a series of successful launches. Nevertheless, I think the essential points in the article, concerning the present state of space technology, are correct.  - JT

 

Remarks on the Recent Explosion of a Falcon 9 Rocket at Cape Canaveral

 

On September 1 a Falcon 9 rocket of the company Space X exploded on the launch pad during fueling for a static firing test. The spectacular explosion completely destroyed the rocket and its satellite payload, whose launch had been planned for September 4. This is the latest of a string of failures which have afflicted the ambitious efforts of SpaceX company to greatly reduce the costs of space missions, through development of new generations of launch vehicles. On June 28 last year a Falcon 9 exploded in flight as a result of a structural failure. Earlier that year, on February 11, a Falcon 9 first stage was destroyed in a failed attempt to land vertically on an ocean platform after completing its boost mission. The explosion on September 1 is a particularly hard blow to the plans to use Falcon 9 vehicles for launching manned missions into space.

 

Spectacular failures are of course not uncommon in the history of rocket technology, and the lessons learned from them have contributed much to improving the reliability of launch systems. Also, despite the recent failures, the Falcon 9 has carried out 28 successful unmanned missions, including the transport of supplies to the International Space Station as well as launching communication satellites into orbit. Nevertheless, the difficulties experienced by Space X provide a useful occasion to reflect upon the present state of space technology and of science and technology generally.

 

There is no reason to doubt that travel into space will one day become as normal and routine as passenger air travel today. Overall, human beings have already spent a total of more than 133 man-years in orbit. The Russian cosmonaut Gennady Padalka has spent a record total of 879 days – nearly two and a half years -- in orbit, including a mission on the Russian space station Mir in 1998, followed by four longer stays on the International Space Station in 2004, 2009, 2012 and 2015. Three cosmonauts have remained in orbit continuously for a year before returning to Earth. The problems of living and working for substantial periods in an orbiting station under weightless (or microgravity) have to a large extent been solved. The International Space Station (ISS) has been continuously manned since the year 2000. A total of 244 people have lived on the ISS. Many of them visited the space station twice or even more times. Up to now there have been over 80 flights carrying passengers from the Earth to the ISS. Since the retirement of the U.S. Space Shuttle in 2011, travel to and from the space station has been done exclusive with Russian Soyuz launchers and reentry vehicles, now operating almost with the regularity of a “space railroad”. There have been 94 unmanned missions delivering supplies and equipment to the station. SpaceX has already carried out 7 of these unmanned resupply missions with its Falcon 9 vehicle, and is striving to become a main transporter of passengers as well as cargo to the ISS in the future. That includes capturing an projected commercial market for “space tourism”. Already seven “space tourists” having spent time on the ISS, taken there by Russian Soyuz vehicles and paying $20-40 million for the exciting trip.

 

All of this could easily lead to the impression that human travel into Earth orbit has already become a routine affair. But in reality this is very far from being true. The critical issue for space travel today -- and the main reason why it is still very far from “routine” – is transport to and from orbit, which today still involves a combination of very high cost and significant risks to the passengers.

Both problems are connected above all with the extraordinary complexity of orbital launch vehicles and the fact that critical components of these systems must operate flawlessly under extreme, rapidly changing conditions -- often near the farthest limit of what can be technically mastered -- in terms of extremes of temperature, pressure, mechanical loads and stresses, vibration, aerodynamic effects etc. The design constraints are extremely severe, especially given the need to minimize the weight of the vehicle. There are many points in the system where a serious malfunction will lead to catastrophic results within seconds or fractions of a second, with little or no time for correction. Given that a manned orbital launch vehicle is launched with hundreds of tons of fuel and liquid oxygen on board, in close proximity to each other, it is not surprising that malfunctions tend to end in explosions. Naturally there tends to be an inverse relationship between cost and risk. Given all these factors, the progress of human space flight from Yuri Gagarin’s flight 55 years ago up to the International Space Station today, has been a very, very remarkable achievement. There has been a price in human lives, however. 546 human beings have been in orbit, but 18 people have died during transport to or from orbit. From a purely statistical point of view that would imply a probability of more than 3% for not surviving the trip. Would you decide to go?

 

Russia’s Soyuz orbital launch vehicles are doubtless the most reliable that have been developed so far, functioning presently as “workhorses” for manned space activity internationally. Their high reliability is connected with the very long experience in building and operative rockets with the same basic design -- going back to the Soviet ICBM development in the 1950s -- and the unparalleled number of launchings of launch vehicles utilizing that same basic design. The total of more than 1700 manned and unmanned missions includes over 870 launches of vehicles belonging to the official “Soyuz” family, including the ones that have transported all crew members to the ISS since 2010, plus a large part of the cargo supplies to the station. Since the first launching of a Soyuz vehicle in 1966 there have been 22 failed missions, amounting to a success rate of about 97.4%. Only one failure occurred with a manned mission, when a Soyuz vehicle exploded during countdown in 1983. Fortunately, the crew survived, thanks to a rocket-powered launch escape system which dragged their capsule away from the launch vehicle just 2 seconds before it exploded.

 

Paradoxically, the high degree of reliability of the Soyuz launchers has much to do with the fact that there have been no major changes in its basic technology and robust design for more than 40 years. Reliability through technological stagnation? At the same time, without major innovations there is hardly any hope of substantially reducing the cost of orbital missions compared to the Soyuz or similar launch vehicles, which have been optimized in the course of long experience. But how can spaceflight become routine, if it requires a giant, 45 meter-long vehicle, and costs over $10 million per person, to transport half a dozen people into orbit?

This is what SpaceX is trying to change. An obvious strategy is to create launch vehicles that can be recovered and used again many times -- as opposed to the Soyuz and other so-called expendible, “throwaway” systems in which the entire hardware, including the engines and other extremely expensive components, is discarded during the ascent trajectory, burns up in the atmosphere or is destroyed by ground impact. Building reusable or all least partially reusable launch vehicles is an old idea, of course, but it has proved so far to be difficult and risky in practice. The most serious effort in this direction, prior to SpaceX’s development of the Falcon launchers, was the U.S. Space Shuttle. Much can be learned from the history of the Shuttle.

 

Like the Falcon 9 today, the Space Shuttle was created with the main goal of drastically reducing the cost of orbital missions compared with conventional rockets. It is important, as background, to keep in mind that following the first successful manned landing on the Moon in 1969, the budget of the U.S. space agency NASA began to be drastically reduced. (The last Apollo mission to the Moon was in 1972, and Man has still not returned!) By 1975 NASA’s budget was less than 50% of what it had been at the height of the Apollo program. NASA was under enormous pressure to cut costs. The hope was that this could be done by creating a launch vehicle which could return to Earth, landing like a glider, and be used again many times. Although the basic facts about the Shuttle are widely known, it is useful to briefly review some of them here.

For various reasons a compromise was made with respect to reusability. The Space Shuttle consisted of the so-called Orbiter, an external liquid fuel tank and two solid-fuel booster rockets. The winged Orbiter carried the crew, a cargo port and the powerful main engine. It was designed to reenter the atmosphere, glide to the Earth and land on a runway. The Orbiter was intended to be reused more than 50 times. The two solid-fuel booster rockets provided extra thrust for the lift-off and initial phase of the launch trajectory. After the solid fuel was used up, the boosters were jettisoned and returned to the ground by parachute, where they could be recovered and used again. The external fuel tank, which was needed to fuel the main engine up to the point when orbital parameters were achieved, was then jettisoned and burned up in the atmosphere. It was the major non-reusable component.

 

Today SpaceX is pursuing a similar strategy in a different way. The Falcon 9 is a two stage rocket, where the first stage – which makes up a large part of the launch cost of rocket-based launching systems – returns to Earth, landing vertically onto a platform. Both approaches pose enormous technical challenges, and also demand very careful maintenance of the rocket engines and other systems before they can be reliably re-used.

 

The Space Shuttle was a masterpiece of engineering. During the 20 years from 1981 to 2011 – when the Shuttle program was discontinued -- four Orbiters flew a total of 133 successful orbital missions. Unfortunately, there were also two major disasters: the explosion of the Shuttle “Challenger” 73 seconds after its launch on January 28, 1986, and the disintegration of the Shuttle “Columbia” during reentry to the Earth on February 1, 2003. In each case the entire crew was lost, resulting in 14 deaths.

Investigations of the background and causes of these disasters cast light on the dangers inherent in the use of rockets for space travel in general, while revealing the risks involved in attempting to reduce costs, and the fateful effects of errors and biases in the decision-making process at all levels of such a complex undertaking. In a deeper sense, the causes of the two accidents are inseparably connected with a fundamental weakness in the whole mode of economic growth which has prevailed in the United States and most other nations of the world since the 1970s.

 

It is particularly instructive, in this respect, to read the short document “Personal observations on the reliability of the Shuttle” written by the famous physicist Richard Feynman and presented on June 11, 1986, one day after the release of official Rogers Commission Report on the Challenger disaster. Although Feynman himself was a member of the Rogers Commission, he carried out his own investigation, including discussions with engineers and others involved directly in the design, launching, operation and maintenance of the shuttle. Feynman is sharply critical of the way the project was managed, and of the way the NASA management downplayed the risks in order to “sell” the Shuttle project to the governement and public. Here I will quote from some particularly interesting passages of his 13-page report, which is well worth reading in its entirety. Feynman begins his discussion of the Challenger disaster with the following observation:

 

“It appears that there are enormous differences of opinion as to the probability of a failure with loss of vehicle and of human life. The estimates range from roughly 1 in 100 to 1 in 100,000. The higher figures come from the working engineers, and the very low figures from management. What are the causes and consequences of this lack of agreement?”

 

As is now well-known, the immediate cause of the Challenger explosion was a material failure in an elastic ring (“O-ring”) used to seal a joint on one of the two solid-fuel booster rockets which supplemented the thrust of the shuttle’s main engine in the initial phase of the flight. A stream of hot gas thereby escaped from the booster(so-called blow-by) and damaged the structure of the adjacent external fuel tank, leading to the breakup and explosion of the whole vehicle. There had long been a controversy about the risks involved in using of solid-fuel engines in manned flights, but the designers of the shuttle system had nevertheless decided to use them. Feynman wrote:

 

“An estimate of the reliability of solid rockets was made … by studying the experience of all previous rocket flights. Out of a total of nearly 2,900 flights, 121 failed (1 in 25). This includes, however, what may be called early errors, rockets flown for the first few times in which design errors are discovered and fixed. A more reasonable figure for the mature rockets might be 1 in 50. With special care in the selection of parts and in inspection, a figure of below 1 in 100 might be achieved, but 1 in 1,000 is probably not attainable with today's technology. Since there are two rockets on the Shuttle, these rocket failure rates must be doubled to get Shuttle failure rates from Solid Rocket Booster failure. NASA officials argue that the figure is much lower. They point out that these figures are for unmanned rockets but since the Shuttle is a manned vehicle ‘the probability of mission success is necessarily very

close to 1.0 ’ “

 

Feynman asks: How could NASA be sure about such an estimate? “Previous NASA experience had shown, on occasion… near accidents, and accidents, all giving warning that the probability of flight failure was not so very small.” He goes on to discuss the specific fact that erosion in seals used to join parts of the shuttle had already been found in previous flights. “The acceptance and success of these flights is taken as evidence of safety. But erosion and blow-by are not what the design expected. They are warnings that something is wrong. The equipment is not operating as expected, and therefore there is a danger that it can operate with even wider deviations in this unexpected and not thoroughly understood way.”

 

Skipping over Feynman’s detailed discussion of the O-ring problem, I want to quote from the section in which he discusses the Shuttle main engine. This crucial component was not involved in the shuttle disasters, but his discussion points to the complexities of the system and the risks of trying to cut costs:

 

“The Shuttle Main Engine was designed and put together all at once with relatively little detailed preliminary study of the material and components. Then when troubles are found in the bearings, turbine blades, coolant pipes, etc., it is more expensive and difficult to discover the causes and make changes. For example, cracks have been found in the turbine blades of the high pressure oxygen turbopump. Are they caused by flaws in the material, the effect of the oxygen atmosphere on the properties of the material, the thermal stresses of startup or shutdown, the vibration and stresses of steady running, or mainly at some resonance at certain speeds, etc.? How long can we run from crack initiation to crack failure, and how does this depend on power level? Using the completed engine as a test bed to resolve such questions is extremely expensive. One does not wish to lose an entire engine in order to find out where and how failure occurs. Yet, an accurate knowledge of this information is essential to acquire a confidence in the engine reliability in use. Without detailed understanding, confidence can not be attained.

 

“The Space Shuttle Main Engine is a very remarkable machine. …. It is built at the edge of, or outside of, previous engineering experience. Therefore, as expected, many different kinds of flaws and difficulties have turned up. Because, unfortunately, it was built in the top‐down manner, they are difficult to find and fix. The design aim of a lifetime of 55 missions equivalent firings (27,000 seconds of operation, either in a mission of 500 seconds, or on a test stand) has not been obtained. The engine now requires very frequent maintenance and replacement of important parts, such as turbopumps, bearings, sheet metal housings, etc. The high‐pressure fuel turbopump had to be replaced every three or four mission equivalents (although that may have been fixed, now) and the high pressure oxygen turbopump every five or six. This is at most ten percent of the original specification. But our main concern here is the determination of reliability. In a total of about 250,000 seconds of operation, the engines have failed seriously perhaps 16 times.”

 

Feynman concludes his report with a now-famous remark:

 

“For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.”

The experience with the Space Shuttle program and the difficulties of SpaceX today point to a fundamental issue in economics. Every basic technology has inherent limits. Attempting to increase its productivity by gradual improvements and clever tricks – i.e. trying to “fool nature” – inevitably leads to diminishing returns and growing risks.

 

Fundamentally speaking, for example, all the launch vehicles used up to today for space travel are limited by two main realities: firstly, the “rocket equation” derived by Kostantin Tsiolkowski in 1903, and secondly the use of chemical fuels, carried on the vehicle, as the source of energy for propulsion. In practice these two realities are reflected in the fact that in all the systems used up to now, the payload is only a small percentage of the total mass of the launch vehicle (including fuel) at the moment of launch. Falcon 9, for example, requires a starting mass of 550 tons in order to transport less than 22 tons into low Earth orbit (LEO). About 90% of the takeoff mass is fuel, so that a large percentage of the fuel and of the physical work performed by the engines is expended in order to lift and accelerate the mass of fuel which is consumed in the process of placing the payload into orbit. This puts enormous demands on the engine, whose performance is limited by the energy density and power density that can be reached using chemical reactions. The only practical way to “break out” of these limitations is to utilize a different principle than that of a rocket, and/or to utilize a different source of energy (e.g. nuclear reactions). That in turn requires a revolution in space technology.

 

Tsiolkowski proposed one possibility already back in 1895: the so-called “space elevator”, which is based on a completely different principle than rocket-based transport. There are a number of other, very interesting possibilities for breaking free of the problems and limitations of chemical-rocket-based space transport. But realizing Tsiolkovski’s idea or any of the other proposed alternatives will require revolutionary breakthroughs in science and technology.

 

Travel into space will surely become a normal and routine matter in the future, just as flying in airplanes today. Man is destined to colonize Mars and other regions of the solar system. But this will not be achieved by strategies focused primarily on cutting costs. Austerity policies lead to disasters. As I have discussed in my book on Physical Economy, we need instead a transition to what I call strongly nonlinear development: economic development “propelled” by an ongoing succession of fundamental discoveries and ensuing technological revolutions. We require this form of economic development not only for Man’s activity in space, but even more for the future of the human civilization on the Earth.

 

                                                                                                                                                                Jonathan Tennenbaum