The Future: six drivers of global change

Chapter 10

Their fearful speculation was dismissed by physicists who were confident that such an event was absurdly implausible. And of course it was. But other questions focused on deeper and more relevant concerns were not adequately dealt with at all. Would this thermonuclear explosion contribute significantly to the diversion of trillions of dollars into weaponry and further accelerate a dangerous nuclear arms race that threatened the survival of human civilization?

For the most part, the fears of microbial kudzu (or their microscopic mechanical counterparts-self-replicating nan.o.bots, or so-called gray goo), are now often described as probably overblown, although the executive director of GeneWatch, an NGO watchdog organization, Helen Wallace, told The New York Times Magazine, "It's almost inevitable that there will be some level of escape. The question is: Will those organisms survive and reproduce? I don't think anyone knows."

But what about other questions that may seem less urgent but may be more important in the long run: if we robosource life itself, and synthesize life-forms that are more suited to our design than the pattern that has been followed by life for 3.5 billion years, how is this new capability likely to alter our relations.h.i.+p to nature? How is it likely to change nature? Are we comfortable forging full speed ahead without any organized effort to identify and avoid potential outcomes that we may not like?

One concern that technologists and counterterrorism specialists have highlighted is the possibility of a new generation of biological weapons. After all, some of the early developments in genetic engineering, we now know, were employed by the Soviet Union in a secret biological weapons program forty years ago. If the exciting tools of the Digital Revolution have been weaponized for cyberwar, why would we not want some safeguards to prevent the same diversion of synthetic biology into bioweapons?

The New and Emerging Science and Technology (NEST) high-level expert group of the European Commission wrote in 2005 that "The possibility of designing a new virus or bacterium a la carte could be used by bioterrorists to create new resistant pathogenic strains or organisms, perhaps even engineered to attack genetically specific sub-populations." In 2012, the U.S. National Science Advisory Board for Biosecurity attempted to halt the publication of two scientific research papers-one in Nature and the other in Science-that contained details on the genetic code of a mutated strain of bird flu that had been modified in an effort to determine what genetic changes could make the virus more readily transmissible among mammals.

Citing concerns that the detailed design of a virus that was only a few mutations away from a form that could be spread by human-to-human transmission, the bioterrorism officials tried to dissuade scientists from publis.h.i.+ng the full genetic sequence that accompanied their papers. Although publication was allowed to proceed after a full security review, the U.S. government remains actively involved in monitoring genetic research that could lead to new bioweapons. Under U.S. law, the FBI screens the members of research teams working on projects considered militarily sensitive.

HUMAN CLONING.

Among the few lines of experiments specifically banned by the U.S. government are those involving federally funded research into the cloning of human beings. As vice president, not long after the birth of the first cloned sheep, Dolly, in 1996, when it became clear that human cloning was imminently feasible, I strongly supported this interim ban pending a much fuller exploration of the implications for humanity of proceeding down that path, and called for the creation of a new National Bioethics Advisory Commission to review the ethical, moral, and legal implications of human cloning.

A few years earlier, as chairman of the Senate Subcommittee on Science, I had pushed successfully for a commitment of 3 percent of the funding for the Human Genome Project to be allocated to the study of extensive ethical, legal, and social implications (they are now referred to as ELSI grants), in an effort to ensure careful study of the difficult questions that were emerging more quickly than their answers. This set-aside has become the largest government-financed research program into ethics ever established. James Watson, the co-discoverer of the double helix, who by then had been named to head the Genome Project, was enthusiastically supportive of the ethics program.

The ethics of human cloning has been debated almost since the very beginning of the DNA era. The original paper published by Watson and Crick in 1953 included this sentence: "It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material." As chairman of the Science Investigations Subcommittee in the U.S. House of Representatives, I conducted a series of hearings in the early 1980s about the emerging science of cloning, genetic engineering, and genetic screening. Scientists were focused at that stage on cloning animals, and fifteen years later they succeeded with Dolly. Since then, they have cloned many other livestock and other animals.

But from the start of their experiments, the scientists were clear that all of the progress they were making on the cloning of animals was directly applicable to the cloning of people-and that it was only ethical concerns that had prevented them from attempting such procedures. Since 1996, human cloning has been made illegal in almost every country in Europe, and the then director-general of the World Health Organization called the procedure "ethically unacceptable as it would violate some of the basic principles which govern medically a.s.sisted reproduction. These include respect for the dignity of the human being and the protection of the security of human genetic material."

Nevertheless, most antic.i.p.ate that with the pa.s.sage of time, and further development and refinement of the technique, human cloning will eventually take place-at least in circ.u.mstances where a clear medical benefit can be gained without causing a clear form of harm to the individual who is cloned or to society at large. In 2011, scientists at the New York Stem Cell Foundation Laboratory announced that they had cloned human embryos by reprogramming an adult human egg cell, engineering it to return to its embryonic stage, and then created from it a line of identical embryonic stem cells that reproduced themselves. Although the DNA of these cells is not identical to that of the patient who donated the egg cell, they are identical to one another, which facilitates the efficacy of research carried out on them.

Several countries, including Brazil, Mexico, and Canada, have banned the cloning of human embryos for research. The United States has not done so, and several Asian countries seem to have far fewer misgivings about moving forward aggressively with the science of cloning human embryos-if not actual humans. From time to time, there are reports that one or another fertility doctor working at some secret laboratory, located in a nation without a ban on human cloning, has broken this modern taboo against human cloning. Most if not all of these stories, however, have been suspected of being fraudulent. There has yet been no confirmed birth of a human clone.

In general, those in favor of proceeding with experiments in human cloning believe that the procedure is not really different from other forms of technological progress, that it is inevitable in any case, and is significantly more promising than most experiments because of the medical benefits that can be gained. They believe that the decision on whether or not to proceed with a specific clone should-like a decision on abortion-be in the hands of the parent or parents.

Those who oppose cloning of people fear that its use would undermine the dignity of individuals and run the risk of "commoditizing" human beings. Cloning does theoretically create the possibility of ma.s.s-producing many genetic replicas of the same original-a process that would be as different from natural reproduction as manufacturing is from craftsmans.h.i.+p.

Some base their argument on religious views of the rights and protections due to every person, though many who oppose human cloning base their views not on religion, but on a more generalized humanist a.s.sertion of individual dignity. In essence, they fear that the manipulation of humanity might undermine the definition of those individuals who have been manipulated as fully human. This concern seems to rest, however, on an a.s.sumption that human beings are reducible to their genetic makeup-a view that is normally inconsistent with the ideology of those who make the protection of individual dignity a top priority.

Both the temporary delay in the public release of details concerning how to create dangerous mutations in the H5N1 bird flu virus and the temporary ban on government-funded research into human cloning represent rare examples of thoughtful-if controversial-oversight of potentially problematic developments in order to a.s.sess their implications for humanity as a whole. Both represented examples of U.S. leaders.h.i.+p that led to at least a temporary global consensus. In neither case was there a powerful industry seeking to push forward quickly in spite of the misgivings expressed by representatives of the public.

ANTIBIOTICS BEFORE SWINE.

Unfortunately, when there is a strong commercial interest in influencing governments to make a decision that runs roughshod over the public interest, business lobbies are often able to have their way with government-which once again raises the question: who is the "we" that makes decisions about the future course of the Life Sciences Revolution when important human values are placed at risk? In the age of Earth Inc., Life Inc., and the Global Mind, the record of decision making includes troubling examples of obsequious deference to the interests of multinational corporations and a reckless disregard of sound science.

Consider the shameful acquiescence by the U.S. Congress in the livestock industry's continuing absurd dominance of antibiotic use in the United States. In yet another ill.u.s.tration of the dangerous imbalance of power in political decision making, a truly shocking 80 percent of all U.S. antibiotics are still allowed to be legally used on farms in livestock feed and for injections in spite of grave threats to human health. In 2012, the FDA began an effort to limit this use of antibiotics with a new rule that will require a prescription from veterinarians.

Since the discovery of penicillin in 1929 by Alexander Fleming, antibiotics have become one of the most important advances in the history of health care. Although Fleming said his discovery was "accidental," the legendary Irish scientist John Tyndall (who first discovered that CO2 traps heat) reported to the Royal Society in London in 1875 that a species of Penicillium had destroyed some of the bacteria he was working with, and Ernest d.u.c.h.esne wrote in 1897 on the destruction of bacteria by another species of Penicillium. d.u.c.h.esne had recommended research into his discovery but entered the army and went to war immediately after publication of his paper. He died of tuberculosis before he could resume his work.

In the wake of penicillin, which was not used in a significant way until the early 1940s, many other potent antibiotics were discovered in the 1950s and 1960s. In the last few decades, though, the discoveries have slowed to a trickle. The inappropriate and irresponsible use of this limited a.r.s.enal of life-saving antibiotics is rapidly eroding their effectiveness. Pathogens that are stopped by antibiotics mutate and evolve over time in ways that circ.u.mvent the effectiveness of the antibiotic.

Consequently, doctors and other medical experts have urged almost since the first use of these miracle cures that they be used sparingly and only when they are clearly needed. After all, the more they are used, the more opportunities the pathogens have to evolve through successive generations before they stumble upon new traits that make the antibiotics impotent. Some antibiotics have already become ineffective against certain diseases. And with the slowing discovery of new antibiotics, the potency of the ones we use in our current a.r.s.enal is being weakened at a rate that is frightening to many health experts. The effectiveness of our antibiotic a.r.s.enal-like topsoil and groundwater-can be depleted quickly but regenerated only at an agonizingly slow rate.

One of the most serious new "superbugs" is multidrug-resistant tuberculosis, which, according to Dr. Margaret Chan, director-general of the World Health Organization, is extremely difficult and expensive to treat. At present, 1.34 million people die from tuberculosis each year. Of the 12 million cases in 2010, Chan estimated that 650,000 involved strains of TB that were multidrug-resistant. The prospect of a "post antibiotic world" means, Chan said, "Things as common as strep throat or a child's scratched knee, could once again kill." In response to these concerns, the FDA formed a new task force in 2012 to support development of new antibacterial drugs.

But in spite of these basic medical facts, many governments-including, shockingly, the United States government-allow the ma.s.sive use of antibiotics by the livestock industry as a growth stimulant. The mechanism by which the antibiotics cause a faster growth rate in livestock is not yet fully understood, but the impact on profits is very clear and sizable. The pathogens in the guts of the livestock are evolving quickly into superbugs that are immune to the impact of antibiotics. Since the antibiotics are given in subtherapeutic doses and are not princ.i.p.ally used for the health of the livestock anyway, the livestock companies don't particularly care. And of course, their lobbyists tendentiously dispute the science while handing out campaign contributions to officeholders.

Last year, scientists confirmed that a staphylococcus germ that was vulnerable to antibiotics jumped from humans to pigs whose daily food ration included tetracycline and methicillin. Then, the scientists confirmed that the same staph germ, after becoming resistant to the antibiotics, found a way to jump back from pigs into humans.

The particular staph germ that was studied-CC398-has spread in populations of pigs, chickens, and cattle. Careful a.n.a.lysis of the genetic structure of the germ proved that it was a direct ancestor of an antibiotic-susceptible germ that originated in people. It is now present, according to the American Society for Microbiology, in almost half of all meat that has been sampled in the U.S. Although it can be killed with thorough cooking of the meat, it can nevertheless infect people if it cross-contaminates kitchen utensils, countertops, or pans.

Again, the U.S. government's frequently obsequious approach to regulatory decision making when a powerful industry exerts its influence stands in stark contrast to the approach it takes when commercial interests are not yet actively engaged. In the latter case, it seems to be easier for government to sensitively apply the precautionary principle. But this controversy ill.u.s.trates the former case: those who benefit from the ma.s.sive and reckless use of antibiotics in the livestock industry have fought a rearguard action for decades and have thus far been successful in preventing a ban or even, until recently, a regulation limiting this insane practice.

The European Union has already banned antibiotics in livestock feed, but in a number of other countries the practice continues unimpeded. The staph germ that has jumped from people to livestock and back again is only one of many bacteria that are now becoming resistant to antibiotics because of our idiotic acceptance of the livestock industry's insistence that it is perfectly fine for them to reduce some of their costs by becoming factories for turning out killer germs against which antibiotics have no effect. In a democracy that actually functioned as it is supposed to, this would not be a close question.

Legislators have also repeatedly voted down a law that would prevent the sale of animals with mad cow disease (bovine spongiform encephalopathy, or BSE)-a neurodegenerative brain disease caused by eating beef contaminated during the slaughtering process by brain or spinal cord tissue from an animal infected by the pathogen (a misfolded protein, or prion) that causes the disease. Animals with later stages of the disease can carry the prions in other tissues as well. When an animal on the way to the slaughterhouse begins stumbling, staggering, and falling down, it is fifty times more likely to have the disease.

The struggle in repeated votes in the Congress has been over whether those specific animals manifesting those specific symptoms should be diverted from the food supply. At least three quarters of the confirmed cases of mad cow disease in North America were traced to animals that had manifested those symptoms just before they were slaughtered. But the political power and lobbying muscle of the livestock industry has so intimidated and enthralled elected representatives in the U.S. that lawmakers have repeatedly voted to put the public at risk in order to protect a tiny portion of the industry's profits. The Obama administration has issued a regulation that embodies the intent of the law rejected by Congress. However, because it is merely a regulation, it could be reversed by Obama's successor as president. Again, in a working democracy, this would hardly be a close question.

THE INABILITY OF Congress to free itself from the influence of special interests has implications for how the United States can make the difficult and sensitive judgments that lie ahead in the Life Sciences Revolution. If the elected representatives of the people cannot be trusted to make even obvious choices in the public interest-as in the mad cow votes or the decisions on frittering away antibiotic resistance in order to enrich the livestock industry-then where else can these choices be made? Who else can make them? And even if such decisions are made sensitively and well in one country, what is to prevent the wrong decision being made elsewhere? And if the future of human heredity is affected in perpetuity, is that an acceptable outcome?

EUGENICS.

The past record of decisions made by government about genetics is far from rea.s.suring. History sometimes resembles Greek mythology, in that like the G.o.ds, our past mistakes often mark important boundaries with warnings. The history of the eugenics movement 100 years ago provides such a warning: a profound misunderstanding of Darwinian evolution was used as the basis for misguided efforts by government to engineer the genetic makeup of populations according to racist and other unacceptable criteria.

In retrospect, the eugenics movement should have been vigorously condemned at the time-all the more so because of the stature of some of its surprising proponents. A number of otherwise thoughtful Americans came to support active efforts by their government to shape the genetic future of the U.S. population through the forcible sterilization of individuals who they feared would otherwise pa.s.s along undesirable traits to future generations.

In 1922, a "model eugenical sterilization law" (originally written in 1914) was published by Harry Laughlin, superintendent of the recently established "Eugenics Record Office" in New York State, to authorize sterilization of people regarded as (1) Feeble-minded; (2) Insane, (including the psychopathic); (3) Criminalistic (including the delinquent and wayward); (4) Epileptic; (5) Inebriate (including drug-habitues); (6) Diseased (including the tuberculous, the syphilitic, the leprous, and others with chronic, infectious and legally segregable diseases); (7) Blind (including those with seriously impaired vision); (8) Deaf (including those with seriously impaired hearing); (9) Deformed (including the crippled); and (10) Dependent (including orphans, ne'er-do-wells, the homeless, tramps and paupers.) Between 1907 and 1963, over 64,000 people were sterilized under laws similar to

What makes the list of traits in Laughlin's "model law" bizarre as well as offensive is that he obviously believed they were heritable. Ironically, Laughlin was himself an epileptic; thus, under his model legislation, he would have been suitable for forced sterilization. Laughlin's malignant theories also had an impact on U.S. immigration law. His work on evaluating recent immigrants from Southern and Eastern Europe was influential in forming the highly restrictive quota system of 1924.

As pointed out by Jonathan Moreno in his book The Body Politic, the eugenics movement was influenced by deep confusion over what evolution really means. The phrase "survival of the fittest" did not originate with Charles Darwin, but with his cousin Sir Francis Galton, and was then popularized by Herbert Spencer-whose rival theory of evolution was based on the crackpot ideas of Jean-Baptiste Lamarck. Lamarck argued that characteristics developed by individuals after their birth were genetically pa.s.sed on to their offspring in the next generation.

A similar b.a.s.t.a.r.dization of evolutionary theory was also promoted in the Soviet Union by Trofim Lysenko-who was responsible for preventing the teaching of mainstream genetics during the three decades of his rein in Soviet science. Geneticists who disagreed with Lysenko were secretly arrested; some were found dead in unexplained circ.u.mstances. Lysenko's warped ideology demanded that biological theory conform with Soviet agricultural needs-much as some U.S. politicians today insist that climate science be changed to conform with their desire to promote the unrestrained burning of oil and coal.

Darwin actually taught that it was not necessarily the "fittest" who survived, but rather those that were best adapted to their environments. Nevertheless, the twisted and mistaken version of Darwin's theory that was reflected in his cousin's formulation helped to give rise to the notion of Social Darwinism-which led, in turn, to misguided policy debates that in some respects continue to this day.

Some of the early progressives were seduced by this twisted version of Darwin's theory into believing that the state had an affirmative duty to do what it could to diminish the proliferation of unfavorable Lamarckian traits that they mistakenly believed were becoming more common because prior state interventions had made life easier for these "undesirables," and had enabled them to proliferate.

The same flawed a.s.sumptions led those on the political right to a different judgment: the state should pull back from all those policy interventions that had, in the name of what they felt was misguided compa.s.sion, led to the proliferation of "undesirables" in the first place. There were quite a few reactionary advocates of eugenics. At least one of them survives into the twenty-first century-the Pioneer Fund, described as a hate group by the Southern Poverty Law Center. Incidentally, its founding president was none other than Harry Laughlin.

Eugenics also found support, historians say, because of the socioeconomic turmoil of the first decades of the twentieth century-rapid industrialization and urbanization, the disruption of long familiar social patterns, waves of immigration, and economic stress caused by low wages and episodic high unemployment. These factors combined with a new zeal for progressive reform to produce a wildly distorted view of what was appropriate by way of state intervention in heredity.

Although this episode in the world's history is now regarded as horribly unethical-in part because thirty years after it began, the genocidal crimes of Adolf Hitler discredited all race-based, and many genetics-based, theories that were even vaguely similar to that of n.a.z.ism. Nevertheless, some of the subtler lessons of the eugenics travesty have not yet been incorporated into the emerging debate over current proposals that some have labeled "neo-eugenics."

One of the greatest challenges facing democracies in this new era is how to ensure that policy decisions involving cutting-edge science are based on a clear and accurate understanding of the science involved. In the case of eugenics, the basic misconception traced back to Lamarck concerning what is inheritable and what is not contributed to an embarra.s.sing and deeply immoral policy that might have been avoided if policymakers and the general public had been debating policy on the basis of accurate science.

It is worth noting that almost a century after the eugenics tragedy, approximately half of all Americans still say they do not believe in evolution. The judgments that must be made within the political system of the United States in the near future-and in other countries-are difficult enough even when based on an accurate reading of the science. When this inherent difficulty is compounded by flawed a.s.sumptions concerning the science that gives rise to the need to make these decisions, the vulnerability to mistaken judgments goes up accordingly.

As will be evident in the next chapter, the decisions faced by civilization where global warming is concerned are likewise difficult enough when they are based on an accurate reading of the science. But when policymakers base arguments on gross misrepresentations of the science, the degree of difficulty goes up considerably. When gross and willful misunderstandings of the science are intentionally created and reinforced by large carbon polluters who wish to paralyze the debate over how to reduce CO2 emissions, they are, in my opinion, committing a nearly unforgivable crime against democracy and against the future well-being of the human species.

In a 1927 opinion by Justice Oliver Wendell Holmes Jr., the U.S. Supreme Court upheld one of the more than two dozen state eugenics laws. The case, Buck v. Bell, involved the forcible sterilization of a young Virginia woman who was allegedly "feeble-minded" and s.e.xually promiscuous. Under the facts presented to the court, the young woman, Carrie Buck, had already had a child at the age of seventeen. In affirming the state's right to perform the sterilization, Holmes wrote that, "Society can prevent those who are manifestly unfit from continuing their kind.... Three generations of imbeciles are enough."

A half century after the Supreme Court decision, which has never been overturned, the director of the hospital where Buck had been forcibly sterilized tracked her down when she was in her eighties. He found that, far from being an "imbecile," Buck was lucid and of normal intelligence. Upon closer examination of the facts, it became obvious that they were not as represented in court. Young Carrie Buck was a foster child who had been raped by a nephew of one of her foster parents, who then committed her to the Virginia State Colony for Epileptics and Feebleminded in order to avoid what they feared would otherwise be a scandal.

As it happens, Carrie's mother, Emma Buck-the first of the three generations referred to by Justice Holmes-had also been committed to the same asylum under circ.u.mstances that are not entirely clear, although testimony indicated that she had syphilis and was unmarried when she gave birth to Carrie. In any case, the superintendent of the Virginia Colony, Albert Priddy, was eager to find a test case that could go to the Supreme Court and provide legal cover for the forced sterilizations that his and other inst.i.tutions already had under way. He declared Buck "congenitally and incurably defective"; Buck's legal guardian picked a lawyer to defend her in the case who was extremely close to Priddy and a close friend since childhood to the lawyer for the Colony, a eugenics and sterilization advocate (and former Colony director) named Aubrey Strode.

Historian Paul Lombardo of Georgia State University, who wrote an extensively researched book on the case, wrote that the entire proceeding was "based on deceit and betrayal.... The fix was in." Buck's appointed defense counsel put forward no witnesses and no evidence, and conceded the description of his client as a "middle-grade moron." Harry Laughlin, who had never met Carrie Buck, her mother, or her daughter, testified to the court in a written statement that all three were part of the "s.h.i.+ftless, ignorant, and worthless cla.s.s of anti-social whites of the South."

As for the third generation of Bucks, Carrie's daughter, Vivian, was examined at the age of a few weeks by a nurse who testified: "There is a look about it that is not quite normal." The baby girl was taken from her family and given to the family of Carrie's rapist. After making the honor roll in school, Vivian died of measles in the second grade. Incidentally, Carrie's sister, Doris, was also sterilized at the same inst.i.tution (more than 4,000 sterilizations were performed there), though doctors lied to her about the operation when it was performed and told her it was for appendicitis. Like Carrie, Doris did not learn until much later in her life why she was unable to have children.

The "model legislation" put forward by Laughlin, which was the basis for the Virginia statute upheld by the Supreme Court, was soon thereafter used by the Third Reich as the basis for their sterilization of more than 350,000 people-just as the psychology-based marketing text written by Edward Bernays was used by Goebbels in designing the propaganda program surrounding the launch and prosecution of Hitler's genocide. The n.a.z.is presented Laughlin with an honorary degree in 1936 from the University of Heidelberg for his work in the "science of racial cleansing."

Shamefully, eugenics was supported by, among others, President Woodrow Wilson, Alexander Graham Bell, Margaret Sanger, who founded the movement in favor of birth control-an idea that was, at the time, more controversial than eugenics-and by Theodore Roosevelt after he left the White House. In 1913, Roosevelt wrote in a letter, It is really extraordinary that our people refuse to apply to human beings such elementary knowledge as every successful farmer is obliged to apply to his own stock breeding. Any group of farmers who permitted their best stock not to breed, and let all the increase come from the worst stock, would be treated as fit inmates for an asylum. Yet we fail to understand that such conduct is rational compared to the conduct of a nation which permits unlimited breeding from the worst stocks, physically and morally, while it encourages or connives at the cold selfishness or the twisted sentimentality as a result of which the men and women who ought to marry, and if married have large families, remain celibates or have no children or only one or two.

Sanger, for her part, disagreed with the methods of eugenics advocates, but nevertheless wrote that they were working toward a goal she supported: "To a.s.sist the race toward the elimination of the unfit." One of Sanger's own goals in promoting contraception, she wrote in 1919, was, "More children from the fit, less from the unfit-that is the chief issue of birth control."

The United States is not the only democratic nation with a troubling history of forced sterilization. Between 1935 and 1976, Sweden forcibly sterilized more than 60,000 people, including "mixed-race individuals, single mothers with many children, deviants, gypsies and other 'vagabonds.' " For forty years, from 1972 to 2012, Sweden required sterilization before a transgendered person could officially change his or her gender identification on government identification doc.u.ments. However, the Stockholm Administrative Court of Appeal found the law unconst.i.tutional in December 2012. Sixteen other European countries continue to have similar laws on the books, including France and Italy. Only a few countries are considering revisions to the laws, despite the lack of any scientific or medical basis for them.

In Uzbekistan, forced sterilizations apparently began in 2004 and became official state policy in 2009. Gynecologists are given a quota of the number of women per week they are required to sterilize. "We go from house to house convincing women to have the operation," said a rural surgeon. "It's easy to talk a poor woman into it. It's also easy to trick them."

In China, the issue of forced abortions has resurfaced with the allegations by escaped activist Chen Guangcheng, but the outgoing premier Wen Jiabao has publicly called for a ban not only on forced abortion, but also of "fetus gender identification." Nevertheless, many women who have abortions in China are also sterilized against their will. In India, although forcible sterilization is illegal, doctors and government officials are paid a bonus for each person who is sterilized. These incentives apparently lead to widespread abuses, particularly in rural areas where many women are sterilized under false pretenses.

The global nature of the revolution in biotechnology and the life sciences-like the new global commercial realities that have emerged with Earth Inc.-means that any single nation's moral, ethical, and legal judgments may not have much impact on the practical decisions of other nations. Some general rules about what is acceptable, what is worthy of extra caution, and what should be prohibited have been tentatively observed, but there is no existing means for arriving at universal moral judgments about these new unfolding capabilities.

CHINA AND THE LIFE SCIENCES.

As noted earlier, China appears determined to become the world's superpower in the application of genetic and life science a.n.a.lysis. The Beijing Genomics Inst.i.tute (BGI), which is leading China's commitment to genomic a.n.a.lysis, has already completed the full genomes of fifty animal and plant species, including silk worms, pandas, honeybees, rice, soybeans, and others-along with more than 1,000 species of bacteria. But China's princ.i.p.al focus seems to be on what is arguably the most important, and certainly the most intriguing, part of the human body that can be modified by the new breakthroughs in life sciences and related fields: the human brain and the enhancement and more productive use of human intelligence.

Toward this end, in 2011 the BGI established China's National Gene Bank in Shenzhen, where it has been seeking to identify which genes are involved in determining intelligence. It is conducting a complete genomic a.n.a.lysis of 2,000 Chinese schoolchildren (1,000 prodigies from the nation's best schools, and 1,000 children considered of average intelligence) and matching the results with their achievements in school.

In the U.S., such a study would be extremely controversial, partly because of residual revulsion at the eugenics scandal, and partly because of a generalized wariness about linking intelligence to family heritage in any society that values egalitarian principles. In addition, many biologists, including Francis Collins, who succeeded James Watson as the leader of the Human Genome Project, have said that it is currently scientifically impossible in any case to link genetic information about a child to intelligence. However, some researchers disagree and believe that eventually genes a.s.sociated with intelligence may well be identified.

Meanwhile, the speed with which advances are being made in mapping the neuronal connections of the human brain continue to move forward significantly faster than the progress measured by Moore's Law in the manufacturing of integrated circuits. Already, the connectome of a species of nematode, which has only 302 neurons, has been completed. Nevertheless, with an estimated 100 billion neurons in an adult human brain and at least 100 trillion synaptic connections, the challenge of fully mapping the human connectome is a daunting one. And even then, the work of understanding the human brain's functioning will have barely begun.

In that regard, it is worth remembering that after the completion of the first full sequencing of the human genome, scientists immediately realized that the map of genes was only their introduction to the even larger task of mapping all of the proteins that are expressed by the genes-which themselves adopt multiple geometric forms and are subject to significant biochemical modifications after they are translated by the genes.

In the same way, once the connectome is completed, brain scientists will have to turn to the role of proteins in the brain. As David Eagleman, a neuroscientist at the Baylor College of Medicine in Houston, puts it, "Neuroscience is obsessed with neurons because our best technology allows us to measure them. But each individual neuron is in fact as complicated as a city, with millions of proteins inside of it, trafficking and interacting in extraordinarily complex biochemical cascades."

Still, even at this early stage in the new Neuroscience Revolution, scientists have learned how to selectively activate specific brain systems. Exploiting advances in the new field of optogenetics, scientists first identify opsins-light-sensitive proteins from green algae (or bacteria)-and place into cells their corresponding genes, which then become optical switches for neurons. By also inserting genes that correspond to other proteins that glow in green light, the scientists were then able to switch the neuron on and off with blue light, and then observe its effects on other neurons with a green light. The science of optogenetics has quickly advanced to the point where researchers are able to use the switches to manipulate the behavior and feelings of mice by controlling the flow of ions (charged particles) to neurons, effectively turning them on and off at will. One of the promising applications may be the control of symptoms a.s.sociated with Parkinson's disease.

Other scientists have inserted multiple genes from jellyfish and coral that produce different fluorescent colors-red, blue, yellow, and gradations in between-into many neurons in a process that then allows the identification of different categories of neurons by having each category light up in a different color. This so-called "brainbow" allows a much more detailed visual map of neuronal connections. And once again, the Global Mind has facilitated the emergence of a powerful network effect in brain research. When a new element of the brain's intricate circuitry is deciphered, the knowledge is widely dispersed to other research teams whose work in decoding other parts of the connectome is thereby accelerated.

WATCHING THE BRAIN THINK.

Simultaneously, a completely different new approach to studying the brain-functional magnetic resonance imaging (fMRI)-has led to exciting new discoveries. This technique, which is based on the more familiar MRI scans of body parts, tracks blood flow in the brain to neurons when they are fired. When neurons are active, they take in blood containing the oxygen and glucose needed for energy. Since there is a slight magnetization difference between oxygenated blood and oxygen-depleted blood, the scanning machine can identify which areas of the brain are active at any given moment.

By correlating the images made by the machine with the subjective descriptions of thoughts or feelings reported by the individual whose brain is being scanned, scientists have been able to make breakthrough discoveries about where specific functions are located in the brain. This technique is now so far advanced that experienced teams can actually identify specific thoughts by seeing the "brain prints" a.s.sociated with those thoughts. The word "hammer," for example, has a distinctive brain print that is extremely similar in almost everyone, regardless of nationality or culture.

One of the most startling examples of this new potential was reported in 2010 by neuroscientist Dr. Adrian Owen, when he was at the University of Cambridge in England. Owen performed fMRI scans on a young woman who was in a vegetative state with no discernible sign of consciousness and asked her questions while she was being scanned. He began by asking her to imagine playing tennis, and then asking her to imagine walking through her house. Scientists have established that people who think about playing tennis demonstrate activity in a particular part of the motor cortex portion of the brain, the supplementary motor area. Similarly, when people think about walking through their own home, there is a recognizable pattern of activity in the center of the brain in an area called the parahippocampal gyrus.

After observing that the woman responded to each of these questions by manifesting exactly the brain activity one would expect from someone who is conscious, the doctor then used these two questions as a way of empowering the young woman to "answer" either "yes" by thinking about playing tennis, or "no" by imagining a stroll through her house. He then patiently asked her a series of questions about her life, the answers to which were not known by anyone partic.i.p.ating in the medical team. She answered correctly to virtually all the questions, leading Owen to conclude that she was in fact conscious. After continuing his experiments with many other patients, Owen speculated that as many as 20 percent of those believed to be in vegetative states may well be conscious with no way of connecting to others. Owen and his team are now using noninvasive electroencephalography (EEG) to continue this work.

Scientists at Dartmouth College are also using an EEG headset to interpret thoughts and connect them to an iPhone, allowing the user to select pictures that are then displayed on the iPhone's screen. Because the sensors of the EEG are attached to the outside of the head, it has more difficulty interpreting the electrical signals inside the skull, but they are making impressive progress.

A LOW-COST HEADSET developed some years ago by an Australian game company, Emotiv, translates brain signals and uses them to empower users to control objects on a computer screen. Neuroscientists believe that these lower-cost devices are measuring "muscle rhythms rather than real neural activity." Nevertheless, scientists and engineers at IBM's Emerging Technologies lab in the United Kingdom have adapted the headset to allow thought control of other electronic devices, including model cars, televisions, and switches. In Switzerland, scientists at the Ecole Polytechnique Federale de Lausanne (EPFL) have used a similar approach to build wheelchairs and robots controlled by thoughts. Four other companies, including Toyota, have announced they are developing a bicycle whose gears can be s.h.i.+fted by the rider's thoughts.

Gerwin Schalk and Anthony Ritaccio, at the Albany Medical Center, are working under a multimillion-dollar grant from the U.S. military to design and develop devices that enable soldiers to communicate telepathically. Although this seems like something out of a science fiction story, the Pentagon believes that these so-called telepathy helmets are sufficiently feasible that it is devoting more than $6 million to the project. The target date for completion of the prototype device is 2017.

"TRANSHUMANISM" AND THE "SINGULARITY"

If such a technology is perfected, it is difficult to imagine where more sophisticated later versions of it would lead. Some theorists have long predicted that the development of a practical way to translate human thoughts into digital patterns that can be deciphered by computers will inevitably lead to a broader convergence between machines and people that goes beyond cyborgs to open the door on a new era characterized by what they call "transhumanism."

According to Nick Bostrom, the leading historian of transhumanism, the term was apparently coined by Aldous Huxley's brother, Julian, a distinguished biologist, environmentalist, and humanitarian, who wrote in 1927, "The human species can, if it wishes, transcend itself-not just sporadically, an individual here in one way, an individual there in another way-but in its entirety, as humanity. We need a name for this new belief. Perhaps transhumanism will serve: man remaining man, but transcending himself, by realizing new possibilities of and for his human nature."

The idea that we as human beings are not an evolutionary end point, but are destined to evolve further-with our own active partic.i.p.ation in directing the process-is an idea whose roots are found in the intellectual ferment following the publication of Darwin's On the Origin of Species, a ferment that continued into the twentieth century. This speculation led a few decades later to the discussion of a new proposed endpoint in human evolution-the "Singularity."

First used by Teilhard de Chardin, the term "Singularity" describes a future threshold beyond which artificial intelligence will exceed that of human beings. Vernor Vinge, a California mathematician and computer scientist, captured the idea succinctly in a paper published twenty years ago, ent.i.tled "The Coming Technological Singularity," in which he wrote, "Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended."

In the current era, the idea of the Singularity has been popularized and enthusiastically promoted by Dr. Ray Kurzweil, a polymath, author, inventor, and futurist (and cofounder with Peter Diamandis of the Singularity University at the NASA Research Park in Moffett Field, California). Kurzweil envisions, among other things, the rapid development of technologies that will facilitate the smooth and complete translation of human thoughts into a form that can be comprehended by and contained in advanced computers. a.s.suming that these breakthroughs ever do take place, he believes that in the next few decades it will be possible to engineer the convergence of human intelligence-and even consciousness-with artificial intelligence. He recently wrote, "There will be no distinction, post-Singularity, between human and machine or between physical and virtual reality."

Kurzweil is seldom reluctant to advance provocative ideas simply because many other technologists view them as outlandish. Another close friend, Mitch Kapor, also a legend in the world of computing, has challenged Kurzweil to a $20,000 bet (to be paid to a foundation chosen by the winner) involving what is perhaps the most interesting long-running debate over the future capabilities of computers, the Turing Test. Named after the legendary pioneer of computer science Alan Turing, who first proposed it in 1950, the Turing Test has long served as a proxy for determining when computers will achieve human-level intelligence. If after conversing in writing with two interlocutors, a human being and a computer, a person cannot determine which is which, then the computer pa.s.ses the test. Kurzweil has a.s.serted that a computer will pa.s.s the Turing Test by the end of 2029. Kapor, who believes that human intelligence will forever be organically distinctive from machine-based intelligence, disagrees. The potential Singularity, however, poses a different challenge.

More recently, the silicon version of the Singularity has been met by a compet.i.tive challenge from some biologists who believe that genetic engineering of brains may well produce an "Organic Singularity" before the computer-based "Technological Singularity" is ever achieved. Personally, I don't look forward to either one, although my uneasiness may simply be an ill.u.s.tration of the difficult thinking that all of us have in store as these multiple revolutions speed ahead at an ever accelerating pace.

THE CREATION OF NEW BODY PARTS.

Even though the merger between people and machines may remain in the realm of science fiction for the foreseeable future, the introduction of mechanical parts as replacements for components of the human body is moving forward quickly. Prosthetics are now being used to replace not only hips, knees, legs, and arms, but also eyes and other body parts that have not previously been replaceable with artificial subst.i.tutes. Cochlear implants, as noted, are used to restore hearing. Several research teams have been developing mechanical exoskeletons to enable paraplegics to walk and to confer additional strength on soldiers and others who need to carry heavy loads. Most bespoke in-ear hearing aids are already made with 3D printers. The speed with which 3D printing is advancing makes it inevitable that many other prosthetics will soon be printed.

In 2012, doctors and technologists in the Netherlands used a 3D printer (described in Chapter 1) to fabricate a lower jaw out of t.i.tanium powder for an elderly woman who was not a candidate for traditional reconstructive surgery. The jaw was designed in a computer with articulated joints that match a real jaw, grooves to accommodate the regrowth of veins and nerves, and precisely designed depressions for her muscles to be attached to it. And of course, it was sized to perfectly fit the woman's face.

Then, the 3D digital blueprint was fed into the 3D printer, which laid down t.i.tanium powder, one ultrathin layer at a time (thirty-three layers for each millimeter), and fused them together with a laser beam each time, in a process that took just a few hours. According to the woman's doctor, Dr. Jules Poukens of Ha.s.selt University, she was able to use the printed jaw normally after awakening from her surgery, and one day later was able to swallow food.

The 3D printing of human organs is not yet feasible, but the emerging possibility has already generated tremendous excitement in the field of transplantation because of the current shortage of organs. However, well before the 3D printing of organs becomes feasible, scientists hope to develop the ability to generate replacement organs in the laboratory for transplantation into humans. Early versions of so-called exosomatic kidneys (and livers) are now being grown by regenerative medicine scientists at Wake Forest University. This emerging potential for people to grow their own replacement organs promises to transform the field of transplantation.

Doctors at the Karolinska Inst.i.tute in Stockholm have already created and successfully transplanted a replacement windpipe by inducing the patient's own cells to regrow in a laboratory on a special plastic "scaffolding" that precisely copied the size and shape of the windpipe it replaced. A medical team in Pittsburgh has used a similar technique to grow a quadriceps muscle for a soldier who lost his original thigh muscle to an explosion in Afghanistan, by implanting into his leg a scaffold made from a pig's urinary bladder (stripped of living cells), which stimulated his stem cells to rebuild the muscle tissue as they sensed the matrix of the scaffolding being broken down by the body's immune system. Scientists at MIT are developing silicon nanowires a thousand times smaller than a human hair that can be embedded in these scaffolds and used to monitor how the regrown organs are performing.

As one of the authors of the National Organ Transplant Act in 1984, I learned in congressional hearings about the problems of finding enough organ donors to meet the growing need for transplantation. And having sponsored the ban on buying and selling organs, I remain unconvinced by the argument that this legal prohibition (which the U.S. shares with all other countries besides Iran) should be removed. The potential for abuse is already obvious in the disturbing black market trade in organs and tissues from people in poor countries for transplantation into people living in wealthy countries.

Pending the development of artificial and regenerated replacement organs, Internet-based tools, including social media, are helping to address the challenge of finding more organ donors and matching them with those who need transplants. In 2012, The New York Times's Kevin Sack reported on a moving example of how sixty different people became part of "the longest chain of kidney transplants ever constructed." Recently, Facebook announced the addition of "organ donor" as one of the items to be updated on the profiles of its users.

Another 3D printing company, Bespoke Innovations of San Francisco, is using the process to print more advanced artificial limbs. Other firms are using it to make numerous medical implants. There is also a well-focused effort to develop the capacity to print vaccines and pharmaceuticals from basic chemicals on demand. Professor Lee Cronin of the University of Glasgow, who leads one of the teams focused on the 3D printing of pharmaceuticals, said recently that the process they are working on would place the molecules of common elements and compounds used to formulate pharmaceuticals into the equivalent of the cartridges that feed different color inks into a conventional 2D printer. With a manageably small group of such cartridges, Cronin said, "You can make any organic molecule."

One of the advantages, of course, is that this process would make it possible to transmit the 3D digital formula for pharmaceuticals and vaccines to widely dispersed 3D printers around the world for the manufacturing of the pharmaceuticals on site with negligible incremental costs for the tailoring of pharmaceuticals to each individual patient.

The pharmaceutical industry relied historically on large centralized manufacturing plants because its business model was based on the idea of a ma.s.s market, within which large numbers of people were provided essentially the same product. However, the digitization of human beings and molecular-based materials is producing such an extraordinarily high volume of differentiating data about both people and things that it will soon no longer make sense to lump people together and ignore medically significant information about their differences.

Our new prowess in manipulating the microscopic fabric of our world is also giving us the ability to engineer nanoscale machines for insertion into the human body-with some active devices the size of living cells that can coexist with human tissue. One team of nanotechnologists at MIT announced in 2012 that they have successfully built "nanofactories" that are theoretically capable of producing proteins while inside the human body when they are activated by s.h.i.+ning a laser light on them from outside the body.

Specialized prosthetics for the brain are also being developed. Alongside pacemakers for hearts, comparable devices can now be inserted into brains to compensate for damage and disorders. Doctors are already beginning to implant computer chips and digital devices on the surface of the brain and, in some cases, deeper within the brain. By cutting a hole in the skull and placing a chip that is wired to a computer directly on the surface of the brain, doctors have empowered paralyzed patients with the ability to activate and direct the movement of robots with their thoughts. In one widely seen demonstration, a paralyzed patient was able to direct a robot arm to pick up a cup of coffee, move it close to her lips, and insert the straw between her lips so she could take a sip.

Experts believe that it is only a matter of time before the increased computational power and the reduction in size of the computer chips will make it possible to dispense with the wires connecting the chip to a computer. Scientists and engineers at the University of Illinois, the University of Pennsylvania, and New York University are working to develop a new form of interface with the brain that is flexible enough to stretch in order to fit the contours of the brain's surface. According to the head of R&D at GlaxoSmithKline, Moncef Slaoui, "The sciences that underpin bioelectronics are proceeding at an amazing pace at academic centers around the world but it is all happening in separate places. The challenge is to integrate the work-in brain-computer interfaces, materials science, nanotechnology, micro-power generation-to provide therapeutic benefit."

Doctors at Tel Aviv University have equipped rats with an artificial cerebellum, which they have attached to the rat's brain stem to interpret information from the rest of the rat's body. By using this information, doctors are able to stimulate motor neurons to move the rat's limbs. Although the work is at an early stage, experts in the field believe that it is only a matter of time before artificial versions of entire brain subsystems are built. Francisco Sepulveda, at the University of Ess.e.x in the U.K., said that the complexity of the challenge is daunting but that scientists see a clear pathway to succeed. "It will likely take us several decades to get there, but my bet is that specific, well-organized brain parts such as the hippocampus or the visual cortex will have synthetic correlates before the end of the century."

Well before the development of a synthetic brain subsystem as complex as the hippocampus or visual cortex, other so-called neuroprosthetics are already being used in humans, including prosthetics for bladder control, relief of spinal pain, and the remediation of some forms of blindness and deafness. Other neuroprosthetics expected to be introduced in the near future will, according to scientists, be able to stimulate particular parts of the brain to enhance focus and concentration, that with the flip of a switch will stimulate the neural connections a.s.sociated with "practice" in order to enhance the ability of a stroke victim to learn how to walk again.



Theme Customizer


Customize & Preview in Real Time

Menu Color Options

Layout Options

Navigation Color Options
Solid
Gradient

Solid

Gradient