Tag Archives: Space Age

Free Agent

/editorial/recurring/inprint.gif

Free Agent

Wernher von Braun’s journey from Nazi scientist to U.S. hero.

November 01, 2007

Von Braun: Dreamer of Space, Engineer of War

By Michael J. Neufeld

/editorial/2007-11-01/Books_von_braun_1_CTR.jpg
shadow
Dr. Wernher von Braun, director of NASA’s Marshall Space Flight Center, points to a television screen in the Saturn blockhouse at Cape Kennedy on February 16, 1965. The screen showed the Saturn I vehicle carrying the Pegasus satellite into orbit. (click for larger version)

 

Knopf; 608 pages; $35.

The story of Wernher von Braun (pronounced “brown”) is the curious adventure of a German-turned-American hero who transformed fantasies of outer-space voyage into realities. However, that story is framed by the often blurred boundaries of good and evil. Despised by some as the Nazi engineer primarily responsible for the V-2 rockets that killed 7,000—mostly in London and Antwerp near the end of World War II—von Braun followed whatever route was available to fulfill his childhood aspirations of space flight. He had dreamed of men one day flying to the Moon and finally realized his ambitions with the development of the Saturn V rocket that launched astronauts into lunar orbit in 1968.

Searching for von Braun’s soul, which is embedded in a history haunted by the Third Reich, author Michael Neufeld has penned a brutally honest, in-depth biography. It chronicles the life of a pioneering rocketeer and one-time Nazi SS officer who became an icon by seducing the American public (thanks to Walt Disney) with notions of space exploration.

Von Braun’s harshest critics insist that he was guilty of war crimes, not only for his primary role in creating the V-2 ballistic missile that intimidated Europe but also because he used prisoners of war laboring in deplorable conditions to build the weapons. More than 20,000 POWs enslaved indirectly under von Braun died at the Mittelwerk rocket facility and its Mittelbau-Dora concentration camp. The underground rocket factory where prisoners lived and worked was a maze of cold, damp, and poorly lit tunnels infested with excrement, lice, and fleas. Prisoners wore rags, and toilets were large metal oil drums cut in half and never cleaned. Disease and malnourishment were rampant, and POWs dropped dead at a rate of 20 per day.

To his defenders, von Braun is a victim of Adolf Hitler’s oppressive authority, a serf of sorts who had no options other than to bow to the Führer’s commands. According to Neufeld, von Braun’s own words in a 1950 New Yorker profile reveal the engineer’s mercenary nature. One afternoon, during a gathering of his amateur rocket club in the early 1930s, a black sedan drove up carrying three German military personnel who made von Braun’s group an offer they could not refuse. Von Braun recalls: “They were in mufti [civilian clothes], but mufti or not, it was the Army . . . That was the beginning. The Versailles Treaty [which disarmed Germany after World War I] hadn’t placed any restrictions on rockets, and the Army was desperate to get back on its feet. We didn’t care much about that, one way or the other, but we needed money, and the Army seemed willing to help us. In 1932, the idea of war seemed to us an absurdity. The Nazis weren’t yet in power. We felt no moral scruples about the possible future use of our brainchild. We were interested solely in exploring outer space. It was simply a question of how the golden cow could be milked most successfully.”

Von Braun claimed no knowledge of the Nazi extermination of Jews. In the 1960s, he told his good friend, science fiction writer Arthur C. Clarke (2001: A Space Odyssey), “I never knew what was happening in the concentration camps. But I suspected it, and in my position I could have found out. I didn’t and I despise myself for it.” Commenting on the confession to Clarke, Neufeld is skeptical about von Braun’s defense: “Knowing what we know now about his direct encounter with SS prisoners starting in mid-1943, the first sentence of his statement could be interpreted as a bald-faced lie.” Quoting historian Ian Kershaw, Neufeld adds, “The road to Auschwitz was built by hate, but was paved with indifference.”

In his sworn affidavit to the U.S. Army in 1947, von Braun said that he was forced to join the National Socialist Party in 1939. In actuality, he had joined the Nazi Party in 1937, though he was no doubt pressured to do so.

Later in life, von Braun often bolstered his claim that he was not a true Nazi by telling of his and a few associates’ arrest by the Gestapo in 1944. The rocket engineer was a “heavy social drinker.” One night he and his intoxicated comrades had talked loudly at a party about the war not going well, wishing that their rocket development could be used to build spaceships instead of weapons. They were arrested within days. Problems that delayed final production of the V-2 had prompted speculation that perhaps von Braun had actually been arrested for suspected sabotage. There was even some talk that he and the others might be executed. They were freed after a couple of weeks, because Hitler desperately needed them to finish the V-2. Von Braun knew he had to produce a successful rocket quickly or else, which forced him to place an order for more POW slave labor at the Mittelwerk. (Peenemünde had been the first principal rocket factory before it was bombed by the British in 1943.)

During his surrender to U.S. Allies in 1945, von Braun exhibited the same charisma, self-confidence, and luminary quality that would later charm the American public. He and his fellow engineers were hiding out in a ski resort in the mountains on the German-Austrian border at war’s end, trying to decide what to do. Two days after Hitler’s suicide, they drove to an Allied-occupied Austrian town to turn themselves in, where von Braun boasted to his captors that he was the “founder and guiding spirit” of the Peenemünde rocket facility, all the while acting like a dignitary.

“One member of the 44th [Infantry Division, to which von Braun surrendered] later said that ‘[von Braun] treated our soldiers with the affable condescension of a visiting congressman,’” writes Neufeld, adding that von Braun posed “for endless pictures with individual GIs, in which he beamed, shook hands, pointed inquiringly at [American soldiers’] medals and otherwise conducted himself as a celebrity rather than a prisoner.” Von Braun even bragged to a reporter for the Beachhead News “that if he had been given two more years, the V-2 bomb he invented could have won the war for Germany.”

• • •
In America, von Braun soon became frustrated that he could not interest the U.S. government in space travel. His purpose in being brought to the United States was to develop missiles as weapons. Von Braun decided he would have to personally get the American public excited about space flight, prompting him to write a novel called Mars Project that he tried to get published in 1950. The book was rejected by 18 publishing companies because it was too technical and featured little storyline. One publisher said that all the novel was good for was to “build a rocket ship.” Eventually, a publisher in West Germany became interested after it was rewritten as a drama by a former Nazi propaganda writer.

The publication of space exploration articles in the early 1950s by von Braun for Collier’s magazine (illustrated with futuristic renderings of rockets) caught the public’s attention. This led Walt Disney to ask von Braun in 1954 to appear on Disney’s ABC network television show “Man in Space.” The rocketeer’s narration of a segment in 1955 was the first time that America heard his voice. Von Braun and a couple of German rocket engineers were prominently featured in the series, but the show’s producers questioned if it was wise for the program to be dominated by German accents. “The Disney crew had in fact discussed whether it was a problem that all three experts were German. But their very accents fit an American cliché of scientific gravity, and as for the Nazi issue, Walt Disney was the quintessential conservative, Midwestern middle American and seems to have given it little thought.”

One month after the first broadcast of “Man in Space,” von Braun legally became an American citizen in Birmingham, along with a hundred of his German colleagues and their spouses. Von Braun told the press gathered for the occasion, “This is the happiest and most significant day of my life . . . Somehow we sensed that the secret of rocketry should only get into the hands of people who read the Bible.” However, to his parents he reported, “It was a terrible circus, with film crews, television, press people and the usual misquotations.”

Profiles in Time and the West German equivalent Der Spiegel did not mention von Braun’s Nazi party membership. Reporters did not have a clue. Instead, a film about his life that von Braun agreed to participate in began the unraveling of his past. I Aim at the Stars began filming in 1959. Von Braun was paid $24,000, and Columbia Pictures kicked in another $25,000 plus 7% of the net profits. With his newfound wealth, he traded in his American car for a Mercedes-Benz. The movie was initially predicated on the image of von Braun as “a space dreamer persecuted by the Nazis and given a second chance by the United States,” though the script was later changed to portray him more accurately as striking a Faustian bargain to go into space. Still, the film was considered a whitewash job. Ironically, the screenwriter was a 1933 refugee from the Nazis who introduced fiction into the script to make the story palatable for an American audience.

At the Munich premier of the film, three unarmed tactical nuclear missiles were on display in front of the theater. U.S. military brass attended in full uniform. Ban-the-bomb demonstrators were also on hand. At a press conference, von Braun answered British critics of his American success: “I have very deep and sincere regret for the victims of the V-2 rockets, but there were victims on both sides . . . A war is a war, and when my country is at war, my duty is to help win that war.” The film was panned and poorly attended. Antwerp, which suffered more V-2 rocket hits than London, banned the movie. Comic Mort Sahl coined the greatest putdown of von Braun’s career when he quipped that I Aim at the Stars should have been subtitled But Sometimes I Hit London.

A year after NASA was created, in 1958, von Braun was appointed chief of the Marshall Space Flight Center (MSFC) in Huntsville, and was no longer working for the army. Pressure was applied by NASA on von Braun to hire more black engineers and technicians, but many were reluctant to move to Alabama at that time. Von Braun did not appear eager to get involved when Governor George Wallace stood in a schoolhouse door to prevent a black student from registering at the University of Alabama, yet he publicly condemned segregation when a black MSFC employee enrolled without incident at the University of Alabama in Huntsville.

Not long afterward, Governor Wallace visited MSFC and witnessed a rocket test. Von Braun addressed an audience that included the governor, and stressed that it was imperative that Alabama move on from its segregationist past. After the speech, he chatted with Wallace and asked the governor if he wanted to be the first person on the Moon. Wallace replied, “Well, better not. You fellows might not bring me back.” &

 

Dead Folks 2005, Authors, Inventors, and Astronauts

Dead Folks 2005, Authors, Inventors, and Astronauts

A look back at the notable names and personalities who called it quits last year.

February 24, 2005

Authors

Susan Sontag

/editorial/2005-02-24/D_Susan_Sontag._RT.jpg
shadow
Susan Sontag (click for larger version)

Once viewers catch on that Woody Allen’s 1983 comedy Zelig is a fake documentary about a man who never actually existed, the joke is in how extensively Allen creates a pastiche of the documentary form. The requisite pauses in the story for comments by observers, analysts, and sundry talking heads are the funniest part of Allen’s method, and the funniest talking head is Susan Sontag. That’s not because she has any funny lines. It’s because she doesn’t. So influential, profound, and brilliant are Sontag’s critical views on all matters cultural, that her very presence in the film signifies the ultimate commentary. The scene is equivalent to Gertrude Stein, Edmund Wilson, or Jean Paul Sartre making a cameo appearance in a Bob Hope comedy.

After entering college at age 16 and fairly blowing away everyone at Berkeley, University of Chicago, Harvard, and the Sorbonne, the groovy brunette with a bride-of-Frankenstein streak in her mane decided to share with the world her innumerable ideas about art and life (for her they were indistinguishable). An article published in Partisan Review in 1964 called “Notes on Camp” was, in literary circles, akin to The Rolling Stones appearing on “The Ed Sullivan Show,” or maybe even the premiere of Citizen Kane. With that essay, and subsequent “assaults” in The Atlantic Monthly, Granta, The New York Review of Books, and various other intellectually inclined periodicals, Sontag provided a brand new way of discussing significant ideas in Western culture and minor ideas in popular culture. The new Bob Dylan album, Godard’s latest film, William James, and Freud were all part of the same story—or critique. Each was crucial to understanding the human creative experience. Yet during her explosion onto the arts and literary scene of the 1960s, what was most exciting for the hipsters, bohemians, and New York intellectuals who embraced/feared her was that Sontag made feasible the notion that one could read everything and know everything that mattered. She simultaneously demonstrated that no one could do it better. In that context, it’s extremely revealing that Sontag once defined the term “polymath” as “a person who is interested in everything, and nothing else.”

The publication of Sontag’s collection of essays titled Against Interpretation (1968) was virtually tectonic in its impact. Here she argues that understanding any work of art starts from intuitive response and not from analysis or intellectual considerations. “A work of art is a thing in the world, not just text or commentary on the world.” Other important works such as On Photography and Illness as Metaphor brought challenging ideas about contemporary culture out of the academy and into popular discourse. Not on Johnny Carson’s show, of course, or in the daily newspapers, but Sontag did to some extent prop open the doors to formerly exclusive salons. That’s mainly because her lucid, confident writing style, which is reinforced by a devastating (and yet somehow celebratory) wealth of intellectual inquiry and research, remains free of academic jargon and postmodern tics.

Such a position as a cultural critic implies a certain amount of controversy, which Sontag always could generate with a few comments. The left-leaning, radical thinker might be famously wrong at times, but one feather in her cap was confronting her lefty pals and stating that “socialism is the human face of fascism.” She was also right about Sarajevo. But regarding her notorious claim that September 11 was the result of U.S. international policies and actions, well, remain on the far left long enough and you’re bound to self-destruct. —David Pelfrey

Daniel Boorstin

The Librarian of Congress from 1975 to 1987, Boorstin loved books and couldn’t understand why anyone else might not; he coined the term “aliterate” to describe those who could read but chose not to. During his tenure, appropriations for the Library of Congress rose from $116 million to more than double that figure, the vast holdings were opened to the public, and Boorstin established the Mary Pickford Theater to call attention to (and utilize) the library’s huge archive of motion pictures. He was the nation’s top cheerleader for libraries in general. Boorstin’s deepest interest was in history, although he was fond of pointing out that he was an amateur and not a professionally trained historian. That’s actually not worth pointing out, however, as he taught history at the University of Chicago for 25 years, held a post as director and senior historian at the Smithsonian Museum of History and Technology, and wrote a Pulitzer Prize-winning trilogy on American history and a subsequent four-volume history of the world. —D.P.

Elisabeth Kubler-Ross

“Whoever has seen the horrifying appearance of the postwar European concentration camps would be similarly preoccupied.” That’s Swiss psychiatrist Elisabeth Kubler-Ross (78) speaking of her obssession with changing the treatment of dying patients. Kubler-Ross was greatly disturbed by what she witnessed in New York hospitals when she visited the U.S. in 1958. Her interest in death and her intensive study of the behavior of the terminally ill led to the publication of On Death and Dying in 1969. In less than a decade the book was a standard reference text for medical ethics and hospital policy. Her celebrated theory of the five stages of grief (denial, anger, bargaining, depression, and acceptance) remains a valuable model of human behavior not only for patients, but also for loved ones, medical professionals, and caregivers. —D.P.

Olivia Goldsmith

The film version of The First Wives’ Club was a jaunty celebration of older women getting revenge on the thoughtless husbands who abandoned them for younger women. There were also plenty of jibes at cosmetic surgery, as also found in the source novel by Olivia Goldsmith (54). Too bad the author didn’t take her pro-aging stance more seriously. Instead, Goldsmith died from complications related to anesthesia during cosmetic surgery. —J.R.T.

Norris McWhirter

Along with twin brother Ross, Norris McWhirter (78) founded the Guinness Book of Records. Its first edition was printed in 1955, and among its earliest records was a Russian woman who gave birth to 16 sets of twins, seven sets of triplets, and four sets of quadruplets from 1725 to 1765. According to its own records, the Guinness Book of Records is the world’s best-selling copyrighted book, with more than 100 million sold. The McWhirter twins personally crammed 70 people into a compact car just to set a record. Ross was murdered in 1975 after posting a 50,000-pound reward for information leading to the arrest of Irish Republican Army terrorists. —E.R.

Inventors and Innovators

/editorial/2005-02-24/D_Estee_Lauder_CTR.jpg
shadow
Estee Lauder attracts a crowd (click for larger version)


Estee Lauder

Growing up in an apartment above her father’s hardware store in Queens, Josephine Esther Mentzer was a nice Jewish girl with an ambitious spirit and an intense fascination with the lotions and potions her chemist uncle prepared in a little shop. She liked them so much that in 1946 she began selling skin creams at beach resorts and hotels. The determined Esther expanded her product line and practically bullied her way onto some counter space at Saks Fifth Avenue two years later, by which time she and her husband Joseph Lauder had created a “nice little company.” The products were fine, but the sales program was outstanding: exquisitely attired staff, sophisticated sales patter, and, by the way, madam . . . here’s a free sample (a.k.a. “the gift”). By 1953 the company was a well-recognized force in the cosmetics industry.

Its success was due to Lauder’s making certain that those free gifts and samples found their way into the handbags of the hottest celebrities, the social elite, and the otherwise well-to-do. If that meant entertaining guests on a lavish scale (plenty of fine wine, fine cuisine, and cartons of free cosmetics), well, that was just part of the sales game; “If I believe in something, I sell it, and I sell it hard,” she was fond of saying. A more famous, and certainly more profit-generating, quote was “There are no ugly women.” It was that attitude, along with Lauder’s sheer force of will, that helped create a $10 billion enterprise with locations in 130 countries and a daunting product line that includes MAC, Aveda, Clinique, Aramis, and Prescriptives, the sum of which currently constitutes a stunning 45 percent share of the cosmetics business in the United States. Estee Lauder is the only woman on Time‘s list of the 20 most influential business figures of the 20th century. She was 97. —D.P.

Al Lapin, Jr.

In 1958, Lapin and his brother Jerry invested $25,000 and founded the International House of Pancakes. “Rooty Tooty Fresh and Fruity” was an early marketing slogan for ridiculously sweet fruit-topped pancakes and waffles drenched in blueberry, boysenberry, strawberry, or maple syrup. Lapin (76) later owned the Orange Julius chain. His first venture was Coffee Time, carts that delivered urns of hot coffee to offices. Attractive presentation was of utmost importance, both in his personal attire and restaurants. Among his favorite sayings were “People eat with their eyes before they eat with their hands,” and “You have to look like a dollar to borrow a dime.” —Ed Reynolds

Francis Crick

Everyone who ever suffered through sophomore biology classes in high school has sketched (or traced) in their lab notebook the double helix, that famous twisted ladder of deoxyribonucleic acid, more commonly known as DNA. Today the term has made a complete transition from scientific jargon to the popular lexicon. Disparaging remarks about the origin of someone’s DNA or gene pool are common, as are police investigations (in the real world or in television dramas) that rely on DNA evidence. The business of bio-engineering and gene therapy is a huge industry now. Half a century ago, however, the very structure of DNA was a great mystery.

/editorial/2005-02-24/D_Francis_Crick._RT.jpg
shadow
Francis Crick (click for larger version)

It certainly intrigued the British-born biologist Francis Crick (88) and his young American-born colleague James Watson, both of whom were grappling with this puzzle at the Cavendish Laboratory in Cambridge, England, during the early 1950s. The pair finally concluded with the double helix, although confirmation of their results did not come for years. Crick nonetheless announced to friends at the University that they had “discovered the secret to life.” His approach was more subtle when it came time to publish their results in Nature in 1953, yet one particular understatement may resonate for all time: “It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material.” In 1962 Crick and Watson were awarded the Nobel Prize. —David Pelfrey

Sir Godfrey Hounsfield

In the 1960s, British electrical engineer Sir Godfrey Hounsfield created the computerized axial tomography scanner—the CAT scan. The CAT scan used X-rays to create three-dimensional images of the body’s interior, revolutionizing medical care. —E.R.

Dr. William Dobelle

William Dobelle (62) developed an experimental artificial vision system for the sight-impaired that involved transmission of electrical signals to electrodes implanted in the brain by way of a tiny camera attached to the user’s glasses. A portable computer receives images that are then sent to electrodes in the brain’s visual cortex. Four years before his death, his creation restored navigational vision to a blind volunteer. “I’ve always done artificial organs,” Dobelle told the New York Times. “I’ve spent my whole life in the spare-parts business.” —E.R.

Tom Hannon

The “father of the automated teller machine,” Tom Hannon pioneered the use of ATMs in locations other than banks. In the early 1990s he had machines in four Southern states. By the time he sold his U S. operation in 2002 to enter the British market, he had 2,500 machines in 40 states. —E.R.

Samuel M. Rubin

Popcorn was probably reasonably priced when Sam Rubin (85) began selling it in movie theaters during the Depression. He’d already built an empire with assorted New York City locations, but Sam changed the way we enjoy movies when he took his popcorn stands into theater chains such as RKO and Loews. His empire signaled the end of vending machines as the preferred mode of movie snacking. Rubin can also claim credit for inventing those oversized boxes of candy that sell for five times what you’d pay outside a movie theater. —J.R. Taylor

Red Adair

During the Gulf War in 1991, Iraqi troops retreating from Kuwait set fire to oil wells in the high-producing Ahmadi and Magwa fields, creating a potentially monumental economic and environmental disaster. All the task forces and experts, along with the team working for legendary oil well firefighter Paul “Red” Adair (89), agreed that extinguishing these mammoth fires would take three to five years. Thanks to the consultation, logistical support, and special equipment provided by Adair’s organization, the task was accomplished in nine months. This was a stunning feat, but observers familiar with Adair’s history were not really shocked. At the time, Adair already had more than 40 years of experience battling wild wells, blow outs, and other conflagrations in the deserts and on the high seas. (Adair’s amazing story is told in Hellfighters, starring John Wayne.)

Throughout the 1960s and ’70s, any time an oil rig exploded Adair’s team was called into action; the media coverage of these events justifiably portrayed Red Adair as an American hero. One of his more spectacular deeds involved the huge oil flame in the Sahara known as “The Devil’s Cigarette.” Today the highly specialized devices designed by Red Adair Service and Marine Company, Inc., are regarded as the Rolls Royces of firefighting equipment. —D.P.

Space is the Place

Gordon Cooper

One of the original seven Mercury astronauts, Gordon Cooper (77) was perhaps the most controversial for his belief that the U.S. government was keeping secrets about UFOs. In 1951, Cooper was part of a squadron scrambled into the air over Germany after metallic objects resembling saucers were spotted flying in formation. Cooper also maintained that he saw a UFO crash at Edwards Air Force Base in California. He filmed the incident, but the film was confiscated by government officials. While orbiting the earth in Gemini 5, Cooper infuriated federal authorities when he inadvertently photographed the top-secret Nevada military base known as Area 51 while shooting outer space photos as part of a Pentagon film experiment.

Cooper was the first American to remain in space for an entire day when he flew the last Mercury mission in 1963. Despite his controversial UFO fascination and associated conspiracy theories, he was the backup commander for the Apollo 10 mission that flew to within 50,000 feet of the moon. On his Mercury mission, the electrical system failed, and Cooper had to pilot the spacecraft manually back to earth to splashdown. Cooper’s belief in UFOs was so strong that he testified about them to the United Nations in 1978 in hope that the U.N. would become a repository for collecting UFO sightings. He also wrote a book urging the government to tell what it knew about UFOs. Most, however, probably remember Cooper through Dennis Quaid’s portrayal of the astronaut in The Right Stuff. —E.R.

/editorial/2005-02-24/D_Gordon_Cooper_RT.jpg
shadow
Gordon Cooper (click for larger version)

Maxime Faget

While scientists were designing rockets to launch astronauts into outer space, Maxime Faget’s job was to bring space travelers home in one piece. Designer of the Mercury space capsule, which ushered the U.S. into the age of manned space flight, Faget’s dilemma was to protect a spacecraft and its occupants from heat when re-entering the earth’s atmosphere. (Astronauts return at 17,000 miles per hour in a craft that reaches temperatures of 4,000 degrees Fahrenheit.) Earliest theories called for a needle-nosed spaceship to cut down on air resistance, but Faget (83) scoffed at such Buck Rogers notions and designed a blunt-bodied craft that entered blunt end first to deflect most of the heat away from the craft. —E.R.

Fred Whipple

Originator of the “dirty snowball” concept, comet expert Fred Whipple (97) introduced the idea in 1950 that comets were balls of ice. This broke from the popular notion that comets were wads of sand held together by gravity. Whipple recognized that a comet’s arrival at a particular destination in outer space did not follow the predictability of gravitational pull only. He instead theorized that as a comet approached the sun, sunlight vaporized ice in its nucleus. Jets of particles resulted, functioning as a rocket engine to speed up or slow down the comet. Close-up photos of Halley’s Comet in 1986 proved Whipple to be correct. Whipple was also responsible for coming up with the idea of cutting aluminum foil into thousands of pieces and releasing the fragments from Allied aircraft over Germany. The tiny bits of foil confused the enemy; it appeared that thousands of planes were attacking. Some speculate that this is where the phrase “foiled again” originated. —E.R.

William H. Pickering

Director of Jet Propulsion Laboratories in California from 1954 to 1976, Pickering (93) was in charge of the United States’ first robotic missions to the moon, Venus, and Mars. Three months after Russia put the first satellite, Sputnik, into orbit in 1957, America launched Explorer I, its first orbiting spacecraft. A New Zealand-born electrical engineer, Pickering was a central figure in the Ranger and Surveyor landings on the moon, precursors to the Apollo flights that landed men on the moon. Initially, the Army oversaw Jet Propulsion Lab activity, but turned it over to NASA after the Russians launched Sputnik. —E.R.