Star Wizards
How a handful of desperate innovators took special effects to new heights in two 1977 movies— Star Wars and Close Encounters
By late February of 1977 Richard Edlund, a visualeffects cinematographer, knew he was in trouble. With his budget long exhausted and an inflexible release date looming, the risky new adventure movie he’d spent nearly two years working on still had no opening shot. His boss, a modest young filmmaker named George Lucas, had banked his career and a sizable chunk of 20th Century–Fox’s money on a movie the studio wasn’t sure anyone wanted to see, especially one hobbled with a title the marketing people had hated from the beginning: Star Wars .
The storyboards described the movie starting with an enormous “Imperial Star Destroyer” pursuing a much smaller spacecraft into the frame right over the audience’s head, hauling them into the adventure. “We knew how important this was,” Edlund explains. “If the audience didn’t buy that shot right away, they might not buy into what follows, but the only Star Destroyer we had was barely three feet long. We’d discussed plans to build a huge model that would stretch out along the side of the building, talking endlessly about how we’d light and shoot it while we kept banging away on the easier shots, always pushing that massive opening shot off, until eventually we started running out of stuff to shoot. By then the money was getting thin, and time was pretty much up.” And so, just as they’d always done, Edlund and John Dykstra, the movie’s visual-effects supervisor, and their crew at the fledgling visual-effects house Industrial Light & Magic, tackled this latest problem with sheer inventive pluck.
“We made a tiny test model of the rebel blockade runner, the first ship you see in the movie,” Edlund recalls, “and I stuck it on the nose of the undersize Star Destroyer with a paper clip. We fitted the camera with a 24-mm lens that was basically scraping the bottom of the model—I think we were only about a thirty-second of an inch away—and after working out an appropriate speed, I spent the afternoon shooting test passes. The next day we all saw the results and knew we were there. But man it was close.”
Three months later, on May 25, 1977, audiences were pressed into the backs of their seats by the power and sheer exuberant fun of that opening flyover, and a legend was born. Star Wars , George Lucas’s modest little space fantasy, went on to pleasantly surprise 20th Century–Fox’s dubious marketers by grossing more than any film before it.
It may be hard to believe now, but immediately before 1977, the watershed year that witnessed the release of both Lucas’s Star Wars and Steven Spielberg’s UFO epic Close Encounters of the Third Kind , big-budget effects-laden movies were incredibly rare. In the near-decade between the 1968 release of 2001: A Space Odyssey and Star Wars , barely a handful had appeared. Stanley Kubrick and Arthur C. Clarke’s masterpiece had overwhelmed audience expectations and inspired a new generation of filmmakers (myself among them), but the slow breakup of the old Hollywood studio system, together with new light-weight cameras and faster lenses that simplified location shooting, was already moving the industry away from large stage-bound projects. The generation that landed on the moon had seemingly abandoned interest in science fiction, a genre that had thrived almost since the first flicker of the projection lamp.
In the earliest days of cinema the Lumière brothers of Paris, the first filmmakers to charge public admission, quickly discovered that to thrill an 1895 audience, all their movies really had to do was show movement. Scenes of trains pulling into stations or workers leaving a factory created a startling new reality. But the Lumières foresaw no future in the business. The renowned magician Georges Méliès, however, began transferring his classic stage tricks to film, gradually incorporating stop-motion, jump cuts, and multilayered imaging to enhance the illusions. These displays proved more intriguing to his audience than the live tricks bracketing them, ultimately causing him to abandon the stage entirely and focus on his new calling.
The world’s first visual-effects maestro produced several hundred highly accomplished short films between 1896 and 1912. Unfortunately, the French government undervalued this work and destroyed most of it during the First World War, reclaiming silver from the emulsion while recycling the cellulose base into soldiers’ boot heels. Today only a few of Méliès’s films survive, including his 1902 masterpiece A Trip to the Moon . But an industry had been born, and it reliably thrilled audiences for decades until the genre slowed in the late 1960s, just when the Hollywood studio system was enduring its own death throes.
Douglas Trumbull, who was one of the effects supervisors for 2001 , holds a close family connection to the old studio system. His father, Don Trumbull, worked as an in-house effects technician for MGM in the 1930s. His jobs included rigging monkey-suited actors for flight and manning a modified fishing rod to swish the Cowardly Lion’s tail on the set of The Wizard of Oz . Then, as now, visual-effects technicians went to work whenever the script called for a shot too dangerous, difficult, or expensive to capture through standard photographic means. As Douglas Trumbull explains, “In the studio days, visual effects were used to save money on sets and keep productions on the lot rather than on location. Process photography could place actors in front of projected ‘plates’—moving background images such as traffic or countryside—while matte paintings and hanging miniatures could add distant skies, mountains, etc. Eventually blue-screen and optical printers made it possible to superimpose one image on another, which replaced cumbersome and time-consuming rear projection.”
The optical printer Trumbull refers to was the main workhorse of the visual-effects trade for decades. In its most basic configuration it consisted of a rock-solid 35-mm motion-picture camera linked optically and mechanically to one or more precision projectors on a heavy base laden with controls. Normally used to add titles, fades, and dissolves, it could also blend separately photographed elements onto a fresh negative, creating scenes that could never have existed otherwise.
The idea was to shoot the foreground subject—actor, vehicle, etc.—against an optically pure background, such as an illuminated blue screen, which was then removed photochemically. By rephotographing the resulting image together with an appropriate background plate, the actor could be inserted into the new scene. Creating complex, multilayered opticals was as much art as science, and the masters of the craft constantly invented new tricks to blend various elements into one. There was only one test that mattered: Would the audience buy the shot? No matter how many layers were employed or how much tweaking was required, the final composite had to intercut with the surrounding main unit footage as seamlessly as if it had been shot through the same lens on the same day.
Opticals are created in a sequence of incremental steps, gradually building to the final composite one layer at a time. Since just piling up two transparent strips of film—foreground and background—would result in bleed-through, with overlapping images, a key step was the creation of protection, or “hold-out” mattes. By reprinting the foreground blue-screen image on black-and-white film, the optical supervisor could generate matching positive and negative elements, a black foreground “hole,” or matte, against a clear background, and a clear foreground matte against a black background. Rephotographing the background on the printer with the appropriate matte elements in place created a brand-new background layer, complete with a “traveling matte,” an unexposed area matching the foreground element, which would be added on a later pass through the printer. The process had to be repeated for every separate element in the shot, with both positive and negative mattes inserted to protect the various layers through each of the many exacting steps.
Throughout the 1940s and 1950s most Hollywood visual-effects departments purchased new optical printers whose uses had been pioneered by a master cameraman named Linwood Dunn ( King Kong , Citizen Kane , and many more) and began generating their own traveling mattes to insert actors into exotic locations. Traditionally, actors were shot against a brightly lit blue screen, though Walt Disney’s technicians came up with their own pro-prietary process based on sodium-vapor lighting, which emits strongly in a narrow yellow frequency band.
The Disney process involved shooting a scene against a white background illuminated with sodium-vapor light, which made it appear yellow. The action in the foreground was lit as usual. Inside the camera, a beam-splitter prism separated the sodium-yellow light from the rest. The “beauty pass,” in which the actors appeared, was shot using color film that was insensitive to the sodium lamp’s yellow light (since the prism did not filter out all of that light). The matte shot, which was an image of the yellow background, was recorded simultaneously on black-and-white film that was sensitive to the sodium light. Rumor has it that only a single, priceless camera prism existed to make that split. It never left the Disney lot and was transported daily between stages in a heavily padded wooden box on a bicycle.
Cumbersome and light-greedy as it was, the patented Disney system, by generating its own protection mattes, avoided the misaligned-matte pitfalls of the finicky blue-screen process, a technique only the most talented and patient (often trained by Linwood Dunn) could master. “I remember seeing my first ‘matte line’ in The Robe ,” Richard Edlund recalls, “Victor Mature on top of a mountain with this big green glowing line around him. I thought, ‘What is that? Some sort of religious thing?’”
Edlund refers to this predigital period as the photochemical era, during which “we were endlessly trying to bend nature to our will.” A problematic visual-effects shot has its own special clunk. Whether it fails in design, execution, or the final composite doesn’t matter. The audience doesn’t necessarily know exactly what’s gone wrong, just that something has, and it can drop them right out of the movie. One of the worst offenders, after those glowing misaligned matte lines, was the sudden and noticeable quality loss that occurred when secondgeneration opticals were intercut with first-generation main-unit footage. As Edlund recalls, “Audiences had gotten used to the fact that when the screen got fuzzy or the contrast changed and graininess accrued, something was about to happen.” One possible solution lay in the large and cumbersome film cameras invented in the early 1950s to challenge the new threat from television. “I had the thought that shooting our elements on larger film formats like VistaVision could minimize dupe loss by providing double the negative size of normal film,” Edlund explains. The film’s greater resolution would prevent “the subliminal tip-off that there was an effect coming up.”
Fortunately for Edlund and his fellow scavengers, the breakup of the old studios’ in-house effects departments led to a wealth of first-rate surplus camera gear, often dumped at yard-sale prices. In the mid-1970s Edlund was shooting award-winning TV commercials at the Los Angeles effects house Robert Abel & Associates when he heard of a large-format optical printer being offloaded by Paramount. “It was an ancient VistaVision printer that hadn’t been used since The Ten Commandments , and it was dirt cheap. We’d been doing most of our shots in-camera with bi-pack mattes and other examples of photographic masochism, where if you screwed up even once, you’d get to shoot the whole thing again tomorrow. I thought if we had a VistaVision printer and could operate in that big format, we might be able to avoid the dupe loss, so I begged Bob Abel to grab it—and he refused! Right about that time I got the call to go work on Star Wars , so I gave Bob three months’ notice before heading off and taking my big-format printer plans with me.”
The ambitious shots planned for Star Wars and Close Encounters of the Third Kind would have challenged any shop in the mid-seventies. The bar-setting work done for 2001 had taken years to achieve; the film was in active production from 1965 almost until the day of its 1968 release and would require many more years to earn back its costs, in no small part because of its unprecedented visual effects. Yet even the multiple breakthroughs achieved in order to set those majestic, slow-moving spacecraft drifting against the stars wouldn’t translate easily into massive space battles, let alone dozens of brilliantly lit UFOs darting about realistically through an ordinary evening sky. As Trumbull explains, “A major problem was the inability to composite multiple elements of a shot while moving the camera at varying speeds. On 2001 the stars, spacecraft, and planets moved at continuous speeds, achieved by having a large synchronous motor drive a series of mechanical gearboxes connected to motors that relayed continuous moves to the camera, the model, etc.”
As soon as Star Wars ’ writer-director, George Lucas, and producer, Gary Kurtz, got their green light from Fox, they went looking for the most experienced and innovative visual-effects supervisor available, a search that quickly brought them to Trumbull. He was having none of it. “When Star Wars was offered, I politely refused, feeling kind of ‘spaced out,’ as it were, and perhaps a little arrogant, having worked on 2001 , which I felt was a far more mature and scientifically worthy project.”
Trumbull had directed the science-fiction adventure film Silent Running in 1972, and he gave Kurtz his blessing to offer the job to his special-effects protégé from that movie, John Dykstra. Brilliant, inventive, and enthusiastic, Dykstra jumped onboard as Star Wars ’ visual-effects supervisor, with Richard Edlund signing on as director of photography. The neophyte team they assembled—average age, about 25—is a roster of future visual-effects stardom: Dennis Muren, Phil Tippett, Ken Ralston, Harrison Ellenshaw, Peter Kuran. They were mostly kids from art or tech school who had assembled a good sample reel, usually in their basement. The industry they would build and refine over the years got its start in a run-down Van Nuys warehouse when Dykstra announced the new company’s name: Industrial Light & Magic.
“God, what a playground it was!” Edlund recalls. “It took nearly a year just to build and debug our systems while George was over in Africa and England shooting the actors. The original ILM was next to the Van Nuys airport in Los Angeles, and right across the street was this huge technology surplus warehouse. They had stuff from airplanes, old electronics. We’d dig around and find big 30-inch-diameter bearings for $100 that would normally cost $10,000.”
One of the first decisions concerned the basic matting technique to be employed for the hundreds of visual-effects shots the script called for. The venerable blue-screen process had been rejected by Stanley Kubrick and his team during 2001 , but Dykstra and Edlund felt there was no other option. “Shooting actors, explosions, and so on meant we had to resurrect blue-screen and find a way to make it work with very little time and money. Of course the only blue-screen shots George had ever seen were the ones where you saw all those bad matte lines, so it really was pretty dangerous territory for him.”
While writing his script, Lucas had envisioned a very different approach. Edlund recalls the day the director laid his ideas out for the team: “George initially wanted to use guys dressed in black in front of black curtains, running around carrying models on sticks, sort of like Bunraku puppet theater, which wouldn’t have produced the results he was going for.”
The script and storyboards described action that had never been attempted with miniatures, including shot after shot of multiple fighters darting across the screen as the camera panned to keep up. As a visual aid, Lucas sent rough cuts of the climactic Death Star battle sequence, with World War II aerial dogfight footage standing in for the shots the team would have to create. Images from Hell’s Angels and Tora! Tora! Tora! —as well as authentic World War II gun camera footage—demonstrated the speed and motion he was going for. As much as this material helped, it also laid bare the raw challenge facing the ILM team once their systems were up and running.
The spaceships would have to be photographed with multiple, precisely repeatable passes accurate to within a few thousandths of an inch, using a computer-driven robotic camera and crane in order to record the many elements the optical compositors required: beauty pass (the central shot capturing the model itself), matte pass (often necessary to generate solid-black protection elements, with all main light sources switched off), engine light pass (no illumination other than the small lights glowing inside engine nacelles), interior lighting pass (no illumination other than the small lights representing windows, portholes, etc.), interactive lighting pass (no illumination other than external timed bursts of light to add reflections from explosions, ray blasts, etc.), and so on. Shooting so many different passes enabled the optical-printer operator to dial the degree of exposure required from each element, thus providing the greatest possible control over the final composite. Even with that, there was little margin for error and a huge cost in production time. The multiple-pass approach would have to be repeated for every model in each of the film’s 365 visual-effects shots, and some of the more complex shots could take days or even weeks.
Driving a motion-picture camera through a repeatable series of precision moves had long been the Holy Grail for visual-effects designers. While Douglas Trumbull was working on one approach, John Dykstra was exploring some different ideas with two electronics designers, Al Miller and Jerry Jeffress. The computer technology they had to work with was very primitive by today’s standards. “There were PDP-8’s and the first hints of microcomputers,” Miller recalls, “but the one thing that allowed development of the Star Wars system was solid memory. 1K chips had only recently become commercially available at about $20 each.” Edlund further observes: “It was all in the timing. We were adapting solid-state electronics originally built for the Apollo program. It wasn’t lost on us that we were basically borrowing NASA technology to shoot a space movie.”
In Japan the first computer-controlled robots were appearing on automobile assembly lines, and in America similar devices were driving milling machines. But as Edlund points out, their capabilities didn’t easily translate to the far more sensitive demands of an ultra-slick camera move. “When we built the first motion-control platform for Star Wars we were using software modified from CNC (computer numerical control) milling, which controlled lathes and cutters using very simple software that drove stepper motors using square-wave on-off pulses. It was the square wave that gave us the greatest problems. We were basically hammering the armature of the motor to the next spot, where it would stop abruptly, giving you these incredibly jerky motions over 200 steps per revolution. So Jerry Jeffress came up with the micro-step concept, figuring out a way to make the stepper motor do its normal 200 steps over 3,200 steps, using a sine wave instead of a square wave, so each step had a slow in and out. It gave us a very smooth movement. Unfortunately that technology was never patented, mostly because we were too busy making the movie. So it found its way back into the CNC market, and then of course everybody grabbed onto it.”
Some of the new generation’s most innovative tools were hand-built by an original Hollywood effects master, now enjoying a late second career. “I was basically designing the cameras, motion-control cranes, and optical systems on a four-by-five yellow pad,” Edlund explains. “But Don Trumbull—Douglas’s father—was the guy who would actually execute the designs for us. None of this stuff existed until Don built it. The modified VistaVision camera movement he came up with was maybe 5 inches high, 10 inches wide, and 12 inches long, including a rotating mount so we could tilt the lens as we went flying through those very small miniatures. It was the only way to hold focus as we were shooting at f/22 to get the exposures we needed.”
Even as new effects were being born in Van Nuys, 20th Century–Fox executives were stewing in uncomprehending silence, waiting month after month without seeing a single shot emerge. The frustrating process of debugging such complex systems was also taking its toll on the ILM staffers. Edlund recalls an incident in which “one guy brought in one of those commercialaircraft escape slides that he rigged from the roof of the shop into a homemade eight-foot hot tub. So everybody’s sliding down this thing into the water just as a carload of Fox execs drive up on a surprise visit to see how we’re spending their money. We survived that, barely, but the studio started thinking of us as some sort of weird country club after that, which was pretty ironic considering how hard everybody was working to get this show up and running.”
Eventually Fox sent in a team of consultants led by the effects expert Linwood Dunn to assess the odds of the ILM crew’s pulling it off. Edlund recalls: “John Dykstra was a great evangelist for all the new systems we were putting together, going on and on about the computer controllers, the electronics, which were mostly just a mess of wires at that point. I think when John finished, he expected applause. What we were doing certainly was amazing, but they just sort of walked away and left him standing there. Linwood later took me aside and said, ‘You know, Richard, the most important problem you’re going to have here is scheduling the work.’ And he was so right.”
Programming shots into Miller’s prototype motion-control camera system, now nicknamed the DykstraFlex, was a slow and often frustrating process. “There was no easy way to edit a programmed move,” Miller explains. “The operator could save the move for each axis while he flew the next one, but he couldn’t smooth or edit the move. He would just have to redo it if he didn’t like it.” As Edlund describes the process, “You had a handheld stick with a bunch of dials on it, and each one was trimmed for a specific function: boom, pan, camera tilt, sideways traverse, track forward, model roll, model pitch, etc. You’d build a combination of these various functions one by one until you had the full move for that shot. Then you’d light it and shoot a test pass on a special black-and-white stock that we could develop in-house in a little Kodak processor. We sacrificed an old Technorama camera so that Don Trumbull could build us a Moviola viewer. That way, if we had multiple elements, which we always did, we could literally stack the film clips and run the shot to make sure the ships would miss each other and were flying in proper formation. I often had packs of six or seven elements running simultaneously.”
Once the system was finally debugged, it went into nearly 24-hour service to complete the necessary elements. “I was shooting during the day, and Dennis Muren had the night shift,” Edlund recalls. “So I had to get all my elements in the one day: shoot the various passes, process and check the tests, make adjustments, finish the shot, and break down the setup so Dennis had a clear stage. Same thing the next day, and every next day.”
Despite Dykstra and Edlund’s early efforts to regulate the work for efficiency’s sake, no two shots in Star Wars were exactly alike, a situation that called for constant innovation. Even Lucas’s puppetry concepts came up for reconsideration. “There were weird setups we did that came pretty close to guys with models on sticks,” Edlund remembers. “It was a case of whatever was necessary to get the shot, but almost every day we’d be doing something that had never been done before. We’d keep painting ourselves into corners technologically, and then we’d have to invent our way out.”
The film’s climactic dogfight sequence, in which dozens of X-wing spaceships battle enemy fighters above the surface of the moon-sized Death Star, required solutions both high-tech and low. Edlund shot the background plates of the Death Star flyovers from the bed of a slow-rolling pickup truck in the ILM parking lot, employing the California sun to illuminate the huge model of the Death Star’s surface. Things got a bit dicier when he had to show Luke Skywalker’s view as he flew his X-wing fighter into a narrow trench girdling the Death Star surface. “Our longest motion-control camera crane track was only 42 feet. In order to hit a speed that worked in the trench, I had to use the entire length just to get a single second, 24 projected frames of film. Well, even in Star Wars no shot was that short. In fact Luke’s POV dive into the trench starts on a matte painting of the Death Star and transits into a lengthy move down the model. We shot that in reverse, pulling the camera backward through the miniature, up and out of the trench, and then we’d stop the shot and complete the move by pulling back off the painting. We cut in a full-frame flash from one of the Death Star’s laser cannon to cover the transition from model to painting. It was kind of funky, but it worked.”
As the crew spent long months perfecting shots, Edlund daydreamed about the modifications he’d make should there ever be a sequel. “On Star Wars the optical printers weren’t motorized, so we had to carefully track every element by hand or else we’d end up with an ugly matte line. We’d massage every ship in the scene through shots that were maybe 100 frames long. Those ships were less than a quarterinch wide on film, so each increment was in the millionths of an inch. It was photographic masochism of the highest order. I’d worked on blue-screen for years, but we’d never taken it that far or asked that much of the process before.”
The film hit the theaters filled with composites Edlund and the team longed for just one more crack at. “Part of knowing what we were doing was knowing when to stop. There was a certain moment on every shot where we had to say, ‘Look, that’s it. That’s as good as we can get it right now. Let’s move on. Maybe we’ll come back to it later’—which, of course, we never did. That’s one reason I appreciated the work George did in 1997, when his digital team went back to the original movies and cleaned up a number of the more embarrassing shots. He may have gone too far on some of that stuff, but there was one landspeeder shot in particular that used to haunt my dreams at night, and it’s finally corrected, so I’m eternally grateful for that.”
Meanwhile, across town another challenging project was gearing up. As ILM continued to develop and test its systems, Douglas Trumbull was figuring out what he’d need to help visualize the UFOs in Close Encounters of the Third Kind . “It required a completely different look,” he says. “ Star Wars had lots of action in deep space while Close Encounters took place within Earth’s atmosphere, calling for difficult, challenging effects brought into a realistic environment in a seamless way.” Having seen the motion-control rig being assembled at ILM, Trumbull contacted Al Miller and Jerry Jeffress to design one for Close Encounters . They built the Star Wars system by day and Trumbull’s Close Encounters one at night.
Steven Spielberg, the director of Close Encounters , and Trumbull intended to shoot actors in close proximity to multiple glowing UFOs. This meant that moves that followed the actors would have to be precisely repeated months later in order to insert the effects. For this they designed the world’s first real-time digital-motion recorder. “It consisted of a special camera mount and dolly rig equipped with optical shaft encoders that fed digital data to a processor and digital tape-drive recorder,” says Trumbull. “There were 12 simultaneous channels, including focus, zoom, pan, tilt, lift, dolly track, etc.” A modified audiocassette system was chosen as the storage medium. Trumbull often flew home to Los Angeles from the spaceship landing-pad set in Alabama with that week’s tape in his shirt pocket, nervously avoiding stray magnetic fields. He says, “The system worked quite well but was troubled by unreliable mechanics, often eating tapes at the worst possible moment. Thus the two tape drives were named ‘Jaws I’ and ‘Jaws II.’”
Some of the most startling images in Close Encounters came from Trumbull’s willingness to use natural phenomena in new and unique ways, including the creation of beautiful, alien-revealing clouds that wrapped around the Devil’s Tower landing site. “We created miniature clouds by shooting into a large aquarium tank. We’d half-load it with cold saltwater, then complete the fill with warm fresh water, causing an inversion layer between the specific gravity of the two waters. When we injected water-soluble paint, it would expand and settle onto the layer, creating flat-bottomed clouds similar to real clouds in the atmosphere.”
Another innovation on Close Encounters was a method of creating the glow one might expect to see from UFOs darting about through atmospheric pollution. Trumbull elected to shoot the self-illuminated miniatures through smoke, adjusting the density of the particles according to the size of the UFOs. “We could maintain an exact level of controlled smoke by using a laser-beam densitometer that was servo-connected to an automated smoke machine and circulating fans. This allowed long hours of continuous photography of the UFOs. In the final weeks we ran short on available time in the smoke room and had to set up an additional stage without automated controls. Dennis Muren and Scott Squires took on the risky work of manually measuring smoke density with a light meter, resulting in great shots—though if you look carefully, you might see a subtle pulsing of the glow around the mothership, due to tiny variations in smoke density.”
Despite the unprecedented success of Star Wars and Close Encounters , it would be some time before the studios started launching similar projects, leaving at least two new effects shops struggling to meet payrolls. Lucas decided to relocate ILM—with the DykstraFlex camera but without John Dykstra—to his base in Northern California, to prepare for the inevitable Star Wars sequel. The remnants of the Van Nuys facility were reborn under Dykstra as Apogee Productions. Their first job was the 1978 ABC TV movie Battlestar Galactica . A few years later Douglas Trumbull’s Future General was renamed Entertainment Effects Group and began to move away from special effects and into developing innovative camera, projection, and display technologies.
The first big-budget effects film to be green-lighted in response to the success of Star Wars would end up requiring the services of both Future General and Apogee and almost cause the collapse of another venerated facility. Star Tre k— The Motion Picture was announced in March 1978 with the TV show’s original creator, Gene Roddenberry, producing and the legendary Robert Wise signed to direct. With quality visual effects critical to its success and ILM tied up preparing The Empire Strikes Back , Paramount first contacted Douglas Trumbull, who was busy with other projects, before finally turning to Richard Edlund’s old employer, Robert Abel and Associates.
Abel was just catching up with the special-effects renaissance, and Joe Lewis, a motion-control designer, remembers the atmosphere: “Abel’s felt like a giant science project—a vast warren of rooms and hallways where, in a nod to 2001 , the screen prompts from the motion-control computers was ‘Good morning, Dave.’ All the camera operators at Abel’s became Daves.”
The company was in serious trouble, though they didn’t know it. In the first major example of the now-common practice called block booking, Paramount had locked Star Trek to an inflexible release date, December 7, 1979. Unfortunately, Abel spent far too much time and money designing and building a massive, interlinked, and centrally controlled camera and optical-printer combo unit, while trying to re-invent the process as he did so. By the time he was ready to shoot, there was little time left to complete nearly twice as many visual-effects shots as in Star Wars and Close Encounters combined. Edlund tried to help: “I admonished him to keep it as simple as possible, but the end result was so complicated that it couldn’t respond to changes without two days of reprogramming, even though the problem might be something as simple as the camera needing more clearance to avoid the spacedock model.”
The moment of truth came when Wise and some studio reps showed up to see what Paramount’s millions and more than a year of research had bought and found only a single sequence ready to screen. The company was fired immediately, and negotiations began that very day to bring in Trumbull to bail out the show. As for the shiny and innovative machinery filling Abel’s new building, Joe Lewis recalls, “It was shipped over to the new effects group after the plug was pulled at Abel’s. I heard it ended up outside in the parking lot.”
Trumbull subcontracted several major sequences to John Dykstra and Apogee, including an opening Klingon encounter, the Epsilon Nine space-station sequence, and external flyovers of the menacing alien probe V’Ger. “Ultimately, Star Trek was an exercise in just getting it done,” he says. “To my delight, Robert Wise gave me tremendous freedom to redesign sequences so that they could be completed on time, as well as tell the story better.”
Despite the best efforts of Trumbull, Dykstra, and Wise, Star Trek never recovered from its difficult start. As Edlund recalls, “While Abel wasted his time, the editors cut the picture and Jerry Goldsmith recorded and laid in his music, leaving these incredibly long black slugs where the effects shots would eventually fit. When Bob was finally let go and Doug and John were brought in to save the day, they were stuck. Those black slugs went on for like a minute each, all pre-cut and pre-scored. The shots were so languid, and there was no way to know that in advance and adjust the pacing.”
Star Tre k— The Motion Picture opened on schedule, but barely so. Robert Wise told of having to personally rush the print, still wet from the lab, to Washington’s National Air and Space Museum to make the premiere. Despite the soaring budget and critical disdain, the film was successful enough to launch a new franchise that’s even now prepping for its eleventh outing. But the lack of proper editing and mixing time, coupled with still-incomplete effects shots, rankled Wise for years. In 2001 Paramount invited him to recut the film; the result was released on DVD to much fan acclaim a few years before he died.
The lessons learned on Star Tre k— The Motion Picture served the industry well. Even as the newly relocated Industrial Light & Magic geared up for The Empire Strikes Back , decisions were made to stick to a hands-on approach, despite the complexity of the new equipment and far greater compositional challenges of the storyboards. As Edlund observes, “We continued to program by hand with one eye to the viewfinder, so the human touch was maintained—which is important because humans are flying these vehicles. I’ve always felt these shots are far more interesting because they’re unpredictable. The essence of great art lies in the imperfections.” In the wake of The Empire Strikes Back , the industry became awash with new effects houses and off-the-shelf motion-control equipment, some of which worked more reliably than others. Indeed, the machines built to facilitate science-fiction storytelling occasionally behaved like something out of a movie themselves. Ray McMillan, a visual-effects supervisor and equipment supplier, recalls one of the earliest plug-and-play motion-control systems on the market: “They decided to go with servomotors, which could move a lot faster than the more reliable stepper motors everyone else was using. Unfortunately those motors were very sensitive to outside interference, suddenly and erroneously correcting their positions in quick, jerking motions. It was a scary performance. Those servos were so sensitive that someone would put money in a nearby Coke machine, and the noise from that would cause the motor to suddenly select a new position—and look out, because it would suck Hoover Dam dry to get there if that’s what it took, studio walls be damned. There were lots of horror stories about motion-control cameramen being chased out of the studio by a runaway camera looking for its new end position.”
Despite the many stumbles and missteps, the generation of visual-effects pioneers represented by Trumbull, Dykstra, Edlund, and their colleagues continued to grow and thrive in the final decades of the twentieth century, and they have spawned technological descendants of their own. John Dykstra won an Oscar for his work on Spider-Man 2 (2004) as well as one for Star Wars . Richard Edlund has several Oscars for the Star Wars trilogy, Raiders of the Lost Ark , and more. Douglas Trumbull, an Oscar winner for his Showscan system and a nominee for Close Encounters , Star Trek , and Blade Runner , has branched out as a film director and theme-park ride designer (Universal’s Back to the Future ride). Lately, he says, “I’ve been working with Tom Furness of the Human Interface Technology Laboratory at the University of Washington toward developing a virtual retinal display that could conceivably put images into both eyes in such a manner as to entirely fill the field of view, with daylight brightness, no latency, stereoscopic full color, and image quality that would be indistinguishable from reality. Seems like a tall order, but we feel that this goal is achievable within about five years.”
Nowadays nearly all visual effects are completed (if not fully generated) on a computer and output digitally, either to other digital media or onto fresh film negative. Yet although companies like Lucasfilm are heavily invested in digital projection and storage systems, the ancient art of capturing imagery on photosensitive emulsion has resisted most predictions of its demise. But the writing is on the wall. Once the costs of converting thousands of theaters to digital projection are resolved, film will begin to disappear, at least as a delivery medium; the expense of creating and shipping thousands of release prints makes that inevitable.
Still, the pioneers of this brave new world don’t seem entirely nostalgic. “I don’t miss the photochemical era,” Edlund says. “Most of the time our necks were stuck way out there. We could usually come up with something that would work, but there were times when it felt like a sumo match, one where you regularly got thrown out of the ring.” Trumbull adds, “Personally, I’ll miss the stinky hands-on world of miniatures, smoke, and film. Reducing everything I want to see to a combination of polygons, splines, shaders, and algorithms is like trying to write poetry from a dictionary. But there’s also an inherent creative empowerment offered by digital technology that is inexpensive, easily accessible, and lots of fun.”
Georges Méliès, the first visual-effects cinematographer, started out as a stage magician, and he crafted a fitting legacy that endures to this day, informing decisions made by every new generation of effects wizards. Richard Edlund sums it up with a quote he attributes to Harry Houdini: “‘If you can fool 90 percent of the audience, the 10 percent you didn’t get will never be able to convince the other 90 percent that they didn’t see what they saw.’ That’s half of moviemaking right there.”