• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Compositing

Metryq

Fleet Captain
Fleet Captain
A zombie thread was revived on TOS silverware. Rather than feeding the zombie, I figured I would respond to something Zap said concerning the color of the Enterprise model.

I can't believe Jeffries thought it should be blue when he knew they had to shoot the model in front of a blue screen.

Perhaps it was known that the VFX crew worked only with bluescreen. However, that was far from the only "traveling matte extraction" technique at the time. To make a "composite" shot, such as a ship in space, one must make more than a single exposure—hence the name composite. However, a common double exposure will result in ghosting effects. Any bright areas in the background, like stars or a planet, will expose over the same area as the ship.

"Mattes" are like those mask templates used by road crews when spray-painting a "STOP" sign on pavement. By using both "positive" and "negative" mattes of a foreground subject (like a ship), a "double exposure" can be made without the ghostly see-through problem. The ship photo is exposed through the ship-shaped mask onto a new piece of film. That film is then rewound and exposed a second time for the background. Only this time the area of the ship is masked by a ship-shaped silhouette.

The first "traveling mattes" were made by hand, tracing a subject frame-by-frame with rotoscope, a technique invented by Max Fleischer (who made the early Popeye cartoons). Bluescreen was a later "automatic traveling matte" technique, but it was not the only one. Disney used the sodium vapor lamp system to make the composites in Mary Poppins. While those extracted mattes are very clean and do not have the limitation of avoiding a background color in foreground subjects, they are more expensive. (They need the special lamps, and a double camera for the normal film and sodium sensitive film.)

And there are many other techniques. 2001: A Space Odyssey used reflex front projection for the "Dawn of Man" sequence, a variant of which was used later in the Christopher Reeve Superman movies. 2001 also used "model mover" multiple exposures for the Discovery in space shots—which is extremely time consuming because of the long exposures used. The camera/model movement was also limited due to the pre-computer, mechanical mover equipment. But 2001 has some of the cleanest composites in film history. (No generation loss from copying a copy of a copy of a piece of film.) The VFX crew for Spaceballs used an upgraded version of this model-mover approach. Models were moved in a blacked-out studio for the first pass. A second pass, moving camera and models the exact same way, was then made under UV light. Since the models had all been coated with a UV-reactant lacquer, they "lit up" and produced bright silhouettes of themselves. One advantage of this technique is that even fine details, like the "rabbit-ear" TV antenna on Lonestar One, will show up in the composite.

Bluescreen became so popular because common film stocks could be used. The hard part was all done in the lab. Blue exposed through a red filter makes "black," and the negative of red exposed through a blue filter also produces "black." And that is how the lithographic, deep black mattes were extracted from common film. But all those exposures produce grainy "generation loss."

Film and television engineer Petro Vlahos invented Ultimatte in the 1970s, the first high quality electronic compositor for video. And as cinema quality digital cameras arrived, the common blue background shifted to green for technical reasons. Although any color can be used. Digital compositing has also solved the motion blur problem, added features like "lightwrap" and other subtleties to make modern composites completely undetectable. It is even possible in realtime with a Webcam to extract a common background, as anyone who has used Zoom or chat filters knows.

We've come a long way.
 
Last edited:
I think it was on Cosmos (1980) that Carl Sagan had himself composited into a miniature interior of the Library at Alexandria. And at the time it was a new technique, something about computerized motion control for the camera on him to match the one on the model. Of course, I think that was shot on video, SD obviously, which was a different animal from filming the 11-footer.
 
Why do you need multiple elements that are gonna be combined at the end anyway (only hull, only lights, only deflector, only nacelles,...), when you can film the whole thing with all lights on? Is that done so you can make the lights brighter or darker separately later on?
 
Why do you need multiple elements that are gonna be combined at the end anyway (only hull, only lights, only deflector, only nacelles,...), when you can film the whole thing with all lights on? Is that done so you can make the lights brighter or darker separately later on?

Yeah. having the hull, lights, deflector, nacelles as separate passes would make sense if you wanted control in compositing them at a later time. I would've thought that was more for digital compositing than doing it with film though.
 
Why do you need multiple elements that are gonna be combined at the end anyway (only hull, only lights, only deflector, only nacelles,...), when you can film the whole thing with all lights on?
With the Enterprise, the ship was photographed all in one pass, as the miniature crew did not have model/camera movers with repeatable movements. I think the 11' model was mounted on a geared camera head designed for some of those bigger cameras, like Vista cameras. Movements were hand-cranked.

2001: A Space Odyssey, on the other hand, had the budget and time table to do things the long way. The model/camera mover was mechanical, but repeatable. On a first pass they might photograph Discovery with hull lighting. (I don't know if they got meticulous enough to do key and fill passes separately, but I think not.) They were using very low ASA stock to get the lowest grain, which meant they needed very bright lights and/or long exposures. The mover and film were wound back, the ship covered in black, then the flight deck ("bridge") was illuminated, sometimes with rear-projected interior so that the astronauts could be seen moving around. And took an entirely different exposure. Then a star pass, although I think Kubrick "permitted" hand rotoscoped mattes for that. He wanted as little "generation loss" as possible. 2001 was extravagantly made, many miniature, VFX and even live action shots made that were never used in the final cut. From Star Wars onward, VFX were planned to the frame before being made. There are rare cases where detailed and expensive VFX were made and not used, like the "love scene" in TRON.

Brian Johnson, who was a VFX assistant on 2001, later used this "in-camera compositing" technique to great effect for Space: 1999 just a few years later. Lost in Space, which started production prior to Star Trek, did single-take shots with front or rear projection and a small model suspended on wires, or supported from the rear. It depends on budget, production design—many factors. James Cameron used "low-tech," old fashioned techniques like rear projection in Terminator 2: Judgment Day side-by-side with the latest in CGI rendered and composited shots because it gave him the "organic," natural look he wanted, quickly and cheaply.

Today, compositing is a specialty all by itself. Even a completely computer generated scene might make dozens of discrete passes so that each element can be controlled separately without having to re-render the entire scene. The compositor (or director) might decide to dim or eliminate one light source—or ask for one to be added to highlight something, meaning only the frames for that one light source need to be rendered.

During the days of bluescreen on film, VFX artists did not go out of their way to make more work. They "worked smarter not harder" and got creative. That element-by-element business I described above is a product of digital tools—it actually makes sense to do things that way, and gives the moviemakers more control in the deal.
 
I'm fairly sure (going off of memory) TMOST says they used a optical printer for creating the matte fx in TOS.
These are relevant
https://davidstipes.net/?p=858
To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.

I just finished watching this cool video and interestingly it was commented that Probert intended the red bussard collectors of the E-D to be off when the ship is not at warp which is different from turning off the blue glow on the TMP Enterprise.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top