• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

How do sensors work?

trekfan_1

Captain
Captain
Obviously it's fiction. But is there a basic principle on how sensors work? I'm talking about ship mounted sensors. I'm introducing my girlfriend to Star Trek . She asked me how the aft view screen works if there's no rear camera. The space baby in TNG's "Galaxy's Child" for example. How can we see the the angle we see on the bridge view screen when the lifeform is latched on the Enterprise from behind? It's not like there's someone floating out in space filming the Enterprise from behind.

galaxys-child-hd-278.jpg


I speculated to my GF that the sensors just projects an image and direct line of sight/angle perspective is not required. She asked how and I had no answer. Even if it's a fiction answer

Anyone can explain the basic principle of sensors? Was I was correct in my assertion that sensor technology is responsible for view screen (external) visuals at seemingly impossible line of sight angles?
 
Last edited:
I wouldn't be surprised if there are multiple Electro-Optical Camera's covertly mounted all across the hull facing all sides so that anybody on the bridge can press a button and see out that camera.

A Camera mounted near the front insides of the Bussard Collector could easily see the Baby latched onto the back of the Neck that connects the StarDrive to the Saucer section.

Look carefully at the angle, it's slightly off center, right near the direction of the StarBoard Warp Nacelle.

As to which camera on the Warp Nacelle, that's TBD and would require a lot of unnecessary math to back trace.

In more recent Star Trek shows, they've shown little robot drones flying around StarShips including DISCO, PRODIGY, & SNW. Those are usually used for maintenance.

But a Camera Bot could easily be made and used or ready to be replicated as needed.

As for sensors, I wouldn't be surprised if they're based off modern RADAR and just apply a little "Subspace" enhancement for Superluminal Sensors.

Just like "Subspace Radio", it's Radio, but shunted through the Subspace layer for maximum Radio Wave speeds traveling at FTL speeds.
 
It's far too high up to be on the nacelle.

They also see right into each other's faces on the viewscreen without a camera in the center. I always imagined there are cameras all around the screen that generate an interpolated picture that is centered.
 
Sensors can bounce off things.
So I wouldn't be surprised if UFP sensors have the ability to bounce off the hull of the ship or even the interior of the shield (or even the subspace field which the ship generates) and other sensors beams to give you any desirable angle on the ship itself.

There's that... and the fact the viewscreen uses holographic projection to make the image (we've seen evidence of this on ST: VOY Year of Hell).
So, the ship's sensors provide the data which is fed to the holoviewscreen on the bridge which then extrapolates what's happening from a given angle... usual face to face communications are simpler because there's no approximation needed since the holo camera can feed a live image directly.

Not a problem really.
 
Last edited:
It's probably a combination of factors:

- Autonomous miniature drones float around the interior and exterior of the ship at all times, recording in multiple spectrums (semi-facetious article I wrote about it long ago). Similar to the Kino's from Stargate Universe. They can give you impossible viewing angles like the one in the OP or the engine room recordings Kirk accessed to witness Spock's mind meld with Dr. McCoy.

- Virtual extrapolation and recreation based on sensor technology. All the available sensors around the ship record everything in its vicinity in multiple spectra, and then when you want an external view like the one of Junior on the neck of the Enterprise, the computer will generate a photorealistic real-time image of what you would see if there was a camera in that location.

- Some kind of steerable artificial gravity effect (possibly a modification of the tractor beams) that allows you to bend light so you can see a view you wouldn't normally be able to get.

- Subspace viewing over long distance or from places you don't have a fixed camera or sensor present. The Argus Array from TNG The Nth Degree and Parallels was able to capture detailed surface images of Utopia Planitia on Mars and other Federation locations from dozens of light years or more away near the Cardassian border, which suggests some sort of artificial gravity lensing was being used by each Argus Array.
Qf5K0Kn.jpg
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top