• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Netflix SW series on the way!!!

The tech has finally come into maturation. This video this week shows what is possible.

Reflections Real-Time Ray Tracing Demo | Project Spotlight | Unreal Engine
To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.
I think this is what Lucasfilm and Disney have been waiting for. The CGI Model assets can be shared for the feature films, videogames, this TV series, any animated TV series (but when it's possible to have this quality in 4k why bother with animation?), and VR games.

via
https://www.roadtovr.com/gdc-2018-w...time-ray-tracing-demo-unreal-engine-stunning/
&
https://www.theverge.com/2018/3/22/17150858/unreal-engine-star-wars-ray-tracing-epic-games-gdc-2018
 
Being able to do a realtime raytrace render of three characters in an elevator is a loooooooooooong way from replacing feature, or even tv length pre-rendered animation. For starters none of these characters have faces, much less lips to synch, or indeed any surface that isn't metal or plastic.
Clearly this is an interesting development, but it's still just a tech demo.
 
Being able to do a realtime raytrace render of three characters in an elevator is a loooooooooooong way from replacing feature, or even tv length pre-rendered animation. For starters none of these characters have faces, much less lips to synch, or indeed any surface that isn't metal or plastic.
Clearly this is an interesting development, but it's still just a tech demo.

There’s another video doing the rounds showing it playing a mo cap performance live into a cgi character (borderline photoreal Human one at that...like almost Rogue One Tarkin) so...it is pretty much possible, but the final quality may not be quite what people want.
 
Perhaps this is the video you are talking about.
Siren Behind The Scenes | Project Spotlight | Unreal Engine
To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.
end result
To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.
It shows how CG characters can happen in real-time from motion-capture now. Again as I mentioned above. Once created CGI model assets can be shared among Lucasfilm's Star Wars movies, TV series, VR immersive experiences, video games, VR-based videogames.
The real-time presentation displayed during Epic’s keynote was previously recorded on Vicon’s capture stage at its headquarters in Oxford, England. To create the video, actress Alexa Lee wore a full-body motion capture suit with a head-mounted camera. Using Vicon’s new Shōgun 1.2 software, her body and finger movements were captured on one screen while the data was streamed into Unreal Engine using Vicon’s new Live Link plug-in. On a second screen, the Siren character – created using the likeness of Chinese actress Bingjie Jiang - moved in sync, driven in-engine at 60 frames per second.

also
To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.
http://www.cgw.com/Press-Center/Web...es-Introduces-Lifelike-Real-Time-Charact.aspx
 
^ That still has one foot firmly in the uncanny valley. Although if I hadn't known it was CG from the start, I might not known exactly why it felt a bit "off".
 
Andy Serkis, in that last video. He and I have lived on the same street.

I’ve waited a long time to share that.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top