It is usually pretty easy to tell the difference between a computer-generated scene and a real life one but some new demos from Epic Games are making that harder and harder.

Epic hosted the State of Unreal presentation at the Game Developers Conference in San Francisco this week where it presented a collection of different scenes showing off their new real-time ray tracing, motion capture and real-time facial animation mapping.

Also read: Microsoft's 'DirectX Raytracing' technology aims to bring movie-quality lighting to video games

This Star Wars scene showcases Epic Games' implementation of Nvidia's new RTX feature. Ray tracing is a rendering process of mapping light rays through an image to create realistic shading, reflections and depth-of-field. It requires incredibly powerful hardware to do quickly and we are only now seeing real-time implementations thanks to Nvidia's Volta GPUs. Epic will use the DirectX Raytracing API (DXR) to make this feature available to Unreal developers later this year.

If you still don't believe the scene was computer-generated, here is a behind-the-scenes explanation of what is going on.

Although not as flashy as the Star Wars scene, the next demo is just as incredible. Siren is a digital human developed through a partnership between Epic Games, Tencent, Cubic Motion and 3Lateral. It was created by mapping the appearance of one actor onto the movements of another.

Motion capture technology isn't new but recent advancements have made the rendering process easier and have constantly been increasing its realism. Epic also showed a demo of 3Lateral's Osiris character. Note how realistic the facial animations and mannerisms are, then remember that this is all being rendered in real-time.