Category: Unreal Engine 4

Ue4 渲染流程

TranslucentRendering.h: Translucent rendering definitions.

TranslucentRendering.cpp: Translucent rendering implementation.

 
 

这部分用来渲染透明网格。

 
 

 
 

后面两个函数处理参数后都是调用的 DrawMesh() 实现,DrawMesh 实现框架如下:

 
 

 
 

这里关键点就是判断,只处理透明材质: if (IsTranslucentBlendMode(BlendMode))

 
 

 
 

然后来看可能是那里调用了上面的处理函数,TranslucentRendering.cpp 文件里面最后的几个实现方法是来自 DeferredShadingRenderer.h: Scene rendering definitions. 的,我们溯源会看到如下信息:

 
 

目测 DeferredShadingRenderer.h: Scene rendering definitions. 定义了完整的场景渲染方法。

 
 

 
 

 
 

 
 

 
 

 
 

 
 

 
 

 
 

 
 

 
 

 
 

 
 

UE4渲染模块分析

UE4的渲染模块是一个独立的模块,这篇文章从该模块的设计理念和思路进行剖析。
通常渲染模块由如下几个部分组成:

  • 场景的描述
  • 场景遍历和拣选
  • 渲染的执行

 
 

 
 

场景的描述

 
 

UE4场景管理相关的数据结构如下:

  • FScene 场景类
  • FPrimitiveSceneProxy 场景里的几何体类
  • FPrimitiveSceneInfo 场景里的结点(拥有几何体和状态信息)

 
 

每个几何体具有材质属性,相关的数据结构如下:

  • FMaterial 材质接口类,提供材质属性的查询(eg. blend mode)shader查找。
  • FMaterialResoruce UMaterial实现的具体的FMaterial
  • FMaterialRenderProxy 渲染线程用的Material对象,提供FMaterial的访问和材质参数的访问(eg. scaler, vector, texture parameter等参数)

 
 

 
 

 
 

场景的遍历和拣选

 
 

在简单的3D渲染引擎中,通常的做法是:遍历场景结点,根据视景体进行判断该结点是否可见,如果可见则保存到渲染队列中,否则忽略之。最后对渲染队列中的结点进行按材质排序,然后绘制它们。

 
 

在UE4中,使用了不同于上面的方式进行处理,它对几何体进行分类处理(Static Primitive和Dynamic Primitive)。

 
 

  • Static Render Path

    在FScene对象中存在一些static draw list,在PrimitiveSceneProxy被插入场景中时,会通过调用FPrimitveSceneProxy::DrawStaticElements()来收集FStaticMeshElements数据。 然后创建相应的drawing policy对象实例放入到draw list中去。这个drawing policy对象是按照材质排序放入到draw list中的。

 
 

  • Dynamic Render Path

    对于动态渲染路径,InitViews在判定某个PrimitiveScenceProxy是可见后,通过传入TDynamicPrimitiveDrawer对象调用FPrimitiveSceneProxy::DrawDynamicElements()来收集FMeshElements,供以后的渲染使用。

 
 

上面的两种渲染路径并不冲突,一个FPrimitiveSceneProxy可以实现DrawStaticElements()和DrawDynamicElements()来同时支持它们,也就是说这个SceneProxy既有Static FMeshElements又有Dynamic FMeshElements。

 
 

 
 

 
 

渲染的执行

 
 

对于简单的渲染引擎,只会简单地对可见的几何体执行渲染(设置渲染状态、GPU Shader和参数、发射Draw指令),而UE4的渲染比较复杂,进行多pass绘制,下面列出它的各个pass顺序并逐一介绍.

 
 

  1. PASS_0: PrePass/Depth Only Pass

    该pass使用FDepthDrawingPolicy策略进行绘制,只绘制depth到Depth-Buffer,这个有利于减少后面的Base pass中的pixel填充,节省pixel-shader的执行。

     
     

  2. PASS_1: Base pass

    pass绘制不透明的和masked material的属性的几何体,输入材质属性到G-Buffer; 同时计算Lightmapsky lighting的贡献量到scene color buffer中。下面罗列出相关函数。

     
     

  3. PASS_2: Issue Occlusion Queries / BeginOcclusionTests

    执行遮挡查询,在绘制下一帧时,InitView会使用这些信息进行可见性判断。遮挡查询的原理是通过绘制几何体的包围盒进行z-depth测试,以粗略地判断该几何体是否被遮挡。

     
     

  4. PASS_3: ShadowMap(阴影计算)

    针对每个光源渲染相应的Shadowmap, 光源也被累积到translucency lighting volumes(这块不明白,理解估计有误)

     
     

  5. PASS_4: Lighting(光照计算)

    分为如下子阶段:

  • Pre-lighting composition lighting stage:预处理组合型光照(eg. deferred decals, SSAO)
  • Render lights:光照计算

 
 

  1. PASS_5: Draw atmosphere

    对非透明表面绘制大气效果.

     
     

  2. PASS_6 Draw Fog

    针对非透明表面逐像素就算Fog.

     
     

  3. PASS_7: Draw translucency

    绘制半透明几何体.

    Translucency is accumulated into an offscreen render target where it has fogging applied per-vertex so it can integrate into the scene. Lit translucency computes final lighting in a single pass to blend correctly.

     
     

  4. PASS_8: Post Processing

    绘制后处理效果

     
     

 
 

 
 

 
 

 
 

 
 

 
 

 
 

 
 

 
 

UE4 四种加载资源的方式

 
 

http://blog.csdn.net/u012385432/article/details/52154737

 
 

在UNITY中,我们加载资源一般是通过Resources.Load(path).即可完成.该方法返回的是Object类型.如果你想要的是材质或者贴图等等,只要价格类型转换的关键字就可以了例如 as Material,则可以返回一个材质的引用…

 
 

在UE4中,加载资源的方式区别较大.经过自己一个下午的摸索,目前发现了这4种资源的加载方式.在UE4中,它的蓝图就大致等于UNTIY中的prefab.所以我们将资源弄成了蓝图的方式来进行加载.

 
 

 
 

 
 

第一种: 如果该蓝图有C++类(或者说是从C++类创建的蓝图),直接进行加载

 
 

ATemp* spawnActor = GetWorld()->SpawnActor<ATemp>(ATemp::StaticClass());

 
 

所有的加载资源并创建到场景中的方式都离不开SpawnActor这一句代码.如果你的蓝图包含了C++类,那么可以直接访问类的StaticClass

 
 

剩下的加载方式均是单纯的加载蓝图,并没有对应的C++类

 
 

 
 

 
 

第二种: 通过ConstructorHelpers来加载

 
 

static ConstructorHelpers::FClassFinder<AActor> bpClass(TEXT(“/Game/BluePrint/TestObj”));

if(bpClass.Class != NULL)

{

GetWorld()->SpawnActor(bpClass.Class);

}

 
 

FClassFinder是一个结构体,其中的Class成员变量是TSubClassof<T>类型的.所以我们只需要SpawnActor(bpClass.Class)就可以生成我们要的东西了

 
 

但是值得一提的是该方法只能在类的构造函数中使用,如果在普通的逻辑代码中嵌套这份代码,很可能引起整个编译器的crash.以下是该代码的具体执行步骤

 
 

 
 

 
 

第三种: 通过FStringAssetReference来加载

 
 

FStringAssetReference asset = “Blueprint’/Game/BluePrint/TestObj.TestObj'”;

UObject* itemObj = asset.ResolveObject();

UBlueprint* gen = Cast<UBlueprint>(itemObj);

if (gen != NULL)

{

AActor* spawnActor = GetWorld()->SpawnActor<AActor>(gen->GeneratedClass);

}*/

 
 

FStringAssetReference类的作用主要是通过一个字符串,找到该字符串所对应的资源.或者通过给定的资源,找到该资源所对应的在项目中的路径,也就是前面所说的字符串

 
 

其中,asset.ResolveObject就是查找字符串对应的资源,返回一个UObejct,我们通过将其转化成UBlueprint类型然后再去的他的GenerateClass即可.

 
 

 
 

 
 

第四种: 通过StaticLoadObject来加载

 
 

UObject* loadObj = StaticLoadObject(UBlueprint::StaticClass(), NULL, TEXT(“Blueprint’/Game/BluePrint/TestObj.TestObj'”));

if (loadObj != nullptr)

{

UBlueprint* ubp = Cast<UBlueprint>(loadObj);

AActor* spawnActor = GetWorld()->SpawnActor<AActor>(ubp->GeneratedClass);

UE_LOG(LogClass, Log, TEXT(“Success”));

}

 
 

原理的话几乎是和第三种是一样的.只是调用的方式不同而已.在这里就不再赘述了.

总结下来,第三种和第四种应该是最通用的.因为第一种要求有对应的蓝图C++类,而第二种又要求一定要是在构造函数中完成(不论是在谁的构造函数都可以,但该方法一定只能在构造函数中调用)…

 
 

想不到一个简单的加载资源也有这么多种方式…我已经给跪了…天呐…

 
 

 
 

Getting Started with VR in Unreal Engine 4

http://www.tomlooman.com/getting-started-with-vr/

 
 

 
 

This guide is for anyone who is looking to get into developing for Virtual Reality projects in Unreal Engine 4. Covering Blueprint, C++, performance considerations and how to set up your VR kits for UE4.

 
 

I highly recommend using the latest release of Unreal Engine 4 as VR is still being improved greatly with each new release.

 
 

A few good places to reference are the official Oculus forums, the official VR documentation pages and the Unreal Engine VR Subforums.

 
 

If you are looking for VR Templates to get you started right away, go to myGitHub Repository. These templates are currently WIP for both C++ and Blueprint with a variety of camera and motion controller features and include the performance optimizations we will discuss in this guide. The official VR Template is currently being developed and will be released as soon as possible!

 
 

 
 

 
 

Setup your VR Device

 
 

For this guide I will assume you have successfully installed your head-mounted display of choice (Visit Oculus Rift Setup or HTC Vive Pre Setup in case you did not). In case you are having difficulties getting your Vive to work, I found this Troubleshooting guide to be helpful.

 
 

Unreal Engine 4 supports all the major devices and you don’t need to perform any hassle to setup your game project for VR. Just make sure that the correct plugins are loaded for your HMD under Edit > Plugins. There are some performance considerations to take into account, we’re covering these later in the guide.

【UE4已经支持所有主流VR设备,一般来说可直接使用,不行的话看一下VR-Plugin里面是否启用。】

 
 


 
 

Before you launch the editor make sure your VR software is running, in the case of the HTC Vive this is the SteamVR app.

【运行editor前确保VR软件开启】

 
 

 
 

 
 

Launching VR Preview

 
 

Testing out your VR set is very straightforward, simply select “VR Preview” from the Play drop-down button. By default the head tracking will work right away without any changes to your existing project or template. I will go into more detail on how to add additional features such as motion controller setup and origin resetting etc. later on in this guide.

【直接VR Preview就可以看到效果了】

 
 


 
 

 
 

 
 

VR Best Practices

 
 

VR is still a relatively unexplored area, and we are learning new things with every day of development. Both Oculus and Epic Games have set up a Best Practices Guide that I recommend you read through and keep in the back of your head while developing games for VR.

【推荐先阅读一下各家的Best Practices Guide】

 
 

 
 

 
 

Using VR in Blueprint

 
 

Using VR in Blueprint is very straightforward and you don’t need a lot of set up to get yourself going.

 
 

You will need a Camera Component and optionally one or two Motion Controllers Components. By default your Camera is already set up for HMD support, if you wish to disable rotation changes from the HMD you can disable “Lock to HMD” in the Component’s properties. For more information on the Motion Controllers you can jump down in this guide or immediately jump to the official documentation page on how to Setup Motion Controllers.

【默认相机就已经启用HMD控制,但是可以选择去掉,motion controller也可以直接加上。】

 
 

Here is a (non-exhaustive list) of the available nodes in Blueprint:

 
 


 
 

To reset your HMD position and/or orientation (With optional Yaw offset):

【重置】

 
 


 
 

To selectively enable features when using VR you can easily check whether your HMD is enabled:

【是否启用Vr的检查】

 
 


 
 

 
 

 
 

SteamVR Chaperone

 
 

The Chaperone component is specific to SteamVR and has easy access to the soft bounds. The soft bounds are represented as an array of Vectors centered around the calibrated HMD’s Origin (0,0,0). The Z component of the Vectors is always zero. You can add this component like any other ActorComponent to your Blueprint as seen below.

 
 


 
 

 
 

 
 

USteamVRChaperoneComponent

 
 

To use the chaperone in C++ open up your ProjectName.Build.cs and add the “SteamVR” module to the PrivateDependencyModuleNames array. See below for a sample.

【C++启用SteamVR模块来开启SteamVR Chaperone】

 
 

using UnrealBuildTool;

 

public class VRFirstPerson : ModuleRules

{

public VRFirstPerson(TargetInfo Target)

{

PublicDependencyModuleNames.AddRange(new string[] { “Core”, “CoreUObject”, “Engine”, “InputCore” });

 

/* VR Required Modules */

PrivateDependencyModuleNames.AddRange(new string[] { “HeadMountedDisplay” , “SteamVR” });

}

}

 
 

 
 

 
 

Setup Motion Controllers

 
 

The official documentation has a good introduction on Motion Controller setup and input handling, so if your VR Device supports motion controllers I recommend following along with the documentation. For a practical example check out my VR Templates on GitHub.

【官方文档和我的例子里面都有好的demo可以直接学习使用】

 
 

If you’re having trouble aligning your Motion Controllers with the Camera, simply use a SceneComponent as “VROrigin”, this is especially helpful when the root component has an undesirable pivot like the CapsuleComponent in a Character Blueprint.

【建议使用VR的时候相关内容成立一个场景的独立组件,如下图的结构】

 
 


 
 

 
 

 
 

 
 

Using VR in C++

 
 

As of 4.11 not all functionality is exposed to Blueprint, if you are looking to do more advanced custom setups you might need to dig into C++ to adjust a few settings. Check out the IHeadMountedDisplay.h for a look at the available functions. Certain plugins add additional features likeSteamVRChaperoneComponent but are specific to a single device.

【UE4没有将所有的功能都暴露在blueprint里面,因此对于高级功能还是要使用C++来处理, IHeadMountedDisplay.h 展示了可用功能。】

 
 

 
 

 
 

Required Modules & Includes

 
 

If you wish to access the HMD features through C++ you need to include the“HeadMountedDisplay” Module in your ProjectName.Build.cs file you can find in your Visual Studio solution explorer. Here is an example of the build file from the VRFirstPerson project.

【首先一样要导入相关模块,下面是第一人称VR游戏的例子】

 
 

using UnrealBuildTool;

 

public class VRFirstPerson : ModuleRules

{

public VRFirstPerson(TargetInfo Target)

{

PublicDependencyModuleNames.AddRange(new string[] { “Core”, “CoreUObject”, “Engine”, “InputCore” });

 

/* VR Module */

PrivateDependencyModuleNames.AddRange(new string[] { “HeadMountedDisplay” });

 

// …

}

}

 

To use HMD features or the motion controller component, make sure you include the following header files.

【还有相关头文件】

 
 

/* VR Includes */

#include “HeadMountedDisplay.h”

#include “MotionControllerComponent.h”

 
 

 
 

 
 

 
 

Performance Considerations

 
 

For the whole VR experience to look smooth, your game needs to run on 75 hz (Oculus DK2) or even 90 hz. (HTC Vive and Oculus CV1) depending on your device. To see your current framerate type in “stat fps” or “stat unit” (for more detailed breakdown) in your console when running the game.

 
 

 
 

 
 

CPU Profiling

 
 

Your game might be CPU or GPU bound, to find out you need to measure (a quick way is to use “stat unit”). With the complexity of current gen games and engines it’s near impossible to make good guesses on what’s bottlenecking your performance so use the tools at your disposal! Bob Tellez wrote a blog post on CPU Profiling with Unreal Engine 4 and it’s a good place to get started.

【关于UE4 VR CPU方面的剖析可以看看】

 
 

 
 

 
 

GPU Profiling

 
 

To capture a single frame with GPU timings press Ctrl+Shift+, or type in “profilegpu” in the console. This command dumps accurate timings of the GPU, you will find that certain processes are a heavy burden on the framerate (Ambient Occlusion is one common example) when using VR.

【Ctrl+Shift+, 来显示单帧GPU渲染时间,通过这可以来分析性能相关问题。】

 
 


 
 

The GPU Profiling & Performance and Profiling docs are a good place to learn about profiling your game.

【相关文档】

 
 

While profiling you might stumble(绊倒) on other costly features depending on your scene and project. One example is the Translucent Lighting Volume you may not need but even when unused it adds a static cost to your scene, check out this

 AnswerHub post by Daniel Wright for more info on how to disable this feature. All that is left for you to do is measure and test, there is no single configuration that is perfect for all projects.

【有些无用渲染特征可以关闭,如何关闭渲染feature详见上文】

 
 

The developers from FATED came up with a great list of tips in their quest for optimized VR. A few examples they mention are to disable HZB Occlusion Culling (r.HZBOcclusion 0), Motion Blur (r.DefaultFeature.MotionBlur=False) and Lens Flares (r.DefaultFeature.LensFlare=False). The commands do not persist through multiple sessions, so you should add (or search and replace) them in your /Config/DefaultEngine.ini config file although most of these settings are available through Edit > Project Settings… > Rendering.

【要关闭哪些相关VR无用渲染设置可以见这份tips。】

 
 

Another great optimization to consider is the Instanced Stereo Rendering, we’ll talk about that next.

 
 

 
 

 
 

Instanced Stereo Rendering

 
 

The latest 4.11 release introduces Instanced Stereo Rendering, check the video below for a comparison video of how that works.

【新版支持的一个非常关键的渲染能力】

 
 

https://www.youtube.com/watch?v=uTUwKC7GXjo

(“Basically, we’re utilizing hardware instancing to draw both eyes simultaneously with a single draw call and pass through the render loop. This cuts down render thread CPU time significantly and also improves GPU performance. Bullet Train was seeing ~15 – 20% CPU improvement on the render thread and ~7 – 10% improvement on the GPU.” – Ryan Vance.)

 
 

To enable this feature in 4.11 and above, go to your Project Settings and look for “Instanced Stereo” under the Rendering category.

【参加下图来开启/关闭该功能】

 
 


 
 

 
 

 
 

Disable Heavy Post-Processors

 
 

Certain post-processing effects are very costly in VR like Ambient Occlusion. Others may even become an annoyance in VR like Lens Flares as they may break your immersion of being present in the scene and instead looking through a camera. These are easy examples to get started and see how it affects your game and performance.

【很多非常耗性能的后处理工作需要关闭】

 
 

To disable post processing features on a project level, go to Edit > Project Settings > Rendering. You can do the same thing in post-processing volumes. Keep in mind that post-processing volumes can override the project-wide settings specified below.

【怎么做,参考下图】

 
 


 
 

 
 

 
 

Reduce Scene Complexity

 
 

With current gen hardware it’s really difficult to stay on your 90 fps target. You may need to revisit your previous traditional constraints and look at your scene complexity like dynamic shadows, atmospheric smoke effects and polycount of meshes.

【减少场景复杂度是提高性能的一大利器。】

 
 

It’s important to minimize overdraw to keep performance at a maximum. Lots of translucent surfaces and/or particle effects can easily cause your framerate to tank. To visualize the current shader complexity / overdraw press Alt+8 in your viewport (Alt+4 to return to default view). Look at the bottom picture from the Elemental Demo to get an idea of how much the atmospheric effects can impact your framerate (green = good, red = bad, white hot = extremely bad at about 2000 shader instructions per pixel)

【Alt+4 default view; Alt+8 visualize the current shader complexity / overdraw. 查看overdraw的性能,绿色表示好。】

 
 

Dynamic shadows and lights have a huge impact on performance too. Bake as much lighting as you can to keep the per-frame cost as low as possible.

【动态光照也是一个性能问题】

 
 


 
 

 
 

 
 

 
 

List of Rendering Commands

 
 

The excellent talk by Nick Whiting and Nick Donaldson contains a list of render commands to use for GPU optimization in VR. You can find the list below. I recommend watching their talk regardless as it contains great info on the basics of Virtual Reality in general.

【这文章包含一系列的渲染命令来做用于VR渲染优化,强烈建议看看。】

 
 

To test out these commands hit ~ (Tilde) to open the command console. Once you settled on a command to be included for your project, you can add them to your configuration in /Config/DefaultEngine.ini under [/Script/Engine.RendererSettings]. Tip: Check if the command exists in the list before adding it yourself.

【相关设置你可以加入你的代码工程设置,做法如上述。】

 
 

r.SeparateTranslucency=0

r.HZBOcclusion=0

r.FinishCurrentFrame=1

r.MotionBlurQuality=0

r.PostProcessAAQuality=3

r.BloomQuality=1

r.EyeAdaptionQuality=0

r.AmbientOcclusionLevels=0

r.DepthOfFieldQuality=0

r.SceneColorFormat=2

r.TranslucentLightingVolume 0

r.TranslucencyVolumeBlur=0

r.TranslucencyLightingVolumeDim=4

r.MaxAnisotropy=8

r.LensFlareQuality=0

r.SceneColorFringeQuality=0

r.FastBlurThreshold=0

r.SSR.MaxRoughness=0

r.SSR.Quality=0

r.rhicmdbypass=0

r.TiledReflectionEnvironmentMinimumCount=10

 
 

 
 

 
 

Troubleshooting

 
 

  •  
     

    Vive Specific (Non-PRE editions): Once you launched the editor, SteamVR may state “Not Ready” this means something may be overlapping and preventing the Compositor screen to run at more than 60 FPS causing jittering and motion sickness. More information and workaround for this issue can be found on this AnswerHub Thread! The next iteration of Vive devices (Vive PRE) no longer have this issue as they moved to direct mode for the displays, for this make sure you updated your graphics drivers to support direct mode.

     
     


     
     

     
     

     
     

     
     

    References

     
     

  • Official Documentation Main Page
  • VR Cheat Sheet
  • Unreal Engine VR Subforums
  • Unreal Engine VR Playlist
  •  
     

    Hopefully this guide has helped you get started with Virtual Reality project!

    If you have a question or feel that I missed something important, let me know by leaving a reply below! To stay in touch, follow me on Twitter!