Skyboxes are textures that you use to display distant objects and environments in your scene, so that you don’t have to render the universe to an infinite distance. They’re often pre-rendered or captured, although you can start adding layered and dynamic elements to make them fancier. Ultimately they’re an optimisation to the problem of how do you give the impression that the player is in a huge world, even though really it ends a lot closer than it looks.
In this post, I’m hopefully going to fill a gap in the UE4 documentation by properly explaining the pipeline for skyboxes. Because although the official documentation has several chapters on it, it somehow manages to completely avoid explaining some pretty important bits.
Aren’t cube maps those 6-faced DDS things?
A lot of my accumulated knowledge from when I previously did 3D graphics is still valid, but some of it ends up being a bit out of date, such as having a gut feeling that a model using more than 2 materials, or more than 5000 triangles, is too complex 😉 (it’s not, that’s small beer these days).
One of those things was thinking that the way you did environment maps for reflections and skyboxes was via 6-sided cubemaps. Typically these would be in .dds files (a DirectX format originally), encoding all 6 square faces directly, or sometimes you’d use 6 separate images and ask the engine to combine them. I spent a lot of time last week trying to make either of these work in Unreal, and hit all kinds of problems because of this faulty assumption.
Sure, I knew that there were plenty of other projections, particularly for real-time reflection mapping, but I’d assumed that for pre-rendered environment maps that 6-sided cubemaps were still the way to go. In UE4, they’re not.
Enter the Equirectangular projection
Instead of the 6 faces of a cube, UE4 really wants environment maps in Equirectangular projection, which is an image which is twice as wide as it is high, and encodes the 6 faces of the skybox cube like this:
As you can see, all 6 faces are encoded into one 2D texture, with the faces quite distorted to provide a spherical mapping where moving an equal distance in longitude or latitude moves an equal distance in pixels, anywhere on the image. Sometimes this is called “Spherical” or “Lat-Long” mapping.
This format is extremely common in HDRI libraries like HDRIHaven and seems the default in visual effects. It has the advantage over cube maps that it’s a single file but also doesn’t require an esoteric format like DDS, because it’s just a regular 2D image.
Using existing cubemaps
But what if you have old-style cubemap files like this old man is used to? Unfortunately, UE4 is quite restrictive here, which is what caused me confusion. For one, it seems to have no support at all for combining 6 separate images into a cubemap in the editor itself, the default fallback I would have usually reached for.
And, while you can import 6-sided DDS cubemaps into UE4, you can only do so if they’re Low Dynamic Range (LDR) e.g. 24-bit RGB. But let’s face it, your skyboxes and environment maps are probably High Dynamic Range (HDR), because they often feature very bright light sources, such as the sun. You just can’t encode that adequately in 24-bit RGB for modern rendering.
HDR DDS cubemaps do not import successfully into UE4. A perfectly valid floatRGB DDS cubemap is simply rejected by the import process, citing an “unsupported pixel format”. A bit annoying, but it seems by design, having found mention of this in the forums.
So, the only route that works for HDR cubemaps is the equirectangular format. The best format for the image seems to be the Radiance HDR (.hdr) format. We’ll need to convert existing HDR DDS cubemaps to this manually before import.
Converting DDS Cubemaps to Equirectangular .hdr
The documentation in UE4 about doing this is kinda vague. They heavily push you towards using the NVidia Texture Tools, but those are a plugin for Photoshop. If, like me, you don’t use Photoshop, that’s pretty useless advice and you’re a bit stuck, becuase it makes no suggestion about what else you might use, and doesn’t document how you might do it other ways. This is a rare case of the UE4 asset pipeline being a bit weak; I was surprised. 🤔
Luckily all you really need is a tool that can import cubemaps, and that can re-map those to an equirectangular / spherical / lat-long mapping. I really wish the UE4 docs would just say that, rather than being coy and telling you to go use those Nvidia tools. Then I wouldn’t have felt the need to write this blog post. 🙄
There are probably other options, but I found the rather vaguely named open source Image Viewer tool to be ideal for this job. After opening it:
- File > Import and load your DDS file (including HDR)
- Tools > Cubemap to Latlong
- Pick a (horizontal) resolution, it will automatically set the vertical to half that
- File > Export to write to a 2D equirectangular texture
- If you’re using HDR, set “Save as type” to HDR (*.hdr)
You can then import that 2D equirectangular file into UE4, it should automatically realise it’s a spherical map and HDR will be supported.
Skybox texture settings
Once you have the skybox imported, it’ll look like this in your content folder:
You’ll want to open the details and change a few things. Firstly, you should know that by default, UE4 will reduce your skybox resolution to 512x512 on each face, regardless of the import texture size:
I assume this is just a safe default on the assumption that a lot of cubemaps will just be for reflection maps, but this is far too low resolution for a skybox.
You need to set the “Max Texture Size” in the advanced section of Compression Settings to something larger - it actually starts at “0”, which seems to mean the default which is currently 512). A good guide to preserve the original resolution is to set it to half the vertical resolution of the original equirectangular map, so if you imported a 4096x2048 map, set this to 1024.
Finally, for a skybox you’ll want to tweak the Level Of Detail settings, to disable mipmaps, and to set the texture group to “Skybox”.
Note: you might wonder “won’t I still need mipmaps for using this as a reflection map?”. Well, in practice no, because you’ll use reflection captures for this, which will capture other parts of the scene as well, and they will have their own mipmap settings. You only really need this big texture for the actual skybox, and it’ll never need mipmaps.
Creating a skybox material
So once you have this texture, you’ll need a material to render it with. Because a skybox is supposed to be infinitely distant from the camera, it should never perceptibly translate relative to the camera. For this reason, we’ll always be generating the texture coordinates for whatever geometry we apply the skybox to, simply based on the view direction.
This is the solution you’ll generally find online to this:
That reflection vector calculation looks a bit odd though, right? What it’s doing is reflecting the world-space camera vector on a null plane, which has the effect of inverting it because most of the terms become zero. We want this, because the camera vector is the vector from a pixel in the world, back to the camera, and what we want is the opposite to look up the cubemap. So I prefer to just do that directly instead:
I don’t know why there isn’t a “Negate Vector” node in materials like in Blueprints, hence the “subtract from zero vector” - if there’s a better way please let me know. I still think it’s better than reflecting with a zero normal. I also added a brightness scaling parameter as well, for convenience.
You’ll also want to make a few tweaks to the top-level material settings:
It’s unlit, and you’ll probably want it to be double-sided so you can use it on the inside of skybox geometry.
This is all fine but you need to render this skybox on something, right? The benefit of generating texture coordinates from the camera is that any geometry we put this material on will appear to be infinitely far away.
One way of doing skybox geometry is to just put a sphere around your world. This is the default in UE4 blank levels, although they use a special object “BP_Sky_Sphere” which is tuned for dynamic sun and clouds. You don’t need that for a static skybox, you can just use a regular sphere, or even a cube if you want.
The important thing to remember is that by default this surrounding geometry is static. You can make it very large, and that’s what BP_Sky_Sphere does, but it’s still possible to “pop” through it if you move the camera far enough.
One way of addressing that is to make the sphere follow the camera so that never happens, and that’s probably the best solution for large outdoor worlds.
However, for indoor or enclosed worlds, those which aren’t sprawling in all directions, you can just surround the areas of the level where you can see “out” with geometry which fills the gaps, and apply the skybox material to that. The shape of the geometry doesn’t matter, so long as it fills the gaps, and never accidentally occludes any non-skybox geometry from any viewpoint. For completely indoor levels, you could just put the skybox material on a plane on the outside of every window, and it would look fine.
One benefit of this localised approach is that you can use different skyboxes for different parts of the level, if they’re far enough away for that to look realistic. Another benefit is that you have less overdraw if you’re only rendering it in places where you can see outside the level.
All the HDRI libraries available are great, but they’re all real-world environments. What if you want a nice space scene, or a mythical landscape? There are almost infinite options, but let’s talk practicalities about a couple.
Space skyboxes with Spacescape
Spacescape has been a great generator for space skyboxes for many years, in fact it was written using Ogre, originally during my tenure. I looked around and I genuinely couldn’t find anything better than it, even today – if you know better, let me know on Twitter.
Here’s how to get a HDR space scene out of that and into UE4:
- First, setup your space scene in the Spacescape editor, or load an example
- In the menu, click Settings > Enable HDR
- File > Export Skybox
- Choose “Single DDS Cubemap” as the type
- Set Export For: “SOURCE” - NOT “Unreal”!
- Give it a name and click Save
That gives us a HDR cubemap in DDS format, but as previously noted we can’t load that into Unreal. So now we convert it with Image Viewer using the sequence described in Converting DDS Cubemaps to Equirectangular above.
Rendering Skyboxes with Blender 2.9
It would be nice to be able to render out general skyboxes from Blender, to give you complete freedom of what you want to put in it. And guess what? That equirectangular diagram image near the start of this blog post was rendered using Blender 🙂 Here’s how you do it…
Firstly, the equirectangular projection required is not available in Eevee, so you need to change your render engine to Cycles:
Next, select your camera, then change its Type to “Panoramic”, then when the Panorama Type appears, select “Equirectangular”:
Finally, in Output Properties:
- In Dimensions, Make sure “Resolution X” is 2x the size of “Resolution Y” (e.g. 2048x1024, 4096x2048)
- In Output, set File Format to Radiance HDR if you want HDR output
Now you can press F12 to render, and you’ll get an equirectangular map that UE4 will happily use as a cube map.
Blender and UE4 don’t quite agree on one thing: Blender uses a right-handed Z-up system, but UE4 uses a left-handed Z-up system. This means that according to Blender, the face in the middle of the map will be +Y, and not -Y. This is fine 😉 It doesn’t matter because no actual coordinates are being exchanged, the image of the world is entirely the same for both, so long as you don’t worry about the names.
Well, this ended up being longer than I expected. I hope it’s useful to someone out there, and if not then it’ll be useful to at least me in 6 months time 😎