Height maps and displacement maps are the unsung workhorses of PBR pipelines. They control how surfaces catch light, cast micro-shadows, and interact with geometry — turning a flat plane of painted polygons into something that reads as genuinely three-dimensional. Getting them right historically required either scanning physical surfaces, hand-painting in Photoshop, or generating procedurally in tools like Substance Designer. In 2026, AI height map generators can produce tileable, production-ready maps from a text description in under 15 seconds.
This guide covers how AI height map generation works, when you need a separate height map versus relying on the normal map, and how to integrate AI-generated height and displacement maps into Blender, Unity, and Unreal Engine workflows.
What Is a Height Map and Why Does It Matter?
A height map is a grayscale image where brightness values represent surface elevation. White pixels indicate raised geometry; black pixels indicate recessed geometry; mid-gray is neutral. The GPU uses this information at render time to push vertices outward or inward (tessellation displacement) or to fake that depth in screen space (parallax occlusion mapping).
Height maps are distinct from normal maps, though they're often confused. A normal map stores surface orientation — it fakes lighting on a flat surface but doesn't actually push geometry. A height map stores actual elevation data. For close-up architectural materials, cobblestone, rough stone walls, or any surface where parallax depth matters, a height map is the component that makes the material read as physically convincing rather than painted-on.
In practice, real-time engines use height maps in one of two ways: as displacement (with tessellation enabled, actual geometry is subdivided and displaced), or as a parallax input for parallax occlusion mapping shaders. Both techniques require an accurate grayscale height signal — which is exactly what AI generators now produce alongside the other PBR maps.
How AI Generates Height Maps
Traditional height map generation required either capturing geometry with photogrammetry, deriving it mathematically from a surface description (Substance Designer nodes), or painting it by hand. AI height map generators work differently: they train on large datasets of physically scanned PBR material sets, learning the statistical relationship between surface appearance and the corresponding height data.
When you type "weathered limestone wall" or "brushed aluminum panel," the model synthesizes all five PBR channels simultaneously — including height — because they're physically correlated. Rough stone tends to have pronounced height variation. Smooth metals have near-flat height maps. Woven fabric has a regular warp-and-weft pattern in the height channel. The AI has learned these correlations and produces coherent, physically accurate outputs without any explicit surface measurement.
The result is a height map that tiles seamlessly, matches the other PBR channels in frequency and detail level, and is immediately usable in a standard material workflow — no cleanup required.
Using Grix to Generate Height Maps
Grix generates a complete set of PBR maps from a text prompt, including the height/displacement channel. The workflow takes about 12 seconds:
- Go to grixai.com/try — no account required
- Describe your surface: "mossy granite cobblestones, close-up" or "weathered concrete, heavy aggregate"
- Download the full map pack including basecolor, normal, roughness, metallic, and height
The height map ships in the same pack as the other channels. All maps tile seamlessly and are sized at the same resolution, so they drop directly into a multi-channel material slot setup without any resampling or alignment work.
For surfaces where height detail is the main visual concern — stone floors, rough walls, embossed metal panels — the AI-generated height map captures the macro elevation variation that makes parallax effects convincing. For smooth surfaces, it produces a near-flat height map that won't introduce unwanted depth artifacts.
Height Maps in Blender: Displacement and Parallax Workflows
Blender supports two height map workflows. The simpler approach uses the height map as a bump map input, which is computed in the shader without modifying geometry. The more accurate approach uses true displacement with the Subdivision Surface modifier and Cycles' displacement mode.
For the displacement workflow in Blender:
- Import your textures via File > Import or drag-and-drop into the shader editor
- In the Material Properties, set Displacement to "Displacement Only" or "Displacement and Bump"
- Connect a Displacement node: plug the height map into a Displacement node's Height input, then connect Displacement to the Material Output's Displacement socket
- Add a Subdivision Surface modifier and enable Adaptive Subdivision in Render Properties
- Adjust the Displacement node's Scale value — AI-generated height maps typically work well at 0.05–0.15 for architectural surfaces
For EEVEE, which doesn't support true tessellation displacement, use the Bump node instead and connect height to the Height input. The result won't push geometry but produces convincing micro-detail shading for most materials at typical viewing distances.
Height Maps in Unreal Engine 5
Unreal Engine 5's Nanite system handles tessellation displacement differently depending on whether Nanite is active on your mesh. For non-Nanite static meshes, the standard workflow applies:
- Import all PBR maps as textures — set the height map import type to Grayscale/Linear (not sRGB)
- In the Material Editor, connect the height map to a Parallax Occlusion Mapping node for real-time depth
- The POM node takes: HeightMap texture, HeightMapChannel (defaults to Red), HeightRatio (0.05–0.1 for stone, 0.01–0.03 for wood), and the MipLevel input from your texture coordinate setup
- For Nanite tessellation (UE5.3+): use the World Displacement output and connect your height map through a multiply node set to your desired displacement scale
Unreal's Parallax Occlusion Mapping is particularly effective with AI-generated height maps because the maps tile cleanly and have good high-frequency detail in the channels that drive POM silhouette accuracy at grazing angles.
Height Maps in Unity (HDRP and URP)
Unity's shader graph exposes height map inputs in both HDRP and URP Lit shaders. In the Lit shader inspector:
- Import the height map texture and set its Texture Type to "Default" (not Normal Map) and Compression to BC4 (grayscale compression)
- In the Material inspector, find the Height Map slot under Surface Inputs
- Assign the height texture and set Height Map Parametrization to "Min/Max" for most AI-generated maps
- Adjust the Amplitude value — 0.02 to 0.05 is typical for stone and concrete; 0.005 to 0.02 for smoother surfaces
Unity's parallax implementation works well at normal camera distances. For closeup architectural visualization or macro game assets where accurate surface depth matters, HDRP's tessellation pipeline produces cleaner results at the cost of higher GPU expense.
When You Need Height vs. Normal
A practical decision framework:
Use the height map when: The surface has significant macro-relief (cobblestones, rough stone, embossed metal, brick mortar joints), the camera will pass within 2–3 meters of the surface in the final render, and you're targeting a tessellation-capable pipeline (Cycles, HDRP with tessellation, UE5 Nanite).
Rely on the normal map when: The surface is smooth to medium roughness, performance budget is constrained (mobile, VR at 90fps), camera distances are architectural/product viz (4m+), or you're in a rasterization pipeline without tessellation support.
Most PBR workflows use both simultaneously: the normal map handles medium and high-frequency surface detail (tool marks, pitting, fiber weave), while the height map handles macro elevation variation (rock cleft depth, cobblestone height difference). AI generators produce both channels in a single export, physically correlated, so using them together gives the full benefit of the PBR model's surface description.
Height Map Quality: What to Look For
Not all AI-generated height maps are equal. Key quality signals:
Seam-free tiling: The map should tile without visible seams at all four edges. Grix uses PATINA's tileable generation to ensure seamless output.
Physical correlation with the normal map: The elevation features in the height map should spatially align with the bumps and recesses in the normal map. If they're mismatched, the surface reads as contradictory — a flat height map with a highly varied normal produces a "painted bump" look rather than actual depth.
Appropriate frequency range: Stone and concrete should have broad, low-frequency elevation variation. Woven fabric should have regular, medium-frequency patterns. A height map that's entirely high-frequency noise tends to make surfaces look like TV static at moderate distances.
Grix's AI model produces height maps that score well on all three criteria because it synthesizes all PBR channels jointly — the height signal is physically consistent with the normal, roughness, and basecolor rather than generated separately.
Pricing and Usage
Grix is free to try at grixai.com/try with no login required. Free tier includes several generations per day. The Light plan at $8/month and Pro at $18/month provide higher generation limits and access to 2K resolution outputs — suitable for production archviz and game development pipelines. Full pricing is at grixai.com/pricing.
Frequently Asked Questions
Does Grix generate height maps separately or as part of a full PBR set?
Grix generates the height map alongside all other PBR channels in a single generation — basecolor, normal, roughness, metallic, and height ship together as a single export. All maps are physically correlated because they're generated by the same model from the same surface description.
What resolution are AI-generated height maps?
Grix generates at 1K on the free tier and up to 2K on paid plans. For game assets at typical viewing distances, 1K height maps are usually sufficient. Archviz closeup renders and 3D printing applications benefit from 2K output.
Can I use an AI height map for 3D printing?
Indirectly — you can use the height map to drive geometry displacement in Blender or ZBrush (via a displace modifier), then export the resulting mesh for printing. The AI height map alone isn't printable, but as a geometry driver it produces good results for embossed surface textures.
Is there a difference between a height map and a displacement map in Grix's output?
The exported file is a grayscale height map, which is the standard format for driving both displacement and parallax effects. How it's used is determined by your renderer — you can use it as true tessellation displacement or as parallax input without any modification to the exported file.
Does Grix produce AO maps?
Not currently — Grix focuses on the five primary PBR channels. AO can be baked from the height map in Blender (Bake > Ambient Occlusion with a low-poly plane) or generated procedurally in-engine if needed.