RunDiffusion provides cloud-hosted access to Stable Diffusion models and related tools, making it a capable option for general AI image generation without requiring local hardware. For 3D artists specifically — those who need tileable PBR texture sets for Blender, Unreal Engine, Unity, or other renderers — RunDiffusion is not the right category of tool. Understanding the difference matters before spending time trying to extract PBR maps from a general image generator.
What RunDiffusion Does
RunDiffusion is a cloud platform for running Stable Diffusion and other generative models. It provides a web interface to various checkpoint models, ControlNet, LoRA support, and image-to-image tools — the standard Stable Diffusion feature set, accessible without a local GPU. The platform serves general creative use cases: concept art, character design, illustration, environment painting, and general image synthesis.
For 3D artists who want to generate reference images, concept art, or textures for specific UV-unwrapped models, RunDiffusion can be useful as part of a workflow. The generated images can be baked, projected, or manually converted into textures for model-specific use.
What RunDiffusion Does Not Do
RunDiffusion does not generate complete tileable PBR map sets. The output of a RunDiffusion generation is an image — a single diffuse-like output, not a coordinated set of basecolor, normal, roughness, metalness, and height maps. For surface textures that need to tile seamlessly and work under physically based rendering, this is a fundamental limitation.
The specific gaps for 3D surface material work:
No PBR map set. A tileable material for Blender's Principled BSDF, Unreal's Material Editor, or Unity's URP shader needs separate roughness, metalness, and normal map inputs. RunDiffusion generates one image at a time. Generating five separate PBR maps that are coherent with each other — where surface detail in the normal map matches the basecolor, roughness encodes the finish described, metalness is physically calibrated — requires a purpose-built model, not a general image generator.
No guaranteed tileability. General Stable Diffusion models do not guarantee seamless tiling. The edges of the generated image may or may not match for repeat. Getting consistent seamless output requires post-processing or specific ControlNet workflows. Purpose-built PBR generators bake tileability into the model training — seamless tiling is guaranteed, not hoped for.
Physically incorrect diffuse maps. A PBR basecolor should be physically calibrated: neutral grey for concrete, dark value for asphalt, light value for white plaster, with no baked lighting or shadows. General image generators produce images that look good visually but often contain baked shadows, specular highlights, or gamma-incorrect values that produce wrong results under real-time lighting in game engines.
Purpose-Built PBR Generators: What to Use Instead
For tileable PBR surface material sets, the right tool category is a purpose-built PBR generator — a model trained specifically on PBR material maps to produce coherent, tileable, physically calibrated map sets from a text description.
Grix is the purpose-built option in this category that runs entirely in the browser. Enter a text description of the surface — material type, finish, condition, specific details — and receive a five-map ZIP (basecolor, normal, roughness, metalness, height) in approximately 25 seconds. All maps tile seamlessly, the basecolor is physically calibrated, and the roughness map encodes the finish described rather than baked lighting. Free trial at grixai.com/try, no account required. Paid plans from $8/month.
GenPBR converts images to PBR maps using algorithmic conversion. It is free and handles the image-to-PBR conversion workflow well. It does not generate from text prompts — it requires an existing diffuse image as input.
ArmorLab is a standalone offline application that generates PBR maps from text or drag-and-drop images. Good for data privacy requirements or offline workflows. Free version available with resolution limitations.
When RunDiffusion Makes Sense in a 3D Workflow
RunDiffusion is useful for 3D artists in specific contexts where its general image generation capabilities are the right fit:
Reference and concept generation. Generating concept images for materials before creating or finding the actual PBR texture. See what "weathered brutalist concrete with orange iron oxide staining" could look like before committing to generating the PBR material.
Model-specific texturing with UV projection. Using img2img or ControlNet to project textures onto specific UV-unwrapped models. This is the model-texturing workflow — different from tileable surface materials. If you have a specific prop with unique UVs and want textures baked to that specific UV layout, RunDiffusion with appropriate ControlNet guidance can produce inputs for projection.
Generating diffuse sources for manual PBR conversion. Generate a diffuse image in RunDiffusion, then use GenPBR or Materialize to convert it to a PBR set. This workflow adds steps but may be appropriate when you need specific artistic direction that text-to-PBR generators do not provide.
Comparing the Two Approaches for Game Development
A game developer building an environment material library needs tileable PBR sets — not individual images. The RunDiffusion workflow for generating each material would involve: generate diffuse in RunDiffusion → make seamless → convert to PBR maps via GenPBR or similar → verify physical calibration → import to engine. Multiple steps, multiple tools, and the output quality for roughness and metalness maps depends on the accuracy of the algorithmic conversion from a single diffuse image.
The Grix workflow: describe material → receive five-map ZIP → import to engine. Two steps. The PBR maps are generated together so they are coherent. The roughness map was trained on the surface, not algorithmically derived from the diffuse image.
For volume work — building material libraries with many different surfaces — the purpose-built PBR generator workflow is significantly faster. The RunDiffusion approach makes more sense for one-off creative work where artistic direction takes priority over PBR accuracy.
Technical Quality Comparison
The technical difference between AI-generated PBR maps from a purpose-built model and PBR maps derived from a general diffusion image output comes down to training data and model architecture. Purpose-built PBR models like Grix's underlying model are trained on PBR material data — they have learned the relationship between surface appearance and the physical properties encoded in roughness, metalness, and normal maps. The outputs reflect that training.
Deriving PBR maps from a general diffusion output involves inference: looking at an image and guessing what the roughness and metalness values should be based on visual appearance alone. The conversion can be good for obvious cases (a shiny metallic surface will get high metalness) but loses nuance for complex cases (a scratched matte metal surface with varying roughness across scratch patterns and base metal).
For production use where materials need to look physically correct under varied lighting conditions — exterior vs interior, day vs night, different HDRI setups — purpose-built PBR output is consistently more reliable.
Frequently Asked Questions
Can RunDiffusion generate seamless textures?
RunDiffusion can generate seamless textures using specific Stable Diffusion techniques — tiling-enabled inference or post-processing tools. The workflow requires more steps and the results are less consistent than purpose-built PBR generators that guarantee tileability. For reliable seamless tileable PBR output, a dedicated tool like Grix is a better choice.
Does RunDiffusion generate normal maps?
RunDiffusion with ControlNet can be configured to generate or estimate normal map outputs. The quality depends on the model and configuration. Purpose-built PBR generators produce normal maps trained specifically on PBR data alongside the other map channels — producing better coherence between the normal map surface detail and the basecolor appearance.
What is the cheapest AI PBR texture generator?
Grix is free to try with no account required. Paid plans start at $8/month for the Light tier. GenPBR is free for image-to-PBR conversion. These are significantly cheaper than platforms like TexturesFast which start at $39/month.
Can I use Stable Diffusion for PBR textures?
Standard Stable Diffusion models are not purpose-built for PBR map sets. You can extract single-channel outputs or use img2img for specific workflows, but the output requires significant post-processing to produce physically correct tileable PBR sets. Purpose-built PBR models like Grix's underlying Patina model produce better results with less effort for tileable surface materials.
What is the best free alternative to RunDiffusion for 3D texture work?
For tileable PBR surface materials: Grix (free trial, no account), GenPBR (free, image-to-PBR), and Poly Haven (CC0 photoscanned materials, no account). For general image generation that RunDiffusion covers, Stable Diffusion can be run locally free with sufficient GPU hardware.