Trying to get comfortable with the shader nodes in blender I sometimes create random materials. I mostly post them on my mastodon account but I will test the waters here as well.

The basic idea for the shader is to get two weaves going at 90 degrees from each other. I used a voronoi, scaled down in one direction to make each bit long rather than round, with a low randomness to make each weave. Then pipe it into normal and displacement maps to get the texture out, and use a colour ramp to add the albedo and some light subsurface scattering. Tinkering with the bsdf is just me playing with slider, I didn’t have any real direction with those.

If this were going to be for a photoreal material this node set up would be an excellent base to start from, but it would take many more layers and details to use.

What do you think? Do you have any suggestions for materials for me to try?

  • a_world_of_madness@beehaw.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Looks great, but doesn’t the Normal Map node expect the input to be a tangent space vec3? Right now it’s just a single float value.

    • Butterbee (She/Her)@beehaw.orgOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Isn’t colour data a vector of the r g and b? But good catch, there’s definitely something wrong there. I’m still getting used to how blender works with its shader nodes and I was just using the “normal map” which is really just a heightmap to add some extra depth or maybe fakey AO (not from the model geometry I know there’s an AO node for that, but from the heightmap itself).

      The bumpmap node will take that heightmap and actually produce the results I was looking for

      • a_world_of_madness@beehaw.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 year ago

        Isn’t colour data a vector of the r g and b?

        Yeah and since it’s basically a grayscale image with identical RGB values, the normal map ends up representing some weird surface with the normals being on a single line between (0,0,0) and (1,1,1). I guess the bump node finds the slope of the grayscale instead.

        I think in Eevee the Displacement node does the same thing behind the scenes, so having both displacement and Bump + Normal Map deriving from the same height data will probably double the effect.

        • Butterbee (She/Her)@beehaw.orgOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Yes. This was in cycles with adaptive subdivision on the model though so the displacement was actually displacing the geometry and the normal map was for adding a little more pop without displacing the geometry more than I wanted

          • a_world_of_madness@beehaw.org
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            Nice, I work mainly in Eevee for game assets so I have to add the geometry manually. I wish they added more real-time rendering features from game engines (like parallax mapping), would make previewing much easier. The new realtime compositor is almost good enough for protyping post-process shaders in blender but they still don’t support render passes unfortunately.