Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MeshPhongMaterial: added support for vertex displacement map #7107

Closed
wants to merge 1 commit into from

Conversation

WestLangley
Copy link
Collaborator

No description provided.

@WestLangley
Copy link
Collaborator Author

@mrdoob Oh, man, we were working on this at the same time... :/

In either case, shadows need to be addressed separately.

@WestLangley
Copy link
Collaborator Author

The displacement map, like a light map, or ao map, should use the 2nd set of UVs -- independent of the offset/repeat settings of the primary UV set. This is how I have implemented it.

The displacement map also should have both a scale and bias setting, so displacements can vary over any range, and allow for indentations. Again, that was how I implemented it.

As far as the normals go, I do not think we have to worry about them. Users can specify a normal map, too, that is created to be consistent with the displacement map, scale, and bias.

@WestLangley
Copy link
Collaborator Author

Punk Ninja : - )

punk ninja

@mrdoob
Copy link
Owner

mrdoob commented Sep 5, 2015

Yep! Seems like we did the same. I did a bit of clean up around though 😇

Regarding displacementBias... How about displacementOffset instead?

Normals would be nice, but would need tangents for that I think... Low quality bump/normal maps look like this otherwise (terrain example using displacementMap and bumpMap):

screen shot 2015-09-05 at 06 34 24

😅

If we wanted to automatically compute tangents the geometry needs to be indexed though.

I wonder if the derivatives bump/normal code can be improved so is less pixely/blocky...

@bhouston
Copy link
Contributor

bhouston commented Sep 5, 2015

@mrdoob wrote:

I wonder if the derivatives bump/normal code can be improved so is less pixely/blocky...

My theory is that you can not without more per-vertex information like tangents/bitangents.

@WestLangley
Copy link
Collaborator Author

Normals would be nice

One option is to override the attribute normal with the face normal, computed via screen-space derivatives in the fragment shader. The problem with that, though, is the faces will appear faceted.

Regarding displacementBias... How about displacementOffset instead?

The terms 'scale' together with 'bias' are quite common. I believe those are the terms @bhouston uses.

@mrdoob
Copy link
Owner

mrdoob commented Sep 5, 2015

Ok. Merging the displacementBias and your demo.
I prefer my implementation though, as I'm supporting skinning+morph+displacement 😊

@mrdoob
Copy link
Owner

mrdoob commented Sep 5, 2015

Sorry for the duplicated efforts! 😅

@mrdoob mrdoob closed this Sep 5, 2015
@WestLangley
Copy link
Collaborator Author

@mrdoob wrote:

I prefer my implementation though, as I'm supporting skinning+morph+displacement

I chose to assume skinning/morph would not be implemented concurrently with a vertex displacement map.

In any event, if you want to go your route, it is not clear to me if displacement should be the first or last transform in object space. (In the case of morphs without morph-normals, I would think first, since the normals would not be changing with the positions, and we don't want to displace in the wrong direction. In the case of skinning, I would think last, since skinning is based on raw positions -- not displaced ones.) If it were me, I would not try to combine these.

Your code:

THREE.ShaderChunk[ "begin_vertex" ],
THREE.ShaderChunk[ "displacementmap_vertex" ],
THREE.ShaderChunk[ "morphtarget_vertex" ],
THREE.ShaderChunk[ "skinning_vertex" ],
THREE.ShaderChunk[ "project_vertex" ],

Also, as I said previously,

The displacement map, like a light map, or ao map, should use the 2nd set of UVs -- independent of the offset/repeat settings of the primary UV set. This is how I have implemented it.

I recommend that you merge the UV logic in this PR, too.

Note that I was careful to set the if-defs so there would be no shader warnings about unused UVs, regardless of which textures were included in the model.

@mrdoob
Copy link
Owner

mrdoob commented Sep 7, 2015

I recommend that you merge the UV logic in this PR, too.

Do you know if we're uploading uv as uv2 when uv2 is not available?

@mrdoob
Copy link
Owner

mrdoob commented Sep 7, 2015

I mean, if we don't, I think we should so we don't have to do this:

geometry.faceVertexUvs[ 1 ] = geometry.faceVertexUvs[ 0 ]; // 2nd set of UVs required for aoMap and displacementMap with MeshPhongMaterial

@WestLangley
Copy link
Collaborator Author

Do you know if we're uploading uv as uv2 when uv2 is not available?

I just checked, and I do not see it. I think your suggestion is a good idea. But I would suggest a console warning, too.

@bhouston
Copy link
Contributor

bhouston commented Sep 8, 2015

@mrdoob wrote:

Do you know if we're uploading uv as uv2 when uv2 is not available?

Might be easier to just assign uv2 from uv at the beginning of the shader like this to save compute and memory:

vec2 uv2 = uv;

@mrdoob
Copy link
Owner

mrdoob commented Sep 8, 2015

By the time we're composing the program we don't know anything about the geometry though.

@WestLangley
Copy link
Collaborator Author

@mrdoob There are two unresolved issues here. The first is the issue of combining displacement, morph, and skinning as you have done. I expressed my view in #7107 (comment), but I expect you want to leave it the way you have it, and I will not fight that battle. : - )

But I feel strongly about the UV issue I mentioned in the same comment. Are you willing to make that change?

@mrdoob
Copy link
Owner

mrdoob commented Sep 14, 2015

There are two unresolved issues here. The first is the issue of combining displacement, morph, and skinning as you have done. I expressed my view in #7107 (comment), but I expect you want to leave it the way you have it, and I will not fight that battle. : - )

Yes. I would prefer if someone bumps into an issue with the current setup rather than trying to guess what the right setup is.

But I feel strongly about the UV issue I mentioned in the same comment. Are you willing to make that change?

I think I want to fix the uv, uv2 setup first. I think I lightMap and aoMap doesn't work properly in the editor because we're not using uv when uv2 are not available.

@bhouston
Copy link
Contributor

Can one map the same data buffer to multiple attributes? Like the mapping uv buffer to both of the attributes uv and uv2 if there is no uv2 buffer?

@WestLangley
Copy link
Collaborator Author

It is currently a requirement that the user provide a second set of uvs for lightMap and aoMap, because even though each texture has its own offset/repeat, only one pair of offset/repeat values are sent to the GPU. The offset/repeat values are applied to the first set of uvs (only).

offset/repeat does not really make sense with lightMap and aoMap, so they use the second uv set.

we're not using uv when uv2 are not available.

Based on the above, I am not sure you should...

@bhouston What is the use case for two uv sets? Can we get away with supporting just one uv set and instead have an offset/repeat per texture or channel?

@mrdoob
Copy link
Owner

mrdoob commented Sep 15, 2015

Can one map the same data buffer to multiple attributes? Like the mapping uv buffer to both of the attributes uv and uv2 if there is no uv2 buffer?

Wouldn't that be a bit wasteful? I don't think we can refer to already uploaded buffer at the moment?

@bhouston
Copy link
Contributor

@mrdoob:

Wouldn't that be a bit wasteful? I don't think we can refer to already uploaded buffer at the moment?

I believe you can in OpenGL but I've never done it -- some web searches suggested it was possible. I think it is less wasteful that uploading a second buffer? It would incur less memory transfer to the shader. The best solution honestly is to allow for specifying which UV set should be used with which texture set instead of a fixed UV or UV2 channel.

It is currently a requirement that the user provide a second set of uvs for lightMap and aoMap, because even though each texture has its own offset/repeat, only one pair of offset/repeat values are sent to the GPU. The offset/repeat values are applied to the first set of uvs (only).

Our version of Three.JS supports a offset/repeat per texture map, rather that on a UV basis. I believe that is the right thing to do for the Clara.io use case. It is useful when, for example, you want to have the gloss map have a different repeat than the underlying diffuse map repeat -- the staggered wrapping makes the texture look unique for longer even though both are repeating, just at different points.

@bhouston What is the use case for two uv sets? Can we get away with supporting just one uv set and instead have an offset/repeat per texture or channel?

I guess the main one is the one you know -- the second UV map is used for unwrapped textures usually used for oaMap, lightMap, and other baked results that require a unique texel on each surface.

BTW it looks like Exocortex may be embarking attempting to write a Layered BRDF with graph based inputs that would allow for flexible arbitrary UV usage (any number), procedural textures and all the rest. We are exploring a design for this over the next few weeks and will implement it if possible. Basically we are hitting the limit of what we can achieve with the the current glsl snippet + #define based shader composition strategy. So maybe one solution would be to go for the good enough near term solutions using the current approach and leave the crazy flexibility for later, when one has a flexible shader graph compiler.

@WestLangley
Copy link
Collaborator Author

I guess the main one is the one you know -- the second UV map is used for unwrapped textures usually used for oaMap, lightMap, and other baked results that require a unique texel on each surface.

And displacementMap needs to be added to that list.

@bhouston
Copy link
Contributor

And displacementMap needs to be added to that list.

I think the displacement map often follows the same UV mapping as the main diffuse texture, sort of like bump maps/normal maps -- really displacement is just a continuation in that vein -- it is surface micro structure represented as texture. If you have brick, the displacement/normal/bump repeats with the brick texture. If you have a face, the displacement/normal/bump is usually the same as the diffuse map. I think it generally isn't unwrapped in my opinion.

PS. I am not a artist, so my experience is limited to seeing what artists do. So maybe sometimes displacement is unwrapped, but I haven't yet seen it to be that case.

@WestLangley
Copy link
Collaborator Author

Well, in http://threejs.org/examples/webgl_materials_displacementmap.html, the vertex displacements are closely coupled with the geometry. There is no way repeat-wrapping makes sense in this case.

Sure, repeat-wrapping is appropriate for normal/bump maps. If you say there are use cases for repeat-wrapping with vertex displacement, then we need to find a solution that is more flexible than the one we have now.

@bhouston
Copy link
Contributor

@WestLangley:

Well, in http:/threejs.org/examples/webgl_materials_displacementmap.html, the vertex displacements are closely coupled with the geometry. There is no way repeat-wrapping makes sense in this case.

The mesh in this example is misleading. Character heads often are do not use any repeating/offsets in their textures. Look at the skin example in ThreeJS as well. With character heads all textures are unwrapped - diffuse, bump, normal, displacement, etc. But they generally all share the first UV channel even though they are unwrapped. In this case, e.g. character heads, displacement is following the bump/normal map usage pattern and thus can share UV rather than use the separate UV2 channel.

@WestLangley
Copy link
Collaborator Author

@bhouston Thank you for your reply.

So in the ninja example, the aoMap uses UV2 (because that is required in three.js) but the displacementMap in three.js should use UV? That seems odd to me.

And what if we want a marble ninja with repeating diffuse map? We can't do that in three.js because the displacementMap is sharing the first UV set.

@bhouston
Copy link
Contributor

@WestLangley

UV usually is not unwrapped but sometimes is. It usually drives diffuse/specular/bump/normal/displacement.

UV2 is always unwrapped and drives oaMap/lightMap and other textures that are generally always unwrapped.

Not everything is possible with the above setup, that is true. You are right that you will have to have bump/normal/displacement follow the diffuse texture -- which I think is generally the desired case, but not always.

I do think that any example you can make for displacement not following the diffuse map being undesirable can be equally made for bump/normal as they are all just varying degrees of sub-face surface detail. So the limitation you mention is already being imposed on bump and normal and if it isn't a huge problem already it isn't likely to become huge - but it is a real limitation.

BTW did you know that there exists a higher quality displacement called "vector displacement"? Details: http://docs.pixologic.com/user-guide/3d-modeling/exporting-your-model/vector-displacement-maps/ Thus the sequence from low quality to high quality is "bump" -> "normal" -> "displacement" -> "vector displacement", this progression is a bit more clear when one supports geometry tessellation, because just moving the original low resolution vertices can make "displacement" seem less useful than normal maps.

This is not a general solution. The high end solution is to allow an arbitrary number of UV channels which can be arbitrary used by any texture. That is how production renderers work: V-Ray/Arnold/RenderMan/etc. But implementing this is challenging given the current #define + glsl snippet structure we use.

@bhouston
Copy link
Contributor

BTW a simple way to hack in support for specifying which map uses which UV, one could create defines like OAMAP_UV which resolves to "vUv2/uv2" or something like that. Thus one uses a define per texture to specify which UV set it reads from.

@WestLangley
Copy link
Collaborator Author

@bhouston OK, I will have to defer to you to suggest a new approach if you feel that we should improve how offset/repeat is handled in three.js.

Currently, offset/repeat is set per texture, but not honored per texture.

@bhouston
Copy link
Contributor

Currently, offset/repeat is set per texture, but not honored per texture.

We modified our branch of Three.JS to honor it on a per texture basis via a function like this:

vec2 applyUVOffsetRepeat( vec2 uv, vec4 offsetRepeat ) {
   return uv * offsetRepeat.zw + offsetRepeat.xy;
}

...
vec2 vOpacityUv = applyUVOffsetRepeat( vUv, opacityOffsetRepeat )
...
vec2 vTranslucencyUv = applyUVOffsetRepeat( vUv, translucencyOffsetRepeat );
...
vec2 vBumpUv = applyUVOffsetRepeat( vUv, bumpOffsetRepeat );
..
vec2 vNormalUv = applyUVOffsetRepeat( vUv, normalOffsetRepeat );
...

@WestLangley
Copy link
Collaborator Author

OK. If/when we implement uniform packing of some sort, perhaps we can add this feature, too.

@mrdoob
Copy link
Owner

mrdoob commented Sep 16, 2015

The high end solution is to allow an arbitrary number of UV channels which can be arbitrary used by any texture.

Agreed!

BTW a simple way to hack in support for specifying which map uses which UV, one could create defines like OAMAP_UV which resolves to "vUv2/uv2" or something like that.

This sounds good to me.

We modified our branch of Three.JS to honor it on a per texture basis via a function like this:

vec2 applyUVOffsetRepeat( vec2 uv, vec4 offsetRepeat ) {
   return uv * offsetRepeat.zw + offsetRepeat.xy;
}

Do you hit uniform limits with that setup? Honoring offset/repeat per texture would be great. We may also want to implement the Matrix3 approach while at it.

@WestLangley WestLangley deleted the dev-displacement branch October 13, 2015 01:26
@threejsworker
Copy link

The examples of this pullrequest are now built and visible in threejsworker. To view them, go to the following link:

http://threejsworker.com/viewpullrequest.html#7107

@andrevenancio
Copy link
Contributor

I've been using this new feature and I think the displacementScale and displacementBias aren't enough. as per my stackoverflow question I think the displacementScale should be a range and not a float.

As it is implemented now we can only have an offset from [0 to displacementScale]. It would be useful in several ways to have that range from [displacementScaleMin to displacementScaleMax].

@mrdoob
Copy link
Owner

mrdoob commented Nov 6, 2015

You mean combining bias and scale into a min/max range?

@bhouston
Copy link
Contributor

bhouston commented Nov 6, 2015

If we move towards @sunag's Node-based material graphs (which are already incredibly powerful), displacement would be an input and you could modify it in any way you want with any equation. It would simplify things tremendously.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants