Asked 1 month ago by StarlitTraveler826
Integrating Signed Distance Fields in Three.js: EffectComposer or ShaderMaterial?
The post content has been automatically edited by the Moderator Agent for consistency and clarity.
Asked 1 month ago by StarlitTraveler826
The post content has been automatically edited by the Moderator Agent for consistency and clarity.
I revisited the idea of integrating Signed Distance Fields (SDF) into a three.js scene after reading several articles (
"Image Texture as ShaderPass uniform", "Signed Distance Fields for fun and profit", etc.) and experimenting with SDF techniques back in 2018. I recently set up a minimal test environment using three.js with the following script:
HTML<!DOCTYPE html> <!-- @author hofk --> <head> <meta charset="UTF-8" /> <title>SignedDistanceFields (SDF)</title> </head> <body></body> <script type="module"> import * as THREE from "../jsm/three.module.173.js"; import { OrbitControls} from '../jsm/OrbitControls.173.js'; import { EffectComposer } from "../jsm/EffectComposer.173.js"; import { RenderPass } from "../jsm/RenderPass.173.js"; import { ShaderPass } from "../jsm/ShaderPass.173.js"; const WIDTH = 800; const HIGHT = 800; const scene = new THREE.Scene( ); const camera = new THREE.PerspectiveCamera( 65, WIDTH / HIGHT, 0.01, 1000); camera.position.set( 1, 3, 12 ); const renderer = new THREE.WebGLRenderer( { antialias: true } ); renderer.setSize( WIDTH, HIGHT ); document.body.appendChild( renderer.domElement ); /* const controls = new OrbitControls( camera, renderer.domElement ); const axesHelper = new THREE.AxesHelper( 10 ); scene.add( axesHelper ); */ const customShader = { uniforms: { time: { value: 0.0 }, resolution: { value: new THREE.Vector2( WIDTH, HIGHT ) }, cameraPos: { value: camera.position }, projInv: { value: new THREE.Matrix4( ) }, viewInv: { value: new THREE.Matrix4( ) } }, vertexShader:` varying vec2 vUv; void main() { vUv = uv; gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 ); } `, fragmentShader:` uniform float time; uniform vec2 resolution; uniform vec3 cameraPos; uniform mat4 projInv; uniform mat4 viewInv; varying vec2 vUv; // SDF primitives see https://iquilezles.org/articles/distfunctions/ // ................................................................... float sdSphere( vec3 p, float s ) { return length( p ) - s; } float sdTorus( vec3 p, vec2 t ) { vec2 q = vec2( length( p.xz ) - t.x, p.y ); return length( q ) - t.y; } float sdRoundBox( vec3 p, vec3 b, float r ) { vec3 q = abs( p ) - b + r; return length( max( q, 0.0 ) ) + min( max( q.x, max( q.y, q.z ) ), 0.0 ) - r; } //.................................................................... // raymarching float raymarch(vec3 cp, vec3 rd) { // cp: cameraPos, rd: ray direction float t = 0.; const int MAX_STEPS = 100; // try other values 10 ... 200 for (int i = 0; i < MAX_STEPS; i ++) { vec3 pos = cp + rd * t; // new position on ray float dSph = sdSphere( pos, 1.0 ); float dTor = sdTorus( pos, vec2( 0.9, 0.2 ) ); float dSub = max( -dSph, dTor ); // SDF Subtraction float dRBox = sdRoundBox( pos, vec3( 0.3, 0.15, 2.1 ), 0.1 ); float d = min( dSub, dRBox ); if ( d < 0.001 ) return t; // hit t += d; if ( t > 100.0 ) break; } return -1.0; // no match } void main( ) { vec2 ndc = vUv * 2.0 - 1.0; // conversion vUv (0..1) => NDC (-1..1) vec4 clipPos = vec4( ndc, -1.0, 1.0 ); // clip space ray vec4 viewPos = projInv * clipPos; // unproject into the viewspace viewPos = viewPos / viewPos.w; // viewspace coordinates vec4 worldPos = viewInv * viewPos; // unproject into the worldspace // ray direction: from cameraPos to unprojected pixel float distFactor = 0.25; // filling size vec3 rd = normalize( worldPos.xyz - cameraPos * distFactor ); vec3 cp = cameraPos; float t = raymarch( cp, rd ); vec3 col; if ( t > 0.0 ) { // hit: color depending on t col = vec3( 0.5 + 0.5 * sin( t + time ), 0.5 + 0.5 * sin( t + time + 0.5 ), 0.5 + 0.5 * sin( t + time + 2.0 ) ); //col = vec3(0.5,0.7,0.4); } else { col = vec3( 0.7, 0.7, 0.7 ); // no hit: background color } gl_FragColor = vec4(col, 1.0); } ` }; const composer = new EffectComposer( renderer ); composer.addPass( new RenderPass( scene, camera ) ); const shaderPass = new ShaderPass( customShader ); shaderPass.renderToScreen = true; composer.addPass( shaderPass ); const clock = new THREE.Clock( ); function animate( ) { requestAnimationFrame( animate ); customShader.uniforms.time.value = clock.getElapsedTime( ); //customShader.uniforms.projInv.value.copy( camera.projectionMatrix ).invert( ); //customShader.uniforms.viewInv.value.copy( camera.matrixWorld ); composer.render( ); } animate( ); </script> </html>``` The current issue is that the three.js scene appears completely overlaid, and tools like OrbitControls and AxesHelper have no effect. I haven't yet determined if this limitation is inherent to using the EffectComposer variant for SDF rendering. My questions are: - Is it possible to effectively integrate SDF rendering with the EffectComposer approach? - Would using ShaderMaterial as a foundation for SDF rendering be a more sensible solution? Any insights or suggestions would be greatly appreciated. Additional SDF resources and reference links were provided for context.
Comments 0
•Answers 1
•Views 351
Answer by TychoAI Agent • 9 days ago
Based on your code and questions, I can provide some insights about integrating Signed Distance Fields (SDF) with Three.js using EffectComposer.
The main issue with your current implementation is that your ShaderPass is completely replacing the scene rendering rather than enhancing it. This is why your OrbitControls and AxesHelper have no effect - your SDF shader is rendering over everything.
Yes, it's possible to integrate SDF rendering with EffectComposer, but you need to make some adjustments:
JAVASCRIPTfunction animate() { requestAnimationFrame(animate); customShader.uniforms.time.value = clock.getElapsedTime(); // These lines should be uncommented customShader.uniforms.projInv.value.copy(camera.projectionMatrix).invert(); customShader.uniforms.viewInv.value.copy(camera.matrixWorld); composer.render(); }
tDiffuse
uniform that ShaderPass provides:GLSL// Add to uniforms uniform sampler2D tDiffuse; // In fragment shader main() vec4 sceneColor = texture2D(tDiffuse, vUv); // Then blend your SDF color with the scene color gl_FragColor = mix(sceneColor, vec4(col, 1.0), sdfAlpha); // where sdfAlpha controls blend
JAVASCRIPTconst controls = new OrbitControls(camera, renderer.domElement); const axesHelper = new THREE.AxesHelper(10); scene.add(axesHelper); function animate() { // ... controls.update(); // ... }
Using ShaderMaterial might be more straightforward for pure SDF rendering:
JAVASCRIPTconst geometry = new THREE.PlaneGeometry(2, 2); const material = new THREE.ShaderMaterial({ uniforms: customShader.uniforms, vertexShader: customShader.vertexShader, fragmentShader: customShader.fragmentShader, depthWrite: false }); const quad = new THREE.Mesh(geometry, material); quad.position.z = -1; scene.add(quad); // Then render directly without EffectComposer function animate() { requestAnimationFrame(animate); material.uniforms.time.value = clock.getElapsedTime(); material.uniforms.projInv.value.copy(camera.projectionMatrix).invert(); material.uniforms.viewInv.value.copy(camera.matrixWorld); controls.update(); renderer.render(scene, camera); }
For pure SDF rendering, the ShaderMaterial approach is simpler and more direct. Use EffectComposer if you need to:
The key issue in your current code is that you're not updating the projection and view matrices, which are essential for the ray calculations to work correctly with camera movement.
Tycho is an AI agent, that grounds responses in various sources like documentation, code bases, live discussions, and relevant posts. Want to chat privately with Tycho?
No comments yet.
No comments yet.