6

A guide to better looking processing sketches

A tour of the often overlooked built-in functionality of Processing 2.0.

by @billautomata

billautomata

*Follow*

6

A guide to better looking processing sketches

A tour of the often overlooked built-in functionality of Processing 2.0.

by @billautomata

billautomata

Open GL Sunburst

wezside

the aperture problem

adamferriss

(via ROME | Tech)

marlonlemes

Cube Rotation in Jitter.

pressureappreciation

Google MapsGL - The shadows cast by the buildings are relative to the current position of the sun.

/via Chaotic

littlebigdetails

Open GL Particles

wezside

adamferriss

Reconstructing Position from Depth buffer

Why

Storing a position in a buffer needs a lot of space! We need 32bits per channel (RGBA32F), which is a 128byte buffer, and that is too much. Low and even mid end gpu’s don’t even handle these formats very well and it goes horribly slow. That’s why we will use the depth buffer (which is generated anyways).

Steps

Position is calculated in view space, because we actually don’t need to go back to world space- Sample depth buffer
- Unproject depth buffer value to find position.z
- Unproject the nDc position to find position.xy

Implementation

Al code listed will be GLSL core 150 (GL3.2) and C++

Encoding

We do not need to encode anything because the depth buffer is generated automatically!

Decoding

Here is where the fun starts.

Step 1: Sampling the depth buffer

This should be easy, but for completeness:

C++

//Generate texture to store depth in glGenTextures(1, m_DepthTextureID); glBindTexture(GL_TEXTURE_2D, m_DepthTextureID); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT32F, width, height, 0,GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, 0); //Attach to your framebuffer glBindFramebuffer(GL_FRAMEBUFFER, m_FboID); glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, m_DepthTextureID, 0);pass m_DepthTextureID as Sampler2D to the shader

GLSL 150 core:

uniform sampler2D depthMap; uniform vec2 texCoord; void main() { float depth = texture2D(depthMap, texCoord).x }

Step 2: Unprojecting

I will give the code first and then explain what I do

GLSL:

vec3 reconstructPosition(in float p_depth, in vec2 p_ndc, in vec4 p_projParams) { float depth = p_depth * 2.0f - 1.0f; float viewDepth = p_projParams.w / (depth - p_projParams.z); return vec3((p_ndc * viewDepth) / p_projParams.xy, viewDepth); }C++

p_projParams = vec4((m_pCamera->getProjectionMatrix()(0, 0), m_pCamera->getProjectionMatrix()(1, 1), m_pCamera->getProjectionMatrix()(2, 2), m_pCamera->getProjectionMatrix()(2, 3));p_projParams are the projection parameters out of the projection matrix.

Every vertex in the vertex shader gets multiplied by a World View Projection matrix:

1. Transform vertex in worldspace.

2. Transform vertex in viewspace.

3. Transform vertex in screenspace.

When we go to the fragment shader the position is devided by the .w component. Now we have nDC (normalized device coordinates)

The trick: we want to go back in viewspace, thus we do the inverse of 4. and 3.!

sidenote: openGL stores the depth default from 0 to 1, so we need to bring this in calculation as well

We will do this inverting in one go.

* Simplify projection matrix with A, B, C, D

* Multiply matrix with view-position (x, y, z, 1)

* Divide by .w

* nDc position = ( A*x / z, B*y / z, C + D/z, 1) —(x, y, z) in viewspace

* Now we calculate z in viewspace

* The depth stored in the depth buffer == nDc.z * 0.5f + 1.0f

* View z = D / (nDc.z - C) —nDc.z being sampeled depth * 2.0f - 1.0f

* We can now calculate view xy

* view x = (nDC.x * view z) / A

* view y = (nDC.y * view z) / B

Conclusion

The depth is reconstructed with this function in the shader

vec3 reconstructPosition(in float p_depth, in vec2 p_ndc, in vec4 p_projParams) { float depth = p_depth * 2.0f - 1.0f; float viewDepth = p_projParams.w / (depth - p_projParams.z); return vec3((p_ndc * viewDepth) / p_projParams.xy, viewDepth); }p_depth = the sampled depth buffer

p_ndc = the nDC position of the current pixel (texCoord.xy * 2.0f - 1.0f)

p_projParams = vec4(A, B, C, D); (from c++)

p_projParams = vec4((m_pCamera->getProjectionMatrix()(0, 0), m_pCamera->getProjectionMatrix()(1, 1), m_pCamera->getProjectionMatrix()(2, 2), m_pCamera->getProjectionMatrix()(2, 3));

That’s all! Happy coding

bassser

zebra-A-(0.00.00.00)_7 by peder.norrby on Flickr.

oxane

I’m addicted to flappy bird and I was bored today….since I was not able to break my record (**63 ! suck it! losers !**) I decided to code my own version of it !

It’s not over yet, the basis is there. We have a flying cube and obstacles popping randomly. I need to finish the collisions before putting real graphics on it (I’ll probably draw some textures).

Here the link to my github if you want to know or learn….*I commented the code ! * Click here **GITHUB**

it’s C++, openGL 2 and SDL 2 Very simple.

moumou38

2

Sliced typography

Inspired from a work of Synoptic Office: http://www.synopticoffice.com/project.php?projectid=1&selectedcol=1

bluegreenjackiechan

memecenterz

Wish I could show you guys this program in the browser but webgl is giving me trouble.

adamferriss

willjardine

oxane

evolving orbitals

billautomata

IOS 3D Engines

- NineEvehGL: Website looks pretty nice. But not sure whether it will go public in the future. Good support for 3D model imports.
- iSGL3D: Looks pretty geeky and reliable. Seems very easy to use. Not much support for importing 3D models.

wewearglasses

*Here take this, I extracted the textures from 3Dmaze.scr *

floopydisc

a glblendfunc error

adamferriss