Thursday, July 26, 2012

Qt Interface for Modifying Parameters in Real-Time

TL;DR Made some cool enhancements to the open source RTPS library checkout the video.

Improved rendering and Qt interface for interactivity from Andrew Young on Vimeo.

I bring some exciting news about my latest developments in the Real-time Particle System (RTPS) library. I created an interface which allows users to modify the system and graphics parameters in real time. I have also improve the screen-space rendering implementation. All the rendering code and shaders are now OpenGL 3.3 compliant.

I have decided to finish graduate school with a M.S. rather than pursuing a PhD. In order to finish my M.S. thesis, I wanted to add an interface to the RTPS library which allows users to modify the parameters in real-time. I chose the Qt library to accomplish the task. Qt works across many platforms which makes it an excellent choice. Qt has a large community and extensive documentation.  Also, they have a license which allows for inclusion into an open-source projects.

Beginning development in Qt wasn't too difficult. Qt has several example applications which use OpenGL widgets. I used these examples as a starting point. However, my venture into Qt was not without it's troubles. One problem I soon ran into was attempting to use GLEW with Qt. I found several posts about the problem and most users suggested the solution was to use QGLWidget instead of QOpenGL. That was indeed part of the problem. The other problem was issue with Qt 4.8. Apparently, when initializing the QGLWidget, but before calling glewinit(), you must call makeCurrent(). My understanding of the problem is limited. However, I believe the problem stems from Qt 4.8's introduction of multi-threaded OpenGL. Without calling make current the GLcontext doesn't become active.

Many of the parameters in the RTPS library are floating point numbers. I wanted a slider which would return floating point numbers. To my surprise, Qt has no native support for a float slider bars. All the solutions I found on the internet suggested simply dividing the result by some scaling factor. The solution made sense. Of course, the approach is rather inconvenient because I can't attach multiple slider signals to a single slot unless the scaling factor was the same on each slot. I decided the best approach was to inherit from the QSlider class to create my own FloatSlider class. Overriding the Qt class was far easier than expected.

//--------- floatslider.h ----------

class FloatSlider : public QSlider
    FloatSlider(Qt::Orientation orientation, QWidget* parent = 0);
    void setScale(float scale);
public slots:
    void sliderChange(SliderChange change);
    void setValue(float value);
    void valueChanged(float value);
    float scale;

//---------- floatslider.cpp ------------
#include "floatslider.h"

FloatSlider::FloatSlider(Qt::Orientation orientation, QWidget* parent)
:QSlider(orientation, parent)
    scale =1.0f;
void FloatSlider::setValue(float value)
void FloatSlider::setScale(float scale)
void FloatSlider::sliderChange(SliderChange change)
         emit valueChanged(value()*scale);
    //for completeness. Also, one could send integer and float signals
    //if desired.

The above class signal sends a float upon sliderChange. Therefore you can set a custom scale for the float slider and will be multiplied by the scaling factor. For example, the following code will return a number between 0.01 and 1.00 by increments of 0.01.

xSPHSlider = new FloatSlider(orientation,this);


Sunday, March 18, 2012

Improved rendering support for rigid-body fluid interaction

Hello everyone,
I have added a lot of new features to the fluid/rigid-body simulator. Here is a bulleted list:

  • Added mesh importing support via AssImp Library.
  • Fixed lots of bugs in mesh voxelization. Now you should be able to voxelize any arbitrary closed mesh.
  • Added parameter files so you can change many system parameters without the need to recompile.
  • Improved rendering of rigid-bodies.
    • I implemented some basic instance rendering. I plan to reintroduce flocking simulations implemented by Myrna Merced. Then I can take advantage of intanced mesh rendering to render birds, fish etc.
  • I have added configurable point source gravity for interesting effects.
I have completed many of the desired features for the stand-alone library. I now plan to dedicate much of my time to integrating the library into Bullet physics engine. Hopefully I can get most of the work integrated within a month. Then I will work on upgrading blenders Bullet interface to include all these new features. In parallel to those two task I would like to also improve the fluid surface extraction.

Monday, January 16, 2012

Improved interaction between rigidbodies and fluid

I fixed a lot of bugs in my simulation framework. Here is a video showing my progress.


  • I wasn't properly scaling the positions for the rigidbody simulation to match the fluid simulation.
  • Some of my quaternion calculations were not correct when updating the rotation of the rigid body.
  • The coefficients for the interaction forces weren't calculated correctly.  I found a detailed formulation for the dampening coefficient in [1].

  • I refactored the rendering code so you can switch rendering techniques on the fly.
  • The test program has run-time configurable mass and dimensions for rigid bodies being added.
  • Velocity visualization can be turned on/off during run-time.
My plans now are to push my changes into the blender version that Ian already created. Once I have done that I will create some examples and work on writing up my Master's thesis. From then, I will return to working on integrating the library into Bullet. Hopefully, I will get a chance to work on some visualization as well.

[1] Cleary, P. W., & Prakash, M. (2004). Discrete-element modelling and smoothed particle hydrodynamics: potential in the environmental sciences. hilosophical Transactions of the Royal Society - Series A: Mathematical, Physical and Engineering Sciences362(1822), 2003-2030. The Royal Society. Retrieved from

Saturday, January 14, 2012

Geometry Shaders for visualizing velocity.

A couple months ago, I posted a blog with my updated rigidbody/fluid interaction. I mentioned I would discuss how I draw the velocity vectors efficiently. In the simulation, all the information for each particle is held on the gpu. Therefore, to render lines in the standard approach I would need to transfer to vectors from the gpu to the host, the position and the vector. Then I would have to loop through to create a third vector which held the position plus the vector. And then draw lines for each set of vertices.

Here is the video.

Geometry shaders (GS) give us the ability to take in one primitive type (points, lines, triangles,..) and output a different primitive type. That makes GSs perfect for our simulation. We already have the position and vector, we just need to calculate the second vertex from this information and then draw a line between the two points. Therefore, we can define the vertices for a line for each vector as follows:


I then define the color for the vector as:


I chose this color representation primarily for its simplicity. Essentially this encodes direction into the color of each vector. Thus, red represents the x direction, green represents the y direction, and blue represents the z direction. The absolute value is necessary because negative colors don't exist (In OpenGL it chooses black automatically if any component is negative).

Now onto the code.

C code: 

//Tell opengl the vector vbo is the color vector.
        glBindBuffer(GL_ARRAY_BUFFER, vecVBO);
        glColorPointer(4, GL_FLOAT, 0, 0);

        glBindBuffer(GL_ARRAY_BUFFER, posVBO);
        glVertexPointer(4, GL_FLOAT, 0, 0);


        glUniform1f(glGetUniformLocation(m_shaderLibrary.shaders["vectorShader"].getProgram(), "scale"),scale);
        glDrawArrays(GL_POINTS, 0, num);


The variable m_shaderLibrary.shaders["vectorShader"].getProgram() is specific to my library. If you want to use this in your own code you should pass in the program(GLuint) which is the compiled shader program from the vector shaders provided below.
Vertex Shader: 

#version 120
varying vec4 vector;
varying out vec4 color;

void main() 
    gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
    //We must project the vector as well.
    vector = gl_ModelViewProjectionMatrix * gl_Color;
    //normalize the vector and take the absolute value.
    color = vec4(abs(normalize(gl_Color.rgb)),1.0);

Geometry Shader: 

#version 150
//#geometry shader

layout(points) in;
layout(line_strip, max_vertices=2) out;
in vec4 vector[1];
in vec4 color[1];
uniform float scale;

void main() 
    //First we emit the position vertex with the color calculated
    //in the vertex shader.
    vec4 p = gl_in[0].gl_Position;
    gl_Position = p; 
    gl_FrontColor = color[0];

    //Second we emit the vertex which is the xyz position plus a
    //scaled version of the vector we want to visualize. Also, the
    //color is the same as calculated in the vertex shader.
    gl_Position = vec4(p.rgb+(scale*vector[0].rgb),p.a);
    gl_FrontColor = color[0];

Fragment Shader: 

//simple pass through fragment shader.
void main(void) {
    gl_FragColor = gl_Color;