Sunday, March 28, 2010

Rendering & Capturing Pipeline

I wrote the rendering and capturing pipeline today. I use the OpenGL accumulation buffer to do motion blur. Each image is captured and saved as a BMP file. When a sequence is done, I use Avidemux to encode the video.

Unfortunately, the screen has to be entirely unobstructed by any other window for capturing to work. Once there is another window in the foreground, glReadPixels() ignores that part of the picture. Apparently this is due to Pixel Ownership testing.

Here is a test render:



Rendering took about 4 minutes, of which a bulk of time was spent writing to disk. Resolution is 1920x800 at 24fps, with 64x motion blur sampling (= 1536 fps).

Sync to music seems to be equally fine (same test, with a different material):

No comments:

Post a Comment