I started filling up some classes today and I am finding out that the process is not as obvious as I thought. The major problem is that I am building up the Python support incrementally, with no real “vision”. Having read the way Blender’s Python API was coded in the past few days, I am trying to follow the same standards but I am not sure it is the way to go (especially on error handling). The lack of a clear API means I do not have the data structures to work with and I am challenged by what parts to keep.
The process is no doubt tedious and repetitive. I feel like I am keeping a badly toned-down version of SWIG, contracting the wrapper code it produced. Things get even more complicated with class inheritance: I would have to copy the code from the ancestors, with some slight modifications. I need to figure out with Jean-Luc how to best go about this migration. I hope we will have time to decide on an API and data structures tomorrow.
Finally, I gave the render buffer another shot. Besides continuing to play around with flags, I rewrote the code from scratch. I noticed that, under Blender, I would get crashes when I created buffers/textures attached to “different” attachment points (COLOR_ATTACHMENT2_EXT for example). Using
glCheckFramebufferStatusEXT, I realized that the buffer drawn to and read from must be set to the one specified by that attachment point. Nevertheless, rendering to that renderbuffer corrupts the back buffer and the result image. If anyone is able to have frame buffer objects working directly within Blender’s rendering pipeline (a great example would be rendering a simple quad to a texture and copying the resulting texture back into a render layer’s float rect), I would be happy to try it on my machine.