I have a problem where the depth of an OpenGL scene is not rendered correctly. Im doing off-screen rendering into a QOpenGLFramebufferObject. If I run the same code in a QGLWidget it renders fine. Here is the code:
// SETUP
SurfaceFormat format;
QWindow window;
window.setSurfaceType(QWindow::OpenGLSurface);
window.setFormat(format);
window.create();
QOpenGLContext context;
context.setFormat(format);
if (!context.create()) {
qFatal("Cannot create the requested OpenGL context!");
}
QOpenGLFramebufferObjectFormat fboFormat;
fboFormat.setAttachment(QOpenGLFramebufferObject::Depth);
QSize drawRectSize(640, 480);
context.makeCurrent(&window);
QOpenGLFramebufferObject fbo(drawRectSize, fboFormat);
fbo.bind();
// OPENGL CODE
glViewport(0, 0, 640, 480);
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
glShadeModel(GL_FLAT);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glLoadMatrixf(pose->getOpenGLModelView());
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
/* calculate prjection parameters */
gluPerspective(fov, aspect, 0.01, 1.0f);
glColor3d(1, 0, 0);
glBegin(GL_TRIANGLES);
/* Draw triangles */
glEnd();
glFlush();
QImage result = fbo.toImage());
Any ideas what im doing wrong? I checked GL_DEPTH_BITS and it seems to be 0. Thanks in advance :)
In your code, you never request a depth attachment for your FBO. All you seem to do is
QOpenGLFramebufferObjectFormat fboFormat;
QOpenGLFramebufferObject fbo(drawRectSize, fboFormat);
According to the Qt docs, the QOpenGLFramebufferObjectFormat constructur will just do the following:
By default the format specifies a non-multisample framebuffer object
with no attachments, texture target GL_TEXTURE_2D, and internal format
GL_RGBA8. On OpenGL/ES systems, the default internal format is
GL_RGBA.
Have a look at the Qt Documentation on how to set the correct format.
Related
I am currently working on creating openGL buffers in Qt for some 3D models (cuboids of various sizes depicting buildings).
I've tried to look into some examples, but I've only found some for 2D. I tried using the same for 3D, but at best I end up with blank images when I convert my buffers to images to check.
I reached the max success (at least an image is being created) using: https://dangelog.wordpress.com/2013/02/10/using-fbos-instead-of-pbuffers-in-qt-5-2/
My current code is something like:
QOpenGLFramebufferObject* SBuildingEditorUtils::getFboForBuilding(Building bd)
{
glPushMatrix();
QSurfaceFormat format;
format.setMajorVersion(4);
format.setMinorVersion(3);
QWindow window;
window.setSurfaceType(QWindow::OpenGLSurface);
window.setFormat(format);
window.create();
QOpenGLContext context;
context.setFormat(format);
if (!context.create())
qFatal("Cannot create the requested OpenGL context!");
context.makeCurrent(&window);
// TODO: fbo size = building size
QOpenGLFramebufferObjectFormat fboFormat;
fboFormat.setAttachment(QOpenGLFramebufferObject::CombinedDepthStencil);
//fboFormat.setAttachment(QOpenGLFramebufferObject::Depth);
auto fbo = new QOpenGLFramebufferObject(1500, 1500, fboFormat);
auto res = glGetError();
auto bindRet = fbo->bind();
glEnable(GL_DEPTH_TEST);
glMatrixMode(GL_PROJECTION);
glPushMatrix();
glLoadIdentity();
glOrtho(0, 1500, 0, 1500, 0, 1500);
glMatrixMode(GL_MODELVIEW);
glBegin(GL_POLYGON);
glVertex3d(500, 500, 500);
glVertex3d(500, 1000, 1000);
glVertex3d(1000, 1000, 1000);
glVertex3d(1000, 500, 500);
glEnd();
glDisable(GL_DEPTH_TEST);
fbo->toImage().save("uniqueName.png");
fbo->release();
glPopMatrix();
return fbo;
}
Here I've been using the image "uniqueName.png" to text my output. As you can guess most of the code here just for testing. Also, this code is part of a larger code base.
Any advice of what I might be missing. While I have some experience with Qt, I lack any formal education / training in openGL. Any help would be appreciated.
I wish to know how to get the code to work.
Thanks.
After some trouble I've managed to correctly render to texture inside a Frame Buffer Object in a Qt 4.8 application: I can open an OpenGL context with a QGLWidget, render to a FBO, and use this one as a texture.
Now I need to display the texture rendered in a QPixmap and show it in some other widget in the gui. But.. nothing is shown.
Those are some pieces of code:
// generate texture, FBO, RBO in the initializeGL
glGenTextures(1, &textureId);
glBindTexture(GL_TEXTURE_2D, textureId);
glGenFramebuffers(1, &fboId);
glBindFramebuffer(GL_FRAMEBUFFER, fboId);
glGenRenderbuffers(1, &rboId);
glBindRenderbuffer(GL_RENDERBUFFER, rboId);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT, TEXTURE_WIDTH, TEXTURE_HEIGHT);
glBindRenderbuffer(GL_RENDERBUFFER, 0);
// now in paintGL
glBindFramebuffer(GL_FRAMEBUFFER, fboId);
// .... render into texture code ....
if(showTextureInWidget==false) {
showTextureInWidget = true;
char *pixels;
pixels = new char[TEXTURE_WIDTH * TEXTURE_HEIGHT * 4];
glReadPixels(0, 0, TEXTURE_WIDTH, TEXTURE_HEIGHT, GL_RGB, GL_UNSIGNED_BYTE, pixels);
QPixmap qp = QPixmap(pixels);
QLabel *l = new QLabel();
// /* TEST */ l->setText(QString::fromStdString("dudee"));
l->setPixmap(qp);
QWidget *d = new QWidget;
l->setParent(d);
d->show();
}
glBindFramebuffer(GL_FRAMEBUFFER, 0); // unbind
// now draw the scene with the rendered texture
I see the Widget opened but.. there is nothing inside it. If I decomment the test line.. I see the "dudee" string so I know that there is a qlabel but.. no image from the QPixmap.
I know that the original data are ´unsigned char´ and I'm using ´char´ and I've tried with some different color parameters (´GL_RGBA´, ´GL_RGB´ etc) but I don't think this is the point.. the point is that I don't see anything..
Any advice? If I have to post more code I will do it!
Edit:
I haven't posted all the code, but the fact I'd like to be clear is that the texture is correctly rendered as a texture inside a cube. I'm just not able to put it back in the cpu from gpu
Edit 2:
Thanks to the peppe answer I found out the problem: I needed a Qt object that accept as a constructor some raw pixels data. Here is the complete snippet:
uchar *pixels;
pixels = new uchar[TEXTURE_WIDTH * TEXTURE_HEIGHT * 4];
for(int i=0; i < (TEXTURE_WIDTH * TEXTURE_HEIGHT * 4) ; i++ ) {
pixels[i] = 0;
}
glBindFramebuffer(GL_FRAMEBUFFER, fboId);
glReadPixels( 0,0, TEXTURE_WIDTH, TEXTURE_HEIGHT, GL_RGBA, GL_UNSIGNED_BYTE, pixels);
glBindFramebuffer(GL_FRAMEBUFFER, 0);
qi = QImage(pixels, TEXTURE_WIDTH, TEXTURE_HEIGHT, QImage::Format_ARGB32);
qi = qi.rgbSwapped();
QLabel *l = new QLabel();
l->setPixmap(QPixmap::fromImage(qi));
QWidget *d = new QWidget;
l->setParent(d);
d->show();
Given that that's not all of your code and -- as you say -- the texture is correctly filled, then there's a little mistake going on here:
glReadPixels(0, 0, TEXTURE_WIDTH, TEXTURE_HEIGHT, GL_RGB, GL_UNSIGNED_BYTE, pixels);
QPixmap qp = QPixmap(pixels);
The QPixmap(const char *) ctor wants a XPM image, not raw pixels. You need to use one of the QImage ctors to create a valid QImage. (You can also pass ownership to the QImage, solving the fact that you're currently leaking pixels...)
Once you do that, you'll figure out that
the image is flipped vertically, as OpenGL has the origin in the bottom left corner, growing upwards/rightwards, while Qt assumes origin in the top left, growing to downwards/rightwards;
the channels might be swapped -- i.e. OpenGL is returning data with the wrong endianess. I don't remember in this case if using glPixelStorei(GL_PACK_SWAP_BYTES) or GL_UNSIGNED_INT_8_8_8_8 as the type may help, eventually you need to resort to a CPU-side loop to fix your pixel data :)
I am having trouble with rendering a OpenGL scene.
The background is I want to display the frames from a video capture device in a preview window. I am using OpenCV and Qt. And to test I am capturing from my MacBook webcam. The preview window is 200x200 and the frame captured is 640x480. I am not worried about maintaining aspect ratios.
Other info from the IplImage struct:
Debug: channels: 3
Debug: depth: 8
Debug: dataOrder: 0
Debug: align: 4. Alignment of image rows 4 or 8
Debug: origin: 0. 0=top left, 1=bottom left
Debug: widthStep: 2560.
Debug: colorModel: RGB
So this image shows the current state of affairs.
Current capture http://clinsoftsolutions.com/fgvc4.png
I started with using glDrawPixels, but this didn't work well. I got output, but no scaling.
Currently I am trying with textures and here is the code I am using for the GL interactions
void VideoCaptureWidget::initializeGL()
{
qDebug("initializeGL called");
qglClearColor(QColor::fromRgb(0,0,0)); // set clear colur to black
glDisable(GL_DEPTH_TEST);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, this->width(), this->height(), 0.0f, 0.0f, 1.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glEnable(GL_TEXTURE_2D);
glGenTextures(3, &m_texture);
glBindTexture(GL_TEXTURE_2D, m_texture);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
glBindTexture(GL_TEXTURE_2D,m_texture);
glTexImage2D(GL_TEXTURE_2D,0,GL_RGB,this->width(),this->height(),0,GL_BGR,GL_UNSIGNED_BYTE,NULL);
glDisable(GL_TEXTURE_2D);
}
void VideoCaptureWidget::paintGL()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glDisable(GL_DEPTH_TEST);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0f,this->width(),this->height(),0.0f,0.0f,1.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D,m_texture);
glTexImage2D(GL_TEXTURE_2D,0,GL_RGB,m_image->width,m_image->height,0,GL_BGR_EXT,GL_UNSIGNED_BYTE,m_image->imageData);
glBegin(GL_QUADS);
glTexCoord2i(0,1); glVertex2i(0,this->height());
glTexCoord2i(0,0); glVertex2i(0,0);
glTexCoord2i(1,0); glVertex2i(this->width(),0);
glTexCoord2i(1,1); glVertex2i(this->width(),this->height());
glEnd();
glFlush();
}
void VideoCaptureWidget::resizeGL(int width,int height)
{
qDebug("reszieGL called");
glViewport(0,0,this->width(),this->height());
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0f,this->width(),this->height(),0.0f,0.0f,1.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
}
The image is captured in a timer slot and m_image is a _IplImage pointer
void VideoCaptureWidget::_captureFrame() {
m_image = cvQueryFrame(m_capture);
if(!m_image) {
qDebug("VideoCaptureWidget::_captureFrame(): Error capturing a frame...");
}
//Draw the scene
glDraw();
}
I am really hoping someone will have seen this sort of distorted image before and know what the problem is.
Another approach is to convert the BGR frames to RGBA before uploading them to the GPU with glTexImage2D(). I uploaded a complete demo in my repository, check cvQTcameraGL.
Here's the relevant code:
// Note: trying to retrieve more frames than the camera can give you
// will make the output video blink a lot.
cv_capture >> cv_frame;
if (cv_frame.empty())
{
std::cout << "GLWidget::paintGL: !!! Failed to retrieve frame" << std::endl;
return;
}
cv::cvtColor(cv_frame, cv_frame, CV_BGR2RGBA);
glEnable(GL_TEXTURE_RECTANGLE_ARB);
// Typical texture generation using data from the bitmap
glBindTexture(GL_TEXTURE_RECTANGLE_ARB, _texture);
// Transfer image data to the GPU
glTexImage2D(GL_TEXTURE_RECTANGLE_ARB, 0,
GL_RGBA, cv_frame.cols, cv_frame.rows, 0,
GL_RGBA, GL_UNSIGNED_BYTE, cv_frame.data);
if (glGetError() != GL_NO_ERROR)
{
std::cout << "GLWidget::paintGL: !!! Failed glTexImage2D" << std::endl;
}
A bit more thinking and looking at the image I figured it must be a byte / channel alignment problem. So a bit more googling pointed mt in the direction of
glPixelStorei with GL_UNPACK_ALIGNMENT and then also GL_UNPACK_ROW_LENGTH
The final code that now works is:
glPixelStorei(GL_UNPACK_ALIGNMENT, 4 );
glPixelStorei(GL_UNPACK_ROW_LENGTH, m_image->widthStep/m_image->nChannels);
This needs to be added above the glTexImage2D call.
It's a weird format for the image data. But that is the output you get from the Apple iSight built in webcam. Hope this helps someone else.
This question already has answers here:
Opengl: 2d HUD over 3D
(3 answers)
Closed 6 years ago.
I have a subclass "GLWidget" of QGLWidget where I render a 3D scene. I can do rotations and zoom on it.
I call with a QTimer the following function :
void GLWidget::processCurrent()
{
draw();
printStats();
glFlush();
swapBuffers();
}
and the main OpenGL render function "draw()" is :
void GLWidget::draw()
{
if (isDisplayFirst)
{
isDisplayFirst = false;
glViewport(0, 0, w_width, w_height);
glMatrixMode(GL_PROJECTION); // Select The Projection Matrix
glLoadIdentity(); // Reset The Projection Matrix
glMatrixMode(GL_MODELVIEW); // Select The Modelview Matrix
glLoadIdentity(); // Reset The Modelview Matrixi
gluPerspective(45.0f, (float)w_width / w_height, g_nearPlane, g_farPlane);
gluLookAt (0.0, 0.0, 3.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0);
glScalef(0.03f, 0.03f, 0.03f);
}
rotateScene();
glClearColor(0.0 ,0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_POINT_SPRITE);
glTexEnvi(GL_POINT_SPRITE, GL_COORD_REPLACE, GL_TRUE);
glEnable(GL_VERTEX_PROGRAM_POINT_SIZE_NV);
glEnable(GL_BLEND);
glBlendFunc (GL_SRC_ALPHA, GL_ONE);
GLuint vbo_disk = 0;
glBindBuffer(GL_ARRAY_BUFFER, vbo_disk);
glVertexPointer(4, GL_DOUBLE, 4*sizeof(double), Galaxy->pos);
glEnableClientState(GL_VERTEX_ARRAY);
glColor4f(1.0f, 1.0f, 1.0f, 0.2f);
glDrawArrays(GL_POINTS, 0, Galaxy->getNumParticles_disk());
glBindBuffer(GL_ARRAY_BUFFER, 0);
glDisableClientState(GL_VERTEX_ARRAY);
glDisable(GL_BLEND);
glDisable(GL_POINT_SPRITE);
}
I have already rendered 2D text (statistics) on this GLWidget with "printStats()" function. I would like now to draw a "2D scale line" on this 3D scene.
I succeed in draw this line by doing into printStats():
glDisable(GL_TEXTURE_2D);
glColor3f(1.0f, 1.0f, 1.0f);
glLineWidth(3.0f);
glBegin(GL_LINES);
glVertex2d(5, 40);
glVertex2d(20, 40);
glEnd();
glEnable(GL_TEXTURE_2D);
but my problem is that when I do rotations or zoom, this line moves with the 3D scene.
What is the way to lock this 2D line while being able to do operations on the 3D object ?
I want to avoid to use "overpainting" with overriding paintEvent.
I also try to use glPush/Pop Matrix but there are conflicts between 2D and 3D.
Could you give me some advice to get it ?
You are trying to implement HUD functionality, in principle what you need to to draw your 3D scene normaly, then backup all the OpenGL states that you want to reuse, and set up a new projection matrix with gluOrtho2D or glOrtho. Additionally you want to turn off the depth check.
After the call to the ortho function you will most likely draw in screen scale, so you will have to calculate the size of your line accordingly. Once you are done with the hud, you can pop the changes off the stack so you can draw again.
There are a couple of questions on stackoverflow that deal with drawing an OpenGl HUD
OpenGl 2D HUD over 3D
OpenGl 2d HUD in 3D Application
and a pretty decent discussion on the gamedev forum
http://www.gamedev.net/topic/388298-opengl-hud/
Small version of my question
In a QGraphicsItem::paint() function I have a QGLFrameBufferObject. How do I get it on the paintdevice of the painter that is passed as an argument? (provided that the QGraphicsItem is in a scene that is being rendered by a QGraphicsView that has a QGLWidget as viewport => painter is using opengl engine)
QGraphicsItem::paint(QPainter* painter, ...)
{
QGLFramebufferObject fbo;
QPainter p(fbo);
... // Some painting code on the fbo
p.end();
// What now? How to get the fbo content drawn on the painter?
}
I have looked at the framebufferobject and pbuffer examples provided with Qt. There the fbo/pbuffer is drawn in a QGLWidget using custom opengl code. Is it possible to do the same thing within a paint() method of a QGraphicsItem and take the position of the QGraphisItem in the scene/view into account?
Big version of my question
Situation sketch
I have a QGraphicsScene. In it is an item that has a QGraphicsEffect (own implementation by overriding draw() from QGraphicsEffect). The scene is rendered by a QGraphicsView that has a QGLWidget as viewport.
In the QGraphicsEffect::draw(QPainter*) I have to generate some pixmap which I then want to draw using the painter provided (the painter has the QGLWidget as paintdevice). Constructing the pixmap is a combination of some draw calls and I want these to be done in hardware.
Simplified example: (I don't call sourcepixmap in my draw() method as it is not needed to demonstrate my problem)
class OwnGraphicsEffect: public QGraphicsEffect
{
virtual void draw(QPainter* painter);
}
void OwnGraphicsEffect::draw(QPainter* painter)
{
QRect rect(0,0,100,100);
QGLPixelBuffer pbuffer(rect.size(), QGLFormat(QGL::Rgba));
QPainter p(pbuffer);
p.fillRect(rect, Qt::transparent);
p.end();
painter->drawImage(QPoint(0,0), pbuffer->toImage(),rect);
}
Actual problem
My concerns are with the last line of my code: pbuffer->toImage(). I don't want to use this. I don't want to have a QImage conversion because of performance reasons. Is there a way to get a pixmap from my glpixelbuffer and then use painter->drawpixmap()?
I know I also can copy the pbuffer to a texture by using :
GLuint dynamicTexture = pbuffer.generateDynamicTexture();
pbuffer.updateDynamicTexture(dynamicTexture);
but I have no idea on how to get this texture onto the "painter".
Extending leemes' answer, here is a solution which can also handle multisample framebuffer objects.
First, if you want to draw on a QGLWidget, you can simply use the OpenGL commands
leemes suggested in his answer. Note that there is a ready-to-use drawTexture()
command available, which simplifies this code to the following:
void Widget::drawFBO(QPainter &painter, QGLFramebufferObject &fbo, QRect target)
{
painter.beginNativePainting();
drawTexture(target, fbo.texture());
painter.endNativePainting();
}
To draw multisample FBOs, you can convert them into non-multisample ones
using QGLFramebufferObject::blitFramebuffer (Note that not every hardware
/ driver combination supports this feature!):
if(fbo.format().samples() > 1)
{
QGLFramebufferObject texture(fbo.size()); // the non-multisampled fbo
QGLFramebufferObject::blitFramebuffer(
&texture, QRect(0, 0, fbo.width(), fbo.height()),
&fbo, QRect(0, 0, fbo.width(), fbo.height()));
drawTexture(targetRect, texture.texture());
}
else
drawTexture(targetRect, fbo.texture());
However, as far as I know, you can't draw using OpenGL commands on a non-OpenGL context.
For this, you first need to convert the framebuffer to a (software) image, like
a QImage using fbo.toImage() and draw this using your QPainter instead of the
fbo directly.
I think I figured it out. I use the QPainter::beginNativePainting() to mix OpenGL commands in a paintEvent:
void Widget::drawFBO(QPainter &painter, QGLFramebufferObject &fbo, QRect target)
{
painter.beginNativePainting();
glEnable(GL_TEXTURE_2D);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glBindTexture(GL_TEXTURE_2D, fbo.texture());
glBegin(GL_QUADS);
glTexCoord2d(0.0,1.0); glVertex2d(target.left(), target.top());
glTexCoord2d(1.0,1.0); glVertex2d(target.right() + 1, target.top());
glTexCoord2d(1.0,0.0); glVertex2d(target.right() + 1, target.bottom() + 1);
glTexCoord2d(0.0,0.0); glVertex2d(target.left(), target.bottom() + 1);
glEnd();
painter.endNativePainting();
}
I hope this will also work in the paintEvent of a QGraphicsItem (where the QGraphicsView uses a QGLWidget as the viewport), since I only tested it in QGLWidget::paintEvent directly.
There is, however, still the following problem: I don't know how to paint a multisample framebuffer. The documentation of QGLFramebufferObject::texture() says:
If a multisample framebuffer object is used then the value returned from this function will be invalid.