I'm doing an opengl project in Qt with the new QOpenGLWidget class.
I'm just trying to get the opengl world coordinates of my mouse when I click.
I've found a lot of ways to do it but some were obsolete and the others don't work. Then I found the QVector3D::unproject() function but I don't get the right coordinates.
This is a part of my code so far :
QMatrix4x4 modelView;
modelView.setToIdentity();
modelView.lookAt(m_camera.getPosition(), m_camera.getTarget(), m_camera.getUp());
QMatrix4x4 projection;
projection.perspective(70.0, width() / height(), 0.1, 120.0);
GLint view[4];
glGetIntegerv(GL_VIEWPORT, &view[0]);
QVector3D worldPosition = QVector3D(event->pos().x(), event->pos().y(), 0).unproject(modelView, projection, QRect(view[0], view[1], view[2], view[3]));
qDebug() << worldPosition;
Do you know why I get (1.96532, -3.93444, 4.93216) where I should have (-0.5, -0.5, 0.0) ?
Related
After some trouble I've managed to correctly render to texture inside a Frame Buffer Object in a Qt 4.8 application: I can open an OpenGL context with a QGLWidget, render to a FBO, and use this one as a texture.
Now I need to display the texture rendered in a QPixmap and show it in some other widget in the gui. But.. nothing is shown.
Those are some pieces of code:
// generate texture, FBO, RBO in the initializeGL
glGenTextures(1, &textureId);
glBindTexture(GL_TEXTURE_2D, textureId);
glGenFramebuffers(1, &fboId);
glBindFramebuffer(GL_FRAMEBUFFER, fboId);
glGenRenderbuffers(1, &rboId);
glBindRenderbuffer(GL_RENDERBUFFER, rboId);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT, TEXTURE_WIDTH, TEXTURE_HEIGHT);
glBindRenderbuffer(GL_RENDERBUFFER, 0);
// now in paintGL
glBindFramebuffer(GL_FRAMEBUFFER, fboId);
// .... render into texture code ....
if(showTextureInWidget==false) {
showTextureInWidget = true;
char *pixels;
pixels = new char[TEXTURE_WIDTH * TEXTURE_HEIGHT * 4];
glReadPixels(0, 0, TEXTURE_WIDTH, TEXTURE_HEIGHT, GL_RGB, GL_UNSIGNED_BYTE, pixels);
QPixmap qp = QPixmap(pixels);
QLabel *l = new QLabel();
// /* TEST */ l->setText(QString::fromStdString("dudee"));
l->setPixmap(qp);
QWidget *d = new QWidget;
l->setParent(d);
d->show();
}
glBindFramebuffer(GL_FRAMEBUFFER, 0); // unbind
// now draw the scene with the rendered texture
I see the Widget opened but.. there is nothing inside it. If I decomment the test line.. I see the "dudee" string so I know that there is a qlabel but.. no image from the QPixmap.
I know that the original data are ´unsigned char´ and I'm using ´char´ and I've tried with some different color parameters (´GL_RGBA´, ´GL_RGB´ etc) but I don't think this is the point.. the point is that I don't see anything..
Any advice? If I have to post more code I will do it!
Edit:
I haven't posted all the code, but the fact I'd like to be clear is that the texture is correctly rendered as a texture inside a cube. I'm just not able to put it back in the cpu from gpu
Edit 2:
Thanks to the peppe answer I found out the problem: I needed a Qt object that accept as a constructor some raw pixels data. Here is the complete snippet:
uchar *pixels;
pixels = new uchar[TEXTURE_WIDTH * TEXTURE_HEIGHT * 4];
for(int i=0; i < (TEXTURE_WIDTH * TEXTURE_HEIGHT * 4) ; i++ ) {
pixels[i] = 0;
}
glBindFramebuffer(GL_FRAMEBUFFER, fboId);
glReadPixels( 0,0, TEXTURE_WIDTH, TEXTURE_HEIGHT, GL_RGBA, GL_UNSIGNED_BYTE, pixels);
glBindFramebuffer(GL_FRAMEBUFFER, 0);
qi = QImage(pixels, TEXTURE_WIDTH, TEXTURE_HEIGHT, QImage::Format_ARGB32);
qi = qi.rgbSwapped();
QLabel *l = new QLabel();
l->setPixmap(QPixmap::fromImage(qi));
QWidget *d = new QWidget;
l->setParent(d);
d->show();
Given that that's not all of your code and -- as you say -- the texture is correctly filled, then there's a little mistake going on here:
glReadPixels(0, 0, TEXTURE_WIDTH, TEXTURE_HEIGHT, GL_RGB, GL_UNSIGNED_BYTE, pixels);
QPixmap qp = QPixmap(pixels);
The QPixmap(const char *) ctor wants a XPM image, not raw pixels. You need to use one of the QImage ctors to create a valid QImage. (You can also pass ownership to the QImage, solving the fact that you're currently leaking pixels...)
Once you do that, you'll figure out that
the image is flipped vertically, as OpenGL has the origin in the bottom left corner, growing upwards/rightwards, while Qt assumes origin in the top left, growing to downwards/rightwards;
the channels might be swapped -- i.e. OpenGL is returning data with the wrong endianess. I don't remember in this case if using glPixelStorei(GL_PACK_SWAP_BYTES) or GL_UNSIGNED_INT_8_8_8_8 as the type may help, eventually you need to resort to a CPU-side loop to fix your pixel data :)
This question already has answers here:
Opengl: 2d HUD over 3D
(3 answers)
Closed 6 years ago.
I have a subclass "GLWidget" of QGLWidget where I render a 3D scene. I can do rotations and zoom on it.
I call with a QTimer the following function :
void GLWidget::processCurrent()
{
draw();
printStats();
glFlush();
swapBuffers();
}
and the main OpenGL render function "draw()" is :
void GLWidget::draw()
{
if (isDisplayFirst)
{
isDisplayFirst = false;
glViewport(0, 0, w_width, w_height);
glMatrixMode(GL_PROJECTION); // Select The Projection Matrix
glLoadIdentity(); // Reset The Projection Matrix
glMatrixMode(GL_MODELVIEW); // Select The Modelview Matrix
glLoadIdentity(); // Reset The Modelview Matrixi
gluPerspective(45.0f, (float)w_width / w_height, g_nearPlane, g_farPlane);
gluLookAt (0.0, 0.0, 3.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0);
glScalef(0.03f, 0.03f, 0.03f);
}
rotateScene();
glClearColor(0.0 ,0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_POINT_SPRITE);
glTexEnvi(GL_POINT_SPRITE, GL_COORD_REPLACE, GL_TRUE);
glEnable(GL_VERTEX_PROGRAM_POINT_SIZE_NV);
glEnable(GL_BLEND);
glBlendFunc (GL_SRC_ALPHA, GL_ONE);
GLuint vbo_disk = 0;
glBindBuffer(GL_ARRAY_BUFFER, vbo_disk);
glVertexPointer(4, GL_DOUBLE, 4*sizeof(double), Galaxy->pos);
glEnableClientState(GL_VERTEX_ARRAY);
glColor4f(1.0f, 1.0f, 1.0f, 0.2f);
glDrawArrays(GL_POINTS, 0, Galaxy->getNumParticles_disk());
glBindBuffer(GL_ARRAY_BUFFER, 0);
glDisableClientState(GL_VERTEX_ARRAY);
glDisable(GL_BLEND);
glDisable(GL_POINT_SPRITE);
}
I have already rendered 2D text (statistics) on this GLWidget with "printStats()" function. I would like now to draw a "2D scale line" on this 3D scene.
I succeed in draw this line by doing into printStats():
glDisable(GL_TEXTURE_2D);
glColor3f(1.0f, 1.0f, 1.0f);
glLineWidth(3.0f);
glBegin(GL_LINES);
glVertex2d(5, 40);
glVertex2d(20, 40);
glEnd();
glEnable(GL_TEXTURE_2D);
but my problem is that when I do rotations or zoom, this line moves with the 3D scene.
What is the way to lock this 2D line while being able to do operations on the 3D object ?
I want to avoid to use "overpainting" with overriding paintEvent.
I also try to use glPush/Pop Matrix but there are conflicts between 2D and 3D.
Could you give me some advice to get it ?
You are trying to implement HUD functionality, in principle what you need to to draw your 3D scene normaly, then backup all the OpenGL states that you want to reuse, and set up a new projection matrix with gluOrtho2D or glOrtho. Additionally you want to turn off the depth check.
After the call to the ortho function you will most likely draw in screen scale, so you will have to calculate the size of your line accordingly. Once you are done with the hud, you can pop the changes off the stack so you can draw again.
There are a couple of questions on stackoverflow that deal with drawing an OpenGl HUD
OpenGl 2D HUD over 3D
OpenGl 2d HUD in 3D Application
and a pretty decent discussion on the gamedev forum
http://www.gamedev.net/topic/388298-opengl-hud/
I am trying to write a modern OpenGL (programmable pipeline) program using Qt SDK .Qt OpenGL examples show only the fixed pipeline implementation.The documentation on how to initialize Shader Program is very poor.This is the best example on how to setup a shader program and load shaders they have:http://doc.trolltech.com/4.6/qglshaderprogram.html#details
This is not very descriptive as one can see.
I tried to follow this doc and cann't get the Shader program working .Getting segmentation error when the program tries to assign attributes to the shaders.I think the problem is that I access the context in the wrong way.But I can't find any reference on how to setup or retrieve the rendering context.My code goes like this:
static GLfloat const triangleVertices[] = {
60.0f, 10.0f, 0.0f,
110.0f, 110.0f, 0.0f,
10.0f, 110.0f, 0.0f
};
QColor color(0, 255, 0, 255);
int vertexLocation =0;
int matrixLocation =0;
int colorLocation =0;
QGLShaderProgram *pprogram=0;
void OpenGLWrapper::initShaderProgram(){
QGLContext context(QGLFormat::defaultFormat());
QGLShaderProgram program(context.currentContext());
pprogram=&program;
program.addShaderFromSourceCode(QGLShader::Vertex,
"attribute highp vec4 vertex;\n"
"attribute mediump mat4 matrix;\n"
"void main(void)\n"
"{\n"
" gl_Position = matrix * vertex;\n"
"}");
program.addShaderFromSourceCode(QGLShader::Fragment,
"uniform mediump vec4 color;\n"
"void main(void)\n"
"{\n"
" gl_FragColor = color;\n"
"}");
program.link();
program.bind();
vertexLocation= pprogram->attributeLocation("vertex");
matrixLocation= pprogram->attributeLocation("matrix");
colorLocation= pprogram->uniformLocation("color");
}
And here is the rendering loop:
void OpenGLWrapper::paintGL()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
QMatrix4x4 pmvMatrix;
pmvMatrix.ortho(rect());
pprogram->enableAttributeArray(vertexLocation);
pprogram->setAttributeArray(vertexLocation, triangleVertices, 3);
pprogram->setUniformValue(matrixLocation, pmvMatrix);
pprogram->setUniformValue(colorLocation, color);
glDrawArrays(GL_TRIANGLES, 0, 3);
pprogram->disableAttributeArray(vertexLocation);
}
Anybody has can help with this setup? Thanks a lot .
You create a local program variable and let your pprogram pointer point to its address. But when initShaderProgram returns, the local program's lifetime ends and you pprogram points to garbage, therefore the segfault when you try to use it. You should rather create the program dynamically and let Qt handle the memory management:
pprogram = new QGLShaderProgram(context.currentContext(), this);
This assumes OpenGLWrapper derives somewhoe from QObject, if not, then you need to delete the program in its destructor manually (or use some smart pointer, or whatever).
Otherwise your initialization code looks quite reasonable. Your matrix variable should be a uniform and not an attribute, but I'm willing to classfiy this as a typo. You should also not bind the program for the whole lifetime, as this is equivalent to a call to glUseProgram. You should rather use bind (and release, which does glUseProgram(0)) in your render routine.
In my experience the Qt wrappers for OpenGL objects are rather poor and limited, I just made a thin-wrapper for straight OpenGL objects (made cross-platform and easy via GLEW), and made the usual OpenGL calls in QGLWidget. It worked no problem, after struggling for awhile with Qt's equivalents.
Small version of my question
In a QGraphicsItem::paint() function I have a QGLFrameBufferObject. How do I get it on the paintdevice of the painter that is passed as an argument? (provided that the QGraphicsItem is in a scene that is being rendered by a QGraphicsView that has a QGLWidget as viewport => painter is using opengl engine)
QGraphicsItem::paint(QPainter* painter, ...)
{
QGLFramebufferObject fbo;
QPainter p(fbo);
... // Some painting code on the fbo
p.end();
// What now? How to get the fbo content drawn on the painter?
}
I have looked at the framebufferobject and pbuffer examples provided with Qt. There the fbo/pbuffer is drawn in a QGLWidget using custom opengl code. Is it possible to do the same thing within a paint() method of a QGraphicsItem and take the position of the QGraphisItem in the scene/view into account?
Big version of my question
Situation sketch
I have a QGraphicsScene. In it is an item that has a QGraphicsEffect (own implementation by overriding draw() from QGraphicsEffect). The scene is rendered by a QGraphicsView that has a QGLWidget as viewport.
In the QGraphicsEffect::draw(QPainter*) I have to generate some pixmap which I then want to draw using the painter provided (the painter has the QGLWidget as paintdevice). Constructing the pixmap is a combination of some draw calls and I want these to be done in hardware.
Simplified example: (I don't call sourcepixmap in my draw() method as it is not needed to demonstrate my problem)
class OwnGraphicsEffect: public QGraphicsEffect
{
virtual void draw(QPainter* painter);
}
void OwnGraphicsEffect::draw(QPainter* painter)
{
QRect rect(0,0,100,100);
QGLPixelBuffer pbuffer(rect.size(), QGLFormat(QGL::Rgba));
QPainter p(pbuffer);
p.fillRect(rect, Qt::transparent);
p.end();
painter->drawImage(QPoint(0,0), pbuffer->toImage(),rect);
}
Actual problem
My concerns are with the last line of my code: pbuffer->toImage(). I don't want to use this. I don't want to have a QImage conversion because of performance reasons. Is there a way to get a pixmap from my glpixelbuffer and then use painter->drawpixmap()?
I know I also can copy the pbuffer to a texture by using :
GLuint dynamicTexture = pbuffer.generateDynamicTexture();
pbuffer.updateDynamicTexture(dynamicTexture);
but I have no idea on how to get this texture onto the "painter".
Extending leemes' answer, here is a solution which can also handle multisample framebuffer objects.
First, if you want to draw on a QGLWidget, you can simply use the OpenGL commands
leemes suggested in his answer. Note that there is a ready-to-use drawTexture()
command available, which simplifies this code to the following:
void Widget::drawFBO(QPainter &painter, QGLFramebufferObject &fbo, QRect target)
{
painter.beginNativePainting();
drawTexture(target, fbo.texture());
painter.endNativePainting();
}
To draw multisample FBOs, you can convert them into non-multisample ones
using QGLFramebufferObject::blitFramebuffer (Note that not every hardware
/ driver combination supports this feature!):
if(fbo.format().samples() > 1)
{
QGLFramebufferObject texture(fbo.size()); // the non-multisampled fbo
QGLFramebufferObject::blitFramebuffer(
&texture, QRect(0, 0, fbo.width(), fbo.height()),
&fbo, QRect(0, 0, fbo.width(), fbo.height()));
drawTexture(targetRect, texture.texture());
}
else
drawTexture(targetRect, fbo.texture());
However, as far as I know, you can't draw using OpenGL commands on a non-OpenGL context.
For this, you first need to convert the framebuffer to a (software) image, like
a QImage using fbo.toImage() and draw this using your QPainter instead of the
fbo directly.
I think I figured it out. I use the QPainter::beginNativePainting() to mix OpenGL commands in a paintEvent:
void Widget::drawFBO(QPainter &painter, QGLFramebufferObject &fbo, QRect target)
{
painter.beginNativePainting();
glEnable(GL_TEXTURE_2D);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glBindTexture(GL_TEXTURE_2D, fbo.texture());
glBegin(GL_QUADS);
glTexCoord2d(0.0,1.0); glVertex2d(target.left(), target.top());
glTexCoord2d(1.0,1.0); glVertex2d(target.right() + 1, target.top());
glTexCoord2d(1.0,0.0); glVertex2d(target.right() + 1, target.bottom() + 1);
glTexCoord2d(0.0,0.0); glVertex2d(target.left(), target.bottom() + 1);
glEnd();
painter.endNativePainting();
}
I hope this will also work in the paintEvent of a QGraphicsItem (where the QGraphicsView uses a QGLWidget as the viewport), since I only tested it in QGLWidget::paintEvent directly.
There is, however, still the following problem: I don't know how to paint a multisample framebuffer. The documentation of QGLFramebufferObject::texture() says:
If a multisample framebuffer object is used then the value returned from this function will be invalid.
I am writing a rendering engine using Qt and am running into problems with texturing my models
I have a very simple shader to test texturing:
vertex shader:
Attribute vec4 Vertex;
Attribute vec2 texcoords;
uniform mat4 mvp;
varying vec2 outTexture;
void main() {
gl_Position = mvp * Vertex;
outTexture = texcoords;
}
and fragment shader:
uniform sampler2D tex;
varying vec2 outTexture;
void main() {
vec4 color = texture2D(tex, outTexture);
gl_FragColor = color;
}
I am passing my texture coordinates to the shaders correctly
My problem is with binding a QImage and sending it to its texture uniform.
I am using the following code to bind the texture:
const QString& filename;
GLuint m_texture;
QImage image(filename);
image = image.convertToFormat(QImage::Format_ARGB32);
glGenTextures(1, &m_texture);
glBindTexture(GL_TEXTURE_2D, m_texture);
glTexParameteri(GL_TEXTURE2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, image.width(), image.height(), 0, GL_BGRA, GL_UNSIGNED_BYTE, image.bits());
glGenerateMipmap(GL_TEXTURE2D);
glEnable(GL_TEXTURE_2D);
The shader works and I can pass a uniform to the matrix and attributes to the vertex and texture coordinates, but when I try to send a uniform to the texture the same way as such:
effect->setUniformValue(effect->uniformLocation("tex", texture->m_texture));
the program crashes with an “access violation reading location” error with glGetError() returning “invalid enumerant”
Interestingly, when I try running the program without attempting to send the texture to the sampler, the texture is actually appearing on the model. Which makes me think the way I’m binding it has something to do with the legacy texture handling and the texture is being bound to a particular texture address which is being picked up by the shader. This is not the effect I want because I want the programmer to be able to explicitly state at draw time what texture should be passed to the uniform (just as any other uniform is set)
How can I pass the texture to it’s sampler, what do I need to change when binding a texture?
Change it to
effect->setUniformValue(effect->uniformLocation("tex"), texture->m_texture);
or
effect->setUniformValue("tex", texture->m_texture);
Try converting the QImage using:
image = QGLWidget::convertToGLFormat(image);
Another thought, if you are using ES2, then GL_RGBA8 is not valid. I think GL_BGRA may be an optional extension, or not ES 2. Hope this helps.