Pass a Qt QImage to a glsl texture sampler - qt

I am writing a rendering engine using Qt and am running into problems with texturing my models
I have a very simple shader to test texturing:
vertex shader:
Attribute vec4 Vertex;
Attribute vec2 texcoords;
uniform mat4 mvp;
varying vec2 outTexture;
void main() {
gl_Position = mvp * Vertex;
outTexture = texcoords;
}
and fragment shader:
uniform sampler2D tex;
varying vec2 outTexture;
void main() {
vec4 color = texture2D(tex, outTexture);
gl_FragColor = color;
}
I am passing my texture coordinates to the shaders correctly
My problem is with binding a QImage and sending it to its texture uniform.
I am using the following code to bind the texture:
const QString& filename;
GLuint m_texture;
QImage image(filename);
image = image.convertToFormat(QImage::Format_ARGB32);
glGenTextures(1, &m_texture);
glBindTexture(GL_TEXTURE_2D, m_texture);
glTexParameteri(GL_TEXTURE2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, image.width(), image.height(), 0, GL_BGRA, GL_UNSIGNED_BYTE, image.bits());
glGenerateMipmap(GL_TEXTURE2D);
glEnable(GL_TEXTURE_2D);
The shader works and I can pass a uniform to the matrix and attributes to the vertex and texture coordinates, but when I try to send a uniform to the texture the same way as such:
effect->setUniformValue(effect->uniformLocation("tex", texture->m_texture));
the program crashes with an “access violation reading location” error with glGetError() returning “invalid enumerant”
Interestingly, when I try running the program without attempting to send the texture to the sampler, the texture is actually appearing on the model. Which makes me think the way I’m binding it has something to do with the legacy texture handling and the texture is being bound to a particular texture address which is being picked up by the shader. This is not the effect I want because I want the programmer to be able to explicitly state at draw time what texture should be passed to the uniform (just as any other uniform is set)
How can I pass the texture to it’s sampler, what do I need to change when binding a texture?

Change it to
effect->setUniformValue(effect->uniformLocation("tex"), texture->m_texture);
or
effect->setUniformValue("tex", texture->m_texture);

Try converting the QImage using:
image = QGLWidget::convertToGLFormat(image);
Another thought, if you are using ES2, then GL_RGBA8 is not valid. I think GL_BGRA may be an optional extension, or not ES 2. Hope this helps.

Related

Is there a way to batch render textures in Qt?

I have been trying to batch render two different pictures. I have 2 different QOpenGLTexture objects I want to draw in a single draw call with batch rendering but am struggling. Both texture objects have id's but only the last texture objects image is drawn. I believe my problem is with setting up the or frag shader.
//..............Setting up uniform...............//
const GLuint vals[] = {m_texture1->textureId(), m_texture2->textureId()};
m_program->setUniformValueArray("u_TextureID", vals, 2);
//..............frag Shader.....................//
#version 330 core
out vec4 color;
in vec2 v_textCoord; // Texture coordinate
in float v_index; // (0, 1) Vertex for which image to draw.
// 0 would draw the image of the first texture object
uniform sampler2D u_Texture[2];
void main()
{
int index = int(v_index);
color = texture(u_Texture[index], v_textCoord);
};
I've tried experimenting with the index value in the frag shader but it only draws the last texture image or blacks out. I tried implementing it how you would with openGL but have had no luck.

Use glBlendFunc in QOpenGLWidget

I'm trying to use glBlendFunc in QOpenGLWidget (in paintGL), but objects do not mix (alpha is works).
My code:
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
glEnable(GL_BLEND);
glBlendFunc(blenFunc, GL_ONE);
m_world.setToIdentity();
m_world.rotate((m_xRot / 16.0f), 1, 0, 0);
m_world.rotate(m_yRot / 16.0f, 0, 1, 0);
m_world.rotate(m_zRot / 16.0f, 0, 0, 1);
QOpenGLVertexArrayObject::Binder vaoBinder(&m_vao);
m_program->bind();
m_tex->bind();
fillYoffsetLightning();
const GLfloat scaleFactor = 0.05f;
m_world.scale(scaleFactor, scaleFactor, 0.0f);
m_world.translate(0.f, 0.0f, 0.0f);
const GLfloat fact = 1 / scaleFactor;
const uint8_t X = 0, Y = 1;
for(int i = 0; i < maxElem; ++i) {
const GLfloat offX = m_ELECT[i][X] * fact;
const GLfloat offY = m_ELECT[i][Y] * fact;
m_world.translate(offX, offY);
m_program->setUniformValue(m_projMatrixLoc, m_proj);
m_program->setUniformValue(m_mvMatrixLoc, m_camera * m_world);
QMatrix3x3 normalMatrix = m_world.normalMatrix();
m_program->setUniformValue(m_normalMatrixLoc, normalMatrix);
glDrawArrays(GL_TRIANGLE_FAN, 0, m_logo.vertexCount());
update();
m_world.translate(-offX, -offY);
}
m_program->release();
shaders are simple:
// vertex
"attribute highp vec4 color;\n"
"varying highp vec4 colorVertex;\n"
//......... main:
"colorVertex = color;\n"
// fragment
"varying highp vec4 colorVertex;\n"
//......... main:
"gl_FragColor = colorVertex;\n"
Color is:
a pentagon with a gradient from white from center to blue to the edges is drawn (center color is 1,1,1, edges is 0,0,0.5)
screenshoot
Why is this happening?
If you want to achieve a blending effect, the you have to disable the depth test:
glDisable(GL_DEPTH_TEST);
Note, the default depth test function is GL_LESS. If a fragment is draw on a place of a previous fragment, then it is discarded by the depth test, because this condition is not full filled.
If the depth test is disabled, then the fragments are "blended" by the blending function (glBlendFunc) and equation (glBlendEquation).
I recommend to use the following blending function:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
In my case (Qt 5.15.2) I found that using a color call with no alpha component (eg. glColor3f(1,0,0) ) causes the blending to be disabled for any subsequent rendering. To my surprise I could not even recover it by re-issuing these commands:
glEnable(GL_BLEND); // wtf has no effect
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Blending simply remained disabled until the next paint begins. This did not happen with the original QGLWidget class. It only happens with QOpenGLWidget and only on Windows (Mac + Linux are fine).
The good-enough solution for me was to replace any non-alpha color calls with alpha equivalents, at least for cases where you need to use blending later in the render. Eg.
glColor3f(1,0,0); // before
glColor4f(1,0,0,1); // after
Another issue that might come up is if you use QPainter along with direct rendering, because the QPainter will trash your OpenGL state. See the mention of 'beginNativePainting' in the docs:
https://doc.qt.io/qt-5/qopenglwidget.html#painting-techniques
EDIT: I'll add this here because my comment on Rabbid's answer was deleted for some reason - the depth test does NOT need to be disabled to use blending. Rabbid might be thinking of disabling depth buffer writes which is sometimes done to allow drawing all translucent objects without having to sort them in order of furthest to nearest:
Why we disable Z-write in blending

How to use Qt's QVector3D::unproject() function?

I'm doing an opengl project in Qt with the new QOpenGLWidget class.
I'm just trying to get the opengl world coordinates of my mouse when I click.
I've found a lot of ways to do it but some were obsolete and the others don't work. Then I found the QVector3D::unproject() function but I don't get the right coordinates.
This is a part of my code so far :
QMatrix4x4 modelView;
modelView.setToIdentity();
modelView.lookAt(m_camera.getPosition(), m_camera.getTarget(), m_camera.getUp());
QMatrix4x4 projection;
projection.perspective(70.0, width() / height(), 0.1, 120.0);
GLint view[4];
glGetIntegerv(GL_VIEWPORT, &view[0]);
QVector3D worldPosition = QVector3D(event->pos().x(), event->pos().y(), 0).unproject(modelView, projection, QRect(view[0], view[1], view[2], view[3]));
qDebug() << worldPosition;
Do you know why I get (1.96532, -3.93444, 4.93216) where I should have (-0.5, -0.5, 0.0) ?

Communicating large/changing/complex sets of vertices in OpenGL?

I've got a very basic scene rendering with a vertex and color array (some code below). I see how to bind the vertexes and colors to the vertex shaders attributes. Currently this vertex and color information is in a local array variable in my render function as you can see below and then glDrawArrays(GL_TRIANGLES, 0, n) is called to draw them for each frame.
I'm trying to picture the architecture of a larger moving scene where there are lots of models with lots of verticies that need to be loaded and unloaded.
The naïve way I imagine to extend this would be to place all the vertex/color data in one big array in main memory and then call glDrawArrays once for each frame. This seems to be inefficient to me. On every frame the vertex and color information changes only in parts, so arranging and reloading an entire monolithic vertex array for every frame seems wrong.
What do 3D games and so forth do about this? Are they for each frame placing all the vertexes in one big array in main memory, and then calling glDrawArrays once? If not, what architecture and OpenGL calls do they generally use to communicate all the vertexes of the scene to the GPU? Is it possible to load vertexes into GPU memory and then reuse them for several frames? Is it possible to draw multiple vertex arrays from multiple places in main memory?
static const char *vertexShaderSource =
R"(
attribute highp vec4 posAttr;
attribute lowp vec4 colAttr;
varying lowp vec4 col;
uniform highp mat4 matrix;
void main()
{
col = colAttr;
gl_Position = matrix * posAttr;
}
)";
static const char *fragmentShaderSource =
R"(
varying lowp vec4 col;
void main()
{
gl_FragColor = col;
}
)";
void Window::render()
{
glViewport(0, 0, width(), height());
glClear(GL_COLOR_BUFFER_BIT);
m_program->bind();
constexpr float delta = 0.001;
if (forward)
eyepos += QVector3D{0,0,+delta};
if (backward)
eyepos += QVector3D{0,0,-delta};
if (left)
eyepos += QVector3D{-delta,0,0};
if (right)
eyepos += QVector3D{delta,0,0};
QMatrix4x4 matrix;
matrix.perspective(60, 4.0/3.0, 0.1, 10000.0);
matrix.lookAt(eyepos, eyepos+direction, {0, 1, 0});
matrix.rotate(timer.elapsed() / 100.0f, 0, 1, 0);
m_program->setUniformValue("matrix", matrix);
QVector3D vertices[] =
{
{0.0f, 0.0f, 0.0f},
{1.0f, 0.0f, 0.0f},
{1.0f, 1.0f, 0.0f},
};
QVector3D colors[] =
{
{1.0f, 0.0f, 0.0f},
{1.0f, 1.0f, 0.0f},
{1.0f, 0.0f, 1.0f},
};
m_program->setAttributeArray("posAttr", vertices);
m_program->setAttributeArray("colAttr", colors);
m_program->enableAttributeArray("posAttr");
m_program->enableAttributeArray("colAttr");
glDrawArrays(GL_TRIANGLES, 0, 3);
m_program->disableAttributeArray("posAttr");
m_program->disableAttributeArray("colAttr");
m_program->release();
++m_frame;
}
Depends on how you want to structure things.
If you have a detailed model that needs to be moved and rotated and transformed but without changing its shape, then a pretty clear way to do it is to load that model into e.g. a VBO (I'm not sure what your setAttributeArray does), and this has to happen only before the first frame, and subsequent frames can render that model with any transformation you want by simply setting the model view matrix uniform which is a much smaller chunk of data going over the bus.
Vertex shaders can and should be used for letting the GPU help or offload entirely the computation and/or application of these types of operations.

Qt OpengGL Shader Program fails

I am trying to write a modern OpenGL (programmable pipeline) program using Qt SDK .Qt OpenGL examples show only the fixed pipeline implementation.The documentation on how to initialize Shader Program is very poor.This is the best example on how to setup a shader program and load shaders they have:http://doc.trolltech.com/4.6/qglshaderprogram.html#details
This is not very descriptive as one can see.
I tried to follow this doc and cann't get the Shader program working .Getting segmentation error when the program tries to assign attributes to the shaders.I think the problem is that I access the context in the wrong way.But I can't find any reference on how to setup or retrieve the rendering context.My code goes like this:
static GLfloat const triangleVertices[] = {
60.0f, 10.0f, 0.0f,
110.0f, 110.0f, 0.0f,
10.0f, 110.0f, 0.0f
};
QColor color(0, 255, 0, 255);
int vertexLocation =0;
int matrixLocation =0;
int colorLocation =0;
QGLShaderProgram *pprogram=0;
void OpenGLWrapper::initShaderProgram(){
QGLContext context(QGLFormat::defaultFormat());
QGLShaderProgram program(context.currentContext());
pprogram=&program;
program.addShaderFromSourceCode(QGLShader::Vertex,
"attribute highp vec4 vertex;\n"
"attribute mediump mat4 matrix;\n"
"void main(void)\n"
"{\n"
" gl_Position = matrix * vertex;\n"
"}");
program.addShaderFromSourceCode(QGLShader::Fragment,
"uniform mediump vec4 color;\n"
"void main(void)\n"
"{\n"
" gl_FragColor = color;\n"
"}");
program.link();
program.bind();
vertexLocation= pprogram->attributeLocation("vertex");
matrixLocation= pprogram->attributeLocation("matrix");
colorLocation= pprogram->uniformLocation("color");
}
And here is the rendering loop:
void OpenGLWrapper::paintGL()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
QMatrix4x4 pmvMatrix;
pmvMatrix.ortho(rect());
pprogram->enableAttributeArray(vertexLocation);
pprogram->setAttributeArray(vertexLocation, triangleVertices, 3);
pprogram->setUniformValue(matrixLocation, pmvMatrix);
pprogram->setUniformValue(colorLocation, color);
glDrawArrays(GL_TRIANGLES, 0, 3);
pprogram->disableAttributeArray(vertexLocation);
}
Anybody has can help with this setup? Thanks a lot .
You create a local program variable and let your pprogram pointer point to its address. But when initShaderProgram returns, the local program's lifetime ends and you pprogram points to garbage, therefore the segfault when you try to use it. You should rather create the program dynamically and let Qt handle the memory management:
pprogram = new QGLShaderProgram(context.currentContext(), this);
This assumes OpenGLWrapper derives somewhoe from QObject, if not, then you need to delete the program in its destructor manually (or use some smart pointer, or whatever).
Otherwise your initialization code looks quite reasonable. Your matrix variable should be a uniform and not an attribute, but I'm willing to classfiy this as a typo. You should also not bind the program for the whole lifetime, as this is equivalent to a call to glUseProgram. You should rather use bind (and release, which does glUseProgram(0)) in your render routine.
In my experience the Qt wrappers for OpenGL objects are rather poor and limited, I just made a thin-wrapper for straight OpenGL objects (made cross-platform and easy via GLEW), and made the usual OpenGL calls in QGLWidget. It worked no problem, after struggling for awhile with Qt's equivalents.

Resources