When I create a QPainter object in paintGL() function the rendered scene does not shown. Here is my code:
void D3Display::paintGL(){
makeCurrent();
glLoadIdentity();
glClear(GL_COLOR_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
// some adjustment stuff here...
glPushMatrix();
// some rendering stuff here...
glPopMatrix();
QPainter painter(this);
// some overpainting stuff here...
}
Now the problem is: when I create the QPainter object, screen suddenly turns into black and does not show anything (including render and overpaint stuff). As you can see, I want to overpaint something on a QGLWidget , I am trying some stuff but I couldn't work it out. This function has been written similarly with the QT Overlay example project.
Related
Currently I am writing a small application with Qt and OpenGl and I choosed QOpenGLWidget for rendering graphics.
That's how I declared my widget:
class GLWidget : public QOpenGLWidget, protected QOpenGLFunctions{
// Methods, slots, e.t.c
}
And that's the constructor:
GLWidget::GLWidget(QWidget *parent) : QOpenGLWidget(parent)
{
/* Tried this, it didn't help.
QSurfaceFormat format;
format.setDepthBufferSize(24);
this->setFormat(format);
*/
}
In my init function I set GL_CULL_FACE and GL_DEPTH_TEST:
void GLWidget::initializeGL()
{
/* Some initialization stuff regarding the scene */
glEnable(GL_CULL_FACE);
glEnable(GL_DEPTH_TEST);
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
}
And I really don't know why the widget renders the black screen.
Here are some pictures.
First one: with disabled GL_CULL_FACE and disabled GL_DEPTH_TEST.
Second one: with enabled GL_CULL_FACE and disabled GL_DEPTH_TEST. Maybe it's not a good picture, but I can assure you, you can see some surfaces through the other.
Third one: with enabled GL_CULL_FACE and enabled GL_DEPTH_TEST. Actually, it doesn't matter if GL_CULL_FACE is enabled. Anyway it renders the black screen.
And here's the image without any shading just to show you that the model is fine.
I tried to set the format manually, but it didn't help. Still the black screen:
QSurfaceFormat format;
format.setDepthBufferSize(24);
this->setFormat(format);
Oh and yes, I set glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); at the beginning of my paintGL() function.
I'm trying to create a chroma key for a qwebview in Qt5. This means I need to make a specific color be transparent (other widgets should be visible through webview's pixels with that color). I've found that it can be done using QPainter::CompositionMode operations, but can't make it work.
For example, I need to make all black pixels of a webview be transparent (the source color should be changed in runtime).
I've reimplemented QWebView::paintEvent in my class (get a part of a code from Qt sources), but don't know what to do next
WebView::paintEvent(QPaintEvent *event) {
if (!page()) return;
QWebFrame *frame = page()->mainFrame();
QPainter painter(this);
painter.setRenderHints(renderHints());
painter.setCompositionMode(QPainter::CompositionMode_SourceOver);
frame->render(&painter, event->region());
}
I found a way how to make any source color be white with the following code:
QWebFrame *frame = page()->mainFrame();
QImage source_image(size(), QImage::Format_ARGB32_Premultiplied);
QImage result_image(size(), QImage::Format_ARGB32_Premultiplied);
QPainter imagePainter(&source_image);
imagePainter.setRenderHints(renderHints());
frame->render(&imagePainter, event->region());
imagePainter.end();
QImage mask = source_image.createMaskFromColor(qRgb(0x00,0x00,0x00)); // Source color
QPainter resultPainter(&result_image);
resultPainter.drawImage(source_image.rect(), source_image);
resultPainter.setCompositionMode(QPainter::CompositionMode_Screen);
resultPainter.drawImage(source_image.rect(), mask);
QPainter painter(this);
painter.setCompositionMode(QPainter::CompositionMode_SourceOver);
painter.drawImage(0, 0, result_image);
But I don't know how to convert a white color to transparent.
I found a solution, but it consumes a lot of CPU.
First it's required to set
setStyleSheet("QWebView { background: transparent }");
setAttribute(Qt::WA_OpaquePaintEvent, true);
somewhere in WebView's constructor (I just forgot to mention that in the first message). Then reimplement paintEvent:
void WebView::paintEvent(QPaintEvent *event)
{
if (!page())
return;
QWebFrame *frame = page()->mainFrame();
QPainter painter(this);
QColor chroma_color(0, 0, 0); // A color that should be transparent
float opacity_level = 0.9; // WebView opacity
m_render_pixmap.fill(Qt::transparent);
QPainter pixmapPainter(&m_render_pixmap);
pixmapPainter.setRenderHints(renderHints());
frame->render(&pixmapPainter, event->region());
pixmapPainter.end();
m_render_pixmap.setMask(m_render_pixmap.createMaskFromColor(
chroma_color, Qt::MaskInColor));
painter.setCompositionMode(QPainter::CompositionMode_SourceOver);
painter.setOpacity(opacity_level);
painter.drawPixmap(QPoint(event->rect().left(), event->rect().top()), m_render_pixmap, event->rect());
painter.end();
}
m_render_pixmap is an instance of QPixmap. I don't want to recreate it every time paintEvent is called. I just recreate it on resizeEvent
void WebView::resizeEvent(QResizeEvent *event)
{
QWebView::resizeEvent(event);
m_render_pixmap = QPixmap(size());
}
The code above work great but in my case I want to render a video widget below a webview. So WebView::paintEvent calls about 25 times per second and each call takes about 20-25 ms in windowed mode on my PC. And it takes about 100% of one of CPU cores in a fullscreen mode.
I'm kinda new in Qt and openGL, I have a shape in my code in openGL and I need to place it on the UI widget. But the result of my code is like the picture below, my UI widget is on my shape. Can anyone help me with it?
Here is my code:
void Widget::paintGL() {
glClear(GL_COLOR_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
glPushMatrix(); BackGround();
gambi();
if(is_dead==1) {
DrawSquare(.8,.8,.8,1,1,1,1,.8);
}
glPopMatrix();
}
I want to draw an image, pixel by pixel at run time. I use QPainter and paintEvent to draw. But when paintEvent is called each time, the previously drawn image is cleared and the new point has been drawn.
How to avoid clearing the previously drawn parts? I just want to append the new pixel point to the previously drawn points.
Lines::Lines(QWidget *parent)
: QWidget(parent)
{
m_timer = new QTimer(this);
connect(m_timer, SIGNAL(timeout()), this, SLOT(updateStatus()));
m_timer->start();
m_x = 0;
m_y = 0;
}
void Lines::paintEvent(QPaintEvent *event)
{
QPen pen(Qt::black, 2, Qt::SolidLine);
QPainter painter(this);
painter.setPen(pen);
painter.drawPoint(m_x, m_y);
}
void Lines::updateStatus()
{
m_x++;
m_y++;
update();
}
paintEvent is supposed to do a complete redraw of the widget region specified in the event.
So you are responsible for buffering previous results.
It doesn't really make sense to change the desired output in paintEvent, as it may be randomly called and when it is called is out of your control.
If you want to avoid that you can use a QGraphicsView.
Buffering could be done using a QPixmap, which would be part of the Lines class. You draw the pixel in the pixmap (not in the paint event, in updateStatus), and draw the pixmap in the paint event.
QWidget::setAttribute( WA_OpaquePaintEvent, true );
prevents clearing the widget. However, this is just for optimization in case the widget does a complete repaint anyway.
You should follow Dr. Hirsch's advice.
In My QGLWidet, I draw some text using method renderText(). Then, I want to save the contents of the widget as an image. But, it turns out that text drawn by renderText() are not saved out.
void MyGLWidget::paintGL()
{
qglClearColor(Qt::white);
glViewport(0,0, width(), height());
glClear(GL_COLOR_BUFFER_BIT);
glColor3f(1.0,0.0,0.0);
glBegin(GL_LINES);
glVertex2f(0,0);
glVertex2f(width(), height());
glEnd();
renderText(50, 50, "Hello");
glColor3f(0.0,1.0,0.0);
renderText(50, 150, "World");
}
Here is code to save image:
void MyGLWidget::saveImage()
{
QGLPixelBuffer pbuffer(width(), height());
pbuffer.makeCurrent();
paintGL();
QImage image = pbuffer.toImage();
image.save("test_image.tif","tif");
}
Any idea?
After debugging into Qt 4.8.0's source code, I found a few reasons this will not work.
renderText uses GLWidget's width & height rather than the QGLPixelBuffer (not an issue for your case as your screenshot & widget are the same size)
It constructs a QPainter on the QGLWidget to render the text
Unfortunately, it looks like renderText is simply incompatible w/ QGLPixelBuffers