Solved!
I cross compiled for Windows, and got my hint: The windows version crashed even before main -- so it had to be basic allocation issues. And it was. I had made a large static allocation and recently made it even larger (something the program requires, not optional or temporary); changed array[size] to array=calloc(etc,etc) and bingo, the windows version ran, and the crash deep in the bowels of OSX/Lion went away, everything runs fine again.
So, lesson learned: large static allocations, no good -- neither windows or OSX is particularly able to accommodate them.
I get a paint event. I have a QPixmap, standalone, that I will be drawing on a QWidget frame. Within the paint event call, I create a painter for the QPixmap, which lives in the class definition. I set colors; brushes, pens. I fill, I draw lines. rects. gradients. text. ellipses. it all works fine under Snow Leopard and Leopard. Under Lion, 10.7.anything, any drawText() call on this same QPixmap fails many call levels within OSX, and five levels deep in com.apple.ColorSync. Doesn't matter what font I use, or size. Both drawText() and drawStaticText() fail the same way.
The failure occurs prior to any attempt to actually draw the QPixmap -- it's during the render of the drawText() that it blows up. All I've done to that point is fill with black (works), fill with a gradient (works), draw some filled rects (works) and draw a grid (works) and then I go to draw this text. Which doesn't work, but blows out the main thread (0) (which is doing the drawing during the paint event) with EXC_BAD_ACCESS SIGSEGV.
Qt has no color management as far as I can tell. OSX has no way to turn off colorsync to the display.
For the moment, I've special cased the OS level and simply don't draw text (in the beta) if running under Lion but this is a horrible workaround.
Anyone have any ideas at all why Apple's 10.7 colorsync would get its knickers in a knot over drawText() to a perfectly vanilla QPixmap with valid size, text, rectangle, and within-bounds drawing task?
Related
When I run my application on Set Top Box the font quality degrades. The degradation manifests itself as follows:
1. The letters in a word is not the same brightness, the pixels of one letter also differ in brightness;
2. Vertical Alignment moves down, in the worst case the bottom line bump on the top.
3. There is a feeling that used font is not I wanted (although the log indicates that the font that's what I need).
Most interestingly, it's fine in the desktop version of the same code, the same fonts.
Below are additional information about used enviroment and experiments that I done.
Fonts - TrueType, is loaded into the system using QFontDatabase :: addApplicationFont;
For PC: QT 5.2.1
For STB: QT 4.7.2
I use for fonts QFont :: PreferAntialias, setStretch (100)
I have replaced setPixelSize calls on setPointSize - did not help;
Suspecting QT version - made application for PC using qt 4.8 - for this version all are well;
I have played with different weights and sizes of font - used for this font information, which pulled via QFontDatabase. Did not help;
I have tested on different plasmas with different diagonal - one result.
The last thing I have learned - DPI. Here are DPI for my PC and STB:
Physical Logical
STB 72x72 72x72
PC 90x116 96x96
Well, perhaps all. If you help I will be very grateful.
I can't seem to find (official or unofficial) documentation on Qt colors vs color spaces.
I would like to define QColor's for my Qt application. I am coming from an OS X background, where I am accustomed to having [NSColor colorWithDeviceRed:green:blue:alpha:] and also [NSColor colorWithSRGBRed:green:blue:alpha:], and other options as well.
Most of the time, I would like to use SRGB. How can I achieve that? It would be also good to know where using the default QColor(int,int,int,int) constructor leads to, but I suspect it will be device colors.
My target platform is mostly Windows, so if you can only come up with a platform-dependent way of creating QColor objects with components defined in the SRGB color space, go ahead!
QColor is just a container for four ints. Everything depends on what you use it for. You'd need to show example code of how you use the QColor instance.
In most cases, though, QColor will end up being used by the raster paint engine back end. In such case, it has the meaning of the device color. Specifically, if you paint using QColor, it is not ever seen directly by OS X drawing functions. OS X is only passed a texture/image that has been already rendered by the raster paint engine.
I am currently working on a Qt application to draw maps. I am trying to draw 400,000+ lines and it crashes after using ~2GB but I still have memory left on my machine. I am wondering if I am hitting some limit inside of Qt that is causing the problem. Anyone know if there is a limit to the number of things you can draw or if you can change this limit?
If it is helpful, I am coding in C++ with a class that has a member function to draw the lines. The code is roughly as follows
QPointF fromPoint;
QPointF toPoint;
fromPoint = foo( x );
toPoint = foo( y );
m_Painter.drawLine(fromPoint, toPoint );
//m_Painter is a QPainter
Edit: Turns out the problem was somewhere else in the code. It had to do with the custom caching that was being done. Though I am still interested if there is a limit to how many lines Qt can draw. Does anyone know?
QPainter executes its underlying graphics through QPaintEngine, which has several implementations (like qpaintengine_mac.cpp, qpaintengine_x11.cpp, or qpaintengine_preview.cpp).
Some devices are raster...and are likely drawing each line into an image buffer and throwing away the endpoints after that drawing is done. There should be no limit to the number of lines you can draw in that case.
If the target device is OpenGL, or to a printer that is doing some kind of PostScript-like output, then the limitations of that particular paint engine may well be a factor. You'd have to look at the specific one.
For example: if you trace down the X11 implementation of drawLine you'll see it passes through to drawPolygon() down through strokePolygon_dev()...and bottoms out at a call to XDrawLines:
XDrawLines(dpy, hd, gc, pts, numberPoints, CoordModeOrigin);
So there you have another abstraction layer...and so the question becomes whether the XWindows display parameter is guaranteed to be raster. (My guess would be that it is.)
Anyway, so the answer is "unlimited if raster. may depend otherwise--but the limitations (if any) are probably coming from the underlying device for the paint engine, not Qt."
At page 136 of the user manual of ILNumerics CTP (RCh), there is a mention to an Image Plot, in the "future section".
Is this the name of a new coming component similar two the TwoDMode of a 3D surface in a PlotCube, but optimized for 2D rendering or so? Could you describe its use case/functionalities?
(I would appreciate to have the possibility to quickly draw image plots (like Matlab imagesc) even with GDI backend. Currently GDI is to slow to render 700x700 ILSurface objects in a PlotCube with TwoDMode=true.)
imagesc - as you noticed - can be realized by a common surface plot in 2D mode. A 'real' imagesc plot would hardly do anything else. If the GDI renderer is too slow on your hardware, I'd suggest to
switch to an OpenGL driver, or
decrease the size of the rendering output, or
prevent from transparent colors (Wireframe or Fill), or
decrease the number of grid columns / rows in the surface
Note, the GDI renderer is mostly provided as fallback for OpenGL and for offscreen rendering. It utilizes decent scanline / z-buffer rendering. But naturally, it is not able to deliver the same speed as hardware accelerated OpenGL driver. However, 700x700 output should work even with GDI - on recent hardware (at least a couple of frames per second, I would guess).
I'm working on a Qt based application (actually in PyQt but I don't think that's relevant here), part of which involves plotting a potentially continuous stream of data onto a graph in real time.
I've implemented this by creating a class derived from QWidget which buffers incoming data, and plots the graph every 30ms (by default). In __init__(), a QPixmap is created, and on every tick of a QTimer, (1) the graph is shifted to the left by the number of pixels that the new data will take up, (2) a rectangle painted in the space, (3) the points plotted, and (4) update() called on the widget, as follows (cut down):
# Amount of pixels to scroll
scroll=penw*len(points)
# The first point is not plotted now, so don't shift the graph for it
if (self.firstPoint()):
scroll-=1
p=QtGui.QPainter(pm)
# Brush setup would be here...
pm.scroll(0-scroll, 0, scroll, 0, pm.width()-scroll, pm.height())
p.drawRect(pm.width()-scroll, 0, scroll, pm.height())
# pen setup etc happens here...
offset=scroll
for point in points:
yValNew = self.graphHeight - (self.scalePoint(point))
# Skip first point
if (not(self.firstPoint())):
p.drawLine(pm.width()-offset-penw, self.yVal, pm.width()-offset, yValNew)
self.yVal = yValNew
offset-=penw
self.update()
Finally, the paintEvent simply draws the pixmap onto the widget:
p = QtGui.QPainter(self)
p.drawPixmap(0, 0, self.graphPixmap)
As far as I can see, this should work correctly, however, when data is received very fast (i.e. the entire graph is being plotted on each tick), and the widget is larger than a certain size (approx 700px), everything to the left of the 700px area lags considerably. This is perhaps best demonstrated in this video: http://dl.dropbox.com/u/1362366/keep/Graph_bug.swf.html (the video is a bit laggy due to the low frame rate, but the effect is visible)
Any ideas what could be causing this or things I could try?
Thanks.
I'm not 100% sure if this is the problem or not, but I thought I might make at least some contribution.
self.update() is an asynchronous call, which will cause a paint event at some point later when the main event loop is reached again. So it makes me wonder if your drawing is having problems because of the sync issue between when you are modifying your pixmap vs when its actually getting used in the paintEvent. Almost seems like what you would need for this exact code to work is a lock in your paintEvent, but thats pretty naughty sounding.
For a quick test, you might try forcing the event loop to flush right after your call to update:
self.update()
QtGui.QApplication.processEvents()
Not sure that will fix it, but its worth a try.
This actually might be a proper situation to be using repaint() and causing a direct paint event, since you are doing an "animation" using a controlled framerate: self.repaint()
I noticed a similar question to yours, by someone trying to graph a heart monitor in real time: http://qt-project.org/forums/viewthread/10677
Maybe you could try restructuring your code similar to that. Instead of splitting the painting into two stages, he is using a QLabel as the display widget, setting the pixmap into the QLabel, and painting the entire graph immediately instead of relying on calls to the widget.update()