I am trying to make wireframe transparent in one of examples of Qt3D, but fails.
I set alpha to 0.5 in robustwireframe.frag, but it does not work.
void main(){
// Calculate the color from the phong model
vec4 color = vec4( adsModel( fs_in.position, normalize( fs_in.normal ) ), 0.5);
fragColor = shadeLine( color );
How to make wireframe transparent in Qt3D?
Adding a BlendEquation to the the renderStates will enable the Alpha blending so add the following code to the RenderPass in the WireframeEffect.qml:
RenderPass {
renderStates: [
BlendEquation {blendFunction: BlendEquation.Min}
]
shaderProgram: ShaderProgram {
vertexShaderCode:loadSource("qrc:/shaders/robustwireframe.vert")
geometryShaderCode: loadSource("qrc:/shaders/robustwireframe.geom")
fragmentShaderCode: loadSource("qrc:/shaders/robustwireframe.frag")
}
}
Related
I have a canvas which is currently drawing a grey scale diagram (JSBin example).
It's effectively a radial progress meter, that will be used a lot in the application. However, rather than colouring it with Javascript, I'd prefer to be able to give it a colour based on a class.
I thought it would be an ideal use case for CSS filters. I'd draw the default progress meter in gray, then use CSS filters to add saturation and do a hue rotation, in order to achieve blue, orange and green too.
canvas {
-webkit-filter: saturate(8);
}
The rule was supported and valid in Chrome, but the problem was, it doesn't seem to change the saturation at all.
I'm imagining that #aaa is transformed into it's HSL counterpart hsl(0, 0%, 67%). Then when I increase the saturation with a filter, it should become more saturated, but for the same hue.
I was hoping to end up with something like hsl(0, 50%, 67%) but instead, the filter doesn't seem to change the colour at all, no matter what value I use.
Any ideas?
It turns out if you draw the meter with some saturation initially, you can use the hue-rotate filter and then you can desaturate them to achieve grey scale again.
http://jsbin.com/qohokivobo/2/edit?html,css,output
Conceptually, this isn't an answer. But in the meantime, it's a solution.
What about picking the color from the CSS style ?
canvas {
color: red;
}
function init() {
let canvas = document.getElementById('test'),
context = canvas.getContext('2d');
let style = window.getComputedStyle (canvas);
let color = style.color;
canvas.width = 300;
canvas.height = 300;
let x = canvas.width / 2,
y = canvas.height / 2;
context.beginPath( );
context.arc(x, y, 100, 0, 2 * Math.PI, false);
context.strokeStyle = color;
context.lineWidth = 20;
context.stroke();
context.globalAlpha = 0.85;
context.beginPath();
context.arc(x, y, 100, 0, Math.PI + 0.3, false);
context.strokeStyle = '#eee';
context.stroke();
}
demo
I followed this tutorial to create a drawing application. It all works really well at making smooth curves. The problem is that as I draw the line is black, and once I let go it makes the line the colour that I want. The tutorial does not go into line colour, but I have altered it. The only problem is that it's black until touchesEnded runs.
Here is my code:
This first section sets the colour to black.
- (id)initWithCoder:(NSCoder *)aDecoder
{
if (self = [super initWithCoder:aDecoder])
{
[self setMultipleTouchEnabled:YES];
path = [UIBezierPath bezierPath];
[path setLineWidth:10.0];
red = 0.0/255.0;
green = 0.0/255.0;
blue = 0.0/255.0;
brush = 10.0;
opacity = 0.8;
toolSelected = 1;
bgImage = 1;
}
return self;
}
Then on touchesEnded this code is ran:
- (void)drawBitmap
{
UIGraphicsBeginImageContextWithOptions(self.bounds.size, NO, 0.0);
if (!incrementalImage) // first time; paint background white
{
UIBezierPath *rectpath = [UIBezierPath bezierPathWithRect:self.bounds];
[[UIColor clearColor] setFill];
[rectpath fill];
}
[incrementalImage drawAtPoint:CGPointZero];
UIColor *colour = [UIColor colorWithRed:red green:green blue:blue alpha:opacity];
[colour setStroke];
[path stroke];
incrementalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
I assume that something needs to be ran in the touchesMoved to colourise the bezier path, but I'm just really struggling with it right now.
Can anyone help?
Thanks
It was achieved in the end by adding these two lines to this section
- (void)drawRect:(CGRect)rect
{
[[UIColor redColor]setStroke]; // Added these
[[UIColor redColor]setFill]; // two lines
[incrementalImage drawInRect:rect];
[path stroke];
}
I would like to draw icons (only one color) in different colors. To do so, I would like to import a single alpha-texture and then combine this with a given color in the application.
The result should be, that nothing is drawn on to the background, when the alpha-map has an opacity of 0 and the used color should be drawn, when the opacity is 1.
One soulution should be hidden somewhere in QPainter, since you can manually set the Composition-Mode (QPainter::setCompositionMode). But I don't get it to work like I want.
Does somebody have an idea?
Thanks in advance.
EDIT: Here is a little graphic explaining what I would like to do. I want to use a Alpha-Map like shown in the graphic and then use a color layer to create my icon. Important is, that the background stays transparent.
You can do thos using QPainter like this:
QColor color;
// set color value
// load gray-scale image (an alpha map)
QPixmap pixmap = QPixmap(":/images/earth.png");
// initialize painter to draw on a pixmap and set composition mode
QPainter painter(&pixmap);
painter.setCompositionMode(QPainter::CompositionMode_SourceIn);
painter.setBrush(color);
painter.setPen(color);
painter.drawRect(pixmap.rect());
// Here is our new colored icon!
QIcon icon = QIcon(pixmap);
Here is gray-scale image and two colored icons which i get using the code above (QPixmap.save() method):
icons
The DestinationIn composition mode will do the trick.
Draw the color layer using the default composition mode of SourceOver.
Draw the alpha layer using the DestinationIn composition mode.
For example:
// https://github.com/KubaO/stackoverflown/tree/master/questions/alpha-mask-24943711
#include <QtWidgets>
QImage icon(int size) {
QImage image{size, size, QImage::Format_ARGB32_Premultiplied};
image.fill(Qt::transparent);
QPainter p(&image);
p.setRenderHint(QPainter::Antialiasing);
p.setPen(Qt::NoPen);
p.translate(image.rect().center());
p.scale(image.width()/2.2, image.height()/2.2);
p.setBrush(Qt::white);
p.drawEllipse(QRectF{-.5, -.5, 1., 1.});
p.setCompositionMode(QPainter::CompositionMode_DestinationIn);
p.setBrush(Qt::transparent);
p.drawEllipse(QRectF{-.3, -.3, .6, .6});
for (auto angle : {0., 100., 150.}) {
p.save();
p.rotate(angle);
p.drawRect(QRectF{-.1, 0, .2, -1.});
p.restore();
}
return image;
}
QImage checkers(int size) {
QImage img{size*2, size*2, QImage::Format_ARGB32_Premultiplied};
QPainter p(&img);
p.fillRect(0, 0, size, size, Qt::darkGray);
p.fillRect(size, size, size, 2*size, Qt::darkGray);
p.fillRect(size, 0, size, size, Qt::lightGray);
p.fillRect(0, size, size, size, Qt::lightGray);
return img;
}
void drawColorIcon(QPainter & p, QColor color, const QImage & alpha)
{
p.save();
p.setCompositionMode(QPainter::CompositionMode_SourceOver);
p.fillRect(QRect{0, 0, alpha.width(), alpha.height()}, color);
p.setCompositionMode(QPainter::CompositionMode_DestinationIn);
p.drawImage(0, 0, alpha);
p.restore();
}
QImage drawColorIconProof(QColor color, const QImage & alpha) {
QImage result{alpha.size(), alpha.format()};
QPainter p(&result);
drawColorIcon(p, color, alpha);
p.setCompositionMode(QPainter::CompositionMode_DestinationAtop);
p.fillRect(alpha.rect(), {checkers(10)});
return result;
}
int main(int argc, char *argv[])
{
QApplication app{argc, argv};
QLabel label;
label.setPixmap(QPixmap::fromImage(drawColorIconProof("orangered", icon(200))));
label.show();
return app.exec();
}
I found a solution. However, instead of using a transparent graphic for the alpha map like I've used in the first post, I had to use a black/white-graphic where black pixels are transparent and white pixels are not (=rendered).
// ++++ in constructor ++++
QImage alphaMap = QImage(fileName);
QColor color;
// ++++ in paint Event ++++
QPainter painter(this);
painter.setRenderHints(QPainter::RenderHint::Antialiasing);
painter.setRenderHints(QPainter::RenderHint::HighQualityAntialiasing);
// draw icon
QImage renderedIcon(alphaMap);
// fill with color
renderedIcon.fill(color);
// set alpha-map, black pixels -> opacity of 0, white pixels -> opacity 1
renderedIcon.setAlphaChannel(alphaMap);
painter.drawImage(this->rect(), renderedIcon); // draw image to QWidget
I do not quite understand the problem, but may be you can use QGraphicsColorizeEffect class? QGraphicsColorizeEffect
I'm trying to generate a depth map for a particle system, but if I render the particle system using MeshDepthMaterial, then every particle is only rendered as a single point for each vertex--not covering the entire area the texture mapped particle is displayed.
Do I need to use MeshDepthMaterial to generate a depth map, or are there other options?
Right now there is no way to get the MeshDepthMaterial to respect the size or texture of the ParticleSystem. However, it is not too hard to implement a custom ShaderMaterial that does that. First, you need a vertex shader and fragment shader.
<script type="x-shader/x-vertex" id="vertexShader">
uniform float size;
void main() {
gl_PointSize = size;
gl_Position = projectionMatrix * modelViewMatrix * vec4( position , 1.0 );
}
</script>
<script type = "x-shader/x-fragment" id="fragmentShader">
uniform sampler2D map;
uniform float near;
uniform float far;
void main() {
float depth = gl_FragCoord.z / gl_FragCoord.w;
float depthColor = 1.0 - smoothstep( near, far, depth );
vec4 texColor = texture2D( map, vec2( gl_PointCoord.x, 1.0 - gl_PointCoord.y ) );
gl_FragColor = vec4( vec3(depthColor), texColor.a);
}
</script>
The vertex shader is totally standard, the fragment shader takes the texture (sampler2D map), but instead of using it for color values it just uses the alpha level texColor.a. For the rgb, a grayscale value based on the depth is used, just like the MeshDepthMaterial. Now to use this shader you just need to grab the html and create a THREE.ShaderMaterial like so:
var material = new THREE.ShaderMaterial({
uniforms : {
size : { type: 'f', value : 20.0 },
near : { type: 'f', value : camera.near },
far : { type: 'f', value : camera.far },
map : { type: "t", value : THREE.ImageUtils.loadTexture( url ) }
},
attributes : {},
vertexShader: vertShader,
fragmentShader: fragShader,
transparent: true
});
Here you have provided the shader with all the info it needs: the camera's near/far range, the size of the particle and the texture it needs to map.
You can see a jsFiddle demo of it here.
I am trying to texture map an image to a single polygon. My image is being read correctly, but only the red plane of the image is being textured.
I am doing this within a QGLWidget
I have checked the image after it is read, and it's components are being read correctly--ie, I get valid values for the green and blue planes.
Here is the code
QImageReader *theReader = new QImageReader();
theReader->setFileName(imageFileName);
QImage theImageRead = theReader->read();
if(theImageRead.isNull())
{
validTile = NOT_VALID_IMAGE_FILE;
return;
}
else
{
int newW = 1;
int newH = 1;
while(newW < theImageRead.width())
{
newW *= 2;
}
while(newH < theImageRead.height())
{
newH *= 2;
}
theImageRead = theImageRead.scaled(newW, newH, Qt::IgnoreAspectRatio, Qt::SmoothTransformation);
// values checked in theImageRead are OK here
glGenTextures(1,&textureObject);
theTextureImage = QGLWidget::convertToGLFormat(theImageRead);
// values checked in theTextureImage are OK here
glBindTexture(GL_TEXTURE_2D, textureObject);
glTexImage2D(GL_TEXTURE_2D,0,GL_RGB,newW, newH, 0, GL_RGBA, GL_UNSIGNED_BYTE,theTextureImage.bits() );
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glFlush();
validTile = VALID_TEXTURE;
return;
}
then I draw like this:
{
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D,textureTiles[tN]->getTextureObject() );
glBegin(GL_QUADS);
glTexCoord2f(0.0,0.0);
glVertex2f(textureTiles[tN]->lowerLeft.x(), textureTiles[tN]->lowerLeft.y());
glTexCoord2f(1.0,0.0);
glVertex2f(textureTiles[tN]->lowerRight.x(), textureTiles[tN]->lowerRight.y());
glTexCoord2f(1.0,1.0);
glVertex2f(textureTiles[tN]->upperRight.x(), textureTiles[tN]->upperRight.y());
glTexCoord2f(0.0,1.0);
glVertex2f(textureTiles[tN]->upperLeft.x(), textureTiles[tN]->upperLeft.y());
glEnd();
glDisable(GL_TEXTURE_2D);
}
Does anybody see anything that would cause my texture to be interpreded as if it is values of (r,0,0,1)? (r,g,b,a)?
QT 4.7.1, Ubuntu 10.04, openGl 2.something or other
thanks in advance for any help
I've had a similar problem. I found that I had to "reset" the gl color to white and opaque before drawing a texturized quad, or the colors would get messed up. Like this:
...
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D,textureTiles[tN]->getTextureObject() );
glColor4f(1.0, 1.0, 1.0, 1.0); // reset gl color
glBegin(GL_QUADS);
...
This is a very common problem. First, set your MIN/MAG filters to something, since the defaults use mipmaps, and since you didn't provide mipmaps, the texture is incomplete.
Sampling a incomplete texture usually gives white. A glColor call with the default texture environment will multiply the vertex color with the texture color. You probably have something like glColor(red), and red * white = red, so that's why you're seeing red.
To fix it, set the MIN/MAG filters to GL_LINEAR or GL_NEAREST.