Load and Save opaque 8 bit PNG Files using ImageSharp - imagesharp

I am trying to Load -> Manipulate byte array directly -> Save an 8 bit png image.
I would like to use ImageSharp to compare its speeds to my current library, however in their code example they require the pixel type to be defined (they use Rgba32):
using SixLabors.ImageSharp;
using SixLabors.ImageSharp.Processing;
// Image.Load(string path) is a shortcut for our default type.
// Other pixel formats use Image.Load<TPixel>(string path))
using (Image<Rgba32> image = Image.Load("foo.jpg"))
{
image.Mutate(x => x
.Resize(image.Width / 2, image.Height / 2)
.Grayscale());
image.Save("bar.jpg"); // Automatic encoder selected based on extension.
}
I looked through the pixel types: https://github.com/SixLabors/ImageSharp/tree/master/src/ImageSharp/PixelFormats
But there is no grayscale 8 bit pixel type.

As of 1.0.0-beta0005 There's no Gray8 pixel format because we couldn't decide what color model to use when converting from Rgb (We need that internally). ITU-R Recommendation BT.709 seems like the sensible solution because that is what png supports and what we use when saving an image as an 8bit grayscale png so it's on my TODO list.
https://en.wikipedia.org/wiki/Grayscale#Converting_color_to_grayscale
So... currently you need to use either Rgb24 or Rgba32 when decoding the images.
Update.
As of 1.0.0-dev002094 this is now possible! We have two new pixel formats. Gray8 and Gray16 that carry only the luminance component of a pixel.
using (Image<Gray8> image = Image.Load<Gray8>("foo.png"))
{
image.Mutate(x => x
.Resize(image.Width / 2, image.Height / 2));
image.Save("bar.png");
}
Note. The png encoder by default will save the image in the input color type and bit depth. If you want to encode the image in a different color type you will need to new up an PngEncoder instance with the ColorType and BitDepth properties set.

Related

Read a locally stored image as the background image for subtraction

I'm using Emgu.CV and planning to use background subtraction. I want to do something fairly simple and read two background images from my local disk, and use one of them as the background image and the other one as the overlay to compare with / the mask.
I haven't got far though, because the signature of the method is very different than just accepting a file. I'm guessing I'm missing some conversion from a File.Read to IInputArray
IBackgroundSubtractor backgroundSubtractor = new BackgroundSubtractorMOG2();
IInputArray inputImage; // how do I create an instance of an InputArray from a local file?
IOutputArray mask;
backgroundSubtractor.Apply(imputImage, mask);
How do I go from a file in C:\<somepath>\someimage1.png to formats IInputArray, IOutputArray below?
EmguCv offers different methods to load images from file (see V1 and V2 below). For the mask you just need to define a new Mat object and it will be allocated and filled automatically when you call backgroundSubtractor.Apply(input1, mask);
//V1 load image
var input1 = new Mat(#"C:\<somepath>\someimage1.png");
//V2 load image
Mat input2 = CvInvoke.Imread(#"C:\<somepath>\someimage1.png", ImreadModes.AnyColor);
var mask = new Mat();
IBackgroundSubtractor backgroundSubtractor = new BackgroundSubtractorMOG2();
backgroundSubtractor.Apply(input1, mask);
The Mat class implements IInputtArray as well as IOutputArray.

Windows Small System Icon Height Incorrect

I'm running on Windows 10, but using Delphi 7 (yes, I know it's quite old).
I want to use the system icons in Windows and have gone about this by defining a TImageList called SystemIcons which I initialize as follows:
var
fileInfo: TSHFileInfo;
begin
SystemIcons.Handle := ShGetFileInfo('', 0, fileInfo, SizeOf(fileInfo),
SHGFI_ICON or SHGFI_SMALLICON or SHGFI_SYSICONINDEX);
...
I have SystemIcons properties set statically as a TImageList component with width and height set to 16.
Elsewhere, I wish to retrieve an icon from this image list given a valid shell object's image index. Because these are "small system icons", I expect them to be 16x16. The result of calling GetSystemMetrics(SM_CYSMICON) yields 16. Oddly, the dimensions depend upon whether I retrieve them as a bitmap or an icon.
...
var
icon: TIcon;
bm: TBitmap;
begin
...
icon := TIcon.Create;
SystemIcons.GetIcon(imgIndex, icon);
bm := TBitmap.Create;
SystemIcons.GetBitmap(imgIndex, bm);
The imgIndex is correct and the same in both cases. The image retrieved is the same in each case, as expected. The dimensions of the bitmap (bm.Width and bm.Height) are also as expected: 16x16. However, the dimensions of the icon (icon.Width and icon.Height) are not. They are 32x32.
When I paint the icon on a canvas it appears as 16x16. So it's only its Height and Width values that appear incorrect. Very odd.
Why are these different?
The images are likely actually 32x32 to begin with.
Internally, TImageList.GetIcon() simply retrieves an HICON for the chosen image directly from the underlying Win32 ImageList API, using ImageList_GetIcon(), and assigns that to the TIcon.Handle property.
TImageList.GetBitmap(), on the other hand, is a bit different. It sizes the TBitmap to the dimensions of the TImageList (16x16), and then stretch draws the chosen image onto the TBitmap.Canvas using TImageList.Draw(), which in turn uses ImageList_DrawEx().

Why does my OBJ / MTL model material show up as black?

Why does my OBJ model have has no material and display as black?
I have an OBJ:
<a-obj-model id="gorilla" src="#gorilla-obj" mtl="#gorilla-mtl"></a-obj-model>
I can see the geometry, but the material shows up as black.
If you check your MTL, you might notice it is trying to use TGA or some other sort of textures that aren't plain images. In this case, you need to include additional three.js loaders.
You could try including all the necessary loaders like including https://github.com/mrdoob/three.js/blob/dev/examples/js/loaders/TGALoader.js and THREE.Loader.Handlers.add( /\.tga$/i, new THREE.TGALoader() );
However, it might be simplest to just batch convert all the TGAs to just use images like PNGs using a converter, and replace all instances of 'tga' with 'png'.

NSColorPanel to hex issue?

I have an NSColorPanel that I am inputting RGV values:
NSColorPanel * sharedPanel = [NSColorPanel sharedColorPanel];
[sharedPanel setTarget: self];
[sharedPanel setAction: updateColor:];
[sharedPanel orderFront: self];
The color panel display and I set this value: r66, g114, b170
By my calculations, this should be #4272AA. I use the following code to convert to hex:
- (void) updateColor: (NSColorPanel*) panel
{
NSString * hexString = [panel.color hexadecimalValueOfAnNSColor];
NSLog(#"%#", hexString);
}
Which logs out #345d9a (not what I would expect).
I'm using the following method from directly from developer.apple.com to convert the color to hex:
#import <Cocoa/Cocoa.h>
#interface NSColor(NSColorHexadecimalValue)
-(NSString *)hexadecimalValueOfAnNSColor;
#end
#implementation NSColor(NSColorHexadecimalValue)
-(NSString *)hexadecimalValueOfAnNSColor
{
float redFloatValue, greenFloatValue, blueFloatValue;
int redIntValue, greenIntValue, blueIntValue;
NSString *redHexValue, *greenHexValue, *blueHexValue;
//Convert the NSColor to the RGB color space before we can access its components
NSColor *convertedColor=[self colorUsingColorSpaceName:NSCalibratedRGBColorSpace];
if(convertedColor)
{
// Get the red, green, and blue components of the color
[convertedColor getRed:&redFloatValue green:&greenFloatValue blue:&blueFloatValue alpha:NULL];
// Convert the components to numbers (unsigned decimal integer) between 0 and 255
redIntValue=redFloatValue*255.99999f;
greenIntValue=greenFloatValue*255.99999f;
blueIntValue=blueFloatValue*255.99999f;
// Convert the numbers to hex strings
redHexValue=[NSString stringWithFormat:#"%02x", redIntValue];
greenHexValue=[NSString stringWithFormat:#"%02x", greenIntValue];
blueHexValue=[NSString stringWithFormat:#"%02x", blueIntValue];
// Concatenate the red, green, and blue components' hex strings together with a "#"
return [NSString stringWithFormat:#"#%#%#%#", redHexValue, greenHexValue, blueHexValue];
}
return nil;
}
#end
Any suggestions as to what I'm doing wrong?
You must have entered coordinates in different color space (probably device's, since on my Mac, #4272AA in color space of my display when converted to calibrated color space produces almost the same result, #345C9A).
To change color space in NSColorPanel, click tiny rainbow button. NSCalibratedRGBColorSpace corresponds to "Generic RGB" selection — since your get-hex method uses calibrated, you need to use the same if you want to get the same numbers back.
A tiny bit of warning: this piece of code from developer.apple.com is harmful.
When people say hex codes to each other, it is almost universally assumed to mean the same color as HTML/CSS would produce. This means hex codes must be in sRGB color space, since web standards dictate to use sRGB whenever color space information is omitted/missing.
Apple's "Generic RGB" (NSCalibratedRGBColorSpace) is very different from sRGB and native color spaces of most modern displays. Your problem just demonstrates how huge that difference is.
Most of displays in the world are manufactured to match sRGB as best as possible, and high quality displays of modern Apple's devices are particularly good at that.
This leads to somewhat surprising conclusion: if need those hex codes to produce same colors as in HTML, using NSDeviceRGBColorSpace, however wrong that is, instead of NSCalibratedRGBColorSpace gives much better results. You can easily verify that fact by entering the same color coordinates in Device, Generic and sRGB color spaces and comparing to what HTML page in Safari produces.
If you need correct and guaranteed match with hex codes used in Web, you will have to do color conversion to sRGB manually, since NSColor does not support reading components for any color profiles other than Device RGB and Calibrated RGB.

Strange problem about conversion between GDI+ to GDI: Bitmap and HBitmap

I want to convert gdi+ Bitmap into gdi object HBitmap.
I am using the following method:
Bitmap* img = new Bitmap(XXX);
// lots of codes...
HBITMAP temp;
Color color;
img->GetHBITMAP(color, &temp);
the img object is drawing on a dialog.
when this part of method is called, strange thing happens!
the img displaying in the window changed!
It become a bit clearer or sharper.
My question is what happens?
The bitmap pixel format may be the reason. Do you specify it explicitly in the Bitmap constructor?
Gdiplus::Bitmap bmp(WIDTH, HEIGHT, PixelFormat24bppRGB);
Try making sure that all the pixel formats you use are the same.
Another reason may be the differences in Gdiplus::Graphics interpolation modes in your code. That attribute determines how the images are resized, how the lines are drawn, etc.
m_pViewPortImage = new Gdiplus::Bitmap(
observedWidth,
observedHeight,
PixelFormat24bppRGB
);
Gdiplus::Graphics gr(m_pViewPortImage);
gr.SetInterpolationMode(Gdiplus::InterpolationModeHighQualityBicubic);

Resources