Using NAudio to achieve fade out and fade in for a series of 44 kHz 16-bit two-channel wave files - wav

I have a series of 44 kHz 16-bit two-channel uncompressed wave files (read from resources) and want to apply the fade out and fade in effect to create a stream from the sequence of all the WAV files.
The resource reading, and getting the 16-bit wavestream happens correctly. The target format is also shown correct, but I keep getting acmnotpossible as the exception in the waveformat coversion step below. What am I doing wrong?
String ResToPlay2 = NameSpaceString + ".Resources." + inWave2 + ".wav";
Stream _audioStream2;
int wavdur2 = 0;
Double fadeDurDbl2 = 0;
int fadeDur2 = 0;
if (!resA.GetManifestResourceStream (ResToPlay2).Equals (Stream.Null))
{
_audioStream2 = resA.GetManifestResourceStream (ResToPlay2);
WaveStream wavePCMStream2 = WaveFormatConversionStream.CreatePcmStream (new WaveFileReader (_audioStream2));
WaveFormat targetFmt2 = new WaveFormat (44100, 32, 2);
WaveStream waveStream2 = new WaveFormatConversionStream(targetFmt2,wavePCMStream2);
using (waveStream2)
{
wavdur2 = (int) waveStream2.TotalTime.Milliseconds;
var fader2 = new FadeInOutSampleProvider(new WaveToSampleProvider(waveStream2));
fadeDurDbl2 = (wavdur2*OverlapPCT) / 100;
fadeDur2 = (int) Math.Round(fadeDurDbl2, 0);
fader2.BeginFadeIn (fadeDur2);
var stwp2 = new NAudio.Wave.SampleProviders.SampleToWaveProvider (fader2);
WaveFileWriter.CreateWaveFile (Application.StartupPath + "\\" + "fadedIn_?.wav", stwp2);
}
}

I'm not sure why you are using WaveFormatConversionStream since you are starting in PCM. First get to a sample provider, then you can use FadeInOutSampleProvider
var reader = new WaveFileReader (_audioStream2)
var sampleProvider = SampleProviderConverters.ConvertWaveProviderIntoSampleProvider(reader);
var fader = new FadeInOutSampleProvider (sampleProvider);

Related

How to display individual images by each date in google earth engine?

I am new to google earth engine and not so familiar with javascript. I want to display the cleared images (B4,B3,B2 bands) of Sentinel 2 by each dates in layers (each layer represent each date). The code is shown as below, but always get error 'no Band 4, constant band'. Can anyone help me to solve this problem? Thanks!
var lakes=table.geometry();
Map.centerObject(lakes, 15);
function maskS2clouds(image) {
var qa = image.select('QA60');
// Bits 10 and 11 are clouds and cirrus, respectively.
var cloudBitMask = 1 << 10;
var cirrusBitMask = 1 << 11;
// Both flags should be set to zero, indicating clear conditions.
var mask = qa.bitwiseAnd(cloudBitMask).eq(0)
.and(qa.bitwiseAnd(cirrusBitMask).eq(0));
return image.updateMask(mask).divide(10000);
}
var start = ee.Date('2015-06-20');
var finish = ee.Date('2018-06-01');
var collection = ee.ImageCollection('COPERNICUS/S2')
.filterDate(start, finish)
.filterBounds(lakes)
.filter(ee.Filter.lt('CLOUDY_PIXEL_PERCENTAGE', 10))
.map(maskS2clouds);
var rgbVis = {
min: 0.0,
max: 0.3,
bands: ['B4', 'B3', 'B2'],
};
function addImage(imageL) { // display each image in collection
var id = imageL.id;
var image = ee.Image(imageL.id);
Map.addLayer(image.select(['B4','B3','B2']).clip(lakes),rgbVis,id)
}
collection.evaluate(function(collection) { // use map on client-side
print(collection.features);
collection.features.map(addImage);
})

How convert a raw image stored in a byte array to a rgb image with opencv and Java

i am working in the preview of a fingerprint scaner using id3Fingerprint sdk and OpenCV. If i just show the preview from the id3fingerprint sdk all is fine, but if i load it to a Mat object of OpenCV in order to draw some rectangles in the image then:
1.- The fingerprints are displayed in right form but the rectangles are displayed as lines or pixels in random x,y location.
2.- The rectangles are displayed in right form but the fingerprints are displayed "blured" (look the image attached).fingerprints are blured
I think, my problem is when i convert the raw grayscale image (a byte array from the id3fingerprint sdk) to a RGB or RGBA image.
private void showPreview2(FingerImage image){
int height = 750;
int width = 750;
int currentWidth = 0;
int currentHeight = 0;
try {
currentWidth = image.getWidth();
currentHeight = image.getHeight();
} catch (FingerException ex) {
Logger.getLogger(CallingID3Example.class.getName()).log(Level.SEVERE, null, ex);
}
byte[] pixels = image.getPixels();
Mat dest = new Mat();
Mat source = new Mat();
Mat source2 = null;
source2 = new Mat(currentWidth, currentHeight, CvType.CV_8UC1);
source2.put(0, 0, pixels);
MatOfByte pix = new MatOfByte();
Imgcodecs.imencode(".bmp", source2, pix);
source2.put(0, 0, pix.toArray());
Imgproc.cvtColor(source2, source, Imgproc.COLOR_GRAY2RGBA);
try {
int i=0;
for(FingerImage finger : image.getSegments()){
Scalar color;
color = new Scalar(0, 250,0);
FingerBounds bound = image.getSegmentBounds()[i];
Imgproc.rectangle(source, new Point(bound.topLeft.x, bound.topLeft.y), new Point(bound.bottomRight.x, bound.bottomRight.y), color, 3);
double[] pixelTest;
pixelTest = source.get(bound.topLeft.x, bound.topLeft.y);
i++;
}
} catch (FingerException ex) {
Logger.getLogger(CallingID3Example.class.getName()).log(Level.SEVERE, null, ex);
}
gc = canvas.getGraphicsContext2D();
WritableImage writableImage = loadImage(source);
imageView.setImage(writableImage);
}
private WritableImage loadImage(Mat matrix) {
// Encoding the image
MatOfByte matOfByte = new MatOfByte();
Imgcodecs.imencode(".bmp", matrix, matOfByte);
// Storing the encoded Mat in a byte array
byte[] byteArray = matOfByte.toArray();
// Displaying the image
InputStream in = new ByteArrayInputStream(byteArray);
BufferedImage bufImage = null;
try {
bufImage = ImageIO.read(in);
} catch (IOException ex) {
}
// Creating the Writable Image
WritableImage writableImage = SwingFXUtils.toFXImage(bufImage, null);
return writableImage;
}
Thanks for your answer.
You could try something like this:
// You need to know width/height of the image
int width = 0;
int height = 0;
byte[] imageSrc = null;//
// Convert 8bit greyscale byte array to RGBA byte array.
byte[] imageRGBA = new byte[imageSrc.length * 4];
int i;
for (i = 0; i < imageSrc.length; i++) {
imageRGBA[i * 4] = imageRGBA[i * 4 + 1] = imageRGBA[i * 4 + 2] = ((byte) ~imageSrc[i]);
// Invert the source bits
imageRGBA[i * 4 + 3] = -1;// 0xff, that's the alpha.
}
// Convert RGBA byte array to PNG
int samplesPerPixel = 4;
int[] bandOffsets = {0,1,2,3}; // RGBA order
byte[] bgraPixelData = new byte[width * height * samplesPerPixel];
DataBuffer buffer = new DataBufferByte(bgraPixelData, bgraPixelData.length);
WritableRaster raster = Raster.createInterleavedRaster(buffer, width, height, samplesPerPixel * width, samplesPerPixel, bandOffsets, null);
ColorModel colorModel = new ComponentColorModel(ColorSpace.getInstance(ColorSpace.CS_sRGB), true, false, Transparency.TRANSLUCENT, DataBuffer.TYPE_BYTE);
BufferedImage image = new BufferedImage(colorModel, raster, colorModel.isAlphaPremultiplied(), null);
System.out.println("image: " + image); // Should print: image: BufferedImage#<hash>: type = 0 ...
ImageIO.write(image, "PNG", new File(path));
Update
To draw rectangle on image:
BufferedImage image = ...
Graphics2D graph = img.createGraphics();
graph.setColor(Color.BLACK);
graph.fill(new Rectangle(x, y, width, height));
graph.dispose();
ImageIO.write(image, "PNG", new File(path));

KsProperty () returns"The data area passed to a system call is too small" ERROR_INSUFFICIENT_BUFFER while setting camera Zoom value

HRESULT hr = S_OK;
KSPROPERTY ksprop;
ZeroMemory(&ksprop, sizeof(ksprop));
PVOID pData = NULL;
ULONG valueSize = 0;
ULONG dataLength = 0;
KSPROPERTY_CAMERACONTROL_S cameraControl;
ZeroMemory(&cameraControl, sizeof(cameraControl));
ksprop.Set = PROPSETID_VIDCAP_CAMERACONTROL;
ksprop.Id = KSPROPERTY_CAMERACONTROL_ZOOM;
ksprop.Flags = KSPROPERTY_TYPE_SET;
cameraControl.Property = ksprop;
cameraControl.Flags = KSPROPERTY_CAMERACONTROL_FLAGS_MANUAL;
cameraControl.Capabilities = KSPROPERTY_CAMERACONTROL_FLAGS_MANUAL;
cameraControl.Value = 50;
pData = &cameraControl;
dataLength = sizeof(cameraControl);
hr = m_pKsControl->KsProperty(
&ksprop, sizeof(ksprop),
pData, dataLength, &valueSize);
here hr "The data area passed to a system call is too small. "
I am compiling on vs 2010 on windows 7 machine.
You are likely to provide a buffer which is too small in fourth parameter.
It's easy to check this out, see IKsControl::KsProperty docs:
To determine the buffer size that is required for a specific property
request, you can call this method with PropertyData set to NULL and
DataLength equal to zero. The method returns
HRESULT_FROM_WIN32(ERROR_MORE_DATA), and BytesReturned contains the
size of the required buffer.
In this KsProperty use case, you should pass KSPROPERTY_CAMERACONTROL_S structure variable to make this work.
For example
KSPROPERTY_CAMERACONTROL_S kspIn = { 0 };
KSPROPERTY_CAMERACONTROL_S kspOut = { 0 };
ULONG valueSize = 0;
kspIn.Property.Set = PROPSETID_VIDCAP_CAMERACONTROL;
kspIn.Property.Id = KSPROPERTY_CAMERACONTROL_ZOOM;
kspIn.Property.Flags = KSPROPERTY_TYPE_SET;
kspOut.Property.Set = PROPSETID_VIDCAP_CAMERACONTROL;
kspOut.Property.Id = KSPROPERTY_CAMERACONTROL_ZOOM;
kspOut.Property.Flags = KSPROPERTY_TYPE_SET;
kspOut.Flags = KSPROPERTY_CAMERACONTROL_FLAGS_MANUAL;
kspOut.Capabilities = KSPROPERTY_CAMERACONTROL_FLAGS_MANUAL;
kspOut.Value = 50;
hr = m_pKsControl->KsProperty(
(PKSPROPERTY)&kspIn, sizeof(kspIn),
&kspOut, sizeof(kspOut), &valueSize);
Should work

Conditional Line Graph using Open Flash Charts

I am using Open Flash Charts v2. I have been trying to make Conditional line graph. But I couldn't find any straight forward way, example or any class for producing Conditional charts.
Example of Conditional Graph
So I thought to use some techniques to emulate conditional graph ,I made separate Line object for values above limit range and then this line is used to overlap the plotted line.
This techniques works some what ok ,but there are problems with it,
How to color or place the conditional colored line exactly above the limit.
Remove tooltip and dot from limit line.
Tooltip of conditional line(red) and plotted line(green) are both shown ,I only need tooltip of green line.
Conditional Line Graph Problem illustrated
Source Code: // C#
var chart = new OpenFlashChart.OpenFlashChart();
var data1 = new List<double?> { 1, 3, 4, 5, 2, 1, 6, 7 };//>4=
var overlap = new List<double?> { null, null, 4, 5, null, null, null, null };
var overlap2 = new List<double?> { null, null, null, null, null, null, 6, 7 };
var limitData = new List<double?> { 4, 4, 4, 4, 4, 4, 4, 4 };
var line1 = new Line();
line1.Values = data1;
//line1.HaloSize = 0;
line1.Width = 2;
line1.DotSize = 5;
line1.DotStyleType.Tip = "#x_label#<br>#val#";
line1.Colour = "#37c855";
line1.Tooltip = "#val#";
var overLine = new Line();
overLine.Values = overlap;
//overLine.HaloSize = 0;
overLine.Width = 2;
overLine.DotSize = 5;
overLine.DotStyleType.Tip = "#x_label#<br>#val#";
overLine.Colour = "#d81417";
overLine.Tooltip = "#val#";
var overLine2 = new Line();
overLine2.Values = overlap2;
//overLine2.HaloSize = 0;
overLine2.Width = 2;
overLine2.DotSize = 5;
//overLine2.DotStyleType.Tip = "#x_label#<br>#val#";
//overLine2.DotStyleType.Type = DotType.DOT;
overLine2.Colour = "#d81417";
overLine2.Tooltip = "#val#";
var limit = new Line();
limit.Values = limitData;
limit.Width = 2;
limit.Colour = "#ff0000";
limit.HaloSize = -1;
limit.DotSize = -1;
// limit.DotStyleType.Tip = "";
limit.DotStyleType.Type = null;
//limit.Tooltip = "";
chart.AddElement(line1);
chart.AddElement(overLine);
chart.AddElement(overLine2);
chart.AddElement(limit);
chart.Y_Legend = new Legend("Experiment");
chart.Title = new Title("Conditional Line Graph");
chart.Y_Axis.SetRange(0, 10);
chart.X_Axis.Labels.Color = "#e43456";
chart.X_Axis.Steps = 4;
chart.Tooltip = new ToolTip("#val#");
chart.Tooltip.Shadow = true;
chart.Tooltip.Colour = "#e43456";
chart.Tooltip.MouseStyle = ToolTipStyle.CLOSEST;
Response.Clear();
Response.CacheControl = "no-cache";
Response.Write(chart.ToPrettyString());
Response.End();
Note:
I have already downloaded the OFC (Open Flash Charts) source ,If I modify the OFC Line.as source than how would I be able to generate json for the changed graph ? ,b/c I'm currently using .Net library for the json generation for OFC charts,please do let me know this also.
Update:
I have modified the source code on the advice of David Mears I'm using FlashDevelop for ActionScript.
P.S: I'm open for ideas if another library can do this job.
If you don't mind a little rebuilding, you can get the source of OFC here and modify the Line.solid_line() method in open-flash-chart/charts/Line.as to do this fairly easily.
In order to set the extra chart details through JSON using the .NET library, you'll also have to modify OpenFlashChart/LineBase.cs to add alternative colour and boundary properties. I'm not hugely familiar with .NET, but based on the existing properties you might add something like this:
private double boundary;
private string altcolour;
[JsonProperty("boundary")]
public virtual double Boundary
{
set { this.boundary = value; }
get { return this.boundary; }
}
[JsonProperty("alt-colour")]
public virtual string AltColour
{
set { this.altcolour = value; }
get { return this.altcolour; }
}
Then I believe the following should work in Line.as:
public function solid_line(): void {
var first:Boolean = true;
var i:Number;
var tmp:Sprite;
var x:Number;
var y:Number;
var last_e:Element;
var ratio:Number;
for ( i=0; i < this.numChildren; i++ ) {
// Step through every child object.
tmp = this.getChildAt(i) as Sprite;
// Only include data Elements, ignoring extra children such as line masks.
if( tmp is Element )
{
var e:Element = tmp as Element;
if( first )
{
if (this.props.get('alt-colour') != Number.NEGATIVE_INFINITY) {
if (e._y >= this.props.get_colour('boundary'))
{
// Line starts below boundary, set alt line colour.
this.graphics.lineStyle( this.props.get_colour('width'), this.props.get_colour('alt-colour') );
}
else
{
// Line starts above boundary, set normal line colour.
this.graphics.lineStyle( this.props.get_colour('width'), this.props.get_colour('colour') );
}
}
// Move to the first point.
this.graphics.moveTo(e.x, e.y);
x = e.x;
y = e.y;
first = false;
}
else
{
if (this.props.get('alt-colour') != Number.NEGATIVE_INFINITY) {
if (last_e._y < this.props.get_colour('boundary') && e._y >= this.props.get_colour('boundary'))
{
// Line passes below boundary. Draw first section and switch to alt colour.
ratio = (this.props.get_colour('boundary') - last_e._y) / (e._y - last_e._y);
this.graphics.lineTo(last_e.x + (e.x - last_e.x) * ratio, last_e.y + (e.y - last_e.y) * ratio);
this.graphics.lineStyle( this.props.get_colour('width'), this.props.get_colour('alt-colour') );
}
else if (last_e._y >= this.props.get_colour('boundary') && e._y < this.props.get_colour('boundary'))
{
// Line passes above boundary. Draw first section and switch to normal colour.
ratio = (this.props.get_colour('boundary') - last_e._y) / (e._y - last_e._y);
this.graphics.lineTo(last_e.x + (e.x - last_e.x) * ratio, last_e.y + (e.y - last_e.y) * ratio);
this.graphics.lineStyle( this.props.get_colour('width'), this.props.get_colour('colour') );
}
}
// Draw a line to the next point.
this.graphics.lineTo(e.x, e.y);
}
last_e = e;
}
}
if ( this.props.get('loop') ) {
// close the line loop (radar charts)
this.graphics.lineTo(x, y);
}
}
With the new open-flash-chart.swf, you should be able to just set your new properties on line1:
line1.Boundary = 4;
line1.AltColour = "#d81417";

mapping rect in small image to larger image (in order to do a copyPixels operation)

this is (I think) a relatively simple math question but I've spent a day banging my head against it and have only the dents and no solution...
I'm coding in actionscript 3 - the functionality is:
large image loaded at runtime. The bitmapData is stored and a smaller version is created to display on the available screen area (I may end up just scaling the large image since it is in memory anyway).
The user can create a rectangle hotspot on the smaller image (the functionality will be more complex: multiple rects with transparency: example a donut shape with hole, etc)
3 When the user clicks on the hotspot, the rect of the hotspot is mapped to the larger image and a new bitmap "callout" is created, using the larger bitmap data. The reason for this is so the "callout" will be better quality than just scaling up the area of the hotspot.
The image below shows where I am at so far- the blue rect is the clicked hotspot. In the upper left is the "callout" - copied from the larger image. I have the aspect ratio right but I am not mapping to the larger image correctly.
Ugly code below... Sorry this post is so long - I just figured I ought to provide as much info as possible. Thanks for any tips!
--trace of my data values
*source BitmapDada 1152 864
scaled to rect 800 600
scaled BitmapData 800 600
selection BitmapData 58 56
scaled selection 83 80
ratio 1.44
before (x=544, y=237, w=58, h=56)
(x=544, y=237, w=225.04, h=217.28)
*
Image here: http://i795.photobucket.com/albums/yy237/skinnyTOD/exampleST.jpg
public function onExpandCallout(event:MouseEvent):void{
if (maskBitmapData.getPixel32(event.localX, event.localY) != 0){
var maskClone:BitmapData = maskBitmapData.clone();
//amount to scale callout - this will vary/can be changed by user
var scale:Number =150 //scale percentage
var normalizedScale :Number = scale/=100;
var w:Number = maskBitmapData.width*normalizedScale;
var h:Number = maskBitmapData.height*normalizedScale;
var ratio:Number = (sourceBD.width /targetRect.width);
//creat bmpd of the scaled size to copy source into
var scaledBitmapData:BitmapData = new BitmapData(maskBitmapData.width * ratio, maskBitmapData.height * ratio, true, 0xFFFFFFFF);
trace("source BitmapDada " + sourceBD.width, sourceBD.height);
trace("scaled to rect " + targetRect.width, targetRect.height);
trace("scaled BitmapData", bkgnImageSprite.width, bkgnImageSprite.height);
trace("selection BitmapData", maskBitmapData.width, maskBitmapData.height);
trace("scaled selection", scaledBitmapData.width, scaledBitmapData.height);
trace("ratio", ratio);
var scaledBitmap:Bitmap = new Bitmap(scaledBitmapData);
var scaleW:Number = sourceBD.width / scaledBitmapData.width;
var scaleH:Number = sourceBD.height / scaledBitmapData.height;
var scaleMatrix:Matrix = new Matrix();
scaleMatrix.scale(ratio,ratio);
var sRect:Rectangle = maskSprite.getBounds(bkgnImageSprite);
var sR:Rectangle = sRect.clone();
var ss:Sprite = new Sprite();
ss.graphics.lineStyle(8, 0x0000FF);
//ss.graphics.beginFill(0x000000, 1);
ss.graphics.drawRect(sRect.x, sRect.y, sRect.width, sRect.height);
//ss.graphics.endFill();
this.addChild(ss);
trace("before " + sRect);
w = uint(sRect.width * scaleW);
h = uint(sRect.height * scaleH);
sRect.inflate(maskBitmapData.width * ratio, maskBitmapData.height * ratio);
sRect.offset(maskBitmapData.width * ratio, maskBitmapData.height * ratio);
trace(sRect);
scaledBitmapData.copyPixels(sourceBD, sRect, new Point());
addChild(scaledBitmap);
scaledBitmap.x = offsetPt.x;
scaledBitmap.y = offsetPt.y;
}
}
Thanks!
public function onExpandCallout(event:MouseEvent):void{
// TODO: build this on startup or only on click? Speed vs memory
if (calloutState == true) return;
if (maskBitmapData.getPixel32(event.localX, event.localY) != 0){
calloutState = true;
//create bitmap from source using scaled selection rect
var ratio:Number = (sourceBMD.width /targetRect.width);
var sRect:Rectangle = hotSpotSprite.getBounds(bkgnImageSprite);
var destRect:Rectangle = new Rectangle(sRect.x * ratio, sRect.y * ratio, sRect.width * ratio, sRect.height * ratio);
calloutBitmapData = new BitmapData(destRect.width, destRect.height, true, 0xFFFFFFFF);
calloutBitmap = new Bitmap(calloutBitmapData);
//-- scale alpha mask
var scaledMaskBitmapData:BitmapData = new BitmapData(destRect.width, destRect.height, true, 0x00000000);
var maskScale:Number = scaledMaskBitmapData.width / maskBitmapData.width;
var mMatrix:Matrix = new Matrix(maskScale, 0, 0, maskScale);
scaledMaskBitmapData.draw(maskBitmapData,mMatrix,null,null,null, false);
// copy source with scaled alpha
calloutBitmapData.copyPixels(sourceBMD, destRect, new Point(), scaledMaskBitmapData, new Point());
scaledMaskBitmapData = null;
// apply filter to bitmap
var myDropShadowFilter:DropShadowFilter = new DropShadowFilter();
myDropShadowFilter.distance = 12;
myDropShadowFilter.alpha = .3
myDropShadowFilter.strength = 1;
myDropShadowFilter.blurX = 8;
myDropShadowFilter.blurY = 8;
calloutBitmap.filters = [myDropShadowFilter];
//place on screen
calloutSprite = new Sprite();
calloutSprite.addChild(calloutBitmap)
calloutSprite.x = offsetPt.x;
calloutSprite.y = offsetPt.y;
// ADD TO PARENT DisplayContainer
calloutLayer.addChild(calloutSprite);
// calloutSprite.scaleX = 2;
// calloutSprite.scaleY = 2;
calloutSprite.doubleClickEnabled = true;
calloutSprite.addEventListener(MouseEvent.DOUBLE_CLICK, onCollapseCallout);
calloutSprite.addEventListener(MouseEvent.MOUSE_DOWN, onStartDrag);
calloutSprite.addEventListener(MouseEvent.MOUSE_UP, onStopDrag);
}
}

Resources