I have an image uploader and cropper which creates thumbnails and I occasionally get an Out Of Memory exception on the following line:
Dim bm As Bitmap = System.Drawing.Image.FromFile(imageFile)
The occurance of the error is tiny and very rare, but I always like to know what might be causing it. The imageFile variable is just a Server.MapPath to the path of the image.
I was curious if anyone had experience this issue previously and if they had any ideas what might be causing it? Is it the size of the image perhaps?
I can post the code if necessary and any supporting information I have, but would love to hear people's opinions on this one.
It's worth knowing that OutOfMemoryException doesn't always really mean it's out of memory - particularly not when dealing with files. I believe it can also happen if you run out of handles for some reason.
Are you disposing of all your bitmaps after you're done with them? Does this happen repeatably for a single image?
If this wasn't a bad image file but was in fact the normal issue with Image.FromFile wherein it leaves file handles open, then the solution is use Image.FromStream instead.
using (FileStream fs = new FileStream(filePath, FileMode.Open, FileAccess.Read))
{
using (Image original = Image.FromStream(fs))
{
...
Using an explicit Dispose(), a using() statement or setting the value to null on the bitmap doesn't solve the issue with Image.FromFile.
So if you App runs for a time and opens a lot of files consider using Image.FromStream() instead.
I hit the same issue today while creating Thumbnail images for a folder full of images. It turns out that the "Out Of Memory" occured exactly at the same point each time. When I looked at the folder with the images to be converted I found that the file that was creating the problem was thumbs.db. I added some code to make sure that only image files were being converted and the issue was resolved.
My code is basically
For Each imageFile as FileInfo in fileList
If imageFile.Extension = ".jpg" Or imageFile.Extension = ".gif" Then
...proceed with the conversion
End If
Next
Hope this helps.
Also check if you haven't opened the same file somewhere else. Apparently, when you open a file twice (even with File.Open()) OutOfMemoryException is thrown too...
Also you can open it in read mode, (if you want to use it in two place same time)
public Image OpenImage(string previewFile)
{
FileStream fs = new FileStream(previewFile, FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
return Image.FromStream(fs);
}
This happens when the image file is corrupted. It is a bad error message, because memory has nothing to do with it. I haven;t worked out the coding, but a try/catch/finally will stop the program from abending.
I had a similar problem today when I was trying to resize an image and then crop it, what happened is I used this code to resize the image.
private static Image resizeImage(Image imgToResize, Size size)
{
int sourceWidth = imgToResize.Width;
int sourceHeight = imgToResize.Height;
float nPercent = 0;
float nPercentW = 0;
float nPercentH = 0;
nPercentW = ((float)size.Width / (float)sourceWidth);
nPercentH = ((float)size.Height / (float)sourceHeight);
if (nPercentH < nPercentW)
nPercent = nPercentH;
else
nPercent = nPercentW;
int destWidth = (int)(sourceWidth * nPercent);
int destHeight = (int)(sourceHeight * nPercent);
Bitmap b = new Bitmap(destWidth, destHeight);
Graphics g = Graphics.FromImage((Image)b);
g.InterpolationMode = InterpolationMode.HighQualityBicubic;
g.DrawImage(imgToResize, 0, 0, destWidth, destHeight);
g.Dispose();
return (Image)b;
}
And then this code for the crop...
private static Image cropImage(Image img, Rectangle cropArea)
{
Bitmap bmpImage = new Bitmap(img);
Bitmap bmpCrop = bmpImage.Clone(cropArea,
bmpImage.PixelFormat);
return (Image)(bmpCrop);
}
Then this is how I called the above code...
Image img = Image.FromFile(#"C:\Users\****\Pictures\image.jpg");
img = ImageHandler.ResizeImage(img, new Size(400, 300));
img = ImageHandler.CropImage(img, new Rectangle(0, 25, 400, 250));
long quality = 90;
I kept getting errors on the crop part, the resizer worked fine!
Turns out, what was happening inside the resizer was throwing errors in the crop function. The resized calculations were making the actual dimensions of the image come out to be like 399 rather than 400 that I passed in.
So, when I passed in 400 as the argument for the crop, it was trying to crop a 399px wide image with a 400px width bmp and it threw the out of memory error!
Most of the above code was found on http://www.switchonthecode.com/tutorials/csharp-tutorial-image-editing-saving-cropping-and-resizing
If an image is an icon then different loading handling is required, like in next function:
public static Image loadImage(string imagePath)
{
Image loadedImage = null;
if (!File.Exists(imagePath)) return loadedImage;
try
{
FileInfo fileInfo = new FileInfo(imagePath);
if (fileInfo.Extension.Equals(".jpg") || fileInfo.Extension.Equals(".jpeg") ||
fileInfo.Extension.Equals(".bmp") || fileInfo.Extension.Equals(".png") ||
fileInfo.Extension.Equals(".gif"))
{
loadedImage = Image.FromFile(imagePath);
}
else if (fileInfo.Extension.Equals(".ico"))
{
Bitmap aBitmap = Bitmap.FromHicon(new
Icon(imagePath, new Size(200, 200)).Handle);
loadedImage = ImageFuncs.ResizeImage(aBitmap, new Size(30, 30));
}
}
catch (Exception eLocal)
{
MessageBox.Show(imagePath + " loading error: " + eLocal.Message);
}
return loadedImage;
}
I had the same problem with a utility I wrote to convert TIFF(s) to PDF(s). Often I would get the "out of memory" error on the same line as you.
System.Drawing.Image.FromFile(imageFile)
Then I discovered the error only happened when the file extension was ".tiff" and worked fine after I renamed it with an extension of ".tif"
I have had the same issue, before looking else where in the code wanted to make sure if I can open the Image with any Image viewer and figured out that the Image is corrupted/damaged though it's a .PNG file with 1KB size. Added a new Image in the same location, then It worked fine.
I am having same problem batch processing Tiff files. Most of the files aren't throwing an exception but few files are throwing "Out of Memory" exception in ASP.NET 4.0. I have used binary data to find out why just for few files and from within same folder. It can't be permission issue for ASP.NET ASPNET or NETWORK SERVICE account because other files are working file.
I have opened iTextSharp.text.Image class and found that there are many overloaded methods for GetInstance(). I have resolved my problem using following code: note: catch block will run for problematic files.
iTextSharp.text.Image image = null;
try
{
var imgStream = GetImageStream(path);
image = iTextSharp.text.Image.GetInstance(imgStream);
}
catch {
iTextSharp.text.pdf.RandomAccessFileOrArray ra = null;
ra = new iTextSharp.text.pdf.RandomAccessFileOrArray(path);
image = iTextSharp.text.pdf.codec.TiffImage.GetTiffImage(ra, 1);
if (ra != null)
ra.Close();
}
If you're serving from IIS, try recycling the Application Pool. This solved a similar image upload "Out of Memory" error for me.
I created a minimal form example that still gives me errors.
private void button1_Click(object sender, EventArgs e)
{
string SourceFolder = ImageFolderTextBox.Text;
string FileName = "";
DirectoryInfo Mydir = new DirectoryInfo(SourceFolder);
FileInfo[] JPEGS = Mydir.GetFiles("*.jpg");
for (int counter = 0; counter < JPEGS.Count(); counter++)
{
FileName = Mydir + "\\" + JPEGS[counter].Name;
//using (Image MyImage = System.Drawing.Image.FromFile(FileName))
using (FileStream fs = new FileStream(FileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
StatusBtn.BackColor = Color.Green;
}
}
}
I tried both the commented out line using Image.FromFile() as well as the line using FileStream(). Both produced file errors.
The Image.FromFile() error was:
System.OutOfMemoryException: 'Out of Memory'
The filestream() error was:
System.UnaurthorizedAccessException: 'Access to the path 'E:\DCIM\100Canon\dsc_7218.jpg' is denied.
I placed a Breakpoint just prior to the lines producing the error and I am able to open the image file using the Windows image viewer. I then closed the viewer and after I advanced to the next line and get the error, I can no longer view the image with the Windows viewer. Instead, I get a message that I do not have permission to access the file. I am able to delete the file.
This error is repeatable. I've done it over 10 times. Each time, after I get the error, I delete the file used for FileName.
All files were verified to be non-corrupt.
My original code that used Image.FromFile() worked fine when I compiled it 2 years ago. In fact, the .exe file runs just fine. I made a minor change somewhere else in the code and was surprised to find that the code would not compile without this error. I tried the FileStream() method based on the information on this page.
Related
I am trying to convert pdf into images using magickNET and ghostscript. It works well on local and windows VMS servers. But when i push to azure servers it gives me the error saying FailedToExecuteCommand ghostscript gswin64c.exe. Details of error in attachements.
I tried changing the path for ghostscript dll but found no luck. The path is correct and dll file is there. One thing I am sure of. I have put dll inside wwwroot/GhostScriptDll. on error it shows "D:/home/site/wwwroot/wwwroot/GhostScriptDll/gswin64c.exe" which is also correct there are 2's wwwroot.
can someone help me to use ghostscript without webjobs things.
C# code i tried is like this
enter image description here
string rawpath = _hostingEnvironment.WebRootPath + "\\GhostScriptDll"
MagickNET.SetGhostscriptDirectory(rawpath);
MagickReadSettings settings = new MagickReadSettings();
settings.Density = new Density(300, 300);
List<string> intList = new List<string>() { };
using (MagickImageCollection images = new MagickImageCollection())
{
images.Read($"{rawFileLocation}{model.OriginalDocName}", settings);
int page = 1;
string tempName = "";
foreach (MagickImage image in images)
{
tempName = System.IO.Path.GetRandomFileName().Replace(".", string.Empty) + "_processed_" + page + ".png";
image.Write($"{processedFileLocation}{tempName}");
intList.Add(tempName);
page++;
}
}
i doubt in this part of error D:/local/Temp/magick-MB0cs71xMgZUfIHIwbzlzehBIlLL-NED"' (The system cannot find the file specified.
) # error/delegate.c/ExternalDelegateCommand/475 ...i dont find the local/temp/.. folder in server though
As the title suggests, WLP won't run the process- it won't return anything to the process input stream nor to error stream.
If anyone knows about a configuration that needs to take place I would love to know..
(note the process Can run by running the command manually - in addition, the whole thing runs smooth on tomcat8 so..)
EDIT 1:
The problem was not the command execution under WLP as you guys stated, so I accepted the answer.
The problem is different : I sent a media file to a multipart servlet and stored it in a file on disk using the following code:
InputStream is = request.getInputStream();
String currentTime = new Long(System.currentTimeMillis()).toString();
String fileName = PATH + currentTime + "." + fileType;
File file = new File(fileName);
// write the image to a temporary location
FileOutputStream os = new FileOutputStream(file);
byte[] buffer = new byte[BUFFER_SIZE];
while(true) {
int numRead = is.read(buffer);
if(numRead == -1) {
break;
}
os.write(buffer, 0, numRead);
os.flush();
}
is.close();
os.close();
and the file gets saved along with the following prefix:
While this does not happen on tomcat8 (using the same client)..
something is not trivial in the received input stream. (Note its a multipart servlet that set up via #MultipartConfig only)
Hope this post will help others..
guys,thanks for your help!
This will work in Liberty. I was able to test out the following code in a servlet and it printed the path of my current directory just fine:
String line;
Process p = Runtime.getRuntime().exec("cmd /c cd");
BufferedReader input = new BufferedReader(new InputStreamReader(p.getInputStream()));
while ((line = input.readLine()) != null) {
System.out.println(line);
}
input.close();
Start with a simple command like this, and when you move up to more complex commands or scripts, make sure you are not burying exceptions that may come back. Always at least print the stack trace!
I am trying to understand and implement a piece of code for Tiff compression.
I have already used 2 separate techniques - Using 3rd party dll's LibTiff.NEt (1st method is bulky) and the Image save method, http://msdn.microsoft.com/en-us/library/ytz20d80%28v=vs.110%29.aspx (2nd method works only on windows 7 machine but not on windows 2003 or 2008 server).
Now I am looking to explore this 3rd method.
using System.Windows.Forms;
using System.Windows.Media.Imaging;
using System.Drawing.Imaging;
int width = 800;
int height = 1000;
int stride = width/8;
byte[] pixels = new byte[height*stride];
// Try creating a new image with a custom palette.
List<System.Windows.Media.Color> colors = new List<System.Windows.Media.Color>();
colors.Add(System.Windows.Media.Colors.Red);
colors.Add(System.Windows.Media.Colors.Blue);
colors.Add(System.Windows.Media.Colors.Green);
BitmapPalette myPalette = new BitmapPalette(colors);
// Creates a new empty image with the pre-defined palette
BitmapSource image = BitmapSource.Create(
width,
height,
96,
96,
System.Windows.Media.PixelFormats.BlackWhite,
myPalette,
pixels,
stride);
FileStream stream = new FileStream(Original_File, FileMode.Create);
TiffBitmapEncoder encoder = new TiffBitmapEncoder();
encoder.Compression = TiffCompressOption.Ccitt4;
encoder.Frames.Add(BitmapFrame.Create(image));
encoder.Save(stream);
But I don't have a full understanding of what is happening here.
There is obviously some kind of a memory stream that the compression technique is being applied to. But I am a bit confused how to apply this to my specific case. I have an original tiff file, I want to use this method to set its compression to CCITT and save it back. Can anyone help?
I copied the above code and the code runs. But my end output file is a solid black background image. Although on the positive side it is of the correct compression type.
http://msdn.microsoft.com/en-us/library/ms616002%28v=vs.110%29.aspx
http://msdn.microsoft.com/en-us/library/system.windows.media.imaging.tiffcompressoption%28v=vs.100%29.aspx
http://social.msdn.microsoft.com/Forums/vstudio/en-US/1585c562-f7a9-4cfd-9674-6855ffaa8653/parameter-is-not-valid-for-compressionccitt4-on-windows-server-2003-and-2008?forum=netfxbcl
LibTiff.net is a little bulky because it's based off LibTiff, which has its own set of problems.
My company (Atalasoft) has the ability to do that fairly easily, and the free version of the SDK will do the task you want with a few restrictions. The code for re-encoding a file would look like this:
public bool ReencodeFile(string path)
{
AtalaImage image = new AtalaImage(path);
if (image.PixelFormat == PixelFormat.Pixel1bppIndexed)
{
TiffEncoder encoder = new TiffEncoder();
encoder.Compression = TiffCompression.Group4FaxEncoding;
image.Save(path, encoder, null); // destroys the original - use carefully
return true;
}
return false;
}
Things you should be aware of:
this code will only work properly on 1bpp images
this code will NOT work properly on multi-page TIFFs
this code does NOT preserve metadata within the original file
and I would want the code to at least check for that. If you are inclined to have a solution that better preserves what's in the content of the file, you would want to do this:
public bool ReencodeFile(string origPath, string outputPath)
{
if (origPath == outputPath) throw new ArgumentException("outputPath needs to be different from input path.");
TiffDocument doc = new TiffDocuemnt(origPath);
bool needsReencoding = false;
for (int i=0; i < doc.Pages; i++) {
if (doc.Pages[i].PixelFormat == PixelFormat.Pixel1bppIndexed) {
doc.Pages[i] = new TiffPage(new AtalaImage(origPath, i, null), TiffCompression.Group4FaxEncoding);
needsReencoding = true;
}
}
if (needsReendcoding)
doc.Save(outputPath);
return needsReencoding;
}
This solution will respect all pages within the document as well as document metadata.
I have used this example when exporting data to PowerPoint:
I have modified the GenerateSlidesFromDB() method:
public void GenerateSlidesFromDB()
{
string slideName = #"C:\Users\x\Desktop\output.pptx";
File.Copy(#"C:\Users\x\Desktop\Test.pptx", slideName, true);
using (PresentationDocument presentationDocument = PresentationDocument.Open(slideName, true))
{
PresentationPart presentationPart = presentationDocument.PresentationPart;
SlidePart slideTemplate = (SlidePart)presentationPart.GetPartById("rId2");
string firstName = "Test User";
SlidePart newSlide = CloneSlidePart(presentationPart, slideTemplate);
InsertContent(newSlide, firstName);
newSlide.Slide.Save();
DeleteTemplateSlide(presentationPart, slideTemplate);
presentationPart.Presentation.Save();
}
}
As you can see I overwrite the placeholder with "Test User", and it works like a charm.
I need to add an image (as a placeholder) to this pptx-file.
When I do that (and run the code again) I get a corrupted pptx-file?
Error message:
PowerPoint removed unreadable content
in output.pptx. You should review
this presentation to determine whether
any content was unexpectedly changed
or removed.
Edit: If I try the original code (which is slightly modified since I dont have Adventureworks), I get some other kind of error message:
This file may have become corrupt or damaged for the following reasons:
Third-party XML editors sometimes create files that are not compatible with Microsoft Office XML specifications.
The file has been purposely corrupted with the intent to harm your computer or your data.
Be cautious when opening a file from an unknown source.
PowerPoint can attempt to recover data from the file, but some presentation data, such as shapes, text,and formatting, may be lost.
Do one of the following:
If you want to recover data from the file, click Yes.
If you do not want to recoverdata from the file, click No.
Ok, sorry for this useless post. My bad.
Solution:
string imgId = "rIdImg" + i;
ImagePart imagePart = newSlide.AddImagePart(ImagePartType.Jpeg, imgId);
MemoryStream stream3 = new MemoryStream();
using (FileStream file = File.Open(#"C:\Users\x\Desktop\Test.jpg", FileMode.Open))
{
byte[] buffer = new byte[file.Length];
file.Read(buffer, 0, (int)file.Length);
stream3.Write(buffer, 0, buffer.Length);
imagePart.FeedData(new MemoryStream(buffer));
}
SwapPhoto(newSlide, imgId);
I am doing a lot of image processing in GDI+ in .NET in an ASP.NET application.
I frequently find that Image.FromFile() is keeping a file handle open.
Why is this? What is the best way to open an image without the file handle being retained.
NB: I'm not doing anything stupid like keeping the Image object lying around - and even if I was I woudlnt expect the file handle to be kept active
I went through the same journey as a few other posters on this thread. Things I noted:
Using Image.FromFile does seem unpredictable on when it releases the file handle. Calling the Image.Dispose() did not release the file handle in all cases.
Using a FileStream and the Image.FromStream method works, and releases the handle on the file if you call Dispose() on the FileStream or wrap the whole thing in a Using {} statement as recommended by Kris. However if you then attempt to save the Image object to a stream, the Image.Save method throws an exception "A generic error occured in GDI+". Presumably something in the Save method wants to know about the originating file.
Steven's approach worked for me. I was able to delete the originating file with the Image object in memory. I was also able to save the Image to both a stream and a file (I needed to do both of these things). I was also able to save to a file with the same name as the originating file, something that is documented as not possible if you use the Image.FromFile method (I find this weird since surely this is the most likely use case, but hey.)
So to summarise, open your Image like this:
Image img = Image.FromStream(new MemoryStream(File.ReadAllBytes(path)));
You are then free to manipulate it (and the originating file) as you see fit.
I have had the same problem and resorted to reading the file using
return Image.FromStream(new MemoryStream(File.ReadAllBytes(fileName)));
Image.FromFile keeps the file handle open until the image is disposed. From the MSDN:
"The file remains locked until the Image is disposed."
Use Image.FromStream, and you won't have the problem.
using(var fs = new FileStream(filename, FileMode.Open, FileAccess.Read))
{
return Image.FromStream(fs);
}
Edit: (a year and a bit later)
The above code is dangerous as it is unpredictable, at some point in time (after closing the filestream) you may get the dreaded "A generic error occurred in GDI+". I would amend it to:
Image tmpImage;
Bitmap returnImage;
using(var fs = new FileStream(filename, FileMode.Open, FileAccess.Read))
{
tmpImage = Image.FromStream(fs);
returnImage = new Bitmap(tmpImage);
tmpImage.Dispose();
}
return returnImage;
Make sure you are Disposing properly.
using (Image.FromFile("path")) {}
The using expression is shorthand for
IDisposable obj;
try { }
finally
{
obj.Dispose();
}
#Rex in the case of Image.Dispose it calls GdipDisposeImage extern / native Win32 call in it's Dispose().
IDisposable is used as a mechanism to free unmanaged resources (Which file handles are)
I also tried all your tips (ReadAllBytes, FileStream=>FromStream=>newBitmap() to make a copy, etc.) and they all worked. However, I wondered, if you could find something shorter, and
using (Image temp = Image.FromFile(path))
{
return new Bitmap(temp);
}
appears to work, too, as it disposes the file handle as well as the original Image-object and creates a new Bitmap-object, that is independent from the original file and therefore can be saved to a stream or file without errors.
I would have to point my finger at the Garbage Collector. Leaving it around is not really the issue if you are at the mercy of Garbage Collection.
This guy had a similar complaint... and he found a workaround of using a FileStream object rather than loading directly from the file.
public static Image LoadImageFromFile(string fileName)
{
Image theImage = null;
fileStream = new FileStream(fileName, FileMode.Open, FileAccess.Read);
{
byte[] img;
img = new byte[fileStream.Length];
fileStream.Read(img, 0, img.Length);
fileStream.Close();
theImage = Image.FromStream(new MemoryStream(img));
img = null;
}
...
It seems like a complete hack...
As mentioned above the Microsoft work around causes a GDI+ error after several images have been loaded. The VB solution for me as mentioned above by Steven is
picTemp.Image = Image.FromStream(New System.IO.MemoryStream(My.Computer.FileSystem.ReadAllBytes(strFl)))
I just encountered the same problem, where I was trying to merge multiple, single-page TIFF files into one multipart TIFF image. I needed to use Image.Save() and 'Image.SaveAdd()`: https://msdn.microsoft.com/en-us/library/windows/desktop/ms533839%28v=vs.85%29.aspx
The solution in my case was to call ".Dispose()" for each of the images, as soon as I was done with them:
' Iterate through each single-page source .tiff file
Dim initialTiff As System.Drawing.Image = Nothing
For Each filePath As String In srcFilePaths
Using fs As System.IO.FileStream = File.Open(filePath, FileMode.Open, FileAccess.Read)
If initialTiff Is Nothing Then
' ... Save 1st page of multi-part .TIFF
initialTiff = Image.FromStream(fs)
encoderParams.Param(0) = New EncoderParameter(Encoder.Compression, EncoderValue.CompressionCCITT4)
encoderParams.Param(1) = New EncoderParameter(Encoder.SaveFlag, EncoderValue.MultiFrame)
initialTiff.Save(outputFilePath, encoderInfo, encoderParams)
Else
' ... Save subsequent pages
Dim newTiff As System.Drawing.Image = Image.FromStream(fs)
encoderParams = New EncoderParameters(2)
encoderParams.Param(0) = New EncoderParameter(Encoder.Compression, EncoderValue.CompressionCCITT4)
encoderParams.Param(1) = New EncoderParameter(Encoder.SaveFlag, EncoderValue.FrameDimensionPage)
initialTiff.SaveAdd(newTiff, encoderParams)
newTiff.Dispose()
End If
End Using
Next
' Make sure to close the file
initialTiff.Dispose()