I created image from byte array
System.Drawing.Image newImage;
using (MemoryStream ms = new MemoryStream(imageBytes, 0, imageBytes.Length))
{
ms.Write(imageBytes, 0, imageBytes.Length);
newImage = System.Drawing.Image.FromStream(ms, true);
}
and now I need to have this image as a source for asp:Image (System.Web.UI.WebControls.Image). Is this possible as I know that conversion is impossible?
Use the following code:
System.IO.MemoryStream ms = new System.IO.MemoryStream();
image.Save(ms, System.Drawing.Imaging.ImageFormat.Gif);
Response.ClearContent();
Response.ContentType = "image/Gif";
Response.BinaryWrite(ms.ToArray());
<asp:Image ID="Image1" runat="server" ImageUrl="~/pic.aspx"/>
If you're starting with a byte array, just send that. No need to parse and re-encode it unless you need to change the image format.
If you do need to perform image manipulations, you might consider an existing library that doesn't have memory leaks, and is optimized for the server. GDI calls are dangerous on the server - you have to know exactly what you're doing.
For the best performance, use an HttpModule to answer the request. An aspx page as suggested by MUG4N will add significant overhead to the request. A .ashx would be better than a .aspx file, but wouldn't be MVC compatible, and doesn't permit disk caching as can be accomplished via an HttpModule and a RewritePath call.
Related
In an existing Asp.Net application, we are using Response.BinaryWrite to render image on an aspx page. This is the required functionality, and below is the C# code-
1. byte[] img = getImage();
2. Response.BinaryWrite(img);
The getImage function reads the image from a folder on server and returns byte array. Fortify scan shows cross-site vulnerability on 2nd line.
I did following validations, but fortify still reports it as cross-site issue -
Validated bytearray to check if the file is of correct format (jpeg or bmp), used this link - Determine file type of an image
Response.BinaryWrite(ValidateFileType(img));
Validated the domain in the file path to check if the file is originating from correct domain.
Is there any specific way to pass the fortify cross-site issue with byte array or can i consider it as false positive?
Had to use a workaround to resolve this, below is the old and new code -
Old Code -
1. byte[] byteImage = getImage();
2. Response.BinaryWrite(byteImage);
New Code (Replaced 2nd line in old code with below block) -
byte[] byteImage = getImage();
var msIn = new MemoryStream(byteImage);
System.Drawing.Image img = System.Drawing.Image.FromStream(msIn);
var msOut = new MemoryStream();
img.Save(msOut, img.RawFormat);
Response.BinaryWrite(msOut.ToArray());
msIn.Dispose();
msOut.Dispose();
Response.Flush();
So, basically converting the byteArray to an Image object, and then writing the image object back to the Response.BinaryWrite stream resolved this, and it passed through Fortify scan.
If anyone is looking for a solution, this might help.
I need to take an uploaded image, resize it, and save it to the database. Simple enough, except I don't have access to save any temp files to the server. I'm taking the image, resizing it as a Bitmap, and need to save it to a database field as the original image type (JPG for example). How can I get the FileBytes() like this, so I can save it to the database?
Before I was using ImageUpload.FileBytes() but now that I'm resizing I'm dealing with Images and Bitmaps instead of FileUploads and can't seem find anything that will give me the bytes.
Thanks!
It's actually not so simple... there are 28 non-obvious pitfalls you should watch out for when doing image resizing. It's best to use my free, open-source library to handle all the encoding issues and avoid the GDI bugs.
Here's how to get an encoded byte[] array for each uploaded file, after resizing, cropping, and converting to Jpeg format.
using ImageResizer;
using ImageResizer.Encoding;
//Loop through each uploaded file
foreach (string fileKey in HttpContext.Current.Request.Files.Keys) {
HttpPostedFile file = HttpContext.Current.Request.Files[fileKey];
//You can specify any of 30 commands.. See http://imageresizing.net
ResizeSettings resizeCropSettings =
new ResizeSettings("width=200&height=200&format=jpg&crop=auto");
using (MemoryStream ms = new MemoryStream()) {
//Resize the image
ImageBuilder.Current.Build(file, ms, resizeCropSettings);
//Upload the byte array to SQL: ms.ToArray();
}
}
It's also a bad idea to use MS SQL for storing images. See my podcast with Scott Hanselman for more info.
See Resizing an Image without losing any quality You can then write your image (Bitmap.SaveToStream) to a MemoryStream and call ToArray to get the bytes.
This is what I've done to resize images.
private byte[] toBytes(Image image)
{
Bitmap resized = new Bitmap(image, yourWidth, yourHeight);
System.IO.MemoryStream ms = new System.IO.MemoryStream();
resized.Save(ms, System.Drawing.Imaging.ImageFormat.Jpeg);
resized.Dispose();
return ms.ToArray();
}
I have some AVI files on disk but they are encrypted. I'm wondering if there is a way I can decrypt them and stream them to the browser (using MemoryStream or something similar) without having to write any files?
I know there is Windows Media Services but I'm using a Vista machine and Windows Media Services will only install in Windows Server 2003 and 2008.
Is there a way to accomplish this without too much trouble or is Media Services/Windows Server the only way to go? And if there is, would I use something like a custom IHttpHandler (.ashx file)?
Edit:
I have decided to use a custom IHttpHandler. What basic code would I need to have the video play?
I wouldn't want to use a MemoryStream for video. Assuming you can create a CryptoStream (found in the System.Security.Cryptography namespace) over the encrypted AVI file you should be able to just pump a Read from that to a Write on the Response.OutputStream in a IHttpHandler. Something like:-
byte[] buffer = new byte[65556]; // adjust the buffer size as yo prefer.
CryptoStream inStream = YourFunctionToDecryptAVI(aviFilePath);
int bytesRead = inStream.Read(buffer, 0, 65556);
while (bytesRead != 0)
{
context.Response.OutputStream.Write(buffer, 0, bytesRead);
bytesRead = inStream.Read(buffer, 0, 65556);
if (!context.Response.IsClientConnected) break;
}
Response.Close(); //see edit note.
Make sure you turn off response buffer and specify a content type.
Edit:
Ordinarily I hate calling close, it seems so draconian. However whilst chunked encoding shouldn't require it, in the case of streamed video it may be that the client doesn't like it. Also with large data transfers closing the connection is not really big deal.
What's the best way to stream files using ASP.NET?
There appear to be various methods for this, and I'm currently using the Response.TransmitFile() method inside an http handler, which sends the file to the browser directly. This is used for various things, including sending FLV's from outside the webroot to an embedded Flash video player.
However, this doesn't seem like a reliable method. In particular, there's a strange problem with Internet Explorer (7), where the browser just hangs after a video or two are viewed. Clicking on any links, etc have no effect, and the only way to get things working again on the site is to close down the browser and re-open it.
This also occurs in other browsers, but much less frequently. Based on some basic testing, I suspect this is something to do with the way files are being streamed... perhaps the connection isn't being closed properly, or something along those lines.
After trying a few different things, I've found that the following method works for me:
Response.WriteFile(path);
Response.Flush();
Response.Close();
Response.End();
This gets around the problem mentioned above, and viewing videos no longer causes Internet Explorer to hang.
However, my understanding is that Response.WriteFile() loads the file into memory first, and given that some files being streamed could potentially be quite large, this doesn't seem like an ideal solution.
I'm interested in hearing how other developers are streaming large files in ASP.NET, and in particular, streaming FLV video files.
I would take things outside of the "aspx" pipeline. In particular, I would write a ran handler (ashx, or mapped via config), that does the minimum work, and simply writes to the response in chunks. The handler would accept input from the query-string/form as normal, locate the object to stream, and stream the data (using a moderately sized local buffer in a loop). A simple (incomplete) example shown below:
public void ProcessRequest(HttpContext context) {
// read input etx
context.Response.Buffer = false;
context.Response.ContentType = "text/plain";
string path = #"c:\somefile.txt";
FileInfo file = new FileInfo(path);
int len = (int)file.Length, bytes;
context.Response.AppendHeader("content-length", len.ToString());
byte[] buffer = new byte[1024];
Stream outStream = context.Response.OutputStream;
using(Stream stream = File.OpenRead(path)) {
while (len > 0 && (bytes =
stream.Read(buffer, 0, buffer.Length)) > 0)
{
outStream.Write(buffer, 0, bytes);
len -= bytes;
}
}
}
Take a look at the following article Tracking and Resuming Large File Downloads in ASP.NET which will give you more in depth than just open a stream and chuck out all the bits.
The http protocol supports ranged byte requests and resumeable downloads, and many streaming clients (like video players or Adobe pdf) can and will try to chunk these up, saving bandwidth and giving your users a better experience.
Not trivial, but it's time well spent.
Try opening the file as a stream, then using Response.OutputStream.Write(). For example:
Edit: My bad, I forgot that Write takes a byte buffer. Fixed
byte [] buffer = new byte[1<<16] // 64kb
int bytesRead = 0;
using(var file = File.OpenRead(path))
{
while((bytesRead = file.Read(buffer, 0, buffer.Length)) != 0)
{
Response.OutputStream.Write(buffer, 0, bytesRead);
}
}
Response.Flush();
Response.Close();
Response.End();
Edit 2: Did you try this? It should work.
After trying lots of different combinations, including the code posted in the various answers, it seems like setting Response.Buffer = true before calling TransmitFile did the trick and the web application is now a lot more responsive in Internet Explorer.
In this particular case, the SWF extension is also mapped to ASP.NET, and we're using a custom handler in our web application to read the files from disk and then send them to the browser using Response.TransmitFile(). We've got a flash-based video player to play video files which are also SWF's, and I think having all of this activity go through the handler without buffering is what may have been causing strange things to happen in IE.
I have a sql server database that returns byte for the image. If I use the tableadapter wizard and set it to my stored procedure and preview data, it pulls back an image. It automatically turns it into an image in the preview data. I don't see it as a string of Ints or anything.
How can I display it on my asp.net webpage with a gridview and objectdatasource?
I have searched and foudn where the imagefield can point to a url on another page that does the byte transformation but I'm not sure it's the best. I found another way that creates a temp file.
Just trying to see the best way to do it.
edit - I am trying not to use a temp file. If I cannot use a gridview a regular image field is ok.
asp.net 2.0, c#.
Thank you for any help.
edit
ended up with:
protected void Page_Load(object sender, EventArgs e)
{
string id = Request["id"];
string connstr = "DSN=myserver";
OdbcConnection conn = new OdbcConnection(connstr);
OdbcCommand cmd = new OdbcCommand("{call mySP (?)}", conn);
cmd.CommandType = CommandType.StoredProcedure;
// Add the input parameter and set its properties.
OdbcParameter parameter = new OdbcParameter();
parameter.ParameterName = "#MyParam";
parameter.OdbcType = OdbcType.VarChar;
parameter.Direction = ParameterDirection.Input;
parameter.Value = id;
// Add the parameter to the Parameters collection.
cmd.Parameters.Add(parameter);
conn.Open();
OdbcDataReader dr = cmd.ExecuteReader();
while (dr.Read())
{
byte[] buffer = (byte[])dr[0];
Response.ContentType = "image/jpg";
Response.BinaryWrite(buffer);
Response.Flush();
}
}
and this on the calling page:
<asp:Image ID="Image1" ImageAlign="Middle" ImageUrl="show.aspx?id=123" Runat="server" />
Two options:
Create a temp file - The problem with this approach is that you have to create the file, which means your web must have write access to a directory which is not a great thing. You also need to have a way to clean up the images.
Serve it from another URL - This is my preferred method, as you have no disk access required. A simple http handler (ashx) is a great method to serve up the image.
Edit
If you need session state in the ashx, check out: Asp.net System.Web.HttpContext.Current.Session null in global.asax.
Edit
Couple more thoughts. There are some cases where using a temp file might be better. For example if your images are requested frequently by a lot of users. Then storing the images on the disk would make sense, since you could write the file once, this does increase the maintance complexity but depending on traffic it might be worth it since this would let you avoid calling back into the .net stack and leverage IIS caching of static content.
I wrote the SqlReader plugin for open-source ImageResizing.Net library to allow you to serve and display images from a SQL database in the most performance-optimal way.
Even if you don't need to do any image processing whatsoever, it's still (a) the easiest, and (b) the most efficient way to do it. You can combine it with disk caching (which provides automatic cleanup) to get the best performance that is possible.
Installation is easy - 2 nuget commands, or copy & paste into Web.Config, your pick.
If you need help, support is free and fast.
The sample code you added is good but you should move it to a .ashx file which is meant for such things.
Here is some example code on how to do this.