in postman it return Cannot acces a closed File.
I want to return profile picture with path saved to database while adding picture which is saved in wwwroot folder on server.
It always return Cannot access a closed file. I don't know why. If needed I will put code from adding profile picture.
public async Task<IActionResult> GetProfilePicture([FromQuery] string userId)
{
FileStreamResult file;
if (userId == null)
return BadRequest("Something went wrong");
enter code here
// searching for user to get picture path that are saved into database
var userPP = await _context.Users.FindAsync(userId);
if (userPP == null)
return BadRequest("Something went wrong! Try again");
// getting extension from file path
var ext = Path.GetExtension(userPP.ProfilePicture);
string contentType;
// sending file with right contentType depending on extension
switch (ext)
{
case ".jpg":
case ".jpeg":
contentType = "image/jpeg";
break;
case ".png":
contentType = "image/png";
break;
case ".gif":
contentType = "image/gif";
break;
default:
contentType = "application/octet-stream";
break;
}
//opening file stream to get image
using (var fs = new FileStream(userPP.ProfilePicture, FileMode.Open, FileAccess.Read))
{
file = File(fs, contentType);
return Ok(file);
}
}
Just removed using statement because it was closing file before it was actually sent to frontend
Related
I am creating a website using Azure Blobs to store content. The website provides Search and Indexing.
When this link is relative,
<a download="" href="./media5/yyy.png">Download</a>
the browser kicks off a "download".
When the files are stored in Blobs, the users get a link like:
<a download="" href="https://xxx.blob.core.windows.net/media5/yyy.png">Download</a>
However, this navigates to the image.
I need the browser "download" to work.
I have tries setting the Storage Account's CORS Blade:
But this did not do anything.
CORS is not going to help in this case. If you want to force download the blob, please change the blob's content-type property to application/octet-stream (or application/binary).
However, please note that when you change the blob's content type to application/octet-stream, it will always be downloaded. You will not be able to display the blob in the browser.
Manrti is correct CORS is not in play. I deleted the Storage Account's CORS Blade.
The fix is 3 parts:
1: Set ContentDispoistion header to "attached"
var blockBlobClient = new BlockBlobClient(connectionString, container, fileInfo.Name, clientOptions);
var uploadOptions = new BlobUploadOptions();
uploadOptions.HttpHeaders = new BlobHttpHeaders();
switch (fileInfo.Extension)
{
case ".wav":
uploadOptions.HttpHeaders.ContentType = "audio/wav";
break;
case ".mp3":
uploadOptions.HttpHeaders.ContentType = "audio/mpeg";
break;
case ".mp4":
uploadOptions.HttpHeaders.ContentType = "video/mp4";
break;
case ".jpg":
uploadOptions.HttpHeaders.ContentType = "image/jpeg";
break;
case ".png":
uploadOptions.HttpHeaders.ContentType = "image/png";
break;
case ".zip":
uploadOptions.HttpHeaders.ContentType = "application/x-zip-compressed";
break;
case ".html":
uploadOptions.HttpHeaders.ContentType = "text/plain";
break;
case ".pdf":
uploadOptions.HttpHeaders.ContentType = "application/pdf";
break;
default:
break;
}
uploadOptions.HttpHeaders.ContentDisposition = "attached";
uploadOptions.ProgressHandler = progressHandler;
var contentMD5 = await GetContentMD5(blockBlobClient);
var contentType = await GetContentType(blockBlobClient);
using (var fs = fileInfo.Open(FileMode.Open, FileAccess.Read, FileShare.Read))
{
if ((sourceMD5 != contentMD5) | (contentType != uploadOptions.HttpHeaders.ContentType))
{
Console.WriteLine("\t\tUploading Blob...");
await blockBlobClient.UploadAsync(fs, uploadOptions);
contentMD5 = await GetContentMD5(blockBlobClient);
if (sourceMD5 != contentMD5)
{
throw new Exception($"Uploaded blob ContentMD5[{contentMD5}] does not match SourceMD5Hash[{sourceMD5}] for {blockBlobClient.Name}");
}
}
return blockBlobClient;
}
2: You must use a SAS Url to have the Content-Disposition header be returned:
var uri = blobClient.GenerateSasUri(BlobSasPermissions.Read, new DateTimeOffset(DateTime.UtcNow.AddYears(2)));
3: HTML Link:
<a download="" href="https://xxx.blob.core.windows.net/media5/yyy.png?sv=2020-08-04&se=2024-04-08T14%3A43%3A28Z&sr=b&sp=r&sig=ZWdbpd8y2hr02MpzgxDC%2Fu2eqi5HukIYhXnLiYK4Rrk%3D">Download</a>
i can't really find a way to download a 100mb zip file from the server to the client and also show the progress while downloading. So how will this look for a normal api controller i can add to my server-side project? if lets say i have 3 files i want to download at 50mb each.
i have tried using JSInterop like this, but this is not showing the progress of the file download, and how will i do if i want to download 3 seperate files at the same time?
try
{
//converting file into bytes array
var dataBytes = System.IO.File.ReadAllBytes(file);
await JSRuntime.InvokeVoidAsync(
"downloadFromByteArray",
new
{
ByteArray = dataBytes,
FileName = "download.zip",
ContentType = "application/force-download"
});
}
catch (Exception)
{
//throw;
}
JS:
function downloadFromByteArray(options: {
byteArray: string,
fileName: string,
contentType: string
}): void {
// Convert base64 string to numbers array.
const numArray = atob(options.byteArray).split('').map(c => c.charCodeAt(0));
// Convert numbers array to Uint8Array object.
const uint8Array = new Uint8Array(numArray);
// Wrap it by Blob object.
const blob = new Blob([uint8Array], { type: options.contentType });
// Create "object URL" that is linked to the Blob object.
const url = URL.createObjectURL(blob);
// Invoke download helper function that implemented in
// the earlier section of this article.
downloadFromUrl({ url: url, fileName: options.fileName });
// At last, release unused resources.
URL.revokeObjectURL(url);
}
UPDATE:
if im using this code, it will show me the progress of the file. But how can i trigger it from my code? This way does not do it. But typing the url does.
await Http.GetAsync($"Download/Model/{JobId}");
Controller
[HttpGet("download/model/{JobId}")]
public IActionResult DownloadFile([FromRoute] string JobId)
{
if (JobId == null)
{
return BadRequest();
}
var FolderPath = $"xxxx";
var FileName = $"Model_{JobId}.zip";
var filePath = Path.Combine(environment.WebRootPath, FolderPath, FileName);
byte[] fileBytes = System.IO.File.ReadAllBytes(filePath);
return File(fileBytes, "application/force-download", FileName);
}
UPDATE 2!
i have got it download with progress and click with using JSInterop.
public async void DownloadFiles()
{
//download all selectedFiles
foreach (var file in selectedFiles)
{
//download these files
await JSRuntime.InvokeAsync<object>("open", $"Download/Model/{JobId}/{file.Name}", "_blank");
}
}
Now the only problem left is.. it only downloads the first file out of 3.
I'd like to upload images from an uri postet to an asp.net mvc5 controller to azure blob storage. I already got it working with HttpPostedFileBase, like this. Can I somehow get the memory stream from an image uri?
HttpPostedFileBase hpf = Request.Files[file] as HttpPostedFileBase;
var imgFile = System.Drawing.Image.FromStream(hpf.InputStream, true, true);
CloudBlockBlob blob = coversContainer.GetBlockBlobReference("img.jpg");
MemoryStream stream = new MemoryStream();
imgFile.Save(stream, ImageFormat.Jpeg);
stream.Position = 0;
blob.UploadFromStream(stream);
So this is how I managed to get it done:
public static Image DownloadRemoteImage(string url)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
HttpWebResponse response;
try
{
response = (HttpWebResponse)request.GetResponse();
}
catch (Exception)
{
return null;
}
// Check that the remote file was found. The ContentType
// check is performed since a request for a non-existent
// image file might be redirected to a 404-page, which would
// yield the StatusCode "OK", even though the image was not
// found.
if ((response.StatusCode == HttpStatusCode.OK ||
response.StatusCode == HttpStatusCode.Moved ||
response.StatusCode == HttpStatusCode.Redirect) &&
response.ContentType.StartsWith("image", StringComparison.OrdinalIgnoreCase))
{
// if the remote file was found, download it
Stream inputStream = response.GetResponseStream();
Image img = Image.FromStream(inputStream);
return img;
}
else
{
return null;
}
}
This code snipped was taken and modified from this question's answer:
Download image from the site in .NET/C#
I read large json file, change certain things and then write it back to disk:
using (var reader = new StreamReader(filePath))
{
var log = (JObject)JToken.ReadFrom(new JsonTextReader(reader));
//log = UpdateOneLog(log);
using (var writer = new StreamWriter(updateFilePath))
{
log.WriteTo(new JsonTextWriter(writer));
writer.Close();
}
reader.Close();
}
or even
JObject o1 = JObject.Parse(File.ReadAllText(inputFile));
File.WriteAllText(outputFile, o1.ToString());
Weird things happen for certain files and I believe it has something to do with file size. The date time should be startedDateTime":"2013-01-17T11:00:40.000-06:00" but it gets written to the file as startedDateTime":"2013-01-17T11:00:40-06:00" (note that fractions of the second "000" is missing). I even commented out my update logic as shown above. All I am doing is reading file and writing it back but the date gets garbled..
Am I doing something wrong?
-Stan
For reason still not clear for me (this is probably a bug), Json.Net sometimes incorrectly parses the millisecond portion of the date/time. So, for example in the string "2013-01-17T11:00:40.230-06:00" the millisecond portion "230" gets droped and the string becomes "2013-01-17T11:00:40-06:00" which is invalid. The workaround that I found is to loop through all tokens when saving and replace millisecond to some value as shown below.
[TestMethod]
public void LoadAndSave()
{
var directory = #"..\..\Files";
var inputFile = Path.Combine(directory, "LargeFile.har");
var outputFile = Path.Combine(directory, "LargeFileResult.har");
if (File.Exists(outputFile))
File.Delete(outputFile);
StreamWriter sw = null;
JsonTextWriter jTextWriter = null;
StreamReader sr = null;
JsonTextReader jTextReader = null;
try
{
sw = new StreamWriter(outputFile);
jTextWriter = new JsonTextWriter(sw);
sr = new StreamReader(inputFile);
jTextReader = new JsonTextReader(sr);
while (jTextReader.Read())
{
var tokenType = jTextReader.TokenType;
var tokenValue = jTextReader.Value;
var tokenString = jTextReader.Value as string;
switch (tokenType)
{
case JsonToken.Boolean:
case JsonToken.Bytes:
case JsonToken.Float:
case JsonToken.Integer:
case JsonToken.String:
jTextWriter.WriteValue(tokenValue);
break;
case JsonToken.Comment:
jTextWriter.WriteComment(tokenString);
break;
case JsonToken.Date:
DateTime date = (DateTime)tokenValue;
DateTime dateWrite = new DateTime(date.Year, date.Month, date.Day, date.Hour, date.Minute, date.Second, 100, date.Kind);
jTextWriter.WriteValue(dateWrite);
break;
case JsonToken.EndArray:
jTextWriter.WriteEndArray();
break;
case JsonToken.EndConstructor:
jTextWriter.WriteEndConstructor();
break;
case JsonToken.EndObject:
jTextWriter.WriteEndObject();
break;
case JsonToken.None:
break;
case JsonToken.Null:
jTextWriter.WriteNull();
break;
case JsonToken.PropertyName:
jTextWriter.WritePropertyName(tokenString);
break;
case JsonToken.Raw:
jTextWriter.WriteRaw(tokenString);
break;
case JsonToken.StartArray:
jTextWriter.WriteStartArray();
break;
case JsonToken.StartConstructor:
jTextWriter.WriteStartConstructor(tokenString);
break;
case JsonToken.StartObject:
jTextWriter.WriteStartObject();
break;
case JsonToken.Undefined:
jTextWriter.WriteUndefined();
break;
default:
break;
}
}
}
finally
{
jTextReader.Close();
sr.Close();
jTextWriter.Close();
sw.Close();
}
I am using the below code to Upload an Image file to a SharePoint Document Library. The code works fine locally but once deployed to server, i get the Exception as file not found.
String fileToUpload = FlUpldImage.PostedFile.FileName; //#"C:\Users\admin.RSS\Desktop\Photos\me_skype.jpg";
String documentLibraryName = "SiteAssets";
if (!System.IO.File.Exists(fileToUpload))
throw new FileNotFoundException("File not found.", fileToUpload);
SPFolder myLibrary = web.Folders[documentLibraryName];
// Prepare to upload
Boolean replaceExistingFiles = true;
String fileName = CheckStringNull(txtFirstName.Text) + CheckStringNull(txtLastName.Text) + CheckDateNull(txtDOB) + System.IO.Path.GetFileName(fileToUpload); ;
if (fileName.Contains('/'))
{
fileName = fileName.Replace("/", "");
}
if (fileName.Contains(':'))
{
fileName = fileName.Replace(":", "");
}
FileStream fileStream = File.OpenRead(fileToUpload);
//Upload document
SPFile spfile = myLibrary.Files.Add(fileName, fileStream, replaceExistingFiles);
string url = site.ToString() + "/" + spfile.ToString();
if (url.Contains("="))
{
url = url.Split('=')[1];
}
//Commit
myLibrary.Update();
The string fileupload contains URL as C:\Users\admin.RSS\Desktop\Photos\me.jpg This URL is actually the client system and the server side code throws exception as file not found. How to handle this issue?
UPDATE:
I removed the lines of code that checks if the file exists and now i get the exeption on FileStream fileStream = File.OpenRead(fileToUpload); as c:\windows\system32\inetsrv\20120605_133145.jpg cold not be found
Kindly help. Thank You
if (this.fuAvatarUpload.HasFile && this.fuAvatarUpload.PostedFile.FileName.Length > 0)
{
string extension = Path.GetExtension(file.FileName).ToLower();
string mimetype;
switch (extension)
{
case ".png":
case ".jpg":
case ".gif":
mimetype = file.ContentType;
break;
default:
_model.ShowMessage("We only accept .png, .jpg, and .gif!");
return;
}
if (file.ContentLength / 1000 < 1000)
{
Image image = Image.FromStream(file.InputStream);
Bitmap resized = new Bitmap(image, 150, 150);
byte[] byteArr = new byte[file.InputStream.Length];
using (MemoryStream stream = new MemoryStream())
{
resized.Save(stream, System.Drawing.Imaging.ImageFormat.Png);
byteArr = stream.ToArray();
}
file.InputStream.Read(byteArr, 0, byteArr.Length);
profile.ImageUrl = byteArr;
profile.UseGravatar = false;
profileService.UpdateProfile(profile);
this._model.ShowApprovePanel();
}
else
{
_model.ShowMessage("The file you uploaded is larger than the 1mb limit. Please reduce the size of your file and try again.");
}
}
Saving the file physically onto server and than working on the same helped me resolve my issue.