ASP.NET WEB API Temp File operation Speed Up - asp.net

I'm hosting a WEB API App on Azure and I'm noticing some latency on a file operation.
What I'm doing is:
Post File
Write file on disk
Execute operations on file
The problem is that it takes A LOT of time to simply "take" this file, this is my code:
[HttpPost]
[Route("api/Test/Debug")]
public async Task Debug()
{
var sw = Stopwatch.StartNew();
var httpRequest = HttpContext.Current.Request;
var postedFile = httpRequest.Files[0];
return Ok(sw.ElapsedMilliseconds);
}
As you can see it's really simple but it takes also 3 seconds to get the "postedFile" element.
Is there any way to optimize this? Any other way to increase performance? Is this file already stored in some temp dir so that I can access it without having to write him down and then delete it?
thanks in advance

Related

Can InputStreamResource be used instead of temporary file for async file upload to sftp server

I have a Spring Integration flow which uploads the files to sftp server asynchronously, files to be uploaded are coming from http endpoint. Initially I faced same problem as discussed here glad it got solved.
In the same SO thread I found this comment.
In enterprise environments, you often have files of sizes you cannot afford to buffer into memory like that. Sadly enough, InputStreamResource won't work either. Your best bet, as far as I could tell so far, is to copy contents to an own temp file (e.g. File#createTempFile) which you can clean up at the end of the processing thread.
Currently I'm connecting file inputstream to InputStreamResource to get rid of the problem, its working flawlessly. why does the commenter say InputStreamResource won't work either, AFAIK InputStream never store data in memory
Does the InputStreamResource's inputStream gets closed automatically after file upload?
When we say large file how much of file size are we talking about here. currently in my case 2-5 Mb of files are uploaded to SFTP
Do I really need to care about changing my file upload mechanism to one something like storing in temp folder?
Code Sample:
#PostMapping("/upload")
public void sampleEndpoint(#NotEmpty #RequestParam MultipartFile file )
throws IOException {
Resource resource = new InputStreamResource(file.getInputStream());
sftpFileService.upload(resource);
}
SftpFileService Async upload method:
#Async
public void upload(Resource resource){
try{
messagingGateway.upload(resource);
}catch(Exception e){
e.printStackTrace();
}
}
2-5 Mb is probably not a size to worry about. The problem could appear when files are in 1-2Gb size. Although you may face some out of memory when several concurrent uploads happens to your service.
The InputStreamResource is just a decorator around an InputStream with Resource API for access to the underlying delegating stream. It is not clear how it can work in async environment since MultipartFile is deleted in the end of HTTP upload request.
Plus you don't show any code to understand the situation better...

Using ffmpeg in asp.net

I needed a audio conversion library. After already pulling my hair..I have given up on the fact that there is no such audio library out there..every library out there has some or the other problem.
The only option left is ffmpeg which is the best but unfortunately you cannot use it in asp.net (not directly I mean). Every user on the website that will convert a file; will launch an exe?; I think I will hit the server memory max soon.
Bottom Line: I will try using ffmpeg.exe and see how many users it can support simultaneously.
I went to the ffmpeg website and in the windows download section I found 3 different version; static, shared and dev.
Does any one know which would be the best? All packed in one exe (static) or dll's separely and exe small, wrt using it in asp.net?
PS: any one has a good library out there..would be great if you can share.
Static builds provide one self-contained .exe file for each program (ffmpeg, ffprobe, ffplay).
Shared builds provide each library as a separate .dll file (avcodec, avdevice, avfilter, etc.), and .exe files that depend on those libraries for each program
Dev packages provide the headers and .lib/.dll.a files required to use the .dll files in other programs.
ffMpeg is the best library out there from what I have used but I wouldn't recommend trying to call it directly from asp.net.
What I have done, is accepted the upload, stored it on the server, or S3 in my case, then have a worker role (if using something like Azure) and a process that continuously looks and monitors for new files to convert.
If you needed a realtime like solution, you could update flags in your database and have an AJAX solution to poll the database to keep providing progress updates, then a link to download once the conversion is complete.
Personally my approach would be
Azure Web Roles
Azure Worker Role
ServiceBus
The WorkerRole starts up and is monitoring the ServiceBus Queue for messages.
The ASP.NET site uploads and stores the file in S3 or Azure
The ASP.NET site then records information in your DB if needed and sends a message to the ServiceBus queue.
The WorkerRole picks this up and converts.
AJAX will be needed on the ASP.NET site if you want a realtime monitoring solution. Otherwise you could send an email when complete if needed.
Using a queuing process also helps you with load as when you are under heavy load people just wait a little longer and it doesn't grind everything to a halt. Also you can scale out your worker roles as needed to balance loads, should it ever become too much for one server.
Here is how I run ffMpeg from C# (you will need to change the parameters for your requirements)
String params = string.Format("-i {0} -s 640x360 {1}", input.Path, "C:\\FilePath\\file.mp4");
RunProcess(params);
private string RunProcess(string Parameters)
{
//create a process info
ProcessStartInfo oInfo = new ProcessStartInfo(this._ffExe, Parameters);
oInfo.UseShellExecute = false;
oInfo.CreateNoWindow = true;
oInfo.RedirectStandardOutput = true;
oInfo.RedirectStandardError = true;
//Create the output and streamreader to get the output
string output = null; StreamReader srOutput = null;
//try the process
try
{
//run the process
Process proc = System.Diagnostics.Process.Start(oInfo);
proc.ErrorDataReceived += new DataReceivedEventHandler(proc_ErrorDataReceived);
proc.OutputDataReceived += new DataReceivedEventHandler(proc_OutputDataReceived);
proc.BeginOutputReadLine();
proc.BeginErrorReadLine();
proc.WaitForExit();
proc.Close();
proc.Dispose();
}
catch (Exception)
{
// Capture Error
}
finally
{
//now, if we succeeded, close out the streamreader
if (srOutput != null)
{
srOutput.Close();
srOutput.Dispose();
}
}
return output;
}

Adobe AIR HTTP Connection Limit

I'm working on an Adobe AIR application which can upload files to a web server, which is running Apache and PHP. Several files can be uploaded at the same time and the application also calls the web server for various API requests.
The problem I'm having is that if I start two file uploads, while they are in progress any other HTTP requests will time out, which is causing a problem for the application and from a user point of view.
Are Adobe AIR applications limited to 2 HTTP connections, or is something else probably the issue?
From searching about this issue I've not found much but one article did indicated that it wasn't limited to just two connections.
The file uploads are performed by calling the File classes upload method, and the API calls are done using the HTTPService class. The development web server I am using is a WAMP server, however when the application is released it will be talking to a LAMP server.
Thanks,
Grant
Here is the code I'm using to upload the file:
protected function btnAddFile_clickHandler(event:MouseEvent):void
{
// Create a new File object and display the browse file dialog
var uploadFile:File = new File();
uploadFile.browseForOpen("Select File to Upload");
uploadFile.addEventListener(Event.SELECT, uploadFile_SelectedHandler);
}
private function uploadFile_SelectedHandler(event:Event):void
{
// Get the File object which was used to select the file
var uploadFile:File = event.target as File;
uploadFile.addEventListener(ProgressEvent.PROGRESS, file_progressHandler);
uploadFile.addEventListener(IOErrorEvent.IO_ERROR, file_ioErrorHandler);
uploadFile.addEventListener(Event.COMPLETE, file_completeHandler);
// Create the request URL based on the download URL
var requestURL:URLRequest = new URLRequest(AppEnvironment.instance.serverHostname + "upload.php");
requestURL.method = URLRequestMethod.POST;
// Set the post parameters
var params:URLVariables = new URLVariables();
params.name = "filename.ext";
requestURL.data = params;
// Start uploading the file to the server
uploadFile.upload(requestURL, "file");
}
Here is the code for the API calls:
private function sendHTTPPost(apiFile:String, postParams:Object, resultCallback:Function, initialCallerResultCallback:Function):void
{
var httpService:mx.rpc.http.HTTPService = new mx.rpc.http.HTTPService();
httpService.url = AppEnvironment.instance.serverHostname + apiFile;
httpService.method = "POST";
httpService.requestTimeout = 10;
httpService.resultFormat = HTTPService.RESULT_FORMAT_TEXT;
httpService.addEventListener("result", resultCallback);
httpService.addEventListener("fault", httpFault);
var token:AsyncToken = httpService.send(postParams);
// Add the initial caller's result callback function to the token
token.initialCallerResultCallback = initialCallerResultCallback;
}
If you are on a windows system, Adobe AIR is using Microsofts WinINet library to access the web. This library by default limits the number of concurrent connections to a single server to 2:
WinInet limits the number of simultaneous connections that it makes to a single HTTP server. If you exceed this limit, the requests block until one of the current connections has completed. This is by design and is in agreement with the HTTP specification and industry standards.
... Connections to a single HTTP 1.1 server are limited to two simultaneous connections
There is an API to change the value of this limit but I don't know if it is accessible from AIR.
Since this limit also affects page loading speed for web sites, some sites are using multiple DNS names for artifacts such as images, javascripts and stylesheets to allow a browser to open more parallel connections.
So if you are controlling the server part, a workaround could be to create DNS aliases like www.example.com for uploads and api.example.com for API requests.
So as I was looking into this, I came across this info about using File.upload() in the documentation:
Starts the upload of the file to a remote server. Although Flash Player has no restriction on the size of files you can upload or download, the player officially supports uploads or downloads of up to 100 MB. You must call the FileReference.browse() or FileReferenceList.browse() method before you call this method.
Listeners receive events to indicate the progress, success, or failure of the upload. Although you can use the FileReferenceList object to let users select multiple files for upload, you must upload the files one by one; to do so, iterate through the FileReferenceList.fileList array of FileReference objects.
The FileReference.upload() and FileReference.download() functions are
nonblocking. These functions return after they are called, before the
file transmission is complete. In addition, if the FileReference
object goes out of scope, any upload or download that is not yet
completed on that object is canceled upon leaving the scope. Be sure
that your FileReference object remains in scope for as long as the
upload or download is expected to continue.
I wonder if something there could be giving you issues with uploading multiple files. I see that you are using browserForOpen() instead of browse(). It seems like the probably do the same thing... but maybe not.
I also saw this in the File class documentation
Note that because of new functionality added to the Flash Player, when publishing to Flash Player 10, you can have only one of the following operations active at one time: FileReference.browse(), FileReference.upload(), FileReference.download(), FileReference.load(), FileReference.save(). Otherwise, Flash Player throws a runtime error (code 2174). Use FileReference.cancel() to stop an operation in progress. This restriction applies only to Flash Player 10. Previous versions of Flash Player are unaffected by this restriction on simultaneous multiple operations.
When you say that you let users upload multiple files, do you mean subsequent calls to browse() and upload() or do you mean one call that includes multiple files? It seems that if you are trying to do multiple separate calls that that may be an issue.
Anyway, I don't know if this is much help. It definitely seems that what you are trying to do should be possible. I can only guess that what is going wrong is perhaps a problem with implementation. Good luck :)
Reference: http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/net/FileReference.html#upload()
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/net/FileReference.html#browse()
Just because I was thinking about a very similar question because of an error in one of my actual apps, I decided to write down the answer I found.
I instantiated 11
HttpConnections
and was wondering why my Flex 4 Application stopped working and threw an HTTP-Error although it was working pretty good formerly with just 5 simultanious HttpConnections to the same server.
I tested this myself because I did not find anything regarding this in the Flex docs or on the internet.
I found that using more than 5 HTTPConnections was the reason for the Flex application to throw the runtime error.
I decided to instantiate the connections one after another as a temporally workaround: Load the next one after the other has received the data and so on.
Thats of course just temporally since one of the next steps will be to alter the responding server code in that way that it answers a request that contains the results of requests to more then one table in one respond. Of course the client application logic needs to be altered, too.

How do I use a SQL CLR Method to call a Webpage?

I have an asp.net mvc application that take a while to load on my production server. I would like to write a script to call my pages every 10 minutes to avoid the pages from being scrapped on the server, which would then cause the server to reload them.
I was thinking of using a SQL Server stored procedure to call my pages every 10 minutes to keep the pages alive.
I've read that I can do this using CLR, but I am not sure how. Does anyone have an example of how to call webpages in a SQL Stored Procedure using CLR?
I have no idea why you would want to use a stored procedure for this.
Just write a simple console application to "call" the page. Then use a scheduled task to run the console application.
Code untested but something like this should work.
You'll probably also need TRUSTWORTHY SET.
ALTER DATABASE Foo SET TRUSTWORTHY ON;
Code:
public partial class WebProc
{
[SqlFunction()]
public static string WebQuery()
{
WebRequest request = HttpWebRequest.Create("http://www.google.com");
WebResponse response = request.GetResponse();
Stream dataStream = response.GetResponseStream();
StreamReader reader = new StreamReader (dataStream);
string responseFromServer = reader.ReadToEnd();
return responseFromServer;
}
}
I have no idea why you would want to use a stored procedure for this.
Just write a simple console application to "call" the page. Then use a scheduled task to run the console application.
You could do that (of course) but that's not what the question was asking ;)
Jafin's answer is correct for the question.
The best answer (IMHO) would be to fix your production server (i.e. reconfigure the settings) so that it doesn't "scrap" your pages every 10 minutes.
Why use a jackhammer (console app or .net inside the database or whatever) when a regular one will do?

File Access Strategy in a Multi-Threaded Environment (Web App)

I have a file which is an XML representation of some data that is taken from a Web service and cached locally within a Web Application. The idea being is that this data is very static, but just might change. So I have set it up to cache to a file, and stuck a monitor against it to check if it has been deleted. Once deleted, the file will be refreshed from its source and rebuilt.
I am now running in to problems though, because obviously in a multi-threaded environment it falls over as it is trying to access the data when it is still reading/writing the file.
This is confusing me, because I added a object to lock against, and this is always locked during read/write. It was my understanding that attempted access from other threads would be told to "wait" until the lock was released?
Just to let you know, I am real new to multi-threaded development, so I am totally willing to accept this is a screw up on my part :)
Am I missing something?
What is the best file access strategy in a multi-threaded environment?
Edit
Sorry - I should have said this is using ASP.NET 2.0 :)
Here is the code that I use to make sure a file is not locked by another process. It's not 100% foolproof, but it gets the job done most of the time:
/// <summary>
/// Blocks until the file is not locked any more.
/// </summary>
/// <param name="fullPath"></param>
bool WaitForFile(string fullPath)
{
int numTries = 0;
while (true)
{
++numTries;
try
{
// Attempt to open the file exclusively.
using (FileStream fs = new FileStream(fullPath,
FileMode.Open, FileAccess.ReadWrite,
FileShare.None, 100))
{
fs.ReadByte();
// If we got this far the file is ready
break;
}
}
catch (Exception ex)
{
Log.LogWarning(
"WaitForFile {0} failed to get an exclusive lock: {1}",
fullPath, ex.ToString());
if (numTries > 10)
{
Log.LogWarning(
"WaitForFile {0} giving up after 10 tries",
fullPath);
return false;
}
// Wait for the lock to be released
System.Threading.Thread.Sleep(500);
}
}
Log.LogTrace("WaitForFile {0} returning true after {1} tries",
fullPath, numTries);
return true;
}
Obviously you can tweak the timeouts and retries to suit your application. I use this to process huge FTP files that take a while to be written.
If you're locking on a object stored as a static then the lock should work for all threads in the same Application Domain, but perhaps you need to upload a code sample so we can have a look at the offending lines.
That said, one thought would be to check if IIS is configured to run in Web Garden mode (i.e. more than 1 process executing your application) which would break your locking logic. While you could fix such a situation with a mutex it'd be easier to reconfigure your application to execute in a single process, although you'd be wise to check the performance before and after messing with the web garden settings as it can potentially affect performance.
You could maybe create the file with a temporary name ("data.xml_TMP"), and when it's ready change the name to what it is supposed to be. That way, no other process will be accessing it before it is ready.
OK, I have been working on this and ended up creating a stress-test module to basically hammer the crap out of my code from several threads (See Related Question).
It was much easier from this point on to find holes in my code. It turns out that my code wasn't actually far off, but there was a certain logic path that it could enter in to which basically caused read/write operations to stack up, meaning if they didn't get cleared in time, it would go boom!
Once I took that out, ran my stress test again, all worked fine!
So, I didn't really do anything special in my file access code, just ensured I used lock statements where appropriate (i.e. when reading or writing).
How about using AutoResetEvent to communicate between threads? I created a console app which creates roughly 8 GB file in createfile method and then copy that file in main method
static AutoResetEvent waitHandle = new AutoResetEvent(false);
static string filePath=#"C:\Temp\test.txt";
static string fileCopyPath=#"C:\Temp\test-copy.txt";
static void Main(string[] args)
{
Console.WriteLine("in main method");
Console.WriteLine();
Thread thread = new Thread(createFile);
thread.Start();
Console.WriteLine("waiting for file to be processed ");
Console.WriteLine();
waitHandle.WaitOne();
Console.WriteLine();
File.Copy(filePath, fileCopyPath);
Console.WriteLine("file copied ");
}
static void createFile()
{
FileStream fs= File.Create(filePath);
Console.WriteLine("start processing a file "+DateTime.Now);
Console.WriteLine();
using (StreamWriter sw = new StreamWriter(fs))
{
for (long i = 0; i < 300000000; i++)
{
sw.WriteLine("The value of i is " + i);
}
}
Console.WriteLine("file processed " + DateTime.Now);
Console.WriteLine();
waitHandle.Set();
}

Resources