Asynchronous multipart uploads to Amazon S3 with ASP.NET - asp.net

I am able to initiate asynchronous uploads to S3, however they are somehow not ending up as a file inside my S3 bucket and I see an error 'WithPartETags cannot be empty'. Here is the complete code
InitiateMultipartUploadRequest initRequest =
new InitiateMultipartUploadRequest()
.WithBucketName(existingBucketName)
.WithKey(Path.Combine(S3Path + "/", finfo.Name));
InitiateMultipartUploadResponse initResponse =
s3Client.InitiateMultipartUpload(initRequest);
// 2. Upload Parts.
long contentLength = finfo.Length;
long partSize = 15728640;//52428800-50MB 104857600- 100 MB - 5242880 - 5 MB
try
{
long filePosition = 0;
for (int i = 1; filePosition < contentLength; i++)
{
// Create request to upload a part.
UploadPartRequest uploadRequest = new UploadPartRequest()
.WithBucketName(existingBucketName)
.WithKey(Path.Combine(S3Path + "/", finfo.Name))
.WithUploadId(initResponse.UploadId)
.WithPartNumber(i)
.WithPartSize(partSize)
.WithFilePosition(filePosition)
.WithFilePath(finfo.FullName);
// Upload part and add response to our list.
//uploadResponses.Add(s3Client.UploadPart(uploadRequest));
IAsyncResult ar = s3Client.BeginUploadPart(uploadRequest, null, null);
ListObj.Add(new ThreadList() { _iasyncResult = ar });
filePosition += partSize;
Console.WriteLine("Length Written - " + filePosition + " .Content Length - " + contentLength);
}
bool uploadsComplete = false;
while (!uploadsComplete)
{
bool individualuploadscomplete = true;
foreach (var obj in ListObj)
{
if (!obj._iasyncResult.IsCompleted)
{
individualuploadscomplete = false;
break;
}
}
if (individualuploadscomplete)
{
uploadsComplete = true;
}
}
foreach (var obj in ListObj)
{
s3Client.EndUploadPart(obj._iasyncResult);
}
//// Step 3: complete.
CompleteMultipartUploadRequest compRequest =
new CompleteMultipartUploadRequest()
.WithBucketName(existingBucketName)
.WithKey(Path.Combine(S3Path + "/", finfo.Name))
.WithUploadId(initResponse.UploadId);
//.WithPartETags(uploadResponses);
CompleteMultipartUploadResponse completeUploadResponse =
s3Client.CompleteMultipartUpload(compRequest);

Not sure why you have the setting of the PartETags commented out for the complete multipart upload call but you need to add that code back in. Also when you are calling the EndUploadPart method you need to capture that UploadResponse that comes back from that.
You also might want to look into the TransferUtility found in the Amazon.S3.Transfer namespace. Its upload methods are designed to handle what you are attempting to accomplish for large objects, see Using the High-Level .NET API for Multipart Upload for details and example snippets.

Related

how can we store a html page into sqlite in blackberry on memory card / phone memory?

Below code specifies that we we can make http connection in blackberry and how to store html page as a string?
I am doing this but I am able to get that http request but when I get response i.e http_ok it is not correct so that I can save text oh html as a string and I can further store that into sqlite.
LabelField title = new LabelField("SQLite Create Database Sample",
LabelField.ELLIPSIS |
LabelField.USE_ALL_WIDTH);
setTitle(title);
add(new RichTextField("Creating a database."));
argURL="https://www.google.com:80";
try {
connDesc = connFact.getConnection(argURL);
if (connDesc != null) {
httpConn = (HttpConnection) connDesc.getConnection();
// //Send Data on this connection
// httpConn.setRequestMethod(HttpConnection.GET);
// //Server Response
StringBuffer strBuffer = new StringBuffer();
inStream = httpConn.openInputStream();
int chr;
int retResponseCode = httpConn.getResponseCode();
if (retResponseCode == HttpConnection.HTTP_OK) {
if (inStream != null) {
while ((chr = inStream.read()) != -1) {
strBuffer.append((char) chr);
}
serverResponceStr = strBuffer.toString();
// appLe.alertForms.get_userWaitAlertForm().append("\n"+serverResponceStr);
//returnCode = gprsConstants.retCodeSuccess;
}
} else {
//returnCode = gprsConstants.retCodeNOK;
}
}
} catch (Exception excp) {
//returnCode = gprsConstants.retCodeDisconn;
excp.printStackTrace();
} `enter code here`
The code does not perform any database functionality, however I tested and it does successfully perform an HttpRequest to an external URL. The data that comes back is based on the response of the server you are making the request to.
The code I used can be found here:
http://snipt.org/vrl7
The only modifications is to keep a running summary of various events, and the response is displayed in the RichTextField. Basically, this looks to be working as intended, and the resulting String should be able to be saved however you see fit; though you may need to be cautious of encoding when saving to a database so that special characters are not lost or misinterpreted.

Raven DB DocumentStore - throws out of memory exception

I have code like this:
public bool Set(IEnumerable<WhiteForest.Common.Entities.Projections.RequestProjection> requests)
{
var documentSession = _documentStore.OpenSession();
//{
try
{
foreach (var request in requests)
{
documentSession.Store(request);
}
//requests.AsParallel().ForAll(x => documentSession.Store(x));
documentSession.SaveChanges();
documentSession.Dispose();
return true;
}
catch (Exception e)
{
_log.LogDebug("Exception in RavenRequstRepository - Set. Exception is [{0}]", e.ToString());
return false;
}
//}
}
This code gets called many times. After i get to around 50,000 documents that have passed through it i get an OutOfMemoryException.
Any idea why ? perhaps after a while i need to declare a new DocumentStore ?
thank you
**
UPDATE:
**
I ended up using the Batch/Patch API to perform the update I needed.
You can see the discussion here: https://groups.google.com/d/topic/ravendb/3wRT9c8Y-YE/discussion
Basically since i only needed to update 1 property on my objects, and after considering ayendes comments about re-serializing all the objects back to JSON, i did something like this:
internal void Patch()
{
List<string> docIds = new List<string>() { "596548a7-61ef-4465-95bc-b651079f4888", "cbbca8d5-be45-4e0d-91cf-f4129e13e65e" };
using (var session = _documentStore.OpenSession())
{
session.Advanced.DatabaseCommands.Batch(GenerateCommands(docIds));
}
}
private List<ICommandData> GenerateCommands(List<string> docIds )
{
List<ICommandData> retList = new List<ICommandData>();
foreach (var item in docIds)
{
retList.Add(new PatchCommandData()
{
Key = item,
Patches = new[] { new Raven.Abstractions.Data.PatchRequest () {
Name = "Processed",
Type = Raven.Abstractions.Data.PatchCommandType.Set,
Value = new RavenJValue(true)
}}});
}
return retList;
}
Hope this helps ...
Thanks alot.
I just did this for my current project. I chunked the data into pieces and saved each chunk in a new session. This may work for you, too.
Note, this example shows chunking by 1024 documents at a time, but needing at least 2000 before we decide it's worth chunking. So far, my inserts got the best performance with a chunk size of 4096. I think that's because my documents are relatively small.
internal static void WriteObjectList<T>(List<T> objectList)
{
int numberOfObjectsThatWarrantChunking = 2000; // Don't bother chunking unless we have at least this many objects.
if (objectList.Count < numberOfObjectsThatWarrantChunking)
{
// Just write them all at once.
using (IDocumentSession ravenSession = GetRavenSession())
{
objectList.ForEach(x => ravenSession.Store(x));
ravenSession.SaveChanges();
}
return;
}
int numberOfDocumentsPerSession = 1024; // Chunk size
List<List<T>> objectListInChunks = new List<List<T>>();
for (int i = 0; i < objectList.Count; i += numberOfDocumentsPerSession)
{
objectListInChunks.Add(objectList.Skip(i).Take(numberOfDocumentsPerSession).ToList());
}
Parallel.ForEach(objectListInChunks, listOfObjects =>
{
using (IDocumentSession ravenSession = GetRavenSession())
{
listOfObjects.ForEach(x => ravenSession.Store(x));
ravenSession.SaveChanges();
}
});
}
private static IDocumentSession GetRavenSession()
{
return _ravenDatabase.OpenSession();
}
Are you trying to save it all in one call?
The DocumentSession need to turn all of the objects that you pass it into a single request to the server. That means that it may allocate a lot of memory for the write to the server.
Usually we recommend on batches of about 1,024 items in you are doing bulks saves.
DocumentStore is a disposable class, so I worked around this problem by disposing the instance after each chunk. I highly doubt this is the most efficient way to run operations, but it will prevent significant memory overhead from happening.
I was running a sort of "delete all" operation like so. You can see the using blocks disposing both the DocumentStore and the IDocumentSession objects after each chunk.
static DocumentStore GetDataStore()
{
DocumentStore ds = new DocumentStore
{
DefaultDatabase = "test",
Url = "http://localhost:8080"
};
ds.Initialize();
return ds;
}
static IDocumentSession GetDbInstance(DocumentStore ds)
{
return ds.OpenSession();
}
static void Main(string[] args)
{
do
{
using (var ds = GetDataStore())
using (var db = GetDbInstance(ds))
{
//The `Take` operation will cap out at 1,024 by default, per Raven documentation
var list = db.Query<MyClass>().Skip(deleteSum).Take(5000).ToList();
deleteCount = list.Count;
deleteSum += deleteCount;
foreach (var item in list)
{
db.Delete(item);
}
db.SaveChanges();
list.Clear();
}
} while (deleteCount > 0);
}

ASP.NET Backgroundworkers for spreadsheet creation: multiple ones interfering with each other?

I am writing an ASP.NET application in which i need to create multiple excel reports. the report creation is pretty time-consuming (up to ten seconds for each) so i am using backgroundworkers to create them simultaneously.
My code looks a bit like this:
if (condition1)
{
excel_file_name = "TRANSFER";
BackgroundWorker worker_t = new BackgroundWorker();
worker_t.DoWork += new DoWorkEventHandler(DoWork);
worker_t.WorkerReportsProgress = false;
worker_t.WorkerSupportsCancellation = true;
worker_t.RunWorkerCompleted +=
new RunWorkerCompletedEventHandler(WorkerCompleted);
worker_t.RunWorkerAsync(excel_file_name);
}
if (Condition2)
{
excel_file_name = "NEFT";
BackgroundWorker worker_n = new BackgroundWorker();
worker_n.DoWork += new DoWorkEventHandler(DoWork);
worker_n.WorkerReportsProgress = false;
worker_n.WorkerSupportsCancellation = true;
worker_n.RunWorkerCompleted +=
new RunWorkerCompletedEventHandler(WorkerCompleted);
worker_n.RunWorkerAsync(excel_file_name);
}
there are more conditions but i haven't written them, since they are all similar. the only difference is the Excel_File_Name
the DoWork even then calls a class to create the excel files with the given name.
When condition1 and condition2 are both true, Here is the issue:
1. if i run this slowly using breakpoints during debugging, both files (TRANSFER and NEFT) are created.
2. if, however, i run it without breakpoints like a normal application, only the last file (NEFT in this example) is created.
What can be the issue?
Thanks
PS: For further information, here is the important code from the class that creates the excel file:
private static string placeDataInTemplate(string destFilePath, DataRow dr, bool isCoverLetter)
{
int loop = 0;
ExcelNamespace.Application excelApplication = new ExcelNamespace.Application();
ExcelNamespace.Workbook workbook = excelApplication.Workbooks.Open(destFilePath, 0, false, 5,
"", "", true, ExcelNamespace.XlPlatform.xlWindows, "\t", false, false, 0, true, true, false);
ExcelNamespace.Worksheet workSheet = (ExcelNamespace.Worksheet)workbook.Sheets[sheet_no];
try
{
string value;
string replicate;
string replicate_end;
// get data for Place Holders
sDataTable dtPlaceHolderData = getPlaceHolderData(dr);
//make Display Alerts False
excelApplication.DisplayAlerts = false;
if (dtPlaceHolderData != null && dtPlaceHolderData.Rows.Count > 0)
{
int rowCntDt = 0; //Which row will be used for data?
int i = 1;
Excel.Range Find = (ExcelNamespace.Range)workSheet.Cells.Find("#",
(ExcelNamespace.Range)workSheet.Cells[1, 1],
Excel.XlFindLookIn.xlValues,
Excel.XlLookAt.xlPart,
Excel.XlSearchOrder.xlByRows,
Excel.XlSearchDirection.xlNext,
false,
false,
Missing.Value);
while (Find != null && loop <= 200)
{
loop++;
value = Find.Value2.ToString();
if (condition)
//VERY long if...else if
}
string approveDirPath = destFilePath.Replace(Path.GetFileName(destFilePath), string.Empty);
workbook.Close(true, destFilePath, Type.Missing);
excelApplication.Quit();
string filepath = destFilePath.Split('-')[0];
string approval_id = dr[0].ToString();
return destFilePath;
}
return string.Empty;
}
catch (Exception ex)
{
//do something
}
finally
{
//release resources
}
NOTE: I have removed a lot of needless code. I can paste it if needed. Thank you
Most likely cause is some shared state between two threads - shared state may include excel application and workbooks. So you need to inspect your code for the same.
On the side note, instead of using Excel Automation to generate excel files, you may consider using some in-process library which would be perhaps more scalable and void of such issues. Have a look at one such free basic library at code project

DOS based printing through ASP.NET

Well my situation is like this:
I am generating a report as a text file at the server which needs to be printed using DOS mode on a dot matrix printer. I want to avoid Windows printing because it would be too slow. Is there a way in ASP.NET through which I can carry out DOS based printing as it is best suited for Dot matrix printers. I have scoured the net but could not come across any solution or pointers. Does any body have any pointers/solutions which they might have implemented or stumbled across.
This application is a Web based application.
Thanx.
If I understand you right, one option is to execute a batch file that would do the actual printing from ASP.NET. From here: (Obviously, you can omit some of the code writing the output to the page)
// Get the full file path
string strFilePath = “c:\\temp\\test.bat”;
// Create the ProcessInfo object
System.Diagnostics.ProcessStartInfo psi = new System.Diagnostics.ProcessStartInfo("cmd.exe");
psi.UseShellExecute = false;
psi.RedirectStandardOutput = true;
psi.RedirectStandardInput = true;
psi.RedirectStandardError = true;
psi.WorkingDirectory = “c:\\temp\\“;
// Start the process
System.Diagnostics.Process proc = System.Diagnostics.Process.Start(psi);
// Open the batch file for reading
System.IO.StreamReader strm = System.IO.File.OpenText(strFilePath);
// Attach the output for reading
System.IO.StreamReader sOut = proc.StandardOutput;
// Attach the in for writing
System.IO.StreamWriter sIn = proc.StandardInput;
// Write each line of the batch file to standard input
while(strm.Peek() != -1)
{
sIn.WriteLine(strm.ReadLine());
}
strm.Close();
// Exit CMD.EXE
string stEchoFmt = "# {0} run successfully. Exiting";
sIn.WriteLine(String.Format(stEchoFmt, strFilePath));
sIn.WriteLine("EXIT");
// Close the process
proc.Close();
// Read the sOut to a string.
string results = sOut.ReadToEnd().Trim();
// Close the io Streams;
sIn.Close();
sOut.Close();
// Write out the results.
string fmtStdOut = "<font face=courier size=0>{0}</font>";
this.Response.Write(String.Format(fmtStdOut,results.Replace(System.Environment.NewLine, "<br>")));
The answer from BobbyShaftoe is correct. Here's a pedantic version of it:
public static void CreateProcess(string strFilePath)
{
// Create the ProcessInfo object
var psi = new ProcessStartInfo("cmd.exe")
{
UseShellExecute = false,
RedirectStandardOutput = true,
RedirectStandardInput = true,
RedirectStandardError = true,
WorkingDirectory = "c:\\temp\\"
};
// Start the process
using (var proc = Process.Start(psi))
{
// Attach the in for writing
var sIn = proc.StandardInput;
using (var strm = File.OpenText(strFilePath))
{
// Write each line of the batch file to standard input
while (strm.Peek() != -1)
{
sIn.WriteLine(strm.ReadLine());
}
}
// Exit CMD.EXE
sIn.WriteLine(String.Format("# {0} run successfully. Exiting", strFilePath));
sIn.WriteLine("EXIT");
// Read the sOut to a string.
var results = proc.StandardOutput.ReadToEnd().Trim();
// Write out the results.
string fmtStdOut = "<font face=courier size=0>{0}</font>";
this.Response.Write(String.Format(fmtStdOut,results.Replace(System.Environment.NewLine, "<br>")));
}
}

JSON WebService in ASP.NET

How do I create an ASP.NET web service that returns JSON formatted data?
The most important thing to understand is to know how to represent data in JSON format.
Please refer http://www.json.org/ to know more about it.
Once you understand this, then the rest part is pretty straight forward.
Please check the following URL for an example of the same.
http://www.ajaxprojects.com/ajax/tutorialdetails.php?itemid=264
http://code.msdn.microsoft.com/JSONSampleDotNet
http://www.phdcc.com/xml2json.htm
I recommend Jquery library for this. It's a lightweight rich library which supports calling web services, handle json data format output etc.
Refer www.jquery.com for more info.
.NET 3.5 has support built-in. For .NET 2.0, extra libraries are needed. I used the Jayrock library.
I recently delivered an application that uses pure Javascript at the browser (viz. using AJAX technology, but not using Microsoft AJAX or Scriptaculous etc) which marries up to Microsoft webservices at the back end. When I started writing this I was new to the world of .NET, and felt overwhelmed by all the frameworks out there! So I had an urge to use a collection of small libraries rather than very large frameworks.
At the javascript application, I call a web service like this. It directly reads the output of the web service, cuts away the non JSON sections, then uses https://github.com/douglascrockford/JSON-js/blob/master/json2.js to parse the JSON object.
This is not a standard approach, but is quite simple to understand, and may be of value to you, either to use or just to learn about webservices and JSON.
// enclosing html page has loaded this:
<script type="text/javascript" src="res/js/json2.js"></script>
// Invoke like this:
// var validObj = = callAnyWebservice("WebServiceName", "");
// if (!validObj || validObj.returnCode != 0) {
// alert("Document number " + DocId + " is not in the vPage database. Cannot continue.");
// DocId = null;
// }
function callAnyWebservice(webserviceName, params) {
var base = document.location.href;
if (base.indexOf(globals.testingIPaddr) < 0) return;
gDocPagesObject=null;
var http = new XMLHttpRequest();
var url = "http://mywebserver/appdir/WebServices.asmx/" + webserviceName;
//alert(url + " " + params);
http.open("POST", url, false);
http.setRequestHeader("Host", globals.testingIPaddr);
http.setRequestHeader('Content-Type', 'application/x-www-form-urlencoded');
http.setRequestHeader("Content-Length", params.length);
// http.setRequestHeader("Connection", "close");
//Call a function when the state changes.
http.onreadystatechange = function() {
if (http.readyState == 4 ) {
if (http.status == 200) {
var JSON_text = http.responseText;
var firstCurlyQuote = JSON_text.indexOf('{');
JSON_text = JSON_text.substr(firstCurlyQuote);
var lastCurlyQuote = JSON_text.lastIndexOf('}') + 1;
JSON_text = JSON_text.substr(0, lastCurlyQuote);
if (JSON_text!="")
{
//if (DEBUG)
// alert(url+" " +JSON_text);
gDocPagesObject = eval("(" + JSON_text + ")");
}
}
else if (http.readyState == 4)
{alert(http.readyState + " " + http.status + " " + http.responseText)}
}
}
http.send(params);
if (gDocPagesObject != null) {
//alert(gDocPagesObject.returnCode + " " + gDocPagesObject.returnString);
return gDocPagesObject;
}
else
return "web service unavailable: data not ready";
}
In our project the requirements were as follow -- ASP.NET 2.0 on the server, and pure Javascript on the browser (no JQuery libs, or .NET AJAX)
In that case on the server side, just mark the webmethod to use JSON. Note that both input and output params are json formatted
[WebMethod]
[ScriptMethod(ResponseFormat = ResponseFormat.Json)]
public String Foo(String p1, String p2)
{
return "Result: p1= " + p1 + " p2= " + p2;
}
On the javascript side, use the regular XmlHttpRequest object, make sure you format your input params as JSON and do an 'eval' on output parms.
var httpobj = getXmlHttpRequestObject();
//Gets the browser specific XmlHttpRequest Object
function getXmlHttpRequestObject()
{
if (window.XMLHttpRequest)
return new XMLHttpRequest();
else if(window.ActiveXObject)
return new ActiveXObject("Microsoft.XMLHTTP");
}
CallService()
{
//Set the JSON formatted input params
var param = "{'p1' : 'value1', 'p2' : 'value2'}";
//Send it to webservice
if(httpobj.readyState == 4 || httpobj.readyState == 0)
{
httpobj.open("POST", 'service.asmx/' + 'Foo', true);
//Mark the request as JSON and UTF-8
httpobj.setRequestHeader('Content-Type','application/json; charset=utf-8');
httpobj.onreadystatechange = OnSuccess;
httpobj.send(param);
}
}
OnSuccess()
{
if (httpobj.readyState == 4)
{
//Retrieve the JSON return param
var response = eval("(" + httpobj.responseText + ")");
}
}

Resources