I'm trying to use the Writeline activity to write to an output file. I've written the expression for the TextWriter as such:
new System.IO.StreamWriter(#"C:\Users\Owner\Documents\Visual Studio 2012\test.txt")
And the Text as such:
"Hello"
Now, the actual file is created when the activity is run, but there's not text in the file. It's just a 0 KB text file. Let me know if you need additional information.
Your test is too simple, you are not closing the TextWriter so, the data in the buffers isn't flushed to disk. Try something like this (with the WriteLine TextWriter set to tw etc.):
Related
I have read the FileWriter Filter :
" The File Writer filter can be used to write files to disc regardless of format. The filter simply writes to disc whatever it receives on its input pin, so it must be connected upstream to a multiplexer that can format the file correctly. You can create a new output file with the File Writer or specify an existing file; if the file already exists, it will be completely overwritten with the new data. "
So my question is :
I am using the FileWriter filter for writing my audio stream into the disc. Before writing the file in the disc i want to access that file , so can it be possible or should i make my own custom filter.
File writer filter does not not provide you with options to change file sharing mode while the file is being written to. Additionally, in most cases your accessing the file before it is finalized makes no sense: the files are rarely written incrementally, file finalization changes data in the middle of the file and your accessing data before the file is closed might get you bad/incomplete stream.
Roman R is right. Writers are for writing. If you need transform data - write your own Transform filter.
You can ask me directly here.
I'm a little stuck in trying to upload files into our SQL DB using FileStream. I've followed this example http://www.codeproject.com/Articles/128657/How-Do-I-Use-SQL-File-Stream but the difference is we upload the file in 10mb chunks.
On the first chunk a record is created in the DB with empty content (so that a file is created) and then OnUploadChunk is called for each chunk.
The file is uploading ok but when I check, a new file has been created for each chunk, so for a 20mb file for example I have one which is 0kb, another which is 10mb and the final one which is 20mb. I'm expecting one file of 20mb.
I'm guessing this is perhaps to do with getting the transaction context or incorrectly using TransactionScope which I dont quite fully grasp yet. I presume this may be different for each chunk with it going to and from client to server.
Here is the method which is called every time a chunk is sent from the client (using PlupLoad if of any relevance).
protected override bool OnUploadChunk(Stream chunkStream, string DocID)
{
BinaryReader b = new BinaryReader(chunkStream);
byte[] binData = b.ReadBytes(chunkStream.Length);
using (TransactionScope transactionScope = new TransactionScope())
{
string FilePath = GetFilePath(DocID); (Folder path the file is sitting in)
//Gets size of file that has been uploaded so far
long currentFileSize = GetCurrentFileSize(DocID)
//Essentially this is just Select GET_FILESTREAM_TRANSACTION_CONTEXT()
byte[] transactionContext = GetTransactionContext();
SqlFileStream filestream = new SqlFileStream(FilePath, transactionContext, FileAccess.ReadWrite);
filestream.Seek(currentFileSize, SeekOrigin.Begin);
filestream.Write(binData, 0, (int)chunkStream.Length);
filestream.Close();
transactionScope.Complete();
}
}
UPDATE:
I've done a little research and I believe the issue is around this:
FILESTREAM does not currently support in-place updates. Therefore an update to a column with the FILESTREAM attribute is implemented by creating a new zero-byte file, which then has the entire new data value written to it. When the update is committed, the file pointer is then changed to point to the new file, leaving the old file to be deleted at garbage collection time. This happens at a checkpoint for simple recovery, and at a backup or log backup.
So have I just got to wait for the garbage collector to remove the chunked files? Or should I perhaps be uploading the file somewhere on the file system first and then copying it across?
Yes, you will have to wait for Sql to clean up the files for you.
Unless you have other system constraints you should be able stream the entire file all at once. This will give you a single file on the sql side
I've written an ASHX generic handler to output XML. However, for some reason, ASP.net is appending numerous whitespace characters to the end of the output which breaks the XML.
My code looks like this:
context.Response.ContentType = "text/xml";
XmlSerializer oSerializer = new XmlSerializer(typeof(ModelXml[]),new XmlRootAttribute("rows"));
System.IO.MemoryStream ms2 = new System.IO.MemoryStream();
System.Xml.XmlTextWriter tw = new System.Xml.XmlTextWriter(ms2,new System.Text.UTF8Encoding());
oSerializer.Serialize(tw,models);
string s = System.Text.Encoding.UTF8.GetString(ms2.GetBuffer());
tw.Close();
ms2.Close();
context.Response.Write(s.Trim());
context.Response.End();
When I run this code thru the debugger, I see that the string s does indeed contain the XML data with NO WHITESPACE. However, when I point Internet Explorer at this file, I get the following error:
The XML page cannot be displayed
Cannot view XML input using XSL style sheet. Please correct the error and then click the Refresh button, or try again later.
--------------------------------------------------------------------------------
Invalid at the top level of the document. Error processing resource 'http://localhost:5791/XXXXX.ashx'.
When I view the page source in Notepad, I see that the file begins with the correct XML, but there are numerous spaces appended to the end. If I remove these spaces, the XML file works fine with my browser and applications.
Why is ASP.net appending these spaces to my output and what can I do about it?
Switch from MS2.GetBuffer() to MS2.ToArray(). You are reading the buffer from the MemoryStream, which is preallocated for efficiency. You want just the used data, not the whole buffer.
Instead of serializing to a MemoryStream, you should serialize directly to Response.Output.
This should solve the issue.
I have to create a tab delimited txt file from a query.
I want to call an HttpHandler that returns my txt file as a stream, I don't want to create the file phisically.
1st question:
what is the best practice to create the tab delimited txt file from a query result?
I have to fetch all rows and create the file manually?
2nd question:
How to set a timeout for the HttpHandler that creates the file?
Thanks for your time.
I would create a plain old http output stream and change the content type to 'text/plain' which means that you don't need to physically create the file on the web server, and if you add the content-disposition header to the output and specify that it has an attachment called something like 'report.txt' the user will be prompted to Open or Save the content, rather than just viewing it in the browser like a normal web page.
You can use the Server.ScriptTimeout = x to set the script timeout by gaining access to the current HttpContext object
Hope this helps
I am creating an xml file. I need to check first if the file exists or not. If the file does not exist, create it and add the data cmg from a .cs file.
If the file exists, don't create the file just add the data cmg from a .cs file.
My code looks like this:
string filename="c:\\employee.xml";
XmlTextWriter tw=new XmlTextWriter(filename,null);//null represents
the Encoding Type//
tw.Formatting=Formatting.Indented; //for xml tags to be indented//
tw.WriteStartDocument(); //Indicates the starting of document (Required)//
tw.WriteStartElement("Employees");
tw.WriteStartElement("Employee","Genius");
tw.WriteStartElement("EmpID","1");
tw.WriteAttributeString("Name","krishnan");
tw.WriteElementString("Designation","Software Developer");
tw.WriteElementString("FullName","krishnan Lakshmipuram Narayanan");
tw.WriteEndElement();
tw.WriteEndElement();
tw.WriteEndDocument();
tw.Flush();
tw.Close();
so next time we add data to file we need to check if the file exits and add data to xml file
and as we have made empID as a primary key, if user tries to make duplicate entry we need to avoid
Is this possible to do?
if (!File.Exists(filename))
{
// create your file
}
or
if (File.Exists(filename))
{
File.Delete(filename);
}
// then create your file
File class is in System.IO namespace (add using System.IO; to your file)
You can't append records to an XML file, you have to read the file and then rewrite it.
So, just check if the file exists, and read the records from it. Then write the file including all previous records and the new record.
Have a look at the File.Exists method here
Testing for existance of a file before attempting to create it inherently is subject to a "things change after check" race condition. Who can guarantee you that your application isn't preempted and put to sleep for a moment after you checked, someone else creates/deletes that file, your app gets to run again and does exactly the opposite of what you intended ?
Windows (as well as all UN*X variants) supports file open/create modes that allow to perform that create-if-nonexistant/open-if-existant operation as a single call.
As far as .NET goes, this means for your task (create an XML file) you'd first create a System.IO.FileStream with the appropriate modes, see http://msdn.microsoft.com/en-us/library/system.io.filemode.aspx and then pass that stream to the XmlWriter constructor. That's safer than simply performing an "exists" check and hoping for the best.