Backend: SQL Server 2008 database with FileStream enabled
Data Access: Linq to Entities
I have thousands of pdf's that currently reside on a file server. I would like to move these pdf's off of the file server and into a SQL Server 2008 database so that I can manage them easier.
As a proof of concept (ie - to ensure that the new FileStream ability in SQL Server 2008 is what I'm looking for), I wrote a small app that would read and write these pdf's to the FileStream enabled database via the entities framework.
The app is very simple; here's the code:
datReport report = new datReport();
report.ReportName = "ANL-7411-Rev-Supp-1.pdf";
report.RowGuid = Guid.NewGuid();
// The following line blows up on really big pdf's (350+ mb's)
report.ReportData = File.ReadAllBytes(#"C:\TestSavePDF\ANL-7411-Rev-Supp-1.pdf");
using (NewNNAFTAEntities ctx = new NewNNAFTAEntities()) {
ctx.AddTodatReport(report);
ctx.SaveChanges();
}
I have the line of code commented above where the error occurs. The exact error is 'System.outofmemoryexception', which leaves me with little doubt that the file size is what is causing the problem. The above code does work on smaller pdf's. I don't know where the exact limit is as far as file size, but my biggest pdf's are over 350 megabytes and they get the error.
Any help would be greatly appreciated. Thanks!
You're not using the streaming in FILESTREAM very much in your example....
Check out the MSDN docs on FILESTREAM in ADO.NET and this article that both show how to use the SqlFileStream as a stream from C# - that should work a lot better (I would believe) than sucking the whole PDF into memory....
Marc
Related
For various reasons related to network bandwidth and performance we have an application with large Unicode strings converted to byte arrays compressed with the GZipStream .Net framework (.Net core 3.1) functionality.
We want to store these in a SQL Server 2017 varbinary(max) column for later retrieval and decompression - this works fine using the same GZipStream library to decompress.
We would also like to take advantage of the T-SQL Decompress function to be able to query the data on the DB server (without having to bring it all back into .Net and decompress it there)
Whilst Decompress does seem to work (as in it doesn't throw an error and it produces a binary output that can be cast to nvarchar(max)), the resulting nvarchar is totally different from the original source - it actually crashes SSMS when displayed!
This is not a problem if we pass the decompressed string into SQL Server and compress it there using the Compress function, but we do not want to do this because it requires additional decompress step and extra bandwidth consumption.
I have ensured we're on CU20 of SQL Server 2017, so I don't think it's a patching problem.
I have tried the using the different compression ratio options in the .Net library but they all produce the same problem.
It would appear that despite both being GZip compression that the T-SQL and .Net compression algorithms are not compatible, but if anyone has succeeded in combining the two, I would appreciate hearing how.
Arrgh... I was being an idiot! My source strings (they were XML) were in UTF-8 so this worked:
using (var compressStream = new MemoryStream())
using (var compressor = new GZipStream(compressStream, CompressionMode.Compress))
{
compressor.Write(Encoding.UTF8.GetBytes(largeString));
compressor.Close();
var bytesToWriteToSQLVarbinaryMax = compressStream.ToArray();
}
And then I could do:
SELECT Cast(Decompress(bytes) AS varchar(max)) FROM compressedTable
I have created a C# windows application with vs2010 and I'm using a SQL Server CE database. I'm looking for a way to backup my database programmatically.
Is there a way to export/import my entire database (as a .sdf file) or just copy it to another location and then import it and replace the current one?Could anyone provide me the code in order to do this?
I'm relatively new to this but I'm guessing this is not something as difficult as it sounds. I couldn't find a clear answer anywhere so any help would be appreciated!
For this task we use the SQL Server Management Objects (SMO) objects that have access to backup/restore functions using asp.net code.
Base on the article How to: Back Up Databases and Transaction Logs here is a part of the sample that make the backup:
// Create the Backup SMO object to manage the execution
Backup backup = new Backup();
// Add the file to backup to
backup.Devices.Add(new BackupDeviceItem(backupPath, DeviceType.File));
// Set the name of the database to backup
backup.Database = databaseName;
// Tell SMO that we are backing up a database
backup.Action = BackupActionType.Database;
backup.Incremental = false;
// Specify that the log must be truncated after the backup is complete.
backup.LogTruncation = BackupTruncateLogType.Truncate;
// Begin execution of the backup
backup.SqlBackup(server);
I need to pick an underlying method of saving data collected in the field (offline and remote locations). I want to use the HTML5 Database with SQLite but I can I pick the location? So far, I haven't been able to accomplish that. Here is some sample code I was using:
var dbName = "";
var Dir = blackberry.io.dir;
var path = Dir.appDirs.shared.documents.path;
dbName = path + "/" + "databasetest.db";
var db = openDatabase(dbName, '1.0', 'Test', 50 * 1024);
I used an "alert()" to see the file was "supposedly" created, but when I opened the folder in Explorer I cannot find it. Not really sure why and hense my question.
My application is for data entry, without getting into specifics, user may end up collecting a lot or little data. But I want some way of downloading the SQLite database?
Is this the intention of the SQLite database, or will I have to use another solution?
Thanks!
Chris
The Web SQL Database specification was designed for browsers where it would not have been appropriate to allow web pages to access arbitrary file paths.
The intended way to download data is to upload it to a web server in the cloud.
If you want to know the file name of your database, try executing the PRAGMA database_list. (Whether your app can access that path is a different question.)
I am converting and saving a word document to SQL Server database as varbinary. I can save the data. I want to display the uploaded Word document back to the user just like how a resume looks in Word, as if the actual Word document was embedded in the web page itself.
I have the below code, which is downloading the saved Word document as a Word file. Please tell me which is the best control to display the word document also inside the browser.
byte[] fileContent = new byte[fuResume.PostedFile.ContentLength];
fuResume.PostedFile.InputStream.Read(fileContent, 0, fuResume.PostedFile.ContentLength);
//lblAppliedMessage.Text = ByteArrayToString(fileContent);
//lblAppliedMessage.Text = BitConverter.ToString(fileContent).Replace("-", string.Empty);
byte[] btYourDoc;
btYourDoc = fileContent;
Response.ContentType = "application/ms-word";
Response.AddHeader("Content-Disposition",
"inline;filename=yourfilename.doc");
Response.BinaryWrite(btYourDoc);
Response.End();
Provided that you have word on the web server or on the application server, I would save to a temp file the stream coming from the database, open it with word, save as html and render this html in the web browser.
Edit:
as mentioned in the link provided in the comment: http://support.microsoft.com/default.aspx?scid=kb;EN-US;257757 you should possibly avoid usage of Word/Excel Automation on server side code, here an extract from this article from MS:
Microsoft does not currently recommend, and does not support,
Automation of Microsoft Office applications from any unattended,
non-interactive client application or component (including ASP,
ASP.NET, DCOM, and NT Services), because Office may exhibit unstable
behavior and/or deadlock when Office is run in this environment.
Suggested solutions/alternatives are:
Word Automation Services Overview
Excel Services Overview
I have an asp.net mvc application that take a while to load on my production server. I would like to write a script to call my pages every 10 minutes to avoid the pages from being scrapped on the server, which would then cause the server to reload them.
I was thinking of using a SQL Server stored procedure to call my pages every 10 minutes to keep the pages alive.
I've read that I can do this using CLR, but I am not sure how. Does anyone have an example of how to call webpages in a SQL Stored Procedure using CLR?
I have no idea why you would want to use a stored procedure for this.
Just write a simple console application to "call" the page. Then use a scheduled task to run the console application.
Code untested but something like this should work.
You'll probably also need TRUSTWORTHY SET.
ALTER DATABASE Foo SET TRUSTWORTHY ON;
Code:
public partial class WebProc
{
[SqlFunction()]
public static string WebQuery()
{
WebRequest request = HttpWebRequest.Create("http://www.google.com");
WebResponse response = request.GetResponse();
Stream dataStream = response.GetResponseStream();
StreamReader reader = new StreamReader (dataStream);
string responseFromServer = reader.ReadToEnd();
return responseFromServer;
}
}
I have no idea why you would want to use a stored procedure for this.
Just write a simple console application to "call" the page. Then use a scheduled task to run the console application.
You could do that (of course) but that's not what the question was asking ;)
Jafin's answer is correct for the question.
The best answer (IMHO) would be to fix your production server (i.e. reconfigure the settings) so that it doesn't "scrap" your pages every 10 minutes.
Why use a jackhammer (console app or .net inside the database or whatever) when a regular one will do?