Java Servlet Download muptiple csv files - servlets

I have a report which displays some information in a report and I have link in the report to export to CSV. To download the csv file what we are doing is ,
public class ReportServlet extends XYXServlet{
public void service(HttpServletRequest req, HttpServletResponse res) throws Exception {
...................
...................
res.setContentType("text/csv");
res.setHeader("Content-Disposition","attachment; filename=\""+reportName+"\"");
OutputStream out = res.getOutputStream();
// Render the report
ReportRender.renderReport(report,results,out,rtParam);
out.close();
}
}
This report is for one patient. Now I have the requirement where I have to download report for all the patient in the system. We have more than 5000 patients. It is a one time download. SO basically I should have one CSV file per patient .eg filename will be xyzreport-patientId. We are using velocity template . Basically ReportRender will take the report result and merge with the template using velocity template. like
VelocityContext c = new VelocityContext(params);
Writer w = new OutputStreamWriter(out);
template.merge(c,w);
w.flush();
So now my problem is how do I download all report for all patients at one time. Can I use one request/response to download reports for all patients?

You can use zip file creation.
Best Practices to Create and Download a huge ZIP (from several BLOBs) in a WebApp
In above example they have BLOBs to download. In your case you need to write CSV files on zipped stream. If you will process all at a time and then sending them will cause memory issue. You need to do it loop; writing on stream as soon as you read it. This will increase efficiency of output as well as will avoid memory issues.
Above question has also answer along with implementation which is submitted by one who asked question. It is tried and tested. :)

Related

How to import data from mql4 (metatrader) to r in order to automate?

To work with the data and time series of the financial market in real time, most of the brokers that offer the platform metratrader allows the download of historical data of the pairs and indexes; This process is done manually to create a csv file. I need to automate this process to download the historical data of 96 markets every 10 days and no bibliography or information about it.
If the question is how to organize the contact between MT4 and R, there are three general ways:
1. Use files, pipe channel as alternative.
2. REST, you need web server for that.
3. DLL( standard WinAPI, write DLL file, use websocket or contact broker). The latter might be the easiest way, try ZeroMQ.
If you need to download some data from MT4, you should write a small script that will collect the data. Something like
bool getData(string symbol,int timeframe,int startFrom,string fileName)
{
string message="";
for(int i=startFrom;i>=0;i--)
{
message=message+StringFormat("%s;%.5f;%.5f;%.5f;%.5f",
TimeToString(iTime(symbol,timeframe,i)),
iOpen(symbol,timeframe,i),
iHigh(symbol,timeframe,i),
iLow(symbol,timeframe,i),
iClose(symbol,timeframe,i));
}
int handle=FileOpen(fileName,FILE_READ|FILE_WRITE,FILE_CSV);
if(handle==INVALID_HANDLE)return(false);
FileSeek(CUR_END);
FileWrite(message);
FileClose(handle);
return(true);
}

Mail Merge Feature for a CRM web-app made in asp.NET

We're working on a web based CRM for my company in ASP.net. I frequently have to send newsletters to all of my customers, and it becomes tedious to manually copy all of their addresses. What I would like is a feature to send one mail to all of my customers, taking their addresses from our contacts database, similar to a mail merge.
My developer said that he can do this for Emails, but not for physical mail. His reasoning behind this is that he can write a script that sends the mails to all customers one by one, but he can only give one single print command, which would only be able to print the current contents of the page. Therefore, he would not be able to print the individual letters for all of the customers.
Does anyone have ideas on how this would be possible? E.g. printing the page in such a way that each letter would be printed on a seperate page, or another way to automatically print all of the letters (with the mailmerged fields)?
Any help will be appreciated. If you require more details, please tell me.
A webpage is not the right solution to physically print letters. What you need to produce is a report that would generate a PDF file. This report will generate a PDF document with a different customer address on each page. Try using Microsoft Reporting Services, it is included in SQL Server. Crystal Reports is also a popular reporting solution too.
Also, you will have a hard time printing the stylized contents of your nice looking e-mail in the reporting solutions mentioned above. Consider using the report only as the cover letter of your mail piece.
One possible solution is to use 3rd party library for creation of individual letters for your customers. Docentric Toolkit is .NET tool that solves exactly your problem. We are using it for creating individual letters for customers and they all are merged in one file so that printing is done only once. Users can even create or change template documents.
Next you would have to create a template document in MS Word where you would include fixed content and placeholders for variable content which would be filled in at runtime with customer information.
After processing the data in .NET application you merge the data with the template document (see code snippet below). Your final document will be one file with letters for your customers, each on its own page. This file can then be sent to the printer with one print command.
I am attaching a code snippet of a Main method of the sample console application. The project has references to Entity Framework and Docentric’s dlls and uses entity model of Northwind database.
As you can see, it is really easy to prepare the data and merge it with template document. Solution is suitable for ASP.NET and MVC applications because you don’t need Microsoft Office installed on the server.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Diagnostics;
using Docentric.Word;
namespace DisplayCustomers
{
class Program
{
static void Main(string[] args)
{
// first we read customers - in the example we select only customers
// from USA and Canada and order them by country and customer name
List<Customers> customerList = new List<Customers>();
using (var db = new NORTHWNDEntities())
{
customerList = db.Customers
.OrderBy(o => o.Country)
.ThenBy(o => o.CompanyName)
.Where(w => w.Country == "USA" || w.Country == "Canada")
.ToList();
}
// next we merge customers data with the template and generate final document;
string templateDoc = #"C:\Test\Templates\CustomerLetter1_templ.docx";
string outputDoc = #"C:\Test\FinishedLetters\CustomerLetters1.docx";
DocumentGenerator dg = new DocumentGenerator(customerList);
DocumentGenerationResult result = dg.GenerateDocument(templateDoc, outputDoc);
}
}
}

Understanding the JIT; slow website

First off, this question has been covered a few times (I've done my research), and, for example, on the right side of the SO webpage is a list of related items... I have been through them all (or as many as I could find).
When I publish my pre-compiled .NET web application, it is very slow to load the first time.
I've read up on this, it's the JIT which I understand (sort of).
The problem is, after the home page loads (up to 20 seconds), many other pages load very fast.
However, it would appear that the only reason they load is because the resources have been loaded (or that they share the same compiled dlls). However, some pages still take a long time.
This indicates that maybe the JIT needs to compile different pages in different ways? If so, and using a contact form as an example (where the Thank You page needs to be compiled by the JIT and first time is slow), the user may hit the send button multiple times whilst waiting for the page to be shown.
After I load all these pages which use different models or different shared HTML content, the site loads quickly as expected. I assume this issue is a common problem?
Please note, I'm using .NET 4.0 but, there is no database, XML files etc. The only IO is if an email doesn't send and it writes the error to a log.
So, assuming my understanding is correct, what is the approach to not have to manually go through the website and load every page?
If the above is a little too broad, then can this be resolved in the settings/configuration in Visual Studio (2012) or the web.config file (excluding adding compilation debug=false)?
In this case, there are 2 problems
As per rene's comments, review this http://msdn.microsoft.com/en-us/library/ms972959.aspx... The helpful part was to add the following code to the global.asax file
const string sourceName = ".NET Runtime";
const string serverName = ".";
const string logName = "Application";
const string uriFormat = "\r\n\r\nURI: {0}\r\n\r\n";
const string exceptionFormat = "{0}: \"{1}\"\r\n{2}\r\n\r\n";
void Application_Error(Object sender, EventArgs ea) {
StringBuilder message = new StringBuilder();
if (Request != null) {
message.AppendFormat(uriFormat, Request.Path);
}
if (Server != null) {
Exception e;
for (e = Server.GetLastError(); e != null; e = e.InnerException) {
message.AppendFormat(exceptionFormat,
e.GetType().Name,
e.Message,
e.StackTrace);
}
}
if (!EventLog.SourceExists(sourceName)) {
EventLog.CreateEventSource(sourceName, logName);
}
EventLog Log = new EventLog(logName, serverName, sourceName);
Log.WriteEntry(message.ToString(), EventLogEntryType.Error);
//Server.ClearError(); // uncomment this to cancel the error
}
The server was maxing out during sending of the email! My code was fine, but, viewing Task Scheduler showed it was hitting 100% memory...
The solution was to monitor the errors shown by point 1 and fix it. Then, find out why the server was being throttled when sending an email!

Printing silently from Spring MVC/Jasper Application

This is very abstract question.
I'm working on a Spring MVC Web Application which has to deal with lot of invoice printing continuously. Currently When the invoice is saved, the spring controller delegates the invoice id to the jasper pdf generation service which prepares the pdf. After the pdf gets downloaded, the user manually prints it.
I need a way to print the invoice silently when the user saves the invoice.
Any ideas?
Since you are exporting to PDF it is possible. You need to add a JRPdfExporterParameter.PDF_JAVASCRIPT parameter to your JRPdfExporter instance with the value "this.print({bUI: true,bSilent: false,bShrinkToFit: true});". For Example:
protected static byte[] exportReportToPdf(JasperPrint jasperPrint) throws JRException{
JRPdfExporter exporter = new JRPdfExporter();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
exporter.setParameter(JRExporterParameter.JASPER_PRINT, jasperPrint);
exporter.setParameter(JRExporterParameter.OUTPUT_STREAM, baos);
exporter.setParameter(JRPdfExporterParameter.PDF_JAVASCRIPT, "this.print({bUI: true,bSilent: false,bShrinkToFit: true});");
exporter.exportReport();
return baos.toByteArray();
}
This actually adds the JavaScript to the PDF. When the PDF is opened it is ran, and will send it the print queue. It should be noted that this should be used sparingly as it is not generally considered to be nice to do this automatically for your users. When I have used it in an app, we had to buttons for each report. An Export button, that does not contain the JavaScript, and a Print button that did. That way users that wanted it to just print it would do so, and those that wanted a digital copy had that also.

Please suggest a way to store a temp file in Windows Azure

Here I have a simple feature on ASP.NET MVC3 which host on Azure.
1st step: user upload a picture
2nd step: user crop the uploaded picture
3rd: system save the cropped picture, delete the temp file which is the uploaded original picture
Here is the problem I am facing now: where to store the temp file?
I tried on windows system somewhere, or on LocalResources: the problem is these resources are per Instance, so here is no guarantee the code on an instance shows the picture to crop will be the same code on the same instance that saved the temp file.
Do you have any idea on this temp file issue?
normally the file exist just for a while before delete it
the temp file needs to be Instance independent
Better the file can have some expire setting (for example, 1H) to delete itself, in case code crashed somewhere.
OK. So what you're after is basically somthing that is shared storage but expires. Amazon have just announced a rather nice setting called object expiration (https://forums.aws.amazon.com/ann.jspa?annID=1303). Nothing like this for Windows Azure storage yet unfortunately, but, doesnt mean we can't come up with some other approach; indeed even come up with a better (more cost effective) approach.
You say that it needs to be instance independant which means using a local temp drive is out of the picture. As others have said my initial leaning would be towards Blob storage but you will have cleanup effort there. If you are working with large images (>1MB) or low throughput (<100rps) then I think Blob storage is the only option. If you are working with smaller images AND high throughput then the transaction costs for blob storage will start to really add up (I have a white paper coming out soon which shows some modelling of this but some quick thoughts are below).
For a scenario with small images and high throughput a better option might be to use the Windows Azure Cache as your temporary storaage area. At first glance it will be eye wateringly expensive; on a per GB basis (110GB/month for Cache, 12c/GB for Storage). But, with storage your transactions are paid for whereas with Cache they are 'free'. (Quotas are here: http://msdn.microsoft.com/en-us/library/hh697522.aspx#C_BKMK_FAQ8) This can really add up; e.g. using 100kb temp files held for 20 minutes with a system throughput of 1500rps using Cache is about $1000 per month vs $15000 per month for storage transactions.
The Azure Cache approach is well worth considering, but, to be sure it is the 'best' approach I'd really want to know;
Size of images
Throughput per hour
A bit more detail on the actual client interaction with the server during the crop process? Is it an interactive process where the user will pull the iamge into their browser and crop visually? Or is it just a simple crop?
Here is what I see as a possible approach:
user upload the picture
your code saves it to a blob and have some data backend to know the relation between user session and uploaded image (mark it as temp image)
display the image in the cropping user interface interface
when user is done cropping on the client:
4.1. retrieve the original from the blob
4.2. crop it according the data sent from the user
4.3. delete the original from the blob and the record in the data backend used in step 2
4.4. save the final to another blob (final blob).
And have one background process checking for "expired" temp images in the data backend (used in step 2) to delete the images and the records in the data backend.
Please note that even in WebRole, you still have the RoleEntryPoint descendant, and you still can override the Run method. Impleneting the infinite loop in the Run() (that method shall never exit!) method, you can check if there is anything for deleting every N seconds (depending on your Thread.Sleep() in the Run().
You can use the Azure blob storage. Have look at this tutorial.
Under sample will be help you.
https://code.msdn.microsoft.com/How-to-store-temp-files-in-d33bbb10
you have two way of temp file in Azure.
1, you can use Path.GetTempPath and Path.GetTempFilename() functions for the temp file name
2, you can use Azure blob to simulate it.
private long TotalLimitSizeOfTempFiles = 100 * 1024 * 1024;
private async Task SaveTempFile(string fileName, long contentLenght, Stream inputStream)
{
try
{
//firstly, we need check the container if exists or not. And if not, we need to create one.
await container.CreateIfNotExistsAsync();
//init a blobReference
CloudBlockBlob tempFileBlob = container.GetBlockBlobReference(fileName);
//if the blobReference is exists, delete the old blob
tempFileBlob.DeleteIfExists();
//check the count of blob if over limit or not, if yes, clear them.
await CleanStorageIfReachLimit(contentLenght);
//and upload the new file in this
tempFileBlob.UploadFromStream(inputStream);
}
catch (Exception ex)
{
if (ex.InnerException != null)
{
throw ex.InnerException;
}
else
{
throw ex;
}
}
}
//check the count of blob if over limit or not, if yes, clear them.
private async Task CleanStorageIfReachLimit(long newFileLength)
{
List<CloudBlob> blobs = container.ListBlobs()
.OfType<CloudBlob>()
.OrderBy(m => m.Properties.LastModified)
.ToList();
//get total size of all blobs.
long totalSize = blobs.Sum(m => m.Properties.Length);
//calculate out the real limit size of before upload
long realLimetSize = TotalLimitSizeOfTempFiles - newFileLength;
//delete all,when the free size is enough, break this loop,and stop delete blob anymore
foreach (CloudBlob item in blobs)
{
if (totalSize <= realLimetSize)
{
break;
}
await item.DeleteIfExistsAsync();
totalSize -= item.Properties.Length;
}
}

Resources