Download CSV File by posting JSON data in ASP.NET MVC - asp.net

In ASP.NET MVC, I am trying to download a file via a post and sending JSON data. This JSON data are filters for the data being displayed on the page via knockout.js. The critieria object is always null. How can I download a file, by sending post data via javascript or a form post? Ive accomplished an ajax download by using a GET, but now I have extra data, like arrays I need to post.
Form
<form method="POST" action="#Model.ExportUrl" >
<input type="hidden" name="criteria" data-bind="value: ko.toJSON(data())" />
<button class="btn"><i class="icon-download-alt"></i> Export</button>
</form>
Request
Request URL:http://localhost:2222/members/eventteams/export?eventId=8998
Request Method:POST
Status Code:500 Internal Server Error
Request Headersview source
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Encoding:gzip,deflate,sdch
Accept-Language:en-US,en;q=0.8
Cache-Control:max-age=0
Connection:keep-alive
Content-Length:128
Content-Type:application/x-www-form-urlencoded
Host:localhost:2222
Origin:http://localhost:2222
Referer:http://localhost:2222/members
User-Agent:Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.95 Safari/537.36
Query String Parametersview sourceview URL encoded
eventId:8998
Form Dataview sourceview URL encoded
criteria:{"page":1,"pageSize":"100","sortOrder":"Team.Name","sortDirection":"ASC"}
Controller
[HttpPost]
public virtual ActionResult Export(int eventId, DivisionTeamsTableCriteria criteria)
{

You can try posting the form to Iframe like this one
How do you post to an iframe?
And on the iframe asp.net page, you write the File to response like this
Write to CSV file and export it?
Iframe can be 1x1 pixels

Using knockout.js I created this custom binding that works quite well.
ko.bindingHandlers.download = {
init: function (element, valueAccessor) {
var value = ko.utils.unwrapObservable(valueAccessor()),
id = 'download-iframe-container',
iframe;
$(element).unbind('click').bind('click', function () {
iframe = document.getElementById(id);
if (!iframe) {
iframe = document.createElement("iframe");
iframe.id = id;
iframe.style.display = "none";
}
if (value.data) {
iframe.src = value.url + (value.url.indexOf('?') > 0 ? '&' : '?') + $.param(ko.mapping.toJS(value.data));
} else {
iframe.src = value.url;
}
document.body.appendChild(iframe);
return false;
});
}
};

Related

Scrapy splash download file from js click event

I'm using scrapy + splash plugin, I have a button which triggers a download event via ajax, I need to get the downloaded file, but don't know how.
My lua script is something like this
function main(splash)
splash:init_cookies(splash.args.cookies)
assert(splash:go{
splash.args.url,
headers=splash.args.headers,
http_method=splash.args.http_method,
body=splash.args.body,
})
assert(splash:wait(0.5))
local get_dimensions = splash:jsfunc([[
function () {
var rect = document.querySelector('a[aria-label="Download XML"]').getClientRects()[0];
return {"x": rect.left, "y": rect.top}
}
]])
splash:set_viewport_full()
splash:wait(0.1)
local dimensions = get_dimensions()
-- FIXME: button must be inside a viewport
splash:mouse_click(dimensions.x, dimensions.y)
splash:wait(0.1)
return splash:html()
end
My request object from my spider:
yield SplashFormRequest(self.urls['url'],
formdata=FormBuilder.build_form(response, some_object[0]),
callback=self.parse_cuenta,
cache_args=['lua_source'],
endpoint='execute',
args={'lua_source': self.script_click_xml})
Thanks in advance
I just tried this with SplashFormRequest and it looks like splash won't work for you. Instead you can send the same Ajax request using python Requests.
here is an example
data = {'__EVENTTARGET': 'main_0$body_0$lnkDownloadBio',
'__EVENTARGUMENT': '',
'__VIEWSTATE': viewstate,
'__VIEWSTATEGENERATOR': viewstategen,
'__EVENTVALIDATION': eventvalid,
'search': '',
'filters': '',
'score': ''}
HEADERS = {
'Content-Type':'application/x-www-form-urlencoded',
'User-Agent': 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.101 Safari/537.36',
'Accept': 'text / html, application / xhtml + xml, application / xml;q = 0.9, image / webp, image / apng, * / *;q = 0.8'
}
data = urllib.urlencode(data)
r = requests.post(submit_url, data=data, allow_redirects=False, headers=HEADERS)
filename = 'name-%s.pdf' % item['first_name']
with open(filename, 'wb') as f:
f.write(r.content)
Please make sure the data and headers you sending are correct.

Why is my urlFetchApp function failing to successfully login

I'm trying to use google apps script to login to an ASP.Net website and scrape some data that I typically have to retrieve manually. I've used Chrome Developer tools to get the correct payload names (TEXT_Username, TEXT_Password, _VIEWSTATE, _VIEWSTATEGENERATOR), I also got a ASP Net session Id to send along with my Post request.
When I run my function(s) it returns a Response Code = 200 if followRedirects is set to false and returns Response Code = 302 if followRedirects is set to true. Unfortunately in neither case do the functions successfully authenticate the website. Instead the HTML returned is that of the Login Page.
I've tried different header variants and parameters, but I can't seem to successfully login.
Couple of other points. When I do the login in Chrome using the Developer tools, the response code appears to be 302 Found.
Does anyone have any suggestions on how I can successfully login to this site. Do you see any errors in my functions that could be the cause of my problems. I'm open to any and all suggestions.
My GAS functions follow:
function login(cookie, viewState,viewStateGenerator) {
var payload =
{
"__VIEWSTATE" : viewState,
"__VIEWSTATEGENERATOR" : viewStateGenerator,
"TEXT_Username" : "myUserName",
"TEXT_Password" : "myPassword",
};
var header = {'Cookie':cookie};
Logger.log(header);
var options =
{
"method" : "post",
"payload" : payload,
"followRedirects" : false,
"headers" : header
};
var browser = UrlFetchApp.fetch("http://tnetwork.trakus.com/tnet/Login.aspx?" , options);
Utilities.sleep(1000);
var html = browser.getContentText();
var response = browser.getResponseCode();
var cookie2 = browser.getAllHeaders()['Set-Cookie'];
Logger.log(response);
Logger.log(html);
}
function loginPage() {
var options =
{
"method" : "get",
"followRedirects" : false,
};
var browser = UrlFetchApp.fetch("http://tnetwork.trakus.com/tnet/Login.aspx?" , options);
var html = browser.getContentText();
// Utilities.sleep(500);
var response = browser.getResponseCode();
var cookie = browser.getAllHeaders()['Set-Cookie'];
login(cookie);
var regExpGen = new RegExp("<input type=\"hidden\" name=\"__VIEWSTATEGENERATOR\" id=\"__VIEWSTATEGENERATOR\" value=\"(.*)\" \/>");
var viewStateGenerator = regExpGen.exec(html)[1];
var regExpView = new RegExp("<input type=\"hidden\" name=\"__VIEWSTATE\" id=\"__VIEWSTATE\" value=\"(.*)\" \/>");
var viewState = regExpView.exec(html)[1];
var response = login(cookie,viewState,viewStateGenerator);
return response
}
I call the script by running the loginPage() function. This function obtains the cookie (session id) and then calls the login function and passes along the session id (cookie).
Here is what I see in the Google Developer tools Network section when I login using Google's Chrome browser:
Remote Address: 66.92.89.141:80
Request URL: http://tnetwork.trakus.com/tnet/Login.aspx
Request Method: POST
Status Code:302 Found
**Request Headers** view source
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Accept-Encoding:gzip, deflate
Accept-Language: en-US,en;q=0.8
Cache-Control:max-age=0
Connection:keep-alive
Content-Length: 252
Content-Type:application/x-www-form-urlencoded
Cookie: ASP.NET_SessionId=jayaejut5hopr43xkp0vhzu4; userCredentials=username=myUsername; .ASPXAUTH=A54B65A54A850901437E07D8C6856B7799CAF84C1880EEC530074509ADCF40456FE04EC9A4E47D1D359C1645006B29C8A0A7D2198AA1E225C636E7DC24C9DA46072DE003EFC24B9FF2941755F2F290DC1037BB2B289241A0E30AF5CB736E6E1A7AF52630D8B31318A36A4017893452B29216DCF2; __utma=260442568.1595796669.1421539534.1425211879.1425214489.16; __utmc=260442568; __utmz=260442568.1421539534.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); __utma=190106350.1735963725.1421539540.1425152706.1425212185.18; __utmc=190106350; __utmz=190106350.1421539540.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none)
Host:tnetwork.trakus.com
Origin:http://tnetwork.trakus.com
Referer:http://tnetwork.trakus.com/tnet/Login.aspx?
User-Agent:Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.115 Safari/537.36
**Form Dataview** sourceview URL encoded
__VIEWSTATE: O7YCnq5e471jHLqfPre/YW+dxYxyhoQ/VetOBeA1hqMubTAAUfn+j9HDyVeEgfAdHMl+2DG/9Gw2vAGWYvU97gml5OXiR9E/9ReDaw9EaQg836nBvMMIjE4lVfU=
__VIEWSTATEGENERATOR:F4425990
TEXT_Username:myUsername
TEXT_Password:myPassword
BUTTON_Submit: Log In
Update: It appears that the website is using an HttpOnly cookie. As a result, I don't think I am capturing the whole cookie and therefore my header is not correct. As a result, I believe I need to set followRedirects to false and handle the redirect and cookie manually. I'm currently researching this process, but welcome input from anyone who has been down this road.
I was finally able to successfully login to the page. The issue seems to be that the urlFetchApp was unable to follow the redirect. I credit this stackoverflow post: how to fetch a wordpress admin page using google apps script
This post described the following process that led to my successful login:
Set followRedirect to false
Submit the post and capture the cookies
Use the captured cookie to issue a get with the appropriate url.
Here is the relevant code:
var url = "http://myUrl.com/";
var options = {
"method": "post",
"payload": {
"TEXT_Username" : "myUserName",
"TEXT_Password" : "myPassword",
"BUTTON_Submit" : "Log In",
},
"testcookie": 1,
"followRedirects": false
};
var response = UrlFetchApp.fetch(url, options);
if ( response.getResponseCode() == 200 ) {
// Incorrect user/pass combo
} else if ( response.getResponseCode() == 302 ) {
// Logged-in
var headers = response.getAllHeaders();
if ( typeof headers['Set-Cookie'] !== 'undefined' ) {
// Make sure that we are working with an array of cookies
var cookies = typeof headers['Set-Cookie'] == 'string' ? [ headers['Set-Cookie'] ] : headers['Set-Cookie'];
for (var i = 0; i < cookies.length; i++) {
// We only need the cookie's value - it might have path, expiry time, etc here
cookies[i] = cookies[i].split( ';' )[0];
};
url = "http://myUrl/Calendar.aspx";
options = {
"method": "get",
// Set the cookies so that we appear logged-in
"headers": {
"Cookie": cookies.join(';')
}
}
...
I notice that the provided Chrome payload includes BUTTON_Submit: Log In but your POST payload does not. I have found that for POSTs in GAS things go much more smoothly if I explicitly set a submit variable in my payload objects. In any case, if you're trying to emulate what Chrome is doing, this is a good first step.
So in your case, it's a one line change:
var payload =
{
"__VIEWSTATE" : viewState,
"__VIEWSTATEGENERATOR" : viewStateGenerator,
"TEXT_Username" : "myUserName",
"TEXT_Password" : "myPassword",
"BUTTON_Submit" : "Log In"
};

Ext Js Filefield issue in Internet Explorer8 and 9

I have a file field in my form.
when i submit form through controller ,IE is showing the security bar saying that I'm trying to download a file to the computer, which is exactly reverse of what I'm doing ( I'm uploading a file).
And everything works great when I submit the form to the server using FF. But a problem occurs with IE8 and Ie 9.
Controller action:
var myForm = Ext.getCmp('uploaddraftpcpPanel').getForm(); // get the basic form
if (myForm.isValid()) { // make sure the form contains valid data before submitting
myForm.submit({
headers: {
enctype: 'multipart/form-data; charset=UTF-8'
},
url: 'upload/uploaddraftpcp.action',
success: function (myForm, action) {
Ext.Msg.alert('success', "success");
},
failure: function (myForm, action) {
Ext.Msg.alert('Failed', "failed");
}
});
} else { // display error alert if the data is invalid
Ext.Msg.alert('Invalid Data', 'Please correct form errors.');
}
Server side code looks like this:
#RequestMapping(value = "/upload/uploaddraftpcp.action", method = RequestMethod.POST)
public #ResponseBody
Map<String, ? extends Object> uploadDraftpcp(HttpServletRequest request){
Map<String, Object> modelMap = new HashMap<String, Object>(2);
try {
System.out.println("title pcp" + request.getParameter("title"));
System.out.println("description pcp"
+ request.getParameter("description"));
// response.setContentType("text/html");
modelMap.put("message", "Successfully submitted Form");
modelMap.put("success", true);
return modelMap;
} catch (Exception e) {
System.out.println(e.toString());
e.printStackTrace();
return ExtJSReturn
.mapError("Error retrieving data.");
}
}
In Ext JS, file uploads are not performed using normal 'Ajax' techniques, that is they are not performed using XMLHttpRequests. Instead a hidden element containing all the fields is created temporarily and submitted with its target set to refer to a dynamically generated, hidden which is inserted into the document but removed after the return data has been gathered.
The server response is parsed by the browser to create the document for the IFRAME. If the server is using JSON to send the return object, then the Content-Type header must be set to text/html in order to tell the browser to insert the text unchanged into the document body.
Chrome, FF, IE10 and up do just when when they see Content-Type = application/json in the response header. However when you do that to IE8 and IE9, when the server reponds from the file upload, you see a message saying something along the lines of: "Do you want to open or save this document"?
So, to fix that in IE8 and IE9, just set Content-Type to text/html and you're fine. The modern browsers responds just the same, so you're not breaking anything there with that solution.

ASP.NET MVC ignoring Content-Length?

I've been having some problems with missing post data in ASP.NET MVC which has lead me to investigate how ASP.NET MVC deals with invalid content lengths. I had presumed that a post with a invalid content length should be ignored by MVC.NET but this doesn't seem to be the case.
As an example, try creating a new ASP.NET MVC 2 web application and add this action to the HomeController:
public ActionResult Test(int userID, string text)
{
return Content("UserID = " + userID + " Text = " + text);
}
Try creating a simple form that posts to the above action, run fiddler and (using "Request Builder") modify the raw data so that some of the form data is missing (e.g. remove the text parameter). Before executing the request, remember to un-tick the "Fix Content-Length header" checkbox under the Request Builder options then set a break point on the code above and execute the custom http request.
I find that the request takes a lot longer than normal (30 seconds or so) but to my amazement is still processed by the controllers action. Does anyone know if this is expected behavior and, if so, what would you recommend to safeguard against invalid content-lengths?
ASP.NET does not ignore the Content-Length request header. Consider the following controller action as an example which simply echoes back the foo parameter:
[HttpPost]
public ActionResult Index(string foo)
{
return Content(foo, "text/plain");
}
Now let's make a valid POST request to it:
using (var client = new TcpClient("127.0.0.1", 2555))
using (var stream = client.GetStream())
using (var writer = new StreamWriter(stream))
using (var reader = new StreamReader(stream))
{
writer.Write(
#"POST /home/index HTTP/1.1
Content-Type: application/x-www-form-urlencoded
Host: localhost:2555
Content-Length: 10
Connection: close
foo=foobar");
writer.Flush();
Console.WriteLine(reader.ReadToEnd());
}
As expected this prints the response HTTP headers (which are not important) and in the body we have foobar. Now try reducing the Content-Length header of the request:
POST /home/index HTTP/1.1
Content-Type: application/x-www-form-urlencoded
Host: localhost:2555
Content-Length: 5
Connection: close
foo=foobar
Which returns a single f in the response body. So as you can see an invalid HTTP request could lead to incorrect parsing of the parameters.

jQuery Validate with WebForms & WebService

Right now I have simple form with a single input. I'm trying to use the remote part of jQuery Validate to call my webservice that right now is just returning false. The problem I'm having right now is when it calls my webservice it is pulling the name of the input which is some garbage created by .net. Is there a way to override this and use the Id of the input verse the name of the input. Here is my jQuery:
$(function () {
$('form').validate();
$("#tbSiteName").rules("add", {
required: true,
remote: "webservices/webservice.asmx/HelloWorld"
});
});
Here is my HTML:
<label for="tbSiteName">Name:</label>
<input name="ctl00$MainContent$tbSiteName" type="text" id="tbSiteName" class="required" />
Here is the header info from Chrome: (notice the Query string params)
Accept:application/json, text/javascript, */*
Accept-Charset:ISO-8859-1,utf-8;q=0.7,*;q=0.3
Accept-Encoding:gzip,deflate,sdch
Accept-Language:en-US,en;q=0.8
Connection:keep-alive
Content-Type:application/x-www-form-urlencoded
Cookie:ASP.NET_SessionId=qvupxcrg0yukekkni323dapj
Host:localhost:56803
Referer:http://localhost:56803/Default.aspx
User-Agent:Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.224 Safari/534.10
X-Requested-With:XMLHttpRequest
Query String Parameters
ctl00%24MainContent%24tbSiteName:ts
From the server side I'm getting a 500 Internal server error because my signature doesn't match my post.
Webservice Code:
[WebMethod]
[ScriptMethod]
public bool HelloWorld(string tbSiteName) {
return tbSiteName.Length > 5;
}
Thanks for the help.
Unfortunately, to my knowledge, there's no way to get around this when using ASP.Net WebForms server controls such as <asp:textbox>. Although in .NET 4 you have the ClientIdMode="Static" attribute (see here) to disable the auto-generated client IDs, that does not affect the name attribute.
Rick Strahl has suggested in response to comments on his blog post that if you really need predictable names, you should just use an html <input> control:
ClientIDMode only affects the ID not the NAME attribute on the control, so for post back form elements the name will still be a long name as held in UniqueID. This is reasonable though IMHO. If you really need simple names use plain INPUT elements rather than ASP.NET controls especially if you don't rely on POSTBACK assignment of controls anyway to retrieve the values by using Request.Form[].
Have you considered just using a client-side <input> instead of an <asp:textbox runat="server">?
Additionally, have you considered dropping ASP.NET WebForms and using MVC? ;-)
This is the answer I've came up with. I had to change my asp:textbox to a HTML input in order to get this to work. Also, I had to change the web.config to all HttpGet to my webservice. This is painful at best. I've also lost viewstate on the control by using a standard input. I'd love to switch this over to MVC, but it is just not an option.
Thanks for the help.
$("#tbSiteName").rules("add", {
required: true,
remote: function() {
var r = {
url: "/webservices/ipmws.asmx/SiteValid",
type: "POST",
data: "{'tbSiteName': '" + $('#tbSiteName').val() + "'}",
dataType: "json",
contentType: "application/json; charset=utf-8",
dataFilter: function(data) { return (JSON.parse(data)).d; }
}
return r
},
});

Resources