passing objects into the url - Url Params /Query strings - query-string

I have two websites. One website is going to capture form data and put it into a url...
let url = `https://xxxxxxxxxx.herokuapp.com/band/${band._id}?toggle=true&eventDate={"eventDate": ${mainDate[0]}, "eventCharge": ${mainDate[1]}}&quoteAdjuster=${sliderValue}`
Some of the information that I collect in the form is stored in objects and arrays.
Is there a way to send objects/arrays in this url to my website? Currently that whole object above, mainDate, still comes through as a string.
Thanks!

You could change your objects and arrays into strings on purpose by using JSON.stringify(myObject).
Then the server would just need to just use JSON.parse(sentData) in order to reconstruct the arrays and objects (some types of data don't survive this operation, so be careful. For example Date objects become strings and you have to reconstruct them manually).
Also, remember that the GET protocol has a fairly small payload limit (8KB). You will want to switch to using POST, if those parameters aren't important for the URL that the user is browsing.

Related

Purpose of tilde delimited values in URL fragment instead of GET params

I came across an unusual URL structure on a site. It looked like this:
https://www.agilealliance.org/glossary/xp/#q=~(infinite~false~filters~(postType~(~'post~'aa_book~'aa_event_session~'aa_experience_report)~tags~(~'xp))~searchTerm~'~sort~false~sortDirection~'asc~page~1)
It seems the category, pagination and sort options of a widget on the page injects and reads through these values. Does this format for storing data in the URL have a name, or is this an esoteric format someone made?
What's the purpose of doing this over using regular GET params, or at least using a more conventional format after the fragment?
If you inspect the URL carefully, you'll see that the parameters you describe are placed after the fragment (#), meaning they're not sent to the server but used by the client instead.
In this case, the client (JavaScript) builds them into something like an ElasticSearch query that's then POSTed to the server, in order to update listing you see on your screen.

How to list all parameters available to query via API?

As a end-point user of an API, how can I list all parameters available to pass the query? In my case (stats about Age of Empires 2 matches), the website describing the API has a list with some of them but it seems there are more available.
To provide more context, I'm extracting the following information:
GET("https://aoe2.net/api/matches?game=aoe2de&count=1000&since=1632744000&map_type=12")
but for some reason the last condition, map_type=12 does nothing (output is the same as without it). I'm after the list of parameters available, so I can extract what I want.
PD: this post is closely related but does not focus on API. Perhaps this makes a difference, as the second answer there seems to suggest.
It is not possible to find out all available (undocumented) query parameters for a query, unless the API explicitly provides such a method or you can find out how the API server processes the query.
For instance, if the API server code is open source, you could find out from the code how the query is processed. Provided that you find the code also.
The answers in the post you linked are similarly valid for an API site as well as for one that provides content for a web browser (a web server can be both).
Under the hood, there is not necessarily any difference between an API server or a server that provides web content (html) in terms of how queries are handled.
As for the parameters seemingly without an effect, it seems that the API in question does not validate the query parameters, i.e., you can put arbitrary parameters in the query and the server will simply ignore parameters that it is not specifically programmed to use.
The documentation on their website is all any of us have to go by https://aoe2.net/#api
You can't just add your own parameters to the URL and expect it to return a value back as they have to have coded it to work that way.
Your best bet is to just extract as much data as you can by increasing the count parameter, then loop through the JSON response and extract the map_type from there.
JavaScript example:
<script>
json=[{"match_id":"1953364","lobby_id":null,"game_type":0},
{"match_id":"1961217","lobby_id":null,"game_type":0},
{"match_id":"1962068","lobby_id":null,"game_type":1},
{"match_id":"1962821","lobby_id":null,"game_type":0},
{"match_id":"1963814","lobby_id":null,"game_type":0},
{"match_id":"1963807","lobby_id":null,"game_type":0},
{"match_id":"1963908","lobby_id":null,"game_type":0},
{"match_id":"1963716","lobby_id":null,"game_type":0},
{"match_id":"1964491","lobby_id":null,"game_type":0},
{"match_id":"1964535","lobby_id":null,"game_type":12},];
for(var i = 0; i < json.length; i++) {
var obj = json[i];
if(obj.game_type==12){
//do something with game_type 12 json object
console.log(obj);
}
}
</script>

Scraping ASP pages with Excel/VBA

I'm trying to scrape an ASP.NET page with Excel. Unfortunately, the page only returns 50 records at a time, of several pages. Excel's native Web Query module only picks up the first page. I want all the pages.
Like most (all?) ASP pages, there are a few hidden variables sent back to the server when requesting a new page. The important ones are _VIEWSTATE and _EVENT_VALIDATION.
I've written a VBA function that gets the HTML source of the page and scrapes these variables from it.
I've also written an .iqy page, which allows for POST requests in it. It looks something like this:
WEB
1
http://www.myaspwebsite/search/search_List.aspx
__EVENTTARGET=&__EVENTARGUMENT=&__VIEWSTATE=%2FwEPDwULLTEy[....truncated ..50k characters..]Mhudyk5U6u8%2BBpvxDPN8R4%3D&__EVENTVALIDATION=%2FwEWFQL%2FkN%2FBCgL6g%2B5vAvfY06EOAoic4qIIAome%2Bf4PAuOrjYgIAuKrjYgIAuGrjYgIAuCrjYgIAuerjYgIAt7e34UPAvuL7m8CtuLToQ4CiaTioggCyKX5%2Fg8C4tv1sAgC49v1sAgC4Nv1sAgC4dv1sAgC5tv1sAgC%2Fd7fhQ%2BU8QRtxd7MM4Bpa%2F%2FZC7I64eUh3Q%3D%3D&ctl00_RadMenu1_ClientState=&ctl00%24ContentPlaceHolder1%24NavBar1%24PageNoDropDownList=2&ctl00%24ContentPlaceHolder1%24NavBar1%24btnGo=Go&ctl00%24ContentPlaceHolder1%24NavBar2%24PageNoDropDownList=1
Selection=AllTables
Formatting=None
PreFormattedTextToColumns=True
ConsecutiveDelimitersAsOne=True
SingleBlockTextImport=False
DisableDateRecognition=False
DisableRedirections=False
This iqy page successfully retuns the desired results if the post query is placed in the file.
I can also use this .iqy page programmatically in VBA and assign the POST query dynamically using QueryTables. However, I get told that my query returned nothing.
I suspect this is because of the length of my argument. The VIEWSTATE alone is about 50k characters. I've tried printing the argument string to a file and it truncates it. However, I can read the same string from a file and use it dynamically successfully.
My questions are : Am I going about this the best way? What limitations should I be aware of when doing this? Also, is there a limit to string size in Excel?
According to Microsoft's documentation on Visual Basic strings (same value applies to VBA strings):
A string can contain from 0 to approximately two billion (2 ^ 31) Unicode characters.
That is more than enough to handle a 50k string. A simple way to bypass IDE line limits and immediate window printing limits would be to print the string into an Excel cell and then read it back into a variable when you need to use that piece of data.

ASP.NET MVC 2 EditModel include Id? Securing Id is not tampered with

I am looking for some best practices when is comes to creating EditMoels and updating data in an ASP.NET MVC app. Lets say I have a Url like so /Post/Edit?Id=25
I am ensuring the user has permissions to edit the specific post by Id on the Get request and the same for my Post in the controller. I am using the ValidateAntiForgeryToken.
Questions: Should I include the Id property in my EditModel? If so, Should I encrypt it?
The problem is I can use FireBug to edit the Id hiddedinput and edit a different post as long as I have permission to do so. This is not horrible, but seems wrong.
Any help would be great!
There are several ways to prevent this.
The first - don't send sensitive data to the client at all. Keep the post id in session variables, so the user can never edit it. This may or may not be an option depending on your architecture.
The next approach is to convert the direct reference to an indirect one. For example, instead of sending postids = {23452, 57232, 91031} to the client to render a drop-down list, you should send an opaque list {1,2,3}. The server alone knows that 1 means 23452, 2 means 57232 and so on. This way, the user can't modify any parameter you don't want him to.
The last approach is including some kind of hash value that adds as an integrity check. For example, suppose you have 3 hidden fields in a html page - {userId=13223, postId=923, role=author}. You first sort the field names and then concatenate the values to get a string like postId=923&userId=13223&role=author. Then, append a server secret to this string, and hash (SHA-1 or MD5) the entire string. For eg. SHA-1('postId=923&userId=13223&role=author&MySuperSecretKey'). Finally add this hashed value as a hidden parameter. You may also want to add another hidden field called ProtectedParameters=userId,postId,role.
When the next request is made, redo the entire process. If the hash differs, balk the process.
Security wise, I have listed the options in decreasing order. At the same time, its probably in the increasing order of convenience. You have to pick the right mix for your application.
I don't think you should worry with that, if the user does what you said, i suppose that you'll know who edited what, so if he edits the wrong post, doing as you said, you can always remove his edition rights...
If you can't thrist your users, don't let them edit anything...

ASP.Net getting a list of querystrings from a link

I have a web application that is using UrlRewriting. Now I want to set it that if the user enters the page with a url in re-written format, all the links apply the same format, otherwise they remain the same (with normal query strings).
Is there a way that I can get a list of query strings that are in the links without parsing the string?
Try the HttpUtility.ParseQueryString

Resources