Paw - A way to make the chosen request/response tab sticky when moving between requests - paw-app

In Paw each tab choice I make, e.g "URL params" request tab, is saved per request.
So if I switched to another request, the tab would jump to "Description" or whatever was chosen last.
Is there a way to make the UI to be consistent between requests? It would make it much easier when I'm comparing multiple requests, for example.

Related

How to know if a user clicked a link using its network traffic

I have large traffic files that I'm trying to analyze in order to get statistical features of users.
One of the features that I would like to extract is links clicking in specific sites (for examples - clicking on popups and more)
My first idea was to look in the packets' content and search for hrefs and links, save them all in some kind of data structure with their time stamps, and then iterate again over the packets to search for requests at any time close to the time the links appeared.
Something like in the following pseudo code (in the following code, the packets are sorted by flows (flow: IP1 <=> IP2)):
for each packet in each flow:
search for "href" or "http://" or "https://"
save the links with their timestamp
for each packet in each flow:
if it's an HTTP request and its URL matches any URL in the list and the
time is close enough, record it
The problem with this code is that some links are dynamically generated while the page is loading (using javascript or so), and cannot be found using the above method.
I have also tried to check the referrer field in the HTTP header and look for packets that were referred by the relevant sites. This method generates a lot of false positives because of iframes and embedded objects.
It is important to mention that this is not my server, and my intention is to make a tool for statistical analysis of users behavior (thus, I can't add some kind of click tracker to my site).
Does anyone have an idea what can I do in order to check if the users clicked on links according to their network traffic?
Any help will be appreciated!
Thank you

Parsing Web page with R

this is my first time posting here. I do not have much experience (less than a week) with html parsing/web scraping and have difficulties parsing this webpage:
https://www.jobsbank.gov.sg/
What I wan to do is to parse the content of all available job listing in the web.
my approach:
click search on an empty search bar which will return me all records listed. The resulting web page is: https://www.jobsbank.gov.sg/ICMSPortal/portlets/JobBankHandler/SearchResult.do
provide the search result web address to R and identify all the job listing links
supply the job listing links to R and ask R to go to each listing and extract the content.
look for next page and repeat step 2 and 3.
However, the problem is that the resulting webpage I got from step 1 does not direct me to the search result page. Instead, it will direct me back to the home page.
Is there anyway to overcome this problem?
Suppose I managed to get the web address for the search result, I intent to use the following code:
base_url <- "https://www.jobsbank.gov.sg/ICMSPortal/portlets/JobBankHandler/SearchResult.do"
base_html <- getURLContent(base_url,cainfo="cacert.pem")[[1]]
links <- strsplit(base_html,"a href=")[[1]]
Learn to use the web developer tools in your web browser (hint: Use Chrome or Firefox).
Learn about HTTP GET and HTTP POST requests.
Notice the search box sends a POST request.
See what the Form Data parameters are (they seem to be {actionForm.checkValidRequest}:YES
{actionForm.keyWord}:my search string )
Construct a POST request using one of the R http packages with that form data in.
Hope the server doesn't care about the cookies, if it does, get the cookies and feed it cookies.
Hence you end up using postForm from RCurl package:
p = postForm(url, .params=list(checkValidRequest="YES", keyword="finance")
And then just extract the table from p. Getting the next page involves constructing another form request with a bunch of different form parameters.
Basically, a web request is more than just a URL, there's all this other conversation going on between the browser and the server involving form parameters, cookies, sometimes there's AJAX requests going on internally to the web page updating parts.
There's a lot of "I can't scrape this site" questions on SO, and although we could spoonfeed you the precise answer to this exact problem, I do feel the world would be better served if we just told you to go learn about the HTTP protocol, and Forms, and Cookies, and then you'll understand how to use the tools better.
Note I've never seen a job site or a financial site that doesn't like you scraping its content - although I can't see a warning about it on this site, that doesn't mean it's not there and I would be careful about breaking the Terms and Conditions of Use. Otherwise you might find all your requests failing.

Provide 2 http responses, first will display loading, second will provide computed data

I have an application that takes a long time to compute data & provide results. Ideally I would want to display a loading symbol or something similar, immediately once the page is requested & once the actual computation is completed I would want to display the results. My question is how to go about this because traditionally 95% of time is spent getting the response ready so the loading icon will not be seen untill the response is served completely to the user, by which time displaying the loading icon will be moot because rendering the response is not the problem point, it is the "getting the data" part which is very time consuming.
BTW, I am using java servlets + freemarker.
TL;DR Provide intermediate response until the real response is ready to be served.
AJAX. That's one of its best use cases. You show the page, make an asynchronous request for the data, show the loading screen, and when the request comes back you show the data.

Why is the GET method faster than POST in HTTP?

I am new to web programming and just curious to know about the GET and POST methods of sending data from one page to another.
It is said that the GET method is faster than POST but I don't know why.
One reason I could find is that GET can take only 255 characters?
Is there any other reason? Please someone explain to me.
It's not much about speed. There are plenty of cases where POST is more applicable. For example, search engines will index GET URLs and browsers can bookmark them and make them show up in history. As a result, if you take actions like modifying a DB based on a GET request, it might be harmful as some bots might also traverse the URL.
The other case can be security issue. If you post credentials using GET, it'll get listed in browser history and server log files.
There are several misconceptions about GET and POST in HTTP. There is one primary difference, GET must be idempotent while POST does not have to be. What this means is that GETs cause no side effects, i.e I can send a GET to a web application as many times as I want to (think hitting Ctrl+R or F5 many times) and the requests will be 'safe'
I cannot do that with POST, a POST may change data on the server. For example, if I order an item on the web the item should be added with a POST because state is changed on the server, the number of items I've added has increased by 1. If I did this with a POST and hit refresh in the browser the browser warns me, if I do it with a GET the browser will simply send the request.
On the server GET vs POST is pure convention, i.e. it's up to me as a developer to ensure that I code the POST on the server to not repeat the call. There are various ways of doing this but that's another question.
To actually answer the question if I use GET or POST to perform the same task there is no performance difference.
You can read the RFC (http://www.w3.org/Protocols/rfc2616/rfc2616.html) for more details.
Looking at the http protocol, POST or GET should be equally easy and fast to parse. I would argue, there is no performance difference.
Take a look at the raw HTTP headers
http GET
GET /index.html?userid=joe&password=guessme HTTP/1.1
Host: www.mysite.com
User-Agent: Mozilla/4.0
http POST
POST /login.jsp HTTP/1.1
Host: www.mysite.com
User-Agent: Mozilla/4.0
Content-Length: 27
Content-Type: application/x-www-form-urlencoded
userid=joe&password=guessme
From my point of view, performance should not be considered when comparing GET and POST.
You should think of GET as "a place to go", and POST as "doing something". For example, a search form should be submitted using GET because the search result page is a "place" and the user will want to bookmark it or retrieve it from their history at a later date. If you submit the form using POST the user can only recreate the page by submitting the form again. On the other hand, if you were to perform an action such as clicking a delete button, you would not want to submit this with GET, as the action would be repeated whenever the user returned to the URL.
Just my few cents from 2016.
I am creating a simple message system. At first I used POST to receive new alerts. In jQuery I had:
$.post('/a/alerts', 'stamp=' + STAMP, function(result)
{
});
And in PHP I used $_POST['stamp']. Even from localhost I got 90-100 ms for every request like this.
I simply changed:
$.get('/a/alerts?stamp=' + STAMP, function(result)
{
});
and in PHP switched to $_GET['stamp']. So a little less than 1 minute of changes. Now every request takes 30-40 ms.
So GET can be twice as fast as POST. Of course not always but for small amounts of data I get same results all the time.
GET is slightly faster because the values are sent in the header unlike the POST the values are sent in the request body, in the format that the content type specifies.
Usually the content type is application/x-www-form-urlencoded, so the request body uses the same format as the query string:
parameter=value&also=another
When you use a file upload in the form, you use the multipart/form-data encoding instead, which has a different format. It's more complicated.
I agree with other answers, but it was not mentioned that GET requests can be cached while POST requests are never cached. I think this is the main reason for some GET request being performed faster.
(Of-coarse this means that sometimes no request is actually sent. Hence it's not actually the GET request which is faster, but your browser's cache.)
HTTP Methods: GET vs. POST: http://www.w3schools.com/tags/ref_httpmethods.asp
POST will grow your headers more, just making it larger, but the difference ought to be negligible really, so I don't see why this should be a concern.
Just bear in mind that the proper way to speak HTTP is to use GET only for actions and POST for data. You don't have to, but you also don't want to have a case where Google bots can, for example, insert, delete or manipulate data that was only meant for a human to handle simply because it is following the links it finds.

Possibility of Ajax Response Mix-up

Suppose that I have an ASP.NET page, where a customer can select a product from a drop down list, and then with this change event, corresponding price, quantity etc. fields are changed and set to appropriate values. These values are obtained from server-side asp.net page using jquery's "$.post(....)" method. Now in the same page, there is another section, which shows live market statistics of the products. This section obtains the live market product values by making a request to the server-side asp.net page every 20 seconds interval, which is controlled using a timer.
Now suppose that this timer goes off and a request is being processed at the server. At the same time the customer selects a different product from the drop down list, which also fires another ajax request.
Is there any chance that the responses from these two different request can get mixed up? I mean the response which is intended for the live update section is considered as the response for the product catalog section? If that is the case, then before making an ajax request, how can I be sure that there is another request is being processed at the server, and if necessary, abort that request?
I don't want to use ASP.NET ajax for this situation, because it generates a lot of unnecessary script/data, which increases the page size, and the customers of this site has a bandwidth of 2-4 kbps.........................:|
It depends how the code is written. So long as you aren't using globals in your JS you should be fine.
The first example at http://www.jibbering.com/2002/4/httprequest.html uses globals (xmlhttp), don't do that. Pass variables about instead. Make use of the this keyword inside your onreadystatechange callback function.

Resources