I have a really buggy web application at work. In order to avoid using its interface, I want something that will save the HTTP requests I send with it, and enable me to resend them whenever I want. Do you know of anything that does that? Maybe there is an add-on for Firefox (I searched, but didn't find one)?
I need to be able to do this on Linux.
You can use Fiddler to intercept HTTP requests and responses between the browser and the client.
Fiddler also supports handcrafting and sending HTTP requests using its Request Builder feature:
Try iMacros for firefox
According to plugin description:
"Automate Firefox. Record and replay repetitious work. If you love the Firefox web browser, but are tired of repetitive tasks like visiting the same sites every days, filling out forms, and remembering passwords, then iMacros for Firefox is the solution you’ve been dreaming of!"
https://addons.mozilla.org/en-US/firefox/addon/3863/
Related
When typing a web link into the safari URL field, the browser attempts to prefetch all links it has previously seen before, both GET and POST.
This causes each and every web link a server supports that is listed in the dropdown as a possible completion to be activated. This is problematic. For example, if a web site has authentication with an /auth/logout link for logging out, then this can cause the link to be activated if it appears in the dropdown, logging the user out unintentionally.
Many browsers send a specific header (eg. 'Purpose: Prefetch' in chrome) that allows the server side to filter prefetch/preload requests (eg. return a 503) but Safari doesn't seem to send any distinguishing header field. It also seems to try to prefetch POST requests, which seems very broken to me. Get requests are notionally at least idempotent, but POST requests are supposed to be understood to be data changing.
Has anyone got a solution to this? Please don't suggest that the browser preload feature can be turned off by the end user - that ISN'T a solution from a service delivery perspective.
Has anyone got an explanation as to why browsers would do this and NOT signal the purpose in a header field? (I get why prefetching is a useful Ux capacity, but not why its useful while typing URLs, especially for URLs already previously downloaded and thus capable of returning prefetching metadata that would allow a server to selectively disable the capability where appropriate) From what I can tell, this kind of functionality started to appear with header fields included, but some browsers have removed this signature. why? It seems to be dreadfully broken to me.
thanks.
I need to know which requests a webpage sends. Basically the site i call, calls another service/api/url whatever and receives the data (probably within javascript) and show me this. Can i see all the calls it make?
Edit: concrete example:
From this site (http://www.flickriver.com/lenses/nikon/) you can choose a lens, at that moment, the page sends a request to flickr, and get all the data. But in chrome developer tools i could not see this request.
Here is a screenshot of get requests. I have looked through them but could not see any request to flickr.
The first is request to the page. And the sixth one is the picture request already, where it requests the picture by its id. So in between other 4 requests should contain a request to the external source which gives the picture id in return or do i miss sth?
And what if the backend makes this request? Do i still need to see this request in developer tools?
No, of course you cannot see the calls made by some server to another server. Why would you expect to be able to do that? Those calls have nothing to do with the browser. The browser knows nothing about those requests. The browser knows only about requests that it itself initiated. Devtools can only report on requests made by the browser. If in fact there were some way to spy on the requests made by a server to another server, it would be gaping security hole.
I often have to go into other people's work to make modifications for short-term contracts and a lot of times I have to deal with security warnings throwing in IE because something is including HTTP content from maybe an image, or css or whatever, onto a HTTPS secure page.
I was just curious if there is a well known program or service that will scan a URL and come back with exactly what is coming from HTTP instead of HTTPS on a page?
I use fiddler, but for reasons having to do with my own inadequacies, I find the program difficult at times, and am un-able to zero in on the offending content in a timely manner.
Any advice from the true pros?
Using Fiddler:
In main menu > Tools > Fiddler options, tab HTTPS, uncheck Capture HTTPS CONNECTs. Then, in main menu > Rules, check Hide HTTPS CONNECTs.
This way, the only thing you'll see in your Fiddler capture will be the HTTP requests and responses (without the HTTPS requests or CONNECTs getting in the way).
I would load up the page in Firefox and use Firebug's Net panel to examine all the resources that the page loads.
There's no 'set in stone' way as far as I know, but the easiest way I know of is to either use a tool such as Opera Dragonfly or Chrome's Web Inspector, view the 'Network' tab and see where the resources are being loaded from. You can sometimes (depending on what you use) order this alphabetically and you will clearly see between http:// and https://.
Also as already mentioned, you could just search the source for http://.
I'm wondering if there are any utilities out there that will display the request/response headers sent/received by my web browser during a browsing session. Does anyone know of anything useful?
I'm familiar with the Modify Headers add-on for Firefox 4 and the HTTP Client utility for MacOSX but neither of these do quite what i'm looking for.
I suspect Fiddler might help here - it captures all of the traffic, including headers, content, etc. It works on startup with IE or Chrome; Firefox needs to be configured to use it as a web proxy.
What is the easiest way to obtain a complete log of all HTTP/HTTPS requests issued by Firefox during a browser session?
The question is programming related insofar that obtaining a log of all HTTP/HTTPS requests issued is a great troubleshooting tool when developing webapps.
Wireshark is the most complete tool for logging all http activity
Fiddler tool might be easier to get started with, and comes with built in HTTPS-decryption
TamperData addon for Firebug is a very good addon for changing the requests ad hoc
Firefox works with Fiddler.
LiveHttpHeaders is a great add-on for Firefox. It traces all the requests along with header information and post data. You can save the log to a file if you want to.
Safari has a built in Activity monitor - Window->Activity list all http requests I believe.