This is actually a follow up on my previous question (link)
I've created the HttpHandler and it works fine for now, I'll add flexibility by using the querystring and session to point the post I'm making in the right direction.
The next question is as follows.
Now that I have the old page iframed as it should be, there's still the trouble of handling the postbacks (or actions) these pages trigger.
Every button action (asp form post) refers to a page that is not there (it's on the other server from which I am importing functionality).
I've tried using a url mapping to the other server but I get an error that tells me the external link is not a valid virtual directory. Hence I discarded this option.
I there anyway to keep functionality going inside the iframe?
please do ask clarification if you need it.
I got a solution from a colleague.
before passing the response string to the Iframe from the handler I use a string.replace to adjust the urls in the old site. This way they point to the old site and everything works again :)
Related
The overall goal is to perform a search on the following webpage http://www.cma-cgm.com/eBusiness/Tracking/Default.aspx with a container value of CMAU1173561. I have tried two approaches, the php extension cURL and python's mechanized. The php approached involves a performing a POST submit using the input fields found on the page (NOTE: These are really ugly on the asp.net page). The returned page does not contain any of the search results. The second approaches involves using python's mechanize module. In this approach I load the page, select the form, then change the text field ctl00$ContentPlaceBody$TextSearch to the container value. When I load the response again no search results.
I am at a really dead end. Any help would be appreciate because as it stands my next step is to become a asp.net expertm which i perfer not to.
The source of that page is pretty scary (giant viewstate, tables all over the place, inline CSS, styles that look like they were copied from Word).
Regardless...an ASP.Net form still passes the same raw data to the server as any other form (though it is abstracted to the developer).
It's very possible that you are missing the cookies which go along with the request. If the search page (or any piece of the site) uses session state, the ASP.Net session cookie must be included in the request. You will be able to tell it from its name (contains "asp.net" and "session").
I assume that you have used a tool like Firebug or Chrome to view the complete outgoing request when the page is submitted. From my quick test, it looks like the request may be performed with a GET, not a POST. I submitted a form, looked at the request, and pasted the URL into a new browser window.
Example: http://www.cma-cgm.com/eBusiness/Tracking/Default.aspx?ContNum=CMAU1173561&T=57201202648
This may be all you need to do.
In my asp.net 2005 app, I would like conceal the app structure from the user. Currently, the end user can learn intimate details of my web app as they navigate and watch the url change. I don't want the end user to know about my application structure. I would like the browser url to not change if possible. Please advise.
thanks
E.A.
URL rewriting is the only one that can provide any kind of real concealment.
Just moving the requests to AJAX or to frames, means anyone (well, more advanced users) can still see those requests being fired, just not in the address bar.
Simplest solution is to use frames - a single frame that holds your application and is 100% * 100%. The URL will not change though the underlying URL can still be seen via "View Frame info", however only advanced users will even figure that out.
In your pages, make sure that they are contained inside the holding frame.
A couple of possibilities.
1) use AJAX to power everything. This will mean that the user never leaves the home page
2) use postbacks to power everything. In this, you'd have all those pages be user controls which you progrmattically hide or show.
3) URL rewriting (especially if this is asp.net 3.0 or later)
My site uses url parameters to dynamically load ascx files into a single main aspx. So if I get 'page_id=123' on the query string, I load the corresponding ascx. The url changes, but only the query string - the domain part remains the same.
If you want the url to remain precisely the same at all times, then frames (per Oded) or ajax (per Stephen) are probably the only ways to do it.
Short answer: use URL encryption
A simple & straight article: http://devcity.net/PrintArticle.aspx?ArticleID=47
and another article: https://web.archive.org/web/20210610035204/http://aspnet.4guysfromrolla.com/articles/083105-1.aspx
HTH
Which one is better when we want to redirect to a new page in asp.net : using a link button and then Response.Redirect OR using an html link?
Depends on your needs:
Use the <a> tag if there's no need for you to do anything with the form/page you're directing from
use response.redirect if you need information from the form before moving on to the next page, for example to update session states or store intermediate results.
well an anchor tag elimates a round trip to the server.
If you are just linking to another page with a static URL, use an HTML anchor link. It will have better performance and only requires code in one place: the page.
If you need to perform operations on the server before the redirect, including manipulating the URL (e.g. dynamically creating querystring parameters), then use a server control (button, link, linkbutton).
I would use the anchor tag. It seems that with the redirect you are doing extra work with the round trip for not much gain besides using the link button control (at least this is what I am surmising from your question, there are valid reasons to use it). Also (not that this will likely make a difference in your case) it does cause and additional status code to be sent to the browser (302), telling the client application something is potentially amiss). If you are working on a highly secure site and or have non-browser applications accessing the page to pull information, this could be a problem.
Actually you have a third option which might suit your needs. Depending on the way your pages are setup and what you need to do, you can use Server.Transfer also.
If you want to use a button instead of a link just for the way it looks, you can use javascript to change the location of the page in the button's onclick also saving you a trip to the server.
A site has 100's of pages, following a certain sitemap. A user can navigate to page2.aspx from page1.aspx. But if the user goes to page2.aspx directly say through a book marked URL, the user should be redirected to page1.aspx.
Edit: I dont want to go in and add code to every page that needs to fulfill this need.
Note: This is not a cross-page postback scenario.
You might consider something that is based off WorkFlow, such as this: http://blogs.msdn.com/mwinkle/archive/2007/06/07/introducing-the-pageflow-sample.aspx
The WCSF team also included a pageflow application block that you can use as a standalone add-on to your application.
I guess you could check the referrer, and if there isn't one / or it isn't page1.aspx then you could redirect back to page1.aspx.
As another answerer mentioned, you could use the Referrer header, but that can be faked by the client.
Since you don't want to modify each page, you could do something with an IHttpModule. Assuming you have some way of describing the valid page navigations, you could do something like this in the BeginRequest handler:
Check the session for a list of valid pages (using a default list for first visit if none are in the session).
If this request is for an invalid page, redirect to the place the user should be.
Based on this request, set up the list of valid pages and redirect page in the session so it's ready for the next request.
I recently worked with real code that checked to see if referrer was blank and used that as a step in authorization. The idea was users wouldn't be able to fake a referrer, you don't need a custom browser to fake a referrer. Users can book mark your page to delicious, then delicious.com is the referrer (and not blank).
I've had real arguments about how sophisticated a user needs to be to do certain hacks-- i.e. if users don't know how to set the referrer, then you can trust it. While true, it's unlikely your users will write a custom browser, but there already are Firefox addons to set headers, referrers etc and they're easy to use.
Josh has the best answer-- on page2 you should check the page hit log and see if the user has recently visted page1
I like alot of the answers above (specifically the workflow).
Another option, is creating each page as a usercontrol and having page1.aspx control what usercontrol gets loaded. This has the advantage of storing your workflow in a single place instead of on each page.
However, I don't think there's a magic bullet out there. It sounds like this security problem is an afterthought, or possibly reported as a bug, and you have been tasked with fixing it quickly and efficiently.
I would start weighing the answers here with their associated cost in hours.. I suspect the quickest solution will be to check referrer addresses on each page. Although hackable, it is obscure and if that risk is acceptable to you it may be the appropriate solution.
I want to make a site where there user can basically navigate the web from within an iframe. The catch is that I'd like to be able to have more control over what is rendered within the iframe. Specifically,
I'd like to be able to filter out images or text, disable forms etc.
I'd also like to be able to gather feedback such as what links the users clicked on.
Question 1:
Is this even possible using a standard back-end scripting language (like php), with html and javascript on the frontend?
Question 2:
Would I first need to grab the source of the site before it is rendered, then do whatever manipulation is necessary, and finally re-render it somehow?
Question 3:
Could somebody please explain the programming flow that would occur here (assuming its possible)?
I think you would probably want to grab the source of the of site (with server-side code) before rendering it. You might run into cross-site scripting issues if you try to use JavaScript. Your iframe would load a page like render.php and pass the address of the page to render os a querystring parameter. Then use regular expressions to find elements in the HTML that render.php downloads from the address. Rewrite the HTML as necessary and then write it all out to the iframe.
Rewrite links so that that the user is taken to a page you control and redirected onto a target site if you want to track where people are going. Example: a link in the page needs to go to google.com. You would send them to tracker.php?target=http://google.com. You control tracker.php and can log each load of this page and then redirect the user to the target site.
Update:
Another possible solution is to use Apache or other server to proxy the target website. There are modules like mod_proxy for this. There may also be modules that let you parse the HTML or you could roll your own.
I should point out that even the best solutions offered to your question will be somewhat brittle if you do not have full control over the target site. You will want to have lots of error handling or alerting.
You can have a look at this. It uses iFrame really well, and maybe even use the library it has.