I have created some header to login into a server,
After loggin into server, i am moving one page to another page using geturl operation using this below headers, but the problem i logged out the server i am not moving into further.
I thought it was missing cookie information.
set headers(Accept) "text/html\;q=0.9,text/plain\;q=0.8,image/png,*/*"
set headers(Accept-Language) "en-us,en\;q=0.5"
set headers(Accept-Charset) "ISO-8859-1,utf-8\;q=0.7,*\;q=0.7"
set headers(Proxy-Authorization) "[concat \"Basic\" [base64::encode $username:$password]]"
I don't how to set cookie information into headers could someone explain.
Thanks
Malli
Cookie support in Tcl is currently exceptionally primitive; I've got 95–99% of the fix in our fossil repository, but that's not much help to you. But for straight handling a session cookie for login purposes, you can “guerilla hack” it.
Sending the cookie
To send a cookie to the server, you need to send a header Cookie: thecookiestring. That's done by passing the -headers option to http::geturl which has a dictionary describing what to pass. We can get that from the array simply enough:
set headers(Cookie) $thecookiestring
set token [http::geturl $theurl -headers [array get headers]]
# ...
Receiving the cookie
That's definitely the easy bit. The rather-harder part is that you also need to check for a Set-Cookie header in the response when you do a login action. You get that with http::meta and then iterate through the list with foreach:
set thecookiestring ""
set token [http::geturl $theloginurl ...]
if {[http::ncode $token] >= 400} {error ...}
foreach {name value} [http::meta $token] {
if {$name ne "Set-Cookie"} continue
# Strip the stuff you probably don't care about
if {$thecookiestring ne ""} {append thecookiestring "; "}
append thecookiestring [regsub {;.*} $value ""]
}
Formally, there can be many cookies and they have all sorts of complicated features. Handling them is what I was working on in that fossil branch…
I'm assuming that you don't need to be able to forget cookies, manage persistent storage, or other such complexities. (After all, they're things you probably won't need for normal login sessions.)
I solved using this tool Fiddler
Thanks All
Related
I was able to successfully use external authentication with datazen via HTTPWEBREQUEST from code-behind with VB.NET, but I am unclear how to use this with an iframe or even a div. I'm thinking maybe the authorization cookies/token isn't following the iframe around? The datazen starts to load correctly, but then it redirects back to the login page as if it's not being authenticated. Not sure how to do that part, this stuff is pretty new to me and any help would be greatly appreciated!!
Web page errors include:
-OPTIONS url send # jquery.min.js:19b.extend.ajax # jquery.min.js:19Viewer.Controls.List.ajax # Scripts?page=list:35Viewer.Controls.List.load # Scripts?page=list:35h.callback # Scripts?page=list:35
VM11664 about:srcdoc:1
XMLHttpRequest cannot load http://datazenserver.com/viewer/jsondata. Response for preflight has invalid HTTP status code 405Scripts?page=list:35
load(): Failed to load JSON data. V…r.C…s.List {version: "2.0", description: "KPI & dashboard list loader & controller", url: "/viewer/jsondata", index: "/viewer/", json: null…}(anonymous function) # Scripts?page=list:35c # jquery.min.js:4p.fireWith # jquery.min.js:4k # jquery.min.js:19r # jquery.min.js:19
Scripts?page=list:35
GET http://datazenserver.com/viewer/login 403 (Forbidden)(anonymous function) # Scripts?page=list:35c # jquery.min.js:4p.fireWith # jquery.min.js:4k # jquery.min.js:19r # jquery.min.js:19
' ''//////////////////////////////////
Dim myHttpWebRequest As HttpWebRequest = CType(WebRequest.Create("http://datazenserver.com/"), HttpWebRequest)
myHttpWebRequest.CookieContainer = New System.Net.CookieContainer()
Dim authInfo As String = Session("Email")
myHttpWebRequest.AllowAutoRedirect = False
myHttpWebRequest.Headers.Add("headerkey", authInfo)
myHttpWebRequest.Headers.Add("Access-Control-Allow-Origin", "*")
myHttpWebRequest.Headers.Add("Access-Control-Allow-Headers", "Accept, Content-Type, Origin")
myHttpWebRequest.Headers.Add("Access-Control-Allow-Methods", "GET, POST, PUT, DELETE, OPTIONS")
Dim myHttpWebResponse As HttpWebResponse = CType(myHttpWebRequest.GetResponse(), HttpWebResponse)
Response.AppendHeader("Access-Control-Allow-Origin", "*")
' Create a new 'HttpWebRequest' Object to the mentioned URL.
' Assign the response object of 'HttpWebRequest' to a 'HttpWebResponse' variable.
Dim streamResponse As Stream = myHttpWebResponse.GetResponseStream()
Dim streamRead As New StreamReader(streamResponse)
frame1.Page.Response.AppendHeader("Access-Control-Allow-Origin", "*")
frame1.Page.Response.AppendHeader("headerkey", authInfo)
frame1.Attributes("srcdoc") = "<head><base href='http://datazenserver.com/viewer/' target='_blank'/></head>" & streamRead.ReadToEnd()
You might have to do more of this client-side, and I don't know whether you'll be able to because of security concerns.
External authentication in Datazen looks something like this:
User-Agent | Proxy | Server
-------------------|----------------------|------------------------------------
1. /viewer/home --> 2. Append header --> 3. Check cookie (not present)
<-- 5. Forward <-- 4. Redirect to /viewer/login
6. /viewer/login --> 7. Append header --> 8. Append cookie
<-- 10. Forward <-- 9. Redirect to /viewer/home
11. /viewer/home --> 12. Append header --> 13. Check cookie (valid)
<-- 15. Forward <-- 14. Give content
16. .................. Whatever the user wanted ..........................
So even though you're working off a proxy with a header, you're still getting a cookie back that it uses.
Now, that's just context.
My guess, from your description of the symptoms, is that myHttpWebResponse should have a cookie set (DATAZEN_AUTH_TOKEN, I believe), but it's essentially getting thrown out--you aren't using it anywhere.
You would need to tell your browser client to append that cookie to any subsequent (iframe-based) requests to the domain of your Datazen server, but I don't believe that's possible due to security restrictions. I don't know a whole lot about CORS, though, so there might be a way to permit it.
I don't know whether there's any good way to do what you're looking to do here. At best, I can maybe think of a start to a hack that would work, but I can't even find a good way to make that work, and you really wouldn't want to go there.
Essentially, if you're looking to embed Datazen in an iframe, I would shy away from external authentication. I'd shy away from it regardless, but especially there.
But, if you're absolutely sure you need it over something like ADFS, you'll need some way to get that cookie into your iframe requests.
The only way I can think to make this work would be to put everything on the same domain:
www.example.com
datazen.example.com (which is probably your proxy)
You could then set a cookie from your response that stores some encrypted (and likely expiring) form of Session("Email"), and passes it back down in your html.
That makes your iframe relatively simple, because you can just tell it to load the viewer home. Something to the effect of:
<iframe src="//datazen.example.com/viewer/home"></iframe>
In your proxy, you'll detect the cookie set by your web server, decrypt the email token, ensure it isn't expired, then set a header on the subsequent request onto the Datazen server.
This could be simplified at a couple places, but this should hold as true as possible to your original implementation, as long as you can mess with DNS settings.
I suppose another version of this could involve passing a parameter to your proxy, and sharing some common encryption key. That would get you past having to be on the same domain.
So if you had something like:
var emailEncrypted = encrypt(Session("Email") + ":somesalt:" + DateTime.UtcNow.ToString("O"));
Then used whatever templating language you want to set your iframe up with:
<iframe src="//{{ customDomain }}/viewer/home?emailkey={{ emailEncrypted }}"></iframe>
Then your proxy detected that emailkey parameter, decrypted it, and checked for expiration, that could work.
Now you'd have a choice to make on how to handle this, because Datazen will give you a 302 to /viewer/login to get a cookie, and you need to make sure to pass the correct emailkey on through that.
What I would do, you could accept that emailkey parameter in your proxy, set a completely new cookie yourself, then watch for that cookie on subsequent requests.
Although at that point, it would probably be reasonable to switch your external authentication mode to just use cookies. That's probably a better version of this anyway, assuming this is the only place you use Datazen, and you'd be safe to change something so fundamental. That would substantially reduce your business logic.
But, you wouldn't have to. If you didn't want to change that, you could just check for the cookie, and turn it into a header.
You should do (1), but just for good measure, one thing I'm not sure on, is whether you can pass users directly to /viewer/login to get a cookie from Datazen. Normally you wouldn't, but it seems like you should be able to.
Assuming it works as expected, you could just swap that URL out for that. As far as I know (although I'd have to double-check this), the header is actually only necessary once, to set up the cookie. So if you did that, you should get the cookie, then not need the URL parameter anymore, so the forced navigation would be no concern.
You'll, of course, want to make sure you've got a good form of encryption there, and the expiration pattern is important. But you should be able to secure that if you do it right.
I ended up just grabbing the username and password fields and entering them in with javascript. But this piece helped me a ton. You have to make sure you set the
document.domain ='basedomain.com';
in javascript on both sites in order to access the iframe contents else you'll run into the cross-domain issues.
how to capture html tables using TCL http package with an example
I have tried with an example but it simple log out from the session.
package require http
package require base64
set auth "Basic [base64::encode XXXX:XXXXX]"
set headerl [list Authorization $auth]
set query "http://100.59.262.156/"
set tok [http::geturl $query -headers $headerl]
set res [http::data $tok]
http::status $tok
# After login into the session, i am moving one page another web page
set goquery [http::formatQuery "http://100.59.262.156/web/sysstatus.html"]
# After this i am log out from the session. I am unable to find reason.
set tok [http::geturl $query -query http://100.59.262.156/web/sysstatus.html]
set res [http::data $tok]
# After this i will get a table output i need capture this table
# how to capture tables using
http::status $tok
Thanks
Malli
There are some misconceptions I see here:
You should cleanup the tokens that http::geturl returns. You do that with http::cleanup $token. Otherwise you get memory leaks.
The basic auth headers has to be send for every request. It is usually enough to request the desired site with the right headers.
http::formatQuery is for POST or GET parameters, not for the URL (you can use it for the query URL part, but not for the entire URL). Drop that.
The http package does not parse HTML for you. You have to do that yourself. I suggest using tdom
Because I don't know what your site returns, I can't tell you how to parse it, but here a script to get started:
package require http
package require base64
package require tdom
set auth "Basic [base64::encode XXXX:XXXXX]"
set headerl [list Authorization $auth]
set url "http://100.59.262.156/web/sysstatus.html"
set tok [http::geturl $url -headers $headerl]
# TODO: Check the status here. If you get a 403, the login information was not correct.
set data [http::data $tok]
# Important: cleanup
http::cleanup $tok
# Now parse it
dom parse -html $data doc
# Search for the important stuff, walk over the dom... to the the important information
I'm using vert.x to write an application. It doesn't have built-in cookie support yet, and we have to use "putHeader()" method to manually set cookies.
Now I want to set several cookies, so I write:
req.response.putHeader("Set-Cookie", "aaa=111; path=/")
req.response.putHeader("Set-Cookie", "bbb=222; path=/")
req.response.putHeader("Set-Cookie", "ccc=333; path=/")
But I found vert.x send only one "Set-Cookie":
Set-Cookie ccc=333; path=/
I'm not sure if I misunderstand something. Can server send multi "Set-Cookie" commands one time? Is it correct to send multi cookies this way?
Use netty's io.netty.handler.codec.http.ServerCookieEncoder functionality:
req.response.putHeader("Set-Cookie",
ServerCookieEncoder.encode(new DefaultCookie("aaa", "111")))
there're many useful method signatures:
ServerCookieEncoder.encode(Cookie cookie)
ServerCookieEncoder.encode(Cookie... cookies)
ServerCookieEncoder.encode(Collection<Cookie> cookies)
ServerCookieEncoder.encode(Iterable<Cookie> cookies)
I think no, it's impossible out of the box because headers stored in a HashMap:
https://github.com/purplefox/vert.x/blob/master/src/main/java/org/vertx/java/core/http/impl/DefaultHttpServerResponse.java#L81
You can:
Open new issue
Comment existing issue https://github.com/purplefox/vert.x/issues/89
Checkout source and use map what allow duplicate keys
Map implementation with duplicate keys (you need handle duplicate manually, for instance Location-header should be only one time
Extend DefaultHttpServerResponse and see how you can integrate it
Merge cookies and handle it manually, for instance:
req.response.putHeader("Set-Cookie", "aaa=111&bbb=222&ccc=333; path=/")
There is one work-arround.
req.response()
.putHeader("Set-Cookie", "some=cookie;max-age=1000;path=/;HttpOnly"
+"\nSet-Cookie: next=cookie"
+"\nSet-Cookie: nnext=cookie;HttpOnly");
for some reason, IE6/7 is caching the ajax call that returns a json result set back.
My page makes the call, and returns a json result which I then inject into the page.
How can I force IE6/7 to make this call and not use a cached return value?
You might want to add
Cache-Control: no-cache
to your HTML response headers when you're serving the JSON to tell the browser to not to cache the response.
In ASP.NET (or ASP.NET MVC) you can do it like this:
Response.Headers.Add("Cache-Control", "no-cache");
you can change your settings in ie, but the problem most likely lies on your server. You can't go out and change all your users' browser settings. But if you want to at least check it on your browser, go to Internet Options->General (Tab)->Browsing History(section)->Settings (button)->"Every time I visit the webpage"
Make sure you set it back, though, at some point.
To fix it on the server, have a look at http://www.mnot.net/cache_docs/
Using curl (w/ cygwin) for debugging is your great way to figure out what's actually being sent across the wire.
If cache-control doesn't work for you (see DrJokepu's answer), according to the spec the content from any URL with a query string should be non-cacheable, so you might append a pointless query parameter to your request URL. The value doesn't matter, but if you really want to be thorough you can append the epoch value, e.g.:
var url = "myrealurl?x=" + (new Date()).getTime();
But this is a hack; really this should be solved with proper caching headers at the server end.
In the controller action that returns a JsonResult, you need to specify in your headers to avoid caching:
ControllerContext.HttpContext.Response.AddHeader("Cache-Control", "no-cache");
I am doing an e-commerce solution in ASP.NET which uses PayPal's Website Payments Standard service. Together with that I use a service they offer (Payment Data Transfer) that sends you back order information after a user has completed a payment. The final thing I need to do is to parse the POST request from them and persist the info in it. The HTTP request's content is in this form :
SUCCESS
first_name=Jane+Doe
last_name=Smith
payment_status=Completed
payer_email=janedoesmith%40hotmail.com
payment_gross=3.99
mc_currency=USD
custom=For+the+purchase+of+the+rare+book+Green+Eggs+%26+Ham
Basically I want to parse this information and do something meaningful, like send it through e-mail or save it in DB. My question is what is the right approach to do parsing raw HTTP data in ASP.NET, not how the parsing itself is done.
Something like this placed in your onload event.
if (Request.RequestType == "POST")
{
using (StreamReader sr = new StreamReader(Request.InputStream))
{
if (sr.ReadLine() == "SUCCESS")
{
/* Do your parsing here */
}
}
}
Mind you that they might want some special sort of response to (ie; not your full webpage), so you might do something like this after you're done parsing.
Response.Clear();
Response.ContentType = "text/plain";
Response.Write("Thanks!");
Response.End();
Update: this should be done in a Generic Handler (.ashx) file in order to avoid a great deal of overhead from the page model. Check out this article for more information about .ashx files
Use an IHttpHandler and avoid the Page model overhead (which you don't need), but use Request.Form to get the values so you don't have to parse name value pairs yourself. Just pretend you're in PHP or Classic ASP (or ASP.NET MVC, for that matter). ;)
I'd strongly recommend saving each request to some file.
This way, you can always go back to the actual contents of it later. You can thank me later, when you find that hostile-endian, koi-8 encoded, [...], whatever it was that stumped your parser...
Well if the incoming data is in a standard form encoded POST format, then using the Request.Form array will give you all the data in a nice to handle manner.
If not then I can't see any way other than using Request.InputStream.
If I'm reading your question right, I think you're looking for the InputStream property on the Request object. Keep in mind that this is a firehose stream, so you can't reset it.