I have an Silverlight 4 RIA Services application with custom Forms Authentication. The custom authentication service works like a charm.
The problems is I want to serialize the user object in a cookie which is then sent with each subsequent request.
I create the cookie and add it to the response cookie collection but on the next request the only cookies in the cookie collection are ASPXAUT and ASPX_SESSIONId, of the custom cookie not a trace.
This is the cookie management class:
public class CookieManager:ISessionManager
{
public object this[string key]
{
get
{
var context = getCurrentContext();
var cookie = context.Request.Cookies[key];
if (cookie == null) return null;
return deserialize(cookie.Value);
}
set
{
var context = getCurrentContext();
string cookieValue = serialize(value);
HttpCookie cookie = new HttpCookie(key, cookieValue);
cookie.Expires = DateTime.Now.AddDays(10000);
cookie.HttpOnly = false;
context.Response.Cookies.Remove(key);
context.Response.Cookies.Add(cookie);
}
}
public void Abandon()
{
var context = getCurrentContext();
context.Response.Cookies.Clear();
}
public void Clear()
{
Abandon();
}
private HttpContext getCurrentContext()
{
return HttpContext.Current;
}
private string serialize(object value)
{
MemoryStream stream = new MemoryStream();
BinaryFormatter formatter = new BinaryFormatter();
formatter.Context = new StreamingContext(StreamingContextStates.Clone);
formatter.Serialize(stream, value);
StreamReader reader = new StreamReader(stream);
stream.Position = 0;
string result = reader.ReadToEnd();
reader.Close();
stream.Close();
return HttpUtility.UrlEncodeUnicode(result);
}
public object deserialize(string value)
{
value = HttpUtility.UrlDecode(value);
MemoryStream stream = new MemoryStream();
StreamWriter writer = new StreamWriter(stream);
writer.Write(value);
BinaryFormatter formatter = new BinaryFormatter();
return formatter.Deserialize(stream);
}
}
It reads and saves cookies.
Now my problem is this:
What I need to enable in silverlight or in the ASP.NET (WCF) application in order for extra cookies to be sent with each request along side the authentication cookie.
EDIT:
I've inspected the HTTP request/response stack and those extra cookies are sent from the server with the WCF RIA Services response but not returned by the next service call from the client.
If I understand your edit above correctly, you've already inspected the HTTP requests and found the desired cookie present in the HTTP Set-Cookie header of the response, but missing in the Cookie header of the next request. Is this correct? If not, please clarify.
If so, the problem sounds like one of three things:
the client is not successfully saving the cookie, due to many possible reasons including:
cookie not properly formatted (unlikley)
cookie is too long
there's a client- or server-side policy (e.g. P3P) preventing saving persistent cookies.
The client is saving the cookie OK, but is not sending it back, even without Silverlight. This could be caused by, for example, a security issue where the hostname of the first request is different from the second.
The client is saving the cookie and can send it back over regular HTML pages, but not via HTTP requests sent by Silverlight.
To see if #1 is the problem, look (using your browser's ability to view cookies) at the cookies saved by your browser for that site. Is the expected cookie saved? If it is, then you can eliminate #1 as the problem. If it's not saved, start looking
To see if #2 is the problem, try creating a server-side page with no silverlight on it-- just a simple HTML page. When you visit that page with your browser, is the cookie sent as expected? If yes, then #2 is not your problem.
If #1 and #2 are not the problem, that leaves #3. Silverlight's HTTP handling is complicated, not least because you have to choose between having HTTP client requests handled by the browser or by Silverlight. Read the Silverlight cookies documentation carefully and see if any of the info therein will help you figure out the problem. Consider trying to use the "Client HTTP" setting, or if you're already using this, consider switching back to the "browser HTTP" setting and see if your problem goes away. Note that the Client HTTP setting apparently has a problem with losing new cookies after an HTTP redirect. See this thread for more info. There's a workaround discussed in that thread: using CookieContainer.
BTW, could you edit your question to include all the HTTP headers of the request and the subsequent request? This may help diagnosis.
Related
Fundamentally all I need to do is grab a users profile photo after successful login (asp.net 4.8) since it doesn't seem that I can request the photo to come over with the login claims.
This is the callback handler
SecurityTokenValidatedNotification<Microsoft.IdentityModel.Protocols.OpenIdConnect.OpenIdConnectMessage, OpenIdConnectAuthenticationOptions> notification
This is how I get the Identity from that callback and it's all there looking good
var identity = notification.AuthenticationTicket.Identity;
So I'm trying to callback with RestSharp
var client = new RestSharp.RestClient("https://graph.microsoft.com");
var request = new RestSharp.RestRequest($"/v1.0/users/{email}/photo/$value", RestSharp.Method.GET);
var callbackResult = client.Execute(request);
Debugger.Break();
if (callbackResult.StatusCode == HttpStatusCode.OK)
{
Debugger.Break();
}
But it keeps (I suppose OBVIOUSLY) coming back as unauthorized. Is there some token or something I can use now that the user has authenticated to add a header or querystring or something that will just get me the extra data easily?
I have some asp.net pages that read and write cookie values. During the life cycle of a page it may update the cookie value and then need to read it again further in the code. What I've found is that it's not getting the latest value of the cookie until a page refresh. Is there a way around this? Here's the code I'm using to set and get the values.
public static string GetValue(SessionKey sessionKey)
{
HttpCookie cookie = HttpContext.Current.Request.Cookies[cookiePrefix];
if (cookie == null)
return string.Empty;
return cookie[sessionKey.SessionKeyName] ?? string.Empty;
}
public static void SetValue(SessionKey sessionKey, string sessionValue)
{
HttpCookie cookie = HttpContext.Current.Request.Cookies[cookiePrefix];
if (cookie == null)
cookie = new HttpCookie(cookiePrefix);
cookie.Values[sessionKey.SessionKeyName] = sessionValue;
cookie.Expires = DateTime.Now.AddHours(1);
HttpContext.Current.Response.Cookies.Set(cookie);
}
What you're missing is that when you update the cookie with SetValue you're writing to the Response.Cookies collection.
When you call GetValue you're reading from the Request.Cookies collection.
You need to store the transient information in a way that you access the current information, not just directly the request cookie.
One potential way to do this would be to writer a wrapper class that with rough psuedo code would be similar to
public CookieContainer(HttpContext context)
{
_bobValue = context.Request.Cookies["bob"];
}
public Value
{
get { return _bobValue; }
set {
_bobValue = value;
_context.Response.Cookies.Add(new Cookie("bob", value) { Expires = ? });
}
}
I ran into needing to do similar code just this week. The cookie handling model is very strange.
Start using Sessions to store your information, even if it's only temporary.
Cookies rely on a header being sent to the browser before the page has rendered. If you've already sent information to the client then proceed to set a cookie, you're going to see this "page refresh delay" you've described.
If it's necessary to have this value, use a session variable between the time you set the cookie and when you refresh the page. But, even then I would just recommend avoiding settings cookies so late in the processing step and try to set it as early as possible.
I have been experimenting with code that will clear all of the cookies in an HttpContext.Response.
Initially, I used this:
DateTime cookieExpires = DateTime.Now.AddDays(-1);
for (int i = 0; i < HttpContext.Request.Cookies.Count; i++)
{
HttpContext.Response.Cookies.Add(
new HttpCookie(HttpContext.Request.Cookies[i].Name, null) { Expires = cookieExpires });
}
However, this will error with an OutOfMemoryException because the for loop never exits - each time you add a cookie to the Response, it also gets added to the `Request.
The following approach works:
DateTime cookieExpires = DateTime.Now.AddDays(-1);
List<string> cookieNames = new List<string>();
for (int i = 0; i < HttpContext.Request.Cookies.Count; i++)
{
cookieNames.Add(HttpContext.Request.Cookies[i].Name);
}
foreach (string cookieName in cookieNames)
{
HttpContext.Response.Cookies.Add(
new HttpCookie(cookieName, null) { Expires = cookieExpires });
}
So, what exactly is the relationship between HttpContext.Request.Cookies and HttpContext.Response.Cookies?
Request.Cookies contains the complete set of cookies, both those that browser send to the server and those that you just created on the server.
Response.Cookies contains the cookies that the server will send back.
This collection starts out empty and should be changed to modify the browser's cookies.
The documentation states:
ASP.NET includes two intrinsic cookie
collections. The collection accessed
through the Cookies collection of
HttpRequest contains cookies
transmitted by the client to the
server in the Cookie header. The
collection accessed through the
Cookies collection of HttpResponse
contains new cookies created on the
server and transmitted to the client
in the Set-Cookie header.
After you add a cookie by using the
HttpResponse.Cookies collection, the
cookie is immediately available in the
HttpRequest.Cookies collection, even
if the response has not been sent to
the client.
Your first code sample should work if you make the for loop run backwards.
The new cookies will be added after the end, so the backwards loop would ignore them.
Our Situation:
Our team needs to retrieve log information from a 3rd party website (Specifically, this log
information is call logs -- our client rents an 866 number. When calls come in, they assist
people and need to make notes accordingly in our application that will correspond with the
current call). Our client has a web account with the 3rd party that allows them to view the
current call logs (date/time, phone number, amount of time on each call, etc).
I contacted the developer of their website and inquired about API or any other means of syncing
our database with their constantly updating database. They currently DO NOT support API. I
informed them of my situation and they are perfectly fine with any way we can retrieve the
information (bot/crawler). *The 3rd party said that they are working on API but could not give
us a general timeline as to when it will be up... and as with every client, they need to start
production ASAP.
I completely understand that if the 3rd party were to change their HTML layout, it may cause a
slight headache for us (sorting the data from the webpage). That being said, this is a temporary
solution to a long term issue. Once they implement their API, we will switch them over to it.
So my question is this:
What is the best way to log into the 3rd party website (see image: http://i903.photobucket.com/albums/ac239/jreedinc/customtf.jpg)
and retrieve certain HTML pages? We have reviewed source codes of webcrawlers, but none of them
have the capability of storing cookies and posting information back to the website (with log in information). We would prefer to do this in ASP.NET.
Is there another way to accomplish logging on to the website, then retrieving said information?
The classes you'll need to use are in the System.Net namespace. Below is some quick and dirty proof of concept code. To login in to a site that uses form login + cookies for security and then scrape the HTML output of a page.
In order to parse the HTML results you'll need to use an additional tool.
Possible HTML parsing tools.
SgmlReader, can convert HTML to XML. You then use .NET's XML features to extract data from the XML.
http://code.msdn.microsoft.com/SgmlReader
HTML Agility Pack, allows XPath queries against HTML documents.
http://htmlagilitypack.codeplex.com/
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
class WebWorker {
/// <summary>
/// Cookies for use by web worker
/// </summary>
private System.Collections.Generic.List `<System.Net.Cookie` > cookies = new List < System.Net.Cookie > ();
public string GetWebPageContent(string url) {
System.Net.HttpWebRequest request = (System.Net.HttpWebRequest) System.Net.WebRequest.Create(url);
System.Net.CookieContainer cookieContainer = new System.Net.CookieContainer();
request.CookieContainer = cookieContainer;
request.Method = "GET";
//add cookies to maintain session state
foreach(System.Net.Cookie c in this.cookies) {
cookieContainer.Add(c);
}
System.Net.HttpWebResponse response = request.GetResponse() as System.Net.HttpWebResponse;
System.IO.Stream responseStream = response.GetResponseStream();
System.IO.StreamReader sReader = new System.IO.StreamReader(responseStream);
System.Diagnostics.Debug.WriteLine("Content:\n" + sReader.ReadToEnd());
return sReader.ReadToEnd();
}
public string Login(string url, string userIdFormFieldName, string userIdValue, string passwordFormFieldName, string passwordValue) {
System.Net.HttpWebRequest request = (System.Net.HttpWebRequest) System.Net.WebRequest.Create(url);
System.Net.CookieContainer cookieContainer = new System.Net.CookieContainer();
request.CookieContainer = cookieContainer;
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
string postData = System.Web.HttpUtility.UrlEncode(userIdFormFieldName) + "=" + System.Web.HttpUtility.UrlEncode(userIdValue) +
"&" + System.Web.HttpUtility.UrlEncode(passwordFormFieldName) + "=" + System.Web.HttpUtility.UrlEncode(passwordValue);
request.ContentLength = postData.Length;
request.AllowAutoRedirect = false; //allowing redirect seems to loose cookies
byte[] postDataBytes = System.Text.Encoding.UTF8.GetBytes(postData);
System.IO.Stream requestStream = request.GetRequestStream();
requestStream.Write(postDataBytes, 0, postDataBytes.Length);
System.Net.HttpWebResponse response = request.GetResponse() as System.Net.HttpWebResponse;
// System.Diagnostics.Debug.Write(WriteLine(new StreamReader(response.GetResponseStream()).ReadToEnd());
System.IO.Stream responseStream = response.GetResponseStream();
System.IO.StreamReader sReader = new System.IO.StreamReader(responseStream);
System.Diagnostics.Debug.WriteLine("Content:\n" + sReader.ReadToEnd());
this.cookies.Clear();
if (response.Cookies.Count > 0) {
for (int i = 0; i < response.Cookies.Count; i++) {
this.cookies.Add(response.Cookies[i]);
}
}
return "OK";
}
} //end class
//sample to use class
WebWorker worker = new WebWorker();
worker.Login("http://localhost/test/default.aspx", "uid", "bob", "pwd", "secret");
worker.GetWebPageContent("http://localhost/test/default.aspx");
I used a tool recently called WebQL (its a web scraper tool that lets the developer use SQL like syntax to scrape information from web pages.
WebQL on Wikipedia
This is actually a relatively simple operation. What you need to do is get the page that the screenshot posts back to (something like login.php, etc) and then construct a webrequest to that page with the login data you have. You will most likely get back a cookiecontainer that will have your login cookie to use on all subsequent requests.
You can look at this MSDN article for the basics of how to do it, but their write-up is kind of confusing. Look at the community comments at the end for an example of how to post back page variables (like the username and password). You will need to make sure you pass the cookiecontainer around on subsequent requests.
Unfortunately .NET does not natively have something like WWW::Mechanize, but the Webclient does have an "upload value" which might make it easier. You will still have to manually parse the page to figure out what fields you need to pass.
This question is a follow up to my previous question about getting the HTML from an ASPX page. I decided to try using the webclient object, but the problem is that I get the login page's HTML because login is required. I tried "logging in" using the webclient object:
WebClient ww = new WebClient();
ww.DownloadString("Login.aspx?UserName=&Password=");
string html = ww.DownloadString("Internal.aspx");
But I still get the login page all the time. I know that the username info is not stored in a cookie. I must be doing something wrong or leaving out an important part. Does anyone know what it could be?
Try setting the credentials property of the WebClient object
WebClient ww = new WebClient();
ww.Credentials = CredentialCache.DefaultCredentials;
ww.DownloadString("Login.aspx?UserName=&Password=");
string html = ww.DownloadString("Internal.aspx");
Just pass valid login parameters to a given URI. Should help you out.
If you don't have login information you shouldn't be trying to circumvent it.
public static string HttpPost( string URI, string Parameters )
{
System.Net.WebRequest req = System.Net.WebRequest.Create( URI );
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
byte[] bytes = System.Text.Encoding.ASCII.GetBytes( Parameters );
req.ContentLength = bytes.Length;
System.IO.Stream os = req.GetRequestStream();
os.Write( bytes, 0, bytes.Length );
os.Close();
System.Net.WebResponse resp = req.GetResponse();
if ( resp == null ) return null;
System.IO.StreamReader sr = new System.IO.StreamReader( resp.GetResponseStream() );
return sr.ReadToEnd().Trim();
}
Well does opening the page in a brower with "Login.aspx?UserName=&Password=" normaly work?
Some pages may not allow login using data provided in the url, and that it must be entered in the login form on the page and then submitted.
The only other reason I can think of then is that the web page is intentionally blocking it from loggin in. If you have access to the code, take a look at the loggin system used to see if theres anything designed to block such logins.
Use Fiddler to see the HTTP requests and responses that happen when you do it manually through the browser.
#Fire Lancer: I asked myself that same question during my tests, so I checked, and it does work from a browser.
As the aspx page I was trying to get was in my own projct, I could use the Server.Execute method. More details in my answer to my original question
Use Firefox with the LiveHttpHeaders plugin.
This will allow you to login via an actual browser and see EXACTLY what is being sent to the server. My first question would be to verify that it isn't expecting a POST from the form. The example URL you are loading is sending the info via a querystring GET.