Android WebView does not use manually installed certificates - xamarin.forms

I have a WebView in a Xamarin Forms application. It shows generated HTML which has references to resources on the intranet. These resources are TLS protected and a self signed certificate backs the secure communication.
The root-certificate is installed on the android device. Accessing it with google chrome works. However accessing it from WebView fails with an SSL error (3, untrusted).
For testing purposes, I changed the WebViewClient and overwrote OnReceiveSslError. There I check for my self signed certificate, allowing it manually (handler.Proceed). This works.
However this gives me new issues such as that the Navigating-event on the Xamarin.Forms.WebView does no more fire. Probably, I could also circumvent this problem, but everything seems a bit wrong.
To me it looks like as if the Android WebView does not use the same certificate store as that Google Chrome does. Is this correct and is there a fix to this behavior, hopefully without the necessity to create a custom renderer for the WebView control and introducing a whole bunch of potential new issues (such as the missing Navigating-event)?
Update according to Jessie Zhang's request
Broken Navigating Event
Attach in XAML a Navigating event. Assign a custom renderer for the WebView and in OnElementChanged assign a custom WebviewClient via SetWebViewClient. This step breaks the Navigating event, regardless whether the OnReceiveSslError is overriden or not (see code below).
Certificate Error
Use a WebView and assign it a custom built HTML via HtmlWebViewSource and embed some images which reference a https:// resource on a server with a self signed root-certificate (OpenSSL). Add the root certificate to the Android certificate store (via the User certificates applet). Although chrome and other clients accept now the https-resources, WebView does not. It seems as it does not consider the manually installed root certificate.
As an additional information: When overriding the OnReceiveSslEvent in WebViewClient, the certificate is available in the SslError parameter, however not accepted.
protected override void OnElementChanged(ElementChangedEventArgs<Xamarin.Forms.WebView> e) {
base.OnElementChanged(e);
if (e.NewElement != null) {
Control.SetWebViewClient(
new BreakingWebViewClient()
);
}
}
class BreakingWebViewClient : WebViewClient{
// ... some code, but for breaking the Navigating-event, no code is necessary
}

Related

Tabs opened via Hyperlinks in Excel/Word not recognizing session cookies

I have an ASP.Net application with authentication using Cookie session variables. Once the user logs in, they can open new browser tabs for the same application and these are logged in automatically as the session cookie is present.
Clicking on a hyperlink on another web page pointing to a specific page within the application also works fine - there is no login required as the user is already logged in.
However, when a hyperlink to the application is in a Word/Excel document, this link does not open the page directly and gets bounced to the Login page instead. If I copy/paste the Url from Word/Excel and paste it in the Url bar on the browser, it works fine.
Any explanation to this behaviour? Does the browser open a isolated session when a link is clicked in Word/Excel?
Edit: It also seems Word/Excel perform their own check before opening a browser tab. If I use a non-existent link, it doesn't open the tab.
We ran into this at my place of work a while back, and found that like you mentioned, MS Office applications indeed do some mysterious stuff behind the scenes. Details on what it actually does are in this article: https://learn.microsoft.com/en-us/office/troubleshoot/office-suite-issues/click-hyperlink-to-sso-website
Toward the bottom of that article, they suggest a workaround involving a meta refresh, which is what worked for us. In our case, we added a method to our request pipeline that checks for a Microsoft product in the User-Agent header. If found, it sends a meta refresh that triggers the browser to use an existing session rather than trying to start a new session (which is why you're being redirected to a logon page). Here's more or less the code:
private static string MSUserAgentsRegex = #"[^\w](Word|Excel|PowerPoint|ms-office)([^\w]|\z)";
protected void Application_OnPostAuthenticateRequest(object sender, EventArgs e)
{
if (!Request.IsAuthenticated)
{
if (System.Text.RegularExpressions.Regex.IsMatch(Request.UserAgent, MSUserAgentsRegex))
{
Response.Write("<html><head><meta http-equiv='refresh' content='0'/></head><body></body></html>");
Response.End();
}
}
}

in Xamarin/App how do I Secure Files on ASP.NET Restful Server in folders from other users and general public

I have an APP using restful server. I want to store PDF's, images, etc. in folders on my server. How can I make the folders private on server, yet allow App to access only certain folders depending on their app access.
I have different users in app and security/tokens established, etc. But if they upload an image for their avatar (and now PDF's), they get stored in folders on the server, and I just display with image source=https://blahblah.com/org1/images/user232.jpg.
How can I make that not accessible to outside (like just going to browser), yet make accessible to app if they have correct login privilege's for that organization/user? And then further extend that logic to more sensative PDF's, and other docs uploaded through app. I didn't want to store in SQL since then harder to use simple image display tools and I already have upload and media managers using folders structures.
I can see how to secure if logging onto server through browser (credentials), but can't see how you connect App with that security level and maintain it for the session.
For future readers. Most of the work was done on the restful (ASP.NET) side. I first tried using authorization/Authentication in web.config and having Allow and deny. This allowed a redirect of a user to a login page; however, it didn't do it if they entered an image exactly correct on website.
Found HTTPHandlers (adding in webconfig ) where I could write code that would be executed once the user entered the specific Image address xyz/abc/image.png. I found this a bit feeling like a hack.
So lastly modified my
routes.MapRoute(
name: "staticFileRoute",
url: "publicstor/{*file}",
defaults: new { controller = "Home", action = "HandleStatic" }
And add a function like this to home controller.
[System.Web.Http.HttpGet]
public ActionResult HandleStatic(string file)
{
if (Session["OrgId"] == null) //todo need to add full security check.
{
return View("Login");
}
else //Either coming from app or coming from web interface
{
string mimeType = MimeInfo.GetMimeType(Path.GetExtension(file));
return File(file, mimeType);
}
}
The final bit is on the Xamarin side to now pass security when getting an image. Since just a simple Xamarin.Forms.Image doesn't have a way to pass login info or tokens/authentication I used
https://forums.xamarin.com/discussion/145575/image-from-url-needing-auth
And established an appwide webclient that logged in generally once forcing my restful to go through security validation, then just accessed the images/documents through out my app from that webclient. So far so good. Hopefully there are no holes.
This gives the gist to a future reader.

How do I configure JxBrowser to automatically persist cookie and use them automatically to login site?

I am using JxBrowser v7.10 in mac and have some cookie issues on persisting them and use them automatically to autologin site on application restart. It seems that cookies fail to persist somehow.
I searched and read the documentation in
https://jxbrowser-support.teamdev.com/docs/guides/cookies.html#working-with-cookies
https://jxbrowser-support.teamdev.com/javadoc/7.2/com/teamdev/jxbrowser/cookie/CookieStore.html
and yet could not find how to use the cookie.
The doc mentions
"JxBrowser delegates the work with cookies to the Chromium engine. Chromium decides how to download cookies from a web server, extract them from the HTTP headers and store them in a local file system (persistent cookies) or in the memory (session cookies)."
So from that understanding, the cookie should be auto-persist and able to use them on application restart , but instead I need to relogin everytime the application restart.
The following is the test code . I should be able to login gmail and jxbrowser auto persist the cookie (no coding require) and autologin gmail on restart , however the following code fail to do that .
Is that something I need to do to implement that ?
import com.teamdev.jxbrowser.browser.Browser;
import com.teamdev.jxbrowser.engine.Engine;
import com.teamdev.jxbrowser.engine.EngineOptions;
import com.teamdev.jxbrowser.view.swing.BrowserView;
import javax.swing.*;
import java.awt.*;
import java.awt.event.WindowAdapter;
import java.awt.event.WindowEvent;
import static com.teamdev.jxbrowser.engine.RenderingMode.HARDWARE_ACCELERATED;
public final class CookieBrowser {
public static void main(String[] args) {
// Creating and running Chromium engine
final Engine engine = Engine.newInstance(
EngineOptions.newBuilder(HARDWARE_ACCELERATED).build());
Browser browser = engine.newBrowser();
// Loading the required web page
browser.navigation().loadUrl("www.gmail.com");
// No cookie printed out, why ?
engine.cookieStore().cookies().forEach(System.out::println);
SwingUtilities.invokeLater(() -> {
// Creating Swing component for rendering web content
// loaded in the given Browser instance
BrowserView view = BrowserView.newInstance(browser);
// Creating and displaying Swing app frame
JFrame frame = new JFrame("JxBrowser AWT/Swing");
// Closing the engine when app frame is about to close
frame.addWindowListener(new WindowAdapter() {
#Override
public void windowClosing(WindowEvent e) {
System.out.println("Cookie persist"); // but unable to use them automatically on restart
engine.cookieStore().persist();
engine.close();
}
});
frame.setDefaultCloseOperation(WindowConstants.DISPOSE_ON_CLOSE);
frame.add(view, BorderLayout.CENTER);
frame.setSize(800, 600);
frame.setVisible(true);
});
}
}`enter code here`
You get this behavior because you don't configure the Engine instance with the user data directory. So, every time you create a new Engine instance, a temp directory will be created and the cookie store will be initialized there. As a result, all the previously created cookies will not be available anymore.
Please check out the guide at https://jxbrowser-support.teamdev.com/docs/guides/engine.html#user-data-directory
From that guide:
User Data Directory
Represents an absolute path to the directory where the data such as cache, cookies, history, GPU cache, local storage, visited links, web data, spell checking dictionary files, etc. is stored. For example:
Engine engine = Engine.newInstance(EngineOptions.newBuilder(...)
.userDataDir(Paths.get("/Users/Me/.jxbrowser"))
.build());
If you do not provide the user data directory path, JxBrowser will create and use a temp directory in the user’s temp folder.
FYI: Just in case I will mention this info in the Cookies guide as well.

Using the facebook c# sdk in asp.net without a canvas application

Im trying to use the sdk without a canvas application, so have followed steps 1-7 in the quickstart guide up to adding the facebookSettings property in the Web.config.
I have added an image to my page and an onclick event that contains the below code. but when I click the button, it just takes me to the home page (CancelUrlPath).
Changing the Authorizer to a CanvasAuthorizer results in FB loading the login screen, but I get an error 404 not found on the call (even after inserting the handlers into the config)..
fbApp = new FacebookApp();
authorizer = new Authorizer(fbApp) {Perms = requiredAppPermissions};
authorizer.ReturnUrlPath = "http://localhost/User/UserRegister.aspx";
authorizer.CancelUrlPath = "http://localhost/";
if (authorizer.Authorize(this.Context))
{
Response.Write("hello");//never gets here
}
Can anyone help please?
Note: I've set the canvas and site url to http://localhost/ on the FB app settings.
If you are just building a simple connect website you really don't want to use the server side authentication tools to authenticate your user. Just use the Javascript SDK to athenticate the user. If you need to do anything on the server side, the FacebookApp class will automatically pick up the user's session from the values stored in the cookies.
See the Facebook documentation for more details: http://developers.facebook.com/docs/guides/web/#registration

.NET Recaptcha https

We've started using the ASP.NET recaptcha control and it works fine. but one of the requirements we have is that all outbound traffic goes over Https.
I know that recaptcha supports https, but It's not clear how to configure (or even if it is configurable) when using the ASP.NET plugin option.
has anyone got any experience of this?
I'll expand a little on what I've found so far....
The Recaptcha package contains 3 public classes
RecaptchaControl,
RecaptchaValidator
and
RecaptchaResponse
RecaptchaControl is an Asp.NET control, the recaptcha specific methods on there seem to be concerning themes/look and feel.
An instance of the Validator has a RemoteIP field (which I presume would represent the verification server), but I can't a way of binding that to the control.
RecaptchaResponse seems to more or less represent an enum with possible responses (valid/invalid/failed to connect).
looks like the Recaptcha control intelligently selects https if the request was https.
I'm presuming it does the same for the validation, but its not clear from source code
http://code.google.com/p/recaptcha/source/browse/trunk/recaptcha-plugins/dotnet/library/
private const string VerifyUrl = "http://www.google.com/recaptcha/api/verify";
private const string RECAPTCHA_SECURE_HOST = "https://api-secure.recaptcha.net";
private const string RECAPTCHA_HOST = "http://api.recaptcha.net";
--------------------------------SNIP------------------------------------
/// <summary>
/// This function generates challenge URL.
/// </summary>
private string GenerateChallengeUrl(bool noScript)
{
StringBuilder urlBuilder = new StringBuilder();
urlBuilder.Append(Context.Request.IsSecureConnection || this.overrideSecureMode ? RECAPTCHA_SECURE_HOST : RECAPTCHA_HOST);
urlBuilder.Append(noScript ? "/noscript?" : "/challenge?");
urlBuilder.AppendFormat("k={0}", this.PublicKey);
if (this.recaptchaResponse != null && this.recaptchaResponse.ErrorCode != string.Empty)
{
urlBuilder.AppendFormat("&error={0}", this.recaptchaResponse.ErrorCode);
}
return urlBuilder.ToString();
}
If you check out http://recaptcha.net/apidocs/captcha/client.html it says:
"In order to avoid getting browser
warnings, if you use reCAPTCHA on an
SSL site, you should replace
http://api.recaptcha.net with
https://api-secure.recaptcha.net."
So clearly recaptcha supports HTTPS submissions. Does the ASP.NET control have any properties you can configure the outbound URL? At worst you might need to use Reflector to examine the code and see how it's built.
The .NET library does not require any configuration to work on HTTPS environment. It will derive from the current HttpContext whether the request is made from HTTPS protocol.
But, there is RecaptchaControl.OverrideSecureMode property that you can use just in case it doesn't work as expected. Set to True to force HTTPS mode.
Update:
I seem to have misunderstood the question. I am afraid there is no HTTPS endpoint for reCAPTCHA verification (between your server and theirs).
We are using the reCAPTCHA plugin for .NET, and we needed to do two things to get it working over SSL in our environment. Our dev environment does not use SSL, and our test and production environments do.
Set the RecaptchaControl.OverrideSecureMode property to true, as Adrian Godong mentioned in his original answer to this question. This allowed the control to work locally and in dev not using SSL, and in test and prod using SSL.
<recaptcha:RecaptchaControl
OverrideSecureMode="True"
ID="recaptcha"
runat="server"
Theme="blackglass"
/>
When we generated the public and private keys, we specified global keys. This allowed us to use recaptcha in all of our different environments (local, dev.mydomain.com, test.mydomain.com and mydomain.com) and fixed the "input error: invalid referrer" error.

Resources