ASP.NET 4.5 Rest API's work in Unity Android/iOS build but fails with "Unknown error" in Unity WebGL build - asp.net

I have scoured every possible forum for this and somehow have not gotten my WebGL to consume my ASP.NET 4.5 REST API's.
From what I can tell it is possibly related to WebGL requiring CORS, but even enabling this I cannot get the game to communicate with my API's
So either there's something wrong with the way I have implemented global CORS settings in ASP.NET or something else is breaking.
To be clear these API's are running perfectly well on Android/iOS/Windows builds and even in the editor.
What I have done so far:
Installed the Microsoft CORS build as recommended by Microsoft's documentation relating to it, then added the following code to the WebAPIConfig class in Visual Studio:
public static void Register(HttpConfiguration config)
{
config.SuppressDefaultHostAuthentication();
config.Filters.Add(new HostAuthenticationFilter(OAuthDefaults.AuthenticationType));
// Web API routes
config.MapHttpAttributeRoutes();
////new code
config.EnableCors(new EnableCorsAttribute("*", "*", "*"));
config.Routes.MapHttpRoute(
name: "DefaultApi",
routeTemplate: "api/{controller}/{id}",
defaults: new { id = RouteParameter.Optional }
);
}
This is also in my web.config:
<httpProtocol>
<customHeaders>
<add name="Access-Control-Allow-Origin" value="*" />
<add name="Access-Control-Allow-Methods" value="GET, POST, PUT, DELETE, OPTIONS" />
<add name="Access-Control-Allow-Headers" value="Origin, X-Requested-With, Content-Type, Accept" />
</customHeaders>
</httpProtocol>
I need these settings global so I used the "*" as indicated by the documentation to include all domains, method types, and headers because I use ASP.NET token authentication for my API.
Here is a code snippet that gets the token in the Unity project (just to be clear, this works on other platforms, only throws an error in a WebGL build)
public IEnumerator login()
{
string url = API.ROUTEPATH + API.TOKEN;
WWWForm form = new WWWForm();
form.AddField("grant_type", "password");
form.AddField("username", API.APIUSERNAME);
form.AddField("password", API.APIPASSWORD);
UnityWebRequest uwr = UnityWebRequest.Post(url, form);
uwr.SetRequestHeader("Content-Type", "application/json");
yield return uwr.SendWebRequest();
try
{
if (uwr.isNetworkError)
{
Debug.Log(uwr.error);
}
else
{
APIAuthToken returnauth = JsonUtility.FromJson<APIAuthToken>(uwr.downloadHandler.text);
if (!string.IsNullOrEmpty(returnauth.access_token))
{
API.hasAuth = true;
API.token = returnauth.access_token;
Debug.Log(returnauth.access_token);
}
}
}
catch
{
}
}
uwr.error produces the following, very helpful error: Unknown Error So I'm not even sure if it is CORS related, it's just my best guess based on the research I have done, but even with multiple different implementations of it I still sit with the same error. So if it's not a problem with the API's and with my Unity code please just ignore the ASP.NET code snippet.

cURL - A simple curl -I <endpoint> or curl -X OPTIONS -v <endpoint> can reveal a ton of information about what is happening related to CORS. It can allow you to set different origins, check preflight responses, and more.
"Let's say you have a backend API that uses cookies for session management. Your game works great when testing on your own domain, but breaks horribly once you host the files on Kongregate due to the fact that your API requests are now cross-domain and subject to strict CORS rules."
Is this your problem?
Problably on both sides if things are not set up properly will refuse to send cookies, but its good, its mean you have the control to allow what domains your sessions cookies will be sent to.
So probably you need first to configure the server to allow multiplies origins but make sure to validate the value against a whitelist so that you aren't just enabling your session cookies to be sent to any origin domain.
Example on a Node Express with CORS middleware(game ID 12345) and an origin whitelist below:
express = require('express')
var cors = require('cors')
var app = express()
var whitelist = ['https://game12345.konggames.com'];
var corsOptions = {
credentials: true,
origin: function (origin, callback) {
if (whitelist.indexOf(origin) !== -1) {
callback(null, true)
} else {
callback(new Error('Not allowed by CORS'))
}
}
};
app.use(cors(corsOptions));
app.options('*', cors(corsOptions)); // Enable options for preflight
app.get('/', (req, res) => res.send('Hello World!'))
app.listen(8080, () => console.log(`Example app listening on port 8080!`))
cURL command to check the headers for an OPTIONS preflight request from an origin in the whitelist array:
curl -X OPTIONS -H"Origin: https://game12345.konggames.com" -v http://localhost:8080/
* Trying 127.0.0.1...
* TCP_NODELAY set
* Connected to localhost (127.0.0.1) port 8080 (#0)
> OPTIONS / HTTP/1.1
> Host: localhost:8080
> User-Agent: curl/7.58.0
> Accept: */*
> Origin: https://game12345.konggames.com
>
< HTTP/1.1 204 No Content
< X-Powered-By: Express
< Access-Control-Allow-Origin: https://game12345.konggames.com
< Vary: Origin, Access-Control-Request-Headers
< Access-Control-Allow-Credentials: true
< Access-Control-Allow-Methods: GET,HEAD,PUT,PATCH,POST,DELETE
< Content-Length: 0
< Date: Tue, 24 Sep 2019 22:04:08 GMT
< Connection: keep-alive
<
* Connection #0 to host localhost left intact
instruct the client to include cookies when it makes a cross-domain request,If the preflight response did not include Access-Control-Allow-Credentials: true, or if your Access-Control-Allow-Access is set to a wildcard (*) then the cookies will not be sent and you are likely to see errors in your browser's Javascript console:
Access to XMLHttpRequest at 'https://api.mygamebackend.com' from origin 'https://game54321.konggames.com' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: The value of the 'Access-Control-Allow-Origin' header in the response must not be the wildcard '*' when the request's credentials mode is 'include'. The credentials mode of requests initiated by the XMLHttpRequest is controlled by the withCredentials attribute.
Unity's UnityWebRequest and the older WWW classes use XMLHttpRequest under the hood to fetch data from remote servers. Since there is no option to set the withCredentials flag to true, we have to perform a pretty dirty hack when initializing our application in order to turn that on for the appropriate requests.
In your WebGL template or generated index.html:
<script>
XMLHttpRequest.prototype.originalOpen = XMLHttpRequest.prototype.open;
var newOpen = function(_, url) {
var original = this.originalOpen.apply(this, arguments);
if (url.indexOf('https://api.mygamebackend.com') === 0) {
this.withCredentials = true;
}
return original;
}
XMLHttpRequest.prototype.open = newOpen;
</script>
This snippet of code overrides the open method of XMLHttpRequest so that we can conditionally set withCredentials equal to true when desired. Once this is in place, cross-origin cookies should begin working between the Kongregate-hosted iframe domain and the game's backend servers!
info taken from here
also looks nice for this

Related

.Net Core Cross Site Cookie Not Being Set by Chrome or Firefox

I am trying to use a cookie sent from an Asp.Net Core web api site in a cross-site configuratioun. I can see the cookie arrive in the Response, but from what I can tell, it's not being set by either Firefox or Chrome. Either way, it's not being sent back on subsequent requests to the API. When I use Postman, everything works great.
I've tried using .Net Core middleware for authentication cookies with server and app configuration in Startup.cs. But I get the same result if I use the direct approach of appending the cookie to the HTTP response in my controller (shown in the sample code below).
My web site is running out of VS Code from a minimal create-react-app, npm start, localhost port 3000.
My API is running of out Visual Studio 2019, .Net Core 3.1, web api site, port 44302. I've also tried deploying to an Azure app service so that my localhost web site could call a non-localhost API. Cookie still not set or sent.
Question is, how do I get the browser to set and then send the cookie back to the API when developing in localhost (or deployed anywhere, for that matter!)? I've spent hours combing Stack Overflow and other docs for the answer. Nothing has worked. Thanks much for any help!
From Startup.cs. Define CORS policy. Note the allow credentials that pairs with the web site's xhr withCredentials:
public void ConfigureServices(IServiceCollection services)
{
...
services.AddCors(options =>
{
options.AddDefaultPolicy(
builder =>
{
builder
.SetIsOriginAllowed(host => true)
.AllowCredentials()
.AllowAnyMethod()
.AllowAnyHeader();
});
});
...
}
From my controller endpoint simulating login:
[HttpPost]
public IActionResult FauxLogin(string Email, string Pwd)
{
Response.Cookies.Append("LoginCookie", "123456", new CookieOptions
{
//Domain = ".app.localhost", // some suggest specifying, some suggest leaving empty for default.
Path = "/",
Secure = true,
HttpOnly = false,
SameSite = SameSiteMode.None
});
return Ok(new { success = true });
}
Javascript function calling back to the API:
function callApi() {
var xhr = new XMLHttpRequest();
xhr.open('GET', 'https://localhost:44302/api/account/echo', true);
xhr.withCredentials = true;
xhr.send(null);
}
Response header from dev tools for faux login call. Set Cookie present:
content-type: application/json; charset=utf-8
server: Microsoft-IIS/10.0
set-cookie: LoginCookie=123456; path=/; secure; samesite=none
access-control-allow-origin: http://localhost:3000
access-control-allow-credentials: true
x-powered-by: ASP.NET
date: Sun, 31 Oct 2021 23:27:22 GMT
X-Firefox-Spdy: h2
Request header calling back to API. No cookie.
GET /api/account/echo HTTP/2
Host: localhost:44302
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:93.0) Gecko/20100101 Firefox/93.0
Accept: */*
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate, br
Origin: http://localhost:3000
Connection: keep-alive
Referer: http://localhost:3000/
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: cross-site

Access-Control-Allow-Origin header completely ignored by FireFox

I set up my website (running IIS8.5) to send the response header for CORS to a subdomain off my main domain and the header response is getting to Firefox just fine. All plug-ins, ad-blockers, etc, are disabled and I can see the header in the DOM inspector.
I've tried:
Access-Control-Allow-Origin: *
Access-Control-Allow-Origin: https://services.mywebsite.com
Access-Control-Allow-Origin: http://services.mywebsite.com
Access-Control-Allow-Origin: null
Access-Control-Allow-Origin: "null"
I've verified the SSL Certificate is working just fine (it's a wildcard cert for *.mywebsite.com from Sectigo and I've verified that the entire certification path is working properly)
There are no other response headers except for: X-Frame-Options: SAMEORIGIN ,however, I removed it with the same result.
The site predates CORS by years (ASP.NET Webforms) and there are no other settings I can find that would prevent Firefox from acknowledging this response header.
I've read dozens of posts here (usually someone had a self-signed cert or forgot something) but am at a loss on what is wrong?
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://services.mywebsite.com/api/geodata/. (Reason: CORS header ‘Access-Control-Allow-Origin’ missing).
It's absolutely NOT MISSING! WTF Firefox?
Pulling hair out here. Anyone?
HTTP/1.1 200 OK
Cache-Control: no-cache
Pragma: no-cache
Transfer-Encoding: chunked
Content-Type: application/json; charset=UTF-8
Expires: -1
Server: Microsoft-IIS/8.5
X-Content-Type-Options: nosniff
Access-Control-Allow-Origin: https://services.mywebsite.com
X-Frame-Options: SAMEORIGIN
Date: Wed, 27 May 2020 08:28:05 GMT
Someone else suggested adding a CORS module to IIS. I did, then added to my web.config file the following (in system.webserver section):
<cors enabled="true">
<add origin="*" allowed="true" >
<allowHeaders allowAllRequestedHeaders="true" />
</add>
</cors>
No Joy! Same message from Firefox (and Chrome) - both browsers completely ignore this directive. Could this be a bug in Mozilla?
-------------------- more info ---------------------------------
I think the problem is with the following jquery script with my CHAT (which is doing the calling to the api). It's worked for 12 years (and still works on old versions), so I'm looking to see what's been deprecated. I suspect that SignalR may be the issue and confusing the browser(s) - since SignalR is making the request (not sure, though -just guessing now). Sorry for not mentioning this sooner.
$.connection.hub.start()
.done(function () {
var existingChatId = getExistingChatId(chatKey);
$.get("https://services.mywebsite.com/api/geodata/", function (response) {
myHub.server.logVisit(document.location.href, document.referrer, response.city_name, response.region_name, response.country_name, existingChatId);
}, "json");
})
.fail(function () { chatRefreshState(false); });
------------------- after using wildcards for CORS headers --------------
HTTP/1.1 200 OK
Cache-Control: no-cache
Pragma: no-cache
Transfer-Encoding: chunked
Content-Type: application/json; charset=UTF-8
Expires: -1
Server: Microsoft-IIS/10.0
X-Content-Type-Options: nosniff
X-SourceFiles: =?UTF-8?B?RDpcU2l0ZXNcaWNhcnBldGlsZXMyXFdlYlxzaWduYWxyXHN0YXJ0?=
X-Powered-By: ASP.NET
X-Frame-Options: SAMEORIGIN
Access-Control-Allow-Origin: *
Access-Control-Allow-Methods: *
Access-Control-Allow-Headers: *
Date: Sun, 07 Jun 2020 10:31:35 GMT
Still no joy - Headers are there. Must be a bug in ASP.NET webforms, IIS, SignalR (please note this is NOT MVC). Time to upgrade this site for this client. No one supports webforms anymore, anyway - it's dead.
It's not possible to do cross domain requests with SignalR and be CORS compatible. There is no way around this problem.
Just move your service to your www.yourwebsite.com and save your hair!
You can install cors dependency:
"Microsoft.AspNet.Cors": "6.0.0-rc1-final"
Add the CORS services in Startup.cs:
public void ConfigureServices(IServiceCollection services)
{
services.AddCors();
}
And enable it for specific domain:
public void Configure(IApplicationBuilder app)
{
app.UseCors(builder =>
builder.WithOrigins("http://example.com"));
}
Another option is enable cors for a specific method:
public class HomeController : Controller
{
[EnableCors("AllowSpecificOrigin")]
public IActionResult Index()
{
return View();
}
}
Or enable it for an specific controller:
[EnableCors("AllowSpecificOrigin")]
public class HomeController : Controller
{
}
If you're using MVC 3, and you have the file Global.asax you can use the method:
protected void Application_BeginRequest(object sender, EventArgs e)
{
HttpContext.Current.Response.AddHeader("Access-Control-Allow-Origin", allowedOrigin);
HttpContext.Current.Response.AddHeader("Access-Control-Allow-Methods", "GET,POST");
}
If you're using WebApi, you might use:
Install-Package Microsoft.AspNet.WebApi.Cors
And register the cors using:
public static void Register(HttpConfiguration config)
{
// New code
config.EnableCors();
}
And:
[EnableCors(origins: "http://example.com", headers: "*", methods: "*")]
public class TestController : ApiController
{
// My methods...
}
Or enable it for whole the project:
public static void Register(HttpConfiguration config)
{
var corsAttr = new EnableCorsAttribute("http://example.com", "*", "*");
config.EnableCors(corsAttr);
}
ASP.Net web forms
Response.AppendHeader("Access-Control-Allow-Origin", "*");
Also try:
Response.AppendHeader("Access-Control-Allow-Methods","*");
Try adding directly in web config:
<system.webServer>
<httpProtocol>
<customHeaders>
<add name="Access-Control-Allow-Methods" value="*" />
<add name="Access-Control-Allow-Headers" value="Content-Type" />
</customHeaders>
</httpProtocol>
</system.webServer>
Do you have app.UseCors() in your middleware pipeline, before app.MapSignalR()?
You can start with app.UseCors(CorsOptions.AllowAll) to check if it'll work and then add your own domain.
app.UseCors(CorsOptions.AllowAll);
app.MapSignalR();

Apache2.49 cookies not working via my ProxyPass VirtualHost

In the apache virtualHost i have these commands:
ProxyPass "/s" "http://127.0.0.1:3001"
ProxyPassReverse "/s" "http://127.0.0.1:3001"
RewriteRule ^/s/(.*) http://127.0.0.1:3001/$1 [P,L]
ProxyPassReverseCookiePath "/" "/"
The backend server is NodeJS. The proxy itself works fine. The problem is that the Node is sending a set-cookie in the HTTP header (session ID) but the browser seems to ignore it. I tested with Chromium and Firefox but none creates the cookie. I tried to change the virtualhost configuration but nothing appears to solve the problem The set-cookie command is:
set-cookie: sid=s%3AhgHWDO3D...BBUZbbOA; Path=/; HttpOnly; Secure;HttpOnly;Secure
I need your help to solve this problem. Thank you.
UPDATE
If the url is containing a direct request for the Node:
https://example.com/s/backend
it works. It creates the session is cookie. But if this URL is called from a AJAX request in the JS, it does not create the cookie.
The https://example.com load a HTML with a script load of a JS file. That JS file makes the AJAX call to the backend using the path https://example.com/s/something and in this case the cookie is never created.
Any suggestions?
UPDATE
I discovered that the problem is when i use the Fetch API to retrieve a JSON file. This code running does not create the session ID cookie:
fetch("https://localbestbatteriesonline.com/s/p.json?0103")
.then(function(response) {
return response.json();
})
.then(function(myJson) {
console.log(myJson);
});
But if i have this code, it creates the cookie:
xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
};
xhttp.open("GET", "https://localbestbatteriesonline.com/s/p.json?0103", true);
xhttp.send();
Analysing the requests, both are exactly the same. Both receive the cookie to create.
Any ideas why with the fetch does not work?
Problem solved. Using the Fetch API does not include the cookies exchange like it does in the XMLHttpRequest. Therefor, it does not create the session id cookie. To enable this, the Fetch call must have the option:
credentials:"same-origin".
fetch("https://localbestbatteriesonline.com/s/p.json?0103",{credentials:"same-origin"})
.then(function(response) {
return response.json();
})
.then(function(myJson) {
console.log(myJson);
});
Now it works.

asp.net Web Api 2 POST [duplicate]

I'm calling this function from my asp.net form and getting following error on firebug console while calling ajax.
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://anotherdomain/test.json. (Reason: CORS header 'Access-Control-Allow-Origin' missing).
var url= 'http://anotherdomain/test.json';
$.ajax({
url: url,
crossOrigin: true,
type: 'GET',
xhrFields: { withCredentials: true },
accept: 'application/json'
}).done(function (data) {
alert(data);
}).fail(function (xhr, textStatus, error) {
var title, message;
switch (xhr.status) {
case 403:
title = xhr.responseJSON.errorSummary;
message = 'Please login to your server before running the test.';
break;
default:
title = 'Invalid URL or Cross-Origin Request Blocked';
message = 'You must explictly add this site (' + window.location.origin + ') to the list of allowed websites in your server.';
break;
}
});
I've done alternate way but still unable to find the solution.
Note: I've no server rights to make server side(API/URL) changes.
This happens generally when you try access another domain's resources.
This is a security feature for avoiding everyone freely accessing any resources of that domain (which can be accessed for example to have an exact same copy of your website on a pirate domain).
The header of the response, even if it's 200OK do not allow other origins (domains, port) to access the resources.
You can fix this problem if you are the owner of both domains:
Solution 1: via .htaccess
To change that, you can write this in the .htaccess of the requested domain file:
<IfModule mod_headers.c>
Header set Access-Control-Allow-Origin "*"
</IfModule>
If you only want to give access to one domain, the .htaccess should look like this:
<IfModule mod_headers.c>
Header set Access-Control-Allow-Origin 'https://my-domain.example'
</IfModule>
Solution 2: set headers the correct way
If you set this into the response header of the requested file, you will allow everyone to access the resources:
Access-Control-Allow-Origin : *
OR
Access-Control-Allow-Origin : http://www.my-domain.example
Server side put this on top of .php:
header('Access-Control-Allow-Origin: *');
You can set specific domain restriction access:
header('Access-Control-Allow-Origin: https://www.example.com')
in your ajax request, adding:
dataType: "jsonp",
after line :
type: 'GET',
should solve this problem ..
hope this help you
If you are using Express js in backend you can install the package cors, and then use it in your server like this :
const cors = require("cors");
app.use(cors());
This fixed my issue
This worked for me:
Create php file that will download content of another domain page without using js:
<?
//file name: your_php_page.php
echo file_get_contents('http://anotherdomain/test.json');
?>
Then run it in ajax (jquery). Example:
$.ajax({
url: your_php_page.php,
//optional data might be usefull
//type: 'GET',
//dataType: "jsonp",
//dataType: 'xml',
context: document.body
}).done(function(data) {
alert("data");
});
You have to modify your server side code, as given below
public class CorsResponseFilter implements ContainerResponseFilter {
#Override
public void filter(ContainerRequestContext requestContext, ContainerResponseContext responseContext)
throws IOException {
responseContext.getHeaders().add("Access-Control-Allow-Origin","*");
responseContext.getHeaders().add("Access-Control-Allow-Methods", "GET, POST, DELETE, PUT");
}
}
You must have got the idea why you are getting this problem after going through above answers.
self.send_header('Access-Control-Allow-Origin', '*')
You just have to add the above line in your server side.
In a pinch, you can use this Chrome Extension to disable CORS on your local browser.
Allow CORS: Access-Control-Allow-Origin Chrome Extension

Redirecting and preventing cache in Symfony2

I am doing this:
domain.com/route-name/?do-something=1
..which sets a cookie and then redirects to this using a 302 redirect:
domain.com/route-name/
It allows an action to take place regardless of the page viewing (cookie stores a setting for the user).
I am using the default Symfony2 reverse-proxy cache and all is well, but I need to prevent both the above requests from caching.
I am using this to perform the redirect:
// $event being a listener, usually a request listener
$response = new RedirectResponse($url, 302);
$this->event->setResponse($response);
I've tried things like this but nothing seems to work:
$response->setCache(array('max_age' => 0));
header("Cache-Control: no-cache");
So how do I stop it caching those pages?
You have to make sure you are sending the following headers with the RedirectResponse ( if the GET parameter is set ) AND with your regular Response for the route:
Cache-Control: private, max-age=0, must-revalidate, no-store;
Achieve what you want like this:
$response->setPrivate();
$response->setMaxAge(0);
$response->setSharedMaxAge(0);
$response->headers->addCacheControlDirective('must-revalidate', true);
$response->headers->addCacheControlDirective('no-store', true);
private is important and missing in coma's answer.
The difference is that with Cache-Control: private you are not allowing proxies to cache the data that travels through them.
Try this on your response:
$response->headers->addCacheControlDirective('no-cache', true);
$response->headers->addCacheControlDirective('max-age', 0);
$response->headers->addCacheControlDirective('must-revalidate', true);
$response->headers->addCacheControlDirective('no-store', true);
You can use annotations too:
http://symfony.com/doc/current/bundles/SensioFrameworkExtraBundle/annotations/cache.html
And take a look at:
Why both no-cache and no-store should be used in HTTP response?

Resources