CORS authenticated requests to Google Sites feed/API blocked - asp.net

I'm currently building an ASP.NET web application to simplify the provisioning of Google Sites, pages, Gadgets on Google Sites and ACLs for Google Sites.
I have encountered the issue which many a developer has already come across: cross-origin resources. According to the Google documentation on CORS requests to Google APIs, you simply use an XMLHttpRequest (or AJAX) request, providing your access token in the header. More information can be found here:
https://developers.google.com/api-client-library/javascript/features/cors
I've been perfectly able to accomplish this when I'm accessing the Google Sites API from within my domain on Google Sites, injecting AJAX requests while my browser window's location is within the domain. An example of a succeeded request to make a new site from within my domain:
$.ajax(
{
////// REQUEST \\\\\\
type: "POST",
url: "https://sites.google.com/feeds/site/[domainName]",
contentType: "application/atom+xml",
headers: {
"Authorization": "Bearer " + [accessToken],
"GData-Version": "1.4"
},
data: ["<entry xmlns='http://www.w3.org/2005/Atom' xmlns:sites='http://schemas.google.com/sites/2008'>",
"<title>What a site</title>",
"<summary>Best description ever.</summary>",
"<sites:theme>ski</sites:theme>",
"</entry>"].join(""),
////// LOGGING \\\\\\
beforeSend: function () {
console.log('-------Making-the-request-------');
},
success: function (result) {
console.log(result);
},
error: function (xhr, ajaxOptions, thrownError) {
console.log(thrownError);
console.log(xhr.status);
}
});
(In several cases below, I'm writing https:// as [https] due to my account still being restricted to 2 links in a post).
At this point everything was going great, I thought I had everything set to use the code into my ASP.NET site. Alas, things don't always go to plan. When I executed the exact same AJAX call from within my application (right now still hosted on [https]localhost:44301), I get the following error:
XMLHttpRequest cannot load
[https]sites.google.com/feeds/site/[censored] Response to preflight
request doesn't pass access control check: No
'Access-Control-Allow-Origin' header is present on the requested
resource. Origin '[https]localhost:44301' is therefore not allowed
access. The response had HTTP status code 405.
The usual CORS error. I was surprised though, as the advised way of making requests to Google APIs is exactly that. I've also found an article about using CORS with the Google Cloud API:
https://cloud.google.com/storage/docs/cross-origin
In the article it states:
Most clients (such as browsers) use the XMLHttpRequest object to make
a cross-domain request. XMLHttpRequest takes care of all the work of
inserting the right headers and handling the CORS interaction with the
server. This means you don't add any new code to take advantage of
CORS support, it will simply work as expected for Google Cloud Storage
buckets configured for CORS.
Of course, this isn't the Google Sites API, but I find it hard to believe that Google hasn't implemented the same functionality in all of their APIs.
Does anyone know whether it's possible to achieve successful requests such as this from within a standalone ASP.NET application? And if so, how?
Many thanks for spending time to read about my hardships.
UPDATE:
I've contacted Google Apps Support regarding my issue, and have gotten the following response:
In addition to the information you provided, I also reviewed your post
at
CORS authenticated requests to Google Sites feed/API blocked.
The note at
https://developers.google.com/google-apps/sites/docs/1.0/developers_guide_java#SiteFeedPOST
only reinforces the statement under 'Can I create a new Google Site?'
and 'How do I copy a site?' at
https://developers.google.com/google-apps/sites/faq#Getting__Started,
which states 'Google Apps users can use the site feed to ...' However,
I don't see why this is relevant to your issue, if you've authorised
against your domain administrator account, as the note is only
indicating that gmail.com users won't be able to use the listed
methods to create, or copy a site.
I haven't used CORS, so can't comment on it's operation, but have been
able to successfully list, and create sites using HTTP GET and POST
requests via a raw HTTP client, so the API is operating as it should
with regard to cross domain requests. I used the sample XML document
at
https://developers.google.com/google-apps/sites/docs/1.0/developers_guide_protocol#SitesFeedPOST
to create my site, configuring the client with credentials for my
Developer console project. The fact that the request only fails in
your ASP.NET site implies that there is something in that environment
which isn't configured correctly. Unfortunately that's outside my
scope of support, so I'm unable to provide any specific advice, other
than to check the relevant documentation, or post a request in the
ASP.NET section of Stack Overflow at
https://stackoverflow.com/questions/tagged/asp.net.

Could you try again after removing "GData-Version": "1.4"?
If you really want to send this value, send it by adding a query parameter such as v=X.0. Resource from here
UPDATED:
"Note: This feature is only available to Google Apps domains." From guides

SOLVED
It seems that a lot of browsers would still block an AJAX request across different domains, even when it's allowed by the API you're trying to reach. Instead of using AJAX, I'm now using the C# WebRequest in a DLL.
Example:
// Create the request
WebRequest request = WebRequest.Create("https://sites.google.com/feeds/site/[domainName]");
request.Method = "POST";
request.contentType = "application/atom+xml";
request.Headers.Set(HttpRequestHeader.Authorization, "Bearer " + [accessToken]);
request.Headers["GData-Version"] = "1.4";
// Fill in the data and encode for the datastream
string data = ["<entry xmlns='http://www.w3.org/2005/Atom' xmlns:sites='http://schemas.google.com/sites/2008'>",
"<title>What a site</title>",
"<summary>Best description ever.</summary>",
"<sites:theme>ski</sites:theme>",
"</entry>"].join("");
byte[] byteArray = Encoding.UTF8.GetBytes (data);
// Add the data to the request
Stream dataStream = request.GetRequestStream ();
dataStream.Write (byteArray, 0, byteArray.Length);
dataStream.Close ();
// Make the request and get the response
WebResponse response = request.GetResponse ();
More info can be found on MSDN:
https://msdn.microsoft.com/en-us/library/debx8sh9(v=vs.110).aspx

Related

CORS / Cross Origin Isolation / Google APIs

I'm trying to integrate the Zoom Web Video SDK into an existing web application, and SharedArrayBuffer has become a requirement for performance reasons, and in order to enable it the site has to implement Cross Origin Isolation. I've gone ahead and added the requisite configuration to NGINX, namely:
add_header 'Cross-Origin-Embedder-Policy' 'require-corp';
add_header 'Cross-Origin-Opener-Policy' 'same-origin';
... but of course this has knock-on effects for the rest of the previously existing and working site. The Google APIs won't seem to load successfully anymore.
I changed my index.html to add the crossorigin attribute to the Google API script tag as follows:
<script src="https://apis.google.com/js/platform.js" async defer crossorigin></script>
... and in my javascript source code I have (paraphrasing and reducing complexity to get the the point):
gapi.load('client:auth2', function() {
gapi.client
.init({
client_id: 'MY-CLIENT-ID',
cookiepolicy: 'single_host_origin',
discoveryDocs: ['https://classroom.googleapis.com/$discovery/rest?version=v1'],
scope: 'profile email'
})
.then(() => console.log('init finished'))
.catch(e) => console.error('init failed', e));
});
In that code, gapi and gapi.client are well-defined, but the init call never completes (no console logs from the then or from the catch). Looking at the network tab in devtools shows a failed GET request to:
https://content-classroom.googleapis.com/static/proxy.html?usegapi=1& ... and bunch of other stuff i'm not sure if is sensitive so am omitting
When you dive into the response, it shows:
To use this resource from a different origin, the server needs to
specify a cross-origin resource policy in the response headers:
Cross-Origin-Resource-Policy: same-siteChoose this option if the resource and the document are served from the same site.
Cross-Origin-Resource-Policy: cross-originOnly choose this option if an arbitrary website including this resource does not impose a
security risk.
Obviously, I can't control what Google's servers do, but can anyone instruct me as to how I can get this to work correctly. This only goes awry in the presence of my NGINX configuration change at the start of this post.
Update
I partially worked around this by fetching the discovery document for the API separately and passing the result into the gapi.client.init method instead of the URL. However, while I don't get the aforementioned outcome in the network tab of devtools anymore, but I instead get weird / inconsistent results with responses like "popup_closed_by_user" and "popup_closed_by_browser" happening in response to my GoogleAuth.signIn call. If I remove the headers from NGINX it starts behaving as expected again. I don't understand what's going on with this.

No-Access-Control-Allow-Origin for authorization token but the URLs are set

The Linkedin API is tough to reach to say the least... :) Or maybe I'm just dumb, who knows!
I have already retrieved the code and it is passed along.
I have this request setup as per the docs (sorry for the font size, trying to fit everything)
However, when I attempt this request through a browser, I am greeted with this CORS error:
Here are the current URLs setup in my Linked App Auth section:
I put both in there just to cover all my bases. Still getting the error. Conceptually, I get it, you should not be able to call the API if your domain is not recognized by Linkedin. However, the redirect URLs, as I understand it, bridge that gap and are 'allowed domains'.
Any idea how to fix this? Thank you everyone!
Generally its server site issue, But you can override this issue from client side using PROXYURL
const proxyURL = "https://cors-anywhere.herokuapp.com/";
const requestURL = YOUR URL;
$.getJSON(proxyURL + requestURL, function(data) {
console.log(data);
})

Is there a way to specify trusted origins for post requests in google web app?

Let's say i created a google sheet to capture user's email addresses.
On my website there is a small form and once the submit button is clicked and an ajax request to a google web app that writes data to a sheet is fired:
// Let's select and cache all the fields
var $inputs = $form.find("input, select, button, textarea");
// Serialize the data in the form
var serializedData = $form.serialize();
// Fire off the request
request = $.ajax({
url: https://script.google.com/macros/s/longURLcode/exec,
type: "post",
data: serializedData
});
In the google script you now use doPost(e) or doGet(e) to handle any incoming http request. To allow this to work as a sign up mechanism permissions for the web app have to be set to (i think !?)
Execute the app as: Me (myemail#gmail.com)
Who has access to the app: Anyone, even anonymous
Given everything on the google script site is set up properly, this works like a charm. So whats wrong?
Problem:
Anyone can either look into the source code of my webpage or use the dev tools to extract the url to the google web app after clicking submit. This url can now in theory be used to flood the sheet with countless (undefined) entries.
Questions:
1) Is there a way to limit accepted http request to certain origins? I tried to do this by accessing the http headers within doPost() but there seems to be no way to do so.
2) Is "Who has access to the app: Anyone, even anonymous" the wrong approach? I thought this is necessary since you can only choose google users here and some url (mywebsite.com) seemed to fall within the anonymous category.
3) I don't think this is possible but maybe i missed an option: Is there a way to NOT expose the google web app url to anyone? I guess not because you can monitor any requests with dev tools.
4) Is using sheets for capturing that kind of data just a terrible idea in general (partly for above reasons) and i should find another solution asap?

Google Tag Manager 403's every request even if CORS mapping is defined

when I moved to AMP, the Google Tag Manager stopped to working.
The problem occurs every time when I open my AMPed page, I can see some errors in browser console, e.g.
First error:
https://www.googletagmanager.com/amp.json?id=MY_GTM_TAG&gtm.url=MY_HTTP_URL
(403)
Second error:
No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin '' is therefore not allowed access. The response had HTTP status code 403. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
In my class that extends WebMvcConfigurerAdapter I overwritten the method addCorsMappings like this:
#Override
public void addCorsMappings(CorsRegistry registry) {
registry.addMapping("/**")
.allowedOrigins("*")
.allowedHeaders("*")
.allowCredentials(true);
};
But it still doesn't work (this method is executed on startup, I checked it). Do you have any ideas / tips why?
EDIT 1 (22.12.2016):
Q: How are you loading tag manager? Are you using the AMP version of the script? (#Jim Jeffries)
A: Yes, in <head> I included the following piece of code:
<script async custom-element="amp-analytics" src="https://cdn.ampproject.org/v0/amp-analytics-0.1.js"></script>
and in <body> there is:
<amp-analytics config="https://www.googletagmanager.com/amp.json?id=${googleTagId}&gtm.url=SOURCE_URL" data-credentials="include"></amp-analytics>
I was having the same issue and it turns out you can't use your old GTM "Web" container for this so you'll have to create a specific AMP Container.
As per Google's instructions found here:
Create an AMP container
Tag Manager features an AMP container type. Create a new AMP container for your project:
On the Accounts screen, click More Actions (More) for the account
you'd like to use. Select Create Container.
Name the container. Use a descriptive name, e.g. "example.com - news - AMP".
Under "Where to Use Container", select AMP.
Click "Create".
Based from this thread, maybe you are doing an XMLHttpRequest to a different domain than your page is on. So the browser is blocking it as it usually allows a request in the same origin for security reasons. You need to do something different when you want to do a cross-domain request. A tutorial about how to achieve that is Using CORS.
*When you are using postman they are not restricted by this policy. Quoted from Cross-Origin XMLHttpRequest:*
Regular web pages can use the XMLHttpRequest object to send and receive data from remote servers, but they're limited by the same origin policy. Extensions aren't so limited. An extension can talk to remote servers outside of its origin, as long as it first requests cross-origin permissions.
Also based from this forum, the app must authenticate as a full admin and POST the desired CORS configuration to /rest/system/config.

How to call the JSON service in Secure Manner in ASP.NET

Hello i have certern API's which i am getting from service providers. The keys contains secured ID and password that we need to send with every request of API through JSON.
Presently i am using
$.ajax({
url: "http://api",
dataType: 'jsonp',
data : {'UserName':'abce','Password':'Password'}
success: function(results){
console.log(results);
}
});
So is there any way that i dont want to show that in the JSON request. I am creating application in ASP.NET. can you suggest me what we can do to encrypt that.
No, there is no way if you make the call from javascript. One possibility is to have a server side script on your domain which will act as a bridge. You could then send the AJAX request to your script which in turn will delegate the call to the remote service. You don't need JSONP in this case.
No there's no way for that.
You can hide the traveling information if you go through an https (witch provides you hardware based encryption for the tunnel). -> this avoids listening, but not middleman if the SSL will be provided by him
I suggest that you should use sessions + httponly cookies what makes sense in this case. Even the session will be captured, the identity can't be hacked. Put the communication over https and you did what you can.
[If the API is provided by 3rd party - then you have no chance at all]

Resources