Azure Content Moderator Portal - Unable to load Azure Media Services Video - azure-cognitive-services

We are creating video reviews in the review tool using the code here and everything used to work before (months back).
Now the only problem we are facing is loading the video on the review tool.
From the console, On chrome, it says CORB blocked the response,
Cross-Origin Read Blocking (CORB) blocked cross-origin response https://REDACTED.streaming.media.azure.net/REDACTED/ignite_c_c.ism/manifest with MIME type application/vnd.ms-sstr+xml. See https://www.chromestatus.com/feature/5629709824032768 for more details.
And I can see 0B responses,
And on Firefox,
But if you paste the same video manifest URL in the Azure Media Test Tool, it works fine there.
Any help to fix the video loading issue would be greatly appreciated.

If you say you were able to use tha same without any changes over months ago, maybe a browser update(unless you have updated endpoints or header to Cross site access policies). Refer Configure CDN profile
However, "CORB" referred above seems similar to CORS (Cross Origin Resource Sharing).
It is an HTTP feature that enables a web application running under one
domain to access resources in another domain. In order to reduce the
possibility of cross-site scripting attacks, all modern web browsers
implement a security restriction known as same-origin policy. This
prevents a web page from calling APIs in a different domain. CORS
provides a secure way to allow one origin (the origin domain) to call
APIs in another origin.
CORS on Azure CDN will work automatically with no additional configuration. When you create a new account, default Streaming Endpoint Azure CDN integration is enabled by default. If you later want to disable/enable the CDN, your streaming endpoint must be in the stopped state. It could take up to two hours for the Azure CDN integration to get enabled and for the changes to be active across all the CDN POPs.
you might want to start using a wildcard (*) to setup the HTTP header, which disables CORS and allows any URL to access the CDN Endpoint.
Refer: Using Azure CDN with CORS
Caution: The Content Moderator Review tool is now deprecated and will be retired on 12/31/2021.
Video moderation enables detection of potential adult content in videos. The review tool internally calls the automated moderation APIs and presents the items for review right within your web browser
There are multiple indications:
SameSite cookie flag error
No decoders for requested formats
CORB error
You can give this a try though:
Set the SameSite by default cookies flag value to Disabled in Chrome 80 and later versions.
In your Chrome browser session, address chrome://flags/ and Search for or find the flag, SameSite by default cookies.
Select Disabled
.

Related

Firebase Analytics not logging events

I have a React web app. I set up analytics as described in the documentation here : https://firebase.google.com/docs/analytics/get-started
With the help of the Analytics extension in Chrome, and in my dev environment, I can see the logs in the Debug view section. That means I setup analytics correctly in the app (I believe).
However, if I deploy my app to my https://myapp.web.app domain, nothing logs. I checked the Hosting section, and my app is correctly deployed and it is selected.
I updated my firebase sdk recently (8.7.0), and I added measurementId in the settings, although the doc says it's optional to use measurementId.
Am I missing something ? Is there any way to see if I'm missing something ?
Enabling Google Analytics involves API requests to Firebase Installations Service, to google-analytics.com and to googletagmanager.com`.
I use Firefox, because Chromium sends my computer CPU and RAM consumption to the moon, even with a single tab open. And in Firefox, unlike Chromium and Brave, among other browsers, I had nothing logged in the console.
Chromium and Brave logged API requests errors.
So, you need to add the Firebase Installations Service API key in the cloud console.
I feel this could be mentioned in the documentation, because it's not very obvious.
Anyway, someone explained it very clearly here : Firebase: 403 PERMISSION_DENIED (FirebaseError: Installations): Requests are blocked, after updating SDKs (FirebaseInstallationsService)
Now Firebase Analytics show logs when using Chromium.
However, these requests are blocked using Firefox and Brave (and therefore no logs are shown in Firebase Analytics). My understanding is it has to do with default settings in the browser.
With Brave, it's GET requests to googletagmanager.com/ that are blocked.
With Firefox, it's POST requests to google-analytics.com/ that are blocked. The above mentioned GET request is NOT blocked by Firefox.
https://rankfuse.com/blog/firefox-browser-blocking-google-analytics/
Does anyone know of a workaround ? I understand some Internet users can be annoyed by tracking systems such as GA, but Internet services need such tracking systems to improve their overall user experience, and if Internet browsers block analytics services, we are kind of stuck there.
EDIT: ok, so a bit of research about the above issue of browsers blocking analytics requests I came across various paid services purposely defined as workarounds and various tricks to bypass analytics blocking.
One straightforward way is to obviously proxy requests from your user's browser to google-analytics.com. There is a good article here that explains how to proceed can be found here: https://iainbean.com/posts/2020/the-shady-world-of-google-analytics-proxying/

Static website I am hosting cannot be reached and the server IP cannot be found

I recently used Google Domains to register a domain and have connected it to Google Cloud Console to manage a static website. I followed the Google Codelabs guide to set it up and faced no issues. However, when refreshing my website, it still doesn't load and my browser (Chrome) gives me the following error message:
This site can’t be reached
carbonfootprint.dev’s server IP address could not be found."
As well, going to www.carbonfootprint.dev gives me another error message:
Your connection is not private
Attackers might be trying to steal your information from www.carbonfootprint.dev (for example, passwords, messages, or credit cards).
NET::ERR_CERT_COMMON_NAME_INVALID
...Which is confusing, because I was under the impression that a .dev domain suffix gives SSL certification by default.
However, in my Google Domains settings, the website content appears as it should in the minimized preview that exists in both the Domain Overview panel and Website panel. It has been over 48 hours, so it should have updated by now if it were just a delay issue.
For reference, this is what my Custom resource records look like, this is what my synthetic records look like, and these are my bucket details in Google Cloud Console. As well, here is a preview of the website, as shown in the Google Domains console.
Any help is much appreciated!
Ended up finding the answer thanks to #IshRaj on ServerFault.
For future reference to anyone else viewing, Google Cloud Storage only supports HTTP connections when hosting a static website through CNAME resource records. To serve content through a custom domain over SSL, you will need to either:
Set up an external HTTPS load balancer (instructions here),
potentially with Google Cloud CDN (set-up documentation here)
Connect a third-party Content Delivery Network to your Google Cloud
Storage (guide here)
Host your static website on Google App Engine with Python (guide
here)
Serve static website content through Google Firebase rather than
Google Cloud Platform (tutorial here/additional support)
Personally, I went with Google Firebase (the last option), which automatically upgrades websites to https. It was simple and quick to set up and content is now directly deployable from my files. As well, with Firestore's automatic scalability and powerful queries, Firebase becomes a viable alternative, especially with its other features (user authentication, realtime data synchronization, machine-learning, extensions).

Firebase Hosting returning 500 internal error for Googlebot user-agent when using Google Chrome's "network conditions" tab?

I've got the following set up on my Firebase web app (it's a Single Page App built with React):
I'm doing SSR for robots user agents, so they get full rendered HTML and no Javascript
Users get the empty HTML and get the Javascript to run the app.
firebase.json
"rewrites": [{
"source": "/**",
"function": "ssrApp"
}]
Basically every request should go into my ssrApp function, that will detect robot crawlers user-agents and decide wheter it will respond with the SSR version for the robots, or the JS version for the regular users.
It is working as intended. Google is indexing my pages, and I always log some info about the user agents from my ssrApp function. For example, when I'm sharing an URL on Whatsapp, I can see Whatsapp crawler on my logs from Firebase Console (see below):
But the weird thing is that I'm not being able to mimick Googlebot using Chrome's Network Conditions tab:
When I try to access my site by using Googlebot's user agent I get a 500 - Internal error
And my ssrApp functions isn't even triggered, since NOTHING is logged out from it.
Is this a Firebase Hosting built-in protection to avoid fake Googlebots? What could be happening?
NOTE: I'm trying to mimick Googlebot's user agent because I want to inspect the SSR version of my app in production. I know that there are other ways to do that (including some Google Search Console tools), but I thought that this would work.
Could you check that your pages are still in the Google index? I have the exact same experience and 80% of my pages are now gone...
When I look up a page in Google Search Console https://search.google.com/search-console it indicates there was an issue during the last crawl. When I "Test it Live" it spins and reports the error 500 as well and asks to "try again later"...
I had a related issue, so I hope this could help anyone who encounters what I did:
I got the 500 - Internal error response for paths that were routed to a Cloud Run container when serving a GoogeBot user agent. As it happens, if the Firebase Hosting CDN happened to have a cached response for the path, it would successfully serve the cached response, but if it didn't have a cached response, then the request would not reach the Cloud Run container, it would fail at the Firebase Hosting CDN with 500 - Internal error.
It turns out that the Firebase Hosting CDN secretly probes and respects any robots.txt that is served by the Cloud Run container itself (not the Hosting site). My Cloud Run container served a robots.txt which disallowed all access to bots.
Seemingly, when the Hosting CDN attempts to serve a request from a bot from a path that is routed to a Cloud Run container, it will first probe any /robots.txt that is accessible at the root of that container itself, and then refuse to send the request to the container if disallowed by the rules therein.
Removing/adjusting the robots.txt file on the Cloud Run container itself immediately resolved the issue for me.

Skype Web Control support

I try to find a internet place where I could find support about the Skype Web Control (dedicated support web site, forum, chat, documentation). A place where I can report problem and find help.
Here are my issues, maybe someone has a solution:
I use the Skype Web Control with a Microsoft Chatbot (Azure, LUIS) and it works pretty well.
But the smileys are not displayed in the conversion when the user send one. Space are taken to display the smiley but no smiley in there. If the user is connected, the conversation in the Skype application displays well the smileys.
And when the bot answer with a smiley, it is displayed as text :) not replaced by an image. Is there a way to do it?
I also have the following error:
Cross-Origin Request Blocked:
The Same Origin Policy disallows reading the remote resource at
https://browser.pipe.aria.microsoft.com/Collector/3.0/?qsp=true&content-type=application%2Fbond-compact-binary&client-id=NO_AUTH&sdk-version=ACT-Web-JS-2.9.0&x-apikey=xxx.
(Reason: CORS request did not succeed).
Does anyone know how to fix it?
There are multiple questions. To enable emojis you can customize webchat control. We did it and enabled emojies and other features too. Below is the link of source of webchat:
https://github.com/Microsoft/BotFramework-WebChat
For CORS issue (cross domain security issue) have you tried placing the code on ms azure app. We faced the issue, with our own server, but not on azure app. CORS can be configured using web.config too.

Wordpress listed as a web browser in google analytics

We have users that manage to bypass javascript form validation and we see in google analytics that they all share a common web browser... "Wordpress". What does it mean ?
'Wordpress' most likely means it’s an automated request. WordPress provides an API for making HTTP calls, WP_Http, and its requests will be shown as 'WordPress' in analytics (but this can be overriden).
'Automated' doesn’t mean it’s malicious. WordPress sites can have valid reasons to access your site (for example, to check for dead links). The fact that ‘WordPress’ accesses your site doesn’t mean anything bad per se.

Resources