I have custom CSP headers in next.config.js that are loaded into the head of the page in ./layouts.
const defaultCSP = {
"script-src": [
"'self'",
"'unsafe-eval'",
"'unsafe-inline'",
`'nonce-${nonce}'`,
"tagmanager.google.com/",
"googletagmanager.com",
],
"script-src-elem": [
"'self'",
`'nonce-${nonce}'`,
"tagmanager.google.com/",
"googletagmanager.com",
],
};
Trying to use react-gtm-module-nonce like so in _app.tsx
useEffect(() => {
TagManager.initialize({
gtmId: GTM_CONTAINER_ID,
auth: GTM_AUTH,
preview: GTM_PREVIEW,
nonce: NONCE,
});
}, []);
but when the app is loaded I see the following error
Refused to execute inline script because it violates the following Content Security Policy directive: "script-src-elem 'self' 'nonce-SOME_NONCE_VALUE' tagmanager.google.com/ googletagmanager.com Either the 'unsafe-inline' keyword, a hash ('sha256-r7NoIbKRzEIuATQ9EL7eN52m5xWoVwuBBTdGzzqnMbY='), or a nonce ('nonce-...') is required to enable inline execution.
It seems like I have the necessary items to run GTM but can't get past CSP. Any clues as to what's happening here? I've tried adding 'unsafe-inline' to script-src-elem but then it shows that it will be ignored if there's a nonce.
Related
I have various image url and changes over time (the image are taken for web by url address and not locally or from a private storage). In order to render <Image /> tag , domains should be passed on to nextjs config.
It isn't possible to pass in 100s of url over time.
How to allow all domains ?
/** #type {import('next').NextConfig} */
const nextConfig = {
reactStrictMode: true,
images: {
domains: [
"img.etimg.com",
"assets.vogue.com",
"m.media-amazon.com",
"upload.wikimedia.org",
],
},
};
module.exports = nextConfig;
I tried this but dint work,
"*.com"
It works for me, accordingly next.js documentation:
const nextConfig = {
images: {
remotePatterns: [
{
protocol: "https",
hostname: "**",
},
],
},
};
next.js remote patterns
The Domain is required to be explicit per their documentation
To protect your application from malicious users, you must define a list of image provider domains that you want to be served from the Next.js Image Optimization API.
You can also see that the source code allows for url's to be evaluated, a wildcard is not accepted.
https://github.com/vercel/next.js/blob/canary/packages/next/client/image.tsx#L864
Solve
You should look at a proxy like cloudinary or imgix and allow those domains in the next.config and use their fetch features to load external image.
i.e
With cloudinary as the allowed domain
module.exports = {
images: {
domains: ['res.cloudinary.com'],
},
};
and then in your code
<Image
src="https://res.cloudinary.com/demo/image/fetch/https://a.travel-assets.com/mad-service/header/takeover/expedia_marquee_refresh.jpg"
width={500}
height={500}
/>
Important
The following StackBlitz utilizes their demo account to fetch an external image from Expedia.com, you should get your own account for production.
https://stackblitz.com/edit/nextjs-pxrg99?file=pages%2Findex.js
I am using vue 3.1.5 and vue cli 4.5.0 for special application like chrome extension and get the following error
"Refused to evaluate a string as JavaScript because 'unsafe-eval' is not an allowed source of script in the following Content Security Policy directive: "script-src chrome://resources 'self'".
in runtime-core.esm-bundler.js
function compileToFunction(source, options = {}) {
...
// compile
const { code } = baseCompile(source, options);
// evaluate function
const msg = new Function(return ${code})();
...
}
Is there any way to build vue3 application compatible to CSP?
I have tried these options
config.resolve.alias.set('vue$', 'vue/dist/vue.esm.js');
configureWebpack: { devtool: inline-source-map }
configureWebpack: { devtool: false }
Are there any other options or I have missed something?
Regards,
Pavel
Yes, Vue uses new Function for runtime compilation of templates.
If you do not wish to allow 'unsafe-eval' in the CSP, you have to pre-compile tempaltes into render functions.
I have a Gatsby site sourcing from Shopify, and I'm having a hard time getting the Acquisition Conversions working.
My guess is that when they go to the Checkout page, since it's on Shopify's domain, it sees that as Direct traffic.
My current configuration is:
{
resolve: `gatsby-plugin-google-analytics`,
options: {
trackingId: "...",
head: true,
anonymize: false,
respectDNT: false,
allowLinker: true,
},
},
I just added that allowLinker in to test today. Is there anything else I'm missing? I'm not too familiar with analytics I'm just a humble javascript farmer.
Thank you
Google Analytics recently changed their API to version 4 and upon other things, they changed the way the tracking identifier is set to the page. I suppose that the plugins will migrate soon to that changes but in the meantime, this is the only plugin available that works with that new API version: gatsby-plugin-google-gtag. So:
// In your gatsby-config.js
module.exports = {
plugins: [
{
resolve: `gatsby-plugin-google-gtag`,
options: {
// You can add multiple tracking ids and a pageview event will be fired for all of them.
trackingIds: [
"GA-TRACKING_ID", // Google Analytics / GA
"AW-CONVERSION_ID", // Google Ads / Adwords / AW
"DC-FLOODIGHT_ID", // Marketing Platform advertising products (Display & Video 360, Search Ads 360, and Campaign Manager)
],
// This object gets passed directly to the gtag config command
// This config will be shared across all trackingIds
gtagConfig: {
optimize_id: "OPT_CONTAINER_ID",
anonymize_ip: true,
cookie_expires: 0,
},
// This object is used for configuration specific to this plugin
pluginConfig: {
// Puts tracking script in the head instead of the body
head: false,
// Setting this parameter is also optional
respectDNT: true,
// Avoids sending pageview hits from custom paths
exclude: ["/preview/**", "/do-not-track/me/too/"],
},
},
},
],
}
All the above parameters are optional, just omit them and replace the GA-TRACKING_ID for your real identifier.
I'm not sure if you ever solved it #cYberSport91, but in the year 2021 AD I am trying to do the same. This is what I found:
Place this snippet in your gatsby-ssr.js or gatsby-browser.js:
exports.onRenderBody = ({ setPostBodyComponents }) => {
const attachCode = `
if (ga) {
ga('require', 'linker');
ga('linker:autoLink', ['destination1.com', 'destination2.com']);
}`
setPostBodyComponents([<script dangerouslySetInnerHTML={{
__html: attachCode
}}/>])
}
Source: Gatsby Git Issues
(Still waiting to confirm with my client whether this solution works)
I am trying to create localized routes with optional first param like /lang?/../../, but without a custom server.
From 9.5 NextJS has this option dynamic optionall parameters, if you set the folder or a file with a name:
[[...param]]. I did that.
The problem is that, i have other routes and I want all of them to be with that lang prefix, ut optional with default language, if that lang is not provided
I have a folder [[...lang]] with a file index.js, with simple function component just for testing. Now optional parameter works for the home page / and /en, but I have other files, which I want to be with that optional lang. For the example, I have about.js and I want to access it via /en/about and /about.
I can't put about.js inside [[...lang]], because, I am getting an error:
Failed to reload dynamic routes: Error: Catch-all must be the last part of the URL.
I know what it says and why is that, but I have a fixed collection of languages ['en', 'fr'] and I can check is there a lang.
Is there a way, without a custom server to use optionally a dynamic first part of the path, like
/en/about and /about ?
I think you are talking about this feature. Have a look on this https://nextjs.org/blog/next-9-5#support-for-rewrites-redirects-and-headers
To extend the answer from #Vibhav, in next.config.js:
const nextConfig = {
async rewrites(){
return [
// URLs without a base route lang param like /my-page
{
source: '/',
destination: '/'
},
// URLs with a base route lang param like /en/my-page
{
source: '/:lang*/:page*',
destination: '/:page*'
},
// URLs `/en/post/post_id`
{
source: '/:lang/:path/:page',
destination: '/:path/:page'
},
]
}
};
module.exports = withBundleAnalyzer(nextConfig);
all pages are in the pages folder. Not the best solution for now, because it works in a deep up to like /pages/another-folder/file.
You can even get the lang param in your pages or _app.js:
....
const router = useRouter();
if(router.query.lang){
pageProps.lang = router.query.lang;
}
console.log(pageProps.lang);
return (
<Layout>
<Component {...pageProps} />
</Layout>
)
For URL - /en/my-page, router.query.lang will be equal to en.
For URL - /my-page, router.query.lang will be undefined, but you can set a default lang.
I want to dynamicaly inject and load an iframe inside the background page. But every time, the request is canceled.
http://i.imgur.com/Puto33c.png
That used to work a week ago. I don't know where I'm wrong.
To reproduce this issue, I created a small extension :
manifest.js :
{
"name": "iframe background",
"version": "1.0.0",
"manifest_version": 2,
"browser_action": {
"default_title": "iframe"
},
"background": {
"persistent": false,
"scripts": ["background.js"]
}
}
background.js :
chrome.browserAction.onClicked.addListener(function() {
var iframe = document.createElement('iframe');
iframe.src = 'http://localhost:3000/';
iframe.onload = function() {
console.log(iframe.contentDocument); // return null
};
document.body.appendChild(iframe);
});
The page to load is not blocked by X-Frame-Options SAMEORIGIN.
I tried to put the iframe directly within a HTML background page with no luck.
I also tried to add an content_security_policy :
"content_security_policy": "script-src 'self'; object-src 'self'; frame-src 'self' http://localhost:3000/"
But the iframe still doesn't load.
Does someone has a workaround or a solution to this problem?
Thanks !
Chrome 58.0.3014.0 enables Site Isolation for extensions by default that makes the iframe load in a different renderer process handled by a separate chrome.exe OS process.
The 'canceled' message means that the extension's chrome.exe process canceled the request and it was handled by a different hidden chrome.exe process.
The correct approach is to declare a content script that will automatically run on the iframe URL and communicate to the background page. Note: only JSON-fiable data may be passed, in other words, you can pass innerHTML but not DOM elements. This is easy to handle though via DOMParser.
manifest.json additions:
"content_scripts": [{
"matches": ["http://localhost:3000/*"],
"js": ["iframe.js"],
"run_at": "document_end",
"all_frames": true
}],
iframe.js:
var port = chrome.runtime.connect();
// send something immediately
port.postMessage({html: document.documentElement.innerHTML});
// process any further messages from the background page
port.onMessage.addListener(msg => {
..............
// reply
port.postMessage(anyJSONfiableObject); // not DOM elements!
});
background.js:
var iframePort;
chrome.browserAction.onClicked.addListener(() => {
document.body.insertAdjacentHTML('beforeend',
'<iframe src="http://localhost:3000/"></iframe>');
});
chrome.runtime.onConnect.addListener(port => {
// save in a global variable to access it later from other functions
iframePort = port;
port.onMessage.addListener(msg => {
if (msg.html) {
const doc = new DOMParser().parseFromString(msg.html, 'text/html');
console.log(doc);
alert('Received HTML from the iframe, see the console');
}
});
});
See also a similar QA: content.js in iframe from chrome-extension popup