I have an Next.js app that receives data from another app on the backend and send it to it's on client as cookies. These cookies are working fine on localhost, but they are not being set properly when I deploy the app to AWS Amplify.
I'm using the cookie module and what my backend does is:
// creates an object with the cookie options
const cookiesOptions = {
httpOnly: "true",
maxAge: (60*60)*24,
sameSite: "None",
secure: "true",
path: "/",
};
// creates a cookies array that will store the needed user credetials,
// initializing it with a public cookie (not HttpOnly) that stores the user's nickname
const cookies = [cookie.serialize(`${AUTH_COOKIE.PREFIX}-${AUTH_COOKIE.LOGGED_USER}`, userName, {
maxAge: cookiesOptions.maxAge,
sameSite: cookiesOptions.sameSite,
secure: cookiesOptions.secure,
path: cookiesOptions.path
})];
// generates a name for the cookie of each user's credential and add this cookie to the cookies array
Object.keys(userData).forEach(prop => {
const userProp = userData[prop].trim();
if (userProp.length) {
const cookieName = AUTH_COOKIE.PREFIX +
'-' +
prop.slice(0, 2) +
prop.charAt(prop.length-1).toLowerCase();
cookies.push(cookie.serialize(cookieName, userProp, cookiesOptions));
}
});
// sets the cookies to the response header
res.setHeader("Set-Cookie", cookies);
// redirects the user to the home page
res.status(302).redirect("/");
The thing is that only the first cookie (nickname) is being set. The ones that I generate in the Object.keys(userData).forEach loop are not. I tried to set each cookie manualy as with the nickname one, but it still doesn't work. I've checked and I'm sure that they are being generated, they just don't go with he response to the client.
Also, this is the exact same code that I used for previous deployed apps and it works just fine, all the cookies are set properly. But now, with this one, it just doesn't work.
Related
I do have a super weird error coming up only when deploying the code to Vercel. It doesn't happen locally which makes it quite annoying to begin with.
I do have a staging and a production instance for my code. I want to protect the staging with a password which is not difficult since I implemented the authentication via Firebase. The only tricky part is that I don't use Firebase to keep track of the user but my server (basically setting a cookie). I should mention that I am using Sveltekit to put it all together.
In sveltekit you can use hooks, which can be seen as middlewares, to redirect a user to the sign-in page if the env variable for the environment is set to dev.
Another hook redirects a logged-in user, so if you are already logged in and try to go to auth/sign-in or auth/sign-up you'll get redirected to the home page.
Now the weird happens: I go on the deployed version of the site, and I get immediately redirected to the sign-in page, which is correct. I try to navigate to all the pages of the website, the redirect still works fine. I log in and upon success, I should be redirected to the homepage, which I do BUT the home page redirects me to the sign-in page as if I wasn't logged in and again the sign-in page redirects me to the home page as if I was, thus creating a loop.
I honestly don't know why this happens since it perfectly works locally, so my thoughts go to Vercel. I would exclude Firebase since I remembered to put the custom domain as an allowed domain in the settings.
To give a bitmore context, I structured the hooks responsible for the redirect in this way:
export const authSessionHandler: Handle = async ({ event, resolve }) => {
const cookie = event.locals.cookie;
const idToken = await getIdTokenFromSessionCookie(getCookieValue(cookie, 'session'));
const user = idToken
? {
uid: idToken?.sub,
email: idToken?.email
}
: null;
event.locals.idToken = idToken;
event.locals.user = user;
return resolve(event);
};
export const redirectLoggedInUserHandler: Handle = async ({ event, resolve }) => {
const { user } = event.locals;
const next = event.url.searchParams.get('next') || '/';
if (
user &&
(event.url.pathname.startsWith('/auth/sign-in') ||
event.url.pathname.startsWith('/auth/sign-up'))
) {
return new Response('Redirect', {
status: http_302.status,
headers: {
location: `${next}`
}
});
}
return resolve(event);
};
export const redirectToSignInForDevEnvironmentHandler: Handle = async ({ event, resolve }) => {
const { user } = event.locals;
const allowedEndpoints = ['/auth/sign-in', '/auth/session'];
if (!user && env === 'dev' && !allowedEndpoints.includes(event.url.pathname)) {
return new Response('Redirect', {
status: http_302.status,
headers: {
location: '/auth/sign-in'
}
});
}
return resolve(event);
};
The handlers are in that order, so the first one populates the user and the rest can check the rest.
In the code I am getting the user from event.locals which kind of decides the entire logic (as it should) and to me it's quite interesting and telling the fact that the sign-in page redirects me to home which mean the user is defined, but the home page redirects back as if the user was not defined. This made me think it is not a problem with the code but probably the provider(s) Vercel or Firebase.
It would be very helpful to know your thoughts about it.
I am trying to implement magic link login to my app. I enabled email login option through Firebase console and localhost is already under the authorized domains. I have the code snippet and the screenshot in the below.
I can see that some request is being done with 200 success code but I receive no email.The code does not throw any error and I have no idea what is wrong at this point. Can someone help?
export const sendMagicLink = (email: string, redirectUrl: string) => {
const auth = getAuth(getClientApp());
const actionCodeSettings = {
url: redirectUrl,
handleCodeInApp: true
};
return sendSignInLinkToEmail(auth, email, actionCodeSettings);};
const handleSubmit: svelte.JSX.EventHandler<SubmitEvent, HTMLFormElement> = async ({
currentTarget
}) => {
email = new FormData(currentTarget).get('email') as string;
const redirectUrl = `${window.location.origin}/auth/confirm`;
state = 'submitting';
try {
await sendMagicLink(email, redirectUrl);
setMagicEmail(email);
state = 'success';
} catch (error) {
if (error instanceof Error) {
state = error;
} else {
console.log(error);
state = new Error('something went wrong sending the magic link 😞');
}
}
};
Request body:
canHandleCodeInApp true
continueUrl "http://localhost:3000/auth/confirm"
email "someemail#gmail.com"
requestType "EMAIL_SIGNIN"
Intuitively a developer assumes that emails sent out by Firebase's internal email service will not be classified as spam, but this happens very often.
To solve this, one would need to:
Setup a custom domain for Authentication in Firebase Console
Go to Firebase Authentication
Go to Templates
Go to Email Address Verification
Click Edit
Click Customize domain and go through the whole process
Setup a proper SMTP server in Firebase Console
Go to Authentication
Go to Templates
Go to SMTP Settings and enter SMTP Settings. Use the same sender domain as has been used in Email Address Verification above.
Setting Action URL
Set your custom domain in the Hosting section, first, e.g.: example.com.
Then, in the Authorization Templates section, click Edit and adjust the Custom Action URL at the bottom of the page. Set it to the same domain used for Hosting, e.g.:
https://example.com/__/auth/action
This helps to decrease the spam ranking of the emails, as the outgoing email from domain A will now contain a link to domain A.
In contrast, an email from domain A carrying a link to domain B is more suspicious.
Hello! I am OCD about security for my app and was wondering how to properly login/signup a user with the Google auth provider.
I have a client ID and secret for that client ID - from Google Credentials - for my app. I know not to put the secret in the client.
The code below works perfectly but I'm unsure if it's safe to generate an id_token for a user directly without any server code because of this doc from Expo Go:
Notice it says "be sure that you don't directly request the access token for the user". I don't know what this means exactly.
const [request, response, promptAsync] = Google.useIdTokenAuthRequest({
clientId:
"my-client-id-goes-here.google.apps.com",
});
React.useEffect(() => {
if (response?.type === "success") {
const { id_token } = response.params;
const credential = new GoogleFirebase.GoogleAuthProvider.credential(
id_token
);
}
}, [response]);
Any ideas on how to execute this securely to make sure a user can't change the URL redirect parameters from Google or anything? I'm just not 100% on what I'm doing here.
Use an .env file to store your sensitive information. For example in your .env file put
CLIENTID="my-client-id-goes-here.google.apps.com"
Then call it like you did above:
const [request, response, promptAsync] = Google.useIdTokenAuthRequest({
clientId:
process.env.CLIENTID,
});
We're using Firebase in a Next.js app at work. I'm new to both, but did my best to read up on both. My problem is more with Firebase, not so much with Next.js. Here's the context:
In the client app, I make some calls to our API, passing a JWT (the ID token) in an Authorization header. The API calls admin.auth().verifyIdToken to check that the ID token is fresh enough. This works fine, since I am more or less guaranteed that the ID token gets refreshed regularly (through the use of onIDTokenChanged (doc link)
Now I want to be able to Server-Side Render my app pages. In order to do that, I store the ID token in a cookie readable by the server. But from here on, I have no guarantee that the ID token will be fresh enough next time the user loads the app through a full page load.
I cannot find a server-side equivalent of onIDTokenChanged.
This blog post mentions a google API endpoint to refresh a token. I could hit it from the server and give it a refresh token, but it feels like I'm stepping out of the Firebase realm completely and I'm worried maintaining an ad-hoc system will be a burden.
So my question is, how do people usually reconcile Firebase auth with SSR? Am I missing something?
Thank you!
I've had that same problem recently, and I solved by handling it myself. I created a very simple page responsible for forcing firebase token refresh, and redirecting user back to the requested page. It's something like this:
On the server-side, check for token exp value after extracting it from cookies (If you're using firebase-admin on that server, it will probably tell you as an error after verifying it)
// Could be a handler like this
const handleTokenCookie = (context) => {
try {
const token = parseTokenFromCookie(context.req.headers.cookie)
await verifyToken(token)
} catch (err) {
if (err.name === 'TokenExpired') {
// If expired, user will be redirected to /refresh page, which will force a client-side
// token refresh, and then redirect user back to the desired page
const encodedPath = encodeURIComponent(context.req.url)
context.res.writeHead(302, {
// Note that encoding avoids URI problems, and `req.url` will also
// keep any query params intact
Location: `/refresh?redirect=${encodedPath}`
})
context.res.end()
} else {
// Other authorization errors...
}
}
}
This handler can be used on the /pages, like this
// /pages/any-page.js
export async function getServerSideProps (context) {
const token = await handleTokenCookie(context)
if (!token) {
// Token is invalid! User is being redirected to /refresh page
return {}
}
// Your code...
}
Now you need to create a simple /refresh page, responsible for forcing firebase token refresh on client-side, and after both token and cookie are updated, it should redirect user back to the desired page.
// /pages/refresh.js
const Refresh = () => {
// This hook is something like https://github.com/vercel/next.js/blob/canary/examples/with-firebase-authentication/utils/auth/useUser.js
const { user } = useUser()
React.useEffect(function forceTokenRefresh () {
// You should also handle the case where currentUser is still being loaded
currentUser
.getIdToken(true) // true will force token refresh
.then(() => {
// Updates user cookie
setUserCookie(currentUser)
// Redirect back to where it was
const decodedPath = window.decodeURIComponent(Router.query.redirect)
Router.replace(decodedPath)
})
.catch(() => {
// If any error happens on refresh, redirect to home
Router.replace('/')
})
}, [currentUser])
return (
// Show a simple loading while refreshing token?
<LoadingComponent />
)
}
export default Refresh
Of course it will delay the user's first request if the token is expired, but it ensures a valid token without forcing user to login again.
I'm using the google cloud nodejs storage library to upload some images to cloud storage. This all works fine. I'm then trying to generate a signed URL immediately after uploading the file, using the same storage object that uploaded the file in the first place but I receive the following error:
Request had insufficient authentication scopes
I'm not sure why this would be happening if it's all linked to the same service account that uploaded in the first place. (For what it's worth it's a firebase app).
The code is below:
const Storage = require('#google-cloud/storage');
storage = new Storage();
storage.bucket(bucketName).upload(event.file.pathName, {
// Support for HTTP requests made with `Accept-Encoding: gzip`
gzip: true,
destination: gcsname,
metadata: {
// Enable long-lived HTTP caching headers
// Use only if the contents of the file will never change
// (If the contents will change, use cacheControl: 'no-cache')
cacheControl: 'public, max-age=31536000'
},
}).then(result => {
let url = `https://storage.googleapis.com/${bucketName}/${gcsname}`;
const options = {
action: 'read',
expires: Date.now() + 1000 * 60 * 60, // one hour
};
// Get a signed URL for the file
storage.bucket(bucketName).file(gcsname).getSignedUrl(options).then(result => {
console.log("generated signed url", result);
}).catch(err => {
console.log("err occurred", err)
})
})
The bucket itself isn't public and neither are the objects, but it's my understanding that I should still be able to generate a signed url. The app itself is running on GCP compute engine, hence not passing any options to the new Storage() - passing options in fact also makes the upload fail.
Can anyone advise on what I'm doing wrong?
With the limited amount of information I have, here's a few things that you could be missing based on the error you are receiving:
The Identity and Access Management (IAM) API must be enabled for the project
The Compute Engine service account needs the iam.serviceAccounts.signBlob permission, available to the "Service Account Token Creator" role.
Additionally, you can find more documentation regarding the topic here.
https://cloud.google.com/storage/docs/access-control/signed-urls
https://cloud.google.com/storage/docs/access-control/signing-urls-manually