I did my best to figure this one out by myself, but I'm totally missing something.
I'm using NextJS 12 and Google Cloud Translate's API to convert a word on a button. Locally it works fine, but once I try to deploy to vercel the permissions/keys gets messed up somewhere.
Locally I have my key.json, which I got from my service account. It's just in my project's root. I have my .env.local file that has references that key file. It looks like this
GOOGLE_APPLICATION_CREDENTIALS=./<projectid&key>.json
But when I try to translate, I get hit with an error.
'Request failed with status code 500'
My translate endpoint looks like this, which I pretty much copied from Google's small tutorial.
import { NextApiRequest, NextApiResponse } from "next";
export default async (req: NextApiRequest, res: NextApiResponse) => {
const translationClient = new TranslationServiceClient();
const projectId = <myprojectID>;
const location = "global";
async function translateText() {
const request = {
parent: `projects/${projectId}/locations/${location}`,
contents: [req.body.text],
mimeType: "text/plain",
sourceLanguageCode: "en",
targetLanguageCode: "es",
};
const [response] = await translationClient.translateText(request);
res.json(response.translations[0].translatedText);
}
translateText();
};
Things I've tried
Putting the JSON as one single environmental variable on vercel. So It was basically GOOGLE_APPLICATION_CREDENTIALS and the key.json file.
Tried putting it all in one line.
Tried taking the keys apart and putting it into a format like this:
GOOGLE_ACCOUNT_TYPE=service_account
GOOGLE_PROJECT_ID=project11111
GOOGLE_PRIVATE_KEY_ID=11111111111111
etc
However I wasn't about to get this method working locally either.
4. Kept the .env.local's path to key.json and just uploaded the key.json itself.
None of these worked and I'm pretty lost.
Resources I've looked at
https://github.com/vercel/vercel/issues/749#issuecomment-715009494
Escaping issue with firebase privateKey as a Heroku config variable
https://daveteu.medium.com/call-google-cloud-function-from-vercel-serverless-hosting-1b1688bb462c
I've tried to apply these to my situation, but I couldn't figure it out. I'd really appreciate any help! Thank you so much.
After looking around for a few days, I found a solution that worked.
I turned my keys.json into a base64 string using a similar command.
https://gist.github.com/tomi/0675e58919af4554b198cee3f84405e5
Then used the method found here.
https://github.com/orgs/vercel/discussions/219
I put that base64 string into one line in my .env file to check and then did the same for Vercel environmental variables, and it worked.
Related
I build a rest service in Deno (Oak) and also serve static files. However, when I run deno compile I would like to have those static files included into the single binary file that is ejected. Is this possible?
From what I can tell, neither Deno nor Oak have intentional support for this.
A downside of even doing this is that your binary file may become large. This isn't only an issue with distribution but may also slow loading and executing the binary.
Nevertheless, one way you can make "static" files available in a compiled binary is to encode the files as JavaScript modules (similar to using WebAssembly in Deno).
e.g. The following module encodes a static file, named example.txt, storing its file type, txt, and its contents, hello world\n. The contents are base64 encoded (thank you jsejcksn for the suggestion). You can encode and decode the contents other ways as well or even use different encodings depending on the file type if you like.
example.txt.ts:
export default {
type: "txt",
data: "aGVsbG8gd29ybGQK",
};
You can programmatically create modules like this from static files.
e.g. encode-as-module.ts:
import { extname } from "https://deno.land/std#0.155.0/path/mod.ts";
import { encode } from "https://deno.land/std#0.155.0/encoding/base64.ts";
const [inputPath, outputPath = `${inputPath}.ts`] = Deno.args;
const type = extname(inputPath).slice(1);
const bytes = await Deno.readFile(inputPath);
const script = /* JavaScript */ `export default {
type: "${type}",
data: "${encode(bytes)}",
};
`;
await Deno.writeTextFile(outputPath, script);
Usage:
deno run --allow-read --allow-write encode-as-module.ts example.txt
Once you have your static files encoded as modules you can then change your Oak app from serving them using send() to serving them using context.response (passing the type and body). More work will need to be done here to encode a list of all the static files, etc. but I think what's already provided here illustrates the idea sufficiently.
Barrel/index files seem to create issues when used with next.js. It doesn't seem established if it's purely a webpack issue or both webpack and next.js
According to this issue tree shaking stops working if we use barrel files. I also created a small repo where I have an issue with an index file. Not sure if it's a tree shaking issue.
Steps to reproduce the issue:
npm install
npm run dev
in browser, visit http://localhost:3000/about-pro, expect to see blank page with errors or warnings in browser's console
go to server's console(where you run npm run dev)
see an error of sort "Module not found: Can't resolve 'fs'" (1) (2) (3)
1- this comes from the await serialize in getAboutPageData file. Which itself is only called within getStaticProps
2 - googling for this issue, you'll find solutions such as modifying next.config.js file. It still doesn't work. Feel free to uncomment the next.config.js file and see for yourself
3 - to "solve" the issue, go to about-pro.tsx, in the imports, import AboutPage from its own file instead of from the barrel/index file
If I only import getAboutPageData from the barrel/index file, then it works fine. But as soon as I import e.g. AboutPage from it, it starts throwing unrelated issues.
Can I continue using barrel/index files with next.js and if yes, is there a simple and intuitive way to do that?
The issue is not in the barrel files themselves but in the library that you're using combined with barrel files.
If you take a look at the readme file https://github.com/hashicorp/next-mdx-remote#examples you can find a warning:
IMPORTANT: Be very careful about putting any mdx-remote code into a separate "utilities" file. Doing so will likely cause issues with nextjs' code splitting abilities - it must be able to cleanly determine what is used only on the server side and what should be left in the client bundle. If you put mdx-remote code into an external utilities file and something is broken, remove it and start from the simple example above before filing an issue.
So in order to make your code work you need to remove the export of getAboutPageData from your barrel file, like this:
export { default as AboutPage } from './AboutPage';
// export { default as getAboutPageData } from './getAboutPageData';
and move the code that uses the library inside the about-pro.tsx file.
import { AboutPage } from '../modules/about';
import { serialize } from 'next-mdx-remote/serialize';
const AboutPro = (props) => {
return <AboutPage {...props} />;
};
export const getStaticProps = async () => {
const serializedContent = await serialize(`# Header1`);
const data = serializedContent;
return { props: {} };
};
export default AboutPro;
I think the issue is that the modules imported in barrel files are executed both client and server side. Probably removing side effects from the barrel file could solve the issue, but I don't know enough about Next.js to be able to do it correctly.
In deno you can load related modules or other code by just referencing the relative path to those ES6 modules. Deno will handle loading them appropriately. What's the way to do this for non-es6 modules? For example: say I wanted to include some custom css with my deno project? Deno doesn't allow doing import mycss from "./relative.css";.
Deno file operations do work for local files, but they're evaluated relative to the cwd not the current file, and they don't work for arbitrary URLs. fetch, on the other hand, should be perfect, but currently doesn't support file schemes and the decision isn't being actively considered. Combining these yields the only solution I can come up with, but I really don't like it:
async function loadLocal(relative: string): Promise<string> {
const url = new URL(relative, import.meta.url);
if (url.protocol === 'file:') {
return await Deno.readTextFile(url.pathname);
} else {
const resp = await fetch(url.href);
return await resp.text();
}
}
This seems like it should mostly work, but it seems like a terrible way to hack in something that I expected would be supported by design in deno. It also must be redeclared in each file, or have the callers URL passed in, although there might be a way to avoid that. It doesn't work on windows without modifying the path delimiter.
Update
Deno.emit seems close to what I would want, however for some reason it has different behavior than standard importing:
If the rootSpecifier is a relative path, then the current working directory of the Deno process will be used to resolve the specifier. (Not relative to the current module!)
It also still requires that the paths be to valid modules, instead of arbitrary text.
As #Zwiers pointed out, deno 1.6 now supports fetch with the file protocol, so this is now irrelevant.
I am writing tests for a React based web tool. So I want to clear all local storage such as login information etc. before each test. I have majorly worked in Cypress, where this was just a simple command.
cy.clearLocalStorage();
I am now using WebdriverIO and this is the approach that I was trying out (in the test file).
afterEach(() => {
browser.executeScript('window.localStorage().clear()');
});
However, this doesn't seem to be working. Moreover, I would prefer a global solution, something that I don't have to write in each test. Thanks in advance for the help.
From the WebdriverIO Docs:
// remove the storage item for the given key
client.localStorage('DELETE', 'someKey');
// clear the storage
client.localStorage('DELETE');
You can clear localStorage by running this preset function.
You were almost right in your assumption. I'd suggest using official docs to eliminate minor errors.
Instead of executeScript use https://webdriver.io/docs/api/browser/execute.html
localStorage is not a function, see https://developer.mozilla.org/en-US/docs/Web/API/Window/localStorage
So it should be
afterEach(() => {
browser.execute('window.localStorage.clear()');
});
P.S.
Assuming you are using WebdriverIO 5 or above.
So, after a lot of time, we realized the problem.
The cy.clearLocalStorage() cleans only the local storage under the baseUrl definition.
If you would like to open multiple pages, you have to explicit define the clearing in the cy.visit() call, like that:
cy.visit(`subdomain.another.domain`, {
onBeforeLoad(win) {
win.localStorage.clear();
},
});
In this case, the local storage for the exact domain, will be deleted.
I hope, this workaround helps.
I have an App that display nightclub description and image. Each club have about 4 related image.
In Firebase Storage i have created directory for each club and then stored their image inside.
so what i want to do is getting all the image from a club directory so i can display all the image in my app
i think a way of achieving this would be to get the DownloadUrl of each image.
i've tried this :
final StorageReference firebaseStorageRef = FirebaseStorage.instance.ref()
.child('profilePics/$clubID/SomeImage.jpg').getDownloadURL();
but since i don't know in advance the name of the image stored i can't use this
so any way of doing this ?
Future<void> listExample() async {
firebase_storage.ListResult result =
await firebase_storage.FirebaseStorage.instance.ref().listAll();
result.items.forEach((firebase_storage.Reference ref) {
print('Found file: $ref');
});
result.prefixes.forEach((firebase_storage.Reference ref) {
print('Found directory: $ref');
});
}
this code worked for me i got it from flutter fire website
here is the link to the docs
https://firebase.flutter.dev/docs/storage/usage
If you don't know the full path of an object in Cloud Storage, then you can't do anything with it using the mobile client SDKs. Typically, one gets the download URL at the time it was uploaded to the bucket, then writes that URL to a database so it can be queried for later.
A solution I came up with was, storing a .txt file which contained the name of each file, so first I read the text file, by line-breaking the file apart, and downloading my images in the same folder with the name of the file I got from the text file. This can work, if you can can store the names of all your files in a text file and then upload it!
Flow of the solution:
Store all the names of the files in the Firebase Storage's folder in a .txt file
write a method to get the text file and the break it line-by-line
write a method to get a download url, have this method call each time you get the name of your file from the .txt file.
use the image url as per your wish!
Any suggestions or corrections are welcomed!
https://github.com/FirebaseExtended/flutterfire/pull/232
Need to use package in pubspec.yaml like below :
firebase_storage:
git:
url: git://github.com/danysz/flutterfire.git
ref: master
path: packages/firebase_storage
Dart file
void getFirebaseImageFolder() {
final StorageReference storageRef =
FirebaseStorage.instance.ref().child('Gallery').child('Images');
storageRef.listAll().then((result) {
print("result is $result");
});
Original post by Mahesh Peri