I'm trying to deploy one of my functions from firebase CLI (version 8.12.1) and it keeps failing.
The function hasn't changed in weeks, so I am a bit confused as to why it's failing now.
Error from the CLI
functions[http-api-(europe-west1)]: Deployment error.
Build failed: Build error details not available. Please check the logs at https://console.cloud.google.com/logs/viewer?project=&advancedFilter=resource.type%3Dbuild%0Aresource.labels.build_id%3Dfeb2697d-29b4-4ab7-9b84-90d9f847be42%0AlogName%3Dprojects%2Fvestico-dev%2Flogs%2Fcloudbuild
Logs from the cloud console
Step #3 - "restorer": Restoring data for "google.nodejs.functions-framework:functions-framework" from cache
Step #3 - "restorer": \u001b[31;1mERROR: \u001b[0mfailed to restore: restoring data: GET https://storage.googleapis.com/eu.artifacts..appspot.com/containers/images/sha256:484d08dfc6a8f356c34a86fa4440fedf86f4fc398967eea66e4aab4e9ee81e3d?access_token=REDACTED: unsupported status code 404; body: NoSuchKeyThe specified key does not exist.No such object: eu.artifacts..appspot.com/containers/images/sha256:484d08dfc6a8f356c34a86fa4440fedf86f4fc398967eea66e4aab4e9ee81e3d
Finished Step #3 - "restorer"
ERROR: build step 3 "eu.gcr.io/fn-img/buildpacks/nodejs10/builder:nodejs10_20201005_20_RC00" failed: step exited with non-zero status: 46
The interesting piece is probably the error from above:
<Error>
<Code>NoSuchKey</Code>
<Message>The specified key does not exist.</Message>
<Details>No such object: eu.artifacts.<project-id>.appspot.com/containers/images/sha256:484d08dfc6a8f356c34a86fa4440fedf86f4fc398967eea66e4aab4e9ee81e3d</Details>
</Error>
What key is builder referring to? <project-id>#appspot.gserviceaccount.com has accesss to the cloud function with roles Cloud Functions Admin and Editor.
EDIT
Deploying through firebase deploy --only functions:<my-api>
That function uses #google-cloud/storage to get a public url for a storage resource.
I'm loading the service account configs like this:
const devServiceAccount = require("../../service-accounts/dev.json");
const prodServiceAccount = require("../../service-accounts/prod.json");
export const getAdminConfig = (): (AppOptions | undefined) => {
const baseConfigEnv = process.env.FIREBASE_CONFIG;
if (!baseConfigEnv) {
console.error("no firebase config environment");
return undefined;
}
const app = functions.config().app;
if (app === undefined) {
console.error("no firebase app config");
return undefined;
}
const serviceAccount = app.environment === 'dev' ? devServiceAccount : prodServiceAccount;
const adminConfig = JSON.parse(baseConfigEnv) as AppOptions;
adminConfig.credential = credential.cert(serviceAccount);
return adminConfig;
}
The cloud storage is used here.
const options = {
action: 'read',
expires: Date.now() + 1000 * 60 * 60 //1 hour
} as GetSignedUrlConfig;
const file = bucket.file(path);
filePathPromises.push(file.getSignedUrl(options))
});
My folder structure is as follows.
+ functions
+ lib
+ function.js
+ service-accounts
+ dev.json
+ prod.json
+ src
+ function.ts
I was ruling out that the service account files are the issue given that the files are loaded in getAdminConfig() for all functions in the project.
Update 10/13/20
I've verified the files uploaded to the GCF storage container. The JSON keys are there and in the right location. The paths match, so they should be found when the GCF is running.
Adding a hint for the next soul running into this problem. It seems to be caused by missing/inaccessible file in the restore/rollback process.
I was successfully removing the problem by simply:
Deleting my functions using the web firebase console.
Deploying normally again >firebase deploy
It seems there was an intermittent issue in Firebase Cloud Functions or GCF. I just ran firebase deploy --only functions again and it deployed successfully.
Related
Step 1: Automatically create a new Next.js project using the new beta app directory:
npx create-next-app#latest --experimental-app
pages/api/hello.ts
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from 'next'
type Data = {
name: string
}
export default function handler(
req: NextApiRequest,
res: NextApiResponse<Data>
) {
res.status(200).json({ name: 'John Doe' })
}
This file is identical to the one created automatically created by npx - there are no modifications.
I am trying to build a simple home page, which fetches data from the api which gets data from my database. Either way an await/async will be required. I am following the instructions from here.
In this example I will demonstrate that even awaiting the supplied hello api can't seem to run in production, and I can't work out why.
app/page.tsx
async function getHelloAsync() {
const res = await fetch('http://localhost:3000/api/hello', { cache: 'no-store' });
// The return value is *not* serialized
// You can return Date, Map, Set, etc.
// Recommendation: handle errors
if (!res.ok) {
// This will activate the closest `error.js` Error Boundary
throw new Error('Failed to fetch data');
}
return res.json();
}
export default async function Page() {
const hello = await getHelloAsync();
return (
<main>
<h1>Hello: {hello.name}</h1>
</main>
)
}
To test the hello api works, I confirm that running pn run dev and then curl http://localhost:3000/api/hello the following successful response is received:
{"name":"John Doe"}
Next up we exit the dev server and run:
pn run build
The first headache is that the build will completely fail to build unless one adds { cache: 'no-store' } to the fetch command:
const res = await fetch('http://localhost:3000/api/hello', { cache: 'no-store' });
or adds this to the top of app/page.tsx:
export const fetchCache = 'force-no-store';
I am actually not sure how one would even build this if you wanted to cache the response or use revalidate instead and provide an initial optimistic response, because without cache: no-store it refuses to build outright. Ideally instead it should just cache the result from /api/hello and not fail. Running the dev server at the same idea as doing the build does allow the build to work, but then as soon as you exit the dev server and run pn run start then all the api calls fail anyway. So that is not a good idea.
This leads us to the next problem - why are the api calls not working in production (i.e. when calling pn run start).
Step 2:
pn run build
pn run start
Confirm that the following still works and yes it does:
curl http://localhost:3000/api/hello
Result:
{"name":"John Doe"}
Now we visit http://localhost:3000 in a browser but, surprise! We get the following error:
> next start
ready - started server on 0.0.0.0:3000, url: http://localhost:3000
warn - You have enabled experimental feature (appDir) in next.config.js.
warn - Experimental features are not covered by semver, and may cause unexpected or broken application behavior. Use at your own risk.
info - Thank you for testing `appDir` please leave your feedback at https://nextjs.link/app-feedback
(node:787) ExperimentalWarning: The Fetch API is an experimental feature. This feature could change at any time
(Use `node --trace-warnings ...` to show where the warning was created)
TypeError: fetch failed
at Object.fetch (node:internal/deps/undici/undici:11118:11)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async getHelloAsync (/Users/username/nextjstest/.next/server/app/page.js:229:17)
at async Page (/Users/username/nextjstest/.next/server/app/page.js:242:19) {
cause: Error: connect ECONNREFUSED ::1:3000
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1300:16)
at TCPConnectWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
errno: -61,
code: 'ECONNREFUSED',
syscall: 'connect',
address: '::1',
port: 3000
}
}
[Error: An error occurred in the Server Components render. The specific message is omitted in production builds to avoid leaking sensitive details. A digest property is included on this error instance which may provide additional details about the nature of the error.] {
digest: '3567993178'
}
Why is it saying that the connection is refused when we know the API is available? I can't get this to run at all. I know this is beta but surely the code should actually run right? How do I make this code work?
Also if anyone knows where where the logs are that I'm supposed to be accessing to see digest '3567993178' please let me know.
I am new to Firebase Functions and backend dev in general. I have written a function (called convertFile) that uses the node package "node-7z" to unzip a file called "example.7z" (located in the root of my functions directory). This is my functions code:
const functions = require("firebase-functions");
const admin = require("firebase-admin");
const sevenZipBin = require("7zip-bin");
const { extractFull } = require("node-7z");
admin.initializeApp();
const pathTo7za = sevenZipBin.path7za;
exports.convertFile = functions.https.onCall((data, context) => {
const seven = extractFull('./example.7z', './example', {
$bin: pathTo7za
});
seven.on('error', function (err) {
console.log(err.message);
});
seven.on('end', function () {
console.log('END');
});
});
When I use Firebase's local emulator to run the function, the function runs successfully. A new directory called "example" containing the contents of "example.7z" is created within my "functions" directory (this is the desired behavior). When I deploy the function to firebase, and run the function in production, I get the following error in the function log (image of full log shown at the bottom):
spawn /workspace/node_modules/7zip-bin/linux/x64/7za EACCES
Note that "/workspace/node_modules/7zip-bin/linux/x64/7za" is the filepath stored in the variable "sevenZipBin.path7za".
I can understand how there might be a permissions issue when the function is run on firebase as opposed to my local system, but I don't know how to solve the issue. I tried using a child process (within the function) to change the file permissions using the "chmod" command, but I get this error in the functions log:
chmod: changing permissions of '/workspace/node_modules/7zip-bin/linux/x64/7za': Read-only file system
I assume this is not a valid solution. I tried searching Google, stackoverflow, Github, etc. for solutions but I still don't know how to proceed from here. If anyone has suggestions, I would greatly appreciate it.
Additional info:
node-7z docs - https://github.com/quentinrossetti/node-7z
7zip-bin docs - https://github.com/develar/7zip-bin
Screenshot of functions log
I have a (very slightly) modified version of the generateThumbnail Firebase Cloud Function found in the Firebase Github repo. The function was working correctly at one point, and now it will time out every time it is called. I haven't made any changes to the function or to the rules for my storage bucket. The rules are the default ones that check for authentication. After adding some logging I can see that it never makes it past this line:
await file.download({destination: tempLocalFile});
The image file I am testing with is a 15.21KB PNG. The timeout of the function happens after ~60000 ms (default). There is no error in the logs, only the timeout.
Any suggestions as to why it started timing out all of the sudden? Or how to debug this single call further?
Node: 14
Firebase Admin: 9.8.0
Firebase Functions: 3.14.1
EDITS
I have deployed a minimum reproducable function and am seeing the same results.
exports.newGenerateThumbnail = functions.storage.object().onFinalize(async (object) => {
const filePath = object.name;
const tempLocalFile = path.join(os.tmpdir(), filePath);
const tempLocalDir = path.dirname(tempLocalFile);
// Cloud Storage files.
const bucket = admin.storage().bucket(object.bucket);
const file = bucket.file(filePath);
functions.logger.log('Creating Temp Directory');
await mkdirp(tempLocalDir);
functions.logger.log('Temp Directory Created');
functions.logger.log('Downloading File');
await file.download({ destination: tempLocalFile });
functions.logger.log('File Downloaded');
functions.logger.log('Removing File');
fs.unlinkSync(tempLocalFile);
functions.logger.log('File Deleted');
return true;
});
The logs show this
I tried to reproduce the issue using your code, but it is working fine for me without any timeout as shown in the screenshot.
I cross checked my versions with yours, I am using Node: 16.6, Firebase: 10.2.1, Firebase Functions: 3.16
It may be because of the version as it is seen in the past as well, as shown in the stackoverflow thread. So I suggest you upgrade your version and try again, it might help you resolve the issue.
I have a node.js application that creates Cloud HTTP tasks with authentication. I'd like to handle these tasks viaFirebase HTTP function (also in JS). I understand that I need to use oidcToken when creating a task, but I don't understand how to validate such a token on the Firebase HTTP function end. Docs are not very helpful. I was expecting to find some utility in #google-cloud/tasks or in googleapis/google-auth-library-nodejs, but nothing jump out at me.
All you have to do is assosiacte your Cloud Function with a service account. This is called Function Identity. Please note that whenever you deploy a Cloud Function either in GCP or Firebase, the same function appears in both platforms.
You can create a new service account for this purpose with:
gcloud iam service-accounts create [YOUR_NEW_SERVICE_ACCOUNT_NAME] \
--display-name "Service Account Test"
And assign the required IAM role with:
gcloud projects add-iam-policy-binding ${PROJECT_ID} \
--member serviceAccount:[YOUR_NEW_SERVICE_ACCOUNT_NAME]#${PROJECT_ID}.iam.gserviceaccount.com \
--role roles/cloudfunctions.invoker
Your function deployment should look like something like this:
gcloud functions deploy hello_world \
--trigger-http \
--region us-central1 \
--runtime nodejs14 \
--service-account [YOUR_NEW_SERVICE_ACCOUNT_NAME]#${PROJECT_ID}.iam.gserviceaccount.com \
--no-allow-unauthenticated
Once you have all that, your Cloud HTTP task should use this service account as the OICD token, but I believe you already know how to do so.
You can find more information in this guide (although it uses the Cloud Scheduler instead of Cloud Tasks, the idea is pretty much the same).
You don't need to validate the token in the function, when you create the function so only authenticated users can call it this verification will automatically be done on Google's side.
So, to make what you want you need to:
1. Create the Task Queue
Detailed steps in this how-to.
2. Create the service account
You will need to create a SA with these IAM permissions:
Cloud Functions Invoker
Cloud Tasks Enqueuer
Service Account User
"Cloud Functions Invoker" is necessary to be able to call the function and "Cloud Tasks Enqueuer" to add tasks to the queue. If you want to use a different SA for each of those steps you can separate these permissions.
3. Create the Firebase Functions function
When creating the function, make sure that it required authentication.
4. Create the task using the SA
There's a section for this in the documentation, which you can find here. The code below is copied below:
// Imports the Google Cloud Tasks library.
const {CloudTasksClient} = require('#google-cloud/tasks');
// Instantiates a client.
const client = new CloudTasksClient();
async function createHttpTaskWithToken() {
const project = '[PROJECT_NAME]';
const queue = '[QUEUE_NAME]';
const location = '[LOCATION]';
const url = '[FUNCTION_TREIGGER_URL]';
const serviceAccountEmail = '[SA]';
const payload = 'Hello, World!';
// Construct the fully qualified queue name.
const parent = client.queuePath(project, location, queue);
const task = {
httpRequest: {
httpMethod: 'POST',
url,
oidcToken: {
serviceAccountEmail,
},
},
};
if (payload) {
task.httpRequest.body = Buffer.from(payload).toString('base64');
}
console.log('Sending task:');
console.log(task);
// Send create task request.
const request = {parent: parent, task: task};
const [response] = await client.createTask(request);
const name = response.name;
console.log(`Created task ${name}`);
}
createHttpTaskWithToken();
According to cloudinary's documentation one should be able to upload an image to cloudinary using google cloud storage.
However when I attempt to do so, I get the following error in my cloud functions logs.
ENOENT: no such file or directory, open 'gs://my-bucket.appspot.com/01.jpg'
this is my cloud function:
import * as functions from 'firebase-functions';
import * as cloudinary from 'cloudinary';
cloudinary.config({
cloud_name: functions.config().cloudinary.cloudname,
api_key: functions.config().cloudinary.apikey,
api_secret: functions.config().cloudinary.apisecret,
});
export const uploadImageToCloudinary = functions.storage
.object()
.onFinalize(object => {
cloudinary.v2.uploader.upload(
`gs://${object.bucket}/${object.name}`,
function(error, result) {
if (error) {
console.log(error)
return;
}
console.log(result);
}
);
})
I have added /.wellknown/cloudinary/<cloudinary_cloudname> to my bucket as well added permission in cloud platform to allow cloudinary object viewer access
Is there an extra step I'm missing - I can't seem to get this working?!
Cloudinary does support Google cloud storage upload, but it's a relatively new feature and the current version of the node SDK doesn't handle gs:// urls.
In your example, it's trying to resolve the gs:// URL on the local server and send the image to Cloudinary, rather than sending the URL to Cloudinary so the fetch happens from Cloudinary's side.
Until this is added to the SDK, you could get this working by triggering the fetch using the URL-based upload method, or by making a small change to the SDK code.
Specifically, it's a small change in lib/uploader.js - you need to add the gs: prefix there, after which it should work OK.
Diff:
diff --git a/lib/uploader.js b/lib/uploader.js
index 2f71eaa..af08e14 100644
--- a/lib/uploader.js
+++ b/lib/uploader.js
## -65,7 +65,7 ##
return call_api("upload", callback, options, function() {
var params;
params = build_upload_params(options);
- if ((file != null) && file.match(/^ftp:|^https?:|^s3:|^data:[^;]*;base64,([a-zA-Z0-9\/+\n=]+)$/)) {
+ if ((file != null) && file.match(/^ftp:|^https?:|^gs:|^s3:|^data:[^;]*;base64,([a-zA-Z0-9\/+\n=]+)$/)) {
return [
params, {
file: file
After applying that diff, I did successfully fetch an image from Google Cloud Storage