cannot get this 'convert' cloud functions command to run - firebase

This firebase function should take a pdf in /test/testfile.pdf, convert it to grey and save it somewhere. I want to use this function in a more complicated process, but the exec('convert') is really not helping me.
The issue is the 'exec' command keeps failing. In the shell, the exact command line you see here is working:
convert -colorspace GRAY -density 300 test/testfile.pdf /tmp/out.pdf
The error in the logs is this:
{ ChildProcessError: Command failed: convert -colorspace GRAY -density 300 test/testfile.pdf /tmp/out.pdf convert: no images defined `/tmp/out.pdf' # error/convert.c/ConvertImageCommand/3210. `convert -colorspace GRAY -density 300 test/testfile.pdf /tmp/out.pdf\` (exited with error code 1) at callback (/user_code/node_modules/child-process-promise/lib/index.js:33:27) at ChildProcess.exithandler (child_process.js:205:5) at emitTwo (events.js:106:13) at ChildProcess.emit (events.js:191:7) at maybeClose (internal/child_process.js:920:16) at Process.ChildProcess._handle.onexit (internal/child_process.js:230:5) name: 'ChildProcessError', code: 1, childProcess: { ChildProcess: { [Function: ChildProcess] super_: [Object] }, fork: [Function], _forkChild: [Function], exec: [Function], execFile: [Function], spawn: [Function], spawnSync: [Function: spawnSync], execFileSync: [Function: execFileSync], execSync: [Function: execSync] }, stdout: '', stderr: 'convert: no images defined `/tmp/out.pdf\' # error/convert.c/ConvertImageCommand/3210.\n' }
This is the function:
const functions = require('firebase-functions');
const rp = require('request-promise');
const request = require('request');
const baseURL = "https://www.google.com/cloudprint/"
const exec = require('child-process-promise').exec;
const mkdirp = require('mkdirp-promise');
const path = require('path');
const os = require('os');
const fs = require('fs');
exports.convertPDF = functions.https.onRequest((req, res) => {
const tempLocalThumbFile = path.join(os.tmpdir(), "out.pdf");
try {
let tempLocalFile = "test/testfile.pdf"
exec('convert -colorspace GRAY -density 300 test/testfile.pdf '+tempLocalThumbFile).then((a) => {
console.log('Conversion created at', tempLocalThumbFile);
}, function (err) {
console.log(err)
})
} catch(err) {
console.log(err)
}
})
I am pretty stuck. How to get this convert to work in Firebase Functions?

Actually the problem is what has been stated by #Nivco: Cloud Functions are missing the ghostscript package. There is a feature request already asking for making ghostscript package available. You can go to the link and click on the start icon to get email notifications when there are some news.
There is another StackoverFlow thread where it is mentioned a workaround consisting on getting gs binaries on your own.

Related

Vue 3 Vite Pinia Unit Testing - TypeError: [Function] is not a spy or a call to a spy! ONLY when running Coverage

In my Vue 3 (using Vite and vi-test) app I am using Pinia and I've written some unit basic tests which run ok when I run
npm run test:unit
but when I run
npm run coverage
I get test errors:
TypeError: [Function] is not a spy or a call to a spy!
Anyone know why one would work but not the other?
This is my script setup in package.json:
"test:unit": "vitest --environment jsdom",
"coverage": "vitest run --coverage",
Here's an example of a test - this will run fine with the first command but with the second command will give the above error on the expect(store.clearCheckedData) line
describe("ContactBook", () => {
let mockProps = {};
let wrapper:any;
beforeEach(() => {
wrapper = render(ContactBook, {
props: mockProps,
global: {
components: {ProgressSpinner,Button,InputText,BaseButton},
plugins: [PrimeVue,
createTestingPinia({
initialState: {contact:{
mockRegistrationData: mockRegistrationData,
loading: false,
}},
stubActions: false,
createSpy: vi.fn,
fakeApp:true
}),
],
},
});
setActivePinia(createPinia());
});
afterEach(() => {
cleanup();
});
it("when Year / Reg Group tab is clicked, registrations component is rendered", async() => {
const button = screen.getByText('Year / Reg Group')
await userEvent.click(button);
const store = useContactBookStore();
expect(store.clearCheckedData).toHaveBeenCalledTimes(1) // ERROR ON THIS LINE
expect(store.fetchRegistrationData).toHaveBeenCalledTimes(2)
wrapper.getByTestId("registrations-component")
});

Error trying to deploy a function on firebase

Hello I am getting an error trying to deploy a function on firebase and it is bothering me because it worked in the past and now that I wanted to deploy the same code it is giving me the error above.
Can someone have a look because I checked the documentation thinking that something might change and the names of the attributes or something are not the same but the function seems 100% sound based on the documentation.
Kind regards and kudos to everyone.
Much respect if someone manages to give me a hint. I will add the log files also.
Code :
const functions = require("firebase-functions");
const axios = require("axios");
const admin = require("firebase-admin");
admin.initializeApp();
const database = admin.firestore();
const page = 1;
const fiat = "RON";
const tradeType = "BUY";
const asset = "USDT";
const payTypes = ["ING"];
let finalData = [];
let tempDataBeforeProccessing = [];
const baseObj = {
page,
rows: 20,
publisherType: null,
asset,
tradeType,
fiat,
payTypes,
};
const stringData = JSON.stringify(baseObj);
const getTheData = async function() {
tempDataBeforeProccessing=[];
await axios.post("https://p2p.binance.com/bapi/c2c/v2/friendly/c2c/adv/search", baseObj, {
hostname: "p2p.binance.com",
port: 443,
path: "/bapi/c2c/v2/friendly/c2c/adv/search",
method: "POST",
headers: {
"Content-Type": "application/json",
"Content-Length": stringData.length,
},
}).then((res)=>{
tempDataBeforeProccessing=res.data.data;
});
};
const processData = function() {
finalData=[];
let obj = [];
for (let i = 0; i < tempDataBeforeProccessing.length; i++) {
let payTypesz = "";
for (let y = 0; y <
tempDataBeforeProccessing[i]["adv"]["tradeMethods"].length; y++) {
payTypesz +=
tempDataBeforeProccessing[i]["adv"]["tradeMethods"][y]["identifier"];
if (y <
tempDataBeforeProccessing[i]["adv"]["tradeMethods"].length - 1) {
payTypesz += ", ";
}
}
obj = {
tradeType: tempDataBeforeProccessing[i]["adv"]["tradeType"],
asset: tempDataBeforeProccessing[i]["adv"]["asset"],
fiatUnit: tempDataBeforeProccessing[i]["adv"]["fiatUnit"],
price: tempDataBeforeProccessing[i]["adv"]["price"],
surplusAmount:
tempDataBeforeProccessing[i]["adv"]["surplusAmount"],
maxSingleTransAmount:
tempDataBeforeProccessing[i]["adv"]["maxSingleTransAmount"],
minSingleTransAmount:
tempDataBeforeProccessing[i]["adv"]["minSingleTransAmount"],
nickName:
tempDataBeforeProccessing[i]["advertiser"]["nickName"],
monthOrderCount:
tempDataBeforeProccessing[i]["advertiser"]["monthOrderCount"],
monthFinishRate:
tempDataBeforeProccessing[i]["advertiser"]["monthFinishRate"],
payTypes: payTypesz,
};
finalData.push(obj);
}
console.log(finalData);
};
const entireCall = async function() {
await getTheData();
processData();
};
exports.scheduledFunction = functions.pubsub
.schedule("every 1 minutes")
.onRun(async (context) => {
await database.collection("SebiBinanceSale").doc("BCR Bank").delete();
await entireCall();
for (let i = 0; i < finalData.length; i++) {
await database.collection("SebiBinanceSale").doc("BCR Bank")
.collection("1").doc(i.toString())
.set({
"tradeType": finalData[i]["tradeType"],
"asset": finalData[i]["asset"],
"fiatUnit": finalData[i]["fiatUnit"],
"price": finalData[i]["price"],
"surplusAmount": finalData[i]["surplusAmount"],
"maxSingleTransAmount": finalData[i]["maxSingleTransAmount"],
"minSingleTransAmount": finalData[i]["minSingleTransAmount"],
"nickName": finalData[i]["nickName"],
"monthOrderCount": finalData[i]["monthOrderCount"],
"monthFinishRate": finalData[i]["monthFinishRate"],
"payTypes": finalData[i]["payTypes"],
});
}
return console.log("Succes Upload of the data ");
});
error:
Function failed on loading user code. This is likely due to a bug in the user code. Error message: Error: please examine your function logs to see the error cause: https://cloud.google.com/functions/docs/monitoring/logging#viewing_logs. Additional troubleshooting documentation can be found at https://cloud.google.com/functions/docs/troubleshooting#logging. Please visit https://cloud.google.com/functions/docs/troubleshooting for in-depth troubleshooting documentation.
Functions deploy had errors with the following functions:
scheduledFunction(us-central1)
i functions: cleaning up build files...
Error: There was an error deploying functions
ivanoiualexandrupaul#Ivanoius-MacBook-Pro functions %
log file :
[debug] [2022-10-29T17:40:16.776Z] Error: Failed to update function scheduledFunction in region us-central1
at /usr/local/lib/node_modules/firebase-tools/lib/deploy/functions/release/fabricator.js:41:11
at processTicksAndRejections (internal/process/task_queues.js:95:5)
at async Fabricator.updateV1Function (/usr/local/lib/node_modules/firebase-tools/lib/deploy/functions/release/fabricator.js:305:32)
at async Fabricator.updateEndpoint (/usr/local/lib/node_modules/firebase-tools/lib/deploy/functions/release/fabricator.js:140:13)
at async handle (/usr/local/lib/node_modules/firebase-tools/lib/deploy/functions/release/fabricator.js:78:17)
[error]
[error] Error: There was an error deploying functions
When you are using scheduled functions in Firebase Functions, an App Engine instance is created that is needed for Cloud Scheduler to work. You can read about it here.They use the location that has been set by default for resources. I think that you are getting this error because there is a difference between the default GCP resource location you specified and the region of your scheduled cloud function.Check your Cloud Scheduler function details and see which region it has been deployed to. By default, functions run in the us-central1 region. Check this link to see how we can change the region of the function.
You can also try re installation using the command
npm install -g firebase-tools
Also check if any lock files are generated and delete these and run the firebase deploy --only functions again and see if that works.

Stream stdout from subprocess

I'm running a Docker build as a subprocess in Deno and would like the stdout streamed to the parent stdout (Deno.stdout) so it's outputted straight away.
How can I achieve this?
Currently I have the following but it doesn't output anything until the subprocess has finished.
const p = Deno.run({
cmd: ['docker', 'build', '.'],
stdout: 'piped'
});
const stdout = await p.output();
await Deno.stdout.write(stdout);
You're not far off; you just need to start the process of piping the output before you await the process. There's some other optimizations you can make, like using Deno.copy to pipe the subprocess' output to the main process' stdout without copying stuff in memory for example.
import { copy } from "https://deno.land/std#0.104.0/io/util.ts";
const cat = Deno.run({
cmd: ["docker", "build", "--no-cache", ".", "-t", "foobar:latest"],
cwd: "/path/to/your/project",
stdout: "piped",
stderr: "piped",
});
copy(cat.stdout, Deno.stdout);
copy(cat.stderr, Deno.stderr);
await cat.status();
console.log("Done!");
If you want to prefix each line with the name of the process it came from (useful when you have multiple subprocesses running, you can make a simple function that uses the std lib's readLines function and a text encoder to do that
import { readLines } from "https://deno.land/std#0.104.0/io/mod.ts";
import { writeAll } from "https://deno.land/std#0.104.0/io/util.ts";
async function pipeThrough(
prefix: string,
reader: Deno.Reader,
writer: Deno.Writer,
) {
const encoder = new TextEncoder();
for await (const line of readLines(reader)) {
await writeAll(writer, encoder.encode(`[${prefix}] ${line}\n`));
}
}
const cat = Deno.run({
cmd: ["docker", "build", "--no-cache", ".", "-t", "foobar:latest"],
cwd: "/path/to/your/project",
stdout: "piped",
stderr: "piped",
});
pipeThrough("docker", cat.stdout, Deno.stdout);
pipeThrough("docker", cat.stderr, Deno.stderr);
await cat.status();
console.log("Done!");
The p stands for process, Deno.run returns the process state upon return (not the stdout):
console.log(await p.status());
// { success: true, code: 0 }
Since you are awaiting the status, the stdout will not stream the output until the process has finished.
Try using the process like this:
const p = Deno.run({ cmd, stderr: 'piped', stdout: 'piped' });
const [status, stdout, stderr] = await Promise.all([
p.status(),
p.output(),
p.stderrOutput()
]);
console.log(new TextDecoder().decode(stdout)); // since it's returned as UInt8Array
p.close();
But it'll still wait until the sub process is done.

Firebase functions :ApiError: Not Found at Object.parseHttpRespBod : when removing from firebase storage

I'm trying to remove an item from my firebase storage by firebase cloud functions.
But its giving me this error..
Error { ApiError: Not Found
at Object.parseHttpRespBody (/user_code/node_modules/firebase-admin/node_modules/#google-cloud/common/src/util.js:193:30)
at Object.handleResp (/user_code/node_modules/firebase-admin/node_modules/#google-cloud/common/src/util.js:131:18)
at /user_code/node_modules/firebase-admin/node_modules/#google-cloud/common/src/util.js:496:12
at Request.onResponse [as _callback] (/user_code/node_modules/firebase-admin/node_modules/#google-cloud/common/node_modules/retry-request/index.js:198:7)
at Request.self.callback (/user_code/node_modules/firebase-admin/node_modules/request/request.js:185:22)
at emitTwo (events.js:106:13)
at Request.emit (events.js:191:7)
at Request.<anonymous> (/user_code/node_modules/firebase-admin/node_modules/request/request.js:1161:10)
at emitOne (events.js:96:13)
at Request.emit (events.js:188:7)
code: 404,
errors: [ { domain: 'global', reason: 'notFound', message: 'Not Found' } ],
response: undefined,
message: 'Not Found' }
And this is my code :
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
var db = admin.firestore();
var storage = admin.storage().bucket('visa_cop');
exports.deletingVisaCop = functions.firestore.document('users/{user_Id}/info/visa_cop').onUpdate((change,context) =>{
var data = change.after.data().checked;
if(data === true)
{
return storage.delete().then(function(data) {
return console.log("DataIs",data);
}).catch(function(error){
return console.log("Error",error);
});
} else
{
}
});
And I added for Google APIs Service Agent and App Engine default service account storage admin roles from the I am & admin page.
Thank You.
the problem is here:
functions.firestore.document('users/{user_Id}/info/visa_cop').onUpdate((change,context)
at the moment, the function listens to a document called "visa_cop" in the folder "info". you need to add the token at the end, to tell the function to listen to update of any file in this folder (or you can specify a file if needed).
Just append e.g. /{visaId} after visa_cop, like so:
functions.firestore.document('users/{user_Id}/info/visa_cop/{visaId}').onUpdate((change,context)
Ps. "visaId" can be anything, however it must match the Document Path that you define at function deploy.
in your example, the function listens to any doc in "visa_cop" folder, so if you use:
Console:
Trigger is "Cloud Firestore"
Event Type is "update"
Document Path is "students/{studentId}/visa_cop/{visaId}"
CLI:
gcloud functions deploy [FUNCTION_NAME] \
--runtime [RUNTIME] \
--trigger-event providers/cloud.firestore/eventTypes/document.update \
--trigger-resource "projects/[PROJECT_ID]/databases/(default)/documents/users/{userId}/info/visa_cop/{visaId}"

google Cloud Vision API: node.js and an image URI, how to invoke vision.detectText()?

I'm trying to detect text in a remote image with the google Cloud Vision API, but can't seem to get the vision.detectText() syntax right.
How do I use vision.detectText() when there is no cloud storage bucket?
I'm thinking I can/should ignore the reference to storage.bucket() indicated on https://cloud.google.com/vision/docs/detecting-text
I have:
vision.detectText('https://drive.google.com/file
/d/0Bw4DMtLCtPMkWVlIVXE5a2ZpQlU/view?usp=drivesdk')
.then((results) => {
const detections = results[0];
console.log('Text:');
detections.forEach((text) => console.log(text));
})
.catch((err) => {
console.error('ERROR:', err);
});
the console reports:
ERROR: { PartialFailureError: A failure occurred during this request.
at /Users/node_modules/#google-cloud/vision/src/index.js:434:15
at /Users/node_modules/#google-cloud/vision/src/index.js:126:5
at _combinedTickCallback (internal/process/next_tick.js:80:11)
at process._tickCallback (internal/process/next_tick.js:104:9)
errors:
[ { image: 'https://drive.google.com/file/d
/0Bw4DMtLCtPMkNFFselFhU0RMV2c/view?usp=drivesdk',
errors: [Object] } ],
response: { responses: [ [Object] ] },
message: 'A failure occurred during this request.' }
I have tried using:
vision.detectText(storage.bucket().file('https://......
but the error is:
Error: A bucket name is needed to use Cloud Storage.
It looks like you're not setting your GOOGLE_APPLICATION_CREDENTIALS environment variable. The following code works as tested:
const Vision = require('#google-cloud/vision');
const vision = Vision();
const fileName = 'http://example.com/eg.jpg';
vision.detectText(fileName)
.then((results) => {
const detections = results[0];
console.log('Text:');
detections.forEach((text) => console.log(text));
})
.catch((err) => {
console.error('ERROR:', err);
});
To try it out with our sample, pass a URI (e.g. http URI) to the detect.js sample as:
node detect.js fulltext http://www.identifont.com/samples/houseindustries/NeutraText.gif

Resources