I'm trying to figure why I keep getting the following erorr with this code
[uncaught application error]: Error - checksum error
import { Untar } from "https://deno.land/std#0.128.0/archive/tar.ts";
import { readerFromStreamReader } from "https://deno.land/std#0.128.0/streams/conversion.ts";
const res = await fetch("https://registry.npmjs.org/react/-/react-17.0.2.tgz", { keepalive: true });
if (res.status === 200) {
const streamReader = res.body!.getReader();
const reader = readerFromStreamReader(streamReader);
const untar = new Untar(reader);
for await (const block of untar) {
// errors with [uncaught application error]: Error - checksum error
}
}
Can you Untar from a stream like this?
The response you are streaming is compressed with gzip compression, so you need to pipe the stream data through a decompression transform stream first:
./so-71365204.ts
import {
assertExists,
assertStrictEquals,
} from "https://deno.land/std#0.128.0/testing/asserts.ts";
import { readerFromStreamReader } from "https://deno.land/std#0.128.0/streams/conversion.ts";
import { Untar } from "https://deno.land/std#0.128.0/archive/tar.ts";
const res = await fetch("https://registry.npmjs.org/react/-/react-17.0.2.tgz");
assertStrictEquals(res.status, 200);
assertExists(res.body);
const streamReader = res.body
.pipeThrough(new DecompressionStream("gzip"))
.getReader();
const denoReader = readerFromStreamReader(streamReader);
const untar = new Untar(denoReader);
for await (const entry of untar) {
const { fileName, type } = entry;
console.log(type, fileName);
}
$ deno run --allow-net=registry.npmjs.org ./so-71365204.ts
file package/LICENSE
file package/index.js
file package/jsx-dev-runtime.js
# etc...
Related
I'm currently making a project that requires me to send a png image from unreal engine to a next JS server which uses multer to pass the file on to another server.
When sending my file as a binary the JS server (intermediate server) is not receiving a file from unreal.
I've tried the two following methods
TArray<uint8> rawFileData;
FFileHelper::LoadFileToArray(rawFileData, *media);
Request->SetURL(API_HP_URL + "nude_upload");
Request->SetHeader(TEXT("Content-Type"), TEXT("multipart/form-data; boundary=----WebKitFormBoundarywpp9S2IUDici8hpI"));
Request->SetHeader(TEXT("Connection"), TEXT("keep-alive"));
Request->SetHeader(TEXT("accept"), TEXT("application/json, text/plain, */*"));
Request->SetContent(rawFileData);
Request->SetVerb("POST");
Request->OnProcessRequestComplete().BindUObject(this, &AHttpCommunicator::OnPostNudeSSResponse);
Request->ProcessRequest();
and
FString JsonString;
TArray<uint8> rawFileData;
TSharedRef<TJsonWriter<TCHAR>> JsonWriter = JsonWriterFactory<TCHAR>::Create(&JsonString);
JsonWriter->WriteObjectStart();
JsonWriter->WriteValue("fileName", pPathToFile);
JsonWriter->WriteValue("file", FBase64::Encode(rawFileData));
JsonWriter->WriteObjectEnd();
JsonWriter->Close();
Request->SetURL(API_HP_URL + "nude_upload");
Request->SetHeader(TEXT("Content-Type"), TEXT("multipart/form-data; boundary=----WebKitFormBoundarywpp9S2IUDici8hpI"));
Request->SetHeader(TEXT("Connection"), TEXT("keep-alive"));
Request->SetHeader(TEXT("accept"), TEXT("application/json, text/plain, */*"));
Request->SetContentAsString(JsonString);
Request->SetVerb("POST");
Request->OnProcessRequestComplete().BindUObject(this, &AHttpCommunicator::OnPostNudeSSResponse);
Request->ProcessRequest();
both of these methods have the server return an undefined file obj
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import path from 'path';
import MulterGoogleCloudStorage from "multer-google-storage";
import nextConnect from 'next-connect';
const Multer = require('multer');
const { Storage } = require('#google-cloud/storage');
const CLOUD_BUCKET = 'nude_locks';
const PROJECT_ID = 'hp-production-338902';
const KEY_FILE = path.resolve('./hp-production-key.json')
const storage = new Storage({
projectId: PROJECT_ID,
keyFilename: KEY_FILE
});
const bucket = storage.bucket(CLOUD_BUCKET);
const upload = Multer({
storage: Multer.memoryStorage(),
limits: {
fileSize: 5 * 1024 * 1024,
}
}).single('file');
const apiRoute = nextConnect({
onNoMatch(req, res) {
res.status(405).json({ error: `Method '${req.method}' Not Allowed` });
},
});
apiRoute.use(upload);
apiRoute.post((req, res) => {
console.log(req.file);
if (!req.file) {
res.status(400).send("No file uploaded.");
return;
}
const blob = bucket.file(req.file.originalname);
// Make sure to set the contentType metadata for the browser to be able
// to render the image instead of downloading the file (default behavior)
const blobStream = blob.createWriteStream({
metadata: {
contentType: req.file.mimetype
}
});
blobStream.on("error", err => {
next(err);
return;
});
blobStream.on("finish", () => {
console.log('finish');
console.log(blob);
// The public URL can be used to directly access the file via HTTP.
const publicUrl = `https://storage.googleapis.com/${bucket.name}/${blob.name}`;
// Make the image public to the web (since we'll be displaying it in browser)
blob.makePublic().then(() => {
res.status(200).send(`Success!\n Image uploaded to ${publicUrl}`);
});
});
blobStream.end(req.file.buffer);
});
export default apiRoute;
export const config = {
api: {
bodyParser: false,
},
}
const fileSelectedHandler = e => {
const file = new File("D:/_Spectre/VHS/P210107_VHS_Configurator/P04_Unreal/Human_Configurator/Saved/Screenshots/Windows/-1-nude-2022-2-13.png");
console.log(file);
const formData = new FormData();
formData.append('file', file);
axios.post('/api/nude_upload', formData, {
headers: {
'Content-Type': 'multipart/form-data',
}
})
.then(res => {
console.log(res);
});
}
Is there a way to create a file object from UE4?
alternatively is there a way to retrieve google cloud storage access tokens from UE4
I'm using firebase/storage to set up audio file downloading/uploading. I have the audio file in my firestore storage already.
With the following code, I am able to get the download URL of the specific file:
import firebase from 'firebase/app';
import 'firebase/firestore';
import 'firebase/storage';
static async downloadMedia(mediaRef: string) {
try {
var storage = firebase.storage();
var pathReference = storage.ref(mediaRef);
const downloadUrl = await pathReference.getDownloadURL();
var xhr = new XMLHttpRequest();
xhr.responseType = 'blob';
xhr.onload = (event) => {
var blob = xhr.response;
};
xhr.open('GET', downloadUrl);
return downloadUrl;
} catch (e) {
switch (e.code) {
case 'storage/object-not-found':
console.warn('File does not exist.');
break;
case 'storage/unauthorized':
console.warn('Unauthorized.');
break;
case 'storage/canceled':
console.warn('Upload cancelled.');
break;
case 'storage/unknown':
console.warn('Unknown error.');
break;
}
}
}
However, I do not understand how to use the firebase library to download the file itself with the URL that it provides me.
Thanks.
Found a solution which doesn't involve me downloading the media but instead playing it directly with the download URL.
Using package 'expo-av'.
Hope this helps someone in my shoes!
export default function AudioPlay({ mediaDownloadUrl } : AudioPlayProps) {
const [sound, setSound] = React.useState<Audio.Sound | null>(null);
async function playSound() {
if (typeof mediaDownloadUrl !== 'string') return null;
try {
const { sound } = await Audio.Sound.createAsync(
{ uri: mediaDownloadUrl }
);
setSound(sound);
await sound.playAsync();
} catch (e) {
console.warn(e);
}
}
React.useEffect(() => {
return sound
? () => {
console.log('Unloading Sound');
sound.unloadAsync(); }
: undefined;
}, [sound]);
// ....
How do I convert a file to Readable stream ?
I am trying to use deno's fetch api to do this, which requires a readable stream as body to put something on server.
I am not able to figure out how to convert a file to ReadableStream ?
There isn't a built-in way yet to convert a Reader to a ReadableStream.
You can convert it using the following code:
const file = await Deno.open("./some-file.txt", { read: true });
const stream = new ReadableStream({
async pull(controller) {
try {
const b = new Uint8Array(1024 * 32);
const result = await file.read(b);
if (result === null) {
controller.close();
return file.close();
}
controller.enqueue(b.subarray(0, result));
} catch (e) {
controller.error(e);
controller.close();
file.close();
}
},
cancel() {
// When reader.cancel() is called
file.close();
},
});
// ReadableStream implements asyncIterator
for await (const chunk of stream) {
console.log(chunk);
}
Have in mind that Deno (1.0.5) fetch does not yet support a ReadableStream as request body.
So currently to post a file, you'll need to buffer the contents.
const body = await Deno.readAll(file);
await fetch('http://example.com', { method: 'POST', body });
This is a continuation of linked question.
It seems to me, that current implementation of the std/archive/tar.ts module only allows reads and writes per file and not for whole directories.
So far, my reference source are the test files, which only show case single file processing. But what if, for example, a directory ./my-dir/ with multiple files and a tar archive ./test.tar is given.
How can I then utilize append, extract & Co. to efficiently write ./my-dir/ to ./test.tar archive and read all file contents back from it?
You can archive a directory by using std/fs/walk
import { walk, walkSync } from "https://deno.land/std/fs/walk.ts";
import { Tar } from "https://deno.land/std/archive/tar.ts";
// Async
const tar = new Tar();
for await (const entry of walk("./dir-to-archive")) {
if (!entry.isFile) {
continue;
}
await tar.append(entry.path, {
filePath: entry.path,
});
}
const writer = await Deno.open("./out.tar", { write: true, create: true });
await Deno.copy(tar.getReader(), writer);
Untar implementation for folders/multiple files was broken, it was fixed by this PR and currently available in master using https://deno.land/std/archive/tar.ts
import { Untar } from "https://deno.land/std/archive/tar.ts";
import { ensureFile } from "https://deno.land/std/fs/ensure_file.ts";
import { ensureDir } from "https://deno.land/std/fs/ensure_dir.ts";
const reader = await Deno.open("./out.tar", { read: true });
const untar = new Untar(reader);
for await (const entry of untar) {
console.log(entry); // metadata
/*
fileName: "archive/deno.txt",
fileMode: 33204,
mtime: 1591657305,
uid: 0,
gid: 0,
size: 24400,
type: 'file'
*/
if (entry.type === "directory") {
await ensureDir(entry.fileName);
continue;
}
await ensureFile(entry.fileName);
const file = await Deno.open(entry.fileName, { write: true });
// <entry> is a reader
await Deno.copy(entry, file);
}
reader.close();
Update
Created a lib to allow transformations, including gzip/gunzip to create & read .tar.gz
gzip
import * as Transform from "https://deno.land/x/transform/mod.ts";
const { GzEncoder } = Transform.Transformers;
/** ... **/
const writer = await Deno.open("./out.tar.gz", { write: true, create: true });
await Transform.pipeline(tar.getReader(), new GzEncoder())
.to(writer);
writer.close();
gunzip
import * as Transform from "https://deno.land/x/transform/mod.ts";
const { GzDecoder } = Transform.Transformers;
/** ... **/
const reader = await Deno.open("./out.tar.gz", { read: true });
const untar = new Untar(
Transform.newReader(input, new GzDecoder())
);
for await (const entry of untar) {
console.log(entry);
}
I have to send a file to an API, therefor I have to use fs.readFileSync(). After uploading the picture to the storage, I am calling my function to execute the API call. But I cannot get the file from the storage. This is a section of the code, which always gets null in the result. I tried also to .getFiles() without a parameter and then I got all files but I dont want to filter them by iteration.
exports.stripe_uploadIDs = functions.https //.region("europe-west1")
.onCall((data, context) => {
const authID = context.auth.uid;
console.log("request is authentificated? :" + authID);
if (!authID) {
throw new functions.https.HttpsError("not authorized", "not authorized");
}
let accountID;
let result_fileUpload;
let tempFile = path.join(os.tmpdir(), "id_front.jpg");
const options_id_front_jpeg = {
prefix: "/user/" + authID + "/id_front.jpg"
};
const storageRef = admin
.storage()
.bucket()
.getFiles(options_id_front)
.then(results => {
console.log("JPG" + JSON.stringify(results));
// need to write this file to tempFile
return results;
});
const paymentRef = storageRef.then(() => {
return admin
.database()
.ref("Payment/" + authID)
.child("accountID")
.once("value");
});
const setAccountID = paymentRef.then(snap => {
accountID = snap.val();
return accountID;
});
const fileUpload = setAccountID.then(() => {
return Stripe.fileUploads.create(
{
purpose: "identity_document",
file: {
data: tempFile, // Documentation says I should use fs.readFileSync("filepath")
name: "id_front.jpg",
type: "application/octet-stream"
}
},
{ stripe_account: accountID }
);
});
const fileResult = fileUpload.then(result => {
result_fileUpload = result;
console.log(JSON.stringify(result_fileUpload));
return result_fileUpload;
});
return fileResult;
});
Result is:
JPG[[]]
You need to download your file from a bucket to your local function context env.
After your Firebase function start executing you can call the below:
More or less the below should work, just tweak to your needs. Call this within you .onCall context, you get the idea
import admin from 'firebase-admin';
import * as path from 'path';
import * as os from 'os';
import * as fs from 'fs';
admin.initializeApp();
const { log } = console;
async function tempFile(fileBucket: string, filePath: string) {
const bucket = admin.storage().bucket(fileBucket);
const fileName = 'MyFile.ext';
const tempFilePath = path.join(os.tmpdir(), fileName);
const metadata = {
contentType: 'DONT_FORGET_CONTEN_TYPE'
};
// Donwload the file to a local temp file
// Do whatever you need with it
await bucket.file(filePath).download({ destination: tempFilePath });
log('File downloaded to', tempFilePath);
// After you done and if you need to upload the modified file back to your
// bucket then uploaded
// This is optional
await bucket.upload(tempFilePath, {
destination: filePath,
metadata: metadata
});
//free up disk space by realseasing the file.
// Otherwise you might be charged extra for keeping memory space
return fs.unlinkSync(tempFilePath);
}