Deno seems targeting text files, but I also need to serve image files for the website.
You can use send()
The function send() is designed to serve static content as part of a
middleware function. In the most straight forward usage, a root is
provided and requests provided to the function are fulfilled with
files from the local file system relative to the root from the
requested path.
const app = new Application();
app.use(async (context) => {
await send(context, context.request.url.pathname, {
root: `${Deno.cwd()}/static`
});
});
await app.listen({ port: 8000 });
With the following directory structure:
static/
image.jpg
server.js
You can access the image by going to http://localhost:8000/image.jpg
Basically, you just need to set the correct headers for your image type, and supply the image data as a Unit8Array:
In your middleware:
app.use(async (ctx, next) => {
// ...
const imageBuf = await Deno.readFile(pngFilePath);
ctx.response.body = imageBuf;
ctx.response.headers.set('Content-Type', 'image/png');
});
Here's a complete working example, which will download a sample image (the digitized version of the hand-drawn deno logo) and serve it at http://localhost:8000/image, and display "Hello world" at all other addresses. The run options are in the comment on the first line:
server.ts
// deno run --allow-net=localhost:8000,deno.land --allow-read=deno_logo.png --allow-write=deno_logo.png server.ts
import {Application} from 'https://deno.land/x/oak#v5.3.1/mod.ts';
import {exists} from 'https://deno.land/std#0.59.0/fs/exists.ts';
// server listen options
const listenOptions = {
hostname: 'localhost',
port: 8000,
};
// sample image
const imageFilePath = './deno_logo.png';
const imageSource = 'https://deno.land/images/deno_logo.png';
const ensureLocalFile = async (localPath: string, url: string): Promise<void> => {
const fileExists = await exists(localPath);
if (fileExists) return;
console.log(`Downloading ${url} to ${localPath}`);
const response = await fetch(url);
if (!response.ok) throw new Error('Response not OK');
const r = response.body?.getReader;
const buf = new Uint8Array(await response.arrayBuffer());
await Deno.writeFile(imageFilePath, buf);
console.log('File saved');
};
await ensureLocalFile(imageFilePath, imageSource);
const app = new Application();
app.use(async (ctx, next) => {
// only match /image
if (ctx.request.url.pathname !== '/image') {
await next(); // pass control to next middleware
return;
}
const imageBuf = await Deno.readFile(imageFilePath);
ctx.response.body = imageBuf;
ctx.response.headers.set('Content-Type', 'image/png');
});
// default middleware
app.use((ctx) => {
ctx.response.body = "Hello world";
});
// log info about server
app.addEventListener('listen', ev => {
const defaultPortHttp = 80;
const defaultPortHttps = 443;
let portString = `:${ev.port}`;
if (
(ev.secure && ev.port === defaultPortHttps)
|| (!ev.secure && ev.port === defaultPortHttp)
) portString = '';
console.log(`Listening at http${ev.secure ? 's' : ''}://${ev.hostname ?? '0.0.0.0'}${portString}`);
console.log('Use ctrl+c to stop\n');
});
await app.listen(listenOptions);
Register middleware like this:
// serve static files
app.use(async (context, next) => {
try {
await context.send({
root: `${Deno.cwd()}/wwwroot/static`,
index: "index.html",
});
} catch {
await next();
}
});
Related
I have an app made with React, Node.js and Socket.io
I deployed Node backend to heroku , frontend to Netlify
I know that CORS errors is related to server but no matter what I add, it just cant go through that error in the picture below.
I also added proxy script to React's package.json as "proxy": "https://googledocs-clone-sbayrak.herokuapp.com/"
And here is my server.js file;
const mongoose = require('mongoose');
const Document = require('./Document');
const dotenv = require('dotenv');
const path = require('path');
const express = require('express');
const http = require('http');
const socketio = require('socket.io');
dotenv.config();
const app = express();
app.use(cors());
const server = http.createServer(app);
const io = socketio(server, {
cors: {
origin: 'https://googledocs-clone-sbayrak.netlify.app/',
methods: ['GET', 'POST'],
},
});
app.get('/', (req, res) => {
res.status(200).send('hello!!');
});
const connectDB = async () => {
try {
const connect = await mongoose.connect(process.env.MONGODB_URI, {
useUnifiedTopology: true,
useNewUrlParser: true,
});
console.log('MongoDB Connected...');
} catch (error) {
console.error(`Error : ${error.message}`);
process.exit(1);
}
};
connectDB();
let defaultValue = '';
const findOrCreateDocument = async (id) => {
if (id === null) return;
const document = await Document.findById({ _id: id });
if (document) return document;
const result = await Document.create({ _id: id, data: defaultValue });
return result;
};
io.on('connection', (socket) => {
socket.on('get-document', async (documentId) => {
const document = await findOrCreateDocument(documentId);
socket.join(documentId);
socket.emit('load-document', document.data);
socket.on('send-changes', (delta) => {
socket.broadcast.to(documentId).emit('receive-changes', delta);
});
socket.on('save-document', async (data) => {
await Document.findByIdAndUpdate(documentId, { data });
});
});
console.log('connected');
});
server.listen(process.env.PORT || 5000, () =>
console.log(`Server has started.`)
);
and this is where I make request from frontend;
import Quill from 'quill';
import 'quill/dist/quill.snow.css';
import { useParams } from 'react-router-dom';
import { io } from 'socket.io-client';
const SAVE_INTERVAL_MS = 2000;
const TextEditor = () => {
const [socket, setSocket] = useState();
const [quill, setQuill] = useState();
const { id: documentId } = useParams();
useEffect(() => {
const s = io('https://googledocs-clone-sbayrak.herokuapp.com/');
setSocket(s);
return () => {
s.disconnect();
};
}, []);
/* below other functions */
/* below other functions */
/* below other functions */
}
TL;DR
https://googledocs-clone-sbayrak.netlify.app/ is not an origin. Drop that trailing slash.
More details about the problem
No trailing slash allowed in the value of the Origin header
According to the CORS protocol (specified in the Fetch standard), browsers never set the Origin request header to a value with a trailing slash. Therefore, if a page at https://googledocs-clone-sbayrak.netlify.app/whatever issues a cross-origin request, that request's Origin header will contain
https://googledocs-clone-sbayrak.netlify.app
without any trailing slash.
Byte-by-byte comparison on the server side
You're using Socket.IO, which relies on the Node.js cors package. That package won't set any Access-Control-Allow-Origin in the response if the request's origin doesn't exactly match your CORS configuration's origin value (https://googledocs-clone-sbayrak.netlify.app/).
Putting it all together
Obviously,
'https://googledocs-clone-sbayrak.netlify.app' ===
'https://googledocs-clone-sbayrak.netlify.app/'
evaluates to false, which causes the cors package not to set any Access-Control-Allow-Origin header in the response, which causes the CORS check to fail in your browser, hence the CORS error you observed.
Example from the Fetch Standard
Section 3.2.5 of the Fetch Standard even provides an enlightening example of this mistake,
Access-Control-Allow-Origin: https://rabbit.invalid/
and explains why it causes the CORS check to fail:
A serialized origin has no trailing slash.
Looks like you haven't imported the cors package. Is it imported anywhere else?
var cors = require('cors') // is missing
I'm currently making a project that requires me to send a png image from unreal engine to a next JS server which uses multer to pass the file on to another server.
When sending my file as a binary the JS server (intermediate server) is not receiving a file from unreal.
I've tried the two following methods
TArray<uint8> rawFileData;
FFileHelper::LoadFileToArray(rawFileData, *media);
Request->SetURL(API_HP_URL + "nude_upload");
Request->SetHeader(TEXT("Content-Type"), TEXT("multipart/form-data; boundary=----WebKitFormBoundarywpp9S2IUDici8hpI"));
Request->SetHeader(TEXT("Connection"), TEXT("keep-alive"));
Request->SetHeader(TEXT("accept"), TEXT("application/json, text/plain, */*"));
Request->SetContent(rawFileData);
Request->SetVerb("POST");
Request->OnProcessRequestComplete().BindUObject(this, &AHttpCommunicator::OnPostNudeSSResponse);
Request->ProcessRequest();
and
FString JsonString;
TArray<uint8> rawFileData;
TSharedRef<TJsonWriter<TCHAR>> JsonWriter = JsonWriterFactory<TCHAR>::Create(&JsonString);
JsonWriter->WriteObjectStart();
JsonWriter->WriteValue("fileName", pPathToFile);
JsonWriter->WriteValue("file", FBase64::Encode(rawFileData));
JsonWriter->WriteObjectEnd();
JsonWriter->Close();
Request->SetURL(API_HP_URL + "nude_upload");
Request->SetHeader(TEXT("Content-Type"), TEXT("multipart/form-data; boundary=----WebKitFormBoundarywpp9S2IUDici8hpI"));
Request->SetHeader(TEXT("Connection"), TEXT("keep-alive"));
Request->SetHeader(TEXT("accept"), TEXT("application/json, text/plain, */*"));
Request->SetContentAsString(JsonString);
Request->SetVerb("POST");
Request->OnProcessRequestComplete().BindUObject(this, &AHttpCommunicator::OnPostNudeSSResponse);
Request->ProcessRequest();
both of these methods have the server return an undefined file obj
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import path from 'path';
import MulterGoogleCloudStorage from "multer-google-storage";
import nextConnect from 'next-connect';
const Multer = require('multer');
const { Storage } = require('#google-cloud/storage');
const CLOUD_BUCKET = 'nude_locks';
const PROJECT_ID = 'hp-production-338902';
const KEY_FILE = path.resolve('./hp-production-key.json')
const storage = new Storage({
projectId: PROJECT_ID,
keyFilename: KEY_FILE
});
const bucket = storage.bucket(CLOUD_BUCKET);
const upload = Multer({
storage: Multer.memoryStorage(),
limits: {
fileSize: 5 * 1024 * 1024,
}
}).single('file');
const apiRoute = nextConnect({
onNoMatch(req, res) {
res.status(405).json({ error: `Method '${req.method}' Not Allowed` });
},
});
apiRoute.use(upload);
apiRoute.post((req, res) => {
console.log(req.file);
if (!req.file) {
res.status(400).send("No file uploaded.");
return;
}
const blob = bucket.file(req.file.originalname);
// Make sure to set the contentType metadata for the browser to be able
// to render the image instead of downloading the file (default behavior)
const blobStream = blob.createWriteStream({
metadata: {
contentType: req.file.mimetype
}
});
blobStream.on("error", err => {
next(err);
return;
});
blobStream.on("finish", () => {
console.log('finish');
console.log(blob);
// The public URL can be used to directly access the file via HTTP.
const publicUrl = `https://storage.googleapis.com/${bucket.name}/${blob.name}`;
// Make the image public to the web (since we'll be displaying it in browser)
blob.makePublic().then(() => {
res.status(200).send(`Success!\n Image uploaded to ${publicUrl}`);
});
});
blobStream.end(req.file.buffer);
});
export default apiRoute;
export const config = {
api: {
bodyParser: false,
},
}
const fileSelectedHandler = e => {
const file = new File("D:/_Spectre/VHS/P210107_VHS_Configurator/P04_Unreal/Human_Configurator/Saved/Screenshots/Windows/-1-nude-2022-2-13.png");
console.log(file);
const formData = new FormData();
formData.append('file', file);
axios.post('/api/nude_upload', formData, {
headers: {
'Content-Type': 'multipart/form-data',
}
})
.then(res => {
console.log(res);
});
}
Is there a way to create a file object from UE4?
alternatively is there a way to retrieve google cloud storage access tokens from UE4
I have an graphql server connected to my Firebase RTD and deployed on heroku.
When I run my server in heroku the request to reservations resolver takes forever and eventually the Playground yells Unexpected token < in JSON at position 0
I suspect this is a timeout from Firebase, but how would I go about debugging this? (Heroku logs nothing about the error).
You can try the server for yourself: https://filex-database.herokuapp.com
The specific query that's causing me trouble is:
query {
reservations {
code
name
}
}
const db = require("../datasources/db");
const masterlist = require("../datasources/masterlist.js");
const getById = (key: string, id: string) =>
db[key].filter((item) => item.id === id)[0];
const firebaseQuery = (context: { firebaseClient }, endpoint: string) => {
const finalEndpoint =
endpoint.charAt(0) === "/" ? endpoint : "/".concat(endpoint);
const baseUrl = "/workshops";
return context.firebaseClient
.database()
.ref(`${baseUrl}${finalEndpoint}`)
.once("value")
.then((snapshot) => snapshot.val());
};
const Query = {
workshops: () => db.workshops,
workshop: (_, args) => getById("workshops", args.id),
options: () => db.options,
option: (_, args) => getById("options", args.id),
// this resolver is causing me trouble
reservations: async (_, __, context) => {
const data = await firebaseQuery(context, "/applicants");
return Object.values(data);
},
reservation: async (_, args, context) => {
const data = await firebaseQuery(context, `/applicants/${args.id}`);
return data;
},
};
module.exports = { Query };
EDIT: I made another simple server using the same technology and only delivering that resolver and it also timesout (everything works fine locally though)
http://apollo-testing-gonzo.herokuapp.com
Situation
I have a firebase function that updates the user image.
Problem
When I run locally my function using firebase serve, I successfully upload the image to firestore using Postman. However, when I run firebase deploy and I try to upload the image using Postman, I get a 500 Internal Server Error. The other functions (not dealing with FormData, just json) work perfectly when I deploy them.
I don't understand why it works locally, but not on deploy when I am doing the exact same thing. Not sure if this is something in the config I am missing, or if I am doing something wrong. Any help would be appreciated!
Code
users.js
const { admin, db, firebase } = require('../util/admin');
const config = require('../util/config');
exports.postUserImage = (req, res) => {
const BusBoy = require('busboy');
const path = require('path');
const os = require('os');
const fs = require('fs');
let imgFileName;
let imgToBeUploaded = {};
const busboy = new BusBoy({ headers: req.headers });
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
// Invalid file type
if (mimetype !== 'image/jpeg' && mimetype !== 'image/png') {
return res.status(400).json({ error: 'Invalid file type' });
}
// Extract img extension
const imgDotLength = filename.split('.').length;
const imgExtension = filename.split('.')[imgDotLength - 1];
// Create img file name
imgFileName = `${Math.round(Math.random() * 1000000)}.${imgExtension}`;
// Create img path
const filepath = path.join(os.tmpdir(), imgFileName);
// Create img object to be uploaded
imgToBeUploaded = { filepath, mimetype };
// Use file system to create the file
file.pipe(fs.createWriteStream(filepath));
});
busboy.on('finish', () => {
admin
.storage()
.bucket()
.upload(imgToBeUploaded.filepath, {
resumable: false,
metadata: {
metadata: {
contentType: imgToBeUploaded.mimetype
}
}
})
.then(() => {
// Create img url to add to our user
const imgUrl = `https://firebasestorage.googleapis.com/v0/b/${config.storageBucket}/o/${imgFileName}?alt=media`;
// Add img url to user document
return db.doc(`/users/${req.user.handle}`).update({ imgUrl });
})
.then(() => {
return res.json({ message: 'Image uploaded succesfully' });
})
.catch((err) => {
console.error(err);
return res.status(500).json({ error });
});
});
busboy.end(req.rawBody);
};
index.js
const { app, functions } = require('./util/admin');
const FirebaseAuth = require('./util/firebaseAuth');
const {
postUserImage,
} = require('./handlers/users');
app.post('/user/image', FirebaseAuth, postUserImage);
I would like to call an asynchronous function outside the lambda handler with by the following code:
var client;
(async () => {
var result = await initSecrets("MyWebApi");
var secret = JSON.parse(result.Payload);
client= new MyWebApiClient(secret.API_KEY, secret.API_SECRET);
});
async function initSecrets(secretName) {
var input = {
"secretName" : secretName
};
var result = await lambda.invoke({
FunctionName: 'getSecrets',
InvocationType: "RequestResponse",
Payload: JSON.stringify(input)
}).promise();
return result;
}
exports.handler = async function (event, context) {
var myReq = await client('Request');
console.log(myReq);
};
The 'client' does not get initialized. The same code works perfectly if executed within the handler.
initSecrets contains a lambda invocation of getSecrets() which calls the AWS SecretsManager
Has anyone an idea how asynchronous functions can be properly called for initialization purpose outside the handler?
Thank you very much for your support.
I ran into a similar issue trying to get next-js to work with aws-serverless-express.
I fixed it by doing the below (using typescript so just ignore the :any type bits)
const appModule = require('./App');
let server: any = undefined;
appModule.then((expressApp: any) => {
server = createServer(expressApp, null, binaryMimeTypes);
});
function waitForServer(event: any, context: any){
setImmediate(() => {
if(!server){
waitForServer(event, context);
}else{
proxy(server, event, context);
}
});
}
exports.handler = (event: any, context: any) => {
if(server){
proxy(server, event, context);
}else{
waitForServer(event, context);
}
}
So for your code maybe something like
var client = undefined;
initSecrets("MyWebApi").then(result => {
var secret = JSON.parse(result.Payload);
client= new MyWebApiClient(secret.API_KEY, secret.API_SECRET)
})
function waitForClient(){
setImmediate(() => {
if(!client ){
waitForClient();
}else{
client('Request')
}
});
}
exports.handler = async function (event, context) {
if(client){
client('Request')
}else{
waitForClient(event, context);
}
};
client is being called before it has initialised; the client var is being "exported" (and called) before the async function would have completed. When you are calling await client() the client would still be undefined.
edit, try something like this
var client = async which => {
var result = await initSecrets("MyWebApi");
var secret = JSON.parse(result.Payload);
let api = new MyWebApiClient(secret.API_KEY, secret.API_SECRET);
return api(which) // assuming api class is returning a promise
}
async function initSecrets(secretName) {
var input = {
"secretName" : secretName
};
var result = await lambda.invoke({
FunctionName: 'getSecrets',
InvocationType: "RequestResponse",
Payload: JSON.stringify(input)
}).promise();
return result;
}
exports.handler = async function (event, context) {
var myReq = await client('Request');
console.log(myReq);
};
This can be also be solved with async/await give Node v8+
You can load your configuration in a module like so...
const fetch = require('node-fetch');
module.exports = async () => {
const config = await fetch('https://cdn.jsdelivr.net/gh/GEOLYTIX/public/z2.json');
return await config.json();
}
Then declare a _config outside the handler by require / executing the config module. Your handler must be an async function. _config will be a promise at first which you must await to resolve into the configuration object.
const _config = require('./config')();
module.exports = async (req, res) => {
const config = await _config;
res.send(config);
}
Ideally you want your initialization code to run during the initialization phase and not the invocation phase of the lambda to minimize cold start times. Synchronous code at module level runs at initialization time and AWS recently added top level await support in node14 and newer lambdas: https://aws.amazon.com/blogs/compute/using-node-js-es-modules-and-top-level-await-in-aws-lambda/ . Using this you can make the init phase wait for your async initialization code by using top level await like so:
const sleep = ms => new Promise(resolve => setTimeout(resolve, ms))
console.log("start init");
await sleep(1000);
console.log("end init");
export const handler = async (event) => {
return {
statusCode: 200,
body: JSON.stringify('Hello from Lambda!'),
};
};
This works great if you are using ES modules. If for some reason you are stuck using commonjs (e.g. because your tooling like jest or ts-node doesn't yet fully support ES modules) then you can make your commonjs module look like an es module by making it export a Promise that waits on your initialization rather than exporting an object. Like so:
const sleep = ms => new Promise(resolve => setTimeout(resolve, ms))
const main = async () => {
console.log("start init");
await sleep(1000);
console.log("end init");
const handler = async (event) => {
return {
statusCode: 200,
body: JSON.stringify('Hello from Lambda!'),
};
};
return { handler };
};
# note we aren't exporting main here, but rather the result
# of calling main() which is a promise resolving to {handler}:
module.exports = main();