Retrieving an External image and storing in CollectionFS - meteor

Alright, so I have an image at a specific URL with the variable path, I want to download it and store it in CollectionFS. How can I get/create the image buffer from the response?
var request = Npm.require('request').defaults({encoding: null});
request.get(path, function(err, res, body)
{
return CollectionFS.storeBuffer(res.request.uri.path, body, {
});
});
It appears to store successfully, with the correct size. But when I try to view the image it appears to be corrupted.

Answering my own question. I had two issues with the above code. Firstly, meteor doesn't like it when you try to touch the database inside of a callback, so I added a future to wait until the callback was done.
Secondly, the default encoding on storeBuffer is utf-8 - which is a problem for images. Changing it to ascii solved my problem.
var Future = Npm.require("fibers/future");
var pre = new Future();
var preOnComplete = pre.resolver();
var request = Npm.require('request').defaults({encoding: null});
request.get(path, function(err, res, body)
{
return preOnComplete(err, res, body);
});
Future.wait(pre);
return EventImages.storeBuffer(pre.value.request.href, pre.value.body, {
contentType: 'image/jpeg',
metadata: { eventId: eventId },
encoding: 'binary'
});

Related

Issue with google bucket image upload in a Cypress test

I'm trying to create a Cypress test that involves uploading an image directly to a google storage bucket. I'm following this stack overflow suggestion:
Upload File with Cypress.io via Request
and I have it to the point where the image file gets uploaded to the bucket without any errors during the upload. However when subsequently downloading the file from google, it's apparent that the image contents were corrupted.
The stack overflow example uses Cypress.Blob.base64StringToBlob() to create a blob to be uploaded. That method comes from this third party library which has been bundled with Cypress:
https://github.com/nolanlawson/blob-util
Here's the code that I currently am working with:
Cypress.Commands.add('fileUpload', (url, fileName) => {
const method = 'PUT';
const fileType = 'image/png';
cy.fixture(fileName).as('bin');
cy.get('#bin').then( (bin) => {
// File in binary format gets converted to blob
// so it can be sent as Form data
const blob = Cypress.Blob.base64StringToBlob(bin, fileType);
// Build up the form
const formData = new FormData();
formData.set('file', blob, fileName);
// Perform the request
cy.rawFormRequest(method, url, formData, function (response) {
cy.log(JSON.stringify(response));
});
});
});
Cypress.Commands.add('rawFormRequest', (method, url, formData, done) => {
const xhr = new XMLHttpRequest();
xhr.open(method, url);
xhr.setRequestHeader('Content-Type', 'image/png');
xhr.setRequestHeader('Access-Control-Allow-Origin', '*');
xhr.setRequestHeader('Access-Control-Allow-Headers',
'Origin, X-Requested-With, Content-Type, Accept');
xhr.setRequestHeader('Access-Control-Allow-Methods',
'GET, POST, PUT, DELETE, PATCH, OPTIONS');
xhr.onload = function () {
cy.log('success');
done(xhr);
};
xhr.onerror = function (error) {
cy.log('error');
cy.log(JSON.stringify(error));
done(xhr);
};
xhr.send(formData);
});
This is the image being uploaded:
But then when I go into the Google Storage GUI, it's apparent the image hasn't been uploaded cleanly:
I tried 2 of the other methods in the blob-util library: Cypress.Blob.arrayBufferToBlob() and Cypress.Blobl.imgSrcToBlob(), however again, although the file is uploaded without errors, when downloading or viewing in the google cloud GUI it's apparent that the file contents weren't uploaded cleanly.

Loading image from http request

I'm developing a profile card that has to get different value's. I'm getting all the value's but i also want to load a network image. I'm using a filemaker server and i had noticed that i needed coockies to load this. When i make a request copy paste the image url into my browser it just loads. But whenever i'm loading it into my application i get the 401 statusCode with my image.
I have tried just a valid network image that's working, i have readed something about coockies but i'm not sure if i need to use them and how. I also find it weird that whenever i load the image in my browser it works but not on my application.
Future makeRequest() async {
var url4 =
"https://fms.xxxxxx.nl/fmi/data/v1/databases/Roscom Management Systeem/layouts/medewerker pa api/_find";
var body = json.encode({
"query": [
{"Emailadres(1)": "xxxx#xxx.nl"}
],
});
Map<String, String> headers = {
'Content-type': 'application/json',
'Accept': 'application/json',
"Authorization":
'Bearer xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
};
var response = await http.post(url4, body: body, headers: headers);
setState(() {
var responseJson = json.decode(response.body);
data = responseJson['response']['data'][0];
profielfoto = data['fieldData']['iMedewerker Foto'];
print(profielfoto);
});
Value i get in the terminal
I expect that i can load the image in a networkimage with just the var $profielfoto. I don't know what to do with the coockies or maybe there's a much easier way to do it. I hope someone can help me please let me know if i need to provide more information about the server or anything else. ;)
A few things. Please do not put any type of heavy processing in setState
https://docs.flutter.io/flutter/widgets/State/setState.html
Generally it is recommended that the setState method only be used to
wrap the actual changes to the state, not any computation that might
be associated with the change. For example, here a value used by the
build function is incremented, and then the change is written to disk,
but only the increment is wrapped in the setState:
setState tells the widget when it needs to be redrawn
https://flutter.dev/docs/cookbook/images/network-image
String _profilePhoto = "";
//Change await into an async operation
http.post(url4, body: body, headers: headers).then( (response) {
var responseJson = json.decode(response.body);
print(data['fieldData']['iMedewerker Foto']);
setState(() {
_data = responseJson['response']['data'][0];
_profilePhoto = data['fieldData']['iMedewerker Foto'];
});
})
Widget build(BuildContext context){
//Check if string is empty
if ( _profilePhoto == "" ){
return CircularProgressIndicator();
}else {
return Image.network( _profilePhoto );
}
}
https://flutter.dev/docs/cookbook/images/network-image
https://pub.dartlang.org/packages/cached_network_image
You have two choices to grab images from the network. I believe I presented one way.
the authorization token must be the issue. a secure server will return this token when a user logs in. then all ensuing request must have this token in the 'authorization header'. if it is not there or incorrect the server returns a 401

Webhook for Mailgun POST?

I am trying to store email messages as JSON (as parsed by Mailgun) in a Mongo.Collection through a Mailgun webhook. I set up an iron-router server-side route to handle the request, but this.request.body is empty. I am using Mailgun's "Send A Sample POST" to send the request, and the POST looks fine using e.g. http://requestb.in/. I was hoping that request.body would have the data, as mentioned in How do I access HTTP POST data from meteor?. What am I doing wrong?
Router.map(function () {
this.route('insertMessage', {
where: 'server',
path: '/api/insert/message',
action: function() {
var req = this.request;
var res = this.response;
console.log(req.body);
...
I'm not sure that is the right syntax. Have you tried using Router.route ?
Router.route('insertMessage',
function () {
// NodeJS request object
var request = this.request;
// NodeJS response object
var response = this.response;
console.log("========= request: =============");
console.log(request);
// EDIT: also check out this.params object
console.log("========= this.params: =============");
console.log(this.params);
// EDIT 2: close the response. oops.
return response.end();
},
{
where: 'server',
path: '/api/insert/message'
}
);
I think the issue is that Mailgun sends a multipart POST request, e.g. it sends "fields" as well as "files" (i.e. attachments) and iron-router does not set up a body parser for multipart requests. This issue is discussed here and here on iron-router's Github Issues. I found this comment particularly helpful, and now I can parse Mailgun's sample POST properly.
To get this working, in a new Meteor project, I did
$ meteor add iron:router
$ meteor add meteorhacks:npm
In a root-level packages.json I have
{
"busboy": "0.2.9"
}
which, using the meteorhacks:npm package, makes the "busboy" npm package available for use on the server via Meteor.npmRequire.
Finally, in a server/rest-api.js I have
Router.route('/restful', {where: 'server'})
.post(function () {
var msg = this.request.body;
console.log(msg);
console.log(_.keys(msg));
this.response.end('post request\n');
});
var Busboy = Meteor.npmRequire("Busboy");
Router.onBeforeAction(function (req, res, next) {
if (req.method === "POST") {
var body = {}; // Store body fields and then pass them to request.
var busboy = new Busboy({ headers: req.headers });
busboy.on("field", function(fieldname, value) {
body[fieldname] = value;
});
busboy.on("finish", function () {
// Done parsing form
req.body = body;
next();
});
req.pipe(busboy);
}
});
In this way I can ignore files (i.e., I don't have a busboy.on("file" part) and have a this.request.body available in my routes that has all the POST fields as JSON.

Express / NodeJS Can't send headers after they are sent caused by http requests

First time working with NodeJS (yes, it's awesome) and also using Express as well. Got the web app / service working great but I run in to problems when trying to make more than one http request. Here's a video of how the app causes 2 http requests - http://screencast.com/t/yFKdIajs0XD - as you can see I click on 'articles' it loads an rss feed, then click videos and it loads a youtube feed - both work just fine but after the second call is made it throws an exception. I get the following when I attempt two separate http requests using node's http module:
http.js:527
throw new Error("Can't set headers after they are sent.");
^
Error: Can't set headers after they are sent.
at ServerResponse.<anonymous> (http.js:527:11)
at ServerResponse.setHeader (/Users/rickblalock/node/auti_node/node_modules/express/node_modules/connect/lib/patch.js:47:22)
at /Users/rickblalock/node/auti_node/node_modules/express/node_modules/connect/lib/middleware/errorHandler.js:72:19
at [object Object].<anonymous> (fs.js:107:5)
at [object Object].emit (events.js:61:17)
at afterRead (fs.js:878:12)
at wrapper (fs.js:245:17)
Sample code provided here:
Using my controller module to render the request - http://pastie.org/2317698
One of the tabs (article tab) - the video code is identical minus referencing a video feed: http://pastie.org/2317731
try using the "end" event not "data" like this:
var data = "";
app.get('/', function(req, res){
var options = {
host: 'http://www.engadget.com',
path: '/rss.xml',
method: 'GET'
};
if (data === "") {
var myReq = http.request(options, function(myRes) {
myRes.setEncoding('utf8');
myRes.on('data', function(chunk) {
console.log("request on data ");
data += chunk;
});
myRes.on('end', function () {
console.log("request on END");
res.render('index', {
title: 'Express',
data: data
});
});
});
myReq.write('data\n');
myReq.end();
}
else {
res.render('index', {
title: 'Express',
data: data
});
}
});
old answer
i also think that this is the culprit:
var req = http.request(options, function(res) {
res.setEncoding('utf8');
res.on('data', function(chunk) {
parseArticle(chunk);
});
});
req.write('data\n');
req.end();
the first line is async so everything inside the callback is called after you do req.write() and req.end()
put these two lines into the callback.

Get the whole response body when the response is chunked?

I'm making a HTTP request and listen for "data":
response.on("data", function (data) { ... })
The problem is that the response is chunked so the "data" is just a piece of the body sent back.
How do I get the whole body sent back?
request.on('response', function (response) {
var body = '';
response.on('data', function (chunk) {
body += chunk;
});
response.on('end', function () {
console.log('BODY: ' + body);
});
});
request.end();
Over at https://groups.google.com/forum/?fromgroups=#!topic/nodejs/75gfvfg6xuc, Tane Piper provides a good solution very similar to scriptfromscratch's, but for the case of a JSON response:
request.on('response',function(response){
var data = [];
response.on('data', function(chunk) {
data.push(chunk);
});
response.on('end', function() {
var result = JSON.parse(data.join(''))
return result
});
});`
This addresses the issue that OP brought up in the comments section of scriptfromscratch's answer.
I never worked with the HTTP-Client library, but since it works just like the server API, try something like this:
var data = '';
response.on('data', function(chunk) {
// append chunk to your data
data += chunk;
});
response.on('end', function() {
// work with your data var
});
See node.js docs for reference.
In order to support the full spectrum of possible HTTP applications, Node.js's HTTP API is very low-level. So data is received chunk by chunk not as whole.
There are two approaches you can take to this problem:
1) Collect data across multiple "data" events and append the results
together prior to printing the output. Use the "end" event to determine
when the stream is finished and you can write the output.
var http = require('http') ;
http.get('some/url' , function (resp) {
var respContent = '' ;
resp.on('data' , function (data) {
respContent += data.toString() ;//data is a buffer instance
}) ;
resp.on('end' , function() {
console.log(respContent) ;
}) ;
}).on('error' , console.error) ;
2) Use a third-party package to abstract the difficulties involved in
collecting an entire stream of data. Two different packages provide a
useful API for solving this problem (there are likely more!): bl (Buffer
List) and concat-stream; take your pick!
var http = require('http') ;
var bl = require('bl') ;
http.get('some/url', function (response) {
response.pipe(bl(function (err, data) {
if (err) {
return console.error(err)
}
data = data.toString() ;
console.log(data) ;
}))
}).on('error' , console.error) ;
The reason it's messed up is because you need to call JSON.parse(data.toString()). Data is a buffer so you can't just parse it directly.
If you don't mind using the request library
var request = require('request');
request('http://www.google.com', function (error, response, body) {
if (!error && response.statusCode == 200) {
console.log(body) // Print the google web page.
}
})
If you are dealing with non-ASCII contents(Especially for Chinese/Japanese/Korean characters, no matter what encoding they are), you'd better not treat chunk data passed over response.on('data') event as string directly.
Concatenate them as byte buffers and decode them in response.on('end') only to get the correct result.
// Snippet in TypeScript syntax:
//
// Assuming that the server-side will accept the "test_string" you post, and
// respond a string that concatenates the content of "test_string" for many
// times so that it will triggers multiple times of the on("data") events.
//
const data2Post = '{"test_string": "swamps/沼泽/沼澤/沼地/늪"}';
const postOptions = {
hostname: "localhost",
port: 5000,
path: "/testService",
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': Buffer.byteLength(data2Post) // Do not use data2Post.length on CJK string, it will return improper value for 'Content-Length'
},
timeout: 5000
};
let body: string = '';
let body_chunks: Array<Buffer> = [];
let body_chunks_bytelength: number = 0; // Used to terminate connection of too large POST response if you need.
let postReq = http.request(postOptions, (res) => {
console.log(`statusCode: ${res.statusCode}`);
res.on('data', (chunk: Buffer) => {
body_chunks.push(chunk);
body_chunks_bytelength += chunk.byteLength;
// Debug print. Please note that as the chunk may contain incomplete characters, the decoding may not be correct here. Only used to demonstrating the difference compare to the final result in the res.on("end") event.
console.log("Partial body: " + chunk.toString("utf8"));
// Terminate the connection in case the POST response is too large. (10*1024*1024 = 10MB)
if (body_chunks_bytelength > 10*1024*1024) {
postReq.connection.destroy();
console.error("Too large POST response. Connection terminated.");
}
});
res.on('end', () => {
// Decoding the correctly concatenated response data
let mergedBodyChunkBuffer:Buffer = Buffer.concat(body_chunks);
body = mergedBodyChunkBuffer.toString("utf8");
console.log("Body using chunk: " + body);
console.log(`body_chunks_bytelength=${body_chunks_bytelength}`);
});
});
How about HTTPS chunked response? I've been trying to read a response from an API that response over HTTPS with a header Transfer-Encoding: chunked. Each chunk is a Buffer but when I concat them all together and try converting to string with UTF-8 I get weird characters.

Resources