Downloading Blob from Docusign Envelopes API - meteor

Using Meteor HTTP I'm able to get a response from docusign and convert to a base64 buffer.
try {
const response = HTTP.get(`${baseUrl}/envelopes/${envelopeId}/documents/1`, {
headers: {
"Authorization": `bearer ${token}`,
"Content-Type": "application/json",
},
});
const buffer = new Buffer(response.content).toString('base64');
return buffer
} catch(e) {
console.log(e);
throw new Meteor.Error(e.reason);
}
Then on the client I'm using FileSaver.js to saveAs a blob created from an ArrayBuffer via this function
function _base64ToArrayBuffer(base64) {
const binary_string = window.atob(base64);
const len = binary_string.length;
const bytes = new Uint8Array( len );
for (let i = 0; i < len; i++) {
bytes[i] = binary_string.charCodeAt(i);
}
return bytes.buffer;
}
// template helper
'click [data-action="download"]'(e, tmpl){
const doc = this;
return Meteor.call('downloadPDF', doc, (err, pdf)=>{
if(err) {
return notify({
message: err,
timeout: 3000,
})
}
const pdfBuffer = pdf && _base64ToArrayBuffer(pdf);
console.log(pdfBuffer);
return saveAs(new Blob([pdfBuffer], {type: 'application/pdf'}), `docusign_pdf.pdf`);
});
},
The PDF is downloading with the correct size and page length, but all the pages are blank. Should I be encoding the buffer differently? Is there something else I'm missing?

When uploading documents into DocuSign you can either send the raw document bytes as part of a multipart/form-data request or you can send the document as a base64 encoded file in the document node in your request body.
However once the envelope is complete the DocuSign platform converts the doc into a PDF (if it wasn't one already) and provides that raw file. As such, you shouldn't have to base64 decode the doc so I would try removing that part of your code.

Related

Display base64 image with Deno's Oak package

I've got a base64 encoded image stored in my database, so the start looks like this data:image/png;base64,iVB.. and I return it from the database at my 'image' endpoint.
I could of course just fetch the content using JavaScript and include the base64 string as img source and it would render just fine, but I also want to be able to view the image when I go to the endpoint with my browser. I've tried a combination of different headers but couldn't get it too work.
How do I render a base64 image using Oak?
EDIT:
This is the code I ended up using, for anyone interested!
const getImage = async (
{ params, response }: { params: { id: string }; response: Response },
) => {
// Get the image using the ID from the URL
const image = await imageDatabase.findOne({ _id: ObjectId(params.id) });
// If there is no image found
if (!image) {
response.status = 404;
return;
}
// Return the image to the user
const semicolonIndex = image.base64!.indexOf(';')
const colonIndex = image.base64!.indexOf(':')
const commaIndex = image.base64!.indexOf(',')
const imageSize = image.base64!.slice(colonIndex + 1, semicolonIndex);
const imageData = image.base64!.slice(commaIndex + 1);
response.headers.set('Content-Type', imageSize)
response.body = base64.toUint8Array(imageData)
};
HTTP doesn't support base64 image content. You should decode it and return with content type.
const images: any = {
beam: 'data:image/png;base64,<base64-string>',
google: 'data:image/png;base64,<base64-string>',
}
router.get('/image/:id', ctx => {
if (ctx.params && ctx.params.id && ctx.params.id in images) {
const image: string = images[ctx.params.id]
const colonIdx = image.indexOf(':')
const semicolonIdx = image.indexOf(';')
const commaIdx = image.indexOf(',')
// extract image content type
ctx.response.headers.set('Content-Type', image.slice(colonIdx + 1, semicolonIdx))
// encode base64
ctx.response.body = base64.toUint8Array(image.slice(commaIdx + 1))
}
})
To find full code, see code on github
Or you can run my sample directly
deno run -A https://raw.githubusercontent.com/XDean/StackOverflow/master/src/main/deno/Q66697683.ts

Firebase short URLs not redirecting

I created a Google Sheet that uses a Google Script to generate short URLs via Firebase API.
This is the code in the Google Script
function URLShortener(longURL) {
var body = {
"dynamicLinkInfo": {
"domainUriPrefix": "https://go.example.com",
"link" : longURL
},
"suffix": {
"option": "SHORT"
}
};
var key = 'xxxxxxx'
var url = "https://firebasedynamiclinks.googleapis.com/v1/shortLinks?key=" + key;
var options = {
'method': 'POST',
"contentType": "application/json",
'payload': JSON.stringify(body),
};
var response = UrlFetchApp.fetch(url, options);
var json = response.getContentText();
var data = JSON.parse(json);
var obj = data["shortLink"];
return obj;
Logger.log(obj)
}
The script works and it generates URLs similar to https://go.example.com/Xdka but these link redirect to https://example.com/Xdka instead of the actual URL that is sent, e.g. https://example.com/final_url.
If I try to generate these short links from the Firebase dashboard the same happens.
Did I misunderstand how these short URLs work or am I missing something?

fetching mp3 file from MeteorJS and trying to convert it into a Blob so that I can play it

am playing around with downloading and serving mp3 files in Meteor.
I am trying to download an MP3 file (https://www.sample-videos.com/audio/mp3/crowd-cheering.mp3) on my MeteorJS Server side (to circumvent CORS issues) and then pass it back to the client to play it in a AUDIO tag.
In Meteor you use the Meteor.call function to call a server method. There is not much to configure, it's just a method call and a callback.
When I run the method I receive this:
content:
"ID3���#K `�)�<H� e0�)������1������J}��e����2L����������fȹ\�CO��ȹ'�����}$A�Lݓ����3D/����fijw��+�LF�$?��`R�l�YA:A��#�0��pq����4�.W"�P���2.Iƭ5��_I�d7d����L��p0��0A��cA�xc��ٲR�BL8䝠4���T��..etc..", data:null,
headers: {
accept-ranges:"bytes",
connection:"close",
content-length:"443926",
content-type:"audio/mpeg",
date:"Mon, 20 Aug 2018 13:36:11 GMT",
last-modified:"Fri, 17 Jun 2016 18:16:53 GMT",
server:"Apache",
statusCode:200
which is the working Mp3 file (the content-length is exactly the same as the file I write to disk on the MeteorJS Server side, and it is playable).
However, my following code doesn't let me convert the response into a BLOB:
```
MeteorObservable.call( 'episode.download', episode.url.url ).subscribe( ( result: any )=> {
console.log( 'response', result);
let URL = window.URL;
let blob = new Blob([ result.content ], {type: 'audio/mpeg'} );
console.log('blob', blob);
let audioUrl = URL.createObjectURL(blob);
let audioElement:any = document.getElementsByTagName('audio')[0];
audioElement.setAttribute("src", audioUrl);
audioElement.play();
})
When I run the code, the Blob has the wrong size and is not playable
Blob(769806) {size: 769806, type: "audio/mpeg"}
size:769806
type:"audio/mpeg"
__proto__:Blob
Uncaught (in promise) DOMException: Failed to load because no supported source was found.
On the backend I just run a return HTTP.get( url ); in the method which is using import { HTTP } from 'meteor/http'.
I have been trying to use btoa or atob but that doesn't work and as far as I know it is already a base64 encoded file, right?
I am not sure why the Blob constructor creates a larger file then the source returned from the backend. And I am not sure why it is not playing.
Can anyone point me to the right direction?
Finally found a solution that uses request instead of Meteor's HTTP:
First you need to install request and request-promise-native in order to make it easy to return your result to clients.
$ meteor npm install --save request request-promise-native
Now you just return the promise of the request in a Meteor method:
server/request.js
import { Meteor } from 'meteor/meteor'
import request from 'request-promise-native'
Meteor.methods({
getAudio (url) {
return request.get({url, encoding: null})
}
})
Notice the encoding: null flag, which causes the result to be binary. I found this in a comment of an answer related to downloading binary data via node. This causes not to use string but binary representation of the data (I don't know how but maybe it is a fallback that uses Node Buffer).
Now it gets interesting. On your client you wont receive a complex result anymore but either an Error or a Uint8Array which makes sense because Meteor uses EJSON to send data over the wires with DDP and the representation of binary data is a Uint8Array as described in the documentation.
Because you can just pass in a Uint8Array into a Blob you can now easily create the blob like so:
const blob = new Blob([utf8Array], {type: 'audio/mpeg'})
Summarizing all this into a small template if could look like this:
client/fetch.html
<template name="fetch">
<button id="fetchbutton">Fetch Mp3</button>
{{#if source}}
<audio id="player" src={{source}} preload="none" content="audio/mpeg" controls></audio>
{{/if}}
</template>
client/fetch.js
import { Template } from 'meteor/templating'
import { ReactiveVar } from 'meteor/reactive-var'
import './fetch.html'
Template.fetch.onCreated(function helloOnCreated () {
// counter starts at 0
this.source = new ReactiveVar(null)
})
Template.fetch.helpers({
source () {
return Template.instance().source.get()
},
})
Template.fetch.events({
'click #fetchbutton' (event, instance) {
Meteor.call('getAudio', 'https://www.sample-videos.com/audio/mp3/crowd-cheering.mp3', (err, uint8Array) => {
const blob = new Blob([uint8Array], {type: 'audio/mpeg'})
instance.source.set(window.URL.createObjectURL(blob))
})
},
})
Alternative solution is adding a REST endpoint *using Express) to your Meteor backend.
Instead of HTTP we use request and request-progress to send the data chunked in case of large files.
On the frontend I catch the chunks using https://angular.io/guide/http#listening-to-progress-events to show a loader and deal with the response.
I could listen to the download via
this.http.get( 'the URL to a mp3', { responseType: 'arraybuffer'} ).subscribe( ( res:any ) => {
var blob = new Blob( [res], { type: 'audio/mpeg' });
var url= window.URL.createObjectURL(blob);
window.open(url);
} );
The above example doesn't show progress by the way, you need to implement the progress-events as explained in the angular article. Happy to update the example to my final code when finished.
The Express setup on the Meteor Server:
/*
Source:http://www.mhurwi.com/meteor-with-express/
## api.class.ts
*/
import { WebApp } from 'meteor/webapp';
const express = require('express');
const trackRoute = express.Router();
const request = require('request');
const progress = require('request-progress');
export function api() {
const app = express();
app.use(function(req, res, next) {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
next();
});
app.use('/episodes', trackRoute);
trackRoute.get('/:url', (req, res) => {
res.set('content-type', 'audio/mp3');
res.set('accept-ranges', 'bytes');
// The options argument is optional so you can omit it
progress(request(req.params.url ), {
// throttle: 2000, // Throttle the progress event to 2000ms, defaults to 1000ms
// delay: 1000, // Only start to emit after 1000ms delay, defaults to 0ms
// lengthHeader: 'x-transfer-length' // Length header to use, defaults to content-length
})
.on('progress', function (state) {
// The state is an object that looks like this:
// {
// percent: 0.5, // Overall percent (between 0 to 1)
// speed: 554732, // The download speed in bytes/sec
// size: {
// total: 90044871, // The total payload size in bytes
// transferred: 27610959 // The transferred payload size in bytes
// },
// time: {
// elapsed: 36.235, // The total elapsed seconds since the start (3 decimals)
// remaining: 81.403 // The remaining seconds to finish (3 decimals)
// }
// }
console.log('progress', state);
})
.on('error', function (err) {
// Do something with err
})
.on('end', function () {
console.log('DONE');
// Do something after request finishes
})
.pipe(res);
});
WebApp.connectHandlers.use(app);
}
and then add this to your meteor startup:
import { Meteor } from 'meteor/meteor';
import { api } from './imports/lib/api.class';
Meteor.startup( () => {
api();
});

problems with sending jpg over http - node.js

I'm trying to write a simple http web server, that (among other features), can send the client a requested file.
Sending a regular text file/html file works as a charm. The problem is with sending image files.
Here is a part of my code (after parsing the MIME TYPE, and including fs node.js module):
if (MIMEtype == "image") {
console.log('IMAGE');
fs.readFile(path, "binary", function(err,data) {
console.log("Sending to user: ");
console.log('read the file!');
response.body = data;
response.end();
});
} else {
fs.readFile(path, "utf8", function(err,data) {
response.body = data ;
response.end() ;
});
}
Why all I'm getting is a blank page, upon opening http://localhost:<serverPort>/test.jpg?
Here's a complete example on how to send an image with Node.js in the simplest possible way (my example is a gif file, but it can be used with other file/images types):
var http = require('http'),
fs = require('fs'),
util = require('util'),
file_path = __dirname + '/web.gif';
// the file is in the same folder with our app
// create server on port 4000
http.createServer(function(request, response) {
fs.stat(file_path, function(error, stat) {
var rs;
// We specify the content-type and the content-length headers
// important!
response.writeHead(200, {
'Content-Type' : 'image/gif',
'Content-Length' : stat.size
});
rs = fs.createReadStream(file_path);
// pump the file to the response
util.pump(rs, response, function(err) {
if(err) {
throw err;
}
});
});
}).listen(4000);
console.log('Listening on port 4000.');
UPDATE:
util.pump has been deprecated for a while now and you can just use streams to acomplish this:
fs.createReadStream(filePath).pipe(req);

Get the whole response body when the response is chunked?

I'm making a HTTP request and listen for "data":
response.on("data", function (data) { ... })
The problem is that the response is chunked so the "data" is just a piece of the body sent back.
How do I get the whole body sent back?
request.on('response', function (response) {
var body = '';
response.on('data', function (chunk) {
body += chunk;
});
response.on('end', function () {
console.log('BODY: ' + body);
});
});
request.end();
Over at https://groups.google.com/forum/?fromgroups=#!topic/nodejs/75gfvfg6xuc, Tane Piper provides a good solution very similar to scriptfromscratch's, but for the case of a JSON response:
request.on('response',function(response){
var data = [];
response.on('data', function(chunk) {
data.push(chunk);
});
response.on('end', function() {
var result = JSON.parse(data.join(''))
return result
});
});`
This addresses the issue that OP brought up in the comments section of scriptfromscratch's answer.
I never worked with the HTTP-Client library, but since it works just like the server API, try something like this:
var data = '';
response.on('data', function(chunk) {
// append chunk to your data
data += chunk;
});
response.on('end', function() {
// work with your data var
});
See node.js docs for reference.
In order to support the full spectrum of possible HTTP applications, Node.js's HTTP API is very low-level. So data is received chunk by chunk not as whole.
There are two approaches you can take to this problem:
1) Collect data across multiple "data" events and append the results
together prior to printing the output. Use the "end" event to determine
when the stream is finished and you can write the output.
var http = require('http') ;
http.get('some/url' , function (resp) {
var respContent = '' ;
resp.on('data' , function (data) {
respContent += data.toString() ;//data is a buffer instance
}) ;
resp.on('end' , function() {
console.log(respContent) ;
}) ;
}).on('error' , console.error) ;
2) Use a third-party package to abstract the difficulties involved in
collecting an entire stream of data. Two different packages provide a
useful API for solving this problem (there are likely more!): bl (Buffer
List) and concat-stream; take your pick!
var http = require('http') ;
var bl = require('bl') ;
http.get('some/url', function (response) {
response.pipe(bl(function (err, data) {
if (err) {
return console.error(err)
}
data = data.toString() ;
console.log(data) ;
}))
}).on('error' , console.error) ;
The reason it's messed up is because you need to call JSON.parse(data.toString()). Data is a buffer so you can't just parse it directly.
If you don't mind using the request library
var request = require('request');
request('http://www.google.com', function (error, response, body) {
if (!error && response.statusCode == 200) {
console.log(body) // Print the google web page.
}
})
If you are dealing with non-ASCII contents(Especially for Chinese/Japanese/Korean characters, no matter what encoding they are), you'd better not treat chunk data passed over response.on('data') event as string directly.
Concatenate them as byte buffers and decode them in response.on('end') only to get the correct result.
// Snippet in TypeScript syntax:
//
// Assuming that the server-side will accept the "test_string" you post, and
// respond a string that concatenates the content of "test_string" for many
// times so that it will triggers multiple times of the on("data") events.
//
const data2Post = '{"test_string": "swamps/沼泽/沼澤/沼地/늪"}';
const postOptions = {
hostname: "localhost",
port: 5000,
path: "/testService",
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': Buffer.byteLength(data2Post) // Do not use data2Post.length on CJK string, it will return improper value for 'Content-Length'
},
timeout: 5000
};
let body: string = '';
let body_chunks: Array<Buffer> = [];
let body_chunks_bytelength: number = 0; // Used to terminate connection of too large POST response if you need.
let postReq = http.request(postOptions, (res) => {
console.log(`statusCode: ${res.statusCode}`);
res.on('data', (chunk: Buffer) => {
body_chunks.push(chunk);
body_chunks_bytelength += chunk.byteLength;
// Debug print. Please note that as the chunk may contain incomplete characters, the decoding may not be correct here. Only used to demonstrating the difference compare to the final result in the res.on("end") event.
console.log("Partial body: " + chunk.toString("utf8"));
// Terminate the connection in case the POST response is too large. (10*1024*1024 = 10MB)
if (body_chunks_bytelength > 10*1024*1024) {
postReq.connection.destroy();
console.error("Too large POST response. Connection terminated.");
}
});
res.on('end', () => {
// Decoding the correctly concatenated response data
let mergedBodyChunkBuffer:Buffer = Buffer.concat(body_chunks);
body = mergedBodyChunkBuffer.toString("utf8");
console.log("Body using chunk: " + body);
console.log(`body_chunks_bytelength=${body_chunks_bytelength}`);
});
});
How about HTTPS chunked response? I've been trying to read a response from an API that response over HTTPS with a header Transfer-Encoding: chunked. Each chunk is a Buffer but when I concat them all together and try converting to string with UTF-8 I get weird characters.

Resources