Process chunked response with angularjs $http - http

I discoverd recently chunked response.
I agree that most of the time we want to work on a full response.
But what if I want to work on a chunked response.
How would i do this with the $http service??

You can define a function with the angularjs promise $q wrapping the XMLHttpRequest.
var chunkedRequestWithPromise = function () {
var deferred = $q.defer();
var xhr = new XMLHttpRequest()
xhr.open("GET", 'https://yoururl.com/chunked', true)
xhr.onprogress = function () {
deferred.notify(xhr.responseText);
}
xhr.onreadystatechange = function (oEvent) {
if (xhr.readyState === 4) {
if (xhr.status === 200) {
deferred.resolve('success');
} else {
deferred.reject(xhr.statusText);
}
}
};
xhr.send();
return deferred.promise;
};
and use it:
chunkedRequestWithPromise().then(successFn,errorFn,notifyFn);

I've found a way to do something that looks correct to me
http://www.igvita.com/2011/08/26/server-sent-event-notifications-with-html5/
Of course there is also the websocket api, but it seams heavy for my uses
Another possibility would be the write another http service which would give control over the http response

Related

Impacts on refreshing access tokens frequently

On my .NET Web API 2 server, I am using OWIN for authentication. I have followed Taiseer's tutorial and successfully implemented an access token refresh mechanism.
I would like to know if there are any impacts on anything if clients refresh their access tokens frequently, e.g. refresh once every 5 minutes on average.
I am asking this question because I have a button on my page, when user clicks it, the data on that page is sent to different endpoints. These endpoints are marked with the attribute [Authorize].
Previously, when I send a request to a single protected endpoint, I can check if the response is 401 (unauthorized). If so, I can refresh the user's access token first, then resend the rejected request with the new token. However, I don't know how can the same thing be done this time, as there are so many requests being sent at once. The aforementioned method is implemented in my AngularJS interceptor. It can handle a single but not multiple rejected unauthorized requests.
FYI, here is the code for my interceptor, which is found and modified from a source on GitHub.
app.factory('authInterceptor', function($q, $injector, $location, localStorageService) {
var authInterceptor = {};
var $http;
var request = function(config) {
config.headers = config.headers || {};
var jsonData = localStorageService.get('AuthorizationData');
if (jsonData) {
config.headers.Authorization = 'Bearer ' + jsonData.token;
}
return config;
}
var responseError = function(rejection) {
var deferred = $q.defer();
if (rejection.status === 401) {
var authService = $injector.get('authService');
authService.refreshToken().then(function(response) {
_retryHttpRequest(rejection.config, deferred);
}, function() {
authService.logout();
$location.path('/login');
deferred.reject(rejection);
});
} else {
deferred.reject(rejection);
}
return deferred.promise;
}
var _retryHttpRequest = function(config, deferred) {
$http = $http || $injector.get('$http');
$http(config).then(function(response) {
deferred.resolve(response);
}, function(response) {
deferred.reject(response);
});
}
authInterceptor.request = request;
authInterceptor.responseError = responseError;
return authInterceptor;
});

XDomainRequest in IE is giving Access is Denied error

This is the code I am using is as follows down below:
I am using IE9 and am unable to see the request being sent in the Network tab. I do have Access-Control headers set in the JSP as:
<% response.setHeader("Access-Control-Allow-Origin", "*");%>
Code to get the AJAX HTML Content from the JSP:
if ($.browser.msie && window.XDomainRequest) {
var xdr = new window.XDomainRequest();
xdr.open("GET", "http://dev01.org:11110/crs/qw/qw.jsp?&_=" + Math.random());
xdr.contentType = "text/plain";
xdr.timeout = 5000;
xdr.onerror = function () {
console.log('we have an error!');
}
xdr.onprogress = function () {
console.log('this sucks!');
};
xdr.ontimeout = function () {
console.log('it timed out!');
};
xdr.onopen = function () {
console.log('we open the xdomainrequest');
};
xdr.onload = function() {
alert(xdr.responseText);
};
xdr.send(null);
} else { ...... }
I am getting a Access is Denied Error. Any help would be much appreciated!
Requests must be targeted to the same scheme as the hosting page
In your example you are doing request to:
http://dev01 ...
And you should do this from HTTP protocol.
For example:
If your site, where js script is located: http://dev.org
You can do this:
xhr = new XDomainRequest();
xhr.open("GET", "http://dev01.org?p=1");
but this throws "Access denied":
xhr = new XDomainRequest();
xhr.open("GET", "https://dev01.org?p=1");
My experience with XDomainRequest is that it doesn't respect Access-Control-Allow-Origin: *. Instead, you must specify the domain. This can be obtained from the HTTP_REFERER header if you need to dynamically generate it, or if you are only expecting requests from one domain you can set it manually. This article might help.
<% response.setHeader("Access-Control-Allow-Origin", "http://dev01.org");%>

How to $http Synchronous call with AngularJS

Is there any way to make a synchronous call with AngularJS?
The AngularJS documentation is not very explicit or extensive for figuring out some basic stuff.
ON A SERVICE:
myService.getByID = function (id) {
var retval = null;
$http({
url: "/CO/api/products/" + id,
method: "GET"
}).success(function (data, status, headers, config) {
retval = data.Data;
});
return retval;
}
Not currently. If you look at the source code (from this point in time Oct 2012), you'll see that the call to XHR open is actually hard-coded to be asynchronous (the third parameter is true):
xhr.open(method, url, true);
You'd need to write your own service that did synchronous calls. Generally that's not something you'll usually want to do because of the nature of JavaScript execution you'll end up blocking everything else.
... but.. if blocking everything else is actually desired, maybe you should look into promises and the $q service. It allows you to wait until a set of asynchronous actions are done, and then execute something once they're all complete. I don't know what your use case is, but that might be worth a look.
Outside of that, if you're going to roll your own, more information about how to make synchronous and asynchronous ajax calls can be found here.
I hope that is helpful.
I have worked with a factory integrated with google maps autocomplete and promises made​​, I hope you serve.
http://jsfiddle.net/the_pianist2/vL9nkfe3/1/
you only need to replace the autocompleteService by this request with $ http incuida being before the factory.
app.factory('Autocomplete', function($q, $http) {
and $ http request with
var deferred = $q.defer();
$http.get('urlExample').
success(function(data, status, headers, config) {
deferred.resolve(data);
}).
error(function(data, status, headers, config) {
deferred.reject(status);
});
return deferred.promise;
<div ng-app="myApp">
<div ng-controller="myController">
<input type="text" ng-model="search"></input>
<div class="bs-example">
<table class="table" >
<thead>
<tr>
<th>#</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr ng-repeat="direction in directions">
<td>{{$index}}</td>
<td>{{direction.description}}</td>
</tr>
</tbody>
</table>
</div>
'use strict';
var app = angular.module('myApp', []);
app.factory('Autocomplete', function($q) {
var get = function(search) {
var deferred = $q.defer();
var autocompleteService = new google.maps.places.AutocompleteService();
autocompleteService.getPlacePredictions({
input: search,
types: ['geocode'],
componentRestrictions: {
country: 'ES'
}
}, function(predictions, status) {
if (status == google.maps.places.PlacesServiceStatus.OK) {
deferred.resolve(predictions);
} else {
deferred.reject(status);
}
});
return deferred.promise;
};
return {
get: get
};
});
app.controller('myController', function($scope, Autocomplete) {
$scope.$watch('search', function(newValue, oldValue) {
var promesa = Autocomplete.get(newValue);
promesa.then(function(value) {
$scope.directions = value;
}, function(reason) {
$scope.error = reason;
});
});
});
the question itself is to be made on:
deferred.resolve(varResult);
when you have done well and the request:
deferred.reject(error);
when there is an error, and then:
return deferred.promise;
var EmployeeController = ["$scope", "EmployeeService",
function ($scope, EmployeeService) {
$scope.Employee = {};
$scope.Save = function (Employee) {
if ($scope.EmployeeForm.$valid) {
EmployeeService
.Save(Employee)
.then(function (response) {
if (response.HasError) {
$scope.HasError = response.HasError;
$scope.ErrorMessage = response.ResponseMessage;
} else {
}
})
.catch(function (response) {
});
}
}
}]
var EmployeeService = ["$http", "$q",
function ($http, $q) {
var self = this;
self.Save = function (employee) {
var deferred = $q.defer();
$http
.post("/api/EmployeeApi/Create", angular.toJson(employee))
.success(function (response, status, headers, config) {
deferred.resolve(response, status, headers, config);
})
.error(function (response, status, headers, config) {
deferred.reject(response, status, headers, config);
});
return deferred.promise;
};
I recently ran into a situation where I wanted to make to $http calls triggered by a page reload. The solution I went with:
Encapsulate the two calls into functions
Pass the second $http call as a callback into the second function
Call the second function in apon .success
Here's a way you can do it asynchronously and manage things like you would normally.
Everything is still shared. You get a reference to the object that you want updated. Whenever you update that in your service, it gets updated globally without having to watch or return a promise.
This is really nice because you can update the underlying object from within the service without ever having to rebind. Using Angular the way it's meant to be used.
I think it's probably a bad idea to make $http.get/post synchronous. You'll get a noticeable delay in the script.
app.factory('AssessmentSettingsService', ['$http', function($http) {
//assessment is what I want to keep updating
var settings = { assessment: null };
return {
getSettings: function () {
//return settings so I can keep updating assessment and the
//reference to settings will stay in tact
return settings;
},
updateAssessment: function () {
$http.get('/assessment/api/get/' + scan.assessmentId).success(function(response) {
//I don't have to return a thing. I just set the object.
settings.assessment = response;
});
}
};
}]);
...
controller: ['$scope', '$http', 'AssessmentSettingsService', function ($scope, as) {
$scope.settings = as.getSettings();
//Look. I can even update after I've already grabbed the object
as.updateAssessment();
And somewhere in a view:
<h1>{{settings.assessment.title}}</h1>
Since sync XHR is being deprecated, it's best not to rely on that. If you need to do a sync POST request, you can use the following helpers inside of a service to simulate a form post.
It works by creating a form with hidden inputs which is posted to the specified URL.
//Helper to create a hidden input
function createInput(name, value) {
return angular
.element('<input/>')
.attr('type', 'hidden')
.attr('name', name)
.val(value);
}
//Post data
function post(url, data, params) {
//Ensure data and params are an object
data = data || {};
params = params || {};
//Serialize params
const serialized = $httpParamSerializer(params);
const query = serialized ? `?${serialized}` : '';
//Create form
const $form = angular
.element('<form/>')
.attr('action', `${url}${query}`)
.attr('enctype', 'application/x-www-form-urlencoded')
.attr('method', 'post');
//Create hidden input data
for (const key in data) {
if (data.hasOwnProperty(key)) {
const value = data[key];
if (Array.isArray(value)) {
for (const val of value) {
const $input = createInput(`${key}[]`, val);
$form.append($input);
}
}
else {
const $input = createInput(key, value);
$form.append($input);
}
}
}
//Append form to body and submit
angular.element(document).find('body').append($form);
$form[0].submit();
$form.remove();
}
Modify as required for your needs.
What about wrapping your call in a Promise.all() method i.e.
Promise.all([$http.get(url).then(function(result){....}, function(error){....}])
According to MDN
Promise.all waits for all fulfillments (or the first rejection)

How to handle http.get() when received no data in nodejs?

Let's say that I have a service that returns customer information by a given id. If it's not found I return null.
My app calls this services like this:
http.get('http://myserver/myservice/customer/123', function(res) {
res.on('data', function(d) {
callback(null, d);
});
}).on('error', function(e) {
callback(e);
});
Currently my service responds with 200 but no data.
How do I handle when no data is returned?
Should I change it to return a different http code? In this case how to handle this? I tried many different approaches without success.
First off, it's important to remember that res is a stream of data; as it stands, your code is likely to call callback multiple times with chunks of data. (The data event is fired each time new data comes in.) If your method has been working up to this point, it's only because the responses have been small enough that the response payload wasn't broken into multiple chunks.
To make your life easier, use a library that handles buffering HTTP response bodies and allows you to get the complete response. A quick search on npm reveals the request package.
var request = require('request');
request('http://server/customer/123', function (error, response, body) {
if (!error && response.statusCode == 200) {
callback(null, body);
} else {
callback(error || response.statusCode);
}
});
As far as your service goes, if a customer ID is not found, return a 404 – it's semantically invalid to return a 200 OK when in fact there is no such customer.
Use the Client Response 'end' event:
//...
res.on('data', function(d) {
callback(null, d);
}).on('end', function() {
console.log('RESPONSE COMPLETE!');
});
I finally found a way to solve my problem:
http.get('http://myserver/myservice/customer/123', function(res) {
var data = '';
response.on('data', function(d) {
data += d;
});
response.on('end', function() {
if (data != '') {
callback(null, data);
}
else {
callback('No customer was found');
}
});
}).on('error', function(e) {
callback(e);
});
This solves the problem, but I'm also going to adopt josh3736's suggestion and return a 404 on my service when the customer is not found.

Get the whole response body when the response is chunked?

I'm making a HTTP request and listen for "data":
response.on("data", function (data) { ... })
The problem is that the response is chunked so the "data" is just a piece of the body sent back.
How do I get the whole body sent back?
request.on('response', function (response) {
var body = '';
response.on('data', function (chunk) {
body += chunk;
});
response.on('end', function () {
console.log('BODY: ' + body);
});
});
request.end();
Over at https://groups.google.com/forum/?fromgroups=#!topic/nodejs/75gfvfg6xuc, Tane Piper provides a good solution very similar to scriptfromscratch's, but for the case of a JSON response:
request.on('response',function(response){
var data = [];
response.on('data', function(chunk) {
data.push(chunk);
});
response.on('end', function() {
var result = JSON.parse(data.join(''))
return result
});
});`
This addresses the issue that OP brought up in the comments section of scriptfromscratch's answer.
I never worked with the HTTP-Client library, but since it works just like the server API, try something like this:
var data = '';
response.on('data', function(chunk) {
// append chunk to your data
data += chunk;
});
response.on('end', function() {
// work with your data var
});
See node.js docs for reference.
In order to support the full spectrum of possible HTTP applications, Node.js's HTTP API is very low-level. So data is received chunk by chunk not as whole.
There are two approaches you can take to this problem:
1) Collect data across multiple "data" events and append the results
together prior to printing the output. Use the "end" event to determine
when the stream is finished and you can write the output.
var http = require('http') ;
http.get('some/url' , function (resp) {
var respContent = '' ;
resp.on('data' , function (data) {
respContent += data.toString() ;//data is a buffer instance
}) ;
resp.on('end' , function() {
console.log(respContent) ;
}) ;
}).on('error' , console.error) ;
2) Use a third-party package to abstract the difficulties involved in
collecting an entire stream of data. Two different packages provide a
useful API for solving this problem (there are likely more!): bl (Buffer
List) and concat-stream; take your pick!
var http = require('http') ;
var bl = require('bl') ;
http.get('some/url', function (response) {
response.pipe(bl(function (err, data) {
if (err) {
return console.error(err)
}
data = data.toString() ;
console.log(data) ;
}))
}).on('error' , console.error) ;
The reason it's messed up is because you need to call JSON.parse(data.toString()). Data is a buffer so you can't just parse it directly.
If you don't mind using the request library
var request = require('request');
request('http://www.google.com', function (error, response, body) {
if (!error && response.statusCode == 200) {
console.log(body) // Print the google web page.
}
})
If you are dealing with non-ASCII contents(Especially for Chinese/Japanese/Korean characters, no matter what encoding they are), you'd better not treat chunk data passed over response.on('data') event as string directly.
Concatenate them as byte buffers and decode them in response.on('end') only to get the correct result.
// Snippet in TypeScript syntax:
//
// Assuming that the server-side will accept the "test_string" you post, and
// respond a string that concatenates the content of "test_string" for many
// times so that it will triggers multiple times of the on("data") events.
//
const data2Post = '{"test_string": "swamps/沼泽/沼澤/沼地/늪"}';
const postOptions = {
hostname: "localhost",
port: 5000,
path: "/testService",
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': Buffer.byteLength(data2Post) // Do not use data2Post.length on CJK string, it will return improper value for 'Content-Length'
},
timeout: 5000
};
let body: string = '';
let body_chunks: Array<Buffer> = [];
let body_chunks_bytelength: number = 0; // Used to terminate connection of too large POST response if you need.
let postReq = http.request(postOptions, (res) => {
console.log(`statusCode: ${res.statusCode}`);
res.on('data', (chunk: Buffer) => {
body_chunks.push(chunk);
body_chunks_bytelength += chunk.byteLength;
// Debug print. Please note that as the chunk may contain incomplete characters, the decoding may not be correct here. Only used to demonstrating the difference compare to the final result in the res.on("end") event.
console.log("Partial body: " + chunk.toString("utf8"));
// Terminate the connection in case the POST response is too large. (10*1024*1024 = 10MB)
if (body_chunks_bytelength > 10*1024*1024) {
postReq.connection.destroy();
console.error("Too large POST response. Connection terminated.");
}
});
res.on('end', () => {
// Decoding the correctly concatenated response data
let mergedBodyChunkBuffer:Buffer = Buffer.concat(body_chunks);
body = mergedBodyChunkBuffer.toString("utf8");
console.log("Body using chunk: " + body);
console.log(`body_chunks_bytelength=${body_chunks_bytelength}`);
});
});
How about HTTPS chunked response? I've been trying to read a response from an API that response over HTTPS with a header Transfer-Encoding: chunked. Each chunk is a Buffer but when I concat them all together and try converting to string with UTF-8 I get weird characters.

Resources