I am using http-proxy to forward all requests in NextJS, I want to handle the self response data with selfHandleResponse option. Specifically, I want to do something like set a some data to cookie, etc...) in Next.js API Routes, but if I set selfHandleResponse to true, none of the webOutgoing passes are called, that resulted in my request never being responded. What should I do? Please help me, I have researched but no hope :(
My code below:
const handleProxyInit = (proxy) => {
proxy.on('proxyRes', (proxyRes, req, res) => {
let body = [];
proxyRes.on('data', function (chunk) {
body.push(chunk);
});
proxyRes.on('end', function () {
body = Buffer.concat(body).toString('utf8');
// what should I do here?
});
});
};
httpProxyMiddleware(req, res, {
...,
selfHandleResponse: true,
changeOrigin: true,
onProxyInit: handleProxyInit,
});
Related
I want to add a fetch call to the initial load of index.html on a local app. This data is otherwise loaded server-side, but for this particular case I need to make an http call in dev.
I am having difficulty proxying the webpack-dev-server with the fetch added to the DOM.
The proxy is working correctly after my app is instantiated and I use Axios to make http calls to /api, but on init, the proxy is still serving from localhost instead of the target endpoint.
This is a bit puzzling to me - why might the proxy work in JS post-load but not on init?
devServer: {
contentBase: '/some/path',
port: 9000,
https: true,
open: true,
compress: true,
hot: true,
proxy: { '/api/*': [Object] }
},
Script in index
<script>
(async function() {
const data = await getData();
addDataset(data);
async function getData() {
return fetch('/api/my/endpoint').then(
(response) => response.data.result,
);
}
function addDataset(data) {
var el = document.getElementById('root');
const parsed = JSON.parse(data);
Object.entries(parsed).forEach((entry) => {
const [k, val] = entry;
el.dataset[k] = JSON.stringify(val);
});
}
})();
</script>
Error
400 Bad request
Request URL - https://localhost:9000/api/my/endpoint
I looked over the previous postings but they did not seem to match my issue. I am only using the next.js package and the integrated api pages. Mongoose is what I am using to make my schemas. I keep getting the above error only for my post calls. Any ideas what is going wrong here?
import dbConnect from "../../../utils/dbConnect";
import Employee from "../../../models/Employee";
dbConnect();
export default async (req, res) => {
const { method } = req;
switch (method) {
case "POST":
await Employee.create(req.body, function (err, data) {
if (err) {
res.status(500).json({
success: false,
data: err,
});
} else {
res.status(201).json({
success: true,
data: data,
});
}
});
break;
default:
res.status(405).json({
success: false,
});
}
};
This is a false warning because in the provided code you always return a response. It's just Next.js doesn't know it.
If you are sure that you return a response in every single case, you can disable warnings for unresolved requests.
/pages/api/your_endpoint.js
export const config = {
api: {
externalResolver: true,
},
}
Custom Config For API Routes
How Can I read Response Header (Content-Disposition)? Please share resolution.
When I check at either Postman or Google Chrome Network tab, I can see 'Content-Disposition' at the response headers section for the HTTP call, but NOT able to read the header parameter at Angular Code.
// Node - Server JS
app.get('/download', function (req, res) {
var file = __dirname + '/db.json';
res.set({
'Content-Type': 'text/plain',
'Content-Disposition': 'attachment; filename=' + req.body.filename
})
res.download(file); // Set disposition and send it.
});
// Angular5 Code
saveFile() {
const headers = new Headers();
headers.append('Accept', 'text/plain');
this.http.get('http://localhost:8090/download', { headers: headers })
.subscribe(
(response => this.saveToFileSystem(response))
);
}
private saveToFileSystem(response) {
const contentDispositionHeader: string = response.headers.get('Content-Disposition'); // <== Getting error here, Not able to read Response Headers
const parts: string[] = contentDispositionHeader.split(';');
const filename = parts[1].split('=')[1];
const blob = new Blob([response._body], { type: 'text/plain' });
saveAs(blob, filename);
}
I have found the solution to this issue. As per Access-Control-Expose-Headers, only default headers would be exposed.
In order to expose 'Content-Disposition', we need to set 'Access-Control-Expose-Headers' header property to either '*' (allow all) or 'Content-Disposition'.
// Node - Server JS
app.get('/download', function (req, res) {
var file = __dirname + '/db.json';
res.set({
'Content-Type': 'text/plain',
'Content-Disposition': 'attachment; filename=' + req.body.filename,
'Access-Control-Expose-Headers': 'Content-Disposition' // <== ** Solution **
})
res.download(file); // Set disposition and send it.
});
It is not the problem with Angular, is the problem with CORS.
If the server does not explicitly allow your code to read the headers, the browser don't allow to read them.
In the server you must add Access-Control-Expose-Headers in the response.
In the response it will be like Access-Control-Expose-Headers:<header_name>,
In asp.net core it can be added while setting up CORS in ConfigureServices method in startup.cs
this solution help me to get the Content-Disposition from response header.
(data)=>{ //the 'data' is response of file data with responseType: ResponseContentType.Blob.
let contentDisposition = data.headers.get('content-disposition');
}
Firstly you need to allow your server to expose these headers. Note that it will show in you browser network tab, regardless if you have these settings. This makes it 'available'.
With C# it would look something like this:
services.AddCors(options => {
options.AddPolicy(AllowSpecificOrigins,
builder => {
builder
.WithOrigins("http://localhost:4200")
.AllowAnyHeader()
.AllowAnyMethod()
.WithExposedHeaders("Content-Disposition", "downloadFileName");
});
});
When you send your API request to the server ensure that you include the "observe" in you return. See below:
getFile(path: string): Observable<any> {
// Create headers
let headers = new HttpHeaders();
// Create and return request
return this.http.get<Blob>(
`${environment.api_url}${path}`,
{ headers, observe: 'response', responseType: 'blob' as 'json' }
).pipe();
}
Then in your response of your angular on your subscribe you can access your filename like this (the subscribe method is not complete it attaches to a pipe function)
.....
.subscribe((response: HttpResponse<Blob>) => {
const fileName = response.headers.get('content-disposition')
.split(';')[1]
.split('filename')[1]
.split('=')[1]
.trim();
});
I am trying to post to my server from twilio, but I am getting a 403 error. Basically my parse-heroku serve is rejecting any request from twilio. I am working with TWIMLAPP and masked numbers. I am having trouble posting to a function in my index file when a text goes through. In my TWIMLAPP my message url is https://parseserver.herokuapp.com/parse/index/sms Any help is appreciated. These are the errors in twilio
var app = express();
app.use(require('body-parser').urlencoded());
app.use(function (req, res, next) {
// Website you wish to allow to connect
res.setHeader('Access-Control-Allow-Origin', 'https://www.twilio.com');
// Request methods you wish to allow
res.setHeader('Access-Control-Allow-Methods', 'GET, POST, OPTIONS, PUT, PATCH, DELETE');
// Request headers you wish to allow
res.setHeader('Access-Control-Allow-Headers', 'X-Requested-With,content-type');
// Set to true if you need the website to include cookies in the requests sent
// to the API (e.g. in case you use sessions)
res.setHeader('Access-Control-Allow-Credentials', true);
res.setHeader("X-Parse-Master-Key", "xxxxxxx");
res.setHeader("X-Parse-Application-Id", "xxxxxx");
// Pass to next layer of middleware
next();
});
app.post('/sms', twilio.webhook({ validate: false }), function (req, res) {
console.log("use-sms")
from = req.body.From;
to = req.body.To;
body = req.body.Body;
gatherOutgoingNumber(from, to)
.then(function (outgoingPhoneNumber) {
var twiml = new twilio.TwimlResponse();
twiml.message(body, { to: outgoingPhoneNumber });
res.type('text/xml');
res.send(twiml.toString());
});
});
I am using node-http-proxy. However, in addition to relaying HTTP requests, I also need to listen to the incoming and outgoing data.
Intercepting the response data is where I'm struggling. Node's ServerResponse object (and more generically the WritableStream interface) doesn't broadcast a 'data' event. http-proxy seems to create it's own internal request, which produces a ClientResponse object (which does broadcast the 'data' event) however this object is not exposed publically outside the proxy.
Any ideas how to solve this without monkey-patching node-http-proxy or creating a wrapper around the response object?
Related issue in issues of node-http-proxy on Github seems to imply this is not possible. For future attempts by others, here is how I hacked the issue:
you'll quickly find out that the proxy is only calling writeHead(), write() and end() methods of the res object
since res is already an EventEmitter, you can start emitting new custom events
listen for these new events to assemble the response data and then use it
var eventifyResponse = function(res) {
var methods = ['writeHead', 'write', 'end'];
methods.forEach(function(method){
var oldMethod = res[method]; // remember original method
res[method] = function() { // replace with a wrapper
oldMethod.apply(this, arguments); // call original method
arguments = Array.prototype.slice.call(arguments, 0);
arguments.unshift("method_" + method);
this.emit.apply(this, arguments); // broadcast the event
};
});
};
res = eventifyResponse(res), outputData = '';
res.on('method_writeHead', function(statusCode, headers) { saveHeaders(); });
res.on('method_write', function(data) { outputData += data; });
res.on('method_end', function(data) { use_data(outputData + data); });
proxy.proxyRequest(req, res, options)
This is a simple proxy server sniffing the traffic and writing it to console:
var http = require('http'),
httpProxy = require('http-proxy');
//
// Create a proxy server with custom application logic
//
var proxy = httpProxy.createProxyServer({});
// assign events
proxy.on('proxyRes', function (proxyRes, req, res) {
// collect response data
var proxyResData='';
proxyRes.on('data', function (chunk) {
proxyResData +=chunk;
});
proxyRes.on('end',function () {
var snifferData =
{
request:{
data:req.body,
headers:req.headers,
url:req.url,
method:req.method},
response:{
data:proxyResData,
headers:proxyRes.headers,
statusCode:proxyRes.statusCode}
};
console.log(snifferData);
});
// console.log('RAW Response from the target', JSON.stringify(proxyRes.headers, true, 2));
});
proxy.on('proxyReq', function(proxyReq, req, res, options) {
// collect request data
req.body='';
req.on('data', function (chunk) {
req.body +=chunk;
});
req.on('end', function () {
});
});
proxy.on('error',
function(err)
{
console.error(err);
});
// run the proxy server
var server = http.createServer(function(req, res) {
// every time a request comes proxy it:
proxy.web(req, res, {
target: 'http://localhost:4444'
});
});
console.log("listening on port 5556")
server.listen(5556);
I tried your hack but it didn't work for me. My use case is simple: I want to log the in- and outgoing traffic from an Android app to our staging server which is secured by basic auth.
https://github.com/greim/hoxy/
was the solution for me. My node-http-proxy always returned 500 (while the direct request to stage did not). Maybe the authorization headers would not be forwarded correctly or whatever.
Hoxy worked fine right from the start.
npm install hoxy [-g]
hoxy --port=<local-port> --stage=<your stage host>:<port>
As rules for logging I specified:
request: $aurl.log()
request: #log-headers()
request: $method.log()
request: $request-body.log()
response: $url.log()
response: $status-code.log()
response: $response-body.log()
Beware, this prints any binary content.