i have a self signed grpc service on server, and got it working for dart server with dart client.
But i could not figure how to bypass or allow self signed certificate for node client..
I've tried this:
const sslCreds = await grpc.credentials.createSsl(
fs.readFileSync('./ssl/client.crt'),
null, // privatekey
null, // certChain
{
checkServerIdentity: function(host, info) {
console.log('verify?', host, info);
if (
host.startsWith('127.0.0.1') ||
host.startsWith('logs.example.com')
) {
return true;
}
console.log('verify other?', host);
return true;
},
},
);
// sslCreds.options.checkServerIdentity = checkCert;
const gLogClient = new synagieLogGrpc.LoggerClient(
'host:port',
sslCreds,
);
but when i call, my validation checkServerIdentity did not call.
anyone have any clue?
after checking out multiple github issues, and testing for 2 days,
this code below works.
critical point is, actual host:port is the destination, which could be localhost. but we will need to override the ssl target name with the actual generated ssl domain.
for sample of tls generation:
https://github.com/grpc/grpc-node/issues/1451
const host = 'localhost';
const port = 8088;;
const hostPort = `${host}:${port}`;
const gLogClient = new synagieLogGrpc.LoggerClient(hostPort, sslCreds, {
'grpc.ssl_target_name_override': 'actual_tlsdomain.example.com',
});
Related
I come from a land of ASP.NET Core. Having fun learning a completely new stack.
I'm used to being able to:
name a route "orders"
give it a path like /customer-orders/{id}
register it
use the routing system to build a URL for my named route
An example of (4) might be to pass a routeName and then routeValues which is an object like { id = 193, x = "y" } and the routing system can figure out the URL /customer-orders/193?x=y - notice how it just appends extraneous key-vals as params.
Can I do something like this in oak on Deno?? Thanks.
Update: I am looking into some functions on the underlying regexp tool the routing system uses. It doesn't seem right that this often used feature should be so hard/undiscoverable/inaccessible.
https://github.com/pillarjs/path-to-regexp#compile-reverse-path-to-regexp
I'm not exactly sure what you mean by "building" a URL, but the URL associated to the incoming request is defined by the requesting client, and is available in each middleware callback function's context parameter at context.request.url as an instance of the URL class.
The documentation provides some examples of using a router and the middleware callback functions that are associated to routes in Oak.
Here's an example module which demonstrates accessing the URL-related data in a request:
so-74635313.ts:
import { Application, Router } from "https://deno.land/x/oak#v11.1.0/mod.ts";
const router = new Router({ prefix: "/customer-orders" });
router.get("/:id", async (ctx, next) => {
// An instance of the URL class:
const { url } = ctx.request;
// An instance of the URLSearchParams class:
const { searchParams } = url;
// A string:
const { id } = ctx.params;
const serializableObject = {
id,
// Iterate all the [key, value] entries and collect into an array:
searchParams: [...searchParams.entries()],
// A string representation of the full request URL:
url: url.href,
};
// Respond with the object as JSON data:
ctx.response.body = serializableObject;
ctx.response.type = "application/json";
// Log the object to the console:
console.log(serializableObject);
await next();
});
const app = new Application();
app.use(router.routes());
app.use(router.allowedMethods());
function printStartupMessage({ hostname, port, secure }: {
hostname: string;
port: number;
secure?: boolean;
}): void {
if (!hostname || hostname === "0.0.0.0") hostname = "localhost";
const address =
new URL(`http${secure ? "s" : ""}://${hostname}:${port}/`).href;
console.log(`Listening at ${address}`);
console.log("Use ctrl+c to stop");
}
app.addEventListener("listen", printStartupMessage);
await app.listen({ port: 8000 });
In a terminal shell (I'll call it shell A), the program is started:
% deno run --allow-net so-74635313.ts
Listening at http://localhost:8000/
Use ctrl+c to stop
Then, in another shell (I'll call it shell B), a network request is sent to the server at the route described in your question — and the response body (JSON text) is printed below the command:
% curl 'http://localhost:8000/customer-orders/193?x=y'
{"id":"193","searchParams":[["x","y"]],"url":"http://localhost:8000/customer-orders/193?x=y"}
Back in shell A, the output of the console.log statement can be seen:
{
id: "193",
searchParams: [ [ "x", "y" ] ],
url: "http://localhost:8000/customer-orders/193?x=y"
}
ctrl + c is used to send an interrupt signal (SIGINT) to the deno process and stop the server.
I am fortunately working with a React developer today!
Between us, we've found the .url(routeName, ...) method on the Router instance and that does exactly what I need!
Here's the help for it:
/** Generate a URL pathname for a named route, interpolating the optional
* params provided. Also accepts an optional set of options. */
Here's it in use in context:
export const routes = new Router()
.get(
"get-test",
"/test",
handleGetTest,
);
function handleGetTest(context: Context) {
console.log(`The URL for the test route is: ${routes.url("get-test")}`);
}
// The URL for the test route is: /test
The Current code looks like does Cache first Strategy, How to modify it use Network first and than fallback to cache if network fails ?
async function onFetch(event) {
let cachedResponse = null;
if (event.request.method === 'GET') {
// For all navigation requests, try to serve index.html from cache
// If you need some URLs to be server-rendered, edit the following check to exclude those URLs
//const shouldServeIndexHtml = event.request.mode === 'navigate';
console.log("onFetch : " + event.request.url.toLowerCase());
const shouldServeIndexHtml = event.request.mode === 'navigate';
const request = shouldServeIndexHtml ? 'index.html' : event.request;
const cache = await caches.open(cacheName);
cachedResponse = await cache.match(request);
}
return cachedResponse || fetch(event.request);
}
if (event.request.url.indexOf('/api') != -1) {
try {
// Network first
var response = await fetch(event.request);
// Update or add cache
await cache.put(event.request, response.clone());
// Change return value
cachedResponse = response;
}
catch (e)
{
}
}
You can add something like this after:
cachedResponse = await cache.match(request);
This should always load api requests from the network first, since it's not part of the cache initially. Every time the cache is renewed for this request. If the request fails, the cached value will be used.
Anyone set the ajs_anonymous_id in SSR (nextjs) if it doesn't currently exist on client?
I have a need to "read" the ajs_anonymous_id (Segment analytics) cookie during a SSR rendering in Next.Js, but of course there are instances when that cookie does not exist yet ie...person hasn't visited my site before and thus never go it... BUT, since i need in SSR side..... I was hoping there is a process I can "set it" on the server side so that I can use it, THEN have it on the client too... so..
client visits page
Has ajs_anonymous_id cookie, cool, use it and do some display things....
Does not have ajs_anonymous_id, I seed the ajs_anonymous_id (drop a cookie) and then do some display things.
pages loads.. My analytics file (that loads on the font end thru a containe) sees there is an already ajs_anonymous_id cookie, cool.
Anyone have an example of this or how to achieve it?
yeah - there seems to be a package and method just for this.
import Analytics from 'analytics-node'
// if no anonymousId, send a randomly generated one
// otherwise grab existing to include in call to segment
let anonymousId
if (cookies.ajs_anonymous_id) {
anonymousId = cookies.ajs_anonymous_id
} else {
anonymousId = = uuid.v4()
res.cookie('ajs_anonymous_id', anonymousId )
}
for the full example, i'd refer to their docs:
https://segment.com/docs/guides/how-to-guides/collect-pageviews-serverside/
For NextJS, I expect you have to move away from their server default implementation and build a wrapper to allow you to add this type of middleware.
const express = require('express');
const bodyParser = require('body-parser');
const next = require('next');
const host = process.env.LOCAL_ADDRESS || '0.0.0.0';
const port = parseInt(process.env.NODE_PORT, 10) || 3000;
const dev = process.env.NODE_ENV !== 'production';
const app = next({ dev, dir: 'app' });
const handle = app.getRequestHandler();
app.prepare().then(() => {
const server = express();
server.use(bodyParser.urlencoded({ extended: false }));
server.set('trust proxy', true);
server.use(customMiddleware);
server.all('*', (req, ...args) => {
// Log the request only if we need to (we don't log all static asset requests)
if (req.log) {
req.log.info('incoming request');
}
return handle(req, ...args);
});
server.listen(port, host, (err) => {
if (err) throw err;
console.log(`> Ready on http://${host}:${port}`);
});
});
then start the app with node app/server
Hope that helps to get you started!
The context of my challenge
I'm building a headless WordPress / WooCommerce Store.
If you're not familiar with the concept of a headless CMS, I pull the store's content (Products, and their images, text) over the WordPress / WooCommerce REST API. This way, I have the benefit of a CMS dashboard for my client whilst I get to develop in a modern language / library, in my case - React!
If possible I'd like to keep the checkout in WordPress/WooCommerce/PHP. Depending on the project I apply this code / boilerplate to I suspect that I'll have to chop and change payment gateways, and making this secure and PCI compliant will be much easier in PHP/WordPress - there's a whole host of plugins for this.
This means the entire store / front-end will live in React, with the exception of the cart in which the user will be redirected to the CMS front-end (WordPress, PHP) when they wish to complete their order.
The Challenge
This makes managing cookies for the session rather unintuitive and unorthodox. When the user is redirected from the store (React site) to the checkout (WooCommerce/PHP site) the cart session has to persist between the two sites.
Additionally, requests to WooCommerce are routed through the Node/Express server which my React client sits ons. I do this because I want to keep the WordPress address obscured, and so I can apply GraphQL to clean up my requests & responses. This issue is that in this process, the cookies are lost because my client and my CMS are communicating through a middle man (my Node server) - I require extra logic to manually manage my cookies.
The Code
When I attempt to add something to a cart, from an action creator (I'm using Redux for state management) I hit the api corresponding endpoint on my Node/Express server:
export const addToCart = (productId, quantity) => async (dispatch) => {
dispatch({type: ADD_TO_CART});
try {
// Manually append cookies somewhere here
const payload = await axios.get(`${ROOT_API}/addtocart?productId=${productId}&quantity=${quantity}`, {
withCredentials: true
});
dispatch(addToSuccess(payload));
} catch (error) {
dispatch(addToCartFailure(error));
}
};
Then on the Node/Express server I make my request to WooCommerce:
app.get('/api/addtocart', async (req, res) => {
try {
// Manually retrieve & append cookies somewhere here
const productId = parseInt(req.query.productId);
const quantity = parseInt(req.query.quantity);
const response = await axios.post(`${WP_API}/wc/v2/cart/add`, {
product_id: productId,
quantity
});
return res.json(response.data);
} catch (error) {
// Handle error
return res.json(error);
}
});
With the clues given by #TarunLalwani (thanks a million!) in his comments, I've managed to formulate a solution.
Cookie Domain Setting
Since I was working with two seperate sites, in order for this to work I had to ensure they were both on the same domain, and that the domain was set in all cookies. This ensured cookies were included in my requests between the Node / Express server (sitting on eg. somedomain.com) and the WooCommerce CMS (sitting on eg. wp.somedomain.com), rather than being exclusive to the wp.somedomain subdomain. This was achieved by setting define( 'COOKIE_DOMAIN', 'somedomain.com' ); in my wp-config.php on the CMS.
Manually Getting and Setting Cookies
My code needed significant additional logic in order for cookies to be included whilst requests were routed through my Node / Express server through the client.
In React I had to check if the cookie existed, and if it did I had to send it through in the header of my GET request to the Node / Express server.
import Cookies from 'js-cookie';
export const getSessionData = () => {
// WooCommerce session cookies are appended with a random hash.
// Here I am tracking down the key of the session cookie.
const cookies = Cookies.get();
if (cookies) {
const cookieKeys = Object.keys(cookies);
for (const key of cookieKeys) {
if (key.includes('wp_woocommerce_session_')) {
return `${key}=${Cookies.get(key)};`;
}
}
}
return false;
};
export const addToCart = (productId, quantity) => async (dispatch) => {
dispatch({type: ADD_TO_CART});
const sessionData = getSessionData();
const config = {};
if (sessionData) config['session-data'] = sessionData;
console.log('config', config);
try {
const payload = await axios.get(`${ROOT_API}/addtocart?productId=${productId}&quantity=${quantity}`, {
withCredentials: true,
headers: config
});
dispatch(addToSuccess(payload));
} catch (error) {
dispatch(addToCartFailure(error));
}
};
On the Node / Express Server I had to check if I had included a cookie (saved in req.headers with the key session-data - it was illegal to use Cookie as a key here) from the client, and if I did, append that to the header of my request going to my CMS.
If I didn't find an appended cookie, it meant this was the first request in the session, so I had to manually grab the cookie from the response I got back from the CMS and save it to the client (setCookieFunc).
app.get('/api/addtocart', async (req, res) => {
try {
const productId = parseInt(req.query.productId);
const quantity = parseInt(req.query.quantity);
const sessionData = req.headers['session-data'];
const headers = {};
if (sessionData) headers.Cookie = sessionData;
const response = await axios.post(`${WP_API}/wc/v2/cart/add`, {
product_id: productId,
quantity
}, { headers });
if (!sessionData) {
const cookies = response.headers['set-cookie'];
const setCookieFunc = (cookie) => {
const [cookieKeyValue, ...cookieOptionsArr] = cookie.split('; ');
const cookieKey = cookieKeyValue.split('=')[0];
const cookieValue = decodeURIComponent(cookieKeyValue.split('=')[1]);
const cookieOptions = { };
cookieOptionsArr.forEach(option => (cookieOptions[option.split('=')[0]] = option.split('=')[1]));
if (cookieOptions.expires) {
const expires = new Date(cookieOptions.expires);
cookieOptions.expires = expires;
}
res.cookie(cookieKey, cookieValue, cookieOptions);
};
cookies.map(cookie => setCookieFunc(cookie));
}
return res.json(response.data);
} catch (error) {
// Handle error
return res.json(error);
}
});
I'm not sure if this is the most elegant solution to the problem, but it worked for me.
Notes
I used the js-cookie library for interacting with cookies on my React client.
Gotchas
If you're trying to make this work in your development environment (using localhost) there's some extra work to be done. See Cookies on localhost with explicit domain
I'm trying to write a REST-API server with NodeJS like the one used by Joyent, and everything is ok except I can't verify a normal user's authentication. If I jump to a terminal and do curl -u username:password localhost:8000 -X GET, I can't get the values username:password on the NodeJS http server. If my NodeJS http server is something like
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(1337, "127.0.0.1");
, shouldn't I get the values username:password somewhere in the req object that comes from the callback ?
How can I get those values without having to use Connect's basic http auth ?
The username:password is contained in the Authorization header as a base64-encoded string.
Try this:
const http = require('http');
http.createServer(function (req, res) {
var header = req.headers.authorization || ''; // get the auth header
var token = header.split(/\s+/).pop() || ''; // and the encoded auth token
var auth = Buffer.from(token, 'base64').toString(); // convert from base64
var parts = auth.split(/:/); // split on colon
var username = parts.shift(); // username is first
var password = parts.join(':'); // everything else is the password
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end('username is "' + username + '" and password is "' + password + '"');
}).listen(1337, '127.0.0.1');
From HTTP Authentication: Basic and Digest Access Authentication - Part 2 Basic Authentication Scheme (Pages 4-5)
Basic Authentication in Backus-Naur Form
basic-credentials = base64-user-pass
base64-user-pass = <base64 [4] encoding of user-pass,
except not limited to 76 char/line>
user-pass = userid ":" password
userid = *<TEXT excluding ":">
password = *TEXT
If you're using express, you can use the connect plugin (included with express):
//Load express
var express = require('express');
//User validation
var auth = express.basicAuth(function(user, pass) {
return (user == "super" && pass == "secret");
},'Super duper secret area');
//Password protected area
app.get('/admin', auth, routes.admin);
You can use node-http-digest for basic auth or everyauth, if adding authorization from external services are in you roadmap.
I use this code for my own starter sites with auth.
It does several things:
basic auth
return index.html for / route
serve content without crashing and silent handle the error
allow port parameter when running
minimal amount of logging
Before using the code, npm install express
var express = require("express");
var app = express();
//User validation
var auth = express.basicAuth(function(user, pass) {
return (user == "username" && pass == "password") ? true : false;
},'dev area');
/* serves main page */
app.get("/", auth, function(req, res) {
try{
res.sendfile('index.html')
}catch(e){}
});
/* add your other paths here */
/* serves all the static files */
app.get(/^(.+)$/, auth, function(req, res){
try{
console.log('static file request : ' + req.params);
res.sendfile( __dirname + req.params[0]);
}catch(e){}
});
var port = process.env.PORT || 8080;
app.listen(port, function() {
console.log("Listening on " + port);
});
It can be implemented easily in pure node.js with no dependency, this is my version which is based on this answer for express.js but simplified so you can see the basic idea easily:
const http = require('http');
http.createServer(function (req, res) {
const userpass = Buffer.from(
(req.headers.authorization || '').split(' ')[1] || '',
'base64'
).toString();
if (userpass !== 'username:password') {
res.writeHead(401, { 'WWW-Authenticate': 'Basic realm="nope"' });
res.end('HTTP Error 401 Unauthorized: Access is denied');
return;
}
res.end('You are in! Yay!!');
}).listen(1337, '127.0.0.1');
The restify framework (http://mcavage.github.com/node-restify/) includes an authorization header parser for "basic" and "signature" authentication schemes.
You can use http-auth module
// Authentication module.
var auth = require('http-auth');
var basic = auth.basic({
realm: "Simon Area.",
file: __dirname + "/../data/users.htpasswd" // gevorg:gpass, Sarah:testpass ...
});
// Creating new HTTP server.
http.createServer(basic, function(req, res) {
res.end("Welcome to private area - " + req.user + "!");
}).listen(1337);