Where and how to store API endpoints in vue js? - http

I am using vue-cli for front-end and lumen for back-end and I am curious about what is a best practice to store API root-url and endpoints in vue ?
Now I have constants.js file in src directory where API root-url and endpoints are like that:
const BASE_URL = "http://localhost:8000"
export const AddLanguge = BASE_URL + "/api/languages"
and when I need for example to implement add language functionality in component I import required API endpoint from constants.js like that:
import { AddLanguge } from '#/constants'
and then use axios to make request
this.$http.post(AddLanguge, params).then(response => {
if (response.status == 200) {
this.addLanguage(response.data.data)
} else {
this.setHttpResponseDialog(response)
}
}).catch(er => {
this.setHttpResponseDialog("Error")
})
I searched this question, but there is no clear answer some say: it's ok.
Others say: it's bad you must store that kind of data in dev.env.js and prod.env.js, and most important fact here is I don't get why are they saying so, why is it important to save that data in .env files? Or maybe is there some other better way?
Can you guys provide a right answer with good explanation or if there is no right answer and it depends on situation how can I decide which way is suitable for my case?

.env files are recommended because you may have different endpoints depending on environment, that is to say are you running dev server with "npm run serve" or building for production with "npm run build". With .env config files they become environment variables and you don't need to hard code them into your app, it's just the most practical thing to do. With Vue CLI 3 you would have
//.env.development
VUE_APP_BASEURL = "http://localhost:8000"
And in your app you could access it with.
process.env.VUE_APP_BASEURL
What I use to do is just have the base in a variable and then concatenate rest.
const BASE_URL = process.env.VUE_APP_BASEURL
this.$http.post(BASE_URL + '/api/languages/', params)

Related

Hiding Storyblok API-Key

I'm using Next.js with Storyblok and recently made use of the react-next-boilerplate.
I noticed that they put the preview token in the _app.js, so essentially publish it:
storyblokInit({
accessToken: "your-preview-token",
use: [apiPlugin],
components,
});
If I use an environment variable instead, which isn't available on the client, I get the error
You need to provide an access token to interact with Storyblok API
in the client. That's because (I think) my components use StoryblokComponent, which makes use of the global Storyblok state. So I wonder:
Should I ignore this error, as I don't plan to interact with the Storyblok API other than using it for component rendering (all the data comes from the server, as far as I understand the concept of static site generation), and component rendering seems to be still working?
Should I just publish the preview token?
Should I create two tokens, one for the server and one for the client?
Setting the token to process.env.STORYBLOK_API_KEY || "NULL" (where "NULL" can be anything except the empty string) also works (no more errors) but seems like a weird solution.
I don't really understand why they combine these two things, component rendering and data fetching, in the same function.
I would use a .env.local file and populate it with:
STORYBLOK_API_KEY=your-preview-token
To use the environment variable inside _app.js you have to pass it to next.config.js like this:
module.exports = {
env: {
STORYBLOK_API_KEY: process.env.STORYBLOK_API_KEY,
}
}
Source: https://nextjs.org/docs/basic-features/environment-variables

eHow to transition away from inline editor on actions on google

In a previous Stack Overflow question, I shied away from using an external webhook on Actions on Google
so I needed to go back to the inline editor. I got that worked out, but now I'm feeling brave again.
I've outgrown the inline editor and want the ability to develop my code on my laptop, testing it in Firebase, and publishing to a site for my webhook, presumably where the inline code editor publishes to. In fact, I have already written the require functions and deployed them from Firebase. So the full functionality is ready to go, I just need to hook it up properly to Actions on Google.
What I have now in Actions on Google, inline editor, is more of a stub. I want to merge that stub into my more fullblown logic that I have in Firebase. Here is what is in the inline editor:
const { conversation } = require('#assistant/conversation');
const functions = require('firebase-functions');
const app = conversation();
app.handle('intent_a_handler', conv => {
// Implement your code here
conv.add("Here I am in intent A");
});
app.handle('intent_b_handler', conv => {
// Implement your code here
conv.add("Here I am in intent B");
});
exports.ActionsOnGoogleFulfillment = functions.https.onRequest(app);
When I search on the Internet, I see discussion from the point of view of Dialogflow, but like I say, I'm in "Actions on Google". I want to transition away from the inline editor, taking what I already have, as a basis.Can someone explain how I set that up? I'm happy to do this within the context of the Google ecosystem.
To test your own webhook locally on your own system I would recommend incorporating a web app framework such as express. With express you can host code on your local machine and make it respond to request from Actions on Google. In your case you would replace this will all the code related to the Firebase functions package. Here is an example of what a simple webhook for Actions on Google looks like:
const express = require('express');
const bodyParser = require('body-parser')
const { conversation } = require('#assistant/conversation');
const exprs = express();
exprs.use(bodyParser.json()) // allows Express to work with JSON requests
const app = conversation();
app.handle('example intent', () => {
// Do something
})
// More app.handle() setups
exprs.post('/', app);
exprs.listen(3000);
With this setup you should be able to run your own application locally. The only thing you need to do is install the required dependencies and add your own intent handlers for your action. At this point you have a webhook running on your own machine, but that isn't enough to use it as a webhook in Actions on Google because it runs locally and isn't publicly available via the internet.
For this reason we will be using a tool called ngrok. With ngrok you can create a public https address that runs all messages to your local machine. This way you can use ngrok address as your webhook URL. Now you can just make as many code changes as you want and Actions on Google will automatically use the latest changes when you develop. No need to upload and wait for Firebase to do this.
Just to be clear: Ngrok should only be used for development. When you are done with developing your action you should upload all your code to a cloud service or host it on your own server if you have any. A (free plan) ngrok URL usually expires every 6 hours. So its not a suitable solution for anything other than development.

Security of secrets added to next.config.js

We are working adding Auth0 to our Next.js website and referencing this example.
What I am wondering about is the settings in next.config.js in the example. It puts the Auth0 and other secrets in the client (via Webpack). Doesn't this put these secrets at risk? Since they are somewhere in the client code, there is a chance that a request can be made to access the secrets.
Examples in this Auth0 article also puts the secrets in the client.
I haven't had much luck finding out how Webpack handles the variables and am looking to the community to shed some light on this. We are trying to ensure our pattern is safe before putting it in to place.
From example, secrets being added to client side next.config.js:
const dotenv = require('dotenv')
dotenv.config()
module.exports = {
env: {
AUTH0_DOMAIN: process.env.AUTH0_DOMAIN,
AUTH0_CLIENT_ID: process.env.AUTH0_CLIENT_ID,
AUTH0_CLIENT_SECRET: process.env.AUTH0_CLIENT_SECRET,
AUTH0_SCOPE: 'openid profile',
REDIRECT_URI:
process.env.REDIRECT_URI || 'http://localhost:3000/api/callback',
POST_LOGOUT_REDIRECT_URI:
process.env.POST_LOGOUT_REDIRECT_URI || 'http://localhost:3000/',
SESSION_COOKIE_SECRET: process.env.SESSION_COOKIE_SECRET,
SESSION_COOKIE_LIFETIME: 7200, // 2 hours
},
}
Update - Next v9.4:
Since Next.js v9.4, it exposes only env variables with the prefix NEXT_PUBLIC_ to the browser.
For more info, read this
Original answer:
DON'T put any secret env variables in a place that is accessible to the client.
I'm not sure what next does with this env property, It just configures a webpack DefinePlugin that replaces usages of process.env.VAR to it's value.
So, this means that your secrets will be inside bundles that are public.
To confirm that it is exposed in the client,
open dev-tools
open console by
pressing esc
click on the search tab
enter your secret key
It will find it in one of the bundles.

Automatically generate dynamic Sitemap.xml for Nuxt VueJs Firestore App

I have a site similar to Stackoverflow where users can create a post (or question) which gets its own URL and should be SEO optimized. Therefore I need to include these dynamic pages in my SiteMap.xml. I would like to find an automatic way to insert each dynamic URL to my Sitemap when initially created.
Hoping to not reinvent the wheel, I found sitemap-module for nuxt, however the example they use for dynamic pages is statically written, so not sure what good that does.
I am having a hard time even conceptualizing how to set this up and what is possible with current infrastructure. Can Firestore functions update source code and redeploy or are there any firestore hosting features to help? Could/ should I set up a cron job to run every night to first run a script to query firestore and update sitemap file on local computer, then automatically deploy it to firestore from command line? Any script examples?
Tech used: VueJS, Node.js, Nuxt/ SSR, Firestore (db and hosting), and Express
This is how I did it. Hope this helps. Please share if you managed to get a different solution.
I used npm install #nuxtjs/sitemap
Website here - #nuxtjs/sitemap
In nuxt.config.js
var routes = []
var allUsers = [{'username': 'username'}] // Getting users as an Array
for (var i = 0; i < allUsers.length; i++) {
routeObject = {
'url': '/profile/' + allUsers[i].username
}
routes.push(routeObject);
}
module.exports = {
sitemap: {
path: '/sitemap.xml',
hostname: 'Your hostname here',
cacheTime: 1000 * 60 * 15,
gzip: true,
generate: false,
routes: routes
}
}

How do I access Request Parameters in Meteor?

I am planning to use Meteor for a realtime logging application for various
My requirement is pretty simple, I will pass a log Message as request Parameter ( POST Or GET) from various application and Meteor need to simply update a collection.
I need to access Request Parameters in Meteor server code and update Mongo collection with the incoming logMessage. I cannot update Mongo Collection directly from existing applications, so please no replies suggesting the same.I want to know how can I do it from Meteor framework and not doing it by adding more packages.
EDIT: Updated to use Iron Router, the successor to Meteor Router.
Install Iron Router and define a server-side route:
Router.map(function () {
this.route('foo', {
where: 'server',
action: function () {
doSomethingWithParams(this.request.query);
}
});
});
So for a request like http://yoursite.com/foo?q=somequery&src=somesource, the variable this.request.query in the function above would be { q: 'somequery', src: 'somesource' } and therefore you can request individual parameters via this.request.query.q and this.request.query.src and the like. I've only tested GET requests, but POST and other request types should work identically; this works as of Meteor 0.7.0.1. Make sure you put this code inside a Meteor.isServer block or in a file in the /server folder in your project.
Original Post:
Use Meteorite to install Meteor Router and define a server-side route:
Meteor.Router.add('/foo', function() {
doSomethingWithParams(this.request.query);
});
So for a request like http://yoursite.com/foo?q=somequery&src=somesource, the variable this.request.query in the function above would be { q: 'somequery', src: 'somesource' } and therefore you can request individual parameters via this.request.query.q and this.request.query.src and the like. I've only tested GET requests, but POST and other request types should work identically; this works as of Meteor 0.6.2.1. Make sure you put this code inside a Meteor.isServer block or in a file in the /server folder in your project.
I know the questioner doesn't want to add packages, but I think that using Meteorite to install Meteor Router seems to me a more future-proof way to implement this as compared to accessing internal undocumented Meteor objects like __meteor_bootstrap__. When the Package API is finalized in a future version of Meteor, the process of installing Meteor Router will become easier (no need for Meteorite) but nothing else is likely to change and your code would probably continue to work without requiring modification.
I found a workaround to add a router to the Meteor application to handle custom requests.
It uses the connect router middleware which is shipped with meteor. No extra dependencies!
Put this before/outside Meteor.startup on the Server. (Coffeescript)
SomeCollection = new Collection("...")
fibers = __meteor_bootstrap__.require("fibers")
connect = __meteor_bootstrap__.require('connect')
app = __meteor_bootstrap__.app
router = connect.middleware.router (route) ->
route.get '/foo', (req, res) ->
Fiber () ->
SomeCollection.insert(...)
.run()
res.writeHead(200)
res.end()
app.use(router)
Use IronRouter, it's so easy:
var path = IronLocation.path();
As things stand, there isn't support for server side routing or specific actions on the server side when URLs are hit. So it's not easy to do what you want. Here are some suggestions.
You can probably achieve what you want by borrowing techniques that are used by the oauth2 package on the auth branch: https://github.com/meteor/meteor/blob/auth/packages/accounts-oauth2-helper/oauth2_server.js#L100-109
However this isn't really supported so I'm not certain it's a good idea.
Your other applications could actually update the collections using DDP. This is probably easier than it sounds.
You could use an intermediate application which accepts POST/GET requests and talks to your meteor server using DDP. This is probably the technically easiest thing to do.
Maybe this one will help you?
http://docs.meteor.com/#meteor_http_post

Resources