Imported deno thirdparties in production - deno

Deno does not use any package manager like npm, it only imports the thirdparty dependencies with a URL. Lets see an example below:
import { Application } from "https://deno.land/x/abc#v1.0.0-rc8/mod.ts";
Does the deployed code in production contain the content of https://deno.land/x/abc#v1.0.0-rc8/mod.ts or the server in production has to send a request to the URL to get the thirdparty code?

For production, deno recommends saving your dependencies to git, if you follow that recommendation, then your server won't need to download anything since it will already be cached.
In order to do that you have to set the environment variable DENO_DIR to specify where do you want to download dependencies.
DENO_DIR=$PWD/vendor deno cache server.ts
# DENO_DIR=$PWD/vendor deno run server.ts
With the above command, all dependencies for server.ts will be downloaded into your project, inside vendor/ directory, which you can commit to git.
Then on the production server, you'll have to set DENO_DIR to read from vendor/ and not for the default path, which can be obtained by issuing:
deno info
If you don't store the dependencies on your version control system, then deno will download the dependencies once, and store them into DENO_DIR directory.
Taken from deno manual:
But what if the host of the URL goes down? The source won't be available.
This, like the above, is a problem faced by any remote dependency
system. Relying on external servers is convenient for development but
brittle in production. Production software should always vendor its
dependencies. In Node this is done by checking node_modules into
source control. In Deno this is done by pointing $DENO_DIR to some
project-local directory at runtime, and similarly checking that into
source control:
# Download the dependencies.
DENO_DIR=./deno_dir deno cache src/deps.ts
# Make sure the variable is set for any command which invokes the cache.
DENO_DIR=./deno_dir deno test src
# Check the directory into source control.
git add -u deno_dir
git commit

Related

Nette - importing naja library

I have a question. I need to make some modification to our nette application (first time working with the framework). I need to import NAJA lib via FTP and by downloading code from github. I do not have access to server or console, so download via composer, npm... is not possible.
I done this:
Downloaded all files from https://github.com/jiripudil/Naja/releases (not the ZIPs) and put them in directory vendor
loaded in app/presenters/templates/#layout.latte with:
<script src="{$basePath}/js/vendor/Naja.js" type="module"></script>
Tried to inicialize it in one of my javascript with:
document.addEventListener('DOMContentLoaded', naja.initialize() );
But then error pop up, saying that naja is not defined. What am I missing?
Naja is written using modern JavaScript, it can be initialized only via ES5+ import, for instance
import naja from 'naja';
and then the event listener for initializing naja can be added after that code, and it can be compiled via webpack like this https://github.com/MinecordNetwork/Website/blob/master/webpack.config.js
You can do it all on your local machine with npm and yarn installed, and then upload bundle.js that was built to the server, it's also used in the mentioned repository so you can check out how and what libraries are needed, to compile the code from /public/js/main.js type yarn encore production or yarn build for development.

ignore dev dependencies in php composer

I have developed a composer laravel based project that I need to install on a remote production server. The problem is I have limited permission/ access so my option is to "archive" the package( using composer archive) and unpack on the production.
What folders do I need to archive and how can I ignore the dev dependencies of the package as well as vendor dev dependencies?
composer archive is likely not to help you, because this command creates an archive of a defined version of a package.
You probably want to upload the whole working application, and not only one package. You should create a little script that will create the archive file for you, which should do:
checkout the application from the repository in a new directory
run composer install --no-dev to install all required dependencies without dev-dependencies
optionally delete files that are not necessary on the server, like documentation, the .git folder, and other stuff
create the archive file from all these files
optionally upload that archive to the target server and unarchive there
optionally check basic functions and switch to the new uploaded version on the server

OpenShift Custom Cartridge and NPM

I am working with a community-developed OpenShift cartridge for nginx. The cartridge's build script (without any modifications) works well; it starts the nginx server with the configuration file that I provide it. However, I am trying to modify the build script so that it first changes directory into my OpenShift repository, runs npm install and then grunt build to build an Angular application that I have created.
When I do this, I continuously get the error EACCES, mkdir '/var/lib/openshift/xxxxxxxxxx/.npm' when the script gets to npm install. Some OpenShift forum posts have attempted to solve the issue, but it appears as though a different solution is required (at least in my case).
Thus, I am interested in whether or not it is possible to use npm in this way, or if I need to create a cartridge that does all of this myself.
Since we do not typically have the access required to create ~/.npm, we have to find ways of moving the npm cache (normally ~/.npm) and the npm user configuration (normally ~/.npmrc) to accessible folders to get things going. The following information comes partially from a bug report that I submitted to Redhat on this matter.
We must begin by creating an environmental variable to control the location of .npmrc. I created a file (with shell access to my application) called .env in $OPENSHIFT_DATA_DIR. Within this file, I have placed:
export NPM_CONFIG_USERCONFIG=$OPENSHIFT_HOMEDIR/app-root/build-dependencies/.npmrc
This moves the .npmrc directory to a place where we have the privileges to read/write. Naturally, I have to also create the directory .npmrc in $OPENSHIFT_HOMEDIR/app-root/build-dependencies/. Then, in my pre-start webhook/early in my build script, I have placed:
touch $OPENSHIFT_DATA_DIR/.env
This ensures that the environmental variable that configures the location of .npmrc will be accessible each time we deploy/build. Now we can move the location of the npm cache. Start by running touch on the .env file manually, and create the .npm directory in $OPENSHIFT_HOMEDIR/app-root/build-dependencies/. Run the following to complete the reconfiguration:
npm config set cache $OPENSHIFT_HOMEDIR/app-root/build-dependencies/.npm
NPM should now be accessible each time we deploy, even if we are not using the NodeJS cartridge. The above directory choices may be changed as desired.
You do not have write access to the ~/.npm directory in your gear. You might try reviewing how the native node.js cartridge is setup (https://github.com/openshift/origin-server/tree/master/cartridges/openshift-origin-cartridge-nodejs) and see if you can apply it to your custom cartridge.

Deploying binaries from Bamboo to Nexus repository

Firstly I am new to Nexus. So please bear if it is too noob a question. Let me first explain how our current build/deployment process works.
HOW WE DO IT AT PRESENT:
We have a project that is Maven based. There is a parent POM.xml and two module pom.xmls Each child module POM.xmls create a JAR file each when built. Currently I am doing the build/ deployments manually. I checkout code from SVN to my local machine. I run mvn clean install. I have created a bash script to bundle the 2 Jar files + few other resources (Present just in SVN repo and gets downloaded to local) into a tar.gzip file. Now I SCP this to the app server. Run install scripts that deploys the tar.gzip file.
HOW WE WANT TO DO IT:
We plan to automate the build in Bamboo (Which I have already done). Then the built artifact needs to be uploaded to a Nexus repository (Due to security issues, the SCP task in Bamboo does not work because of establishing SSH connectivity from Bamboo Server to App Server).
MY FIRST HURDLE:
I have created a Bash Script task in Bamboo which does the bundling ( 2 Jars from each child Module POM + resources) to a tar.gzip. This tar.gzip is prersent in a path a/b/c/d on my bamboo machine.
How do I upload this tar.gzip to Nexus Repository?
MY CONFUSION:
I have read about uploading artifacts to Nexus. But I understand it if just 1 jar/ear/war file is created from the build. But we want the bundle. So if I make changes to settings.xml & POM.xml to configure the upload to NEXUS, each JAR file will be uploaded into separate paths in Nexus. And then I have to configure separately to upload the resource files (Not part of build). Is my understanding correct? Please let me know how to proceed with this?
Thanks in advance!!!
Use the Maven Assembly Plugin to create an assembly that contains your artifacts and resources, and then your regular maven deploy will deploy it into Nexus.

How to specify local npm modules in meteor packages

I have an application which uses a Meteor with npm module. So I have a packages/mymodule/package.js file which contains:
Npm.depends({my_npm_module:"my_npm_module_version"});
Upon lauching Meteor app, my_npm_module will be installed to packages/mymodule/.npm from npm repository. Now lets say I want to develop my_npm_module on my local machine. How can I force Meteor to use my local directory for my_npm_module, What should I do i my_npm_module source is modified?
If you have a local Npm module it may not be necessary to do this.
Simply use var module = Npm.require('<absolute path to npm module>'); anywhere on your server side.

Resources