Speed performance Deno vs Node - deno

I am new to Deno. I was looking for its difference with Node Js. I found that Deno is always fetching modules online on run time from https://deno.land/.. .
But Node only used the internet during the installation of modules.
So in case if the internet is not available or with low-speed internet how we can overcome this issue in Deno?

I found that Deno doesn’t need an Internet connection once the modules are loaded.
They are cached in the folder your working in it’s the same module you’ll be using until you use the  — reload flag.
So it’s practically the same with node and how package.json files work.
I think Deno is here to replace node js and the security features being its greatest assets and that’s going to be invaluable with the security treats we constantly keep facing.

The best way, IMO, is to create a deps.js file as part of your project and declare all your imports in it. You can import from there into other files and will have a single point to maintain them AND get more control whether internet will be needed or not.
The first time an import is needed Deno will fetch it and then keep it local.
If there is no internet the local copy will be used.
If you want to avoid internet do NOT use imports without a version because Deno will try to get the latest version each time. It cannot know if the version changed since last run, and hence will check.
import {isWindows} from "https://deno.land/std/_util/os.ts";
export {isWindows};
Instead use a version. If Deno sees that that version is present local it will use it. There is no ambiguity here, that version is that version.
import {isWindows} from "https://deno.land/std#0.88.0/_util/os.ts";
export {isWindows};
You can monitor the process in your console window.
Furthermore, if with internet you really mean the public internet vs just the network (e.g. LAN) then you can build up a local resource where the imports can be fetch from, which would make you independent of the internet and only dependent on the availability of your local LAN resource. I figure if that one goes down your Deno app, if for instance it is a server, cannot be reached anyway.

Related

Index files in host and remote machine using PhpStorm

I work on a Symfony project using Vagrant. The host machine is using Windows. Due to fact that the request time is very high, I decided to install the vendor files inside the vm and the entire "rest" of the project remains inside the synced folder (project root => /vagrant).
Everything is working fine and the request time is under 100ms now. But there is one issue left. I have to install the vendor on my Windows machine first and then again in the vm, otherwise PhpStorm is not able to index the files correctly (I know, this is a logical consequence).
So my question is, if it is possible, to host a project on the Windows machine and the files are for example under "C:\Users\SampleUser\Project\ProjectX" and the vendor is installed under "/home/vagrant/vendor" and let PhpStorm index the files of both directories?
Otherwise I will have to live with this one and code completion won't work.
Or I will have to install the libraries on both machines to improve request time and have a more or less "good" workflow.
I hope, I could explain good enough, what my actual problem is.
Thank you very much for your time.
Had the same exact problem. Indeed a bummer.
One possible solution is to leave the vendor folder on the VM and manually copy it to your host machine.
Pros:
PHPStorm is able to index files
Cons:
If you add a dependency, you have to copy some parts of the vendor folder manually to the host machine
To those facing the same problem, I might advise SFTP (Tools -> Deployment -> Configuration in PHPStorm) - files can be transferred without leaving the IDE window. The only thing to do is get the VM box password, which is located at
%USERNAME%/.vagrant.d/boxes/your box/box version/virtualbox/Vagrantfile
Second solution: if you are using Virtualbox, you can use vm.synced_folder with type: "virtualbox" (the sync works both ways, host<->guest), and leave the vendor folder in your project (for it to sync all the time).
Pros:
vendor folder always up to date, no manual work
Cons:
Horrible performance (tested myself)
If you want to use non-virtualbox rsync (type: "rsync"), you will not get the ability to sync back from the guest (someone, please correct me if I'm wrong!), so you are left with the 1st solution.
It would be great if we could include the vendor folder directly from the VM (using some kind of rsync/symlink magic) to the "Languages & Frameworks -> PHP -> include path" list, at least when using VirtualBox, but oh well...

Why do we need to deploy a meteor app instead of just starting it?

As we all know, we can run a meteor app by just typing meteor in a terminal.
By default it will start a server and use port 3000.
So why do I need to deploy it using MUP etc.
I can configure it to use port 80 or use nginx to route to port 80 for the app. So the port is not the point.
Edit:
Assume meteor is running on a VPS or cloud server with public IP address, not a personal computer.
MUP does a few extra things you can do yourself:
it 'bundles' the code into a single file, using meteor build bundle
the javascript is one file, and css another; it's minified, and obfuscated so it's smaller and faster to load, and less easy to decipher on the client.
some packages are also meant to be removed when running in production. For example meteorToys, the utility toolset to look up collections and much more, is not bundled into the production bundle, as per the instructions in its package. This insures you don't deploy code with security vulnerabilities (Meteor toys basically opens up client side delete / updates etc... if you're not careful)
So, in short, it installs a minimal version of your site, making sure that what's meant for development only doesn't get push to a production environment.
EDIT: On other reason to do this, is that you don't need all the Meteor build tools on your production server; that can add up to a lot of stuff, especially if you keep caches going for a while...
I believe it also takes care of hooking up to a remote MongoDB Instance (at least it used to be the case on the free meteor site) which is more scalable and fault tolerant than running on the same instance as the web server, as well as provision storage etc... if needed.
basically, to deploy a Meteor app yourself manually, you need to:
on your dev box:
meteor build bundle your app to a tar file (using the architecture flag corresponding to the OS you will use)
on the server:
install node v0.10 (or whatever is the current version of node required by Meteor)
you might have to install Fiber#1.0.5 (but I believe this is now part of meteor install already)
untar the bundle, get into bundle/programs/server/ and run npm install
run the server with node main.js in the bundle folder.
The purpose of deploying an application is that you are situating your project on hardware outside of your local machine. For example if you deploy an application on Heroku app you create a repository on heroku's systems and that code based is used to serve your application off of their servers.
If you just start an application on your personal system, you will suffer a lack of network and resource availability as well as under use of computer time at non-peak hours as your system will need to remain attentive for additional users without having alternative tasks. Hosting providers provide resources as needed, and their diverse client base allows their systems to work around the clock on a global scale.

Proper way to deploy Meteor app without data to custom server

I want to deploy a meteor app to some custom Linux server. Sure currently installed project packages must be preserved on the destination server.
So I need to pack my local project structure, upload it to the server and unpack it there (or something else).
I think I need (or can) at least remove .meteor/local/* folder content.
What about .meteor/.id file content? Anything else?
I can't find any documentation explaining how to do this but taking into account Meteor's usage simplicity philosophy there must be some simple command to pack application distro.
If you are deploying you should use the meteor build(docs) command which creates a tarball contains your application.
But to make the setup even simpler you could use Meteor-up takes care of the whole deployment process even including preparation of the target server.

Should I be worried about 3rd-party packages accessing my settings.json?

I just saw a package that, in order to run properly, asks you to put something in the public section of settings.json. That made me wonder if the rest of the information there (sometimes sensible, like AWS keys) is accessible as well.
So, should I be worried about this or does Meteor hides this information from packages?
Any package you install from any package manager including NPM, Ruby Gems, and the Meteor package server can run arbitrary code on your computer as your user, including using the fs module to read and write files, accessing the network to send and receive data, etc.
In fact, you place the same trust in the developer whenever you install an application from the internet - almost any application on your computer could read your settings.json file, for example Dropbox, Chrome, etc.
Therefore, there is no way to completely secure the settings.json file from package code. The only way to be sure that packages are safe is to use only community-approved packages or read the source code of the packages you are using.

QSQLDatabase (using SQLite) takes long time to open a database

I have developed an application win QT which uses SQLIte database. The copy of database is located on each site.
On one site let's say site 'BOB1' it works perfectly without any problem. But when we try to use it on another site lets say 'BOB2' it takes long time to open a database connection(approx 2000 milliseconds).
I thought that perhaps there is a network problem, So they tried to use the server of the site 'BOB1' as their server, which works fine. But when i tried to use the server of the site 'BOB2' from the site 'BOB1', I have the same problem. So i thought it may not be the network issue.
Another thing that came to my mind was that, perhaps there is a problem of DNS resolution. But when i tried to ping the server using IP and hostname, the response time is the same.
Any idea or pointer that what can be the problem.
PS: Server + database file path is specified in the setDatabasePath() fuinction using enviornment variables.
Consider copying the database to the local machine (eg temp folder if transient, or other suitable location if permanent). You can safely use either file copy, or consider using the qt backup API to ensure that the transfer happens successfully (plus you get the option of progress feedback)
https://sqlite.org/backup.html
You could even "backup" the file from the remote server to in-memory if the file is small and you say you're reading only?
You can see some sample code here on how to import an sqlite DB into a Qt QSqlDatabase. Note that when you do this, you want to make sure the version of sqlite native API that you're using is the same as that compiled into Qt, or you may get error messages from sqlite or Qt.

Resources