Can a MSIX package use an external file for user settings? - .net-core

We are evaluating the migration from our current client/server application to .NET Core. The 3.0 release added the support for WinForms we need for our client, but ClickOnce will not be supported.
Our solution is installed on-premise and we need to include settings (among others) like the address to the application server. We create dynamically ClickOnce packages that can be used to install and update the clients and include the settings. This is working like a charm today. The users install the client using the ClickOnce package and every time we update the software we regenerate these packages at the customer's site and they get automatically the new version with the right settings.
We are looking at MSIX as an alternative, but we have got a question:
- Is it possible to add some external settings files to the MSIX package that will be used (deployed) when installing?
The package for the software itself could be statically generated, but how could we distribute the settings to the clients on first install / update?

MSIX has support for modification packages. This is close to what you want, the customization is done with a separate package installed after you install the main MSIX package of your app.
It cannot be installed at the same time as your main app. The OS checks if the main app is installed when you try to install the modification package and it will reject its installation if the main is not found on the machine.
The modification package is a standalone package, installed in a separate location. Check the link I included, there is a screenshot of a PS window where you can see the install path for the main package and the modification are different.
At runtime (when the user launches the app) the OS knows these two packages are connected and merges their virtual files and registry system, so the app "believes" all the resources are in one package.
This means you can update the main app and the modification package separately, and deploy them as you wish.
And if we update the modification package itself (without touching the main), will it be re-installed to all the clients that used it?
How do you deploy the updates? Do you want to use an auto-updater tool over the internet? Or ar these users managed inside an internal company network and get all the app updates from tools like SCCM?
The modification packages were designed mostly for IT departments to use them, and this is what I understood you would need too.
A modification package is deployed via SCCM or other tools just like the main package, there are no differences.
For ISVs I believe optional packages are a better solution.

Related

Can you recommend how to manage package dependencies for apps hosted on shiny server?

I am hosting my Shiny apps via Shiny Server on AWS. The apps I am hosting depend on a package I am actively developing.
I want app1 to use a stable version of the package I am developing (i.e., be fixed to a certain Github release). But app2, I want to follow the latest updates (i.e., that I reinstall on the server every time I want to test something).
Usually, when I install the latest version of my package, this updates things globally, so app1 would be affected. But how could I have a fixed system environment for app1 and a different environment for app2 in this context? What is a good workflow to achieve this?
I was able to achieve this using adisarid's answer here:
Using renv:
log into the production server, get into the app's directory. Then run
R, and use renv::restore and then renv::isolate.

Drupal 9 in an airgap or without composer

Does anyone have any experience with Drupal 9 either without composer or in an airgap? Basically we're trying to run it in an airgapped server. Composer obviously wants to access the internet for checking and downloading packages.
You'll need to run composer to install your packages and create your autoload files to make it all work.
You could create your own local package repository and store the packages you need there, however this would be a large undertaking given all the dependencies Drupal Core and contrib modules use. You'd need to manage them all yourself, and keep your local versions synced with the public versions, especially for security updates.
If you need to do that anyway, you're better off just using the public repos.
Documentation on composer repos is here:
https://getcomposer.org/doc/05-repositories.md
Near the bottom it shows how to disable the default packagist repo:
https://getcomposer.org/doc/05-repositories.md#disabling-packagist-org
A much simpler alternative would be to do development in a non air gapped environment so you have access to the packages you need and can run composer commands to install everything. Then once your code is in the state you need, copy that to your air gapped server to run. Once composer install has run it is not required to do anything else. Just make sure you include the vendor directory with all your dependencies, as well as drupal core and contribs.
The server you run your Drupal instance on, does not even require composer to be installed.

Local NPM/Atmosphere package repositories for Meteor applications without internet access

I'm currently working on a fork of the Meteor application Rocket Chat. I have a requirement to stand up the application for testing and development on an isolated network, so no internet access whatsoever.
I can't just get it running on a connected system and then copy it wholesale into the disconnected lab. Rather, I need to be able to check out a copy of the source code (from a local SCM) and then run Meteor, letting it perform all necessary compilation and dependency resolution on the fly.
Even though it is a huge kludge, I was hoping that I could just copy the .meteor folder from a working system directly onto the target system so that it would already have a cache of all required packages and therefore not need to reach out to any repositories. However, from what I have found, that only works for Meteor dependencies downloaded from Atmosphere.
Within Rocket Chat, there are several private packages (such as rocketchat-ldap) that have dependencies on NPM packages (in this case, ldapjs). When the application is run and these packages are built, the .npm folder in the user's home directory gets populated with those NPM packages. So, I tried to package that folder up along with the .meteor folder to accomplish the same task.
Unfortunately, when I tested it on the offline system, despite having the populated .npm folder, Meteor spits out the following error:
While building package rocketchat:ldap:
error: Can't install npm dependencies. Are you connected to the internet?
Obviously, I'm not connected - by design.
So, I am currently looking into Sinopia to stand up an NPM repository mirror on our local network that can host these dependencies. However, I have no idea how I'm supposed to point Meteor to the alternate server. The Meteor documentation includes information about the Npm.depends and Npm.requires directives, which the application uses, but I can't find anything about specifying a URL from which to find said packages.
Further, is it possible to do something similar with the Atmosphere packages? Or is copying the .meteor folder the only way? As in, is there some application out there that I can use to host some of the Meteor packages? Or am I going about this in the wrong way?
The solution I went with, which isn't as elegant as I'd hoped was the following:
First, I copied the .meteor folder from the user account of a "working" system (this contains the Meteor executable and all of the Meteor packages downloaded from Atmosphere) to the user account of the disconnected target system. This allowed the target system to run Meteor.
Second, the NPM packages in question were being downloaded directly into the private packages in the source, but the .gitignore file on the source was set to ignore the node_modules folders. So I altered that and then checked those node_modules folders into the source with the rest of the application.
So, for example, the application source included a /packages/rocketchat-ldap/.npm/package folder. Then, when the application was run using meteor, the associated NPM packages (such as ldapjs) would get downloaded directly into a node_modules folder in that folder structure, at which point the private packages could be built.
Now, the source code in Git already contains those downloaded packages, so when a copy is checked out onto the disconnected target system, there is no need to download them.
Fortunately, this did not increase the size of the source very much (just a few hundred kilobytes).
The result is that when running meteor to run the application on the target system, all dependencies are already in place, and no internet connection is required.

Should I be worried about 3rd-party packages accessing my settings.json?

I just saw a package that, in order to run properly, asks you to put something in the public section of settings.json. That made me wonder if the rest of the information there (sometimes sensible, like AWS keys) is accessible as well.
So, should I be worried about this or does Meteor hides this information from packages?
Any package you install from any package manager including NPM, Ruby Gems, and the Meteor package server can run arbitrary code on your computer as your user, including using the fs module to read and write files, accessing the network to send and receive data, etc.
In fact, you place the same trust in the developer whenever you install an application from the internet - almost any application on your computer could read your settings.json file, for example Dropbox, Chrome, etc.
Therefore, there is no way to completely secure the settings.json file from package code. The only way to be sure that packages are safe is to use only community-approved packages or read the source code of the packages you are using.

Package management running R on a shared server

some background: I'm a fairly beginning sysadmin maintaining the server for our department. The server houses several VM's, mostly Ubuntu SE 12.04, usually with a separate VM per project.
One of the tools we use is R and RStudio, also server edition. I've set this up so everyone can access it through their browser, but I'm still wondering what the best way would be to deal with package management. Ideally, I will have one folder/library with our "common" packages, which are common in many projects and use cases. I would admin this library, since I'm the only user in sudo. My colleagues should be able to add packages on a case-by-case basis in their "personal" R folders, that get checked as a backup in case a certain package is not available in our main folder.
My question has a few parts:
- Is this actually a viable way to set this up?
- How would I configure this?
- Is there a way to easily automate this library for use in other VM's?
I have a similar question pertaining to Python, but maybe I should make a new question for that..
R supports multiple libaries for packages by default. Libraries are basically just folders in which installed packages are placed.
You can use
.libPaths()
in R to view what paths are use as libraries on your system. On my Ubuntu 13.10 system, there are
a personal library at "~/R/x86_64-pc-linux-gnu-library/3.0" where packages installed by the user are placed,
"/usr/lib/R/library" where packages installed via apt-get are placed and
"/usr/lib/R/site-library" which is a system-wide library for packages that are shared by all users.
You can add additional libraries to R, but from how I understand your question, installing packages to /usr/lib/R/site-library might be what you are looking for. This can be archived relatively easily by running R as root and calling install.packages() and update.packages() from there as usual. However, running R as root is a security risk and not a good idea, so maybe it is better to create an separate user with write access to /usr/lib/R/site-library and to use that one instead of root.
If you mount /usr/lib/R/site-library on multiple VM, they should also share the packages installed there. Does this answer your question?
Having common library and personal library locations is completely feasible.
Each user should have two environment variables set. R_LIBS should point to the common library, and R_LIBS_USER should point to their personal location. See ?.Library for more information.
You can check a user's library paths using .libPaths(). You probably want users to install packages to their personal library, so some fiddling may be required to make sure that the personal library is the first element in of .libPaths().

Resources