Upgrade airflow image version on google cloud Composer - airflow

When I first created an instance on GC Composer, it was set on airflow 1.9.0. Since the time, the images 1.10.0 and 1.10.1 are now available for new composer instances, but I didn’t find a way to upgrade into an existing composer instance !
What is the best way to upgrade ?

Image version is a non-editable property of cloud composer.
The only way is backup your files in bucket or cloud shell environment.Then delete and rebuild(select the version you need and copy back the files).

Related

Migrating Data from Nexus2 to Nexus3 via Upgrade Agent

based on the documentation to upgrade Nexus2 to Nexus3 we can use the Upgrade Agent but now I am wondering is it possible to use it for data migration, my use case is, I have already Nexus3 with data inside, for the other project we are using Nexus2 which now we want to move data to Nexus3, just wondering if migrating in this way cause some configuration issue or overriding blob in Nexus3.
Does anyone tried it for migrating data from one instance to already existed instance with data inside?
at the end I had to come with this solution as I am using the Nexus OSS.
first download the target repository from Nexus 2
wget --user user --password pass --recursive --no-parent http://NEXUS2-URL/nexus/content/repositories/maven-releases/
then used this library to import them:
https://github.com/AlexLiue/nexus-repository-import-scripts, it supports NuGet, Maven, NPM
I would check repository import, maybe that solves your problems

How to automate wp instance setup + theme/plugin customization with docker?

I want to create a docker image for wordpress, which creates a ready to use wp instance with already setup and completed installation, activated theme and plugins and other configurations.
I was exploring WP CLI to do that as part of the Dockerfile build, however it turned out that this is a stateful process (it requires existing mysql database already in place), which is impossible during docker build.
I'm thinking about creating a bash script which brings up a docker WP CLI container and a pure ubuntu/php container. The wp cli container will then install and setup the wp instance on the ubuntu/php container which will become the defacto WP container, and at last the wp cli container will be shut down by the script.
I'm not sure if this is a good solution, do you have better advises for this use case?
It's a broad and opinion-based question. I think you basically need to figure out three things:
Find and choose a Composer template to manage your dependencies. For example https://roots.io/bedrock/.
WP-CLI commands. I see you are already at it.
Find a way to export and import your WordPress config. For example https://wordpress.org/plugins/wp-cfm/
A basic setup routine then could maybe look like this:
# Download WordPress core and plugins.
composer install -n --no-dev
# Default DB name and user.
wp config create --dbname="test" --dbuser="root" --dbpass="" --dbhost="127.0.0.1"
# Install a basic site. The title doesn't matter.
wp core install --url="blog.localhost" --title="Lorem ipsum dolor sit amet" --admin_user="admin" --admin_password="admin" --admin_email="webmaster#example.com" --skip-email
# Language.
wp language core install de_DE
# Activate WP-CFM and import default conf.
wp plugin activate wp-cfm
wp config pull default
You don't necessarily need Composer. You can download WordPress and plugins with just WP-CLI as well. The advantage of Composer is the versioning, and that you only need a composer.json and -.lock file to keep your repo clean. It's then just one command composer install that downloads you and your colleagues everything. Big advantage also is vendor/ can be cached during builds depending on the composer.lock checksum for example. This can shorten build time a lot.
Alternatively to the above sample routine you could also try to get an existing database backup imported and then just update the base URL and import the updated config. That would skip the WordPress install step and give you dummy content to maybe target automated tests at.

Drupal 9 in an airgap or without composer

Does anyone have any experience with Drupal 9 either without composer or in an airgap? Basically we're trying to run it in an airgapped server. Composer obviously wants to access the internet for checking and downloading packages.
You'll need to run composer to install your packages and create your autoload files to make it all work.
You could create your own local package repository and store the packages you need there, however this would be a large undertaking given all the dependencies Drupal Core and contrib modules use. You'd need to manage them all yourself, and keep your local versions synced with the public versions, especially for security updates.
If you need to do that anyway, you're better off just using the public repos.
Documentation on composer repos is here:
https://getcomposer.org/doc/05-repositories.md
Near the bottom it shows how to disable the default packagist repo:
https://getcomposer.org/doc/05-repositories.md#disabling-packagist-org
A much simpler alternative would be to do development in a non air gapped environment so you have access to the packages you need and can run composer commands to install everything. Then once your code is in the state you need, copy that to your air gapped server to run. Once composer install has run it is not required to do anything else. Just make sure you include the vendor directory with all your dependencies, as well as drupal core and contribs.
The server you run your Drupal instance on, does not even require composer to be installed.

How do you push updates to a deployed meteor app that has a filesystem?

I have an app running on my own digitalocean VM that I'm trying to play around with to figure out how to run a meteor production server. I deployed it with meteor build, but now I'm a bit unsure about how to push updates. If I build a new tarball on my own machine, I will loose file references that my users have made to files in bundle/uploads, because the remote filesystem isn't incorporated into my local project. I can imagine some hacky ways to work around this, but besides hosting the files on s3 or another 3rd party server, is there any way to "hot code push" into the deployed app without needing to move files around on my server?
Am I crazy for wondering what the meteor equivalent of git push/pull is in production, or just ignorant?
You can use dokku (https://github.com/progrium/dokku). DigitalOcean allows you to create an instance pre-installed with dokku too.
Once you've set up your ssh keys, set the environment variables, ROOT_URL, PORT and MONGO_URL you can add that server as a git remote and simply git push to it.
Dokku will automatically build up the Meteor app and have it running, and keep it up to date whenever you git push.
I find Dokku is very convenient. There's also flynn and deis which are able to do the same in multi tenant environment with way more options.
Just one thing to keep in mind with this is to push the guys who own the repo to keep the Node version in the buildpack up to date. Meteor is a bit overzealous when it comes to using the latest version of Node and refusing older versions.
Meteor does lack a bit in this department. I can't remember where I may have heard this, but I believe they intend on adding this very popular Meteor deployment package to their library. Short of switching to a more compatible host, I'm not aware of any better solutions.

How to update (some files) a Symfony2 project to remote

I've been developing with Symfony1.4 'till now and had no problem to deploy a project or update it into a remote hosting. I just used sfFtpPlugin and everything was perfect: http://www.symfony-project.org/plugins/sfFtpPlugin
But now I'm starting with Symfony2 (2.2.0) and the first of all I had this question: how to update it when I make changes?
For the first time deploy I know there are some options: upload full project by FTP or use Maestro (e.g. offered in the ServerGrove.com hostings) With those tools I can upload everything, but in the case where I need to update... ¿50? files, I cannot manually do by FTP, of course.
Thanks everyone for helping!
P.S: Aditional info: I have some SVN knowledge and started learning GIT a few days ago.
The documentation on this is fantastic. The Cookbook provides workflows for both Git and SVN.
http://symfony.com/doc/current/cookbook/workflow/index.html
If you have no shell available, you can use composer on your local machine to update your project and then FTP the entire project over.
This covers how to store settings for different environments:
http://symfony.com/doc/current/cookbook/configuration/environments.html
Personally I use a private Satis repo for deploying all my code.
That way I never have to use FTP, just composer create-project/install/update.

Resources