Transfer a file from host filesystem into Dokku app - dokku

This should be a simple question but I can't find the solution.
How can I transfer a file from the local filesystem into my Dokku container?

You can use the official copy-files-to-image plugin to do this.

Related

Do Git operations on web server or on local filesystem?

I'm new to web development. I have a repo at GitHub where I manage my own theme for wordpress. What is the best way for handling this theme? Is it okay when I do the Git operations (clone, pull, push) directly on the webserver or should I sync the theme folder with a local repository and do all the Git operations on my local file system?
It's up to you. There is no a better way. Depends of what are you more confortable using. Doing the Git operations in the webserver, it will execute the same commands that you would run in your local folder. Another option for you, it can be to install a client for Git, like TortoiseGit, that is easy to use.

Do I need to attach storage to Bitnami Wordpress Azure?

I'm unable to find the details in Bitnami Wordpress documentation, but if I use it in Azure, do I also need to setup a managed disk? If yes, how can I configured Bitnami to use the managed disk instead of the OS disk?
Bitnami Engineer here. The WordPress solutions we provide in Azure don't use managed disk by default. All the components and files are included in the same instance so if you want to use a different disk, you would need to create that new disk, move the data to it and attach it to the instance.
Please remember that you will need to continue using /opt/bitnami as installation directory so if you mount a new disk, you will need to create a symlink to link /opt/bitnami with the files in the new disk.
I hope this information helps.

How to migrate containers from a local docker-compose to another host

Good day!
I'm trying to migrate my local wordpress/mariadb containers made from docker-compose to another host probably to a production server.
Here's what I did:
I created a docker-compose for the wordpress and mariadb containers locally. I then started to populate wordpress content to them.
Use Case:
I want to export and import the containers made through docker-compose along with its data to another server.
Please guide me on my problem.
Many thanks.. :-)
Ideally you wouldn't be storing data in the containers. You want to be able to destroy and recreate them at will. So if that's what you have I'd probably recommend figuring out how to copy the data out of the containers, then deploy them remotely from images. When you redeploy them you want to mount the data directories to an external drive which will never be destroyed and repopulate the data there.
If you really want to deploy the containers with the data then I'd say you want to look at Docker Commit which you can use to create images from your existing containers which you can then deploy.
This is solved! :-)
I define volumes in mariadb and wordpress services in my Compose file which created the data directories that I need. I will then tar the docker compose directory and will recreate the docker-compose in my remote server. thanks for the awesome answer. heads up for you #lecstor.

Swisscom CloudFoundry with SSH keys

I try to install Wordpress on the Swisscom CloudFoundry application cloud. To install it I need SSH with private and public key pairs (not cf ssh).
I follow the steps here:
https://github.com/cloudfoundry-samples/cf-ex-wordpress
Is this possible? What are the correct values for:
SSH_HOST: user#my-ssh-server.name
SSH_PATH: /home/sshfs/remote
Is this possible?
It depends on your CF provider. This method of running Wordpress requires that you use a FUSE filesystem (SSHFS) to mount the remote files system over the wp-content directory of your Wordpress install. In recent versions of CF (I can't remember exactly where this changed) you are no longer allowed to use FUSE based file systems.
Before you spend a lot of time on this, you might want to validate that your provider still allows FUSE. You can validate with a simple test.
Push any test app to your provider.
cf ssh into the application container.
Check that the sshfs binary is available.
Try using sshfs to mount a remote filesystem (man page | examples).
If you can successfully mount a remote filesystem via SSH using the steps above then you should still be able to use the method described in that example application.
If you cannot, the next best option is to use a plugin that allows storing your media on a remote system. Most of these are for S3. Search google or the WP plugin repo, they're easy enough to find.
There is a better solution on the horizon called Volume Services. You can read more about this here. I have not seen any public CF providers offering volume services though.
What are the correct values for:
SSH_HOST: user#my-ssh-server.name
This should be the user name and host name of your SSH server. This is a server that exists outside of CF. Examples: my-user#192.0.2.10 or some-user#host.example.com. You should be able to ssh <this-value> and connect without entering a password. This is so that the volume can automatically be mounted without user interaction when your app starts.
SSH_PATH: /home/sshfs/remote
This is the full path on the remote server where you'd like to store the Wordpress files. In other words, this directory will be mounted as the wp-content directory of your app.

how to setup openshift wordpress setup on localhost

I am amazed that no information on how to setup openshift wordpress in local environment.
They also don't have full wordpress structure to set it up on localhost.
Their information on the site also not clear on this.
is there any steps to follow this?
have you look at their github account? they have openshift quick start for wordpress here:
https://github.com/openshift-quickstart/openshift-wordpress-developer-quickstart
install client tool here:
https://developers.openshift.com/en/managing-client-tools.html
UPDATES:
if you are looking for host your own OpenShift, then you should look at OpenShift Origin:
https://docs.openshift.org/latest/getting_started/administrators.html
You can run openshift in a virtual machine.
see this.
But there is no reason why you shouldn't use a free openshift account and build it from your local machine (with git). It is a hassle at first but if you master it you won't have to reinvent the wheel when going live.
edit: focus on getting it to work on openshift, there should be a lot of documentation on it. The local environment is just your git repo running on your local machine.

Resources