How do people usually work with managing remote stuff with local stuff? Say I have an EC2 instance running ubuntu and my dev machine is an OS X. I have a symfony2 project in it. Do people usually work with files directly on the remote server on EC2? If yes how do they use a text editor such as sublime text on an EC2 box?
I cannot say how "people" work. But to me the best practice is following:
Remote:
Install mysql & php incl. all extensions are needed. (symfony-project/web/config.php)
You could use this tutorial: http://www.howtoforge.com/perfect-server-ubuntu-10.04-lucid-lynx-ispconfig-3
Create Database for your project on server
Clone your git repository on server and define WebHook URLs
Create the file on your server. <?php git pull ?>
You'll never work on remote directly from now on! (except installing vendors by ssh)
Local:
Install local webserver. I recommend http://php-osx.liip.ch/ But you can also use MAMP (don't like that).
Clone git repository in web folder
Define remote Database in symfony project
See changes local
Commit&Push will pull remote automatically
Related
After rebooting a Wordpress Dev machine (Ubuntu Desktop), does not boot at all.
I can ping the machine, but cannot access it using SSH. I recon the SSH Daemon doesn't start properly.
Is there a way of recovering the Wordpress site from a dead machine? I have access to the filesystem by attaching it to another machine.
Depend on your Linux architecture, but you can certainly copy the files for your Wordpress installation. Usually files are in /var/www/....
Additionnaly you need to backup your database, if you have no means to access it throught SSH, you cannot use the SQL dump capability. You can try recovering raw database files wich are usually under /var/lib/*sql (for PostGreSQL database this is /var/lib/pgsql or /var/lib/mysql for MySQL database).
Once your database is backuped you need to use the exact same SQL engine and copy the files to the new database folder (be extr cautious with files permissions).
I'm trying to create a local development environment to test website changes out in. The web developers I'm working with have a gitlab repo of the website with auto DevOps where changes are pushed to the production/stage sever when commited.
I downloaded Laragon and was able to serve pages locally, then I cloned the gitlab repo into my root directory.
How do I get a copy of production website's database and then connect it to my local development environment with laragon? I tried to duplicate the database on production with WP all-in-one migration but there are read/write restrictions.
I was advised the following: "I would recommend copying the database onto your local MariaDB/MySQL server the Laragon provides. You'd need to get a dump of the database from the server using SSH command line tools and then you can use the phpMyAdmin interface provided by Laragon to upload the dump".
I wasn't able to SSH from the windows command line, but I could SSH into the server using Putty, however, how am I able to transfer the file to my local windows machine? If I'm logged into my server wouldn't the file just be placed somewhere in the file system of the remote machine, so I wouldn't be able to use it with Laragon.
If you are using Laragon Full, you have ssh/scp command in Laragon's Terminal. So you can get your-sql-dump from your remote server easily.
Here's how:
Open Laragon's Terminal:
Menu > Laragon > Terminal
Run this command:
scp user#your-remote-host:/path-to-your-sql-dump C:/laragon/tmp/your-sql-dump
Import your sql-dump to your local database:
mysql -u root -p your-database < C:\laragon\tmp\your-sql-dump
Reference: https://laragon.org/download/migrate-from-xampp.html
That's all.
I have now a working Website that I have deployed to the IIS on my local machine using Visual Studio 2013. I can access the website successfully from other machines in my office and in neighboring offices. I have even had others check the access from geographically remote locations, and still been successful. The access method is to put my machine's IP Address and add the name of the application, ###.###.###.###\name, into the address bar of a browser.
The next step is to promote it to production. We are not using Azure, so the tutorials for promotion to production I have found aren't useful, nor are we using 3rd party providers.
The method I have conceived is as follows:
Have an admin directly login into a company server
install the database software
install Visual Studio
install IIS
copy the local machine's db to the server db
copy the locally deployed files to the server
admin login into Visual Studio
deploy the site on the server in the same way it was deployed on the local machine
use the server IP or update the host file on all networked computers to map the IP to an appropriate name (or the local network has a local DNS we can update)
This would allow me a DEV environment (the VS IDE), a TEST env (locally deployed version), and a PROD env (the version deployed on the server).
I don't see any reasons why this wouldn't work, maybe a bit tedious, but workable.
Is this method ok? Am I missing anything critical?
No, this is an altogether inappropriate way to push a build to a production environment.
Your source code should be stored in a source code repository. You should have an automated, continuous build server pull from the repository and complete the build in a dedicated environment that is itself under change control. The build should include the generation of installation files, e.g. a click-once deployment package. From there you can have an admin run the deployment package, or, ideally, you'd push it automatically with a tool like Octopus.
The above, honestly, is the bare minimum for a commercial web site. There is much, much more you can do to make things even more robust, e.g. blue-green deployment.
Note that none of this involves installing development tools like Visual Studio on your server. The server should stay as clean as possible, running the fewest applications that you need, to minimize any sort of attack surface and to keep the machine running efficiently.
Dont Put Yours Machine IP, Decompilers Can Take It And BruteForce Your IP To Remote Acess Your Machine.
I recently created a droplet on Digital Ocean, and then just used Meteor Up to deploy my site to it.
As awesome as it was to not have to mess with all of the details, I'm feeling a little worried and out of the loop about what's happening with my server.
For example, I was using the console management that Digital Ocean provides, and I tried to use the meteor mongo command to investigate what was happening with my database. It just errored, with command not found: meteor.
I know my database works, since records are persistent across accesses, but it seems like Meteor Up accomplished this without retaining any of the testing and development interfaces I grew used to on my own machine.
What does it do??? And how can I get a closer look at things going on behind the scenes?
Meteor Up installs your application to the remote server, but does not install the global meteor command-line utilities.
For those, simply run curl https://install.meteor.com | /bin/sh.
MUP does a few things. Note that this MUP is currently under active development and some of this process will likely change soon. The new version will manage deployment via Docker, add support for meteor build options, and other cool stuff. Notes on the development version (mupx) can be found here: https://github.com/arunoda/meteor-up/tree/mupx.
mup setup installs (depending on your mup.json file) Node, PhantomJS, MongoDB, and stud (for SSL support). It also installs the shell script to setup your environment variables, as well as your upstart configuration file.
mup deploy runs meteor build on your local machine to package your meteor app as a bundled and zipped node app for deployment. It then copies the packaged app to the remote server, unbundles it, installs npm modules, and runs as a node app.
Note that meteor build packages your app in production mode rather than the debug mode that runs by default on localhost when you call meteor or meteor run. The next version of MUP will have a buildOptions property in mup.json that you can use to set the debug and mobileSettings options when you deploy.
Also, since your app is running directly via Node (rather than Meteor), meteor mongo won't work. Instead, you need to ssh into the remote server and call mongo appName.
From there, #SLaks is right about how it sets things up on the server (from https://github.com/arunoda/meteor-up#server-setup-details):
This is how Meteor Up will configure the server for you based on the given appName or using "meteor" as default appName. This information will help you customize the server for your needs.
your app lives at /opt/<appName>/app
mup uses upstart with a config file at /etc/init/<appName>.conf
you can start and stop the app with upstart: start <appName> and stop <appName>
logs are located at: /var/log/upstart/<appName>.log
MongoDB installed and bound to the local interface (cannot access from the outside)
the database is named <appName>
What is the proper way to deploy webapps on Heroku? I'm installing Moodle, but the same procedure should apply to e.g. Drupal or Wordpress. What I hace done is to unzip Moodle locally, then uploaded it using git to Heroku. When I then visit my site I get the option to install it and select the database, which works fine. The problem is that the install procedure saves information in the filesystem on the server, which gets overwritten next time I deploy my app. So what is the proper way of doing this?
You have to pre-configure your app with all of the database settings before you deploy to Heroku. So either do a fake "install" on your local environment, or manually edit your php config files.
As you've discovered, Heroku's filesystem is not persistent: https://devcenter.heroku.com/articles/dynos#ephemeral-filesystem.