Sublime text SFTP plugin. How to set automatic selection of a remote directory? - sftp

How can I set up automatic selection of a remote directory based on the location of a local file?
Settings: "remote_path" "/site.ua/www/"
For example: the abs.php file is located on the local server along the path: /www/php/abs.php, and this plugin uploads all files to the root directory on the remote server /site.ua/www/abs.php
I need the files to be loaded into the appropriate directories.
UPD: at the same time, the js- and css-files are automatically allocated to the remote directory according to their location on the local server.

Related

Publish profile copy file to remote directory outside of destination root VS2019

I'm trying to accomplish what seems like a simple task but having issues with the publish profile (pubxml) syntax to make this work
I'm using a web publish profile to push changes to our remote staging server, and I'm trying to include a global config file that's shared across our apps into a remote directory on that staging server into a folder that is 'outside' of the publish destination root folder.
Here's a basic sample folder layout of the source and destination:
Source file:
C:\dev\webapp1\Global.config
This is just a basic web config file in the project root.
The destination folders for our web apps would be:
C:\websites\Config
C:\websites\App1
C:\websites\App2
C:\websites\App3
App1 is the project with the publish profile I'm working with, so when I publish it needs to place the Global.config file into the websites\Config directory on the remote server.
So far I have:
<ItemGroup>
<ResolvedFileToPublish Include="Global.config">
<RelativePath>../Config/Global.config</RelativePath>
</ResolvedFileToPublish>
</ItemGroup>
I was hoping using the ../Config would back up a directory from the destination root and place the file in the remote server Config folder but it's placing it into the destination root folder (App1) instead.
Anyone know if this is possible?

How to restore Apache configuration file in Bitnami

When running the bncert-tool on my LightSail server, I have accidentally modified some of Apache's configuration files which I now need to revert from the backup directories.
See previous question and answer here for more info: Modified Bncert command has taken site offline
I have looked in both the /opt/bitnami/apache2/conf and /opt/bitnami/apache2/conf/bitnami directories and can see a series of files including httpd.conf.back.202101220056 (/conf) and bitnami.conf.back.202101220056 (/bitnami).
My question is which backup files do I need need to copy to which location?
I assume it is performed via a 'mv' command.
Many thanks for your help.
On a bitnami stack the main Apache server configuration file is (httpd.conf)
When you are configuring server for SSL this is usually done in the file (bitnami.conf)
I would start by replacing the current bitnami.conf with the original bitnami.conf. Then restart your server so that the changes that effect. bitnami.conf is located in the directory apache2/conf/bitnami
If that does not fix it then replace the current httpd.conf with the original httpd.conf. Then restart your server so that the changes that effect. httpd.conf is located in the directory apache2/conf .
Note you will find it a lot easier to modify server files if you connect to your server with FileZilla. You can delete and drag and drop to copy files with FileZilla.

Configuring Web Server in Google Cloud Compute Engine

I have a dash application in a compute engine instance that I'm looking to view in my browser. HTTP and HTTPS traffic is enabled and the instance has a static IP address. The apache server works and when I first ran an application, the default index page located at /var/www/html showed up at the browser address http://EXTENAL_IP_OF_VM_INSTANCE
From what I've seen elsewhere, web application files tend to be stored in the /var/www directory and the index.html file is present as the default page. However I have a single app.py file that I want to run which is located in the /home/user directory, so not in any particular web directory.
I run the app by entering python3 app.py and it does run:
Running on http://127.0.0.1:8050/ (Press CTRL+C to quit)
However, going to the instance's external IP address (34.89.0.xx) in my browser doesn't show the app, instead it shows text from an old 'hello world' application I made previously, that I thought I had deleted but is still showing up.
Part of the server configuration file apache2.conf is below:
The sites-available folder contains two files, 000-default.conf and default-ssl.conf, both with /var/www/html as the DocumentRoot. 000-default.conf is also in the sites-available folder, and is the only file there.
I tried changing the DocumentRoot in these files to /home/user where the app.py file is which didn't work, then I tried moving the file to the web directory /var/www which didn't work either.
Does anyone have any suggestions on how to fix this so that I can see my application in the browser?

How can I access files outside of my webroot when using Vagrant?

I have a directory that contains the following structure:
phpinfo.php
adminer.php
drupal/
Vagrantfile
bootstrap.sh (config file)
index.html (on-boarding information for site-builder, etc.)
My synced folder is drupal (mapped to /var/www/html), but I also want to access phpinfo.php and adminer.php.
A hostname is also setup to be built as a webapp.dev host mapped to this new vagrant guest.
I could make the overall directory the synced folder, but I don't want to create clutter or have to access the site at webapp.dev/drupal.
How can I access both the drupal site as web root but still run the various tools? Is it possible to create an additional virtual host and synced directory that maps to the containing folder structure?
You can configure another synced folder. I'm doing this for certs that should be kept above webroot. Here's an excerpt from my Vagrantfile.
config.vm.synced_folder "./public_html", "/vagrant",
id: "vagrant-root",
owner: "vagrant",
group: "www-data",
mount_options: ["dmode=775,fmode=664"]
config.vm.synced_folder "./certs", "/certs",
id: "certs"
Note you have to use a separate id for each folder.

NGINX + PHP-FPM tmp dir

I need to know where the uploaded files are sent when a user upload a file like an image through php. The files are written direct to the destination directory in scripts or they uploaded to an tmp directory? In this case, would be nice if tmp directory were mounted with flags noexec and nosuid. With FPM PHP and NGINX, this is necessary? when I list the content of the directory /tmp while I upload an file, the directory is showing empty.
PS: the script is running as user that owner directory. I change the var tmp_upload_dir and when a file is uploaded, the directory still empty.

Resources