problems with sudo mv file /non_existing_folder - file disappeared - unix

I was trying to move a RubyOnRails.txt file into a /RUBY directory, so for some reason I typed:
mv RubyOnRails.txt /Ruby"
And I got this error: mv: cannot move ‘Untitled Document’ to ‘/Ruby’: Permission denied
Obviously, I typed: sudo mv RubyOnRails.txt /Ruby
And then, I understood my error, the folder wasn't /Ruby, was /RUBY.
Now the file is gone, and I can't find it anywhere.
There is some way to find it or recover it?

The file isn't gone, it's just renamed to Ruby and it's in root directory of your system /
You can still move it where you want like that sudo mv /Ruby your_destination

Related

I'm unable to delete this folder

Okay, I recently installed snapd on my system, but I later decided to uninstall it. Regardless of me uninstalling it, the folder, /snap/, remains. I initially tried to delete the folder using the
sudo rm -r snap
command, but I keep getting the error message that the folder is a read-only folder. I did the
ls -ld snap
command to see the permissions for that folder and the output I got was
drwxr-xr-x 1 root root 370 Jan 24 19:02 snap
Would anyone be so kind as to tell me how I would go about deleting that folder? Thanks.

Why changing LD_LIBRARY_PATH has no effect in Ubuntu?

I was trying to deploy my application on Ubuntu 16.04. So i made a package with the following hierarchy -
Package
|
----bin
|
-----application
-----application.sh
-----Qt
|
-----necessary qt libraries
-----platforms
Here is the application.sh file -
#!/bin/sh
export LD_LIBRARY_PATH=`pwd`/Qt
./application
When i execute the application.sh file, it shows me that it cant find the libQt5MultimediaWidgets.so.5 file. But its in the Qt folder. Also when i print the ldd application from the application.sh file after exporting LD_LIBRARY_PATH it gives me following output -
Please check the marked parts. Can anyone please explain why the libraries from the Qt folder are not found even after exporting the LD_LIBARRY_PATH?
Edit:
So as suggested by #Zang, i have checked the debug log and here it is -
Please check the marked parts.
It seems like its actually trying the actual libQt5MultimediaWidgets.so and then report that its unable to find it. Can anyone please help me understand whats happening here?
Edit-2: As per suggestion from #Tarun, i have ran ls -al on my Qt folder. Here is the output -
All files in Your Qt directory are actually simlinks to non-existing files in the same directory, therefore they cannot be found.
If you look at the output of your ls -al
These are soft links that you have. Your softlink libQt5MultimediaWidgets.so.5 points to libQt5MultimediaWidgets.so.5.9.2 in the same directory and the file is not there at all. So you need to either set the correct softlink path or have the file in same directory
First
Could it be that the pwd is not where you assume it is?
You could try adding
# Figure out where the application.sh script is located
scriptpath="$( cd "$(dirname "$0")" ; pwd -P )"
# Make sure our pwd is that location
cd "$scriptpath"
in the top of your script (assumes bash shell, from here)
By doing this all relative paths to Qt folder will be valid.
Second
Maybe you should considder exporting your new LD_LIBRARY_PATH, like so (from here):
LD_LIBRARY_PATH=whatever
export LD_LIBRARY_PATH
Third
It may be useful to run ldconfig command for ld to update after changing the variable (from here):
sudo ldconfig
The file libQt5MultimediaWidgets.so is not present in /Desktop/package/bin/Qt according to the screenshots shown.

Drupal 7 Install Error - The directory sites/default/files does not exist

I am attempting to install Drupal 7.34 on RHEL and I continue to run into issues with permissions on sites/default/files. I've searched all over for a solution, but nothing has helped.
Here are the steps I am taking (with root access):
In /var/www/html I execute: drush dl drupal to download Drupal.
I then follow Drupal's install instructions (from /var/www/html):
mv drupal-7.34/* ./
mv drupal-7.34/.htaccess ./
mv drupal-7.34/.gitignore ./
cp sites/default/default.settings.php sites/default/settings.php
chmod a+w sites/default/settings.php
chmod a+w sites/default
cd ..
chown -R apache:apache html
In the browser, I navigate to http://myhost/install.php. In the "Verify requirements" step of the install process I receive the following error:
The directory sites/default/files does not exist.
So, I take then take the following steps:
mkdir html/sites/default/files
chmod a+w html/sites/default/files
chown apache:apache html/sites/default/files
When I attempt the install process I now get the following error:
The directory sites/default/files is not writable.
What am I missing here? The sites/default/files directory exists and is writable. Any guidance is much appreciated.
The solution I applied was more of a work-around, but I ended up using Drush to handle the entire installation rather than using it to download Drupal and manually configuring it from there.
I still don't know the answer to this "simple" error but I do know that before adding users or granting full permissions to a group one must know the name of the user running (owning) httpd. This is not always www-data
Also - my sites/default/files -is- in fact writable as is the case for just about everyone who posts this question. There is something seriously wrong with Drupal's install that it has this issue, that it is so prevalent and not addressed adequately by the code maintainers. Searched about twenty responses to this "very simple" problem and still none of the suggestions work. Opened up permissions entirely, chown the drupal installation files to the httpd daemon (apache) and group (www in my case)
These fixed it for me:
chmod 777 sites/default
chmod 777 sites/default/setting.php
It turns out the 'chmod a+w...' commands in the docs were not enough - the 777 includes 'x', making the items executable as well as writable.

When extracting a tar archive, I get the error "No such file or directory found"

When I attempt to extract a huge tar archive, I get the following error:
"filename: No such file or directory found"
Any suggestions on what could be going wrong?
This may happen if the disk is full. If you extract using:
tar -xvf <filename.tar>
you may see the following message before any No such file or directory found:
mkdir failed: Disk quota exceeded
why dont you try to test your tar file first!
file yourfile.tar
(it should say its a tar file if it's not broken)
Then...
tar -tvf yourfile.tar
It should give a listing of the contents of your tar file without actually writing it to disk. Just to check the integrity of it.
Also, if your file is larger tan 2GB it is posible that your tar binary wont work, try gtar instead!
with that info, you can go further...
regards,
Daniel.

Fatal error: cannot mkdir R_TempDir

When attempting to run R, I get this error:
Fatal error: cannot mkdir R_TempDir
I found two possible fixes for this problem by googling around. The first was to ensure my tmp directory didn't contain a load of subdirectories - it doesn't and it's virtually empty. The second fix was to ensure that TMP, TMPDIR, and R_USER in my environment weren't set to non-existent paths - I didn't even have these set. Therefore, I created a tmp directory in my home directory and added it's path to TMP in my environment. I was able to run R once and then I got the fatal error again. Nothing was in the TMP directory that I set in my environment. Does anyone know what else I can try? Thanks.
Dirk is right, but misses a point: If /tmp is full, you can't create subdirectories there. Try
df /tmp
I just hit this on a shared server, where /tmp is mounted on it's own partition, and is shared by many users. In this particular case, you can't really see who's fault it is, because permissions restrict you seeing who is filling up the tmp partition. Basically have to ask the sys admins to figure it out.
Your default temporary directory appears to have the wrong permissions. Here I have
$ ls -ld /tmp
drwxrwxrwt 22 root root 4096 2011-06-10 09:17 /tmp
The key part is 'everybody' can read or write. You need that too. It certainly can contain subdirectories.
Are you running something like AppArmor or SE Linux?
Edit 2011-07-21: As someone just deemed it necessary to downvote this answer -- help(tempfile) is very clear on what values tmpdir (the default directory for temporary files or directories) tries:
By default, 'tmpdir' will be the directory given by 'tempdir()'. This
will be a subdirectory of the temporary directory found by the
following rule. The environment variables 'TMPDIR', 'TMP' and 'TEMP'
are checked in turn and the first found which points to a writable
directory is used: if none succeeds '/tmp' is used.
So my money is on checking those three environment variables. But AppArmor and SELinux have shown to be an issue too on some distributions.
Go to your user directory and create a file called .Renviron and add the following line, save it and reopen RStudio or Rgui or Rterm
TMP = '<path to folder where Everyone has full control>'
This worked with me on Windows 7
If you are running one of the rocker docker images (e.g., rocker/verse), you need to map a local directory to the /tmp directory in the container. For example,
docker run --rm -v ${PWD}/tmp:/tmp -p 8787:8787 -e PASSWORD=password rocker/verse:4.0.4
where ${PWD} for me is ~/devProjs/r, and I created a /tmp directory inside it, so that the container's /tmp is mapped to my ~/devProjs/r/tmp directory.
Just had this issue and finally solved it. Simply a windows permission issue. Go to environment variables and find the location of the temp folders. Then right click on the folder > properties > security > advanced > change everyone to full control > tick "replace all child object permission entries with inheritable permission entries from this object" > Ok > ok.
This will also happen when your computer is completely, utterly out of space. Currently, my Mac has 0 kb free and it's causing this error. Freeing up some space solved the problem.
Check for the user account with which you are launching the RStudio with. Now u check the TMP(System Environment variable) for its location. If the user who is launching RStudio has Write access for those directories you will not face this issue. Being said that you are facing this issue, all you have to do is to change the permissions for that user to have write access on those directories.
Running R on CentOS system and had the same issue. I had to remove all R folders from the tmp directory. Usually all R folders will be in the form of /tmp/Rtmp*****
so i tried to delete the folders from /tmp by running the below.
CD into /tmp directory and run rm -rf Rtmp*
R shell Worked for me afterwards
I had this issue, solution was slightly different. I run R on a linux server - it turned out for me R had made a whole load of tempdirs when running jobs with cron that had hung and not been cleaned up, clogging up the root /tmp directory with ~300 RtmpXXXXXX folders.
Using terminal access, I navigated to the /tmp folder did a recursive find/rm - deleting all of them using this command:
find . -type d -name 'Rtmp*' -exec rm -r -v {} \;
After this, Rstudio took a while to load up, but was once again happy and my scripts began to run again.
You will need the appropriate admin rights for this solution. And always be careful when running rm -r, especially with a find command, as it's easy to remove things unexpectedly.
When it comes to deleting tmp files, make sure that the tmp files are in the server or in local.
If its in the remote, 1st check for the df /tmp in the server or in the remote to see who uses more storage.
Then use rm(file_name)` to remove the files which cause the blocking.
If its in the remote, then use rm /tmp/(file_name)..
MOreover, you can also refer to https://support.rstudio.com/hc/en-us/articles/218730228-Resetting-a-user-s-state-on-RStudio-Server

Resources