I am trying to delete the following files from a directory of my unix machine:
$ ls -la
total 160
... other files ...
-rw-r--r--# 1 username staff 171 Oct 24 2017 ~$checklist.xlsx
-rw-r--r--# 1 username staff 171 Oct 16 2017 ~$papers.xlsx
-rw-r--r--# 1 username staff 162 Sep 4 2017 ~$rec.docx
-rw-r--r--# 1 username staff 162 Nov 25 21:00 ~$file1.docx
-rw-r--r--# 1 username staff 162 Nov 25 21:01 ~$file2.docx
However, when I attempt to delete them, it won't let me for various reasons. For example:
$ rm ~$checklist.xlsx
rm: ~.xlsx: No such file or directory
$ rm $checklist.xlsx
rm: .xlsx: No such file or directory
$ rm checklist.xlsx
rm: checklist.xlsx: No such file or directory
Why won't my computer let me delete these files? How can I go about deleting them? Thanks!
you need to scape those characters
a simple way to create one:
echo "fileteste" > \~\$file
a simple way to delete one:
rm \~\$file
Related
I'm trying to automate our backups of some mysql databases in MariaDB Server 10.2.15 on CentOS 7.5:
mariabackup --backup --target-dir=/srv/db_backup --databases="wordpress" --xbstream | \
openssl enc -aes-256-cbc -k mysecretpassword > \
$(date +"%Y%m%d%H").backup.xb.enc
What I expect is a file in /srv/db_backup called $(date +"%Y%m%d%H").backup.xb.enc
What I'm finding is a file called $(date +"%Y%m%d%H").backup.xb.enc in my home directory with file size 0, and the /srv/db_backup dir looks like:
[root#wordpressdb1 ~]# ls -la /srv/db_backup/
total 77868
-rw------- 1 root root 16384 Jul 31 14:30 aria_log.00000001
-rw------- 1 root root 52 Jul 31 14:30 aria_log_control
-rw------- 1 root root 298 Jul 31 14:30 backup-my.cnf
-rw------- 1 root root 938 Jul 31 14:30 ib_buffer_pool
-rw------- 1 root root 79691776 Jul 31 14:30 ibdata1
-rw------- 1 root root 2560 Jul 31 14:30 ib_logfile0
drwx------ 2 root root 19 Jul 31 14:30 wordpress
-rw------- 1 root root 103 Jul 31 14:30 xtrabackup_checkpoints
-rw------- 1 root root 458 Jul 31 14:30 xtrabackup_info
All further attempts to run the mariabackup command fail on:
mariabackup: Can't create/write to file '/srv/db_backup/ib_logfile0' \
(Errcode: 17 "File exists")
mariabackup: error: failed to open the target stream for 'ib_logfile0'.
What have I done wrong?
EDIT
First error was a missing dash in openssl -aes-256-cbc
Now I'm seeing this:
180731 15:18:37 Executing FLUSH NO_WRITE_TO_BINLOG TABLES...
Error: failed to execute query FLUSH NO_WRITE_TO_BINLOG TABLES: Access \
denied; you need (at least one of) the RELOAD privilege(s) for this operation
I've granted both SUPER and RELOAD to root user, still get this error.
Partial answer:
"What I expect is a file in /srv/db_backup called $(date +"%Y%m%d%H").backup.xb.enc" -- Then you need to specify a directory other than the current directory:
mariadbdump ... > \
/srv/db_backup/$(date +"%Y%m%d%H").backup.xb.enc
As for "unable to write", what do you get from
ls -ld /srv/db_backup
You need to use --stream=xbstream , not --xbstream
You backed up into directory, not into stream.
I have to check the user and owner of that file which is modified in last 24 hours .I am not able to understand how could i get the file which is modified in last 24 hour
ls -lrt /dirpath | grep 'Util'
Output of this command is :
-rw-r--r-- 1 user user 186 Apr 11 08:05 Util-04-11.log.gz
-rw-r--r-- 1 user user 185 Apr 12 08:05 Util-04-12.log.gz
-rw-r--r-- 1 user user 186 Apr 13 08:05 Util-04-13.log.gz
-rw-r--r-- 1 user user 186 Apr 14 08:05 Util-04-14.log.gz
-rw-r--r-- 1 user user 278 Apr 20 08:05 Util-04-20.log
Now i want to check user and owner of file which is modified in last 24 hours.How we can do this in unix.
You can use find to filter files with corresponding modified date.
From the man page:
find $HOME -mtime 0
Search for files in your home directory which have been modified in
the last twenty-four hours. This command works this way because the
time since each file was last modified is divided by 24 hours and
any remainder is discarded. That means that to match -mtime 0, a
file will have to have a modification in the past which is less
than 24 hours ago.
With that in mind you can use the -exec option to build the following command
find /dirpath -mtime 0 -exec stat -c "%n %U %G" '{}' \;
You could use the find command specifying creation time is less than one day (-ctime 0), then use xargs to perform your ls:
find /dirpath -name "*Util*" -ctime 0 -type f |xargs ls -lrt
Hope it helps.
Is there a way to copy the current directory (and not just the contents) to a remote directory -- without specifying the current directory by name.
For example, I'm in the directory /bar and I want to copy /bar and its contents to the remote directory /foo with the resulting directory being /foo/bar.
Of course I could specify the current directory by name, but I'd just to be able to copy, in the manner specified, whatever directory I'm in.
This seems to do what you want in my tests unless I am miss understanding you:
rsync -r `pwd` user#hosts:./bar
Assuming you have:
Michaels-MacBook-Pro-2 in ~/foo
○ → ls -l
total 0
-rw-r--r-- 1 errr staff 0 May 9 13:07 bah
-rw-r--r-- 1 errr staff 0 May 9 13:07 bar
-rw-r--r-- 1 errr staff 0 May 9 13:07 baz
Michaels-MacBook-Pro-2 in ~/foo
○ → rsync -r `pwd` errr#192.168.88.217:./bar
errr#192.168.88.217's password:
You end up with:
errr#ansible-master:~/bar$ ls -lR
.:
total 4
drwxr-xr-x 2 errr errr 4096 May 9 18:10 foo
./foo:
total 0
-rw-r--r-- 1 errr errr 0 May 9 18:10 bah
-rw-r--r-- 1 errr errr 0 May 9 18:10 bar
-rw-r--r-- 1 errr errr 0 May 9 18:10 baz
I am working on Solaris 10 machine. In that i cannot able to untar my file. Logs are given below. Anyone please suggest what may be the issue? I can able to create tar file but unable to untar. :(
bash-3.2# ls -lrth ConfigCheck-120614-KL.out*
-rw-r--r-- 1 root root 144K Jun 12 17:15 ConfigCheck-120614-KL.out
-rwxrwxrwx 1 root root 146K Jun 16 16:49 ConfigCheck-120614-KL.out.tar
bash-3.2# tar xvf ConfigCheck-120614-KL.out.tar
tar: extract not authorized
bash-3.2# tar tvf ConfigCheck-120614-KL.out.tar
-rw-r--r-- 0/0 147377 Jun 12 17:15 2014 ConfigCheck-120614-KL.out
Solaris 11 tar will fail with that error message if you are running as uid 0 but do not have the Media Restore profile set up in the RBAC configuration.
Unless you're trying to restore from backup, you should normally be untarring files as a normal user, not root, to avoid accidentally overwriting critical system files.
I run the following command at uni to my user account
chmod -R 700 *
Then, I run
chmod -R 755 public_html
My homepage remains to be "Forbidden" when I browse to it.
The permissions of my user account
4 drwx------ 5 Newbie staff 4096 2008-12-19 12:39 Desktop
4 drwx------ 10 Newbie staff 4096 2009-04-16 02:28 Documents
4 drwx------ 4 Newbie staff 4096 2008-11-28 20:48 irclogs
4 -rwx------ 1 Newbie staff 1686 2008-09-10 16:00 kieli
4 drwxr-xr-x 3 Newbie www 4096 2009-04-16 02:14 public_html
4 drwx------ 2 Newbie staff 4096 2008-09-01 08:43 Templates
4 drwx------ 4 Newbie staff 4096 2008-12-21 03:15 tmp
4 drwx------ 7 Newbie staff 4096 2008-09-03 21:39 Windows
4 drwx------ 4 Newbie staff 4096 2008-10-03 16:29 workspace
The permissions of files in public_html
4 -rwxr-xr-x 1 newbie staff 3414 2009-04-15 02:23 index.html
4 -rwxr-xr-x 1 newbie staff 2219 2008-09-16 10:46 index.html~
144 -rwxr-xr-x 1 newbie staff 140120 2009-04-14 22:16 jquery.js
4 -rwxr-xr-x 1 newbie staff 699 2009-04-15 01:05 template.css
Well, your second chmod doesn't seem to be working very well, since your public_html directory is mode 744, not 755.
You'll also need to make it so that your home directory can be "looked through" by the Web server; www user needs execute permission on the directory. chmod o+x . is probably your best bet.
Do you have an index file in the directory?
You would also need to give excute permission to the files in public_html. In the statement you are giving permissions chmod -R 755 public_html, still the permissions given is only 744. So give the execute permissions as well.
I found another bug.
My user folder had the permissions 700.
I changed it to 701.
I can now see my website.
It seems that 701 is required.