Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I have many nginx access log files and i want to parse these files between two dates. For example i want to parse the log files between 15/Sep/2020 until 15/Oct/2020 and i use this command:
zcat /var/log/nginx/mywebsite.access.log.*.gz | awk '$4 >= "[15/Sep/2020" && $4 < "[15/Oct/2020"'
or this command:
zcat /var/log/nginx/mywebsite.access.log.*.gz | sed -n -e '/15\/Sep\/2020/,/15\/Oct\/2020/p'
but the result is not between two dates.
Please help me for this. Thanks
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
i would like to know how can I get the path to sources.list on a modified UNIX which have apt and other base packages on it , like gpg and sudo. Does apt can identify the path to sources.list ?
He is using it , so he should be able to locate it, right ?
I don't know if this is the best way, but apt-config dump will show all of apt's configuration variables. On my system, the Dir::Etc variable gives the directory where the file is located, and Dir::Etc::sourcelist gives its name.
You can also read in the apt-config man page about the shell option which may be more useful for processing this data in a program.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
I'm running rsync underneath Supervisor. I normally start rsync daemon like this:
rsync --daemon --config=/home/zs6ftad/deployments/cmot_rsync_daemon/rsyncd.conf --no-detach
I'd like to make it so that any log messages get echo'd to standard output instead of being stored in the log-file. Is there an option which will make an rsync server behave this way?
You can get rsyncd to log to stdout by setting the --log-file argument to /dev/stdout
rsync --daemon --no-detach --log-file=/dev/stdout
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
How to kill/stop Kibana process ?
Answer: netstat -pln | grep 5601
then you can get the process id and kill -9 13304
If you have installed as service following command will work
service kibana stop
kill -9 `ps aux|grep -v grep|awk '{print $2}'`
is very helpful when you find a lot of Kibana processes. But be careful that it can kill other processes that contain "kibana" in the process name.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I want to move from one server to another and I won´t lost special logfiles (like mail.logs), so want to rsync the files with the --files-from option. But I can´t use a quantifier like * or {0..9} in the file list.
rsync -avR --files-from=/backup/filelists/filelist1.txt / $DESTSRV:"$DESTPATH"
for example I want to rsync all mail server log files
/var/log/mail.log
/var/log/mail.log.1
/var/log/mail.log.2.gz
/var/log/mail.log.3.gz
/var/log/mail.log.4.gz
But in the /backup/filelists/filelist1.txt I can´t use
/var/log/mail*
or
/var/log/mail.log.{2..10}.gz
I got the following error
rsync: link_stat "/var/log/mail*" failed: No such file or directory (2)
Anybody knows a solution for my problem?
After searching and trying I found another solution that fits to me:
cat /backup/fileslists/filelist1.txt | { while read line; do rsync -avzR $line "$DESTSRV":"$DESTPATH"/; done; }
This code reads the input file line by line and sync it with rsync. In this case I could use any quantifier :).
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
My system admin gave me a file with iptables rules.
What command do I type in to load this?
I heard people can do this in one line?
Something like...iptables > thefile.dat ????
You can save your current iptables using iptables-save as in
iptables-save >thefile.dat
and later load it with
cat thefile.dat | iptables-restore