i have an issue with a bot that hacked my website and injected google ads script.
i have tried to find in the files where the script is located but i had no luck.
i been told to try block the script using .htaccess file, i have read online and tried to make my own code to block the script but no luck again.
this is what i have tried so far :
<Files adsbygoogle.js?client=ca-pub-4262160458050412>
order deny,allow
deny from all
</Files>
<Limit GET>
Order allow,deny
Allow from all
deny from https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?client=ca-pub-4262160458050412
deny from https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js
</Limit>
that is how the script that the bot injected looks like :
<script async="" src="https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?client=ca-pub-4262160458050412" crossorigin="anonymous" data-checked-head="true"></script>
can anyone let me know why the script is still loading in my website?
Related
In WordPress my network team restrict wpadmin folder with single ip. So my admin-ajax.php ajax call are 403 forbidden for end user. Is there a solution to allow everyone to access this?
Step 1: restrict Wp-admin folder file wise, and allow admin-ajax file
Step 2: Any another method is available to ajax call without adamin-ajax file
Are any of these possible?
If you want to allow access folder by ip, then please add the below code in .htaccess file.
<Directory /path/to/the/folder>
Options +Indexes
IndexOptions +FancyIndexing
Order deny,allow
Deny from all
Allow from X.X.X.X
</Directory>
For specific File then add the below code
<Files file-name.php>
Order deny,allow
Deny from all
Allow from X.X.X.X
</Files>
I have a domain I use for development purposes. In this domain I have several subdirectories with different wordpress installations.
To hide the whole area I made a simple htpasswd protection in the root.
Now I have one of this Wordpress in the domain that uses timthumb library to resize images, and due to the htpasswd, I get "NetworkError: 400 Bad Request" instead of the image.
This is an example of the request that gets the error
http://subdomain.domain.com/WP/wp-content/plugins/plugin-directory/timthumb.php?src=http%3A%2F%2Fsubdomain.domain.com%2FWP%2Fwp-content%2Fuploads%2F2015%2F01%2F012015_valentines_hp_budvase.jpg&w=300&h=620&zc=1
Is there a way to bypass the protection only for that file?
More details on my paths to better read my .htaccess snippets:
I'm in a subdomain pointed to a subdirectory called 'subdomain_folder'
.htaccess I'm working on is located in 'subdomain_folder'
WP is in a subdirectory called 'WP' inside 'subdomain_folder'
Complete Path to WP: '/home/some-folder/public_html/subdomain_folder/WP
Complete Path to Uploads: '/home/some-folder/public_html/subdomain_folder/WP/wp-content/uploads
I tried this:
SetEnvIf Request_URI "^/WP/wp-content/plugins/plugin-dir/timthumb\.php$" allow
AuthType Basic
AuthName "Restricted Area"
AuthUserFile "/home/some-folder/.htpasswds/public_html/subdomain_folder/passwd"
Require valid-user
Order allow,deny
Allow from env=allow
Satisfy any
UPDATE
Someone adviced me that allowing access to timthumb.php file it's pointless, instead I should allow him to make http requests, or allow full access to uploads folders so, I tried the following, allowing requests from localhost ip
AuthType Basic
AuthName "Reserverd Area"
AuthUserFile "/home/some-folder/.htpasswds/public_html/subdomain_folder/passwd"
Require valid-user
Order Deny,Allow
Deny from all
Allow from 127.0.0.1
Satisfy Any
Tried both localhost and 127.0.0.1
I even tried to add another .htaccess in the single WP upload folder (where timthumb asks for images) with rule to allow from any
Satisfy Any
Order Allow,Deny
Allow from all
Still I cant' get images shown, and I keep getting the NetworkError: 400 Bad Request" instead of the image.
Last Detail, the .htaccess in the WP directory is a standard wp htaccess --> pastebin.com/8PRqEYQ2
I Found the solution.
The right way is indeed allowing requests from the server itself, but the localhost IP (127.0.0.1) was not the right adress to allow.
I made a Reverse IP Lookup searching for the domain I'm on, and I used that IP.
This is the .htaccess that works
RewriteEngine On
<IfModule mod_authn_file.c>
AuthName "Restricted Area"
AuthUserFile "/home/path-to-passfile/passwd"
AuthType Basic
Require valid-user
Order Deny,Allow
Deny from all
# Use your server ip:
Allow from 111.111.111.11
Satisfy Any
</IfModule>
With this rules I can develop apps using timthumb.php in .htpasswd protected directory.
Criticisms and improvements are welcome :)
We've tried a few things that we found around Google for this, but can't seem to get anything to work.
The Problem
We have a server with around 500 Wordpress websites on it. We're trying to lock down all the wp-login.php pages for every instance to the IP address of our office using a global htaccess - but the individual Wordpress htaccess files are overriding this.
The Environment
We're hosted on an AWS Linux server running Plesk to manage each website / Wordpress instance.
The Question
Is there a way we can set one htaccess file on the server to lock down all of the Wordpress login pages without the individual htaccess files overriding this?
any help or suggestions for a good way to do this, would be appreciated.
Thanks in advance
I assume that you have read up on the RewriteOptions directive. As I explain in Tips for debugging .htaccess rewrite rules and as you have found with WP which generates its own .htaccess files, by default the current path is scanned for .htaccess and the rewrite rules in the lowest are applied unless a higher one specifies a RewriteOptions Inherit in which case it's rules are executed after rules specified in the child scope, and this is the catch-22 in that WP access file generates a [L] flag on all its execution paths preventing the parent rules from firing.
So the answer is to do this with an Apache mechanism other than rewrite and you can use the SetEnvIf directive:
SetEnvIf Remote_Addr "!^192\.168\." forbidden
<Files *>
Order allow,deny
Allow from all
Deny from env=forbidden
</Files>
or
SetEnvIf Remote_Addr "!^192\.168\." forbidden
<Directory /var/www/wproot>
Order allow,deny
Allow from all
Deny from env=forbidden
</Directory>
Clearly you'll need to change the Regexp to your local needs but this should do the biz. The Apache docs give other variants on this, but you should be able to find one which works in your case. Just put this in the a per-virtual server context -- within a Directory(Match) directive if necessary -- or in a common parent directory .htaccess file.
I ended up getting this to work with your first suggestion, but actually without the SetEnvIf line being required, so thanks very much! this was my .htaccess in the /var/www/vhosts folder for anyone else needing this:
<files wp-login.php>
order deny,allow
deny from all
Allow from xxx.xxx.xxx.xxx
</files>
Nice and simple and completely different from the previous routes I was trying to take for this.
I have an sqlite database located at:
http://example.com/db/test.db
When visited in a browser, the database is downloaded.
How can i prevent this as i do not want others to be able to get ahold of it?
Something like a .htaccess file, along the lines of;
<Files ~ "\.(htaccess|db)$">
order allow,deny
deny from all
</Files>
Is there a module I can use to disable some Drupal system pages? For example, I'd like to disable node, taxonomy/term/*, filter/tips.
I'm not sure if there is a module that does that, but it's not too hard to write your own custom module for this. You only need to implement hook_menu_alter (and clear the cache after changing your code). You can choose to return an 'access denied' page or a '404 not found':
<?php
function MODULENAME_menu_alter(&$items) {
// This will deny access to taxonomy/term/* for all users.
$items['taxonomy/term/%']['access callback'] = FALSE;
// This will completely remove filter/tips, resulting in a 404.
unset($items['filter/tips']);
}
?>
If you want to know more about writing Drupal modules, see http://drupal.org/developing/modules.
This seems to be more of a "one time" configuration to me. So I wonder if its necessary to have an admin interface for this that you have requested in one of your comments.
If you're using apache, in the virtual host configuration of your site you can include the following directives:
<LocationMatch ^/taxonomy/term>
SetHandler server-status
Order Deny,Allow
Deny from all
</LocationMatch>
<LocationMatch ^/filter/tips>
SetHandler server-status
Order Deny,Allow
Deny from all
</LocationMatch>
This will deny access to those URLs. But you need to make sure that you don't have an URL aliased to taxonomy/term/ etc paths. Otherwise the user can access those URLs.
Check http://httpd.apache.org/docs/2.0/mod/core.html#locationmatch
and http://httpd.apache.org/docs/2.0/mod/core.html#location for some documentation