Google Analytics recently started showing PHP scripts as referrers to my website, for example:
localhost/index.php
EDIT: This is a recent surge in activity coming from India. It is not coming from our own services, such as our web host, or a backup service. It is also coinciding with spam users on my websites from India, so I know this is intentionally malicious behavior.
Any suggestions on how to investigate further and prevent it? We are running on Django, hosted on AWS, if that helps.
If the server have subnet or the server is on your system it may cause that kind of referrers if request from the subnet.
Well, In case of Django if somebody from your team is running a development version of your application with the Google Analytics tracking code, then things like this can show up. Not only will localhost show up in your Referrers, but your aggregate metrics like Bounce Rate, Time On Site, Conversion, and others will be incorrect because the unusual behavior of a developer's will be mixed in with that of normal users and skew our results. There are basically 3 steps to fix it :
Add a Google Analytics exclusion filter
1) Open Google Analytics and choose your property view.
2) Navigate to Admin.
3) Click on Filters under the View column.
4) Click on New Filter.
5) Create a new "Predefined filter" which excludes traffic to the "localhost" hostname.
Edit: Configure ALLOWED_HOSTS in Django settings
This is a security measure to prevent an attacker from poisoning caches and password reset emails with links to malicious hosts by submitting requests with a fake HTTP Host header, which is possible even under many seemingly-safe web server configurations. Django 1.5 introduced the allowed hosts setting that is required for security reasons. A settings file created with Django 1.5 has this new section which you need to add:
ALLOWED_HOSTS = [
'.example.com', # Allow domain and subdomains
'.example.com.', # Also allow FQDN and subdomains
]
Add your host here like ['www.antodominic.com'] or ['*'] for a quick test, but don't use ['*'] for production.
Hope this helps ...!!
Cheers.. :)
If you have a website that is externally accessible, then yes- someone is trying to hack your website... and every other website in existence. It's a fact of life.
Your localhost referrer is not necessarily indicative of malicious behavior, however. It's more likely that your dev instance, or someone else's dev instance of their site with links to your site, is creating the entries in your analytics.
However, if it's a referer with a link to another site in the querystring, then what you're falling victim to is referer spam attempts. If you want to prevent them, you can block them via htaccess if you're running on Apache, or via web.config if you're running on IIS. Just replace the pertinent bits regular expressions, or better yet, add to them.
Related
I am trying to learn how to implement a DNS server, and am looking first at Heroku.
Why do they have # point to hidden-sierra-7936.herokudns.com.? Why not just foo.herokudns.com., or better yet, herokudns.com. (no subdomain), or even better, heroku.com. (main website). What are the reasons for this? Is it security, performance, architecture-needs, something else, all of the above? More specifically, what are the details of these reasons, does it depend on the number of requests coming through and that's why the <dynamic-name>.herokudns..., so there are a lot of them? Or perhaps if there is an error in one they can quickly switch it?
Finally, can these reasons be avoided/countered/argued-against so you could make the domain a little nicer and just do heroku.com.? Why can't you just do it on heroku.com.? (If you were building Heroku that is, obviously Heroku doesn't support this).
I am also looking at this. It looks like Heroku used to do it like proxy.heroku.com, but for some reason they switched it. Why?
As far as using subdomains on .heroku.com, this is at least partly a security mitigation.
Consider the arguments made in this blog post from GitHub, published when they moved Pages sites from .github.com to .github.io:
There are two broad categories of potential security vulnerabilities that led to
this change.
Session fixation and CSRF vulnerabilities resulting from a browser security issue sometimes referred to as “Related Domain Cookies”. Because Pages sites may include custom JavaScript and were hosted on github.com subdomains, it was possible to write (but not read) github.com domain cookies in way that could allow an attacker to deny access to github.com and/or fixate a user’s CSRF token.
Phishing attacks relying on the presence of the “github.com” domain to create a false sense of trust in malicious websites. For instance, an attacker could set up a Pages site at “account-security.github.com” and ask that users input password, billing, or other sensitive information.
If I set up a simple web server online (eg nginx), and generate a very large random string (such that it is unguessable), and host that endpoint on my domain, eg
example.com/<very-large-random-string>
would I be safe in say, hosting a webapp at that endpoint with no authentication to store my personal information (like a scratch-pad or notes kind of thing)?
I know google docs does this, is there anything special one has to do (again, eg for nginx) to prevent someone from getting a list of all available pages?
I guess I'm asking is there any way for a malicious actor to find out about the existence of such a page, preferably irrespective of what web-server I used.
I'd be pretty alarmed if my online bank started using this system, but it should give you a basic level of security. Bear in mind that this is security through obscurity, which is rather frowned upon and will immediately turn into no security whatsoever the moment someone discovers the hidden URL.
To prevent this from happening, you will need to take a few precautions:
Install an SSL certificate on your server, and always access the url via https, never via http (otherwise the URL path will be sent in plain view and visible to everyone along the way).
Make sure your secure document contains no outgoing links. This includes not only hyperlinks (<a href="...">) but also embedded images, stylesheets, scripts, media files and so on. Otherwise the URL will be leaked to other domains via the Referer request headers.*1
(A bit of a no-brainer, but) make sure there are also no inbound links to this page. Although they aren't so common now, web hosts used to generate automatic "web stats" pages showing the traffic to each web domain. Some content management systems generate a site map automatically. This would be just as bad.
Disable directory browsing on your server. In other words, make sure that someone who visits the directory level above your hidden directory isn't presented with a list of subdirectories.
Bear in mind that the URL will always be visible in your address bar and browser history, and possibly in other places like your browser's cookie jar. Your browser will probably provide the rest of the URL by auto-complete when someone types the domain into your address bar.
*1: Actually, your browser will only send a Referer header when you access other https pages, but still...
I have a running website (based on ASP.NET MVC) on some domain, let's say mydomain.com
Yesterday I was looking into site access logs and I noticed very weird logs: inside it, I saw different domain!
Something like anotherdomain.com/somePage
And I saw exception text in my log saying that 404 - anotherdomain.com/somePage can't be found. It looks like somehow my code running on some other domain (Request.URL show different domain).
How it is possible? Does that means that someone somehow got access to my host (I running on Azure) and steal my binaries and deployed on another host? Or maybe my website opened from iframe?
I need to understand in order to determine whether I have a breach.
If I had to guess, I would bet that someone accidentally set their domain's DNS records to point at your server. You can check where the A record for the domain is pointed with nslookup or whoisfrom the command line. If they are in fact mis-configured, you should contact the site administrator to let them know. This kind of mis-configuration, while uncommon, can happen more frequently with cloud services due to the inherently transient nature of the servers and routes used.
It's actually possible to make a GET request to access other domains, via your domain, to check if there's a badly configured proxy. Since you're not, it simply returns a 404 Not Found because you are not actually hosting those pages.
Scans like these happen all the time and is an unfortunate side effect of being connected to the internet, but does not mean that you are under attack or that someone has access to your host.
I have a Comodo SSL certificate on my host plan, however when accesing my site from google, it sends me automatically to
http://example.com, Where the green lock doesn't appear.
If I manually add "https", like: https://example.com it does show up!
Is there a way to access my website always with the green lock showing up? instead of manually having to write it everytime?
You can easily redirect to the https version of any page using rewrite rules/rewrite module of your web server (the exact way to do this depends on the webserver used). Ask your provider, this is a common case so there may even be a UI option in your console to do this.
Regarding google see this: https://webmasters.stackexchange.com/questions/67212/how-to-convince-google-to-list-https-version-of-website
It may also be good form to verify the protocol used to access the site in your authentication module and refuse authentication if the wrong protocol is used. Assuming web rules are used to redirect traffic this would to prevent leaking information due to a misconfiguration/bug.
Recently I added SSL to my WordPress site but it started causing some problems (conflicts with Woocommerce and WP Super Cache plugins). The problem the I was having because of SSL was that the the Woocommerce cart was sometimes showing empty even after adding a product ans sometime the cart was not proceeding to checkout page. Do you think it had something to do with WP Super Cache or SSL or both? Anyway, I couldn't get it solved and removed the SSL after 2 days. But meanwhile Google had indexed the HTTPS URLs of my site and was showing them in the search results and they were returning SSL connection error. Now my question is how can I redirect all those HTTPS URLs to the HTTP ones? I asked my web host for help but said the redirection is not possible through htaccess or any other method. Was he right? How long will Google take to 'forget' these HTTPS links and show the HTTPS links again in search Results?
There are two standard ways to redirect:
At the DNS level
At the HTTP level
The DNS level can't help you because it just changes hostname. You want to keep the same hostname but change the scheme. This means you need an HTTP server to do the redirect.
In order to redirect from https to http you need to have an HTTPS service running on the computer with the IP address that the hostname resolves to.
Without that, there is nothing the receive the HTTP request over SSL and response with "Oh, this has moved to plain HTTP".
If the SSL service isn't running, then there is nothing that can do that.
(.htaccess is just a (suboptimal) means to configure an HTTP server, that does no good if you don't have the HTTP server listening on SSL).
Personally I'd fix the https issues. The world is going more https everyday so it's a backwards step to go from https to http. If you elaborate on what issues you had someone might be able to help.
However if you really want to do this then you need to run both http and https and redirect all traffic from https to http. How you do this depends on your set up (in Apache you'd do it using htaccess config).
How long it takes Google to fronded your site depends on many factors including the size and popularity of your site - which governs how often Google crawls your website. Give it a month at least for a small site. You can give it a kick by submitting your site to Google Search Console (the new name for Google Webmaster Tools).
Btw StackOverflow is primarily for programming questions so questions like this might be better asked on the http://webmasters.stackexchange.com sister site.