I have a wordpress site:
http://www.fairlady-sleepingtiger.co.uk/
but the home page is not displaying all the content.
When I inspect the Console I see this message:
www.fairlady-sleepingtiger.co.uk/:12 A Parser-blocking, cross-origin script, http://ajax.cloudflare.com/cdn-cgi/nexp/dok3v=088620b277/cloudflare.min.js, is invoked via document.write. This may be blocked by the browser if the device has poor network connectivity.
I have deleted/disbaled all Cloudflare functionality from this site but this still comes up.
Can anyone help please?
You are still pointing to Cloudflare nameservers, so your domain is still being proxied.
There is a period of time where we will continue to handle DNS even after you remove your zone, which protects against this situation where the nameservers didn't get changed. Otherwise the site would just be unreachable.
If you want to completely remove Cloudflare you need to update the nameservers at your registrar, and depending on the registrar it should update in 24 hours or so.
Related
I have a static Gatsby site that uses WordPress for it's back end. I also have the WordFence plugin installed to prevent hackers from causing havoc. I started out without WordFence installed, the site got hacked, and we had to scrap the whole back end and start over with a new database.
When trying to deploy my Gatsby website using Netlify I recieve this error message:
If you are using a security plugin like WordFence or a server firewall you may need to whitelist your IP address or adjust your firewall settings for your GraphQL endpoint.
I have the NetlifyPress WordPress plugin installed. It doesn't help prevent the issue.
I was able to fix this error locally by whitelisting my IP address in the Wordfence firewall settings.
It's not as simple whitelisting the Netlify automated build processes on the website though.
Does anyone know how to stop Wordfence from from blocking Netlify?
I figured it out!
Go to WordFence dashboard.
Click on "Tools" to view a live graph of intercepted suspicious activity
Switch to Netlify and run a deploy of your site (it should fail)
Go back to the WordFence graph and take a look at the top row, the "page visited" column should have "/graphql" in it
Click on that row
Click on the "Add Param to Firewall Allowlist" button
Run your Netlify build again and it should work!
Some caveats:
Be super careful that you don't accidentally whitelist a hacker!
WordFence is constantly blocking attacks. Most attacks aren't trying to access "/graphql" though, so that is a pretty good indicator of which rows are services you want to white list and which rows are hackers that need to stay blocked.
The above method seems to give access to anyone that is trying to access the "/graphql" endpoint. That might be considered a security issue for you. On the plus side, it meant that it also fixed my BitBucket pipeline issue.
An alternative method is to copy the IP Address in the "IP Address" column then add it to your IP Address whitelist.
Go to the Wordfence dashboard
Click "Firewall" in the left nav bar
Click "All firewall options"
Enter the IP address in the "Allow listed IP addresses that bypass all rules" field
That could turn into quite a long list though as Netlify has a lot of servers all with different IP addresses. You will have to be constantly whitelisting new IP Addresses. It is admittedly a much more secure method than the first option though. This method ensures that only Netlify will have access to the "/graphql" end point.
Recently I added SSL to my WordPress site but it started causing some problems (conflicts with Woocommerce and WP Super Cache plugins). The problem the I was having because of SSL was that the the Woocommerce cart was sometimes showing empty even after adding a product ans sometime the cart was not proceeding to checkout page. Do you think it had something to do with WP Super Cache or SSL or both? Anyway, I couldn't get it solved and removed the SSL after 2 days. But meanwhile Google had indexed the HTTPS URLs of my site and was showing them in the search results and they were returning SSL connection error. Now my question is how can I redirect all those HTTPS URLs to the HTTP ones? I asked my web host for help but said the redirection is not possible through htaccess or any other method. Was he right? How long will Google take to 'forget' these HTTPS links and show the HTTPS links again in search Results?
There are two standard ways to redirect:
At the DNS level
At the HTTP level
The DNS level can't help you because it just changes hostname. You want to keep the same hostname but change the scheme. This means you need an HTTP server to do the redirect.
In order to redirect from https to http you need to have an HTTPS service running on the computer with the IP address that the hostname resolves to.
Without that, there is nothing the receive the HTTP request over SSL and response with "Oh, this has moved to plain HTTP".
If the SSL service isn't running, then there is nothing that can do that.
(.htaccess is just a (suboptimal) means to configure an HTTP server, that does no good if you don't have the HTTP server listening on SSL).
Personally I'd fix the https issues. The world is going more https everyday so it's a backwards step to go from https to http. If you elaborate on what issues you had someone might be able to help.
However if you really want to do this then you need to run both http and https and redirect all traffic from https to http. How you do this depends on your set up (in Apache you'd do it using htaccess config).
How long it takes Google to fronded your site depends on many factors including the size and popularity of your site - which governs how often Google crawls your website. Give it a month at least for a small site. You can give it a kick by submitting your site to Google Search Console (the new name for Google Webmaster Tools).
Btw StackOverflow is primarily for programming questions so questions like this might be better asked on the http://webmasters.stackexchange.com sister site.
Google Analytics recently started showing PHP scripts as referrers to my website, for example:
localhost/index.php
EDIT: This is a recent surge in activity coming from India. It is not coming from our own services, such as our web host, or a backup service. It is also coinciding with spam users on my websites from India, so I know this is intentionally malicious behavior.
Any suggestions on how to investigate further and prevent it? We are running on Django, hosted on AWS, if that helps.
If the server have subnet or the server is on your system it may cause that kind of referrers if request from the subnet.
Well, In case of Django if somebody from your team is running a development version of your application with the Google Analytics tracking code, then things like this can show up. Not only will localhost show up in your Referrers, but your aggregate metrics like Bounce Rate, Time On Site, Conversion, and others will be incorrect because the unusual behavior of a developer's will be mixed in with that of normal users and skew our results. There are basically 3 steps to fix it :
Add a Google Analytics exclusion filter
1) Open Google Analytics and choose your property view.
2) Navigate to Admin.
3) Click on Filters under the View column.
4) Click on New Filter.
5) Create a new "Predefined filter" which excludes traffic to the "localhost" hostname.
Edit: Configure ALLOWED_HOSTS in Django settings
This is a security measure to prevent an attacker from poisoning caches and password reset emails with links to malicious hosts by submitting requests with a fake HTTP Host header, which is possible even under many seemingly-safe web server configurations. Django 1.5 introduced the allowed hosts setting that is required for security reasons. A settings file created with Django 1.5 has this new section which you need to add:
ALLOWED_HOSTS = [
'.example.com', # Allow domain and subdomains
'.example.com.', # Also allow FQDN and subdomains
]
Add your host here like ['www.antodominic.com'] or ['*'] for a quick test, but don't use ['*'] for production.
Hope this helps ...!!
Cheers.. :)
If you have a website that is externally accessible, then yes- someone is trying to hack your website... and every other website in existence. It's a fact of life.
Your localhost referrer is not necessarily indicative of malicious behavior, however. It's more likely that your dev instance, or someone else's dev instance of their site with links to your site, is creating the entries in your analytics.
However, if it's a referer with a link to another site in the querystring, then what you're falling victim to is referer spam attempts. If you want to prevent them, you can block them via htaccess if you're running on Apache, or via web.config if you're running on IIS. Just replace the pertinent bits regular expressions, or better yet, add to them.
I have a client that has a domain registered through GoDaddy (e.g., http://www.godaddysite.com). He has the domain set to forward w/masking to a page on our servers (eg., https://www.someuniversity.edu/someproject/loginpage.aspx).
When on our network (a university network) I can navigate to his domain, the forwarding/masking works and I can log in without issue. However, anyone off the university network, when visiting the client's site, cannot log into the site. It forwards/masks as it should, accepts the user name and password but stays on the login page after the credentials are accepted. If they navigate directly to my site they have no issues.
I checked his GoDaddy settings and everything appears right. GoDaddy says it is our configuration that is causing the problem (not allowing a different domain mask the site). Is this true? Is there something I need to change in IIS to allow people to log in when they visit through the GoDaddy site?
Update:
Finally was able to test this offsite. This scenario ONLY happens in IE. So now it is a browser setting issue.
The most common cause of this sort of problem is described here: http://blogs.msdn.com/b/ieinternals/archive/2013/09/17/simple-introduction-to-p3p-cookie-blocking-frame.aspx
If you change the IE Privacy Settings (Tools > Internet Options > Privacy) to Accept All Cookies does the problem go away? If so, then you need to set a P3P response header.
I have a directory with my media files and I need no to display them on other sites.
Server doesn't support .htaccess, because it uses nginx.
How can I enable hotlink protection for my files??
Thank you.
Easiest way would be to check for the Referer header in HTTP request. Basically if that header does not have URL from your site, then this could be hot linking.
This has following problems:
Referrer header can be forged -> hot linking works
All user agents do not necessarily send the Referrer header -> legitimate user might not get the content.
You could also set a cookie when user is browsing your site, and check for existence of that cookie when user is accessing the streaming content.
The details may be dated, but Igor gives an example of referrer mapping for image hotlink protection that might be useful here: http://nginx.org/pipermail/nginx/2007-June/001082.html
If you decide to go the referrer route.
If you are using memcached you could also store store client IP addresses for a time and only serve up your streaming media if an unexpired client IP is found in the cache. The client IP gets cached during normal browsing ensuring that the person viewing your streaming content has also recently been visiting your site.
On my hostgator site, they used nginx as a proxy to Apache(nginx+apache). maybe that will help you. Also if you have access to the logs, if you see a lot of traffic that way from a ip I would investigate, and if it points to a site, then block the other web server. Php's file_get_contents doesn't get stopped by htaccess or anything else I know besides blocking the ip.