Site only refreshes when adding www. to URL - http

First time posting so please bear with me.
I'm the unofficial web guy at the company I work for and I helped create our basic static HTML site.
Any work that I do to the site offline and then FTP shows up instantly on my machine. I rarely, if ever, need to clear the cache for changes to show up. However, within the company I work for, nearly half of the users never see the updates. Some do, some don't.
On the machines that don't I've cleared the cache in browser and through the internet control panel settings. Nothing. Still shows the stale content. The only thing that works - and I've seen this both in Chrome and IE is that when I add www in front of the URL is then shows the refreshed site. No big deal, right? Well for users who type in mysite.com without the in front will not see the updates. People who have favorited it like that, will not see the updates.
Now, on to what I've tried to fix it. After much research many people have steered me away from meta tag refresh so I haven't tried that, however, with the help of the IT guy we have, from what we can tell, set the HTTP header of the site to always refresh. This did not do anything for us.
I've tried changing image names in the HTML page when updating a photo and that didn't work either.
I haven't been able to find a .htaccess file so can I create one? If we (IT guy and I) changed the HTTP Header setting to always refresh but there is not .htaccess file will there be no change?
Any help or suggestions would be greatly appreciated.
I have searched on here for the answer and the two most suggested changes are HTTP Header and Meta refresh. HTTP header didn't help and it seems the Meta tag route is bad form.

This is a DNS issue. You need to ask the provider of your web services to add an A or a CNAME record for the domain's root.
If you don't understand the above, just call the provider of your web presence (the company that hosts your web server) and tell them you want yourdomain.com and www.yourdomain.com to go to the same place.

Related

A completely different site load under my domain while my main site is working fine

I'm a real newbie in this world.
I just recovered from a serious attack, and I'm trying do things right at this time.
recently I made a quick Google search for my site and I found this page:
https://www.neocsatblog.info/CNC-Metalworking-&-Manufacturing-%C3%98-mm-for-Marble-Granite-Ceramic-Tile-312652-Woodworking-Supplies/
The problem is, my main site is a simple blog, so I do not sell anything, and obviously this link loads a completely different site from mine.
And this not the only suspicious site, which has link with my own domain while my main sites and pages loads on as usual.
Cloudflare shows many different links on firewall the request come from Russian federation, the strange thing is the other links what they trying to reach working to (Meanwhile I block all request from Russia, Singapore).
I don't understand this. I don't have this sites, on my ftp server, I don't have this site on my database.
Also I asked my hosting provider about this incident, they said my domain is registered and completely fine.
I'm using WordPress.
What's the next step?
How to remove this site from my domain?
I really would like to close all the backdoors.
Based on my inspections, I found the malware, which php code is this:
https://app.codingrooms.com/w/YgaXOdAllXsp
Its around 3000 lines, so I rather not paste in here, but you can view on the link.
Based on the code, do I need search more files on my ftp?

Logged-in Users need to Refresh page to see content

Hi I'm having an issue with a site where visitors need to be members to access certain pages, but once logged in they go to these pages and still see the 'not logged in' page and need to refresh to view the actual content.
This obviously leads to a lot of bounces and I'd like to fix so that they see the content right away.
The root issue comes from some cache settings or something from the host - unfortunately we can't change host (and it's not a regular hosting company with a website but a design company reseller) for the time being. This issue does not occur in our offline environment of the same site.
I've already had to add a ?randomnumber to the stylesheet so it loads new versions properly. I was wondering if something like this would work - but dynamically as pages are being added all the time by different admins.
Or any other solutions also appreciated!
Thanks
Like you said, tweaking the caching settings would be the most ideal. But since that's not an option, I'd suggest adding a random, meaningless query string to the URL of the member pages so that it's seen as a 'new page' and (likely) won't cache.
So instead of /member-page
Direct them to /member-page?cache-buster=randomlyGeneratedStringHere

Making your site shareable on LinkedIn

I'm having a few issues with making our site shareable on linked in and I'm at a loss. The og: meta tags all look fine, the facebook scraper picks it up fine, but the linkedIn scraper does not... and the img etc are not on a protected folder or anything like that.
When inspecting the developer tools the get request to the url-preview?url= link shows that the img etc.. aren't there.
The image is less than 1mb, all og: meta tags are obeyed. The only think that may not be 100% is the image ratio is not 1/4 or 4/1 (it's 2/1)... But that is only a recommendation and not a hard and fast rule.
Does LinkedIn provide something similar to FB (https://developers.facebook.com/tools/debug/) where you can test the scraper and re-run it? Or is there another way to debug this? Any help appreciated.
https://www.hipla.co.uk (is the page i'm trying to share).
cheers
It transpires linked in doesn't offer a similar facility to FB or twitter to test the OG meta tags and re-scrape the page. They cache a page for 7 days and then re-scrape again. However, you can refresh the linkedIn crawler cache simply by appending GET params to the URL, i.e. https://www.hipla.co.uk?123.
I eventually figured out what our issue was. We were using a wildcard cert (for multi domain, so we could have a single ssl cert for multiple subdomains) which meant we had to set the server name in the apache default-ssl.conf file, but we had a typo in it for the www instance ... which meant it gave an SSL error (for the linkedIn crawler) which isn't debuggable (if that's a word) using linkedIn but was spotted as we got an SSL error when testing the twitter metadata tags using the twitter card validator. Hope this helps anyone else who has a typo in their ssl settings. Note that the ssl error was not visible using a browser(s) as all looked fine.

Site Hijacking RSS Feed and Entire Site in iFrame

The following site appears to be hijacking a client's content.
http://mothernova2.rssing.com/chan-24556607/latest.php
This is my client's site.
http://www.mothernova.com/
How would I go about blocking that domain from accessing the site? It also appears they are pulling the site into an iframe allowing full browsing.
FYI, the site is using WordPress, WordFence and iThemes Security (if there are any settings I should add for blocking).
You need to use a framekilling script, which uses javascript to check if your script is the top one. Here's a simple version:
<script type="text/javascript">
if(top != self) top.location.replace(location);
</script>
One drawback to this approach: if there is a legitimate site iframing your code, you need to check the referrer and start adding exceptions.
And a question to answer before you do it: you're getting a pageview and ad impression from the annoying framing site; is there any reason why you need to go to the bother, when they're sending a few viewers to your client's content?
The site owners of rssing.com are well known scrapers. And they are grabbing your content by RSS, hence the name rssing.com.
You can use the contact form to ask that they take your content down. Tell them they are clearly violating your TOS and copyright for your content.
(I had to do this in the past for my own content scraped from my site; they did remove my site at my request.)
Maybe I wasn't implementing the above suggestions correctly (I was adding them at the page level), but they weren't working for me. I did find this post and it seems to work as outlined.
http://forum.ait-pro.com/forums/topic/rssing-com-good-or-bad/
I updated my .htaccess file with the suggested code.
Brett

How to suspend website for users but left it activated for the developer?

I would like to set a page on my website "Coming soon",and open the website when the update is finished.
My website is on host with plesk panel, so I suspended my site and edited the "temporary unavialable service" error page. but now as the developer i cannot check my website( I have to check it on the host and not in the IDE).
What should I do?
Using Url Rewrite you could do a few different things, here are a few of the easier ones:
Redirect by IP. Send everyone that doesn't match your IP address to a maintenance page (make sure you use a 302). If you're on a network where everyone has the same external IP and users on that network are accessing the site, this could be an issue.
Redirect by (lack of) a querystring parameter. This will work if you just need to view and refresh a single page, but if you need to click around through the site it's not going to work.
Redirect by (lack of) a cookie. Have Url Rewrite look for a cookie and if it's not present, redirect to maintenance page. This is probably the best solution of the 3 as it avoids the pitfalls of the other two approaches. The cookie will persist for however long you tell it to so you won't get redirected when you click through as you would with the querystring approach, and the redirects will work for everyone that doesn't have a cookie set- so everybody but you.
There are plenty of examples of all of these approaches on this site and on Webmasters.StackExchange and all over the web that can be found with quick search.
The easiest answer is leave your site up with the home page one of the static pages of your asp.net website and have your page for the site that you are working on called something like home2.aspx. I'm that way people that go to your site without a specific page will get the under construction page and you just need to add the URL of your test page when you deploy

Resources