Prevent varnish caching for a specific widget / plugin? - wordpress

I have a weather widget on our homepage that uses the users IP to display current local weather. The issue is that the first person to land on the homepage sees the correct weather but then all other users see the first users weather.
Obviously the homepage gets a lot of traffic so turning the cache off on the page is not an option.
What steps do i need to take to not cache just that widget/plugin on the homepage? Since it is a widget that might some day appear on other pages it would be great if the whole thing could be exempt but I don't even have a clue how to start?
As an additional note, the widget makes an api request to a 3rd party service with the IP address as one of the parameters.
Thanks in advance.

If the IP address of the user is included in the homepage as it is returned to the user, you will not be able cache the page without the side-effect you are having.
My suggestion would be to try and get that IP address info to the widget in a separate request. You would need to load the homepage first, without the users IP included, and them make a 2nd request from your Javascript (You could use Ajax/Websockets etc) that gets the ip address from the server, updates the HTML for the widget and makes it display the weather.
It's more work, and the exact implementation will depend on how the widget works.
Hopefully this sends you in the right direction :)

Related

ASP.net/Vb.net page "permissions" -- How to make a page only accessible with valid Pin number

I have an ASP.net web page that should only be accessible should the user enter a valid Pin number on the welcome page. They shouldn't be able to simply access the page by typing in the URL in their browser, for example.
Flow: User visits web page. User enters their Pin on the welcome page.
If Pin is valid, user is redirected to the page in question.
If Pin is invalid, show an error message stating that the Pin is invalid.
Okay, right now I have it set up so that they are redirected to the page if the Pin they enter is correct. Problem is, this doesn't negate them from being able to enter the URL in their browser or otherwise access the page. Of course, I could make the URL long and obfuscated, but this doesn't seem to me secure.
I have a lot of ideas of how to make the page secure. The best example, is creating an obfuscated unique URL based off of their Pin, that redirects to the page in question, but I'm not sure how to accomplish this within the ASP.net realm. It seems like there are methods, properties, or just in general coding techniques that should be built in to .net which handle this, since it is so common. I just don't know about them, personally.
Also, I would like to make the connection to this page secure, perhaps using Https or some other security method. Maybe this is best left to another question.
I dont want to sound harsh here but what you want to accomplish, should be based on proper security. I would start with these tutorials and make my way from there http://www.asp.net/web-forms/tutorials/security
In short you could contain and restrict the user by logging onto the site, database security etc. This would give you further control and allow future updates and or add additional security methods if required.
Hope this helps

Search bot detection

Is it possible to prevent a site from being scraped by any scrapers, but in the same time allow Search engines to parse your content.
Just checking for User Agent is not the best option, because it's very easy to simulate them.
JavaScript checks could be(Google execute JS) an option, but a good parser can do it too.
Any ideas?
Use DNS checking Luke! :)
Check the user agent to see if it's identifying itself as a search engine bot
If so, get the IP address requesting the page
Reverse DNS lookup the IP address to get a hostname
Forward DNS lookup the hostname to get an IP address
Same idea provided in help article Verifying Googlebot by Google
Checking link access times might be possible, in other words, if the front page is hit, then the links on the front page are all hit "quickly".
Even easier, drop some hidden links in the page; bots will follow, people almost never will.

Using cookie in asp.net

I have a like/dislike buttons and I put them in an update panel to be able to update the counter without refresh the page, the problem that the user can click like/dislike button several times and the counter will be changes
I want to allow user to click the button once I think I may use cookie but I didn't used it before so if anyone can help me doing that I will be thankful
also if there is any other solution that may be better please let me know
Thanks in Advance
If you want to use cookies, you can look at this page (older version) or this page (newer version).
You haven't described what kind of website you are creating, but if you have a user registration/login mechanism, you could just save information that a specific user clicked the like button in your database.
If logging in is not acceptable, you can try to identify your users by their IP addresses, as Adam suggested. You can do this by using:
String remoteAddress = HttpContext.Current.Request.UserHostAddress;
or
String remoteAddress = HttpContext.Current.Request.ServerVariables["REMOTE_ADDR"];
Either way I think it would be best to use cookies combined with another method, because you can then check the cookie first. If it exists on a user's computer, you know she/he has already voted. If the cookie is not there, you can query a database for the saved information about the user (identified by IP or login mechanism). This way you can make less queries to the database, which should be good for your application's performance :).
Instead of using cookies you can track via IP address.
I know IP addresses can change over time so you could use this with cookies but cookies can also be cleared so nothing will be 100%.
When a user clicks like or dislike, store their IP address with the record of the like.
Place code to stop another like or dislike counting if they already have done so.
Then on your update remove the like or dislike button and just show the count.
This is what I use for my application. I also have a Facebook app, in which I use their Facebook user Id which is much harder to fake.
Either way I think IP address is the best way to detect and stop someone from doing it twice.

ASP.NET Saving Customer's Shipping/Billing Addresses

I'm looking for the simplest solution to this situation:
I have a pre existing web store with a shopping cart using .NET (vbscript)
I customize what products my customer's see based on the subdomain they use to come to my site (customer.mysite.com)
What my customer's are requesting is, instead of typing in their billing/shipping addresses each time, that they have a selection to choose from from previous addresses they have used.
How can I accomplish this, keeping in mind that they don't log in, they simply use the subdomain to come in to my site and place orders without a user/pass.
The simpler (easier to implement) solution, the better.
Why not just show all the addresses for that subdomain, but, due to some privacy concerns, I would wait until they type in a street address, then show them the addresses for that.
Otherwise, everyone on that subdomain will see the address of everyone else on that subdomain.
If they don't care, then just show all the addresses for that subdomain.
Or, give them an option to login and order, and then when they do that, then you can show them all their addresses they shipped do when they are logged in.
The last one is the preferred one, IMO.
If they don't login then I assume you don't have them create an account either. Thus the server won't be able to identify them. In this case I think you are left with using client cookies. Just make sure you don't store sensitive data in them (like credit card).
I would place a cookie on the users computer with the address information in it attached to the subdomain. The down side to this is that you should not put sensitive information inside cookies but depending on the nature of your business this may not be a problem for you.

Protect WordPress login page

I have a WordPress site. Like with many WordPress sites I see people (probably robots) trying their luck at the login page every once in a while. However, for the past 2 weeks it’s been non-stop at a rate of 400-500 tries a day…
So I went ahead and took the following security measures:
Changed the login URL to something different than the regular /wp-admin.
Limit the number of login attempts per URL and also automatically block any IP trying to login with an invalid username such as “test” or “admin”.
Set up two factor authentication to make sure that even though they tried they would not manage to get in, even if they guessed the username and password.
However that didn’t seem to do much and I’m still seeing a huge number of login attempts, so next thing I did was:
Password protect the login URL itself.
And still I’m seeing the same number of login attempts… now my questions are basically 2:
How are they managing to still try their luck at the login form even if that page is password protected?
Is there anything else I can do about it?
Cloudflare offers a free entry level plan that may help reduce some of this traffic before it gets to your site. Also, their $20/month plan (as of Aug 2017) can be paired with their WordPress plugin to use their built-in WordPress rulesets. CloudFlare also has a few more settings to allow you to put a few more filters and road blocks in front of specific types of traffic.
If you do choose to use CloudFlare with WordPress, be sure you understand exactly how/if you are choosing to push content into the CloudFlare CDN (content delivery network) and how that relates to the content cache on your site.
Standard disclaimer: I have no relationship with CloudFlare except as a customer.

Resources