implications of having a site in full https - asp.net

I am currently developing an MVC4 web application for eCommerce. The site will contain a login and users can visit the site, input their details and submit orders etc. This is a traditional eCommerce site.
To boost the security of the site, I am looking to set up the entire site in https. As the user will be supplying their log in credentials and storing personal information in cookies, I would like the site to be fully secured.
I have concerns though, these being if I set up the site in https, will it detriment performance? Will it impact negatively on search engine optimization? Are there any other implications of having an entire site in https?
I use output caching to cache the content of my views - with https will these still get cached?
I have been reviewing security guidelines and documentation, such as this from OWASP and they recommend this. Also, I see that sites such as twitter are fully https.

Generally speaking, no - whole-site encryption is not a problem for performance.
(Just make sure you disable SSL 2.0 on your server, as it's vulnerable to the BEAST attack; you should use TLS 1.0 or SSL3.0 which have been supported by pretty much every browser since 2000).
The performance issues were a problem years ago, but not anymore. Modern servers have the capacity to deal with the encryption of hundreds of requests and responses every second.
You haven't mentioned deploying a load-balancer or failover system, which implies your site won't be subject to thousands of pageviews every second. That's when you need to start using SSL offloaders - but you're okay for now.
Output caching is not affected by encryption - just make sure you're not serving one person's output to another (i.e. cache a shopping cart or banking details in Session or with the Session ID in the Cache key).

Related

Why would a consistent number of visitors be missing User-Agent headers?

We recently deployed a mobile version of our site, and part of that deployment included a User-Agent check to determine which version to deliver to the end user.
Every minute or so since we released, we've had an Elmah error from an exception that was thrown when a User-Agent was blank.
We've already fixed the issue in production, but I'm curious as to why a consistent (but very small) percentage of our traffic might not have User-Agent defined.
This is a simple guess, but it could come from bots.
There's an amazing number of bots (search engines, botnets, and others) that constantly scan websites and servers for vulnerabilities, passwords and such. Sometimes they have a known User-Agent, sometimes not.
You could use a CDN service like CloudFlare to get an idea of how many of those requests come from robots (no, I don't work for that company - but using their services made me realize how much the web is polluted by bots, the stats are scary).

ASP.NET shared hosting security and performance for ecommerce aplications

Is there any disadvantage on using shared hosting in general (of discountasp.net) for an ecommerce website? security concerns or performance? The site is new and we dont expect many visitors right now, we have at least 30 products.
I am using my own shopping cart, user accounts (Membership provider), credit card processor (paypal), my own CMS, in C# ASP.NET 4.0 webforms and SQL Server 2008.
I dont save credit card information in the database, my system only create an account for users who buy something in the checkout process, and we need only processing power on some paypal apis only in checkout (very low cpu usage I guess).
My website is optimized client-side and server-side, I have the XSS security enabled of ASP and the AntiXSS library of Microsoft in all inputs/outputs (forms, cookies, http headers, query strings and even websevices), stored procedures, parameterized queries to avoid sql injection, SSL connections, anti spam, compiled and obfuscated dlls, encripted web.config, etc...
I am missing something? thanks, and sorry for my bad english
Just to give you a quick answer:
Yes, there are problems. Plenty of them.
Performance is always (typically) worse on shared than on dedicated. Someone might be using all the IO and you get bottlenecked.
Security, as you can't manage the server you have no way of knowing if it's patched, hardeneded etc. If one of the thousand other people on the same shared server manages to exploit it you're done. However, one could argue that if you don't know how to secure a dedicated server it might be better to rely on the shared hosting providers experience.
Also, you have to trust the shared hosting provider not to steal your data etc.
One other thing is that if something crashes you have to wait for the provider to fix it rather then just do it yourserlf. Again, if you don't know how to fix it, it might be better to wait for the provider anyway..
All and all, for your site in the beginning I would go with shared hosting and move up to VPS as soon as you start generating some money on the site.

Why do we need CDNs when HTTP proxies already cache content?

CDN seems to be a popular way of improving an app's performance.
But why are they required when you consider that HTTP proxies on the web can cache the content already ?
CDNs are a kind of web cache, just one operated under your auspices, rather than the web user's. You get full control of the freshness of your content, whereas you don't have any control of the proxy servers "out there".
The user's proximity to your web server has an impact on response times. Deploying your content across multiple, geographically dispersed servers will make your pages load faster from the user's perspective. But where should you start?
Read full article at https://developer.yahoo.com/performance/rules.html

Check if anyone is currently using an ASP.Net app (site)

I build ASP.NET websites (hosted under IIS 6 usually, often with SQL Server backends and forms authentication).
Clients sometimes ask if I can check whether there are people currently browsing (and/or whether there are users currently logged in to) their website at a given moment, usually so the can safely do a deployment (they want a hotfix, for example).
I know the web is basically stateless so I can't be sure whether someone has closed the browser window, but I imagine there'd be some count of not-yet-timed-out sessions or something, and surely logged-in-users...
Is there a standard and/or easy way to check this?
Jakob's answer is correct but does rely on installing and configuring the Membership features.
A crude but simple way of tracking users online would be to store a counter in the Application object. This counter could be incremented/decremented upon their sessions starting and ending. There's an example of this on the MSDN website:
Session-State Events (MSDN Library)
Because the default Session Timeout is 20 minutes the accuracy of this method isn't guaranteed (but then that applies to any web application due to the stateless and disconnected nature of HTTP).
I know this is a pretty old question, but I figured I'd chime in. Why not use Google Analytics and view their real time dashboard? It will require minor code modifications (i.e. a single script import) and will do everything you're looking for...
You may be looking for the Membership.GetNumberOfUsersOnline method, although I'm not sure how reliable it is.
Sessions, suggested by other users, are a basic way of doing things, but are not too reliable. They can also work well in some circumstances, but not in others.
For example, if users are downloading large files or watching videos or listening to the podcasts, they may stay on the same page for hours (unless the requests to the binary data are tracked by ASP.NET too), but are still using your website.
Thus, my suggestion is to use the server logs to detect if the website is currently used by many people. It gives you the ability to:
See what sort of requests are done. It's quite easy to detect humans and crawlers, and with some experience, it's also possible to see if the human is currently doing something critical (such as writing a comment on a website, editing a document, or typing her credit card number and ordering something) or not (such as browsing).
See who is doing those requests. For example, if Google is crawling your website, it is a very bad idea to go offline, unless the search rating doesn't matter for you. On the other hand, if a bot is trying for two hours to crack your website by doing requests to different pages, you can go offline for sure.
Note: if a website has some critical areas (for example, writing this long answer, I would be angry if Stack Overflow goes offline in a few seconds just before I submit my answer), you can also send regular AJAX requests to the server while the user stays on the page. Of course, you must be careful when implementing such feature, and take in account that it will increase the bandwidth used, and will not work if the user has JavaScript disabled).
You can run command netstat and see how many active connection exist to your website ports.
Default port for http is *:80.
Default port for https is *:443.

Pro's & Cons of URL Expiring Concept

Is there any disadvantage of using URL expiring concept to protect online videos?
You're stopping your users from usefully bookmarking the URLs to get back to them later (thus, making life harder for your users), and stopping your users from mailing, tweeting or otherwise usefully sending the URLs of your best videos to their friends, so basically killing any chance that the videos may "go viral" and attract large number of viewers. In other words, you're working totally against the growing "social" component of the web world, as well as making your users' experience less pleasant and useful.
If you want to "protect" a URL e.g. against visits from users who aren't registered with your service, why not use the HTTP authentication mechanisms and/or cookies handled out and processed at application level in correspondence with registration and login? It seems to me that such approaches, if you do need protection, can have fewer issues than "expiring URLs".

Resources