What might my user have installed thats going to break my web app? - asp.net

There are probably thousands of applications out there like 'Google Web Accelerator' and all kinds of popup blockers. Then theres header blocking personal firewalls, full site blockers, and paranoid cookie monsters.
Fortunately Web Accelerator is now defunct (I suggest you read the above article - its actually quite funny what issues it caused) but there are so many other plugins and third party apps out there that its impossible to test them all with your app until its out in the wild.
What I'm looking for is advice on the most important things to remember when writing a web-app (whatever technology) with respect to ensuring the user's environment isnt going to break it. Kind of like a checklist.
Whats the craziest thing you've experienced?
PS. I may have linked to net-nanny above, but I'm not trying to make a porn site

The best advice I can give is to program defensively. For example, don't assume that all of your scripts may be loaded. I've seen cases where AdBlocker Plus will block 1/10 scripts that are included in a page just because it has the word "ad" in the name or path. While you can work around this by renaming the file, it's still good to check that a particular object exists before using it.
The weirdest thing I've seen wasn't so much a browser plugin but a firewall/proxy configuration at a user's workplace. They were using a squid proxy that was trying to remove ads by replacing any image HTTP request that it thought was an ad with a single pixel GIF image. Unfortunately it did this for non-GIF images too so when our iPhone application was expecting a PNG image and got a GIF, it would crash.

Internet Explorer 6. :)
No, but seriously. Firefox plugins like noscript and greasemonkey for one, though those are likely to be a very small minority.
Sometimes the user's environment means a screen reader (or even a braille interface like this). If your layout is in any way critical to the content being delivered as intended, you've got a problem right there.
Web pages break, fact of life; the closer you have been coding and designing up against standards, the less your fault it is.

Something I have checked in the past is loading some of the more popular toolbars that people tend to install (Google, Yahoo, MSN, etc) and seeing how that affects the users experience.
To a certain extent it is difficult to preempt which of the products you mentioned will be used by your users since there are so many. I would say your best bet is to test for the most frequent products that your user base may employ and roll with the punches for the rest. If you have the time to test other possible scenarios, by all means do.
Also, making it easy for your users to report possible issues also helps lessen the time it takes to get a fix in place should it be something you can work around.

Related

Prevent automated tools from accessing the website

The data on our website can easily be scraped. How can we detect whether a human is viewing the site or a tool?
One way is by calculating time which a user stays on a page. I do not know how to implement that. Can anyone help to detect and prevent automated tools from scraping data from my website?
I used a security image in login section, but even then a human may log in and then use an automated tool. When the recaptcha image appears after a period of time the user may type the security image and again, use an automated tool to continue scraping data.
I developed a tool to scrape another site. So I only want to prevent this from happening to my site!
DON'T do it.
It's the web, you will not be able to stop someone from scraping data if they really want it. I've done it many, many times before and got around every restriction they put in place. In fact having a restriction in place motivates me further to try and get the data.
The more you restrict your system, the worse you'll make user experience for legitimate users. Just a bad idea.
It's the web. You need to assume that anything you put out there can be read by human or machine. Even if you can prevent it today, someone will figure out how to bypass it tomorrow. Captchas have been broken for some time now, and sooner or later, so will the alternatives.
However, here are some ideas for the time being.
And here are a few more.
and for my favorite. One clever site I've run across has a good one. It has a question like "On our "about us" page, what is the street name of our support office?" or something like that. It takes a human to find the "About Us" page (the link doesn't say "about us" it says something similar that a person would figure out, though) And then to find the support office address,(different than main corporate office and several others listed on the page) you have to look through several matches. Current computer technology wouldn't be able to figure it out any more than it can figure out true speech recognition or cognition.
a Google search for "Captcha alternatives" turns up quite a bit.
This cant be done without risking false positives (and annoying users).
How can we detect whether a human is viewing the site or a tool?
You cant. How would you handle tools parsing the page for a human, like screen readers and accessibility tools?
For example one way is by calculating the time up to which a user stays in page from which we can detect whether human intervention is involved. I do not know how to implement that but just thinking about this method. Can anyone help how to detect and prevent automated tools from scraping data from my website?
You wont detect automatic tools, only unusual behavior. And before you can define unusual behavior, you need to find what's usual. People view pages in different order, browser tabs allow them to do parallel tasks, etc.
I should make a note that if there's a will, then there is a way.
That being said, I thought about what you've asked previously and here are some simple things I came up with:
simple naive checks might be user-agent filtering and checking. You can find a list of common crawler user agents here: http://www.useragentstring.com/pages/Crawlerlist/
you can always display your data in flash, though I do not recommend it.
use a captcha
Other than that, I'm not really sure if there's anything else you can do but I would be interested in seeing the answers as well.
EDIT:
Google does something interesting where if you're looking for SSNs, after the 50th page or so, they will captcha. It begs the question to see whether or not you can intelligently time the amount a user spends on your page or if you want to introduce pagination into the equation, the time a user spends on one page.
Using the information that we previously assumed, it is possible to put a time limit before another HTTP request is sent. At that point, it might be beneficial to "randomly" generate a captcha. What I mean by this, is that maybe one HTTP request will go through fine, but the next one will require a captcha. You can switch those up as you please.
The scrappers steal the data from your website by parsing URLs and reading the source code of your page. Following steps can be taken to atleast making scraping a bit difficult if not impossible.
Ajax requests make it difficult to parse the data and require extra efforts in getting the URLs to be parsed.
Use cookie even for the normal pages which do not require any authentication, create cookies once the user visits the home page and then its required for all the inner pages.This makes scraping a bit difficult.
Display the encrypted code on the website and then decrypt it on the loadtime using javascript code. I have seen it on a couple of websites.
I guess the only good solution is to limit the rate that data can be accessed. It may not completely prevent scraping but at least you can limit the speed at which automated scraping tools will work, hopefully below a level that will discourage scraping the data.

Is it ok if everything is looking ok but X/HTML and CSS are not valid , for CMS's Admin/control panel?

Is it OK if everything looking OK but HTML and CSS are not valid , for CMS Admin/control panel?
Should we only consider Web-standards for site, not necessary for site-management tools?
for example
:http://example.com/wp-admin
:http://example.com/admin/
Well, the point of standards compliance is to make everything work correctly for every user. Even though admin areas are only accessible to a few select users per site, if you are building a CMS you have to consider that many, many people might use your script which would add additional users who will be needing to access those admin panels. It's best to make everything standards compliant, that's why they create them. If an admin can't get the admin panel working properly, they'll ditch the script.
I agree it may not be worthwhile to to make everything valid. As long as you've done your testing and it working then it's probably not worth the time to make everything valid.
Some validation errors matter, some don't. The spec is a bit ridiculous in its requirements in places. What is important is that all open tags are closed and they are nested correctly as if this is not done there can be many subtle errors both now and in the future. As for non-encoded entities, using rel's and targets when you shouldn't - it doesn't matter so much.
Since you (usually) can control the access to the admin area (rather than normally having every single device and platform access it) it certainly matters less. The time would probably be better spent adding more features and fixing real bugs rather than aiming for 100% compliance. Don't tell any standards advocates I said that though. :)

Is it worth the time to care about users who have Javascript disabled in their browsers? [duplicate]

This question already has answers here:
Closed 13 years ago.
Possible Duplicate:
Is it worth it to code different functionality for users with javascript disabled?
I've just learned in this question that an ASP.NET webforms application will have issues to run properly in a browser with disabled Javascript (unless I don't use certain controls and features).
So this is a kind of a follow-up question. For my current web application which will only have a small number of users I can postulate to enable Javascript.
But how to deal with this question in general? As I can see (with my rather small knowledge about web development until now) Javascript is all around, especially the more a web site is "dynamic" or "RIA" like.
Is it worth at all to take care about the few users during web app development who disable Javascript in their browsers?
Are they really "few"? (I actually have no clue, I just don't know anyone who has Javascript disabled.)
I'm a bit inclined to tell them: "If you want a nice interactive page, don't disable Javascript." or "You cannot enter this website without Javascript. Go away!" Because I don't want to double code and development time for some mavericks (if they are mavericks). Does it indeed add a lot more time to get a website working with Javascript and without Javascript as well?
For what reason does someone disable Javascript at all in his/her browser? Well, I've heard: "Security!" How unsecure is Javascript actually? And does that mean in the end that millions of pages are unsecure and almost every website I am using is a risk for me? Are there other reasons except security?
Are there any branches or environments where it is usual or even mandatory to disable Javascript for security (or other) reasons? (For instance I have security services, offices in defense ministries or a bank in mind.)
Are there any trends in development to see to add more Javascript to web sites, thus making it more and more necessary to let Javascript enabled in the browser? Or is there rather a counter motion?
Thank you for some feedback in advance!
Whether or not to care about users who turn off javascript should be done on a case by case basis. If you believe that it is ok to turn away users that do not have it enabled then that is a decision that you can make and is made by many apps.
Keep in mind that it is not necessarily a conscious decision to have javascript disabled or at a limited capacity. Screen readers, for example have a very stripped version of javascript and a site that uses it throughout will often be inaccessible. Depending on the website, this may actually be illegal.
If a website is properly constructed with progressive enhancement from the beginning, then creating versions that work without javascript should not be too much additional work. Therein lies one of the major issues with webforms - it is difficult to gain control over markup and javascript tends to be very tightly coupled.
My personal view is the number of people who completely disable javascript to the extent that .net web sites stop functioning is very small - For some government sites I have been responsible for, I don't recall getting any complaints from "non-javascript" users.
My concern would be more about making sure your site was xhtml compliant, with valid markup (which earlier versions of Visual Studio did not generate), valid css, and intelligent use of javascript.
Having a disclaimer somewhere on your site - that is accessible to those few with javascript disabled telling people that javascript is required for the site to function correctly would be a good thing.
Depends on your audience. For one thing, if the site is completely nonfunctional without JavaScript, accessibility (e.g. to those who must use a screen reader) may be an issue, so if you expect any blind users, you might need to consider that.
In most situations, I'd say, you're probably fine just using <noscript> tags to drop in a quick disclaimer along the lines of "JavaScript is required to use this web site."
While I have no solid numbers -- and I presume I'm in the distinct minority -- I, and many of the more savvy users I know, disable JavaScript by default (a la NoScript). I enable it on websites on a case by case basis. Most novice users (I'm ignoring the 25% in the upper/middle of experience) don't even know what "JavaScript" means.
As a developer I see the cost/benefit of supporting JS-less users
as boiling down to one question:
Do your users need the site more, or do you need your users more?
One of my current projects makes heavy use of JavaScript and Flash and does not function at all without it. But as it's installed at the employer's site and the visitors are the employees using it for their job, that requirement is completely reasonable.
However, if I were working on a revenue-generating site where losing users meant losing money, I'd seriously think about crafting the site to work w/o JS -- albeit less quickly with more page reloads. (Perhaps by asking advice here.)
As a random thought -- you could probably devise a special method to determine how many of your current users do and don't have JS enabled.

How to make a website run faster? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 months ago.
Improve this question
I'm into my first 3 months of web development and I have been dabbling with some server side scripting in the form of ColdFusion, along with some Javascript, JQuery and CSS.
I have read about CSS optimization and would like to know what are the other pertinent factors contributing to the better performance of a website. What all factors can a developer profile and optimize?
How much part does picking (or rather I should say recommending) a particular browser play in this performance hunt?
cheers
Install YSlow and Pagespeed plugins for Firefox. Then start looking at all of the ways your site is unoptimized. This will be much like trying to take a sip of water from a fire hydrant.
Using minified ( and possibly aggregated ) Javascript and CSS along with a good, healthy far-future-expires is a really good way to start.
Don't forget to gzip.
And use Etags.
And put your CSS at the top of the document.
And put the javascript at the end.
And use separate domains for static resources.
And avoid URL redirects.
And remove duplicate javascript and CSS.
And... see what I mean about the fire hydrant!
Just to sum the stuff up from above:
The speed of a website depends on a few things:
Server
Connection
Client
And on each of this part you can do improvements.
Server: if you rely on a database, check if you queries are cached, and more importantly check if your data is cached. For example if on every page you get a menu from the database, then you can cache that result. In addition you can check your code and see if there is room for optimization.
Also the hardware itself plays a role. If you are on a shared hosting plan, maybe the server is full of other not-optimized apps that take a toll on the server.
Connection: Here the YSlow and Pagespeed come in handy, as well as Fiddler. You can do some caching of static content (CSS and JS). Set their expire date far in the future. Using GZIP to make their contents smaller, and combining the static files helps to a certain extent.
In addition maybe the server has a low bandwidth.
Client: if you do wacky javascript or have slow css selectors, this might hurt performance on the client. But this depends on the speed of the client's computer.
I'd recommend reading Best Practices for Speeding Up Your Web Site and all of the content on Yahoo!'s Exceptional Performance page.
If you like books, you may be interested in High Performance Websites (note that a lot of the content in this is in the Best Practices for Speeding Up Your Web Site article) and Even Faster Websites.
Here are a few of my favourite rules from Best Practices for Speeding Up Your Web Site:
Minimize HTTP Requests
Add an Expires or a Cache-Control Header
Gzip Components
Make JavaScript and CSS External
Minify JavaScript and CSS
Also, smush.it is good for compressing images (which have a big impact on how fast a webpage loads).
As far as browsers go, Safari 4 claims it is "the world's fastest browser", and I can say that the Mac version is certainly nice and fast (not to mention elegant!). However, the above suggestions will make much more of a difference than which browser you're using.
Steve
With ColdFusion you will want to make sure your queries are being cached. Use query analyzer (if using mssql server) to make sure a slow loading page isn't the result of a bad query. On the database end you'll also want to ensure proper indexing.
A big factor in performance is how many HTTP requests are sent for images, files, etc. YSlow will show you this info. Its only available for firefox.
I'd recommend this book.
Google is currently collecting all sorts of performance tips on their new 'Let's make the web faster'-page here: http://code.google.com/intl/de-CH/speed/articles/
FYI: Not all information on these pages are valid, particularily the PHP tips are way off.
There is a really great plugin for for Firefox called Dust-Me Selectors. It scans your css files and lets you find selectors that aren't used/have become redundant in your markup.
https://addons.mozilla.org/en-US/firefox/addon/5392
You should also be delivering your static content off a CDN. Parallel downloads of static files will speed up your page renders. A better explanation here: http://www.sitecheck.be/tips/speed-up-your-site-using-a-cdn/
You shouldn't recommend any particular browser, but design your webpage to current standards with some fixes for older models, if necessary. From my perspective everything can have a speed impact, but CSS is the least important one and in real world examples the user will not notice this. In most cases a clear separation of html and style declarations will do the job. What really has an impact? First of all you can simply throw money at the problem by getting a better hosting contract (maybe a dedicated server). Another thing to improve the speed a website takes to load is to reduce the quality of your images and the usage of CSS-Sprites. Very often on dynamic webpages the database is a bottleneck and therefore caching and a good database abstraction layer can improve things (PHP: PDO instead of simply using mysql()). GZip your output to the user. There are so much more things, but a lot of them are very language dependent..
I recommend the use of FireBug and loadimpact.com for testing.
Less files are more - CSS sprites may be something to consider. In my experience, you have to balance your CSS file between speed and maintainability - one rule more or less won't make the difference between night and day...
For IE, see http://www.fiddler2.com/fiddler/Perf/
The new neXpert plugin for Fiddler (http://www.fiddler2.com/fiddler2/addons/nexpert.asp) offers features similar to those found in YSlow and PageSpeed.
The biggest problem I have is creating fast-running, beautifully designed pages with rich content. This is one thing that is extremely hard to do with today's technology.
If you have lots of javascript you might wanna use Javascript compression. Dojo provides one such tool SHRINKSAFE to compress your javascript. Find the link below:
http://www.dojotoolkit.org/docs/shrinksafe
There is another tool open sourced by google called page speed, which can help you optimize the website performance. It was used internally before it was open sourced to everyone recently.
http://google-code-updates.blogspot.com/2009/06/introducing-page-speed.html
http://code.google.com/speed/page-speed/
Hope it helps.
A couple of very basic rules of performance testing:
Performance means nothing if the program/web page/whatever is wrong.
Do not try to improve performance without having a reliable form of measurement.
You must profile your site/program/whatever to find out what is making things slow.
Corrolary: Do not just change things at random to see if things get better.
Cache everything (web server and browser caching).
Statically publish as much as possible (i.e. to reduce the amount of database calls)
Also add multiple waiting icons to your website.
Show icons in such way that every time user should get different waiting icon, which should be effective to engage user. And mean while your website will get loaded.

Possible reasons for stylesheets not to be applied?

We're developing this ASP.NET web app with multiple users on different browsers. One of the users complains that the pages look funny, and after looking at the screen shot he sent, it looks like the css is not applied. According to the bug report, the user is using IE7 (but I suspect it could be IE8).
I develop in Firefox, and using the web developer toolbar, when I choose CSS --> Disable Styles --> Linked Style Sheets, I manager to recreate a page that looks a lot like the reported one.
I sent a message in return that told him to check the options under Internet Options --> General --> Accessibility, and make sure all the check-boxes where un-chekced. The reply was that they were, but that the page still looked funny.
So, the strange thing is that this app is used by hundreds of users, but I have only gotten complaints from this one user. Does anyone have any idea of what could cause the CSS to not be applied? Could it be my code, or is it some local setting in the client browser or system?
Sometimes there's no substitute for actually walking over to the user and looking at his/her setup. Difficult if they're in a different location, but not impossible; use a remote access tool like Copilot.
If I had to guess in order of likelihood:
They've disabled/overridden application-supplied stylesheets (enough ugly backgrounds and anyone might do this)
They have a caching issue
Could be a very mundane file-related issue - e.g. the version you're looking at references a file that's visible to you, but not to the user. They may even be blocking that file via browser settings, purposefully or not. Have you asked them to request the stylesheet file directly and report back what they 'see'?
Other than that, could be issues relating to the markup used (e.g. bad 'type' attribute) - is your markup validating?
Also, it's obvious, but have you validated your stylesheet? Is this person the only IE7 user?
There's a few things to check.
Firstly, check their anti-virus / security / firewall / etc. That can randomly block all sorts of things.
Secondly, there's a registry setting somewhere to disable CSS, but that'd do it for all websites.
Also, check to see if they've got a load of IE add-ons and toolbars installed. Some of them add crap to the user-agent string and it that gets too long I seem to remember IE will truncate it. This can confuse the adaptive rendering "feature" on .net 1.1 sites.
Oh, and make sure you've validated your CSS :)

Resources