Screen Readers For Testing Website Accessibility - asp.net

My website is designed to meet the accessibility guidelines.
I'm HOPING that this means screen readers should work well with them... But I have two questions:
Is this a fair assumption to make?
Are there any free/cheap screen
readers clients I can use to test or
online emulators?

Just because something meets the guideline doesn't mean it's guaranteed to be accessible, all screen readers have there different quirks. I'm a totally blind individual so comments on screen readers are below.
Note this is a rather long post so I’ve summarized it at the top. In summary if you want to make sure your site is mostly accessible use NVDA, if you want to make sure that blind individuals working in the government will be able to use your site use Jaws to test, if you want to be extra safe use Window-Eyes and Orca to test as well.
NVDA is an open source screen reader that is rather new. It isn't quite as good as some of the commercial screen readers out there but it gets the job done. I'd say if a site works with NVDA it's likely to work with most other screen readers. One issue with NVDA is the fact that its accessibility is only really good in Firefox so you'll have to use that to test.
Jaws is the most widely used screen reader out there. You can download a demo of it that will run for 40 minutes at a time then require you to reboot if you want to run it again. If you’re trying to insure 508 compliance this is probably the way to go since Jaws is the screen reader used by the US government.
Window-Eyes is the second most used screen reader. I don’t have any experience with it but I’ve been told it’s quite good as far as internet accessibility goes. Orca is a screen reader built into gnome that works with Firefox and Linux. It’s built into Ubuntu. I tried it about a year and a half ago and it was absolutely horrible but I’ve been told it’s gotten better.

NVDA is a free option:
NonVisual Desktop Access (NVDA) is a free and open source screen reader for the Microsoft Windows operating system. Providing feedback via synthetic speech and Braille, it enables blind or vision impaired people to access computers running Windows for no more cost than a sighted person. Major features include support for over 20 languages and the ability to run entirely from a USB drive with no installation.

Is this a fair assumption to make?
No, even if you think you know the ins and outs of accessibility well, only testing can really tell you this.
Are there any free/cheap screen
readers clients I can use to test or
online emulators?
The already mentioned NVDA looks like a viable option, or you can download a trial of JAWS which I believe is the most widely used screen reader on Windows. If you're really serious about accessibility and you have a requirement for ongoing testing, you might want to think about just buying a copy.
On a final note, it sounds like you already know this but no amount of automated accessibility testing can really tell you if your site is accessible, only real-world testing can do that! Which is what you're doing, so well done.

Related

make non-native application accessible to screen readers for the visually impaired

I create applications, that are divorced from any native framework. All rendering happens in OpenGL, with a context provided by GLFW, all in C, with no framework to rely on supplying compatibility. As such, standard screen readers like NVDA have no chance of picking up information ( excluding OCR ) and my applications are an accessibility black hole.
How can I provide an interface for screen readers to cling unto? I presume this is a per OS thing... How would that be possible on Windows, Linux, BSD or even android? In the *NIX world, I presume this would be Desktop environment dependent...
I'm finding a lot of information on this, with a framework as a starting point, but have a hard time finding resources on how to do it from scratch.
I'm fully aware this is far beyond the capability of a sole developer and know, that writing programs by ignoring native interfaces is a common accessibility hole, which you are advised to avoid.
However, I have a tough time finding resources and jump-in points to explore this topic. Can someone point me in the right direction?
TL;DR: How to provide screen-reader compatibility from scratch. Not in detail - but conceptually.
As you have already well identified, your app is an accessibility blackhole because you are using a rendering engine.
It's basicly the same for OpenGL, SDL, or <canvas> on the web, or any library rendering something without specific accessibility support.
WE can talk about several possibilities:
Become an accessibility server. Under windows, it means doing the necessary so that your app provide accessible components on demand from UIA / IAccessible2 interface.
Use a well known GUI toolkits having accessibility support and their provieded accessibility API to make your app.
Directly talk to screen readers via their respective API in order to make them say something and/or show something on a connected braille display.
Do specific screen reader scripting
However, it doesnt stops there. Supporting screen readers isn't sufficient to make your app really accessible. You must also think about many other things.
1. Accessibility server, UIA, IAccessible2
This option is of course the best, because users of assistive technologies in general (not only screen readers) will feel right at home with a perfectly accessible app if you do your job correctly.
However, it's also by far the hardest since you have to reinvent everything. You must decompose your interface into components, tell which category of component each of them are (more commonly called roles), make callback to fetch values and descriptions, etc.
IF you are making web development, compare that with if you had to use ARIA everywhere because there's no defaults, no titles, no paragraphs, no input fields, no buttons, etc.
That's an huge job ! But if you do it really well, your app will be well accessible.
You may get code and ideas on how to do it by looking at open source GUI toolkits or browsers which all do it.
Of course, the API to use are different for each OS. UIA and IAccessible2 are for windows, but MacOS and several linux desktops also have OS-specific accessibility API that are based on the same root principles.
Note about terminology: the accessibility server or provider is your app or the GUI toolkit you are using, while the accessibility client or consumer is the scren reader (or others assistive tools).
2. Use a GUI toolkit with good accessibility support
By chance, you aren't obliged to reinvent the wheel, of course !
Many people did the job of point 1 above and it resulted in libraries commonly called GUI toolkits.
Some of them are known to generally produce well accessible apps, while others are known to produce totally inaccessible apps.
QT, WXWidgets and Java SWT are three of them with quite good accessibility support.
So you can quite a lot simplify the job by simply using one of them and their associated accessibility API. You will be saved from talking more or less directly to the OS with UIA/IAccessible2 and similar API on other platforms.
Be careful though, it isn't as easy as it seems: all components provided by GUI toolkits aren't necessarily all accessible under all platforms.
Some components may be accessible out of the box, some other need configuration and/or a few specific code on your side, and some are unaccessible no matter what.
Some are accessible under windows but not under MacOS or vice-versa.
For example, GTK is the first choice for linux under GNOME for making accessible apps, but GTK under windows give quite poor results. Another example: wxWidgets's DataView control is known to be good under MacOS, but it is emulated under windows and therefore much less accessible.
In case of doubt, the best is to test yourself under all combinations of OS and screen readers you intent to support.
Sadly, for a game, using a GUI toolkit is perhaps not a viable option, even if there exist OpenGL components capable of displaying a 3D scene.
Here come the third possibility.
3. Talk directly to screen readers
Several screen readers provide an API to make them speak, adjust some settings and/or show something on braille display. If you can't, or don't want to use a GUI toolkit, this might be a solution.
Jaws come with an API called FSAPI, NVDA with NVDA controller client. Apple also alow to control several aspects of VoiceOver programatically.
There are still several disadvantages, though:
You are specificly targetting some screen readers. People using another one, or another assistive tool than a screen reader (a screen magnifier for example), are all out of luc. Or you may multiply support for a big forest of different API for different products on different platforms.
All of these screen reader specific API support different things that may not be supported by others. There is no standards at all here.
Thinking about WCAG and how it would be transposed to desktop apps, in fact you are bypassing most best practices, which all recommand first above anything else to use well known standard component, and only customize when really necessary.
So this third possibility should ideally be used if, and only if, using a good GUI toolkit isn't possible, or if the accessibility of the used GUI toolkit isn't sufficient.
I'm the autohr of UniversalSpeech, a small library trying to unify direct talking with several screen readers.
You may have a look at it if you are interested.
4. Screen reader scripting
If your app isn't accessible alone, you may distribute screen reader specific scripts to users.
These scripts can be instructed to fetch information to give to the user, add additional keyboard shortcuts and several other things.
Jaws has its own scripting language, while NVDA scripts are developed with Python. AS far as I know, there's also scripting capabilities with VoiceOver under MacOS.
I gave you this fourth point for your information, but since you are starting from a completely inaccessible app, I wouldn't advise you to go that way.
In order for scripts to be able to do useful things, you must have a working accessible base. A script can help fixing small accessibility issues, but it's nearly impossible to turn a completly inaccessible app into an accessible one just with a script.
Additionally, you must distribute these scripts separately from your app, and users have to install them. It may be a difficulty for some people, depending on your target audience.
Beyond screen reader support
Screen reader support isn't everything.
This is beyond your question, so I won't enter into details, but you shouldn't forget about the following points if you really want to make an accessible app which isn't only accessible but also comfortable to use for a screen reader user.
This isn't at all an exhaustive list of additional things to watch out.
Keyboard navigation: most blind and many visually impaired aren't comfortable with the mouse and/or a touch screen. You must provide a full and consist way of using your app only with a keyboard, or, on mobile, only by standard touch gestures supported by the screen reader. Navigation should be as simple as possible, and should as much as you can conform to user preferences and general OS conventions (i.e. functions of tab, space, enter, etc.). This in turn implies to have a good structure of components.
Gamepad, motion sensors and other inputs: unless it's absolutely mandatory because it's your core concept, don't force the use of them and always allow a keyboard fallback
Visual appearance: as much as you can, you should use the settings/preferences defined at OS level for disposition, colors, contrasts, fonts, text size, dark mode, high contrast mode, etc. rather than using your own
Audio: don't output anything if the user can't reasonably expect any, make sure the volume can be changed at any time very easily, and if possible if it isn't against your core concept, always allow it to be paused, resumed, stopped and muted. Same reflection can apply to other outputs like vibration which you should always be able to disable.

How to find out if a user is using screen reader on my website

Is there are a way (a library) with which a web page can detect a screen reader being used on it? This can be just for a reporting/analytics purpose.
PS: A Drupal 8 website.
No. It is not possible. Screen readers operate as an application on the computer - this would be similar to trying to find out if someone viewing your website also had their calculator open - it is a privacy restriction. Also many of these users may be using Voiceover or Talkback on their mobile devices and there is no way to detect that either.
Your website should instead strive to follow web standards and work equally for all users.
I'm also curious as to what your specific goal is in detecting this, as screen readers are only one part of the accessibility tools that many people use - and focusing on just the screen reader user will not make your site accessible.
No. You can't. Definitely not.
One thing you can do is detecting if a user uses his mouse. This does not mean that he uses a screenreader or that he doesn't but is quite an indicator (but this should be categorized as "keyboard only users" not "screenreader users"). And that's, in my opinion, a question more interesting than knowing if a user uses a screenreader.
There are a lot of discussions about the wrongdoing of detecting screenreaders :
On Screen Reader Detection
Detecting screen readers in analytics, pros and cons

Which screen reader would be best to test site accessibility and how to configure that? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Which screen reader would be best to test site accessibility and how to configure that screen reader to test website (or default screen-reader setting would be ok) and which browser should be used to test accessibility with screen-readers?
Free or commercial it doesn't matter . Which can give best testing then site should be accessible in whole world as much as possible with all other screen readers?
my purpose is to make site as much as possible.
I will preface this answer by stating I’m a totally blind individual who uses Jaws as there only screen reader. I've played around with NVDA as well but have l9imited experience with it. Jaws is the most widely used screen reader at least in the US. If you can only use one screen reader I would pick it with the default settings. Both Internet Explorer and Firefox work with Jaws and both are widely used. Another screen reader you could use to test accessibility is NVDA this is an open source screen reader that works well with Firefox but not internet explorer. I would say if cost is an issue use NVDA with the latest version of Firefox, and if your site is accessible using that setup it will most likely work with Jaws. For a complete list of screen reading software see this
Installing and starting a screen reader isn't enough to do good accessibility testing. You won't know how accessible your site is until you turn off your monitor and unplug your mouse. Getting good enough at using any screen reader to do that will take time. The only sighted people I know that are efficient screen reader users either work for the screen reader companies, or do assistive technology training as their job. So while you can use a screen reader to test your site's accessibility the learning curve for a realistic test is quite high.
To answer your question directly, I would use JAWS with default settings in your target browser. If you can only afford one license, then use NVDA or Chromevox for your developers and give your Accessibility expert the copy of JAWS.
Keep in mind that while making sure your site works perfectly with a screenreader is very important, this only helps the blind. There are many other types of disabilities (e.g., hearing, motor, and cognitive disabilities) and to truly be accessible, your site needs to support those users too.
WCAG 2.0 is the best standard for making your site accessible to as many people as possible. There is A LOT of WCAG 2.0 documentation though, so I would start at webaim.org, http://webaim.org/standards/wcag/checklist if you are new to it, but do use the real thing http://www.w3.org/TR/WCAG20/ when you are ready.
Also, keep in mind that even if it "works" with a screenreader, it may be annoying (blind users rarely read top-to bottom, make sure you put in solid structure with headings and ARIA lankmarks) or it may not be giving a blind user the same amount of information that a sighted user might get. For example, helper text next to text inputs will be missed by a user tabbing through a form (fixes: hide a copy in the label with CSS, make the helper text the actual label, or use ARIA-describedby) - a good way to make sure it more than just "works" is to have your JAWS tester not be familiar with the site.

Do you develop with Accessibility in mind?

I've never really learned much about accessibility but it seems like an important topic.
When you build a website or piece of software, or when you're talking to a client about a website, where does accessibility come in? Or from your experience, if you don't have accessibility in something you've built for a client, do you get a lot of requests to include it, or does it limit you in some financial way?
What are the numbers, I guess. What's the return in your business, how many people have you talked to that need it? Do you yourself need accessibility features?
I do mainly Flex/Flash and it seems like I'll have to do a bit of work to have full accessibility.
Thanks for the help.
As a person with a disability myself, I am consious of adding accessibility features when I write software
Accessibility is an area of software design concerned with making software user interfaces avvessibile for people with physical or mental disabilities or imparements. Different people have different specific needs and you can't be expected to cater specifically to each but there are some broad groupings
Visual Imparements:
This includes blindness or color blindness. To assist in this area consider providing "good" alt text (clarified blow) and hints so that screen readers can present a view of your content that makes sense aurally. Providing easy access to links to raise text size and/or access to some high contrast stylesheet options is also a good idea.
Non-Mouse Users
There are a huge number of conditions that can prevent one from being able to successfully mouse, it took a few years for me and my brain, which is somewhat unreliable when it comes to spatial relationships to pick up the skill. For these people keyboard access is really helpful, I don't work in the web space so I'm not sure if there are standard keys to use, but these are communicated by screenreaders and tooltips so having any is better than none.
Hanselminutes episode #125 is quite educational. He talks with a blind user about accessibility on the web and in generalAccessibility is omitted from a lot of design processes, either because businesses don't have an immediate need for it and therefore don't consider it at all, or consider it a low priority feature. Leguslation in various countries has helped a bit in this regard, but the real problem is that accessibility in general is usually an afterthaught to the design process,
1"Good" alt text is judicious use of alt text that accentuates the content or purpose of a page, navigation elements should have alt text describing where interacting with them will take the user, similarly, things that aren't content, like spacers should have no alt text at all, because there is nothinng worse then hearing "Foo's widgets spacer spacer spacer spacer spacer nav_Products spacer nav_support"
I think accessibility is usually completely forgotten about (either implicitly or explicitly dismissed beforehand because of issues like cost) in most software development projects. Unless companies (or individual developers, more likely) already have experience with either people with disabilities or with writing software with disabilities of users in mind.
As a developer I at least try to do keyboard shortcuts correctly in software I work on (because that's something I can easily dog-food myself, since I try to keep hands-on-keyboard as much as possible). Apart from that it depends on whether there are requirements about accessibility.
I do think this kind of thing is part of "programming taxes", i.e. things that you as a developer should always be doing, but...
I am only aware of this - at least more than the average developer, I think - because I have once written software for a software magazine on floppy disk, or Flagazine. This was in PowerBasic 3.2, grown out of BASIC sources in a magazine, making these sources available by BBS and disk, eventually growing a menu around the little applications to easily start them, etc.
One of our primary users (and later members of the editorial staff) was blind and was appalled when we switched from text mode to an EGA mouse driven menu, as his TSR screenreader software couldn't do anything with graphics. It turned out that his speech synthesizer simply accepted text from a COM port. It had a small (8K I think?) buffer that would be instantly cleared on reception of (I think) an ASCII 1 character. And that was it.
So we made the graphical menu (and most other programs on the Flagazine) completely keyboard accessable at all times and in the graphical programs we use a small library I wrote to send ASCII text to a configured COM port. This had small utility methods like ClearBuffer(). With this, and the convention of speaking possible menu actions when pressing the space bar, made all of this software accessable to our blind users.
I even adapted a terminal application for my HP48 calculator (adding a clear buffer/screen on ASCII 1) so I could use that to emulate a speech synthesizer. I would then test all of our software in each Flagazine by attaching my HP48 with the emulator running, turning off my computer monitor and trying if I could use all the software without seeing anything.
Those were the days, about 12 years ago... ;-)
I am a blind individual so have to develop with accessibility in mind if I want to use my own programs. I find my self focusing on accessibility based on the type of application I’m writing. When doing command line or mainframe applications I don’t think about accessibility since those environments are inherently accessible. With web based applications I have to give some thought to accessibility but not a lot. This is mainly because I write simple web applications for limited use so don’t have to worry about making the interface appealing, just usable. The area I spend the most time focused on accessibility is desktop applications. For example using .net I need to make sure accessible properties are set properly and that labels are in the proper position in relation to a text box so my screen reader can find them and associate them with the proper control.

What might my user have installed thats going to break my web app?

There are probably thousands of applications out there like 'Google Web Accelerator' and all kinds of popup blockers. Then theres header blocking personal firewalls, full site blockers, and paranoid cookie monsters.
Fortunately Web Accelerator is now defunct (I suggest you read the above article - its actually quite funny what issues it caused) but there are so many other plugins and third party apps out there that its impossible to test them all with your app until its out in the wild.
What I'm looking for is advice on the most important things to remember when writing a web-app (whatever technology) with respect to ensuring the user's environment isnt going to break it. Kind of like a checklist.
Whats the craziest thing you've experienced?
PS. I may have linked to net-nanny above, but I'm not trying to make a porn site
The best advice I can give is to program defensively. For example, don't assume that all of your scripts may be loaded. I've seen cases where AdBlocker Plus will block 1/10 scripts that are included in a page just because it has the word "ad" in the name or path. While you can work around this by renaming the file, it's still good to check that a particular object exists before using it.
The weirdest thing I've seen wasn't so much a browser plugin but a firewall/proxy configuration at a user's workplace. They were using a squid proxy that was trying to remove ads by replacing any image HTTP request that it thought was an ad with a single pixel GIF image. Unfortunately it did this for non-GIF images too so when our iPhone application was expecting a PNG image and got a GIF, it would crash.
Internet Explorer 6. :)
No, but seriously. Firefox plugins like noscript and greasemonkey for one, though those are likely to be a very small minority.
Sometimes the user's environment means a screen reader (or even a braille interface like this). If your layout is in any way critical to the content being delivered as intended, you've got a problem right there.
Web pages break, fact of life; the closer you have been coding and designing up against standards, the less your fault it is.
Something I have checked in the past is loading some of the more popular toolbars that people tend to install (Google, Yahoo, MSN, etc) and seeing how that affects the users experience.
To a certain extent it is difficult to preempt which of the products you mentioned will be used by your users since there are so many. I would say your best bet is to test for the most frequent products that your user base may employ and roll with the punches for the rest. If you have the time to test other possible scenarios, by all means do.
Also, making it easy for your users to report possible issues also helps lessen the time it takes to get a fix in place should it be something you can work around.

Resources