Log in   Register a New Account

Accessify Forum - Discuss Website Accessibility

New to the forum?

Only an email address is required.

Register Here

Already registered? Log In

Legal advice from automated testing tools?

Reply with quote
JackP wrote:
And yet it could be argued that providing information IS providing a service.

True. But it's the intent of the Parliament that passed the Act that matters. That's what a judge will consider when interpreting the statute and making a judgment, if a case ever makes it to court.

I personally don't believe the intention was to impose duties on private citizens. The DDA, the Code of Practice and the DRC's Web site make a lot of references to organisations, businesses, companies, staff, employees, customers etc, but don't even hint that it applies to private citizens.

JackP wrote:
There is information about my family tree on my site. If that information was not available to users with disabilities, I would say that it's right to argue I am discriminating against them.

Maybe; it depends why you haven't made that information available to them. And whether it is actionable in law is a different matter.

JackP wrote:
[...]I still think a court would find in my favour if I was sued because a) I specifically say "if you have problems accessing my site, contact me and I'll do my best to fix your problem" and it would certainly be reasonable for that line of approach to be tried first

I can't imagine a solicitor taking on a case if the person alleging discrimination hasn't even contacted the site owner to raise the access issue(s).

JackP wrote:
and b) they wouldn't want to set a precedent where hobbyists/personal sites could be sued under the DDA.

If we assume your site was not accessible, and you refused to make it accessible or available by other means during initial communication and during the solicitors' negotiation stage. And their solicitor and barrister believe it's a case they can win…going to court is always a gamble. There's no guarantee of winning. If they lose, they pay their legal costs and yours. It's the site contents they want access to, and in your case, what would they win – access to your family tree? Who's going to gamble maybe tens of thousands of pounds just to see a site's contents? I just can't see it happening.
Reply with quote Oliver from Silktide here (yes, we read forums too!)

A lot of you have raised some excellent points and we've encountered many of these before. In particular we're aware that the statement "this website is probably illegal" has provoked a great deal of controversy.

Our rationale was simple enough originally:

* DDA means that websites which offer a public service must be usable to those with disabilities
* There is currently no defined "test" for this, but general expert consensus suggests A-AA level compliance to be appropriate
* Ergo we approximate DDA compliance as W3C compliance

Clearly this is not exact, as pointed out there are varying degrees of infringement and single character entities (esp. when copy & pasted from some horror like Word) are hardly grounds for legal action. Conversely, omitting ALT tags, or providing non-HTML text navigation almost assuredly are.

Bear in mind this tool was created almost casually over a year ago as an experiment, at the time we were expecting a few hundred users a day (we've hit over 50,000 an hour at some times). Since then we've been re-writing everything into Sitescore 2 / Sitescore Enterprise which does things very differently.

Our current thinking (although both tools are still being tested):

* Variable levels of testing. Critical issues (e.g. no ALTS, especially on images which link somewhere) versus minor compliance issues (e.g. encoding of entities). To do this we've had to develop our own W3C parser.

* Variable score (1-10) for accessibility, where anything below 5 would almost certainly be vulnerable to legal action, 7 would be an acceptable 'safe' level, 9 would be excellent but with some minor flaws etc

* Manual tests. The only way to test some areas of a site is to subject them to manual testing. We're adding a framework to "plug in" test results from manual tests, which can be scheduled and allocated to batches of users.

We're also considering some tools to simulate a given website as it would be experienced by users using a screen reader or similar, possibly as a compliment to the manual tests.

We'd be interested in all of your own feedback.
Reply with quote
silktide wrote:

Clearly this is not exact, as pointed out there are varying degrees of infringement and single character entities (esp. when copy & pasted from some horror like Word) are hardly grounds for legal action. Conversely, omitting ALT tags, or providing non-HTML text navigation almost assuredly are.

the trouble is that as of present there has been no legal action in the UK, so we don't know exactly what is required so any statement like this in inaccurate. you really need to emphasise the May part if you say anything

silktide wrote:

Bear in mind this tool was created almost casually over a year ago as an experiment, at the time we were expecting a few hundred users a day (we've hit over 50,000 an hour at some times). Since then we've been re-writing everything into Sitescore 2 / Sitescore Enterprise which does things very differently.


you are right it does not quite have the feel of a finished product

aslong as you are trying to improve and get the testing as accurate as such tools could be then we will be glad to help
Reply with quote Good to hear from you Oliver.

silktide wrote:
* DDA means that websites which offer a public service must be usable to those with disabilities


This is not the case though. A company could over a service through their website that was completely inaccessible, however provided they offer an accessible alternative they could be deemed to have taken reasonable steps to make the actual service accessible.

The rest of your logic was flawless though Wink

One thing about the test though, tried it with a site of mine and it marks down because the same titles are used on the site? Since each page has a different title this confused me, the only thing I can guess at is it reviews each page twice once normal and once text only?
Reply with quote
Richard Conyard wrote:
This is not the case though. A company could over a service through their website that was completely inaccessible, however provided they offer an accessible alternative they could be deemed to have taken reasonable steps to make the actual service accessible.


that's a bit of a grey area. some corners are arguing that each "method" of providing a service can be regarded as a service in its own right. thus, for example, having an inaccessible web shop and saying "but we offer a telephone ordering service" may not necessarily hold. as i said...not necessarily cut and dry.

Patrick H. Lauke / splintered
Reply with quote and by the way: good to see silktide's response. nice one guys.

Patrick H. Lauke / splintered
Reply with quote
silktide wrote:
Oliver from Silktide here (yes, we read forums too!)

A lot of you have raised some excellent points and we've encountered many of these before. In particular we're aware that the statement "this website is probably illegal" has provoked a great deal of controversy.

[...]

Bear in mind this tool was created almost casually over a year ago as an experiment, at the time we were expecting a few hundred users a day (we've hit over 50,000 an hour at some times). Since then we've been re-writing everything into Sitescore 2 / Sitescore Enterprise which does things very differently.

Our current thinking (although both tools are still being tested):

[...]
Hi Oliver, this has helped to make the problems with the tool a lot more understandable. Smile

I would suggest making it very clear on your site that the tool is more of a "public beta" than a final and authoritative audit. For example, Google have several services which are in "beta" form and they always make sure this is clear to their users.

The requirements set out by the DDA require a much deeper range of tests than simply parsing the code of the website. For example, a company which simply cannot afford to make adjustments would not be expected to, although it would be encouraged to do so. Rather than stating that a site is "probably" breaking the law when you cannot test overriding factors such as the financial viability of their site being upgraded, I would suggest saying it "possibly" breaks the law. You could add a bit on the end saying that they could use the services of your company to find out for sure. Wink

Will you make the testing and scoring mechanism public? By showing technical staff exactly what your tool does and does not test for, they could get a much better idea of what areas your tool would be useful to them. It would make your tool more transparent and that would give it a competitive edge over some other tools, which are distinctly opaque.

I hope you won't be tempted to start creating league tables only using automated results!
Reply with quote
Cerbera wrote:
I hope you won't be tempted to start creating league tables only using automated results!


There is the top 750 sites - good to see a couple of ours in there Smile
Reply with quote I had a little play yesterday and decided that I quite like SilkTide's system, at least as far as these things go.

I'm going to qualify that by saying that I took issue with about 2/3 of SilkTide's results, suggestions and recommendations. But then, I do have the advantage of being a fairly knowledgeable boffin with a brain the size of an egg (I'm also a pedant). There's not much there that will make me want to use the service again.

But it seems that SilkTide have put a big big effort into making their site critiques readable and understandable to normal people. In comparison to Sitemorse, whose reports simply just arn't written in a human digestable format (and are therefore utterly useless), SilkTide has been thinking about the people they provide a service too.

My big criticism of SiteMorse is that the only reason they seem to be in it, is to line their own pockets. As far as I can tell they're not much interested in accessibility, promoting good design practices or improving their victims' websites. They just seem to want to prise free money from the public sector for the least effort. It's no suprise then, that SiteMorse's reports are terrible. In my humble opinion SilkTide's reports are presented excellently for an automated tool, even if the actual content leaves more than a little to be desired in places; so I'm definitely looking forward to version 2.
Reply with quote
Quote:
The requirements set out by the DDA require a much deeper range of tests than simply parsing the code of the website. For example, a company which simply cannot afford to make adjustments would not be expected to, although it would be encouraged to do so. Rather than stating that a site is "probably" breaking the law when you cannot test overriding factors such as the financial viability of their site being upgraded, I would suggest saying it "possibly" breaks the law.


You're right - one of the hardest factors to consider is "what is the website actually doing?" The DDA is basically irrelevant for personal blogs for instance, although we'd suggest it's just nicer to comply anyway! We may end up having to ask the user ("this website is a personal site / small company / large company / public body" etc) but we'd prefer to auto-detect where possible and streamline the free test.

Quote:
Will you make the testing and scoring mechanism public? By showing technical staff exactly what your tool does and does not test for, they could get a much better idea of what areas your tool would be useful to them. It would make your tool more transparent and that would give it a competitive edge over some other tools, which are distinctly opaque.


Yes - we're going to start publishing details with the new versions (although not the actual code, we have families to feed!) We're very open to the benefits of ongoing peer review.

We are specifically targetting SiteMorse's market with what I like to think of as a 'Google Approach'. We're aiming to be the nice guys who people want to do business with - this means being open and playing fair. We won't be blocking our own website from any test results.
Reply with quote
silktide wrote:
Clearly this is not exact, as pointed out there are varying degrees of infringement and single character entities (esp. when copy & pasted from some horror like Word) are hardly grounds for legal action. Conversely, omitting ALT tags, or providing non-HTML text navigation almost assuredly are.

Omitting ALT attributes is almost assuredly grounds for legal action? Bloody Hell! Whatever happened to the presumption of innocence? Very Happy

A site's images may be purely decorative, so a lack of ALT text would detract nothing from the understanding or use of the page to anyone using a screen reader or refreshable Braille display. Omitting ALT attributes may mean the document won't validate, and therefore fail checkpoint 3.2, but there doesn't seem to be a consensus on whether invalid code is even a barrier to access.

And even if a site fails to satisfy one or more checkpoints, it doesn't mean the site owner is discriminating. It doesn't take into account whether they have made reasonable adjustments to make the site accessible, or whether they have provided a reasonable alternative method of making their services available, as Ben and Richard have already pointed out.

Or they may not even be aware their site is inaccessible. They may be in the process of making it accessible, or they may have hired one of the many "run it through Bobby and slap a badge on it" cowboy Web design firms to make it accessible. Etc.

If a disabled person is having a problem accessing a site, it is "almost assuredly grounds for" them to contact the site owner and raise the issue. If they don't get a satisfactory response, and feel they are being discriminated against because of their disability, then their next step should be to seek legal advice if they want to take it further.

No matter how well your tool has been written, it's in no position to make conclusions about discrimination, because it doesn't have all the facts. The most it can ascertain is that a site may be inaccessible to some people.
Reply with quote An interesting tool, the report format is nice. Be interested to see the full version after user feedback. If Silktide gets the tone right and the tests are machine measureable I'm sure the credibility gained from developer approval will compensate for all the maybe's on legal standing.

Now link to my site people, you are driving my average down! My site scores 5.8 marketing, 9.8 design, 10 accessibility and 9.4 experience...
Reply with quote I like the presentation of results.
I'm trying to work out what the main difference is between this service and Sitemorse (which I detest). There's a scoring sytem here, like SiteMorse, and at first glance it seems to be fair. The October 1st warning is total rubbish, but that's already been said, and you've made some comments about missing alt texts that aren't strictly true.
If you can incorporate user testing and make the testing method transparent I reckon you could be onto a winner.

I'd be interested to find out how an automated tool can accurately tell the difference between a layout table and a data table. If it's some algorithm (guessing system) that only gets it right in 90% of cases, then it's already a system open to abuse and that's probably where SiteMorse falls over. I wonder that if I built a table layout site with a touch of CSS and blank summaries on all my tables that it would pass your CSS test?

You'll need to build in a manual check in each instance where it can be demonstrated that the automated test can't fully test a checkpoint.

Stay ethical and you should be on the right track to providing an accurate and useful tool. Just don't start publishing accessibility league tables based on automated results Wink

Grant Broome
Blog
CDSM
Shaw Trust
Reply with quote Once the scoring system for the tool is publically available, that will solve all the problems, imho. People will then be able to make an informed decision about which scores should be taken seriously, which ones should be ignored and whether the service is a worthwhile investment. As long as the limitations of automated testing are made clear to all clients and interested entities, there should be no problems with the results being taken for something they are not.

The conservative way to detected layout tables is to identify any table which has no <th> elements. A table without headers cannot be a data table since the data is not being defined. Also, a nested table will always be for layout because data would use colspan and rowspan.

However, some layout tables use <th> elements without nesting, so the results would have to indicate that: "There appears to be some layout tables, but I cannot be certain."

By combining many complex tests into the algorthim (such as looking for very large amounts of text in cells with large amounts of markup, looking at the DOM tree to see if the table is containing many, most or all elements of the <body> section, etc) you could get a score and set a series of threshholds. A different response could be reported by the tool at each threshold, becoming more confident the higher the score is but never stating the detection as absolute. It could even give details about exactly what results from which tests made it think there are layout tables present, although this might not be understood by clients. Smile


I think it is impossible for an algorithm to be 100% sure about finding all layout tables because their implementation doesn't follow any strict patterns. The only things you can be sure about is that a table without headers is not a correctly formed data table - but even that still doesn't mean it's a layout table.
Reply with quote I think Cerbera has been reading our source code for the new Sitescore... Smile

Display posts from previous:   

Page 2 of 3

Goto page Previous  1, 2, 3  Next

All times are GMT

  • Reply to topic
  • Post new topic