Log in   Register a New Account

Accessify Forum - Discuss Website Accessibility

New to the forum?

Only an email address is required.

Register Here

Already registered? Log In

Currently Online

No registered users are online.

SiteMorse Dropped from Public Sector Forums

  • Reply to topic
  • Post new topic
Reply with quote SiteMorse league tables will no longer be published on the Public Sector Forums.

Quote:
After over two years of bringing you the loved and loathed tables we’ve decided to call time, basically because of the company’s insistence on including the accessibility element which we now – after much deliberation – accept makes the whole exercise utterly meaningless. We therefore feel these are no longer of value to anyone other than SiteMorse itself, and certainly not to our users.


The move by PSF is likely to have a huge impact on SiteMorse who rely heavily on Public Sector business.

The issue of the validity of SiteMorse's league tables have recently been discussed here:
http://www.accessifyforum.com/viewtopic.php?t=3299&start=0
Reply with quote Thanks for the heads up Grant Very Happy

Mike Abbott
Accessible to everyone
Reply with quote My Pleasure Mike.

I've asked PSF if they mind me publishing the article here.
It's good news for the public sector guys. They'll feel a lot less pressure to conform to the results of an automated tool.
Reply with quote Grant, thanks for the update. Fingers crossed the PSF guys allow a public reproduction of their post - from what I've seen of it, it looks like an excellent informative article.

I've blogged it.
Reply with quote
Isofarro wrote:
Grant, thanks for the update. Fingers crossed the PSF guys allow a public reproduction of their post - from what I've seen of it, it looks like an excellent informative article.


Here's the article:

Quote:
Why We Will No Longer Publish SiteMorse League Tables

After much deliberation it has been decided to discontinue running the monthly local authority website league tables from SiteMorse on Public Sector Forums. These are our reasons why....

Subsequent to discussions both with the company itself and a number of accepted experts in the field we have concluded the accessibility element of the tables is extremely misleading, rendering the positions and rankings that make up the league as a whole, meaningless.

Moreover in the final analysis – far from ‘pushing up the bar’ as was the original intention - inclusion of this aspect could actually damage the cause of promoting better websites among local authorities. Because accessibility simply cannot be accurately tested by automatic means alone, basing a league on such results is likely to give an entirely false picture of the true state of affairs.

Reasons are severalfold, but are summarised neatly in the below paragraph taken from revisions to the forthcoming Government Guidelines on Accessibility due to be published shortly by the eGovernment Unit. As such they represent the official view of auto testing of sites:

"…automated tools are like spell checkers – they look for obvious problems within a web page, then generate a list of possible problems. They cannot give a straightforward statement of whether your website meets certain accessibility standards. The list of possible problems needs to be interpreted by an experienced person and matched against what your site is actually doing. There is a substantial list of accessibility issues (at least 50%) that cannot be assessed by current automatic tools.."

Using this analogy, ranking sites using an auto tool is the equivalent of ranking Word documents for errors based on the number of items highlighted by the ‘spelling & grammar’ checker – an obviously absurd proposition.

Concurring with such a view on the value of automated tools was the RNIB's Julie Howell who remarked:

“Clearly, automated testing tools can be useful, albeit to a limited extent as discussed above. However, such tools should never be the sole means of measuring the accessibility of a website.

Research carried out by the Disability Rights Commission (DRC) in 2004, indicated that while conformance to the WAI WCAG is essential, organisations can only be sure of the accesibility and usability of their websites when disabled people are involved in testing their designs and providing feedback.

Later this year, the British Standards Institution (BSI) will publish guidance commissioned by the Disability Rights Commission (DRC): Publicly Available Specification (PAS) number 78 'Guide to Good Practice in Commissioning Accessible Websites'.

PAS 78 will include guidance on the benefits and pitfalls of using automated testing tools for evaluation and repair of accessible web code. The message is clear: don't really on automated testing tools alone, involve disabled people to test your website. PAS 78 will include guidance on how to undertake usability testing with disabled participants."

Although there was dialogue recently about the possibility of SiteMorse reviewing the way it compiles its accessibility test results and creating a table based on the percentage of ‘negative’ rather than ‘positive’ ones - further consideration leaves us to feel this actually gives no clearer a picture. As one contributor to the Accessify Forum website – to which PSF linked to follow the recent debate on this topic - put it:

“I still can't see how it makes a jot of difference. 0% fail = 100% pass, 100% fail = 0% pass. Still hokum….(because of the number of checkpoints the tool is unable conclusively to test).”

With the above in mind we have therefore now concluded compiling leagues which incorporate the results of auto accessibility testing - as SiteMorse does – to be a worthless and meaningless exercise.

We do nonetheless recognise that - having hosted the tables for over two years – a more detailed explanation for our decision may probably be helpful. This position is shared by accessibility expert Grant Broome of CDSM Interactive Solutions - a partner, consultant organisation to pan-disability group the Shaw Trust – who supplied the below clarification. Such information might also be useful when - as some people have experienced - individuals within an organisation question members of the web team about their authority’s position in the league.

A website that is almost entirely inaccessible can score 100% as long as it passes the checkpoints that are being tested.
In all cases where automated testing is concerned, some checkpoints are too subjective to be tested in any meaningful way. That means that only a proportion of checkpoints can be tested by the machine. SiteMorse tests account for 30-40% of priority 1 and 2 checkpoints. That leaves the vast majority of priority 1 and 2 checkpoints untested (and none of the priority 3 checkpoints).

It is If we take a scenario where a website passes 60-70% of checkpoints but fails the SiteMorse test set, it could score 0% in that test, even though it may be more accessible than the site that passes all SiteMorse tests and scores 100%.

Automated tools can be tricked
Give every image on a site the same alt text, for example say, alt=”my standard alt text”, and it will probably be passed by every automated tool available as acceptable alt text, even though the images vary a great deal and a proper understanding of their context is essential to the user. Given this simplistic method of error checking, it’s easy to see how a conscientious web manager with a single missing alt text could have his/her entire site ranked as a priority 1 failure (even though the image may have be of no importance – a spacer image for example), while the webmaster that duplicates alt text simply to gain a high ranking with minimum effort can do so and score 100% for this checkpoint.

The errors that can be checked for need manual verification
Of the errors that are being checked, practically all of them need to be checked manually. Automated tools make mistakes because their error detection is based on certain assumptions. It’s important to check some of the errors manually to ensure that the tool itself isn’t making errors. If during a manual check, the error is found to be faulty, then the author is simply more aware of the tools’ shortcomings and can watch out for the same type of scenario in future. But a league table often puts pressure on the author to drastically alter the web page even if there are no true errors because they may be under pressure to improve their ranking, even though it is meaningless. It is especially in regard to this practice we feel the tables may now do more harm than good.

The algorithm that the tool uses to assess the website is entirely dependent on the developers own interpretation of guidelines which are often entirely subjective
Every automated tool has a programmer behind it. Whether or not a web page passes a test is down to the way that the automated tool has been developed.

Web accessibility can be very complicated. There is much discussion about the interpretation of guidelines and very often it’s only the end user that can decide whether a site is accessible or not.

When an automated tool flags up an error, it is the developer’s interpretation of the checkpoint that is being tested, and not necessarily the checkpoint itself. It’s important when using automated testing tools to have an informed opinion on all web accessibility issues in order to be able to verify results.

Some accessibility issues are not covered by WCAG guidelines
It is possible to create a website that complies with WCAG guidelines and still presents accessibility barriers. A site that has text that is not fixed in size but scales between “1pt” and “4pt” technically meets Web Content Accessibility Guidelines. It will incidentally pass through most automated testing tools. Yet it would not only make the site inaccessible to disabled people, it would be inaccessible to most people. Ironically, the only people likely to be able to use the site without altering their browser settings would be screen reader users who would not be affected by text size. So while measuring accessibility using WCAG guidelines is undeniably the best starting point, there is more to accessibility than a list of checkboxes.

In this context we feel the tables no longer serve any purpose other than as a marketing instrument for SiteMorse since – as demonstrated above – it is impossible seriously or accurately to ‘rank’ one website above or below another based on such tests.

Given the seriousness with which we are aware some of our users take these leagues we encourage feedback on this article and our decision to discontinue with the league tables. You can post to our Noticeboard or mail to info@publicsectorforums.co.uk


I've spoken to Ian Dunmore of PSF who, aside from a single disgruntled league table supporter has had unanimous support for their decision.

For what it's worth, I don't think it was an easy decision to make. I have the utmost respect for PSF and their ethics.
Reply with quote You know I understand and agree (who couldn't), with most of those points, but it does seem like throwing the baby out with the bathwater.

Automated tools aren't the be all and end all by a long chalk, and they do have their limitations. But I feel there is a place for automated testing, it just requires a clear definition of where that place is and that is understandable by all. Also since there are certain checks like DTD compliance that can be checked 100% it's a pity that some compromise couldn't be reached.
Reply with quote
Richard Conyard wrote:
You know I understand and agree (who couldn't), with most of those points, but it does seem like throwing the baby out with the bathwater.


Not at all. The problem is with the league tables themselves - not the concept of automated testing. By publishing the league tables gives some sense of meaning, and this meaning is now shown to be misleading.
Reply with quote Indeed, thus the compromise I mentioned. The league tables could be misleading, but it was at least one way of promoting competition between local authorities in a way that non-technical managers (and budget allocators), can understand. I do understand and agree that they are open to abuse, and can give conflicting answers.
Reply with quote
Isofarro wrote:
Richard Conyard wrote:
You know I understand and agree (who couldn't), with most of those points, but it does seem like throwing the baby out with the bathwater.


Not at all. The problem is with the league tables themselves - not the concept of automated testing. By publishing the league tables gives some sense of meaning, and this meaning is now shown to be misleading.


i would think that the real problem was not with the tables but a misunderstanding of that they represent, but if what the represent is not made clear then they should not be published

reports of any statistical nature are only valid if you list the baseline for it and state what factors are being tested and you should also make the raw data avalible if possible so it can be independently verified

I don't think that many, if any companies would make this obvious as in most cases it undermines the marketing element of such surveys

when I view any thing like a survey it should be " pinch of salt, pinch of salt, pinch of salt "
Reply with quote
monkeygod wrote:

when I view any thing like a survey it should be " pinch of salt, pinch of salt, pinch of salt "


Agreed, although I like the phrase 'lies, damned lies and statistics'
Reply with quote
Richard Conyard wrote:
monkeygod wrote:

when I view any thing like a survey it should be " pinch of salt, pinch of salt, pinch of salt "


Agreed, although I like the phrase 'lies, damned lies and statistics'


statistics never lie , they are just misinterpreted ( some times deliberately )
Reply with quote Anyone know if SiteMorse would care to comment

Mike Abbott
Accessible to everyone
Reply with quote
Richard Conyard wrote:
The league tables could be misleading, but it was at least one way of promoting competition between local authorities in a way that non-technical managers (and budget allocators), can understand.

Noooo.

If that was the case no-one would be arguing with them. All it promoted was getting a pass with the Sitemorse test, had nothing to do with accessibility whatsoever. As has already been pointed out passing the Sitemorse test doesn't exactly mean you have a good site.
Reply with quote Hmmm,
Agreed that passing the test doesn't mean you definately have a good or accessible site. My point was that they've gone from mentioning a non-perfect test that was encouraging competition (I've never met an ICT manager at a local authority that doesn't mention SiteMorse less that 5 minutes after mentioning website), to no mention at all, which seems a somewhat backward step since I am sure a halfway house could have been reached.

The level of resources available to LA website managers in promoting WAI standards is from all accounts very slim, without the league table I feel they would be slimmer still and the whole accessible website strategy for local government would be swept under the carpet. It's almost like the eGMS standard, having talked briefly to a website manager at a LA who had > 20,000 documents to classify, no budget, no staff, just him by himself and it would eventually happen when hell froze over.
Reply with quote So is there a way of offering a viable alternative. Can the Office of the E-Envoy help with this, I thought it was their job to promote accessibility and inclusion for government websites. Or am I out of date Confused

Mike Abbott
Accessible to everyone

Display posts from previous:   

Page 1 of 5

Goto page 1, 2, 3, 4, 5  Next

All times are GMT

  • Reply to topic
  • Post new topic