Log in   Register a New Account

Accessify Forum - Discuss Website Accessibility

New to the forum?

Only an email address is required.

Register Here

Already registered? Log In

Currently Online

No registered users are online.

SiteMorse Dropped from Public Sector Forums

Reply with quote
Richard Conyard wrote:
I don't think I would go as far as useless. Just the same as you can't rely completely on automated testing I don't think it is practical to throw automated testing away because it doesn't solve everything.

Also looking at SiteMorse specifically they are checking more than just accessibility. The marketing which is the league table comprises of other checks like eGMS, code validity and speed/size.


Agreed on testing - SiteMorse provides a baseline. If its wrong - and that's a big if - then its wrong in the same way for everyone.

How is the league table 'marketing' beyond getting SiteMorse coverage in the National Press.
Reply with quote
Isofarro wrote:
eGMS and the highly flawed speed and response tests have nothing to do with accessibility.


What?

So if your site runs like a three legged donkey with athritis its going to be accessible. Bullplop!

and, by the way, eGMS is mandatory for all pulic sector websites. SiteMorse makes it very clear that it is not just testing accessibility but also, for the public sector, conformance with national requirements.
Reply with quote
Isofarro wrote:
Richard Conyard wrote:
Also looking at SiteMorse specifically they are checking more than just accessibility. The marketing which is the league table comprises of other checks like eGMS, code validity and speed/size.


Interesting that SiteMorse use the very same tool for the basis of criticising accessibility experts under the guise of "accessibility". eGMS and the highly flawed speed and response tests have nothing to do with accessibility.

You'd think SiteMorse knew their tools well enough to make that distinction - especially in a press release. Whatever next - will they test the number of coffees drunk in a day by the janitor as another "accessibility test"?


Iso,
I'm not on here to stand up for SiteMorse, they can do that themselves if they want to. Personally I don't like their marketing tactic of having a go at other companies in the field. There are enough companies that don't bother with accessibility, or worse claim they do and still don't bother which would make far better targets.

That said I'm also not about to having a go at them just because they are SiteMorse and I don't like some of the stuff they do. And well some of the stuff on here just sounds like personal gripes.

Coming back to your point I don't think they tested for eGMS when looking at accessibility companies, looking through the report it's not in their table. As for government sites why not test for eGMS, after all it is a requirement?
Reply with quote
nhodgetts wrote:
Isofarro wrote:
eGMS and the highly flawed speed and response tests have nothing to do with accessibility.


What?

So if your site runs like a three legged donkey with athritis its going to be accessible. Bullplop!


You are mistaken on two counts:

1.) SiteMorse actually runs the test from one server location somewhere in the UK. If you happen to have your site on the same network ring, you are going to score really well on the speed tests. If you happen to be located on the other side of the UK, or somewhere else in the planet - the number of hops between the boxes grows from one to twenty or thirty. SiteMorse don't actually do anything to even out this imbalance, so in effect their speed test provides no useful indication at all. A proper speed test would involve testing from various locations around the planet - to negate the effects of local problems around the testing robot. SiteMorse don't appear to have done this, for all we know they could be doing it with carrier pigeons from an Oil Rig in the North Atlantic.

2.) If a site is slow, it will be slow for everyone, not just people with disabilities.
Reply with quote
nhodgetts wrote:
So if your site runs like a three legged donkey with athritis its going to be accessible. Bullplop!


Congratulations on using disability as a derogatory term.

A slow-loading Web site is no more a barrier to access than a slow lift is a barrier to accessing the top floor of a building.
Reply with quote
nhodgetts wrote:
Isofarro wrote:
That is complete rubbish. There are a number of organisations that have done the same, and still offer the same service.

Name One!


I'll name three:
* DRC - they measured over 1000 UK websites June 2004, including a proper accessibility test of 100 of those sites.
* Nomensa - they regularly survey sectors of the UK measuring accessibility - doing a proper accessibility test
* AbilityNet - they regularly test sections of the UK websites measuring accessibility - doing a proper accessibility test.
* RNIB - they've regularly done accessibility testing of ecommerce websites - proper accessibility testing.

You'll find the material coming out of their reports are far more practical and usable.

Don't even think that some half-arsed script written by a developer with a low understanding of web accessibility, and running it at 11 o'clock on a Friday morning then piss off to the pub for a few hours counts as doing a proper accessibility test.

A proper accessibility test involves checking the results. And manually checking all the checkpoints - as its clear there's probably only one checkpoint that could be fully tested automatically - checkpoint 3.2, and SiteMorse even get that one wrong!
Reply with quote
Isofarro wrote:
If you happen to have your site on the same network ring, you are going to score really well on the speed tests. If you happen to be located on the other side of the UK, or somewhere else in the planet - the number of hops between the boxes grows from one to twenty or thirty. SiteMorse don't actually do anything to even out this imbalance, so in effect their speed test provides no useful indication at all.

This is nonsense and you know it. For the sort of speed tests we are talking about, network geography (certainly within the country) is pretty much irrelevant. If either your response time or your download speed is affected by network geography to the extent that your SiteMorse score goes down, then you have a problem which needs fixing.
Reply with quote
Jon R wrote:
If either your response time or your download speed is affected by network geography to the extent that your SiteMorse score goes down, then you have a problem which needs fixing.


Or SiteMorse has a problem that needs fixing.
Reply with quote
Jon R wrote:
This is nonsense and you know it. For the sort of speed tests we are talking about, network geography (certainly within the country) is pretty much irrelevant. If either your response time or your download speed is affected by network geography to the extent that your SiteMorse score goes down, then you have a problem which needs fixing.


I don't intend to test this out myself, I've got better things to do, but that just doesn't hold water.

Pinging(traceroute or the like) a site from one server, where site A is on the same server will always return a faster result that those hosted elsewhere. If I were to test from here it would go through 3 or 4 hops before I even left my ISP!!

A speed test can only be useful if all sites tested are done so from a 'remote' point. It would then give a better reflection of the time taken to reach these sites.
Reply with quote
elfin wrote:

Pinging(traceroute or the like) a site from one server, where site A is on the same server will always return a faster result that those hosted elsewhere. If I were to test from here it would go through 3 or 4 hops before I even left my ISP!!

A speed test can only be useful if all sites tested are done so from a 'remote' point. It would then give a better reflection of the time taken to reach these sites.


a good idea with speed tests might also be the number of hops taken that then can be used to gauge the relative location of the servers involved
Reply with quote Something like:

(response time - latency) * size of files required to be downloaded

?
Reply with quote
Jon R wrote:
This is nonsense and you know it. For the sort of speed tests we are talking about, network geography (certainly within the country) is pretty much irrelevant. If either your response time or your download speed is affected by network geography to the extent that your SiteMorse score goes down, then you have a problem which needs fixing.

Rubbish - we monitor the response time of our site from three different locations around the UK, and there are significant fluctuations between the three according to network traffic and other conditions affecting some areas and not others.

The Watchmaker Project - my personal blog
29digital Design Studio - freelance web design/development
Reply with quote
Richard Conyard wrote:
I don't know if you're reading this Deri, but is there any plan for SciVisum to look at producing league tables? I don't know whether comparison here might be a benefit or a detractor?


Been on holiday, so just catching up.

We did give it some thought a while back, but struggled to come up with a list of things that can be genuinely 100% auto tested, whilst not being so trivial as to have very little value.

I talked to PSF back in mid 2003, initially they'd suggested a meeting so we could discuss things, but they instead came back and said that because the sitemorse league tables were getting them more PR value than anything else they had done, they didn't want to think about changing the model.

Instead we've continued in the local authority space to contribute to the Socitm 'Better Connected' web audits each year, which are a pretty good and authoritative exercise - if anyone is keen for a league table for LA sites then there's one.

The testing we contribute is not automated though, although elements of it are.

We've also just finished testing 69 councils who wanted an audit if the use of technology on their sites - and found loads of areas where sites can be greatly improved in performance, accessibility and other factors; at the cost of little time and effort.

Our approach is to give the extra engineering analysis that helps folks improve their site; not just to apply score card to ti.

Interestingly, one of the councils was using it's robots.txt file to keep some parts of it's site from being spidered by the SiteMorse robots! - we weren't sure whether they were selecting the poorer parts of their site to be hidden so they got a better score... or whether they were doing it because they had had trouble in the past when the sitemorse robot behaved badly in terms of grabbing too may pages at once or etc.

League tables and 'naming and shaming' are tricky - for example after our survey of 100 Uk sites from a Firefox user perspective that the BBC picked up, we had one of the companies contact us rather miffed with what we'd written. (I had personally contacted them at the time the BBC report went out, to keep them in the picture.) We used the phrase that parts of their site were 'hidden' to firefox browsers, which they didn't agree with because the content could be found - but only if the site visitor clicked on a little icon to the left of the heading that they've just clicked. None of the folks here who had double checked our survey findings before we went public had spotted this work-around, so we felt that hidden was the right word.

I guess if we or anyone came with a good league table model, we'd consider running it - we do have a wonderfully flexible test engine here as a result of the conscious effort we've put into it, so can test for pretty much anything.. but of course analysis and interpretation are the hard parts.

Deri
www.scivisum.co.uk
Reply with quote Thank you Deri, I for one have no knowledge of SciVisum other than seeing it mentioned time and time again in articles about SiteMorse and it's actions. This has been helpfull.

--
[size=9]Kyle J. Lamson
Analyst/Programmer III, State of Alaska
Reply with quote Been on my hols so missed this breaking. Unsurprisingly I welcome PSF's decision. It was the right thing to do, but that doesn't mean it was the easy thing to do - I suspect a lot of traffic at PSF was generated by the league tables.

Sadly there will most likely always be other sites willing to step into the breach - I notice that Government Forum now carry the reports - http://www.governmentforum365.co.uk. They might be interested to hear from anyone who wishes to point out the deficiencies in the league tables...

To pick up on a couple of specifics from this discussion:

nhodgetts wrote:
and, by the way, eGMS is mandatory for all pulic sector websites. SiteMorse makes it very clear that it is not just testing accessibility but also, for the public sector, conformance with national requirements.


Richard Conyard wrote:
As for government sites why not test for eGMS, after all it is a requirement?


Sure, SiteMorse tests for eGMS compliance, but all it does it check to see if there are values in certain meta-data elements. Sites get a 100% score even if they use the same LGCL term on every page - is that fair on sites like ours where LGCL terms are selected on a per page basis? Of course not, but since when have the SiteMorse tables been fair?

SiteMorse 'performance' tests hammer our server in an unrealistic manner. They are discussed elsewhere here and extensively on usenet, and until SiteMorse publish details of the methodology used the results are worthless IMHO.

We were one of the 69 Councils who took up SciVisum's offer of an audit, and it was very useful, thanks Deri. In particular it gave me a kick up the backside to get gzip compression up and running!

Display posts from previous:   

Page 3 of 5

Goto page Previous  1, 2, 3, 4, 5  Next

All times are GMT

  • Reply to topic
  • Post new topic