When you've worked as a webmaster with accessibility an important part of the job, it's difficult to accept that javascript is now becoming a more legitimate part of the page's rendering process.
Early on, any important content needed to be visible as text with javascript disabled. If it couldn't be seen in a text-only browser without js then it wasn't going to get past me and onto a local government website.
Justifications include Googlebot's blindness to such content and the anecdotal user with assistive technology. (Not really hypothetical, I met some.) But even then I couldn't help feeling that perhaps their software ought to be capable of a bit more, rather than the web being limited by the most basic or old user agents.
I guess the tipping point is the point at which Google is able to index content that relies on javascript to render it. That shoots a big fox of old stalwarts like me and that point has arrived.
This is the reason for Scrutiny now being able to execute javascript before scanning a page (it's early days and if it doesn't work as expected for you, we need to know, let support know.)
This article from Google's Webmaster Central Blog gives their view of the matter. As well as useful tips like making sure the necessary js and css files are accessible by the search engine bots, I'm happy to say that they do still recommend that your pages 'degrade gracefully', ie that users without the latest whizzy browser can still get at your important content.
No comments:
Post a Comment