Wednesday 31 May 2017

Scrutiny and WIX sites

Wix make it easy to create a website, but their reliance on javascript and plugins (including Flash apparently) makes it a challenge for webcrawlers, including Googlebot, to index the content.

It seems that Google have had to go out of their way to index these sites. There used to be a hack to get at plain html content, which I used to give as a workaround to users wanting to scan Wix sites, but that no longer appears to work.

As someone with a background in website accessibility, this all seems appalling to me*.

But they're gaining ground, I seem to be seeing more support questions about scanning WIX sites. (and their *!^%$ advert that I have to skip before almost every Youtube video I watch. When is Yotube Red going to be available in the UK??)

Unless anyone knows otherwise (please, please tell me if you know), WIX relies on a/ providing its users with an xml sitemap for submission to Google, and b/ Google being able to render js before indexing a page (again, as an old-fashioned gal, something I heartily disapprove of* )

Scrutiny has long been able to render javascript before parsing page content. There have always been a small number of sites which have menus and links that are written by js when the page loads. I've recently done some work to ensure proper scanning of such a site.

So... here's my best advice for scanning a Wix site using Scrutiny (Integrity and Integrity Plus don't have the necessary js option).

1. Make sure you have Scrutiny 7.4.1 or higher, which contains some necessary improvements to the js functionality. The most up-to-date version of scrutiny is always here.

2. Switch on 'Render Javascript' which is in the Advanced settings for your site.
3. Turn down your number of threads. You can see I'm using 3 here (first notch). Using the js option is very heavy in terms of client-side processing. You don't want pages timing out without having loaded because Scrutiny is working hard at rendering a large number of pages. If your website scans fully then maybe edge this up a bit but best to err on the side of caution.
4. Go and make a cup of tea. This is going to take some time.
5. After the scan, switch on the Target Size column of the 'by link' table and check whether any of the Internal links have a target size of zero (switch the Filter button to Internal to see this more easily). If an internal page has zero kb as the target size, then it may have timed out before the page had finished loading / rendering, which means that you may have an incomplete scan.

If that looks well then you're good to go with your data, which should be complete.



* It's not the javascript I disapprove of, but the fact that the page is 'invisible' without the js being rendered. Accessibility guidelines state that we must "Ensure that pages are usable when scripts, applets, or other programmatic objects are turned off or not supported. If this is not possible, provide equivalent information on an alternative accessible page." (Web content accessibility guidelines, checkpoint 6.3 -  a priority 1)
SaveSave

No comments:

Post a Comment