djlebarron
19 May 2011, 02:52 AM
I'm beginning to think that I'll eventually convert the interaction on complex sites that are almost completely PHP to JQuery or Ajax interaction. It's just getting hard to resist the big wave anymore. I've got a lot of "work-around" in these sites such as cache manipulation and the use of $_SESSIONS that make them a lot more controlled and consequently smoother as far as refresh and back/forward navigation etc. But completely seemless seems to be the way everything is going and it certainly looks like the benefits are well worth it.
I just visited a Yahoo page that states that among their 300 million worldwide visitors each month, only 1% are js dis-abled. They say the U.S. is about 2% js dis-abled.
http://developer.yahoo.com/blogs/ydn/posts/2010/10/how-many-users-have-javascript-disabled/
W3school's last reported js-disabled percentage @ 5% in 2008, so it appears that users may be turning js back on in response to constantly encountering non-accessability, and that non-js browsers are being upgraded to js-enabled versions.
The issues I'm wondering about are:
Should I work my butt off creating almost virtually new sites that have a ton of <noscript> degrades that will require LOTs of extra code, or should I simply replace much of the PHP refresh with JQuery and redirect the js-disabled to the old sites via a conditional statement placed in the index pages' head sections?
How badly will the extra amount of code in the "combination" <noscript> sites slow things down?
If I choose to save myself a bag of headaches and redirect the js-disabled, how would that affect SEO, seeing as how many of the SE bots don't read and/or follow js?
Would it be better to do it the other way around for the bots by redirecting the js-ENABLED to the "new and improved" JQuery versions of the sites, based on a conditional statement?
Thoughts?
I just visited a Yahoo page that states that among their 300 million worldwide visitors each month, only 1% are js dis-abled. They say the U.S. is about 2% js dis-abled.
http://developer.yahoo.com/blogs/ydn/posts/2010/10/how-many-users-have-javascript-disabled/
W3school's last reported js-disabled percentage @ 5% in 2008, so it appears that users may be turning js back on in response to constantly encountering non-accessability, and that non-js browsers are being upgraded to js-enabled versions.
The issues I'm wondering about are:
Should I work my butt off creating almost virtually new sites that have a ton of <noscript> degrades that will require LOTs of extra code, or should I simply replace much of the PHP refresh with JQuery and redirect the js-disabled to the old sites via a conditional statement placed in the index pages' head sections?
How badly will the extra amount of code in the "combination" <noscript> sites slow things down?
If I choose to save myself a bag of headaches and redirect the js-disabled, how would that affect SEO, seeing as how many of the SE bots don't read and/or follow js?
Would it be better to do it the other way around for the bots by redirecting the js-ENABLED to the "new and improved" JQuery versions of the sites, based on a conditional statement?
Thoughts?