Difference between revisions of "Learn/Flashy Websites Can Be Too Hot for Bots"

Line 20: Line 20:
 
A CSS menu can do pretty much everything a JavaScript menu can do, and without any of the issues that cause problems for search engine crawlers. Don’t forget that mobile phones, tablets and the other small computers that are increasingly popular for surfing the Web also have problems displaying JavaScript, but do fine with CSS.
 
A CSS menu can do pretty much everything a JavaScript menu can do, and without any of the issues that cause problems for search engine crawlers. Don’t forget that mobile phones, tablets and the other small computers that are increasingly popular for surfing the Web also have problems displaying JavaScript, but do fine with CSS.
  
There’s a nice CSS menu generator here: [[CSSMenuMaker.com]] ([http://cssmenumaker.com/ visit]).
+
There’s a nice CSS menu generator here: [[CssmenuMaker.com|CSSMenuMaker.com]] ([http://cssmenumaker.com/ visit]).
  
 
== JavaScript Click-Tracking Links ==
 
== JavaScript Click-Tracking Links ==

Revision as of 13:58, 23 January 2014

By Michael Cottam

Don’t let fancy moves ruin your SEO

Website designers often employ technologies like Flash, JavaScript, Ajax, and Silverlight to make their sites attractive, fast, and easy to use. While there are great reasons to use these technologies, they can create problems when it comes to search engine optimization (SEO). Web page content that’s wrapped in a fancy package can be difficult – or impossible – for a search engine to “see.” That means the search engine crawlers may have a hard time understanding what your page is about, and so may not index all your important pages.

The search engines may also find it difficult to follow any links – internal or external – you’ve placed in web page content rendered in Flash, Silverlight or other technologies. That matters because search engines use your internal links to discover other pages on your site, to understand how pages on your site relate to each other, and to determine which pages on your website are more important than others.

Designers sometimes incorporate the search function for a website into their designs. That can be helpful for people, but it may pose problems for search engines trying to crawl your site. If pages on your site are accessible only from a search box, the search engine won’t be able to see those pages, because search engines don’t type keywords into search boxes to find relevant web pages.

Below I’ll tell you about some of the popular technologies for creating attractive, people-friendly web pages, describe the potential issues, and tell you how to avoid them.

JavaScript Menus

Web designers often use JavaScript to make navigation menus with special mouse-over effects, animated drop-downs and other interactive features. While these design innovations can be truly useful for human beings, they can also be a real problem for search engine crawlers.

Today, Google’s crawler – fondly known as Googlebot – can actually follow many links created in JavaScript. But it can’t follow all of them. And while Google is the dominant search engine, with about 70 percent of people using it, 30 percent of your potential customers are using a search engine other than Google. Those people are even less likely to see your JavaScript links. If your business depends on people coming to your site from search engines, saying that the bots can probably follow your JavaScript links is a bit like your boss saying your paycheck probably won’t bounce.

A CSS menu can do pretty much everything a JavaScript menu can do, and without any of the issues that cause problems for search engine crawlers. Don’t forget that mobile phones, tablets and the other small computers that are increasingly popular for surfing the Web also have problems displaying JavaScript, but do fine with CSS.

There’s a nice CSS menu generator here: CSSMenuMaker.com (visit).

JavaScript Click-Tracking Links

People who are serious about tracking the business performance of their website use some form of analytics. Seeing how visitors get to your website, and where they go after they land on it, helps you understand how to turn more visitors into customers.

Sometimes web developers use a single page with pre-set parameters to track clicks. The page captures the information about which links were clicked, and then redirects the web browser to the final page that will be shown to the person who’s surfing.

This is very similar to the Javascript click-tracking function, and sadly, it has similar effects when it comes to search engines. Even if the website developer does the redirect with a [[Glossary/301-redirect|301], some of the goodness of that link is lost. I don’t recommend this approach. Solution? Use a free click-tracking service like Google Analytics instead of click-tracking JavaScripts. Yes, Google Analytics uses JavaScript, but NOT in the links themselves.

Flash

Flash is an incredible technology that enables a richer user experience. Flash is often used for video, slideshows and interactive features on a website. However, search engines can’t “see” any content that’s rendered in Flash.

Many websites have everything in Flash. It can look great to human visitors, but to search engines, it looks like the website consists of a single web page – and one with very little content, at that. If the search engines think your entire site consists of a single page, they’ll think your site doesn’t have much useful content, and won’t rank your site high in search results.

Google has improved its crawler’s ability to “see” what’s in a Flash object, especially if the web designer has followed some straightforward rules.

Still, it’s not certain that all text rendered in Flash will be accessible to Googlebot. At the risk of repeating myself, let me remind you that 30 percent of searchers don’t use Google. Do you really want to fence out a third of your potential customers?

Bottom line: Use Flash for decorative elements. Render your links and navigation menus in HTML, so search engine bots can see them.

Silverlight

This technology, created by Microsoft Corp., enables rich media experiences similar to what you can do with Flash. Googlebot has problems seeing the text and links in Silverlight.

Just as with Flash, you’re best advised to use Silverlight for decorative purposes, and use HTML to render links and navigation menus.

Band-Aid Solutions

Some web designers apply a Band-Aid solution to the problems caused by rendering navigation menus in JavaScript, Flash, Silverlight or Ajax. They’ll create an HTML sitemap with links to all the pages, and sometimes submit an XML sitemap to the search engines.

These sitemaps will, in fact, allow search engines to see all the pages on your site. However, the search engines still won’t be able to see how many pages on your site link to any given page. That’s important information – the number of internal links to a page tell search engines how important that page is.

If your main navigation menu is in HTML or CSS, and all your major pages have the same navigation menu, then all your important pages will be linked from many pages on your site. Minor pages on your site will have just one or two links from specific pages. The variation in the number of links to each page tells search engines very clearly which are the most important pages on your site. If, on the other hand, your navigation menu is entirely in Flash or JavaScript, and you’ve got a sitemap as a Band-Aid solution, the only internal link to each major page that search engines can see will be from the sitemap. That gives each page on your site just one link, making it appear to a search engine bot that each page is as important as every other. That’s not accurate, and means that your most important pages won’t show up as high in search results as they should.

Google Webmaster Tools can tell you how many pages on your site link to any other page. Log in to Google Webmaster Tools, click on Your Site On The Web, then click Internal Links.

Pages Accessible Only by Forms

Some sites have pages that can be reached only by filling out a form. For instance, one of the largest automobile insurance companies in the world used to have a simple form on its home page that asked for your postal code. You’d fill that out, click on Submit, and be directed to the portion of the insurer’s site that dealt with your region.

It sounds logical, but search engine crawlers don’t type in postal codes, and they don’t click on Submit. To the search engines, this insurer’s site looked like just a single page – and a pretty boring one, at that.

Search forms pose a similar problem. While this is a tremendously useful way for a human to find information on your site, it’s not a navigation method the crawlers can use. Crawlers don’t type words into a search box, and they don’t click on a Search button. The solution? Keep the search form – it’s great for your human visitors. Add a sitemap, and submit to search engines an XML sitemap that links to every page you want indexed.

How to Check If You Have a Crawl Issue

Luckily, there are a number of tools that can tell you which web pages and links pose problems for search engine crawlers. A few of my favorites: Xenu Link Sleuth - Download this one for free. It can also create an XML sitemap for you.

SEOmoz Crawl Test - Part of a tremendous suite of site analysis features available to SEOmoz PRO subscribers. AboutUs Website Visibility Report - You can quickly check how many pages of your site the major search engines have indexed. If the number is lower than you think it should be, you can investigate further.


Retrieved from "http://aboutus.com/index.php?title=Learn/Flashy_Websites_Can_Be_Too_Hot_for_Bots&oldid=45245621"