Home > Search Engine Optimization (SEO) > 10 Key Search Engine Optimization Items – Feeding the Spiders

10 Key Search Engine Optimization Items – Feeding the Spiders

March 10th, 2008

Most of today’s internet traffic originates with a search. It’s no secret that most of the traffic that originates¬† comes from Google. Hence, when you start analyzing and tweaking your site so you will get more traffic, you target Google’s policies on improved rankings.

Google looks at the basic HTML page from a standards point of view. In other words, Google’s crawler/spider tries to identify key information that is labeled and described correctly. Search engine spiders appreciate the extra information and consider well labeled and well described information extra ‘tasty’. The following are key items to consider when building any web site in todays search engine centric world.

Spider Web

1. Search Engine Friendly URLs (SEFs)

Try to make use of a good folder and file naming structure. The folder should describe the category or section of the site and the file name should summarize in a few words (typically 2-5 words) the subject of the page. Today, many websites are database driven either eCommerce or Content Management Systems (CMS), however, any respected system has a way to make sure that the URLs are self describing and conform to the site’s content logic. Make use of the ReWrite URL module for Apache or .NET’s URL aliasing mechanism. Additionally, many content management systems like Ektron CMS400, Joomla CMS, SiteCore, and SiteFinity allow the content editor to define the page URL or setup an automatic rule to generate the page URL when published.

2. Page Titles – the <title> html tag

The second place that the crawler/spider will look for additional information about the page will be in the page title. Similar to the SEF, we recommend using a reverse hierarchy of the page with respect to its section and category. Something like the following would work great:
{Page Subject} - {Section} - {Site Name} - {Site Slogan}
If the Page Title matches the SEF – even better!

3. Meta Tags – the description and the keywords

Yes, it is true that search engines do not rely solely on these hidden pieces of information anymore. However, it is noticeable that sites with well written meta tags has an advantage over sites that disregard these tags completely. The key points are a short list of keywords for the page (not for the site!). Typically, you should have 2-5 main keywords for the web page. Ideally, each web page has its own dedicated keywords. The description is a short summary of the web page and usually about 2-3 sentences will suffice.

4. Breadcrumbs – another enforcement of the subject in the web page

Breadcrumbs are the little list of words describing the path to the item. They are usually located under the header and at the top of the page. The repetition of the information is what really assists with the search engine optimization. We typically recommend the reversal order of the title, something like:
{Home Page - optional} - {Section} - {Category} - {Page}

5. Performance Optimization – allowing the spiders to crawl faster

One of the reasons why many of the top ranking sites are simple informational pages or basic HTML pages is because they are served extremely quickly. In today’s world of large pipes of bandwidth, most internet users still (believe it or not) use dial up modems or DSL with very limited bandwidth. While it is true that bandwidth has improved in recent years, the pace of improvement has not come close to the pace of which pages grow in size due to graphics and special effects. Hence, performance is critical. Make sure to use tools like YSlow (FireFox extension) and Performance Analyizer to measure and improve your sites performance.

6. Valid, Semantically XHTML and CSS

Crawlers, unlike browsers depend on the validity of the HTML which builds the page. While most browsers tolerate bad web page structures, spiders are known to penalize for such defect. In addition to the validity of the web page, it is important to make use of semantic HTML. Semantic HTML is a way to label, tag, and as a result style web pages which describes the content not placement or any styling characteristics of the content. In other words, if a specific side box contains information about manufacturers, label the div tag with class=”manufacturers” or id=”manufacturers” instead of “second_right” or “brown_box” etc.

7. Avoid Nested Tables, or Tables for Layout as a Rule of Thumb

Table layouts and nested tables create significant overhead of unnecessary HTML tags and clutter. The clutter makes it hard for the spiders to differentiate between important and non-important information. Additionally, with well formatted XHTML and CSS, it is fairly easy to bring important information to the top of the page therefore giving it higher weight when indexed.

8. CSS and Ordered Lists Menus Instead of JavaScript or Flash Based Menus

Yes, it is cool to have a flashy animated menu or a smooth transitioning javascript based menu, but the bottom line will get affected. Spiders cannot read these menus. A good main menu shows on all pages in the same place, a flash or javascript menu will not only prevent guiding the spiders but also force the spider to skip important pages. According to the spider, if there is no link to the page, the page doesn’t exist in the spiders’ point of view.

9. Bottom Links – the Best Place to Remind Spiders of Additional and Important Pages

In addition to the main menu links, bottom links are a great location to place links that were left out. The bottom links are a great example of using web pages length, which are unimportant from a UI perspective, but key for SEO. Simply add or repeat links to the important pages at the bottom of the home page, and even at the bottom of every page in the site. This is also the place where you can try different variations of the links:

<a href="mylink">Wholesale</a>
...is very different from:
<a href="mylink">Wholesale Coffee for Restaurants and Grocery Stores</a>

10. Reduce the Size of JavaScript and CSS Files

There is really no need to have inline Javascript or CSS clutter in web pages anymore. The amount of free and open source tools and the commercially available tools allow any professional web developer to collect, minify, and compress JavaScript. It allows developers to collect, consolidate, and compress CSS code. An additional benefit of extracting this code and placing them in a separate files is the fact that once the files were read once by the browser, they are cached for a while. Each browser has separate algorithms and default lengths of caching, which therefore speed up the site and use less bandwidth. One improvement means double the fun!

So now, go back to your drawing board, and spice it up for the spiders!

Search Engine Optimization (SEO)

  1. | #1

    Hi Ronpeled,

    The article posted is interesting and descriptive that the traffic to particular website depends upon the search and depends upon the content. Yes, I agree that most search starts from google. Search engine marketing will help us in making attractive content, SEMS is a promotional tool that involves increasing website traffic through search engines. Whether your website aims to sell products and services or simply provide information

  2. | #2

    Yeah, it’s good, establishing a good relationship with the manufacturers, As SEO or search engine optimization is important for the popularity of your website. When you top the result pages of search engines such as Google this means that people will choose to visit your site.

  3. | #3

    Excellent.. very bitter-sweet for me.
    I am very excited for the future through..

Comments are closed.