Archive

Archive for March, 2008

Free Open Source ICO File Format for Photoshop

March 25th, 2008
Comments Off

Once in a blue moon, a web developer will need to format or re-work the favicon for the website they are working on. What are the alternatives? Download a shareware from download.com. Any of those applications will need installation. Most people start complaining within a month or even a week if they haven’t purchased a license. I always try to avoid such ad-hoc installations, because you never know what else you might be installing on your machine. In other words, it’s not safe or stable.
open source file format plugin for Photoshop
A nice, free, and open source alternative is the open source file format plugin for Photoshop. This little plug-in works on Mac, PC, and even VISTA platforms. It supports Photoshop versions up to CS2. If you have Photoshop installed on your machine, all you have to do is download the zip, and copy the plugin (1 file) into the right folder. On my laptop the path to the folder was:
C:\Program Files\Adobe\Adobe Photoshop CS2\Plug-Ins\File Formats

Reopen your Photoshop and you are done! drag and drop the ico file or save your image as (use “Save As…”) and choose the ICO file format.

If editing icons were a bit easier than playing with pixels. On a side note, what happened to the transparency with icons? That’s a question for another day..

Web Design, Web Development, Web-based User Interfaces

3 Pitfalls to Avoid for a Faster Ektron CMS400 Website

March 17th, 2008

Server performance is one of the most important functions of websites today. Users expect immediate response when clicking around your site. Even a 3.5 seconds delay may send them somewhere else. Also,¬† search engine crawlers (like Google) will rank you lower as a result of high latency. Hence, it is not only important to practice a faster website delivery, it’s a necessity. Recently, we have assisted our clients with server performance, which we supported using Ektron CMS400 v7.0.4. Here are the three main server performances we have noted after testing and tracking down the cause of website delays by using the Trace technique in .NET:

1. Avoid XML/XSLT Tranformations for Controls Output

After researching the cause of a huge latency greater than 2 seconds on every page refresh, we have discovered that about 50% of the latency was during the Page_Load occurrence. A more thorough research revealed that the 50% in delay was occurring during the XSLT transformations of all the controls on the page.
By caching these controls, (this solution is only partial and not recommended), and changing the way controls are rendered onto the page, we were able to reduce this latency to less than half. Therefore, we recommend building your Ektron site with the basic Ektron controls, and if you need a special way to present the information, use the code behind to generate a display of the data while you gather the data through the Ektron API and process the data programmatically. In other words, avoid XSLT altogether.

2. Make Use of the Flex Menu Ektron Control

Most of the Ektron sites that we’ve had the chance to work on were structured similarly. The main menu was a set of multi-level menus, which are all rendered by a style-specific XSLT. In some cases, before running through the XSLT, a script was passing through the menu items to find the one that needed to be ‘selected’.
Why should we reinvent the wheel?
If you read Ektron’s documentation, you will find a few menu controls that can be very handy: DHTMLMenu, Menu, SmartMenu, and FlexMenu. Each one has its advantages and disadvantages. In short about each one:

  • DHTMLMenu: My least favorite. Uses too much JavaScript and doesn’t render nicely for SEO
  • Menu: The simplest one to use for basic menu systems
  • SmartMenu: I like this menu because it’s a styled and nested unordered list. It can also support section 508 and highlights the selected menu item by a client side script, which is a lot more performance friendly
  • FlexMenu: Our tests indicate that this menu control is the fastest if you have a sophisticated XSLT. It seems like Ektron simply provided a flexible menu control specifically for XML transformations.

We recommend the use of the SmartMenu, and if you insist in using XSLT to display a menu, use the FlexMenu as the alternative.

3. Make Use of the .NET Caching Mechanism

A simple thing for developers to set, isn’t it? Well, you can’t imagine how many sites we’ve seen without any caching beyond what the default settings allow. There is so much more to cache, it is almost a crime not to make use of it in our technology-driven age.

Ultimately, the above lists are just a few main performance issues that we have found with many Ektron sites. The items above alone can improve the site’s performances by up to 50%. However, this list is far from complete by any means. Hardware, Paging, Deadlocks, Server Environment, and even Bandwidth need to all be reviewed in order to improve performances.

Fast Surfing!

.NET Framework, Ektron, Performance Optimization

10 Key Search Engine Optimization Items – Feeding the Spiders

March 10th, 2008

Most of today’s internet traffic originates with a search. It’s no secret that most of the traffic that originates¬† comes from Google. Hence, when you start analyzing and tweaking your site so you will get more traffic, you target Google’s policies on improved rankings.

Google looks at the basic HTML page from a standards point of view. In other words, Google’s crawler/spider tries to identify key information that is labeled and described correctly. Search engine spiders appreciate the extra information and consider well labeled and well described information extra ‘tasty’. The following are key items to consider when building any web site in todays search engine centric world.

Spider Web

1. Search Engine Friendly URLs (SEFs)

Try to make use of a good folder and file naming structure. The folder should describe the category or section of the site and the file name should summarize in a few words (typically 2-5 words) the subject of the page. Today, many websites are database driven either eCommerce or Content Management Systems (CMS), however, any respected system has a way to make sure that the URLs are self describing and conform to the site’s content logic. Make use of the ReWrite URL module for Apache or .NET’s URL aliasing mechanism. Additionally, many content management systems like Ektron CMS400, Joomla CMS, SiteCore, and SiteFinity allow the content editor to define the page URL or setup an automatic rule to generate the page URL when published.

2. Page Titles – the <title> html tag

The second place that the crawler/spider will look for additional information about the page will be in the page title. Similar to the SEF, we recommend using a reverse hierarchy of the page with respect to its section and category. Something like the following would work great:
{Page Subject} - {Section} - {Site Name} - {Site Slogan}
If the Page Title matches the SEF – even better!

3. Meta Tags – the description and the keywords

Yes, it is true that search engines do not rely solely on these hidden pieces of information anymore. However, it is noticeable that sites with well written meta tags has an advantage over sites that disregard these tags completely. The key points are a short list of keywords for the page (not for the site!). Typically, you should have 2-5 main keywords for the web page. Ideally, each web page has its own dedicated keywords. The description is a short summary of the web page and usually about 2-3 sentences will suffice.

4. Breadcrumbs – another enforcement of the subject in the web page

Breadcrumbs are the little list of words describing the path to the item. They are usually located under the header and at the top of the page. The repetition of the information is what really assists with the search engine optimization. We typically recommend the reversal order of the title, something like:
{Home Page - optional} - {Section} - {Category} - {Page}

5. Performance Optimization – allowing the spiders to crawl faster

One of the reasons why many of the top ranking sites are simple informational pages or basic HTML pages is because they are served extremely quickly. In today’s world of large pipes of bandwidth, most internet users still (believe it or not) use dial up modems or DSL with very limited bandwidth. While it is true that bandwidth has improved in recent years, the pace of improvement has not come close to the pace of which pages grow in size due to graphics and special effects. Hence, performance is critical. Make sure to use tools like YSlow (FireFox extension) and Performance Analyizer to measure and improve your sites performance.

6. Valid, Semantically XHTML and CSS

Crawlers, unlike browsers depend on the validity of the HTML which builds the page. While most browsers tolerate bad web page structures, spiders are known to penalize for such defect. In addition to the validity of the web page, it is important to make use of semantic HTML. Semantic HTML is a way to label, tag, and as a result style web pages which describes the content not placement or any styling characteristics of the content. In other words, if a specific side box contains information about manufacturers, label the div tag with class=”manufacturers” or id=”manufacturers” instead of “second_right” or “brown_box” etc.

7. Avoid Nested Tables, or Tables for Layout as a Rule of Thumb

Table layouts and nested tables create significant overhead of unnecessary HTML tags and clutter. The clutter makes it hard for the spiders to differentiate between important and non-important information. Additionally, with well formatted XHTML and CSS, it is fairly easy to bring important information to the top of the page therefore giving it higher weight when indexed.

8. CSS and Ordered Lists Menus Instead of JavaScript or Flash Based Menus

Yes, it is cool to have a flashy animated menu or a smooth transitioning javascript based menu, but the bottom line will get affected. Spiders cannot read these menus. A good main menu shows on all pages in the same place, a flash or javascript menu will not only prevent guiding the spiders but also force the spider to skip important pages. According to the spider, if there is no link to the page, the page doesn’t exist in the spiders’ point of view.

9. Bottom Links – the Best Place to Remind Spiders of Additional and Important Pages

In addition to the main menu links, bottom links are a great location to place links that were left out. The bottom links are a great example of using web pages length, which are unimportant from a UI perspective, but key for SEO. Simply add or repeat links to the important pages at the bottom of the home page, and even at the bottom of every page in the site. This is also the place where you can try different variations of the links:

<a href="mylink">Wholesale</a>
...is very different from:
<a href="mylink">Wholesale Coffee for Restaurants and Grocery Stores</a>

10. Reduce the Size of JavaScript and CSS Files

There is really no need to have inline Javascript or CSS clutter in web pages anymore. The amount of free and open source tools and the commercially available tools allow any professional web developer to collect, minify, and compress JavaScript. It allows developers to collect, consolidate, and compress CSS code. An additional benefit of extracting this code and placing them in a separate files is the fact that once the files were read once by the browser, they are cached for a while. Each browser has separate algorithms and default lengths of caching, which therefore speed up the site and use less bandwidth. One improvement means double the fun!

So now, go back to your drawing board, and spice it up for the spiders!

Search Engine Optimization (SEO)