Sacramento Website Design and Computer Support

          CALL 1-800-543-9897 TODAY!
    Sacramento Website Design & IT Services Web Design    |   Commercial   |   Residential
Sacramento Website Design and Computer Support
Home : About Us : Web Services

Call4GEEKS! Integrated Approach to
Search Engine Optimization

Step 1 - Improve and Maintain Site Quality

Maintaining site quality will maximize SEO visibility and usability.

Our Integrated Approach considers site quality a crucial element to effective promotion campaigns and traffic analysis. Website content is the foundation upon which to build an online business. The higher the quality of this material, the more successful the business will be.

Our SEO campaign starts with a complete site quality assessment, especially in the following situations:

  • When we provide SEO / SEM services and are hired to promote a Web site already created by somebody else. In this case, we work in conjunction with the site development and administration team to properly prepare the Web site for the SEO campaign. The site quality reports we obtain with the help of software will be the starting point in organizing this teamwork.
  • When we have just developed a Web site of a considerable size but cannot afford the time or resources to double-test every page and function.
  • When a Web site we're working with uses a dynamic technology and/or is constantly changed and updated with new materials, products, news etc.
  • When the Web site we're working with is developed, maintained and managed by a group of people on a regular basis. Even if the team is well coordinated, this usually means that site quality will still yield some gaps.

Website audiences are split into both human visitors and search engine spiders. We first ensure the site is spider-compatible and does not have broken links, lack META and TITLE tags and other elements important for the search engines.

Once we've repaired all errors and breaches that could potentially damage site visibility, we look for usability flaws such as missing or incorrectly displayed images, broken links (these can frustrate not only spiders but human visitors also), missing JavaScript includes or CSS styles. The result of these types of usability problems is the site will not display or function correctly.

Step 2 - Detect and Fix Site Visibility Problems

In terms of our Integrated Approach, the term Visibility refers to compliance of a page's source code with the requirements and needs of search engine spiders. This topic is based on the knowledge that the spiders are computer programs instructed to parse HTML pages, then analyze certain page areas and undertake certain actions such as requesting other pages by following the hyperlinks found on the page.

We know specific page areas bear a certain importance for spiders and thus play a role in overall search engine visibility. Another conclusion is that consistent site structure is an essential factor for spiders to be able to effectively crawl a site.

Following are the main visibility problems that are usually encountered on a website (especially large sites):

  • Broken links
  • Broken redirects
  • Missing ALT attributes
  • Missing TITLE and META (keywords and description) tags
  • Old pages which are not regularly updated
  • Deep pages (far from the home page or from the page where the spider enters your site)
  • Excessive length of TITLE and META tags

Most of these problems (except broken links and redirects) will not frustrate your human visitors; however these flaws are potentially dangerous for your positions in the SERP (search engine result pages) and for indexing of your whole site by the search engine spiders.

Broken links

Broken links are a webmaster’s common scourge, especially with large and dynamic sites. A search engine robot will not be able to access a page that is hidden behind a broken link. Most robots, however, will not stop spidering your site when they meet a broken internal link, provided they have other links to follow. But if the broken link is intended to point to a strategically important content page, the page won't be indexed, and the problem becomes more serious.

Redirects and broken redirects

Broken redirects will lead to the same results as the broken links, that is, spiders will not be able to index the pages hidden behind the broken redirect, unless they find that page through some other links or redirects used on your web site.

Our Site Quality Auditing software analyzes your page through the HTTP protocol and records the 3-digit response code returned by the server. If the response code begins with 3, it means the page redirects the visitors (including spiders) to another page. Most common redirect codes are 301 (Moved Permanently), 302 (Found), 304 (Not Modified), 305 (Use Proxy), and 307 (Temporary Redirect).

As a webmaster, we may implement a redirect in a number of different ways: either by sending the corresponding HTTP headers from within your server-side code (PHP, ASP, PERL etc), or by putting the instructions into the ".htaccess" file in the directory on the server, or by putting a special META REFRESH tag directly into the HTML code of the page.

Missing ALT attributes

ALT attributes are used within the "< IMG >" HTML tags to describe images textually if the browser will not display the image. The ALT attribute is also useful in providing additional information when a user hovers the mouse cursor over a displayed image.

ALT attributes have also become important nutrition for search engine spiders, because they treat the first several ALT attributes on an HTML page as a separate valuable page area with its own keyword prominence, weight, proximity and frequency.

Therefore, if we have images on your Web site without ALT attributes, it will potentially decrease your rankings.

Missing TITLE and META tags

This is probably the most serious gap in the success of any SEO campaign. The TITLE tag, META keywords and META description tags are the primary regions where search engines attempt to find relevant keywords for your site . If some of your pages are missing these TITLE and META tags, we should put these on your page before starting any further optimization analysis.

Old pages

One of the algorithms Google now applies for ranking Web pages is the SandBox algorithm. It means that the pages that haven't been updated for a long time will gradually lose their positions in the rankings even if they initially had high positions. Google constantly crawls the Web and adds thousands of new pages to its index. These new pages have unique content, but it is the novelty or “newness” of these pages that may cause Google to rank them higher than the actual content would otherwise dictate. Google assumes that newer pages probably contain more updated information than older ones.

To retain your high positions, we not only need to constantly add to your link popularity, but also update your pages often and add new content pages from time to time. The "Old Pages" report in our software will point out those pages of your site that can potentially yield to the "Sandbox" algorithm.

Deep Pages

An average search engine spider will not crawl your site deeper than 2 levels away from the home page, and the most advanced spiders are instructed not to go further than 4 levels deep. Under "depth" we mean number of jumps which should be made from page to page along the links to get to the destination page from the source page (which is, most often, your home page).

So, if a spider enters your site on your home page, it will read all pages linked to it, then all pages those pages link to, and then most spiders will stop. If your site has any pages that link any deeper than this your site structure needs optimization.

It's important to understand that the link structure and file structure are two different things: you may have 500 pages in the root directory, yet some of them may be 5 and more levels deep according to the link jumps from links on your home page.

TITLE and META tags length

The information placed in the TITLE, "keywords", and "description" META tags is indexed by search engine spiders for use on the search results page. There is a limited amount of space for these. If they are too long, they will be cut off in the search results display. Moreover, search engines will not read more than certain amount of characters in the tag. Even further, some search engine spiders may consider an excessive amount of characters as spamming. Therefore, it's useful to check whether the length of these vital page areas is within sensible limits.

Step 3 - Detecting and Fixing Site Usability Issues

The term Usability mainly refers to a site's ease of use and user convenience. Your site may have been designed with rich graphics and loads of content (or both), but if visitors have trouble navigating your site or can't find what they're looking for, you have a usability problem.

Many usability problems aren't detectable with automated software web maintenance tools. For example, a serious usability problem like a navigation menu that lies “below the fold” on a web page may not be detectable to most software website maintenance programs. If we depend on software tools to report usability problems and let such a serious one remained undetected and unchanged, we are likely to prevent as much as half of your site visitors from going any deeper into your site.

However, there are a number of usability problems that are detectable by intelligent software tools and must be corrected. These are as follows:

  • Broken links and redirects
  • Broken inclusions (broken links to scripts and styles for page display and operation)
  • Missing ALT image attributes.
  • Slow pages
  • Deep pages
  • Missing images
  • Images that display incorrectly because of missing "WIDTH" and "HEIGHT" attributes

Broken links and redirects

We discussed broken links in the previous lesson as a visibility issue, but, obviously, broken links are also a serious usability problem as well. Once your visitors find a broken link it is very possible that they will leave your site, possibly never to return. They will return to the search engine's result page, and move from there to your competitor’s site.

Broken includes

Today it is common to use sophisticated browser-side technologies to make Web pages more interactive, adjust their visual appearance, and execute data operation procedures. Standard HTML has long ago exhausted its possibilities to meet these requirements. Today's developers commonly link external files with scripting (JavaScript) or description (CSS) code to their HTML pages.

Usually, when Web designers or developers build a page, they simultaneously check it in their browsers to make sure that this browser-side includes are working properly.

Unfortunately, as a result of moving, adding, changing or deleting pages, and after reorganizing the folder structure, such includes can become “broken”. So, pages will not display or function the way they are meant to. The first impression a visitor receives when coming to such a Web site is that the design is unprofessional. Serious customers will not waste much time considering what the reason for this is, most probably they will just leave the site.

Missing ALT attributes

In terms of usability, omitting ALT tags is not as big a problem as it is for site visibility, especially for search engines. However, for the sake of visitors, it's a helpful tool to have textual descriptions of the images. In any case, we should remember the primary audience for your ALT image tags is made up of search engine spiders.

Slow pages

As the terms implies, slow pages are those that take a long time to load into a browser. The reasons may be that the HTML code is overly massive, or the page includes too many images, external scripts, or style files linked to this page. Or all page elements added together makes for a huge file size and the amount of information (measured in bytes) becomes so large it takes a long time for the page to load.

World-famous usability expert Jacob Nielsen in his book "Designing Web Usability" writes that 10 seconds is the critical time limit to hold a visitor focused on the dialog in your site. If your page takes longer to load, it becomes irritating, especially if the visitor is consistently faced with long wait times at a particular site as multiple pages all struggle to load over many seconds or even minutes.

Deep pages

Deep pages are those which require a certain number of steps of the navigation process. The typical definition of a deep page is four steps away from an entry or home page, that is, following 4 or more links from the home page. If visitors are not initially highly interested or motivated to find that deep content on your site, they will leave as soon as they see that the search requires a long and complicated navigation process. As a rule of thumb, we try to make all of your content available within two or a maximum of three link steps.

We analyze your site for any deep pages and re-structure the site to link from either your home page or a page no more than one link away from your home page.

Missing images

Missing images are one of the more irritating problems a Web site can have. While insignificant for the search engine spider, this aesthetical flaw will create an impression of an unprofessionally and poorly designed site for human visitors. It detracts from the user's browsing experience and diminishes the site owner's conversion rate. It is a lose/lose proposition.

Missing images appear as a result of careless design or reorganization of the file / folder structure. If we are in charge of the maintenance of a site that we did not design, one of our first tasks is a consistency check of the entire site. Once done, we will know which images are not displaying correctly and be able to immediately correct the problem.

We analyze your site for any missing images and make sure that they have the correct height and width included in the "< IMG >" element.

Contact us for a QUOTE or call us today at 1-800-543-9897.

  Commercial | Residential | Web Design | Contact Us                                               © Copyright 2011  Call4GEEKS! Sacramento Website Design & IT Services