Sunday, February 6, 2011

Optimising your web site


Search engine optimization (SEO) is the process of improving the visibility of a website or a web page in search engines via the "natural" or un-paid ("organic" or "algorithmic") search results. Other forms of search engine marketing (SEM) target paid listings. In general, the earlier (or higher on the page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine. SEO may target different kinds of search, including image search, local search, video search and industry-specific vertical search engines. This gives a website web presence.

Optimizing a website may involve editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic.

Because effective SEO may require changes to the HTML source code of a site and site content, SEO tactics may be incorporated into website development and design. The term "search engine friendly" may be used to describe website designs, menus, content management systems, images, videos, shopping carts, and other elements that have been optimized for the purpose of search engine exposure.

Another class of techniques, known as black hat SEO or spamdexing, uses methods such as link farms, keyword stuffing and article spinning that degrade both the relevance of search results and the user-experience of search engines. Search engines look for sites that employ these techniques in order to remove them from their indices.

Saturday, February 5, 2011

Some history


Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed to do was submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[1]

Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat and black hat SEO practitioners.

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using meta data to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[5] Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.[6]

By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation.

Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.

By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals.[10] The leading search engines, Google and Yahoo, do not disclose the algorithms they use to rank pages.

In 2005 Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[14] In 2008, Bruce Clay said that "ranking is dead" because of personalized search. It would become meaningless to discuss how a website ranked, because its rank would potentially be different for each user and each search.[15]

Thursday, February 3, 2011

Increasing your prominence


A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to most important pages may improve its visibility.[37] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[37] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's meta data, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL normalization of web pages accessible via multiple urls, using the "canonical" meta tag[38] or via 301 redirects can help make sure links to different versions of the url all count towards the page's link popularity score.


SEO techniques are classified by some into two broad categories: techniques that search engines recommend as part of good design, and those techniques that search engines do not approve of and attempt to minimize the effect of, referred to as spamdexing. Some industry commentators classify these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[39] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites will eventually be banned once the search engines discover what they are doing.[40]
A SEO tactic, technique or method is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[26][27][28][41] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see.

Some design guidelines from Google


  • Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

  • Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.

  • Keep the links on a given page to a reasonable number.

  • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

  • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

  • Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. If you must use images for textual content, consider using the "ALT" attribute to include a few words of descriptive text.

  • Make sure that your <title> elements and ALT attributes are descriptive and accurate.

  • Check for broken links and correct HTML.

  • If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.
 
What next?
Because search engines base their judgement of what is unique on the content of the page, the order in which duplicates were discovered, and volume and quality of links and citations, Greenlight’s Bunn advises that the best way to insulate against any future steps search engines may take against duplicate content is for sites to ensure their pages:
  • have a sufficient amount of original text content, supported by images, videos and other multimedia as appropriate.
  • are rapidly indexed by the search engines.  To achieve this the site should be regularly linked to, necessitating some kind of link acquisition strategy, and new pages should be submitted to the engines via XML sitemaps and featured on the homepage or another highly authoritative hub page in the respective site (such as a category homepage) until they have been indexed.  If the site has a blog, make sure it pings the search engines when a new post is published (most do), and then use the blog to publish or link to new content on the site.
  • are linked to and/or cited directly by third party sites.  Since it is rarely practical or economical to actively link build for every page in a site, consideration should be given regularly as to why someone/a third party would naturally link to the site’s pages or share them on Twitter, for example.  If a firm/site cannot think of a good reason, it may need to go back to the drawing board.
“This problem (low value and duplicate content) is one of such perpetual nature that the search engines are sure to revisit it again in the future.”

Wednesday, February 2, 2011

Simple elements you can optimise yourself


Optimising your web site for search engines is often a matter of making small modifications to your site.
Below is an overview of some of the types of optimisation that your site should include.

Optimisation TIPS

Unique, accurate page titles.  A good title will include the name of your site that reflects what you do, and some keywords that accurately describe the focus of that particular page.  You want the title to read well also, not just your name and a list of tags, but a short descriptive sentence.

Description meta tag. Search engines use the descript tag to gain a summary of what the page is about.  Some search engines only use snippets of the description in the search results, whilst others will use the entire description.

File structure of your site.  Use good descriptive names for the folders and files inside your site to allow search engines to rank your pages well.  Don’t name pages with names like page1.htm, but rather use a name such as rarest_baseball_cards.htm

Keep navigation simple.  Search engines use spider technology to trawl through your web site and the ease of this navigation (not too many clicks from the home page to reach important content) will be noted as search engines like to have a sense of what role each page plays in the bigger picture of the site.

Provide quality content.  Users of your site are more likely to direct others to your site using their own blog, twitter and social networking sites if you provide clear, easy to read, informative content.  Whilst you are designing the content for your users, not search engines, the more links that point back to your site through your quality content, the higher your ranking progresses.

Use clear anchor names.  Anchors provide links for users to jump directly to actual positions on a page.  Clear anchor names are noted and ranked by search engines.

Use heading tags appropriately.  Headings inside your page are created by use of heading styles.  Using these correctly allows search engines to note this as important content text and improves ranking if you provide appropriate text in your headings.

Use clear Image names and tags.  Image names should reflect what is in the image as search engines read the image names.  The Alt tag that is attached to images is useful for your visitors, giving them a description of the image (very important if they are not viewing images) and allowing search engines to see that your images reflect your content.

Promote your site well.  Good practices include – using a blog to promote new content and services, put your site address on all business material the public may encounter, use social networking sites to promote your site, link to other peer sites in your business community and add your business to Google’s Local Business Centre

Track site usage.  Use a tool such as Google Analytics to regularly review how users reach and behave on your site.

Which Hat do you wear?


White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to game the algorithm.

White Hat SEO is merely effective marketing, making efforts to deliver quality content to an audience that has requested the quality content. Traditional marketing means have allowed this through transparency and exposure. A search engine's algorithm takes this into account, such as Google's PageRank.