Seo Tricks

My personal experience in content web site building, and how I make money online with it. Search Engine Optimization Tricks, articles and tutorials about Adsense and the Webmaster community.

Saturday, April 15, 2006

A little bit more on SEO

What’s to be discussed below are just systematized guidelines that worked fine for me and many others on a great load of high competition keywords for quite a bunch of sites. What I’m perfectly sure of is that they might and most likely will eventually make YOUR site a success, provided you don’t treat these lines as tables of stone and realize that really a lot depends on your specifics like level of competition in your niche market, your potential audience and generally this little thingy called brain (I guess I should have placed this on top of the list:).

Build QUALITY content.

When you ask somebody what really matters for search engines in the first place and somebody starts speaking of the so-called "key factors" such as meta tags, page title, anchor text, keyword density, header elements, alt tags, etc – tell him he’s talking rubbish, those things are never determinative.

Although undoubtedly important they are SPAM in nature WITHOUT QUALITY CONTENT. Long before your site is done start piling up notes or REAL content of at least 200-500 words for each page and make sure the content is what general audience will like. Make your articles up-to-date and topical, don’t use any sorts of nerd computer lingo or flamboyant and ostentatious lexemes. You can inundate your site with “widgety widgets” on every line of your code, cramming it into every anchor text on every FFA (free-for-all) link site you might find – but if you DON’T HAVE QUALITY CONTENT you are offering SPAM to visitors and your site will be butchered one way or another, today or tomorrow. It’s much better not having a site at all then entering the web with spam. There’s enough of this bullocks on the web already – don’t add to the dump!

Cleanly Interlinked Internal Pages With Proper Anchor Text.

Rule of thumb – each page on your site should not be further than 3 clicks away from any other site page. Bots like easily crawlable sites, and you get more traffic because all internal pages get indexed and PageRank calculation is facilitated. Plus humans that use Internet as well now and then don’t like to waste their precious time finding what they want or scraping through your clandestine navigation. In this eye this rather looks to be a matter of website usability but as long as Google PageRank nowadays remains next to unbeatable factor in getting more visitors from Google (write Yahoo and AOL) and Google holds 90-92% market share, I would suspect it is cleanly interlinked with SEO as well. Plus proper anchor text, although slightly devalued lately, might help you get your internal pages right. By proper I mean SPAM-PROOF in the first place. Don’t make anchor text 10 words long topping it up with keywords – search engines get wise with this old trick.

Keep Your URLs Clean - Stay Away From SessionID’s.

Each time the bot (GoogleBot, Inktomi and Scooter in particular) spiders a page of your site it gives it some sort of associated ID number and stores it in the some sort of repository for future use (for instance for further PageRank calculation as true of Google). That unique ID is associated with the url that the bot saw when it spidered the page. BUT: when it eats into the same (in human eye!) page next time the URL IS DIFFERENT!

?sessid=999&xyz=123 IS NOW REPLACED BY ?sessid=999&xyz=124. The troubles that you’ll face are immense:

1) This leads the bot into thinking: - Aha, this page might be spam as it duplicates the content of
?sessid=999&xyz=123 completely! There’s no harder penalty nowadays then PR0 Google massacre and the main factor that triggers it is content duplication.

2) Sessionid’s is a surefire way to nip PageRank transfer to internal pages in a bud. Google PageRank calculation is multi-iterative and at each iteration (exact number of which is kept secret but IMHO hardly exceeds 50-60) GoogleBot assigns some of PageRank share to a DIFFERENT page i.e. a page with DIFFERENT SESSIONID! Thus PageRank is not accumulated and internal pages remain bare.

Whatever visitor tracking functionality you need – don’t go for sessionid’s in urls – god forbid! Either use
.htaccess for these purposes or more complex server side solutions using IP address, user agent, visitor OS info, cookies info etc. for visitor identification.

Frameset Sites Are Both Feet In The Grave.

If somebody claiming to be a bigwig designer, developer, or usability pro suggested you a frameset structure for your site – just say goodbye and never refer to the chap again. If however this suggestion was later approved and implemented – you are in a big jam. Bots loath frames and 99,999% sure WON’T follow "frame src" urls which they view as either duplicate content or spam in any other way. Say thanks to malicious blackhat "optimizers" that used this technique for years for duplicating their content on all possible affiliated and non-affiliated sites. Also you just won’t believe how many 12-year-olds stick yahoo, ebay, microsoft or amazon frames into their FrontPage juvenilia! Do you think the bots will bother spidering those? At the certain point of time due to abundance of spam from frames it just became easier for bots to ignore framesets at all. And yes, someone would probably bother with noframes tag to include text and links there, but in my view it’s just not worth it. Frameset site owners should forget it and rebuild your site structure from scratch. Framesets should go to the dustbin.

Title, Keyword Density, Metas, Headers, Alt Tags.

Aha, back to old webmaster tricks now. Everybody knows about meta tags nowadays – don’t think you’re the only one. What most of them don’t know is that nowadays most of them are COMPLETELY USELESS or at least MUCH LESS WEIGHT than before. Too many clever buggers deluging too many pages with too much spam in the metas – how would you protect the web from spammy sites of the kind by means other than just ignoring? Basically you must always keep in mind that all onsite SEO factors must be kept modest rather than ostentatious. "Underdensity" of keywords in the body and underuse of keywords in metas, alt tags, title will do you much less harm than overuse and "ovedensity". This is the general rule of the thumb. But how do you find the optimal balance? Densitywise, I would belive it ranges somewhere between 5-12%, but I personally never exceeded 7-8% with the sites I was good at for some really competitive keywords. Once in title, once-twice in description, once in headers, once-twice high on the page, might try a few times in the keywords in different variations. This is the approach that worked for me on some really tough keywords in the IT industry where every other one is somewhat clever about
search engines.

Javascript navigation does more harm than good.

Whatever they are saying – Google doesn’t parse javascript navigation. It takes so much calculation resources, plus so many keep their JS files external – Google has got other 3-4 billion pages to see to instead of wasting time and Google developer’s effort on going deep into each individual JS navigation case. However, when this seems to be the only navigation option (which seems a lame excuse to me) – be kind enough to arrange a clean href for sitemap preferable high enough on the page with plain href links for all pages you want indexed.

What you need to be conscious of now BEFORE taking the plunge for it all is that just following these guidelines won’t make you and your online business cash flow positive. Now and not then is the time to contemplate proper "reaping the fruit" approach. Converting traffic into customers and eventually staggering bank account numbers is a much more challenging task then just making your site visible on search engines – but this is a topic for a completely different story.


Post a Comment

<< Home