The first goal of any search engine optimization strategy is to get your web pages indexed. But even before that can happen, you need to get the search engine crawlers to visit your website.
Depending on the search engine or directory and the overall circumstances (how you invite and solicit crawlers), that first visit could take days, weeks, or even months.
And while it’s true that the initial crawler visits can be somewhat unpredictable (or take a long time in coming), once the ice is broken, future visits can be controlled to some degree…
Basically, the more frequently you update your pages, the more frequently the crawlers will show up on your website doorstep.
Of course, that’s only half the battle. The other half is getting the search engines and directories to actually index your pages.
In order to do that, you need to start at the beginning. And the beginning in this particular instance is developing and enhancing pages in such a way that the search engine crawlers will be impressed.
The overall search process is simple…
All the text content that search engine crawlers gather is stored and indexed. People conduct searches based on certain phrases (keywords). Whatever content possesses the most relevancy with regard to any given keyword will be placed in the top positions of the search results.
And since the title of the page and the text content generally carry the most weight – at least with regard to what search engine crawlers deem most relevant during their visits – it stands to reason that improvement in page rank and/or search results listing can most often be attributed to having individual and specific keywords properly incorporated into those two prime areas.
Of course, if keywords were the only basis for which page rank and position in search results were determined, optimizing web pages would be pretty much cut and dried…
pick a keyword > use it in your title and throughout your content >
achieve high page rank and top position in search engine results
The problem is, there are so many variables that not only come into play but change on a regular basis, it can seem as though achieving solid and effective search engine optimization might never be possible.
Fortunately, it’s not only possible, it can be relatively painless as well. All you have to do is satisfy the top three requirements of pretty much all major search engines…
- provide quality content
- update content on a regular basis
- get numerous top-ranking websites to link back to your site
And the search engines and directories you should be trying to impress the most are the top four contenders…
DMOZ (Open Directory)
Beyond that, there are countless other search engines and directories like AltaVista, Ask Jeeves, and AllTheWeb.
Should you optimize for those as well, or simply level your sites on the major players and bypass all the search engines and directories below them? Not necessarily. You still want your pages listed in as many locations as possible. You just shouldn’t try to satisfy every one of them with regard to optimization.
Satisfy the top four contenders. Then, if you have the time and ambition to broaden the scope of your SEO efforts, do it. If not, don’t worry about the hundreds (or even thousands) of other search engines and directories that exist.
You’re only human. And just meeting the optimization criteria of the top four is going to be challenging and energetic enough.
Of course, unless you plan to make search engine optimization your life‘s work, it’s not likely you’ll invest most of your energy in that one single area (even when restricted to the top four players). But you do need to invest a fair amount of quality effort.
And that basically equates to these two missions…
1. Get your pages indexed by major search engines.
2. Improve your page rank and position in search results.
In order to accomplish both of those, you need to carefully balance the line between good optimization techniques and the urge to take things a bit too far.
In other words, you need to make certain you carry out your two missions without stepping over the line into what’s commonly referred to as “black hat” search engine tactics.
That dark and evil territory would include things like…
Keyword Stuffing – repeating keywords over and over again for no logical or practical reason
Hidden Content – including keywords or text that’s the same color as the background for the purpose of manipulating search engine crawlers
Doorway Pages – not intended for viewers to see but rather to trick search engines into placing the website into a higher index position
Although these types of practices were once considered intelligent and effective methods of optimization, they can now result in having your website banned from search engines entirely.
In general, it’s better to concentrate on the most popular and most reasonable optimization techniques. By doing that, you’ll not only achieve the results you’re looking for, your efforts will have long lasting results.
And when you consider how much work is involved in getting any website to the top of search engine rank and position, it’s worth whatever effort it takes to get it right the first time.
Search Engine Strategy Basics
For the most part, there are three basic things you’ll need to do in order to accomplish proper and effective search engine optimization.
- compile keyword lists
- publish keyword-rich content
- establish a beneficial link strategy
Naturally, having software that can help you accomplish those four things quickly and efficiently would be a great asset. So, in addition to exploring each of these areas, we’ll include the best software programs for making each task easier to perform.
The core of any SEO strategy is built almost entirely around the group of keywords you choose to target.
The first order of business is to decide which groups of keywords you’ll be utilizing. In most instances, those groups will be either directly or indirectly related to the topic or niche that your website is (or will be) associated with.
Once you’ve established the individual groups of keywords you want to target, you can begin to compile a comprehensive list of top-level phrases that have each of the following characteristics:
- are searched for by thousands of viewers each and every month
- have little or no competition associated with it
The more people who search for the term combined with the least amount of competition associated with it, the more valuable the keyword will be with regard to gaining automatic search engine traffic.
Beyond that, you’ll want to compile lists of secondary keywords. These would still be valuable, but not to the extent that the first top-level list would be.
The main advantage of lower level keywords is the fact that you don’t have to work quite as hard to get definitive search engine recognition. And since you’ll automatically get fairly decent results position, you’ll also receive additional targeted viewer traffic.
To make up for the lack of quality in the keyword itself (in most cases that equates to fewer searches being conducted every month and therefore less competition), you need to work with a much larger quantity of lower-level keywords.
Basically, the results will be just as good as what you experience through top-level keywords. It will just take more keywords to achieve those same results.
Of course, the good news is that there are software programs which can significantly cut down the amount of time it takes to gain content – no matter how many keywords you decide to target (see the next segment on Quality Content).
There are several ways in which you can compile keyword lists. One of the quickest and easiest methods is to use the free online suggestion tool that’s provided by Overture at http://inventory.overture.com/d/searchinventory/suggestion/.
Although it will give you a clear indication of how many searches have been performed on any given topic during the previous month, it’s somewhat bare-bones. Plus, there’s no way to easily transfer results from their web page to your independently compiled keyword list.
When you copy and paste the Overture results, you also get the number of times each keyword has been searched. While that might be good for research purposes, you’ll have to manually remove that part of the data in order to wind up with a file that only lists keywords.
Wordtracker at http://www.wordtracker.com, on the other hand, does in fact allow you to save your results with nothing but the keywords listed.
You’ll have to pay to use their online service, but it’s well worth the price. It’s highly effective and offers the most in-depth and accurate capability with regard to real searches that people perform.
Although there are numerous ways you can conduct research using Wordtracker, they will all revolve around the ability to compile keyword lists which are based on the groups of keywords you originally established.
Once you know exactly which keywords you’ll be targeting, you can begin to implement content that will be associated with each of those phrases.
There are numerous reasons why “Content Is King”.
From a viewer’s perspective, content not only invites them to visit your website but encourages them to return on a regular basis.
It’s a relatively simple equation…
They’re looking for valuable information. Give it to them.
From a search engine perspective, content is one of the primary factors in determining just how much weight or importance should be given to any web page.
Unfortunately, this one isn’t quite as simple an equation…
Search engine crawlers gather and index content. Figure out how to make them place your content higher on the results ladder than some other website.
Of course, in order to become King, content needs to be of considerable quality. In order to remain King, content needs to be updated on a fairly regular basis.
Not to mention the fact that you also need to add content (new pages) on a regular basis. If not, whatever ground you initially gain will simply fade away. And so will whatever search position or rank you’ve achieved.
Naturally, you can manually add content by writing everything yourself. But that alone would be far too time-consuming. Especially when you consider all the other webmaster tasks that need your attention.
So let’s talk about automating the task instead…
One of the best methods for gaining quality content is to include keyword-rich articles on your website. And rather than take the time to write them yourself, you can simply search for and accumulate articles that others have written.
Of course, doing that can also eat up a great deal of time. To minimize the task – as well as enhance the results – you can simply use the following software program.
This allows you to accumulate up to one thousand articles with just the click of a button. And, you can do it based on a specific topic or keyword.
Use the articles to add quality content to your existing websites or use them to built entirely new niche sites. Either way, this is one of the fastest and most efficient methods of gathering quality content.
RSS feeds are yet another superior method. Not just for gaining content but keeping it fresh and updated as well.
Depending on what feed or feeds you happen to choose, the content can be as simple as a list of topic-related links or as complete as a full-scale, full-page article. And of course, there’s everything in between.
The most common choice for RSS feeds are the ones that display a list of topic-related URL’s with a brief description beneath each one. The reason this type of feed is most popular is the fact that the brief description allows more potential for targeting specific keywords.
For example, if the topic of your website is golf and you want one of your pages to be optimized for the keyword “golf swing”, you would want any and all content to include that particular search phrase.
It’s no different than optimizing any other content on your website. You have a specific keyword and you need that phrase to be included in such a way that it will carry significant weight with the search engine crawlers.
If you can’t accomplish that, you’re merely shooting in the dark, hoping to gain targeted viewer traffic without actually targeting it.
The goal is to add content that is geared toward specific keywords. And RSS feeds are no different than any other content. If it doesn’t include the keywords, you’ll merely get search engine credit for having generic topic-related content.
Of course, what you really want – and need – is to gain rank and listing benefit from whatever content is added. That’s the whole purpose… to gain enough search engine recognition which in turn gains you targeted viewer traffic.
That being the case, the ultimate software program would be one that could automatically place RSS feeds on your pages while at the same time do it based on specific keywords.
Fortunately, there is such a program. And it’s the best software available…
What you achieve by using RSS Equalizer is instant theme-based content, the kind that search engines like Google are looking for. And because the content changes each and every day, you can count on receiving more frequent visits from search engine crawlers.
So the ultimate result is just what you’re hoping to gain… faster indexing, better search position, and higher page rank.
Choosing the right keywords and publishing quality keyword-rich content puts you approximately two-thirds of the way toward optimum search engine recognition. The other third is pretty much solely based on popularity.
If we were talking about popularity in the real world, it would probably include simple things like who was voted King and Queen of the high school prom, or who had the most date options on a Saturday night, or which sibling got the most attention from Mom or Dad.
In the world of search engines, popularity takes on a whole different meaning. And in most instances, it comes down to this… the website with the most quality links pointing to it wins the contest.
That’s the game. And the ultimate goal is to get countless “important” websites (those that have a theme or topic that’s similar to yours) to provide links back to you. Of course, when we’re talking about importance, we’re referring to how major search engines view them.
Most often, that equates to high page rank and top position in search results. The higher up the food chain a website happens to be, the more powerful any link they provide back to you is perceived.
In order to get the most bang out of the link popularity process, it’s best if you actually seek out valuable websites. Aside from those you might already have in mind, conduct searches based on the keywords you’re most interested in gaining search engine recognition for.
Naturally, someone who’s in direct competition with you wouldn’t even consider giving you a link back. So what you’ve really looking for are popular websites that have content or products that are either complimentary to yours or are indirectly.
For example, let’s say your topic and keyword is based on ways of perfecting your golf swing. Good link back choices would be websites with the following themes or products:
- information about golf courses or golf tournaments
- golf equipment or apparel
- golf instructors or seminars
If the topic is related to yours and the website that’s providing the link back carries a good deal of weight with major search engines, the value of your own website will automatically be elevated.
When it comes to the actual link that these valuable and important websites place on their pages…
Always encourage the use of text links rather than just a URL. For example, instead of simply displaying http://www.adwordanalyzer.com as the link back to your website, you want something more substantial and keyword rich. And, of course, search engine friendly.
If one of your keywords is “targeted traffic”, for example, the link might read as follows:
Drive targeted traffic to your website with Adword Analyzer
That not only gives you credit for the keyword, it encourages the search engine crawler to perceive your website as having more value.
If you have a separate page on your website where you solicit link backs, it’s always a good idea to list one or more link text possibilities. That way, you’ll receive credit for the keywords you yourself have chosen to target.
You should also provide the HTML code for placing your link on other websites. Basically, make it as easy as possible for someone else to add you to their pages.
You should also specify where you require a link back to be placed. Ideally, you would want your link located on either a home page or one click away from the home page. At the very least, your link should be located where it will be perceived as valuable by the search engine crawlers.
Buried four or five levels deep on some obscure page that might not even be indexed is absolutely worthless. The whole point of getting link backs is to gain more importance with the search engines.
So the bottom line is…
The more control you have over the links that others place on their websites, the more search engine value you’ll experience.
It takes a good deal of time and effort to encourage high-ranking websites to link back to you. Make certain you invest whatever additional effort is necessary in order to gain the best possible link as well.
And the criteria for the best possible link is this:
1. It includes keyword rich text.
2. It originates from a valuable and high-ranking website.
3. It’s placed in what would be considered an important location.
Anything less than that and you’re compromising the whole link back process.
Always keep in mind that in this particular instance, quality will always win out over quantity. Yes, you want a vast number of links pointing back to your website. But given a choice, you’re much better off with fewer links from important websites than countless links from sites that don’t carry much weight with search engines.
What To Do…
Following is a brief overview of what each of the major search engines and directories is looking for with regard to optimization and value.
Doesn’t use meta description and keyword tags. High score for the overall weight and proximity of keywords, < h > tags, and bold text. Rewards quality content, anywhere between 50 to 600 words. Content should include keywords in text and links. Likes to see keywords in the page title (utilizing 90 characters or less) and carried consistently throughout the website. Especially values link popularity, themes, and keywords in URL‘s and link text. The use of excessive keywords, cloaking, and link farms is viewed as SE spamming.
No major importance but the description and keywords filled in play a role. Will not index anything associated with SE spam. Slow loading pages run the risk of being excluded. The page title has some significance and should be concise. Likes site popularity and wants to see a theme throughout the website.
Supports meta description and keyword tags. Doesn’t index anything associated with SE spam. Frames must use <no frames> tag to get indexed. Considers the page title important and wants it to contain keywords. Wants to see proper keyword frequency. Link popularity carries a good deal of weight. Likes to see a theme carried throughout the entire website.
Likes to see concise and accurate descriptions and keywords. Slow loading pages can be penalized. The page title has some significance and should be filled in. Keyword frequency is not factored in. Link popularity is not important. Especially likes to see accurate and appropriate category choices.
What Not To Do…
After all your hard work getting your web pages optimized, the last thing you want is to do something that would prevent your site from getting indexed. Or worse, have it blacklisted by search engines altogether.
At the top of the “don’t do” list is the use of invisible text (the text is the same color as the background ). Most every search engine is wise to this practice and will currently ban any website found to be using it.
Here is a quick rundown of everything else you should never do…
Don’t repeat keywords excessively.
Don’t place irrelevant keywords in the title and meta tags.
Don’t make use of link farms.
Don’t submit to inappropriate categories in search directories.
Don’t submit too many web pages in one day.
Don’t publish identical pages.
Don’t use meta refresh tags
No matter how good your website is – no matter how valuable the content it contains or how legally optimized it might be – if you use any of the things spelled out above, you run the risk of being blacklisted, branded as a search engine spammer.
Although it varies from one search engine to another, spamming can include one or more of the following:
irrelevant web page titles and meta description and keywords tags; repetition of keywords; hidden or extremely small text; submitting web pages more than once in a twenty-four-hour span; mirror sites that point to different URL addresses; using meta refresh tags
When it comes to directories such as DMOZ (which have human editors), spamming generally equates to one of these three practices:
deliberate choice of an inappropriate category within the directory; marketing language; capitalization of letters
It’s not difficult to stay out of black hat territory. But it’s certainly difficult to recover from having used those types of techniques. That is, assuming you can recover at all.
Just pay attention to the rules established by search engines and directories. And since Google is the player you’ll most want to satisfy, it’s important that you read and re-read their webmaster guidelines which are published at http://www.google.com/webmasters/ on a regular basis.
Break the rules and you’ll always be struggling to gain benefit from all the major search engines. Follow the rules and you’ll establish web pages that will not only be around a long time, they’ll always be in contention for top search results position.
- Start by establishing groups of keywords that are related to your chosen topics or areas of interest.
- The best keywords are ones that are searched for by thousands of viewers each and every month but have little competition associated with them.
- Because secondary keywords are associated with fewer searches and less competition, you’ll need to implement more of them in order to achieve maximum benefit.
- Keywords should be included in the title, in < h > tags, and throughout the overall content.
- Wordtracker is a comprehensive and in-depth online service for compiling accurate and effective keyword lists.
- Don’t repeat keywords excessively.
- Don’t use inappropriate keywords in the page title and description.
- From a viewer’s perspective, content not only invites them to visit your website but encourages them to return on a regular basis.
- From a search engine perspective, content is one of the primary factors in determining just how much weight or importance should be given to any web page.
- You need to add new content on a regular basis.
- You need content that is updated frequently.
- Use Article Equalizer ( http://www.articleequalizer.com ) to easily and quickly accumulate and publish keyword-rich content.
- Use RSS Equalizer ( http://www.rssequalizer.com ) to place keyword-related RSS feeds on specific and individual pages.
- The goal is to get countless “important” websites to provide links back to you.
- The higher up the food chain a website happens to be, the more powerful any link they provide back to you is perceived.
- Actively seek out important websites that have similar or related themes, products, or information.
- Encourage link backs to include valuable and keyword-rich text rather than simply a URL address.
- The best links originate from high-ranking websites, are placed in important page locations, and include keyword-rich text.
- Pay attention to the rules set forth by search engines and directories, especially the webmaster guidelines published by Google.
- Follow the rules and guidelines set forth by search engines and directories.
One Way SEO Links
Overture Suggestion Tool