x

Welcome to part four in this search engine positioning series. Last week we discussed the importance of the structure of your website and the best practices for creating an easily spidered and easily read site. In part four we will discuss content optimization.

This is perhaps the single most important aspect of ranking your website highly on the search engines. While all of the factors covered in this series will help get your website into the top positions, it is your content that will sell your product or service and it is your content that the search engines will be reading when they take their “snapshot” of your site and determine where it should be placed in relation to the other billions of pages on the Internet.

Over this series we will cover the ten key aspects to a solid search engine positioning campaign.

The Ten Steps We Will Go Through Are:

  • Keyword Selection (http://www.beanstalk-inc.com/articles/search-engine-positioning/keywords.htm)

  • Content Creation (http://www.beanstalk-inc.com/articles/search-engine-positioning/content.htm)

  • Site Structure (http://www.beanstalk-inc.com/articles/search-engine-positioning/structure.htm)

  • Optimization

  • Internal Linking

  • Human Testing

  • Submissions

  • Link Building

  • Monitoring

  • The Extras

Step Four – Content Optimization

There are aspects of the optimization process that gain and lose importance. Content optimization is no exception to this. Through the many algorithm changes that take place each year, the weight given to the content on your pages rises and falls. Currently incoming links appear to supply greater advantage than well-written and optimized content. So why are we taking an entire article in this series to focus on the content optimization?

The goal for anyone following this series is to build and optimize a website that will rank well on the major search engines and, more difficult and far more important, hold those rankings through changes in the search engine algorithms. While currently having a bunch of incoming links from high PageRank sites will do well for you on Google you must consider what will happen to your rankings when the weight given to incoming links drops, or how your website fares on search engines other than Google that don’t place the same emphasis on incoming links.

While there are many characteristics of your content that are in the algorithmic calculations, there are a few that consistently hold relatively high priority and thus will be the focus of this article. These are:

  1. Heading Tags

  2. Special Text (bold, colored, etc.)

  3. Inline Text Links

  4. Keyword Density

Heading Tags

The heading tag (for those who don’t already know) is code used to specify to the visitor and to the search engines what the topic is of your page and/or subsections of it. You have 6 predefined heading tags to work with ranging from to .

By default these tags appear larger than standard text in a browser and are bold. These aspects can be adjusted using the font tags or by using Cascading Style Sheets (CSS).

Due to their abuse by unethical webmasters and SEO’s, the weight given to heading tags is not what it could be however the content between these tags is given increased weight over standard text. There are rules to follow with the use of heading tags that must be adhered to. If you use heading tags irresponsibly you run the risk of having your website penalized for spam even though the abuse may be unintentional.

When using your heading tags try to follow these rules:

  • Never use the same tag twice on a single page

  • Try to be concise with your wording

  • Use heading tags only when appropriate. If bold text will do then go that route

  • Don’t use CSS to mask heading tags

Never use the same tag twice on a single page. While the tags holds the greatest weight of the entire heading tags, its purpose is to act as the primary heading of the page. If you use it twice you are obviously not using it to define the main topic of the page. If you need to use another heading tag use the tag. After that the tag and so on. Generally I try never to use more than 2 heading tags on a page.

Try to be concise with your wording. If you have a 2 keyword phrase that you are trying to target and you make a heading that is 10 words long then your keyword phrase only makes up about 20% of the total verbiage. If you have a 4-word heading on the other hand you would then have a 50% density and increased priority given to the keyword phrase you are targeting.

Use heading tags only when appropriate. If bold text will do then go that route. I have seen sites with heading tags all over the place. If overused the weight of the tags themselves are reduced with decreasing content and “priority” being given to different phrases at various points in the content. If you have so much great content that you feel you need to use many heading tags you should consider dividing the content up into multiple pages, each with its own tag and keyword target possibilities. For the most part, rather than using additional heading tags, bolding the content will suffice. The sizing will be kept the same as your usual text and it will stand out to the reader as part of the text but with added importance.

Don’t use CSS to mask heading tags. This one just drives me nuts and is unnecessary. Cascading Style Sheets (CSS) serve many great functions. They can be used to define how a site functions, looks and feels however they can also be used to mislead search engines and visitors alike. Each tags has a default look and feel. It is fine to use CSS to adjust this somewhat to fit how you want your site to look. What is not alright is to adjust the look and feel to mislead search engines. It is a simple enough task to define in CSS that your heading should appear as regular text. Some unethical SEO’s will also then place their style sheet in a folder that is hidden from the search engine spiders. This is secure enough until your competitors look at the cached copy of your page (and they undoubtedly will at some point) see that you have hidden heading tags and report you to the search engines as spamming. It’s an unnecessary risk that you don’t need to take. Use your headin!

gs properly and you’ll do just fine.

Special Text

Special text (as it is used here) is any content on your page that is set to stand out from the rest. This includes bold, underlined, colored, highlighted, sizing and italic. This text is given weight higher than standard content and rightfully so. Bold text, for example, is generally used to define sub-headings (see above), or to pull content out on a page to insure the visitor reads it. The same can be said for the other “special text” definitions.

Search engines have thus been programmed to read this as more important than the rest of the content and will give it increased weight. For example, on our homepage we begin the content with “Beanstalk Search Engine Positioning …” and have chosen to bold this text. This serves two purposes. The first is to draw the eye to these words and further reinforce the “brand”. The second purpose (and it should always

be the second) is to add weight to the “Search Engine Positioning” portion of the name. It effectively does both.

Reread your content and, if appropriate for BOTH visitors and search engines, use special text when it will help draw the eye to important information and also add weight to your keywords. This does not mean that you should bold every instance of your targeted keywords nor does it mean that you should avoid using special text when it does not involve your keywords. Common sense and a reasonable grasp of sales and marketing techniques should be your guide in establishing what should and should not be drawn out with “special text”.

Inline Text Links

Inline text links are links added right into text in the verbiage of your content. For example, in this article series I may make reference to past articles in the series. Were I to refer to the article on keyword selection rather than simple making a simple reference to it as I just have it might be better to write it as, “Were I to refer to the article on keyword selection rather …” (this instance of “keyword selection” is mean to be an inline link to http://www.beanstalk-inc.com/articles/search-engine-positioning/keywords.htm however limitations in the article submission process do not make this possible)

Like special text this serves two purposes. The first is to give the reader a quick and easy way to find the find the information you are referring to. The second purpose of this technique is to give added weight to this phrase for the page on which the link is located and also to give weight to the target page.

While this point is debatable, there is a relatively commonly held belief that inline text links are given more weight that a text link which stands alone. If we were to think like a search engine this makes sense. If the link occurs within the content area then chances are it is highly relevant to the content itself and the link should be counted with more strength than a link placed in a footer simply to get a spider through the site.

Link “special text” this should only be employed if it helps the visitor navigate your site. An additional benefit to inline text links is that you can help direct your visitors to the pages you want them on. Rather than simply relying on visitors to use your navigation bar as you are hoping they will, with inline text links you can link to the internal pages you are hoping they will get to such as your services page, or product details.

Keyword Density

For those of you who have never heard the term “keyword density” before, it is the percentage of your total content that is made up of your targeted keywords. There is much debate in forums, SEO chat rooms and the like as to what the “optimal” keyword density might be. Estimates seem to range from 3% to 10%.

While I would be the first to admit that logic dictate that indeed there is an optimal keyword density. Knowing that search engines operate on mathematical formulas implies that this aspect of your website must have some magic number associated with it that will give your content the greatest chance of success.

With this in mind there are three points that you should consider:

  1. You do not work for Google or Yahoo! or any of the other major search engines (and if you do you’re not the target audience of this article). You will never know 100% what this “magic number” is.

  2. Even if you did know what the optimal keyword density was today, would you still know it after the next update? Like other aspects of the search engine algorithm, optimal keyword densities change. You will be chasing smoke if you try to constantly have the optimal density and chances are you will hinder your efforts more than help by constantly changing the densities of your site.

  3. The optimal keyword density for one search engine is not the same as it is for another. Chasing the density of one may very well ruin your efforts on another.

So what can you do? Your best bet is to simple place your targeted keyword phrase in your content as often as possible while keeping the content easily readable by a live visitor. Your goal here is not to sell to search engines, it is to sell to people. I have seen sites that have gone so overboard in increasing their keyword density that the content itself reads horribly. If you are simply aware of the phrase that you are targeting while you write your content then chances are you will attain a keyword density somewhere between 3 and 5%. Stay in this range and, provided that the other aspects of the optimization process are in place, you will rank well across many of the search engines.

Also remember when you’re looking over your page that when you’re reading it the targeted phrase may seem to stand out as it’s used more than any other phrase on the page and may even seem like it’s a bit too much. Unless you’ve obviously overdone it (approached the 10% rather than 5% end of the spectrum) it’s alright for this phrase to stand out. This is the phrase that the searcher was searching for. When they see it on the page it will be a reminder to them what they are looking for an seeing it a few times will reinforce that you can help them find the information they need to make the right decision.

Final Notes

In an effort to increase keyword densities, unethical webmasters will often use tactics such as hidden text, extremely small font sizes, and other tactics that basically hide text from a live visitor that they are providing to a search engines. Take this advice, write quality content, word it well and pay close attention to your phrasing and you will do well. Use unethical tactics and your website may rank well in the short term but once one of your competitors realizes what you’re doing you will be reported and your website may very well get penalized. Additionally, if a visitor realizes that you’re simply “tricking” the search engines they may very well decide that you are not the type of company they want to deal with; one that isn’t concerned with integrity but rather one that will use any trick to try to get at their money. Is this the message you want to send?

Next Week

Next week in part five of our “Ten Steps To an Optimized Website” series we will be covering internal links strategies and best practices. This will cover everything from image links and scripts to inline and basic text links.

About The Author

Dave Davies is the owner of Beanstalk Search Engine Positioning (http://www.beanstalk-inc.com/). He has been optimizing and ranking websites for over three years and has a solid history of success. Dave is available to answer any questions that you may have about your website and how to get it into the top positions on the major search engines.

info@beanstalk-inc.com

x

Every webmaster would like to see his/her website to be the number 1 search result returned in search engines. A number 1 spot in Google pretty much guarantees loads of traffic to a website which can then materialize in high revenue for the website owner.

To reach that number 1 spot search engine optimization (SEO) is the tool webmasters have to use in almost every case. Several books have been written covering search engine optimization. Hundreds of websites cover the topic and give loads of advice. There is so much information about this topic – it’s almost impossible to digest. Webmasters have all they need available at their hands at any time and also share the knowledge.Google (as an example) changes the rules all the time and missing out on these things can mean that a website drops down to the bottom of the search results delivered on any given search. The hunt for the best search engine optimization results is on 24 hours a day, 365 days a year.

As with anything there will always be people who go a step too far.Search Engine Optimization is no exception. You’ve got the Black Hats who do use every legal or illegal trick to increase their website’s search engine ranking and you have the so called White Hats who play by the rules and only use legitimate SEO tools and tricks. And then you have people who just over-do it. They build their websites completely optimized for the search engines but seem to forget about the user in the end. These websites are stuffed with keywords and phrases all over.Navigation and presentation of content is optimized for the search engine but they seem to completely forget about

the human factor. Yes, driving traffic to the website from search engines is great. But what if the site is difficult to navigate for the visitor because it is optimized for a search engine and not for usability? A website not meeting the needs of humans is set up to fail.

Having the number 1 spot in a search engine will not materialize in higher profits and revenue if the site does not meet basics requirements for humans to a) navigate the site properly and b) to be able discover what they are looking for in an easy way. Articles stuffed with the same keywords over and over again are hard to read and the information the user is looking for is difficult to extract. Links to sub-pages covered under keywords over and over again will make it difficult to even get to the information the user is looking for. The user experience will be disappointing and will lead to the user moving on to other sites that are able to deliver information in an appropriate way. There are other webmasters who are able to achieve high search engine rankings and still offer satisfying experiences for users on their websites?

So, if you are a webmaster – will you optimize your websites for search engines or for the user?

About the Author

Christoph Puetz is a successful entrepreneur and international book author. Examples of his search engine optimization work can be found at Web Hosting Tutorials, Highlands Ranch and at Credit Repair.

The article can be published by anyone as long as the resource box (About the Author) is posted on the website including the links. These links must be clickable.

x

In one of my articles, I discussed how to market your web site link twice. It detailed out how to promote not only www.yoursite.com but how you should also promote your site without the www., like this: http://yoursite.com.

This article is to talk about promoting ALL of your pages within your marketing campaign. See, most of us typically only promote the main page on our sites. Ex. www.yoursite.com. The truth is, your site is much more than just the 1st page right? Well, let’s condition ourselves to promote everything available within your site…

Search Engine Marketing is crutial for all companies who want to succeed online. I’m sure at one point or another, you will hear how “Optimizing your site for search engines is crutial”. Of course, they aren’t fooling you, it is a crutial marketing tactic but, what is also crutial is learning how to use different tools to boost your search engine placements once you’ve optimized your site for the web.

So let’s talk about the marketing tactics available to you and implement strategies on how to promote all your pages within them.

LINK EXCHANGES:

Link popularity has become a norm for most small companies to implement in their daily promoting activities. Here’s the problem, most companies that perform link exchanges daily fail to utilize it to their advantage. For ex. Let’s say you perform approximately 10 link exchanges daily. For each one, you submitted your link “www.yoursite.com”. What you’ll want to start implementing is submitting 10 different links within your site.

Ex Link Exchanges:

  • Link #1: http://yoursite.com

  • Link #2: http://yoursite.com/resources

  • Link #3: http://www.yoursite.com/services

  • And so on…

A good strategy would be to open up “Note Pad” and create all the links you want to promote.

  1. Add each link

  2. Assign an appropriate title to each link

  3. Create an appropriate description for each link

Now all you have to do is to

copy and paste each link when performing your daily link exchanges.

WRITING ARTICLES:

Do you write articles to promote your site??? If you do, then you probably have created a “Resource Box” at the end of each article right??? Good, let’s change the resource box a little.

My site is host to 100′s of Marketing Articles and the 1 thing I notice time & time again is that each author (no disrespect to any of the wonderful authors) Typically only add the main page of their web site within their “resource box”.

Ok, let’s say you write articles about “Search Engine Marketing”. At the end of the article, add a link to a page on your site that talks about “Search Engine Marketing” or something similar.

Ex. http://www.yoursite.com/search_engine_marketing.html

So with that in mind, try revising all your articles to point to specific pages on your web site.

DIRECTORY SUBMSSIONS:

Submitting your site to directories will give your site long lasting traffic. Not all directories will accept any link from your site besides the main page. This is ok though, there are literally 100′s of other directories that will allow you to submit any page you want.

Experiment with this and try to change up all your links when your submitting to directories.

I WOULD NOT RECOMMEND : Using anything other than your main page for directories like: OPD Open Project Directory or Yahoo.

So now you have 3 proven marketing strategies on how to improve your search engine placements for all of your pages instead of only your main page. Be creative with this and look for more strategies you can implement this tactic with.

About The Author

Martin Lemieux

Smartads – President

http://smartads.info/top-10/download

Instantly download any Marketing Article created by Martin Lemieux for use within your web site and/or newsletter: http://smartads.info

Affordable Web Design & Web Site Marketing

support@smartads.info

x

A while back, I read an article that explained how to get a good google rating without ever submitting your site to their submission forms. Like you, I was kind of shocked by this statement so I decided to give it a try.

In the beginning, I used to submit my site all the time to Google but soon realized the magnitude of my failure. Of course, it’s a known fact that Google relies solely on your link popularity and content.

Link Popularity?

What that means is the amount of links (yousite.com) listed on other sites that are related to yours! The more sites that link to you, the greater your popularity!

So again, Google depends on your link popularity! If you don’t have a google rating (In other words, is your link found on google?) some sites WILL NOT link to yours. There are many sites that have a great google rating and have specific regulations to whom they will accept within their resource sections. They will specify that your site must be listed within Google and if they type in your site in the google search bar, your site should be listed within the top 5 sites.

Here’s The Theory:

Of course you want all sites to link to you, especially the ones who already have a great google rating because that means that your site will be picked up by google. Some people only try to get links from those sites but just remember, we all had to start somewhere.

My suggestion is to do as many link exchanges as possible and especially make sure that each site is specifically categorized by their content. You will get penalized for having one page full of links to sites that do not relate to each other.

Another little “Timbit”, try to keep your pages to

1-20 links within each page. Once you have 20 links, add a button to a (PAGE 2) and continue your resources that way.


Here’s an example of our “Resource Section”

http://www.smartads.info/resources

You’ll notice how each category has specific sites that get listed within them.

Ok, so I went a little off topic with respect to this article but you’ll see how everything is connected together. Once google starts ranking your site, you don’t want to leave any stone left un-turned.

So back to getting a google rank without submitting your site. Once you start performing link exchanges with other sites, just keep on going and going and I promise that eventually, Google will start picking up some of your links on other sites. Once this starts to happen you’re google rank will rise.

So just keep doing what you’re doing and let Google do it’s own thing.

When your ready and you’re link popularity grows, then you can submit your site to Google and reap the rewards.

So many companies do this process backwards and wonder why they don’t get listed in Google. Or like what happened to me, Google lists your site right away but then in a couple of weeks, presto, your site is gone from their listing and your left confused as to why!

In this article I talked about Link Exchanges & popularity, feel free to learn more about this in my 2 part series:

Link Exchanges, what they can do for your business, part 1

http://www.smartads.info/articles/le/10.html

Link Exchanges, what they can do for your business, part 2

http://www.smartads.info/articles/le/11.html

About The Author

Martin Lemieux

President

Smartads Information Centre

Advertising, Marketing Resources & Web Design

http://www.smartads.info

Want To Use This Article???

GO AHEAD! Just keep it in it’s entirety!

. . : : SmartAds Information Centre : : . .

“Helping you expose your business to the world!”

support@smartads.info

x

Anchor Text (also called phrase linking) can significantly improve your web pages relevance in the search engines. Optimized or keyword rich anchor text can help your web site gain positioning in the search engines as well as help drive better targeted search traffic.

What is Anchor Text?

Anchor Text is the visible hyperlinked text that you see on the page, here is an example:

To read more about search engine optimization techniques, check out my Search Engine Optimization blog.

Here the words “search engine optimization” are hyperlinked to the hidden URL of http://seogirl.blogspot.com/. The words, visible, “search engine marketing blog” is the anchor text.

Anchor text should be used to indicate the subject matter of the page that it links to. If you use the example above, “Search Engine Optimization Blog” indicates to visitors that they can expect to find information pertaining to search engine optimization if they click on that link.

Why is Anchor Text Important?

Anchor text is one of the more important elements in

influencing a Web site’s position in search engine result pages (SERPS). Your anchor text should include important keywords.

If the anchor text technique is used properly it will enhance the relevance of the targeted page. The page containing the anchor text will also be enhanced to some degree because you will be using relative keywords.

Optimizing Anchor Text of Inbound/External Links

Keywords within the anchor text are equally useful from links pointing to your website from other websites. (Inbound links) If you are working on a link building campaign, it is suggested that you have several title and description options for the link pointing to your website.

If you supply a webmaster with something to copy and paste they are able to set your link up within a few minutes and you get exactly what you want as far as anchor text.

2005 NMS

Nicole St. Martin is a professional search engine optimization consultant currently working in the legal industry.

View My Blog:http://seogirl.blogspot.com

x

Google Sitemaps enables Webmasters to Directly Alert Google to Changes and Additions on a Website and that’s just one of 7 Benefits.

Telling search engines about new pages or new websites use to be what the submission process was all about. But major search engines stopped using that process a long time ago.

Google has for a long time depended on external links from pages they already know about in order to find new websites.

For webmasters and website owners Google Sitemaps is the most important development since RSS or Blog and Ping, to hit the Internet.

Using RSS and Blog and Ping enabled webmasters to alert the search engines to new additions to their web pages even though that was not the primary purpose of these systems.

If you’ve ever waited weeks or months to get your web pages found and indexed you’ll know how excited we webmasters get when someone discovers a new way to get your web pages found quicker.

Well that new way has just arrived in Google Sitemaps and it’s a whole lot simpler than setting up an RSS feed or Blog and Ping. If you haven’t heard of Blog and Ping it’s a means by which it’s possible to alert the search engines to crawl your new website content within a matter of hours.

If you’re a webmaster or website owner Google Sitemaps is something you Can’t afford to ignore, even if you’re also using RSS and/or Blog and Ping

The reason you should start using Google Sitemaps is that it’s designed solely to alert and direct Google Search Engine crawlers to your web pages. RSS and Blog and Ping are indirect methods to alert search engines, but it’s not there primary purpose.

It works now, but like most things it’s becoming abused. Search Engines will find ways to combat the abuse as they’ve done with every other form of abuse that’s gone before.

Abusing the search engines is a short term not a long term strategy and in some cases certain forms of abuse will get you banned from a search engines index.

You may also be thinking, don’t we already have web page meta tags that tell a search engine when to revisit a page. That’s true, but the search engine spider still has to find the new page first, before it can read the meta tag. Besides that meta tags are out of favour with many search engines especially Google, because of abuse.

If talk of search engine spiders leaves you confused, they’re nothing more than software programs that electronically scour the Internet visiting web sites looking for changes and new pages.

How often the search engine spider alias robot, visits your website depends on how often your site content is updated, or you alert them to a change. Otherwise for a search engine like Google they may only visit a website once a month.

As the internet gets bigger every second of every day, the problem for search engines and webmasters is becoming evidently greater. For the search engines it’s taking their search spiders longer to crawl the web for new sites or updates to existing ones.

For the webmaster it’s taking longer and becoming more difficult to get web pages found and indexed by the search engines

If you can’t get web pages found and indexed by search engines, your pages will never be found in a search and you’ll get no visitors from search engines to those pages.

The answer to this problem at least for Google is Google Sitemaps

Whilst still only in a beta phase while Google refines the process, it’s fully expected that this system, or one very similar, is here to stay.

Google Sitemaps is clearly a win-win situation

Google wins because it reduces the huge waste of their resources to crawl web sites that have not changed. Webmasters win because they alert Google through Google Sitemaps what changes or new content has been added to a website and direct Google’s crawlers to the exact pages.

Google Sitemaps has the potential to speed up the process of discovery and addition of pages to Google’s index for

any webmaster that uses Google Sitemaps.

Conventional sitemaps have been used by webmasters for quite some time to allow the easier crawling of their websites by the search engine spiders. This type of sitemap is a directory of all pages on the website that the webmaster wants the search engines or visitors to find.

Without sitemaps a webmaster runs the risk of webpage’s being difficult to find by the search engine crawlers, or never being found at all.

Do I need Google Sitemaps if I already have sitemaps on my websites?

Google Sitemaps are different to conventional sitemaps because they’re only seen by the Search Engine Spiders and not human visitors. Google Sitemaps also contain information that’s only of value to the search engine in a format they understand.

Creating Google Sitemaps in 5 steps

1. Create Google Sitemaps in a supported format ( see end of article )

2. Upload Google Sitemaps to your Web Hosting space

3. Register for a free Google Account if you don’t already have one

4. Login to your Google Sitemaps Account and submit the location of your sitemaps

5. Update your Sitemaps when your site changes and Resubmit it to Google

From your Google Sitemaps account you can also see when your sitemap was last updated and when Google downloaded it for processing. It will also tell you if there were any problems found with your sitemaps.

Google Sitemaps can be used with commercial or non-commercial websites, those with a single webpage, through to sites with millions of constantly updated pages. However a single Google Sitemaps file is limited to 50,000 web pages. For websites with more pages, another Google Sitemaps file must be created for each block of 50,000 pages.

If you want Google to crawl more of your pages and alert them when content on your site changes, you should be using Google Sitemaps. The other added benefit is it’s free.

If you’re expecting this special alert process with Google Sitemaps to improve your Page Rank, change the way Google ranks your web pages, or in any way guarantee inclusion of your web pages, Google has made it clear it will make no difference.

Google Sitemaps web pages are still subject to the same rules as non Google Sitemaps pages.

If your site has dynamic content or pages that aren’t easily discovered by following links, Google Sitemaps will allow spiders to know what URLs are available and how often page content changes.

Google has said that Google Sitemaps is not a replacement for the normal crawling of web pages and websites as that will continue in the conventional way. Google Sitemaps does however allow the search engine to do a better job of crawling your site.

The Google Sitemap Protocol is an XML file containing a list of the URLs on a site. It also tells the search engine when each page was last updated, how often each page changes and how important each page is in relation to other web pages in the site.

Google Sitemaps 7 Benefits You Can’t Ignore

1. Alert Google to Changes and Additions to your Website Anytime You Want

2. Your Website is crawled more Efficiently and Effectively

3. Web Pages are Categorized and Prioritized exactly How You Want

4. Speed up the process of New Website and New Web Page Discovery

5. No Waiting and Guessing to see when Spiders crawl your web pages

6. Google Sitemaps is likely to set the standard for Webpage Submission and Update Notification which will extend the benefits to other Search Engines

7. The Google Sitemaps service is Free

Exactly how to create a Google Sitemaps file to upload to your website is in the continuing part of this article in Google Sitemaps.

Tony Simpson is a Web Designer and Search Engine Optimizer who brings a touch of reality to building a Web Business. It’s a No-Hype, No B.S approach from his own 5 year experience. He provides advice, product reviews and products at Web Page Add Ons to Make Automation of Your Web Site Work for You.

The continuing part of this article about creating Google Sitemaps is at Google Sitemaps

x

Quite often I get asked what the magic solution is for getting better Google ranking. Sometimes the questions sound fairly naive, something like this:

“I have recently established a site called wazooski.com and would like to rank much higher in Google, Yahoo and MSN. Can you tell me how to do this without spending a lot of money? Can I get into the top 10 within 1 or 2 months?”

This is one of those “rookie” questions. Experienced marketers know that predicting search engine rankings is always a hit or miss affair. SEO practitioners who “guarantee” high search engine rankings are making misleading claims, intended only to sell their services.

Imagine how many sites within any competitive area are going after those “top 10″ rankings. Many of your competitors have been around for a few years, so they have an established site with lots of valuable content, steady traffic, and thousands of sites linking into them. How can you expect to just throw up a site and within a month or two walk away with a “top 10″ position?

This is only possible within a narrowly defined, highly specialized niche. Say for instance you are going to hold a Wazooski family reunion next year, and want to use the internet to promote it. Chances are

a few well placed announcements scattered around 20 or 30 article sites, directories and blogs will generate enough search engine activity to get you good positioning in the search engines. Within a month or two you should get the number one spot for “Wazooski family reunion”, within the top 10 for “Wazooski”, and possibly even an honourable mention further down the list for “family reunion”. Using a blog or two will often speed this process up considerably.

The reason is pretty obvious: there is not a lot of competition for “Wazooski family reunion”. In fact you may be the only one competing for that term. All you really need to do is get your site or your announcements spidered and the chances are pretty good that you will get a high ranking almost immediately.

But try this with a more competitive term and you are talking a completely different game. Considering that most competitive terms have thousands of sites chasing after that illusive “top 10″ ranking, you will be lucky to even get on the radar screen.

And trying to do it within a month or two is almost completely unrealistic.

Rick Hendershot publishes the Linknet Network, a group of websites and blogs offering web owners advertising and link promotion opportunities.

x

One thing is for sure, you don’t want to spend hours, perhaps days, months, or years on a website to have some stupid little mistake get your site dropped or never even listed in the search engines. There are a lot of rules that search engines have created to block out what they call spammers, so don’t kid yourself by telling yourself you are not an evil spammer. As the courts might say, ignorance of the law is no excuse. So what kind of horrific mistake could sneak upon you and possibly ruin all your hard work?

When I first starting making web pages, I created basic templates that I used for an entire site. Of course it’s great to have a uniformed look for your site, but what if you had a screw up on a template you used over and over again. What kind of screw up? How about hidden text or a hidden link? You see the old WYSIWIG editors, like the older versions of FrontPage sometimes leave behind links within the html code even after you’ve deleted the link. As far as hidden text, that can happen by not paying attention to what you are doing. If you make hundreds of pages eventually you might accidentally color some of your text the same as your background. You say it’s not likely. I wouldn’t think so either, but it’s happened to me several times. If you use the mistake ridden template over and over again, you might have a problem. It’s generally understood that search engines frown on hidden text and links. How many will they overlook is anybody’s guess. So if you haven’t checked your old web pages, it might be profitable to check out your html code. Look for urls with no

link text in the code. You can usually find hidden text by simply highlighting your webpage in your browser.

Solutions

If you find out you have the problem over hundreds or thousands of pages, it might be worth investing in Microsoft FrontPage 2003. It has a split screen that helps in finding html errors, and best of all you can do a site wide search and replace. The software will find the code you search for, and all you have to do is leave the replacement code box blank, thus removing the offending hidden link.

The good news is that there are some other ways to avoid this problem altogether. You can learn CSS for template designs for instance. A trick I like to use is Server Side Includes (SSI) for my links menu. To make it work you have to have two things, a code like this [an error occurred while processing this directive] with your links menu page inside the code, and your server has to be set to take it. Most servers are automatically set to use SSI includes in shtml pages, but most web hosts allow you to pick .html or .htm pages to parse. The only thing to keep in mind is it puts an extra task for the server to perform on each and every page that contains the extension you choose to parse. For example one of my web hosts has in their control panel a apache handler section. I simply go there put server-parsed in the Handler box, .htm in the extension box, and click add. That’s it. Now if I need to add a link to my menu I change one page the menu.htm page and I’m done.

Webmaster of WebmasterTips.US – Free scripts, articles,& web hosting reviews

x

By now you have likely heard that keywords and keyword phrases, are extremely important in having search engines display your website. So how do you choose them? Guess? Ask a friend? Check successful competitors sites? There is a better way!

First let’s digress. There are a lot of things that affect your ranking in search engines and ultimately how many sales you make from you website. The quality of the (and amount of) content on your site, how many links point to your site, what keywords are used, how they are used, even the age of your site can play a factor. Large companies spend thousands on getting their companies to the top of the rankings and keeping them there. If you’re a small company trying to compete with these large budgets, you are likely to come up short.

So, how can a small business with a far more moderate budget compete? Niche Marketing! People searching for information on products don’t always use the keywords you would expect them to use. And just because a competitor has a large enough budget to reach the top with a certain keyword or keyword phrase doesn’t mean you will be able to. Niche phrases are sets of keywords that people are using to search in the search engines. If you are able to rate high on these niche phrases you will see an increase of traffic. To find out what words people are using you need to use one of the word tracking sites. These sites find out from the search engines exactly what phrases your potential customers are using.

For example using Google, “used cars” turned up 36,900,000 results; if you don’t show up in the first 10-30 results it is very unlikely you will ever have someone click on your site. In this case every major car dealer and thousands of smaller dealers are competing for those same 10-30 spaces. You have a huge amount of competition. Likely though your potential customers also see this huge number and if he or she can’t find what they are looking for they will try to limit the search. If they are looking for a local dealer they may try

adding a city. My office is near Redding California. By entering “used cars Redding”, I still receive 205, 000 results. That’s a lot closer and maybe I could optimize my pages to reach near the top, BUT will my efforts really produce the results I want?

So far I have only guessed that my potential customers are really using this search phrase. If I check my competitor’s site I will only be able to see their guesses. That is where word tracking comes to play. There are a number of sites that offer word-tracking services. Some are free, others charge for their service.

The advantage I’ve found of using a paid word-tracking site over the free sites is that at the free sites, you are told how many times a word has been searched for but there is little or no other information provided. These extra features are what help you to make the best decision on your keywords. These features include keyword suggestions based on your starting keyword, and most importantly the ratio of how many people use that phrase verses how many competitors are using the phrase. It takes a little time to learn how to make the best use of word tracking software, but it is well worth the effort. If you are having someone else optimize your site, insist that they find out what phrases are being used.

Using word tracking you will be able to tell how much competition you have for each phrase and how often the phrase gets used, so that you can optimize a website page for phrases that will get clicks to your site. To illustrate the point, would you rather have 0% of the clicks resulting from 1000 searches or 20% of the clicks resulting from 100 searches? If you choose keyword phrases that will put you at the top of the search engines lists you are far more likely to see results you want.

That’s the benefit of choosing your keywords wisely!

Dwayne Goerges is the owner of ADAC Programming and Top Website Tips of Shasta Lake CaliforniaSpecializing in data driven websites for inventory display and client management. Auto Mall Website

x

In part one; we learned that keywords and keyword phrases are an important part of the success of our website. Also, we had a homework assignment.

Pull out your list of keywords and keyword phrases, and we will learn what to do with them.

Since your keywords have to be spread throughout you website to maximize their effectiveness, let us start with the top of the web page and work our way to the bottom.

Title Bar

One if the most important places to insert our keywords is the title bar. This is the text you see in the blue line, at the top of your website.

If you search different sites, on the web, you will see many websites that have their company name and home in the title bar. It may be something like this.

Arabians de Argentina | HOME

Changing the title to include specific keyword phrases will increase the search engine’s ability to find your site. Try it this way.

Training and History of Arabian Horses in Argentina | Arabians de Argentina

Which do you think is more focused?

Header Tags

While spreading keywords and keyword phrases throughout your website, a good place to insert them is in your header tags.

Header tags are the bold print section introductions you use to break up your text into smaller more digestible pieces.

  • <H1>Arabian Horses</H1>
  • <H2>History </H2>
  • <H3>Training</H3>

Graphic and Alternative Text Tags

Header tags are not the only place you can imbed keywords. You

can use them in alternative text or graphic labels. For graphic labels, open the graphic then save it with the new file name. Therefore, JPG00001.jpg becomes Arabian.jpg or better Argentinean_Arabian_Horse.jpg. Then insert it into your website.

Be careful though; make sure you do not over do it and get in trouble with the search engine police.

Meta Tags

Meta tags are on your website for the purpose of drawing traffic your site. Think of them as invisible text boxes full of keywords and keyword phrases.

Other Tags

You can put your keywords and keyword phrases into your navigation page or other text links.

What all this does, is increase your website’s keyword density. As search engines looks for information, your entry looks appealing because you have more keywords and keyword phrases. This causes it to present your page before other similar pages.

Of all the above, a descriptive title bar may be the most important. Experiment with the others and see how it affects your page rankings.

In our next installment, we will look at another great way to make your website attract the attention of the big search engines. Do not loose that keyword list!

Parrott Writing Services, a San Antonio Texas company specializing in web content, ghostwriting, website optimization, online/offline ad copy and technical writing to small businesses.

http://www.rickparrott.com

Send an email to EBOOK@sasecure.net for a free copy of the eBook Computer Security for SOHO Networks

Next Page »