How To: Get Better Search Engine Rankings



As we’ve previously discussed, having a website (and vanity email accounts) plays a key role in your businesses’ professional appearance, and ultimately the number of leads you get and sales you make.  But, having a website is no good unless people can find it.  This is where search engines, Google being the largest and most notable, come into play.  Search engines, as you probably already know, index the internet and allow end users to search their indexes in order to find the content you are looking for.
The easier it is for consumers to find your business on Google, the more likely you are to see greater return on investment and sales delivered through your website.  This how-to is going to go through a few simple steps that will help to get you better rankings on search engines such as Google, and in the long run give you a shot at boosting your online presence and ultimately your sales.

Pick a Name and Stick With It

Picking a good domain also helps to ensure that you get good rankings within search engines.  Typically, you will want a fairly simple name (not one that’s too long), and contains either your business name or the type of business you’re in.  For example, if you had a business name called “Ray’s Sheet Metal” in Northern California; you’d want a domain such as “rayssheetmetal.com” or “norcalsheetmetal.com”.  It also doesn’t hurt to have multiple domain names directing to your website.  Typically, your web-host can provide you with pricing on additional domains (typically referred to as “alias” domains), or you can purchase them yourself for about ten dollars per year through a registrar such as GoDaddy.  Having a relevant domain name not only tends to give you better rankings (mainly because it’s along the same lines of what a user would search for), but also makes it easier for consumers to remember your web address later.  Also, many search engines tend to give better rankings to sites with domains that you have paid for multiple years in advance on.  For example, a company with a domain name registered for five years is obviously expecting to be around for a while.

Meta Tags

Meta Tags are portions of your websites HTML code that hold user-given information about the website in question.  These tags, located in your HTML head (), store content such as the description of a website, as well as keywords that match a website’s purpose.  While Google does not use Meta Tags to index data, some other search engines such as Yahoo do.  Meta Tags are usually implemented by website management software such as Adobe Dreamweaver and Microsoft Expression Web, however it is always wise to double check your source coding to ensure that this is indeed the case.  Below are two of the more common Meta Tags (description and keyword) which can easily be implemented into your site.  Note thatname=”” denotes the name of the Meta Tag (and thus what kind of information it stores, andcontent=”” stores the actual information.
To read more about Meta Tags, the different options, and information regarding implementation, see the W3-Schools page regarding Meta Tags.

robots.txt

The “robots.txt” file is a file that most search engine bots use to determine what you do and do not want them to index.  This file should be placed inside of your root web directory (/).  The basic structure includes the definition of a User-agent, and the directories that you want it to ignore.  The User-agent is more or less the bots’ “ID”, and how it identifies itself.  For the purpose of this tutorial, we are going to use the wildcard (*) as our given User-agent, and we are only going to disallow the directories “/private” and “staff”.  After all, this tutorial is about getting indexed, not getting ignored.
User-agent: *
Disallow: /private
Disallow: /staff
You should note that each “disallow” directory (which will be ignored by the web-crawler) is on its own line.  In the scenario where you did not want any directory to be ignored, your robots.txt file would look like this:
User-agent: *
Disallow:
For more information about robots.txt file, feel free to visit the Web Robots Page, which goes into more details about the robots.txt file, its uses, and how to fine-tune it for individual search engines.

Minimize Downtime

Many search engines tend to discredit sites that have frequent downtime or are often inaccessible.  Minimizing server downtime ensures that your website is online as much as possible, and that you stay indexed on the major search engines.  In order to achieve this, you may want to look into a stable web-host, which will help to ensure the greatest uptime for your site.  BestTechie highly recommends Webair.

Submit, but Don’t Over-Submit

Most search engines such as Google have pages in which you can enter your domain name, and have it indexed during the engine’s next round of crawling.  Giving a search engine your domain name is helpful, as it allows them to find you easier and ultimately index you sooner, however submitting your domain multiple times can potentially discredit your site with the search engine, and they may choose not to index you at all.  There are a lot of services on the internet, both free and paid, that claim to submit your domain to search engines faster.  Truth be told, these services tend to be seen as spam towards the search engines, and can ultimately hurt you in the end.  It is always best to submit your domain directly to the search engine via their forms.  For example, Google’s “Add your URL” page can be found here.
Following these tips should help you to get a good search ranking.  However, the most important thing to keep in mind is patience.  No website is going to go to the top of Google’s rankings overnight.  However, by keeping a reputable, organized, and easy to find website, you dramatically increase your chances of maintaining your position.  To additionally boost your site, you may consider paid advertising such as Google’s “AdWords” program.