Showing posts with label seo techniques. Show all posts
Showing posts with label seo techniques. Show all posts

Friday, November 12, 2010

Tips For Google Local Business Place

Google has recently relabeled Google Maps Local Business Listings into Google Places. Some site owners experienced changes in their business listing rankings while some lost several important reviews for their businesses listed in Google Places.

Here's my Tips for Google Places:

  • Never let another person control your Google Places listing. If you hire a Google Places Expert, make sure he/she has no access to your Google Places account. He/She should only recommend the changes to your Google Places listing(s). The website owner should be the only one to have full access to a Google Places account.
  • Report a duplicate listing to Google. You don't want a duplicate listing to share your user reviews. User reviews are factors for top rankings of your business listings.
  • Optimize the CATEGORY types of your business listings using your target keywords. Use one target keyword or keyword variation per category and use all the allocated CATEGORY types using the "Add another category" link. A maximum of five category types are allowed as of October 2010.

Wednesday, May 6, 2009

Six Simple Tips to Improve your Search Engine Ranking

Simple Six Tips To Improve Search Engine Ranking

There are some very simple SEO techniques available for you to improve your search engine ranking. In the course of my normal analysis of competitors’ websites, I find an amazing number that does not employ all of these techniques, yet every one can help you get closer to that coveted #1 position.

Using these will not in itself allow you reach the top position for your keyword, but they definitely help, and there is no single factor that will enable you to hit the top. A #1 position on Google is attained by a combination of many factors, such as internal and external linking strategies, relevance of your content to the keyword and the overall look and feel of your site. Plus those detailed below.

So let’s get started on these: these are the nuts and bolts of SEO, and if they are not right then you are starting off on the back foot. These are the essential SEO techniques that you must have as a minimum if you want to improve your search engine ranking, and although intended mainly for beginners, many established web pages do not contain every one of these.

I write ‘pages’ deliberately, because Google and the other main search engines list web pages, and not entire domains. That means that every single page in your website should be optimized in the same way: each should contain every single one of the SEO techniques listed below.

TITLE TAG

The title tag is contained within the ‘HEAD’ tags of your HTML, before the ‘BODY’ tags. This states the title of the page, and must contain the major keywords of the page. The contents of your title tag do not appear in the text of the page: its purpose is to inform the search engine spiders what the topic of your page is, and what words are important (i.e. your main page keyword). For example, the TITLE tag of a page based on this article would be “SEO Techniques - Improve your Search Engine Ranking”.

DESCRIPTION TAG

The description Meta tag is used by Google, and other search engines, in the search engine listings. I have tested this with them all and Google uses it as is, while Yahoo uses part of it. You should provide a description of what the web page is about, and a simple check of the descriptions in other sites using your keyword on Google will show you how many words you can use to have the whole description included. About 20 words are fine.

KEYWORD TAG

Search engines rarely use the keyword Meta tag: Google ignores it completely. However, it doesn’t hurt, and can help in a small way. Include your brand name and your own name. That way some engines might show your pages if somebody is looking for your name. The other Meta tags have no SEO value, and do not help to improve your search engine ranking whatsoever.

HEADING H TAGS

Heading tags (H1, H2, . . .) are used by Google to determine the importance of the text contained in your headings. Use H1 tags for the main title of your page (you also use it in the TITLE tag, but that isn’t seen by readers, only by the spiders). Put subtitles in H2 tags. You can change the font size of the text within these tags.

TEXT FORMATTING

Text in bold, italics and underscored are seen by the search engines as having greater weight, and so will be used in determining the relevance of your site. Always bolden your titles, and it also helps to underline it if it doesn’t make it look out of place.

WRITING STYLE and CONTENT

Do not write for algorithms (spiders): write for your readers. Always write for humans and you won’t go wrong. If your page content reads well, and has good vocabulary relating to the topic, then it will have a better chance of a higher listing than if you stuffed it full of keywords. I rarely use more than 1.5% - the keyword densities of the terms ‘SEO’, SEO techniques’ and ’search engine ranking’ (the main keywords) of this article are 1.5, 0.87 and 0.87 respectively. Too many keywords is bad SEO, and could result in a poor listing for your page - if it is listed at all.

So there you are: six simple SEO techniques to improve your search engine ranking. It is surprising how many experienced webmasters fail to apply all of these: there is no excuse, and they are failing to get the nuts and bolts properly fitted and tightened on their web pages.

Apply these to every page and not only will you improve your SEO, but also your chances of a good search engine ranking. It is amazing how many web pages lack these basic SEO techniques.

Monday, May 4, 2009

How to Increase PR?

PageRank

PageRank is a numeric value that represents how important a page is on the web. Google figures that when one page links to another page, it is effectively casting a vote for the other page. The more votes that are cast for a page, the more important the page must be. Also, the importance of the page that is casting the vote determines how important the vote itself is. Google calculates a page's importance from the votes cast for it. How important each vote is taken into account when a page's PageRank is calculated.
PageRank is Google's way of deciding a page's importance. It matters because it is one of the factors that determine a page's ranking in the search results. It isn't the only factor that Google uses to rank pages, but it is an important one. 

Increase your PR

Follow these steps and you will be on your way to building a great PageRank.

  • Look for quality sites that compliment your products and services.
  • Find sites that have a PR of 4 or above.
  • Submit your site to top directories is a great way to increase PageRank.
  • Look for "themed" directories (related to your industry) to submit your website to. For example, if you are a manufacturing company, look for manufacturing directories, B2B search engines, etc. 
  • Do you belong to any professional organizations related to your industry? If so, have your website listed within their membership directories. This is very simple and a great way to incorporate link popularity building strategies.
  • Study your competitor's websites and see who links to them.
  • Have website links from your clients, partners, vendors, subcontractors, etc.
  • DO NOT participate in FFA Pages. Free-For-All link pages are sites that allow you to submit your link to them for free. If you hate spam in your inbox, stay away from this so-called link popularity building strategy.
  • DO NOT participate in link popularity building farms. A link building farm consists of sites that link to other sites for the sole purpose of increasing their link popularity score. Unlike perfectly valid links to sites with related information, sites that participate in link popularity building farming contain links to totally unrelated sites.


If you follow these basic guidelines in building your PageRank, you will be off to a great start.

Black Hat Techniques

Black Hat Techniques

In an attempt to improve visibility Black Hat Techniques is used by some webmasters. Black Hat SEO is a different practice of Search Engine Optimization. Some of the Black Hat techniques use to be ok, but people took advantage of a good thing and now those techniques are red flag, therefore, breaking the rules of search engines guidelines.

Some of These Black Hat Techniques Are:

Invisible Tex

Keywords and keyword Sentences in the same color as the background where the viewer doesn’t see the text but search engine does. 

Keyword Stuffing

As it’s easy to be carried away, Keyword stuffing a web page is one mistake people do. The rule is that more than 7% of chosen keywords on a web page is to push you luck. 

Doorway Pages

Doorway pages are stuffed with keywords and key sentences. Webmasters creates doorway page strictly for search engines. When you arrive on one of those doorways webpage’s immediately, you are redirect to another website, very often not related to your search. 

Cloaking

Another Black Hat Technique shows the viewers one page whiles search engines read a completely different text. In other words for example when a search engine crawls a website it reads a html text specifically written for search engines while you the viewers are viewing a different page that could be pictures or animation. Therefore, the webpage was altered to deceive search engines. 

Link Farms

Buying links to achieve higher ranking or owning a certain amounts of site all linking to your main sitemight show you are important but again a Black Hat Technique and deceiving to search engines and viewers alike. Links needs to be relevant to your website services, therefore, the more relevancy the better for your website.

Note

Webmaster employs Black Hat SEO Techniques trying to increase search engines visibility and increase traffic whiles they hope to deceive search engines without being caught. These Black Hat practice may work for a while, but eventually search engines will catch-up and the webmaster will face unhappy subsequence's.

When you see websites that employs Black Hat SEO practice, you should report them to your search engine webmaster.

Site Submission

Submission means filling out a form on a search engine's site to invite them to add your site to their index. What many people don't realize is that this is unnecessary. Engines find what's on the web by following links. As long as there's a link to your site from any site that's already in the search engines, the engines will find your site. If you don't have any incoming links you're not going to rank well anyway.

Once your site is listed in an engine you're in for good (unless you get kicked out for trying to fool them, as covered below under Black Hat SEO). There's never any reason to resubmit your site once it's already in. Resubmission is a waste of time.

The overwhelming majority of search traffic comes from the top five or so search engines. Some companies will offer to submit your site to "thousands" of search engines. This is a waste of money. If your site is linked to from anywhere, you'll get in all the search engines that matter, automatically, for free.

Search engines use automated robots to follow the links around the web and grab the content from the web pages they find. The robots are calledspiders, and when they follow links they're crawling the web (also calledspidering). Google's spider is called Googlebot, and you'll see it listed as the user agent in your server logs. Once a search engine has gathered a site's data and analyzed it the site is said to be indexed. To see whether your site is in the Google index, search Google for site:yourdomain.com.

New sites don't always get listed right away. In some cases it can take several months for a new site to show up in the SERPS. Even when a site gets in the index, Many believe that Google puts new sites "in the sandbox" and won't let them rank well for the initial few months. Jennifer Laycock has a better explanation: New sites can rank fine if there's not much competition for that topic, but Google will assume that a new site in an established, competitive market isn't any better than the tons of sites already there, unless that site proves itself to be superior. The sandbox issue has been discussed on Webmaster World ad nauseum. (Searching WebmasterWorld for all pages mentioning the sandbox results in nearly 1000 hits at present.)

Black Hat SEO Penalties

Black Hat SEO Penalties

Google makes it clear that it disapproves of certain SEO tactics, such as hiding keywords with invisible text, or showing one page to Google's spider and another to actual human visitors. (See what Google recommends and what they don't.) Methods that conform to what the search engines like are called White Hat SEO, and disapproved methods are called Black Hat SEO. There is a lot of controversy about whether Black Hat SEO is really "bad" in the ethical sense. White Hatters say that Black Hatters are unfairly trying to manipulate the SERPs. Black Hatters counter, "What constitutes 'fair'? Isn't any change you make to your page for SEO purposes an attempt to influence rankings? Why is one method less pure than another when we're all just trying to get our pages to rank higher? Further, if an engine is ranking a bunch of irrelevant sites above mine, what's so wrong about using any method at my disposal to get my relevant site ranked above them? Doing so doesn't benefit just me, it benefits the searchers because it gives them what they're searching for. And it also benefits the engine, because searchers will think better of the engine for giving them more relevant results than it would have otherwise."

Adding to the controversy about Black Hat SEO is the fact Google does allow a certain select few sites to operate contrary to its own stated policies. Danny Sullivan complains that Google's cloaking policy is inconsistent (more on cloaking below), and that the policy wording should be updated. He further notes that the reason Google allows some cloaking is that it improves searching rather than hinders it, and that it's therefore inappropriate to think of cloaking as synonymous with "bad".

Personally I think that what's good or bad is not the methods you use, but whether you're trying to get a ranking you deserve. If your site is really one of the best about, say, the history of baseball, then it doesn't really matter to me how it gets to the top of the SERPs for a search on that phrase. What's annoying to me, and to millions of people around the world, is when a crappy, useless site tricks its way to the top of the SERPs, usually in an attempt to get more visitors to there so they'll click on the ads there and make money for the webmaster. So to me it's not how you get to the top, but whether you should be at the top in the first place.

Whether you think Black Hat SEO is bad or not you should avoid it anway, because it can get you banned from the search engines, or at least reduce your ranking. Google has been known to remove sites it felt weren't playing fair. Granted, this isn't likely, but why take that risk? Also, much Black Hat SEO involves some fairly technical work. If this article is your introduction to SEO, you likely don't have the skills to be a successful Black Hatter anyway -- at least one who doesn't get caught.

If you want to stay on Google's good side, here are some things to avoid:

  • Invisible text. Don't put white text on a white background. In fact, don't put even very light yellow on a white background. The engines aren't stupid; just because the colors aren't exactly the same doesn't mean they can't figure out there's no contrast. Yes, there are clever ways to try to fool Google about what the background color actually is, but Google is probably aware of most of them anyway, and I won't cover them besides.
  • Cloaking. Google knows what's on your site because periodically its automated robot called Googlebot visits all the pages in its index, and grabs all the page content so it can analyze it later. Cloaking means showing one page to Googlebot and a completely different page to real human visitors. Google despises this aplenty.
  • Keyword Stuffing. The engines want your pages to be natural. Finding every place to cram your keywords onto your pages -- or worse, including a "paragraph" of nothing but keywords, especially if they're repeated ad nauseum -- is a big no-no. Do you consider pages with lists of keywords to be high quality? Neither does Google.
  • Doorway pages. A doorway page is a page built specifically for the purpose of ranking well in the search engines and without any real content of its own, and which then links to the "real" destination page, or automatically redirects there. Doorway pages are a popular choice of some SEO firms, although Google has cracked down on this and many webmasters saw their pages disappear from the index. Some SEO firms call their doorway pages something else, in an effort to fool potential customers who know enough to know that they should avoid doorway pages. But a doorway page is still a doorway page even if you call it something else. Some engines may decide that an orphaned page is a doorway page, and if so then the page or the site might suffer a penalty.
  • Spam. Spam has a special meaning with regards to SEO: worthless pages with no content, created specifically for the purpose of ranking well in the engines. You think they have what you're looking for, but when you get there it's just a bunch of ads or listings of other sites. The webmaster is either getting paid by the advertisers, or the page is a doorway page, with the webmaster hoping that you'll click over to the page s/he really wants you to go to.

Penalties

It's important to distinguish between the two punishments from search engines since they're entirely different. Being banned means your site is removed from the index completely. This is pretty rare; most people who think they've been banned are actually still in the index. It's easy to tell whether you've been banned by Google. Assuming your site was in the index to begin with, search Google forsite:yourdomain.com. If you get any results, your site hasn't been banned.

Being penalized means having your rank reduced. Unfortunately I know of no way to test for this. I do think that most of the time a webmaster thinks they've been penalized they're wrong. Rankings change, sites drop -- it's all part of the way the search engines work. But often many people take it personally and feel they're being victimized.

Webmaster World has a good checklist for dealing with a potentially dropped site.

Thursday, April 9, 2009

Basic SEO Tips & Tricks

  • Manoj's first advice would be; get a exact keyword check to see where your website is right now in the SERP's.
  • Make sure your main index page is content rich with text, links and images.
    Never use a 'splash page' or an 'Enter Website' page.
  • Don't use keyword stuffing techniques, i.e doorway pages, invisible text etc
    And always use relevant keywords
    .
  • If you have to use a database i.e. php or asp, always make sure that you keep it to a minimum, and only use it if you have no other option. Always keep as many pages as possible as static pages.
  • Don't submit your website until it is ready. Make sure that all pages are full of text and complete, without any 'under construction' stuff.
  • Don't ever use the free submission programs available on the internet. Whilst there are some that may be good, it's very hard to tell which ones they are.
    Submitting to 100's or even 1000's of search engines may well get your website, removed or black-listed at best it will simply get your email address on thousands of spammers mailing lists.
  • Also remember that some directories like Yahoo only accept one submission, and if you exceed this in a limited time frame the human will block all your further submissions. Some search engines accept 5 URL submissions per day like Altavista, others only accept 2 per month, or 1 per day. The key is to know which ones allow how many, how often. Then just when you get the hang of it, they change the rules again.
  • 'Monthly Submission - still necessary?' - Absolutely yes. Google and Fast are the only search engines in the 'we're the biggest and best' war. Everyone else like Lycos, Inktomi, Dmoz etc may remove you if you don't keep your listings updated or request their crawler to visit your site on a regular basis. The number of pages that have to be searched through slows down the search engine process. If your site is not getting sufficient number of click-through's to activate the crawler on its own, you need to request it.
    The only problem here is some search engines have maximum tolerances and if you over-submit, they will remove your website!
  • 'Watch those inbounds!' - When you get another webmaster to place a link to you on their site you need to make sure they place keywords in the linking text.
    This increases ranking for those keywords and is called 'anchor text'. Always remember that the reverse always applies in search engine optimisation. Having no keywords or irrelevant keywords in the linking text can 'lower' ranking. So if anyone is linking to you with the words 'click here', 'this is a great site', 'visit our sponsor' and so on, get them to change it or remove it before it causes problems.

Wednesday, January 21, 2009

Formatting images for SEO

One of the most commonly known Search Engine Optimization (SEO) tips that go around has to do with the way you name and tag the images that you use on your site. Its fairly commonly accepted by most SEO experts that Google not only looks at the text on your blog in order to measure its worth but that Google’s spidering bots also take a look at the code you use in your image files.

Over the years SEO techniques have been developed to abuse this fact and webmasters have ’stuffed’ their ‘alt tags’ with all kinds of keywords - however Google has found ways to combat this and treats such strategies as spamming their bots now - however it is still legitimate to put keywords in you image tags and I would recommend that you do (within reason).

If I’m writing on one of my technical blogs about a product and want to post a picture - I always make sure that the file name of the picture includes the name of the product (with-hyphens-between-words). The system I use (ecto) to publish my blog uploads photos automatically to set the file name as the ‘alt tags’ (which are the words that come up as your picture loads) and uploads the picture to its own URL with the file name in the actual URL. This ensures that when Google’s bot spiders through your site it sees your keywords an additional few times per picture.

Tuesday, January 6, 2009

SEO Tips

There are no tricks here, just a bit of work and some time. So let's get started by reading the following list of SEO Techniques!

  1. Domain & File Names:
    Choose your site domain name that contains words from your primary keyword phrase. Your domain name should also be easy to spell and easy to remember. You keyword phrase also should in many cases go in your file name. Read this thread Keywords in the URL from SEO Chat Forum.
    For example I use the file name seo-techniques.html for this page.
  2. Keyword Phrases:
    1. Use keywords that are being searched for. You can check your keyword phrases with either the Search Term Suggestion Tool or the Overture Keyword Popularity Tool to find out how often they are being searched. You can also look at Google AdWords Keyword Suggestions for suggestions for different keyword phrases.
    2. Add keyword synonyms to your content.
    3. Put the keyword phrases in the keyword phrase .
    4. Insert the keyword phrases in a

      keyword phrase

      tag at the beginning of your page. Keyword synonyms should be put in your h2 & h3 tags. The h1, h2, h3 tags are used for titles and subtitles in articles.
    5. Make sure you use your keyword phrases from the page you are linking to, in your anchor text on the site map. i.e. SEO Techniques.
  3. Keyword Density:
    Keyword density is a very important part of search engine optimization. Keyword density is the percent that your keyword or keyword phrase are of your web page text. You may want to look that your competition to see what keyword density they are using. To high a keyword density will be considered search engine spam and can get you blacklisted.
    Your keywords should be toward the top of your page and your keyword phrase be in either every paragraph or every second paragraph depending on your paragraph length.
  4. Bad Techniques:
    Bad search engine optimization techniques can get you blacklisted from a search engine. Some techniques that are considered spam are cloaking, invisible text, tiny text, identical pages, doorway pages, refresh tags, link farms, filling comment tags with keyword phrases only, keyword phrases in the author tag, keyword density to high, mirror pages and mirror sites.
    While these techniques might work to give you a higher ranking for short time in the long run they will hurt you.
    Google has a good article on Google information for webmasters that is very imformative if you are considering Getting a SEO Company to so work on your website.
  5. Title & Meta Description Tag:
    Construction of your title tag is one of the most important things you need to do. Each page should have a different title with 2 or 3 of your keyword phrases at the beginning. When search engine results are displayed the title is the first thing people see.
    Below the title is a description which will be either be taken from your meta name description content="Description phrase" or from the first sentence on at page. You description should also have 2 or 3 of your keyword phrases at the beginning as so should your first sentence. You should have a different title, description and first sentence on each page. You many also what to try shorter titles with only one keyword or keyword phrase as this will raise you keyword relevance. Also you can consider putting your domain name at the very end of the title.
  6. Meta Keywords Tag:
    The meta keywords tag is not as relevant as it used to be and some say Google doesn't ever look at it anymore, but put it in anyway. It is as follows, and put in it all your keywords and keyword phrases. This tag should be different for each page.
  7. Author & Robots Tags:
    The Author Tag should contain the name of the company that owns the site. This tag will help you get a #1 position for your company's name.

    Use a generic Robots Tag on all pages that you want indexed. This instructs the robots to crawl the page. The following is the generic robots tag.
  8. Quality Content:
    Quality content will bring people back and as people always want to tell others about a good thing it will get you forward links from other sites. Your content should be written with your keyword phrases in mind
  9. Quantity Content:
    The more the better. Just remember your content will need to be both quantity and quality.
  10. Changing Content:
    You can do this by hand or with a script. For example you can have a php script that draws five paragraphs from a pool of twenty paragraphs when the content is different each time the php page is accessed.
    www.carsinlondon.com/used-cars-london-ontario.php shows a sample of php script that will do this.
  11. Avoid Dynamic URLs:
    Are you pages via php, asp, or cf? Some search engines may have a problem indexing them. Create static pages whenever possible. Avoid symbols in your URLs like the "?" that you will often find in php, asp or cf pages.
    Static pages are the best but if you have a db driven site, make sure the menu and site map like go to inventory.cfm not inventory.cfm?vn=0 .
  12. Frames:
    Many search engines can't follow frame links. Make sure you provide an alternative method for the search engines to enter and index your site. For more information read Search Engines and Frames.
  13. Site Map:
    A good menu system is really a site map. A well constructed menu system that is on each page and contains a link to very page on the website is all you need.
  14. Site Themes:
    All of the top 3 search engines look for site themes or a common topic when they crawl a website. If your site is about one specific topic you will rank better than if you have more than one theme or topic on your site. By using similar keyword phrases in each page the search engines will detect a theme this will be to your advantage.
  15. Site Design:
    You may think, what does site design have to with search engine optimization. Well if your website has a bad color scheme that is hard to read, is not organized, is a cheesy looking site, then all your site optimization has been a waste of time. Make your site attractive to the viewer, make things easy to find, have you graphic header and menu bar the same place on each page.
    These things will keep your visitors on the site and bring them back. A well optimized site with a high search engine results position that is ugly and is hard find information on, will not keep the visitors your optimization has brought to the site.
    Use W3C Link Checker to make sure all your page links are good. If you have broken links on your site this can effect the ranking you are given.
    Put a proper doctype on each page. If you don't have a proper doctype on each page Internet Exployer will go into quirks mode and display it different.
    Use The W3C Markup Validate Service to verify that your pages are Validate HTML or XHTML code. The W3C validation will verify that your HTML or XHTML is not broken. This validation show you any broken code that could cause your webpages from displaying properly in all the different browsers and browser versions.
  16. Separate Content & Presentation:
    Put all your presentation code into Cascading Styles Sheets (CSS). This separates the presentation from the content and makes your html files up to 50% smaller. It is reported that the search engine bots prefer this and the more content you have compared to presentation in your file, the better you get rated. Read why tables for markup are stupid for an overview.
  17. Robots.txt File:
    While this file is not really required it should be included so that the search engine bots don't get 404 errors when they look for it. Just include the following 2 lines and drop it in the root.
    User-agent: *
    Disallow: