Search Engine optimization

What is SEO?

Search engine optimization is a technique of getting your website to rank higher in search engines—such as Google, Yahoo or Bing. A search engine optimization campaign pairs on-site optimization with off-site plans, which means you make changes to your site itself while structure a collection of natural looking back links to increase your natural rankings. When Internet users search for your products or services, your website needs to be the first one they find. SEO helps the search engines recognize your importance to particular keywords that people search for online. The search engine optimization process includes researching keywords, creating content, building links and making sure your website is visible in the search engines.

SEO is not an appropriate approach for every website, and other Internet marketing strategy can be more effective, depending on the site operator's goals. A successful Internet marketing promotion may also depend upon construction high quality web pages to engage and influence, setting up analytics programs to enable site owners to quantify results, and improving a site's conversion rate.

SEO may generate a sufficient return on speculation. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referral. Due to this lack of guarantees and faith, a business that relies a great deal on search engine traffic can suffer foremost fatalities if the search engines discontinue sending visitors. Search engines can change their algorithms, impacting a website's placement, perhaps resulting in a serious loss of traffic. According to Google's CEO, Erick Schmidt, in 2010, Google made over 500 algorithm changes - almost 1.5 per day. It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic. Seo moz.org has suggested that

With nearly 14 billion online searches every month and social media sites that rival populations of large countries, there’s unlimited opportunity. But to take advantage of that opportunity, it is vital to rank high in the search engines, to leverage social media and pay per click advertising, and to make sure your website turns visitors into leads and sales. If you don’t understand how to successfully apply these strategies, then your customers will go to your competitors instead.

Here’s how we do it: We form a strategic Internet marketing plan around search engine optimization (SEO), pay per click advertising, social media marketing, conversion optimization and search-optimized Web design. Then, we tenaciously go to work so you can quickly dominate your online competition.

So, start dominating today. Request a proposal from a top SEO company and let’s set up a chance for high search engine rankings, more customers and a big return on your investment.

"Search marketers, in a twist of irony, receive a very small share of their traffic from search engines." Instead, their main sources of traffic are links from other websites




SEO TERMS
In the Internet, abbreviations and lingos are bound to be everywhere. SEO is no different. Before you start on this topic, I will now brief you on all the SEO terms so that you will not irritate others on webmaster forums by asking silly questions.
1. SEO - Search Engine Optimization:-Techniques used to increase organic traffic by optimization WebPages to give search engines the impression that the WebPages are highly applicable and important.
2. Organic Traffic:-Website traffic that comes from search engines.
3. SERP - Search Engine Results Page: - The results page of a search engine for a search term.
4. SEF - Search Engine Friendly: - A website that is properly designed for Search Engines to read, cache, crawl and index.
5. Crawlers/Spiders
Web bots of search engines that 'crawls' your webpage, creating a cache of it for their database. It will also index it (include your website in their database) if it haven't done so.
6. Onpage Optimization:-Optimizing by controlling ranking factors on your WebPages.
7. Offpage Optimization:-Optimizing by improving ranking factors outside of your website.
8. Back links
This is a link on another webpage that links to yours. There are 2 types: one way back link and reciprocal back link. One way means they link to you, but you did not link to them. Reciprocal means both parties link to each other.
9. Internal Links:-Links that point to another webpage of the same website.
10. Keywords:-The word or expression that your webpage is about. You optimize your webpage for these keywords so that when users search for them, they can find your website from search engines.
11. PR - Google Page rank
This is an indicator of how 'important' your webpage is through the number and quality of backlinks. 0 is lowest, 10 is highest.
12. Alexa Rank
Shows the traffic of a website compared to others. Highest rank is 1 and means that your site has the highest traffic among all the websites! 1,000,000 mean the 1 millionth highest traffic website.
13. Compete Rank
Ranks a website based on the number of visitors per month. Not so important right now, currently a new ranking system of not much attention to it.
There are of course more terms to go, but these are the more general ones. As you read on, I will introduce more in-depth terms and explanations.


Keywords Analysis
What are Keywords?
Keywords are expression that you want your website to be found through search engines. These are phrases that people type into the search engines to find websites. Notice that I used 'phrases' to describe keywords. It is simply too difficult to use a one-worded keyword for your website.
Let’s say your website is related to the fruit mango, and you optimized your webpage for the keyword 'mango'. Bad mistake. A search on Google shows a huge 25,000,000 results for the search term 'mango'. That means competition is very high and almost impossible to rank well for this keyword
You will want to use a expression as keywords in its place. Two reasons for this. The first is obvious, the more specific a keyword is, the lesser the competition. The second reason is that one word cannot describe a webpage well enough. Mango can mean anything; it can be mango plantation, mango recipes, books on mango, etc. Simply put, the keyword is not targeted enough.


Selecting the Right Keywords

The ideal keyword is one that has high search quantity and little opposition. It has to be closely related to your website too. You do not want someone searching for mango farm and end up coming to find your website advertising mango ice cream.
You will want target viewers that will stay on your website in its place of leaving in few seconds. With that in mind, ask yourself this: what search terms do you expect people to use to find your website? If you are selling a product, put yourself in the shoes of the buyer. What would you search for?
In general, a longer keyword has better targeting and less opposition. Good keyword expression is generally 2 to 5 words long. Most people think that they need to rank high for common terms and heart way too much time on it. Generic terms do not convert as well as specific search terms because the traffic is not well targeted.


Here is a comparison of generic and specific search terms:
Generic Keywords
Running shoes
Specific Keywords
New balance running shoes
Buy new balance running shoes
New balance running shoes store
Buy new balance running shoes online
Keyword Density
Keyword density refers to how much keywords are here in your content. This is considered in percentage, the total number of keywords is divided by the total number of words.
There is no correct density to follow and I advise you not to get overly fixated with this number. I am not saying placing keywords in your article is not important. Just write your article naturally and stuff in some keywords only if it makes sense. If you really want a number, 2-4% density is good enough. Never go too high or you might get punish by Google for keyword stuffing.
Keyword Analysis/Research
Keyword analysis or research is the process used to find fit keywords. That is, keywords that have poor opposition and reasonable search volumes. The best free keyword research tool used to be the Overture by Yahoo, but that has been down at this moment of writing. So you can use these two free tools Keyword Discovery and MIVA instead. They do not provide as much statistics as Overture though.
If you have deep pockets, I recommend you to go for WordTracker they have the most comprehensive statistics that the free tools cannot provide.  For a start, you will want to target keywords that have search results of less than 500,000 in Google. The search volume per day should be at least 50, if there are such keywords in your niche. Sure, the traffic from these keywords may not be high, but you can rank #1 easily for them. If you had 10 WebPages that target such keywords (each webpage has its own keyword) and manage to rank high for them, you can get 500 visitors per day.
Once your website gains traffic, popularity and link authority (more backlinks), you can start to challenge for harder keywords.
4. SEO On page Optimization
On page optimization is one of the two ways you can optimize your website. On page optimization does not boost your position in SERP as much as off page optimization but is nonetheless extremely vital to get a good position.
On page Factors
You need to have some knowledge of HTML to understand the following on page optimization tips.


URL, File Name
The URL of your webpage is looked at by search engines in trying to determine the keywords of your webpage. A webpage named internet-marketing.html is definitely more relevant to internet marketing than a webpage named article24.html.
Place keywords into your file name and you can get a slight advantage over most competitors who don't know SEO or on page optimization.


Title
The title of your webpage is the most important on page factor. Your title should not have more than 80 characters and do not contain repeated words. Search engines place high importance on the title of a webpage and if you use the right keywords, your webpage can get a large boost in the SERP position.
Try not to use unnecessary words such as 'Welcome to my website'. The words used in your title is precious and the more words it have, the more diluted the importance of each word holds.
Only use keywords in your title don't waste it! Let’s look at the following example:


Welcome to Mike's blog
Now, how many people will search for 'welcome' and how many will search for the name 'mike'? This is an example of wasting the precious title tag. The title is telling search engines that the website is a blog from Mike and nothing else. Take a look at the next example:


Blog on Fitness and Health
This is much better. Search engines are able to tell that the blog is on fitness and health. Improvements can be made further. Words like 'to', 'and', 'a' and such are not necessary and should be removed from the title.
Fitness | Health Blog
What happens here is that words like 'and' are replaced with characters like '|'. '|' and '-' are not considered as words and will not have a diluting effect.


Meta Tag – Description
A short description of the webpage for search engine bots to read not exceeds 280 characters. This description will appear in the SERP if provided. If the Meta tag does not exist, the crawlers will just extract the description from the webpage itself.
Having a good Meta tag description can encourage the human users to click on your website when they use the search engines. Crawlers are not intelligent enough and might include gibberish information. Take a look at what Google shows for this website:
Copyright Make Money Online 2007-2008. Powered by Larvos.com. Navigation. Home How to Make Money Online Contact us Site Map More Links...
Not every descriptive huh? This Meta tag will not have any effect on your position in the SERP, but you might want to put a little work on it to improve on the human experience.
Meta Tag - Keywords
This is basically a list of keywords for your webpage. It is not important anymore as the major search engines don't look at them. They look at your content to associate keywords with your WebPages. Placing this Meta tag will have no effect.
Heading Tags - H1, H2, H3
Heading tags are very good formatting tags to use. The text enclosed by such tags is highly regarded by search engines. For example, if you had a <h1>Cars</h1> in your webpage, it will be associated with cars.
By placing relevant keywords in it, you can rank higher for that keyword. The common practice is to place your primary keyword in h1, and secondary in h2. Make sure you use only one h1 tag per webpage.
Formatting Tags - bold, italic, underline
Other formatting tags that are not as highly regarded as headings are the bold, italic and underline tags. Formatted text tells search engines that these are to be taken note of.
By placing keywords within them, you are highlighting the keywords to search engines, making your webpage more relevant to these keywords. A favorite formatting tag for webmasters seems to be the bold tag. When formatting your keywords, don't be excessive. Apply the bold tag on 1-3 rate of the keywords will be sufficient.


List, ordered or unordered
Content in a list form are regarded to be important by Google (and maybe Yahoo and MSN), so place your keywords among the list, and bold them for maximum effect. Just make sure not to overload the keywords and make it look like spam.
Anchor Text:-
Anchor text is very powerful in verify the keywords for a webpage. Let’s say you place a link that says 'SEO Tutorial' and it points to a webpage of yours called seo-tutorial.php. The keyword 'SEO Tutorial' will be associated with that page, making it more relevant and rank higher for the keyword 'SEO Tutorial'.

Combination:-
You can have very powerful internal anchor links if you combine them with the on page factors I mentioned above. You can bold your anchor links; place them in headings or even in a list. This makes the anchor link highly valued by search engines.

Alt Attribute:-
The 'alt' attribute is used for the 'image' tag. When placing an image on your webpage, use the 'alt' attribute to describe the image. Place keywords in them, but do not over do it. This is a common spamming technique and can be detected by Google easily. Also, do not use the same text for the 'alt' attribute at a webpage. Do some variations and make sure the text descript the image at the same time.



Keyword Density:-
Keyword density refers to the percentage of your text that is keywords. A simple calculation is total number of keywords present divided by the total number of words. There is no current percentage for your keyword density. Some claimed as high as 8% while most follow in the region of 2-6%.
I believe you will be safe at 2-3%. That means, for every 100 words, 2-4 of them have to be your keyword. Of course you are not going to count; you can use an online keyword density tool to do the job. Never exceed 8% or you will be banned by Google for spamming keywords.
5. Google Page Rank


What is Page Rank?
Page Rank (PR) is one of the many factors Google use to rank your webpage. Yes, webpage. Each of your webpage can have different PR. PR is determined by calculating the number of links to the webpage. Each link is considered as a vote for that webpage, so the more votes it gets, the higher the PR it will have.
You can view the Google Page Rank of a webpage by installing the Google Toolbar or by using online tools such as PR Checker. Get Firefox with Google Toolbar Be conscious that the PR for your viewing is updated every 3 months or so by Google, so you will not see any immediate increase or decrease until the next update.
There are 2 ways to increase your PR. The easiest way is to have a good internal linking structure so that PR is spreader around your own WebPages. If you link to an external website, you lose some of your potential PR to that website.
Confused? Lets take a look at the formula for calculating PR. Note that this is not the actual formula (only Google knows it) but a rather close estimate.
PR (A) = (1-d) + d (PR)(T1)/C(T1) + ... + PR(Tn)/C(Tn))
Where...
A is your webpage
T1 to Tn is the WebPages that links to webpage A
PR(x) is the Page Rank of a webpage
This is  a damping factor between 0 and 1
C(x) is the number of outbound links on a webpage
As you can see, by having more links (represented by C) in a webpage, the PR of that webpage will be spreader out thinly to the WebPages it links to. So if a webpage has a Page Rank of 5 and have 5 links to 5 different WebPages, each of them will get a spill over PR of 1 (for example). Have 10 links and each one will get a PR of 0.5.
Generally, a link you get from a PR7 webpage is the same to 6 links from a PR6 webpage, assuming the outbound links are the same on both WebPages. So a high PR link will benefit you more than a low PR link. Note that even a PR0 link is still valuable, as it may not be exactly 0, but 0.4 for example. Google only shows you a rounded off Page Rank and there is no way to get the real number.

Page Rank Distribution
Now, back to issue of linking to other websites. If you had a link exchange with another website, you lose part of the PR to this website. Instead of distributing this PR to your own WebPages, you allow others to get a slice of it.
Lets say partners.html of Page Rank 4 have 10 outbound links to other websites. Each of this website will get PR 0.4. By increasing the number of internal links on partners.html, you can reduce the PR you lose. Place an additional 10 internal links (to your own webpage) on partners.html. Now the PR is split into 50/50. You lose half of the PR to other websites, yet retain the other half.
An alternative is to use the no follow attribute in your HTML code.
<a href="http://www.example.com" rel="no follow">Outbound Link</a>
This perform is not allowed when you are exchanging links with other websites. Though unspoken, it is accepted that you do not use no follow when you are doing reciprocal linking. On the other hand, if you are placing an outbound link of your own will and with no responsibility go ahead and use it.Having a good navigation system can increase the maximum Page Rank of your website (total PR of all the WebPages). You see, Page Rank can be distributed within your own website. So if you had a PR 5 webpage and you had no links at all to your own webpage, you lose that PR 5 for good. However, if you had some internal links to other WebPages, that PR 5 is spread to them. These WebPages will get a higher PR, and the WebPages they link to will get a higher PR... The cycle goes on and on!
Remember that it is not advisable to have too many links (internal and external) on a single webpage or it might be banned by search engines as link farms. As a rule of thumb, have less than 80 links per webpage.
More on the maximum PR I just mentioned. By having more WebPages, you actually increase the maximum Page Rank of your website. Most of the PR 8-10 websites have hundreds, if not, thousands of WebPages. So if there are 50 WebPages linking to one webpage, the PR distributed will add up to a significant number! Do bear in mind that creating many poor content WebPages will make matters worst. Only create original content-rich WebPages.
Much to the disorder of many, PR actually differs from each of the Google datacenters, resulting in a different PR of your website in different regions. This is normal, so do not worry about it.
Still want to know more about Page Rank? Visit Page Rank Explained for an in-depth on Page Rank.

SEO - Off page Optimizing
Off page Factors
Off page optimization refers to improving SEO factors that improve your SERP position. The factors are off page, which basically means factors outside of your own website. Off page optimization usually do not come under your own direct control unless you do Black hat SEO.
Off page optimization is dependent on how other websites link to your website. This is therefore harder to accomplish than on page optimization.
Unfortunately, the major search engines take high emphasis on the off page factors of a website. The factors refer to back links pointing to your website, what type of back links it is, the nature of the source of back links and more. A back link can be considered useful only if it comes from a website related to your niche or keywords.

Types of Back links
If you have read Google Page Rank, you will know that having more back links increase your Page Rank. Besides the quantity of back links, there is also the quality. Despite being a simple anchor link, back links have many characteristics.

Anchor Text
The anchor text used in the back links is the most important factor in the quality of the back link. You will want people to link to you using keywords you are trying to rank for.
So how do you make sure the anchor text is what you want? Usually, when you build back links by doing reciprocal or directory submissions, you can choose the anchor text. In reciprocal linking, just provide the HTML code with the desired anchor text to the webmaster. For directory listings, place your keywords in your website name when filling in the submission form. This will be used as your anchor text.


IP of Back links
Search engines place more value on back links that come from many different IPs. If you had a bunch of back links coming from the same IP, the search engines will not place so much value on them as compared to those that comes from unique IPs.
This can be due to the existing method of creating back links for you. You can easily make a new spam website and place several links on it pointing to your main website. Since all these back links come from the same website (and IP), it will not be valued much. This is a protection against manipulating search engines to believe a website have many back links when its not.


One Way or Reciprocal
One way back links are links pointing to your website but not having yours linking to them. It is the direct opposite of reciprocal linking. As search engines like Google already know of link exchange schemes, they decided to place more value on one way back links than reciprocal back links.
You see, Google prefer back links to build up naturally instead of being engineered. In turn, webmasters have thought of a way to bypass this, doing complicated 3-way linking. Example of 3-way linking: Website A links to Website B, which in turn links to Website C. C than links to A. In this manner, there is no reciprocal involved although the intention is there. It is believed that Google can now detect 3-way linking too.

Deep Links
Deep links are back links pointing to your inner WebPages (for example:computers.HTML) instead of your homepage (index.HTML). Deep link ratio (DLR) is the total number of deep links for your website divided by the total number of back links (deep links + links pointing to homepage).
Some SEO experts talked of a perfect or correct deep link ratio but there is no proof that there is such a figure. Always keep in mind that your website needs BOTH types of back links to rank high. You should always try to gain more back links and not worry about the DLR.

Source of Back links
A back link that comes from a website related to your niche is more valuable than one that comes from a unrelated website. This is because back links tell the search engines what a website is about. If your website is about golf, you will want golf-related websites to link to you.
Also, different domains have different values to the search engines. Some domains are so highly valued by search engines that they can easily rank well for any keywords they are targeting. An example? Wikipedia. These websites are what we called authority websites. Having links from authority websites are almost impossible, but is highly valuable.
Other sources of back links that are highly valuable are those that come from .gov and .org domains.


Why Back links are Valued
Back links do not just increase PR, it serve more purpose than that. In fact, the most important factor to rank high is through back links. Search engines determine what a website is about through back links. If a website has 50 back links with the anchor text 'free games', the website will be linked to the niche 'free games'.
That is why you should be looking for back links from websites in your own niche. Make sure the website is search engine friendly. Having a back link that cannot be read by search engines will not help in SEO. You can read more about building back links.


Black hat SEO
Black hat SEO refers to techniques used by webmasters to optimize their websites that are not approved of by search engines, Google in particular. Why not? Because these techniques often produce websites that are not relevant to the user experience. The websites contain spam, ads and duplicated content that are Markov (rewritten in gibberish manner). Nothing which interests you I assume.
This is against what Google or any other search engines is trying to achieve. They want to give relevant results for searches and black hat websites are simply not what a user wants. Hence, using black hat techniques can get your website banned fast if you are not good at it.
Black hat SEO involves the use of 1 or more methods to 'cheat' the search engines into thinking that the website is relevant with the keywords, such as spamming it with keywords that are not related to the topic.
I shall not go into too much of Black hat SEO as it is very in-depth and you might get your website banned in few days. Instead, I will talk about the common ones that you might use without even knowing it.


Keyword Stuffing
This is when you use as many keywords as possible in your article, spamming it to increase the keyword density, resulting in illogical reading for users. Write your content naturally, do not attempt to squeeze in unnecessary keywords and you will be safe.
A more common usage of this is to stuff keywords into the webpage title. Never ever repeat the same word in the title tag, and limit it to less than 80 characters.

Hidden text
Some webmasters hide their text from visitors. Why would anyone do that? Simple, because the text doesn't make sense and is meant for search engines. Usually, this is text that is stuffed with keywords so that the website is associated with these keywords when it is not. Common methods of hiding text are by placing them on a same colored background (e.g white text on white background) or using CSS to hide them (display: none).

Link Farms or Blog Farms
This is highly dangerous as inexperienced webmasters might link to link farms, getting their website banned. Link or blog farms are websites with webpages that have a high number of outbound links (more than 30, maybe around 100+). The purpose of them is to add more back links to other websites in the hope of increasing the Page Rank.
Link farms and blog farms are usually banned by Google in little time. Any website found linking to a banned website will be associated with it, resulting in a ban too.
However, the websites that a banned website links to will not be banned. This is to prevent people from sabotaging their competitors by creating a banned website that links to them.
How this affects you is that you may unknowingly link to a link farm or even create one yourself. Some websites just dump all their reciprocal links onto a single webpage, resulting in an unintentional link farm. Spread your outbound links around a few Web Pages. Let’s say you exchanged links with 50 websites. Place 25 on 1 webpage and the other on a second webpage.

URL Cloaking
This is used to show 1 version of the webpage to visitors, and another to the search engines. The version for search engines are highly optimized (such as keyword stuffing), but doesn't make sense to a human visitor, hence the need to hide it from them.
Cloaking can be done to change dynamic URLs to static URLs, so it is not necessary black hat. It can be used for hiding affiliate links too. Use it for the right purpose and you should be safe.



Automated Link Networks
There are websites that participate in link programs where they gain back links from other members websites automatically. In return, they have to link to other members' websites. This type of mindless linking to websites can benefit you as you build back links easily and quickly.
Although these networks not banned by Google yet, there is a high possibility that it might change in the future. These are sophisticated link networks that require a coding file to be uploaded to your website, so that back links can be added to your website through this coding without you manually doing it.
I would advise you not to join such networks as this coding file is a foot track left behind for using this black hat method of building back links. Search engines can easily track down the foot track in your website and ban it if they want to.
Here is what Google have to say on this:
Don't participate in link schemes designed to increase your site's ranking or Page Rank. In particular, avoid links to web spammers or "bad neighborhoods" on the web as your own ranking may be affected adversely by those links.

Over-Optimization
This is not really black hat, but your website can be penalized by Google if it is over-optimized. Yes, doing too good a job is not so good after all.
So how do you prevent over-optimizing? Never increase your back links at a fast rate. For example, gaining 100 back links in a week is way too much and will raise a red flag too Google. You should always build back links slowly.
Another sign of over-optimizing is when a new website had a lot of back links before it even gets indexed by the search engines. Before your website is indexed, avoid buying back links or submitting your articles to every single article directory you can find. The large amount of back links for a new website can also set off a red flag to Google.

Quick Look at Google Guidelines
•             Avoid hidden text or hidden links.
•             Don't use cloaking or sneaky redirects.
•             Don't send automated queries to Google.
•             Don't load pages with irrelevant keywords.
•             Don't create multiple pages, sub domains, or domains with substantially duplicate content.
•             Don't create pages that install viruses, Trojans, or other bad ware.
•             Avoid "doorway" pages created just for search engines, or other "cookie cutter" approaches such as affiliate programs with little or no original content.
•             If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.

Banned by Google
So your website has been banned by Google? This is usually when you practice 1 or more black hat techniques, or that the domain name you brought has already been banned by Google before you had it.
The best way to get your site back to the main Index is to contact Google. Make sure your website is clean with no black hat stuff before you email them at help@google.com.
A search on the Internet for this particular topic points me to an interesting case. A website, phillyfuture.org, is banned due to misuse by the previous owner. The new owner brought the new domain without realizing it has been banned. When he did, he emailed Google to explain the situation. After some weeks, Google finally unbanned the website and it got listed into the Google Index.
Here is a quote from Matt Cuts (an engineer with Google, a well-known blogger too) in a reply to phillyfuture.org predicament:
Karl did the right thing with a re inclusion request on 3/18/2005 to us, because that goes to someone who can verify that the site is now good. Because of the large volume of correspondence we get to user support, it really helps us to send the request with the right terms if you suspect a domain has been spamming in the past ("re inclusion request" in the subject is enough).
After getting that re inclusion request it was approved in less than 72 hours, but I understand that we could have done better in this case, because Karl first wrote us more than 2-3 weeks ago.
Karl, I understand your frustration because you just bought a domain and wanted it to rank where it should. I'm sorry that the previous life of this domain as spam affected you.
*Karl is the owner of phillyfuture.org
In short, email Google with the header "re inclusion request" and explain the problem to them. Be polite, tell them you have read their webmaster guidelines and had since cleaned up your act (black hat SEO).
Alternatively, use the Google Webmaster Tools to request re-inclusion. Read Google - Request Reconsideration for more information. Your website should be indexed in a few weeks time if all goes well.


  Building Back links
In Off page Optimization, I talked about back links and the types of back links you can receive. Now I shall share with you one of the hottest topic in SEO, building back links.

Text Link Ads
How do you build back links? There are the paid and free methods. Of course, by paying, you can get more valuable back links from high PR websites easily. Start by going to Text Link Ads, a network of advertisers and publishers, to buy links from publishers. Through TLA, you pay the amount they want and in return, you get a static HTML link that is readable by search engines.
Using TLA is much faster than emailing the webmasters and asking if they want to sell links on their websites. The negotiation part is cut out and you can find websites selling links straight away.

Web Directories
Another source of buying links is from web directories. Most web directories have a paid listing included and you have to pay to get listed. If you are working with a budget, you can use the free listings instead. The review time to get listed is longer and your website is not guaranteed to be accepted.
There are so many web directories out there that I cannot list them all. You can search in Google for 'web directory' or 'directory of directories'. Submitting your website to web directories is simple and requires an email address. Usually, you need to fill in a form asking for your website URL, name, email address, description, meta keywords, URL of reciprocal link (if required), etc.
The task is simple yet time consuming if you are looking to submit to 100+ directories. You can either pay someone to submit for you (around $5-10 usually, depending on the number of submissions) or you can use a free software called Directoy Submitter to submit to 350 directories.

Deep Link Directories
These directories do NOT want your homepage. Unlike most directories that only list homepages, deep link directories list your WebPages instead. Why submit to this? Because building back links on your homepage only is not going to work out in the long run. You need deep links to your inner WebPages. There are only a few of such specialized directories for free. Most web directories offer this additional service, for a price.
Here are some that I use:
•             Deep Link
•             Deeply Linked
•             Free Deep Links

Buying Links
I personally do not like paying to get listed. Why? Because if your motive is to increase your ranking in Google, you are cheating Google by paying others to link to you. But it can also be seen as a way to get traffic from these websites linking to you. For the time being, buying links is safe, and seems likely to continue for quite a long time.
Does that mean link exchanges and reciprocal linking are safe? Nope, all these practices are grey hat (between white and black hat) and might be banned by search engines one day. Unless you do this on a large scale and blatantly selling/buying links on your website, you should be safe.

Reciprocal Linking
Contact webmasters of websites related to your niche and tell them you want to exchange links. Some tips to help improve your chance of being accepted:
•             Use an email header that sounds less Spam my , mention their website name so they know its not automated
•             Tell them a thing or two of what you like about their website, to show you are really interested
•             Use proper English, be polite
•             Place a link to their website and let them know the webpage it is on before emailing
•             Tell them how to link to your website, include the anchor text (usually your website name), the URL to use and give a short description should they want to include it
•             Try not to act smart and teach them things like benefits of back link, Page Rank, etc. They might be better than you!


Forums Signature
If you join a forum, remember to post your website link in your signature. This is by far the most easily obtained free back link. Do not spam the forum when doing this though. Remember to use keywords in your anchor text even when its your signature. An even better method is to post in forums related to your niche. Not only can you build back links, members of the forum might visit your site. Free traffic!
 10. More on Building Back links

Article Submission
If your website has original articles, you can submit them to article directories. When accepted by the editors, your article will appear in their directory. By placing links to your website, you get free back links from them. Also, there is an author resource box at the bottom of your article. Place your back link to your homepage and the webpage where the original article can be found.
By doing so, you can get free back links and deep links! Furthermore, when others use your article in their own website, they will have to include this resource box, meaning more back links.
Before you go submit to every single article directory you can find, I strongly advise you to submit any article ONCE. Yes, only once to any of the directory. Submitting the same article to multiple directories is not going to have any advantage.
Sure, you might get more back links from the article directories, but you can get penalized for duplicated content since search engines cannot determine which is the original source. A way to prevent this is to have 2 versions of the article. One for your own website, the other for article directories.
Here are some article directories:
•             Ezine Articles (recommended)
•             Buzzle
•             Article City
•             Article Biz
•             Find Articles

Press Release
Press release is somewhat like a news article. They are for announcing something, such as the arrival of an event, the launch of a service (such as your website service), etc. Submit the press release you write to a distribution service and when people use your press release, they must link back to the original article (you that is). Some press release distribution services:
•             Click Press
•             Web Wire
•             E Media Wire
•             Business Wire
•             PR Web Direct
•             Open Press


Social Bookmarking
Social bookmarking is a type of service provided by various websites for community members to share their bookmarks with others. It is a way to tell other community members that this website is good and you should look at it. The more bookmarks a website receive, the higher it appears up the listing.
Some Social Bookmarking Websites:
•             del.icio.us
•             Digg
•             Blink List
•             Furl


For more of these, looks at....http://dofollowbacklinksdirectory.blogspot.in/


Blogs
Blogs have a comment area where visitors can place a link along with their comments. The anchor text used is usually the name you write the comment under. Most of the blogs have the no-follow attribute enabled to prevent spamming, so if you can find one that doesn't, post some good comments and in return, get your free back link.
Link Networks
There are some link programs out there that help you build back links automatically. This method requires you to link to others in the network in order to receive links automatically, 1 way or reciprocal. This is risky as search engines might ban websites doing this in the future, especially since some of these programs require you to upload a coding file (PHP) to your website, leaving a foot track behind.
Do a search in Google for 'link network', 'link exchange', 'automated link exchange' for these link programs if you are interested, since I shall not mention them here. If you come across a service called Pyramid Linking, do not use it. The system is dead and no one is using it now.

Link Exchange Service
This is slightly different from Link Networks. It does not require a coding file, and you have to search for link partners using their Search function. Find a suitable one, and using the service, send them a request to exchange links. You must link to them before requesting. They will then either accept or reject. If accepted, they must then place a link to your website.
It is safe to use these services as it is nearly impossible to detect such perform. Be careful who you link to though, if the website is banned, so will you! Always check for possible link farms in these link exchange websites.


Other Methods
StumbleUpon
Download their toolbar, and click on the 'Thumbs Up' button to stumble your website. The more stumbles it receive, the more likely visitors are too stumbled onto your site when they click the 'Stumble' button on the toolbar.
Squidoo
Create a lens on your niche, post a few articles there, and link your Lens (a webpage on Squidoo that you created) to your actual website for quality back links.


Free Web Templates
Create a free web template for people to use, submit it to websites like OSWD. When someone uses it, they have to provide a site wide link in their footer that links back to your website. Be sure to include this link in the template else most people won't bother to give you the credits.

Google Supplemental Index
What is Supplemental Index?
The Supplemental Index only applies for Google. When Google indexes a webpage, it either ends up in the main index or the supplemental index. Websites from the main index are returned for normal search queries. On the other hand, websites in the supplemental index is deemed to be of less importance and hence will not be shown.
There used to be a way to know if your WebPages is in Supplemental Index by Simply searching "site: http://yourdomain.com" in Google. Now, when Google introduced the supplemental index, many webmasters were confused initially. After much debate, Google has decided to improve the system.
WebPages that are in supplemental index are almost never returned for normal search queries. Google has now announced that they have overcome the limitations of this system. And they have also removed the tag 'Supplemental Result' from their SERP so there is no way of telling if a webpage is in the supplemental index.
Here is a quote from the official Google Blog on this matter:
The changes we make must focus on improving the search experience for our users. Since 2006, we've completely overhauled the system that crawls and indexes supplemental results. The current system provides deeper and more continuous indexing.
Additionally, we are indexing URLs with more parameters and are continuing to place fewer restrictions on the sites we crawl. As a result, Supplemental Results are fresher and more comprehensive than ever. We're also working towards showing more Supplemental Results by ensuring that every query is able to search the supplemental index, and expect to roll this out over the course of the summer.

How to Escape from Supplemental Index
The removal of the 'Supplemental Result' tag has caused webmasters problems as they cannot tell if their WebPages are in the supplemental index and needs improvement. There are many reasons why your webpage can end up supplemental (though you will never find out). Below are some things you can do to prevent this.


1. Get more Back links
Google has mentioned that low Page Rank (usually 0) is the main reason why most webpages end up in the supplemental index. And the only way to increase Page Rank is to get back links. Get back links from trusted domains, domains that have Page Rank of at least 1.


2. Unique Title
Give each webpage a unique title. I saw some websites where the title is the same throughout. This is particularly evident in online stores. Unique title not only boosts your SERP position, it also shows the search engines that your WebPages are all different from each other. If Google thinks that your WebPages are similar to each other, it will place the less important ones into supplemental index.


3. Unique Meta Description
Meta Description is used to describe a website. Although it does not affect the ranking algorithm of the major search engines, Google still looks at it. Not for ranking, but for displaying this information in their SERP.
They also look at the Meta Description for similar WebPages. So do give a unique description for each of your webpage. A short two sentence description is good enough. Remember, this is for the human users to read, not for robots. So write in complete sentences and do not spam keywords here.


4. Original Content
If you copied content from others, such as article directories, your webpage becomes supplemental to the web since it is duplicated. With low Page Rank and no back links, this can combine to give Google the impression that your webpage is of no importance.
If you have quality back links to duplicated WebPages, you can get out of the supplemental index. However, it helps to have some original content. Write around 100 words of original content for each webpage can do no harm.


5. Build Deep Links
Your website should not only have back links pointing to the homepage. This gives search engines the impression that your website has a hollow shell. That is, only the homepage is valued while the inner WebPages have poor content.
The end result would be that only the homepage is in the main index whereas the rest are in the supplemental index. Work on your deep links. You can submit articles to directories use social bookmarking or even allow others to use your article (with a link back) on their websites.
That is about all for escaping from the supplemental index. Remember, it takes time to get out of there as you need to wait for the next time Google spiders your WebPages. You shouldn't worry about the supplemental index in the long run if you use SEO on your website.


Google Sandbox
Google Sandbox - Does It Exist?
The Google Sandbox is a merely a theory or rather an observation made by webmasters. It is purely conjecture and is pieced together by personal experience of many people facing it. The Google Sandbox arises around the time where Google made an algorithm update in 2004. After this update, many websites worldwide were losing their high positions in SERPs.
The websites affected had one thing in common. The domain names were registered around March 2004. All domain names registered thereafter are said to suffer from the Sandbox effect. It was believed that these new websites are penalized in some way by Google.
It is not confirmed that Google Sandbox actually existed as Google has never acknowledged it. In the Allegra update by Google in 2005, websites that ranked highly were losing their positions whereas those that ranked lowly are gaining top positions.
The conclusion made is that the websites that are affected by the Sandbox in 2004 are being released by Google. Hence, these low ranking websites gain back their high positions (since they were well-optimized in the first place but penalized by the Sandbox) in the SERPs.
It is conjectured that there is a time-delay factor for the Sandbox. Websites in the Sandbox will be released after a period of time expires.

Sandbox Effect
The Sandbox effect is described as a form of penalty for new domains. The Sandbox acts as a holding area for new websites as it is deemed to be of lower authority than older and more established websites. Websites in the Sandbox will have lower positions in the SERP than what it is supposed to be based on its optimization.
Usually, a new website will be hit by the Sandbox effect a few weeks after it gets indexed. This gives rise to a scenario where a new website loses its high position rapidly. For example, a new website ranking in the top 10 for a keyword can drop out of the top 100 results overnight.
The Sandbox affects WebPages targeting competitive keywords more than those targeting less competitive keywords. That means WebPages with competitive keywords will drop more places in their positions.

Google Sandbox Theory
There are many theories about the Google Sandbox. Since it is not officially documented by Google, many can only guess what the Sandbox actually does to your website.
It is known that there is a time-based factor that the Sandbox uses to release websites due to the Allegra update. Most have seen it to be the age of the domain. Therefore, webmasters usually choose to buy domain names that have expired since these domains have already aged.
Others registered a new domain and leave it alone for it to age. Once it is aged, they will work on the domains. This allows them to bypass the Sandbox.
However, another variant of the time-based factor is the age of the inbound links (back links). To be release from the Sandbox, you need inbound links that have been there for some time.
The time a website stays in the Sandbox before it is released is estimated to be around 6 months. This is the time where most of the websites in the Sandbox is released. Some websites have been struck in there for more than a year while some can leave in less than 3 months.

How to tell if a website is sand boxed?
While there is no definite way to determine if your website is sand boxed, nor are there any online tools to help you, there are characteristics of a sand boxed website to help.
If your website does not appear in the SERPs for your keywords or it ranks very low (like out of the top 500) even when you have a lot of quality back links, original content and well-optimized WebPages, chances are that it is in the Google Sandbox.

Escaping from Google Sandbox
Escaping from the Sandbox is simple but time consuming. Just follow the optimization techniques I mentioned in On page Optimization and Off page Optimization. Even when results are not improving, do not give up. Continue doing white hat SEO and keep building quality content and inbound links.
Once your website releases from the Sandbox, the position will rise tremendously high if you had done a good job optimizing it. It is tempting to use Black hat SEO during this period of Sand boxing, but for long term benefits of your website, always use white hat SEO.


Robots.txt for SEO
What is Robots Text?
Robots. Text is a text file that tells a search engine crawler what to do with your website. When the crawler visits your website, it will look for robots. Text. It tells the crawler which WebPages cannot be indexed.
The file must be uploaded to the root directory (WWW.example.com/robots.text) and in the .text extension. This is not a HTML or PHP file, just a text file with instructions for the crawlers.
So what has the robots? Text got to do with SEO? By using robots. Text, you can control the search engines to crawl your website effectively. Well written robots. Text should filter out unimportant WebPages and directories so that the crawlers can index the important WebPages straight away. By removing WebPages with poor quality content from the index, search engines will index more of your keyword rich WebPages.
Why disallow WebPages to be crawled? Crawlers do not crawl all of a website at once. It visits a few WebPages, updates the index, and then leave. So obviously, you do not want it to crawl unimportant WebPages during its visit. It has been shown that by limiting the WebPages to be crawled, your position in SERPs can raise significantly.
You should not overdo this though. Only filter out WebPages that are not targeting any keywords, yet have to be there for visitors. This can be a 'privacy policy', 'disclaimer' or 'contact us' webpage.

Creating robots. Text
Creating robots. Text is simple. Just open up Note Pad or any text editor and save it as 'robots.txt'. The contents of the file are called records. Records have two fields: the User Agent and at least one Disallow. Here is an example:
User-agent: Google bot # name of crawler
Disallow: /scripts # disallow folder 'scripts'
Disallow: contact.HTML # disallow webpage 'contact.HTML'
What this record does is to disallow the crawler called 'Google bot' to crawl any webpage from the folder 'scripts' and the webpage 'contact.HTML'. WebPages excluding these are free to be crawled. However, there is a common mistake made in the above example. The robots. Text also disallows the crawler to crawl the file '/scripts.HTML', if there is one in the directory. To prevent such accidental filtering of WebPages, you should always add a slash at the back of the directory name.
User-agent: Google  bot Disallow: /scripts/ # add a slash at the back to limit it to the folder only
User Agent is the name of a crawler. There are many crawlers in the Internet. For a list of them, look at..
The # sign is the symbol used to define comments. Any text that follows the # (in the same line) will be ignored by the crawlers. However, it is recommended that you do not use any comments in the robots.txt as some crawlers might get confused. Yes, that right. Not all crawlers behave the same way. Some crawlers even ignore the robots.txt file. Normally, these are bad crawlers that search for email addresses to spam.
Defining a User Agent one at a time is tedious. Therefore, the * symbol is used to denote all User Agents. The / is used to denote all directories (which basically means your whole website). For example:
User-agent: *
Disallow: /

User-agent: Google
Disallow:
This will tell all User Agents (crawlers) not to crawl any part of your website. Take note of the next record. It tells Google bot to crawl all parts of the website. By leaving the Disallow blank, it tells the crawler to crawl any part of the website.
There is a conflict between the first and second record. The first says not to crawl the website for all User Agents (which includes the Google bot), while the second tells Google bot to crawl the whole website. This is an example of allowing only one (or a few) User Agents to crawl a particular part of the website while filtering out the rest.
This little hack is very useful. It is impractical to list out the entire User Agents one at a time in the robots.txt. Hence, we disallow all of them in a single record, and then add in the small number of User Agents that we want.
Why do this? Crawlers take up bandwidth and some people do not want certain folders to be crawled so many times. For example, you might want Google bot-Image (Google Image Crawler) to crawl your images folder while filtering out other irrelevant crawlers.
User-agent: *
Disallow: /images/
Disallow: /shop/images/
Disallow: /blog/images/

User-agent: Google bot-image
Disallow:

Mistakes in robots.txt
While it is fairly straight forward to use the robots.txt, some people might make mistakes in them. There are online tools to help you check for errors. The best one, in my opinion, is to use the Google Webmaster Tools to do so. Submit and validate your website in the Webmaster Dashboard. Once validated, you can check for errors in your website (404, broken links, etc.) and one of them includes a robots.txt checker.
With that said, here are some common mistakes made:

White Space at the Beginning
User-agent: *
Disallow: /scripts/
There should not be any empty lines at the beginning of the file.
Change of Order

Disallow: /scripts/
User-agent: *

Multiple User Agents
User-agent: Google bot
User-agent: yahoo bot
User-agent: Google bot-image
Disallow: /scripts/
While using multiple Disallow for a User Agent is possible, using multiple User Agents at the same time is not allowed.


Mod_rewrite with .htaccess
Duplicated Websites
You might not have noticed, but your website is duplicated without you knowing. No one is copying your website, it just so happens that a website have different versions. Confused? Take a look at the following URLs:
http://www.larvos.com/
http://larvos.com/
Both URLs are different, yet they point to the same webpage. This usually doesn't mean anything for the human user. However, it means a whole lot of difference for the search engines. To them, there are 2 versions of the website, one with the www and the other without the www.
Why so? Consider the fact that each sub domain is regarded as a new website by the search engines. And WWW is a universal sub domain that exists in most websites. So there, you have two identical websites waiting to be ranked by search engines. This is bad in two ways. Your website is penalized for duplicated content and your back links are split up between these two versions.
You end up having to do SEO on two identical websites, which is not a very smart thing to do. Here comes mod_rewrite, the solution to all this mess. Mod_rewrite basically rewrites your URL and you can use it to remove the WWW, force in the WWW and other stuff. Be sure that your web host allows mod_rewrite or it won't work.

Remove WWW
#removes www from your URL
  Rewrite Engine On
  Recondite %{HTTP_HOST} ^www\.(.+)$
  Rewrite Rule^(.*)$ http://%1/$1 [R=301,L]
Or try this...
Options +FollowSymLinks
  Rewrite Engine on
RewriteCond %{ HTTP_HOST} .
RewriteCond %{ HTTP_HOST} !^example\.com
  Rewrite Rule (.*) http://example.com/$1 [R=301,L]

Insert www
#inserts WWW
  Rewrite Engine On
  Recondite %{HTTP_HOST} !^www\.
  Rewrite Rule^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]
Or try this...
Options +Follow Sym Links
  Rewrite Engine on
  Recondite %{ HTTP_HOST} ^example.com [NC]
  Rewrite Rule^ (.*) $ http://www.example.com/$1 [L,R=301]

Redirect Sub domain
The same goes for sub domains. They have two versions.
http://make-money.larvos.com/
http://larvos.com/make-money/
The first is the sub domain URL whereas the second is the directory URL. The following command redirects the directory URL to the sub domain URL.
#redirect directory to sub domain
RewriteCond %{ HTTP_HOST}! ^sub domain\.example\.com
Rewrite Rule (.*) http://subdomain.example.com/$1 [L,R=301]
The following redirects a sub domain to a directory.
#redirects sub domain to directory Rewrite Engine On RewriteCond %{ HTTP_HOST} ^sub domain\.example\.com
  Rewrite Rule^ (.*) $ /sub domain/$1 [L]

301 Permanent Redirect
Redirects can be done in many ways, using PHP, HTML, JavaScript and more. But what if you rename webpage that ranks well? You will want to preserve the back links, Page Rank and everything that you painstakely optimized. That is where the 301 permanent redirect comes in.
Using the 301 redirect tells the search engines that the webpage has moved. Hence, the new webpage will inherit all the statistics of the old webpage (such as back links). This is how you do a 301 redirect.
#Redirect 301 /oldwebpage.html http://www.example.com/newwebpage.html

What is DoFollow Link
Understanding DoFollow
"DoFollow" is simply an internet slang term given to web pages or sites that are not utilizing "NoFollow." NoFollow is a hyperlink value that tells search engines not to pass on any credibility or influence to an outbound link. Originally created to help the blogging community reduce the number of inserted links into a "comment" area of a blog page, the attribute is typically standard in blog comments. It helps overwhelmed webmasters disallow spammers from gaining any kind of advantage by inserting an unwanted link on a popular page, and has become an integral part of Google-specific SEO.

How DoFollow & NoFollow Have Affected Link Building
As a result of the implementation of NoFollow, the process of building links has taken a steep turn. Many sites, including wikis, social bookmarking sites, corporate and private blogs, commenting plug-ins and many other venues and applets across the internet began implementing NoFollow. This made effective link building difficult for both honest people and spammers alike. It also made DoFollow links become the "Holy Grail" of SEOs everywhere, who seek them out as expensive collections to their off-site optimization repertoire.

NoFollow Isn't Bad
There's nothing wrong with getting NoFollow links. In fact, you'll want to get an equal amount of them as well. While they don't pass on link juice, they do help associate your site with anchor text (the keyword phrase that makes up the URL pointing to your site). They also increase the exposure of your site, overall, which may eventually lead to you getting more mentions via DoFollow links!

8 Powerfull Tips To Get Free Dofollow Backlinks
1. Profile Sites
Backlink from profile site is one of the easiest ways to get free dofollow backlinks. You can set anchor text contains links and point it to your website in many forums profile or other website profiles. You can use free backlink submitter like 247backlinks to get 800+ free dofollow backlinks from forum profiles. You just need set a campaign and forget it.
2. Article Directory
Article directories are one of the best places to get free dofollow backlinks. Because they usually have high PR and dofollow. In this way, your website will get quality and permanent backlinks. My article directories favorite like EzineArticles and GoArticles.
Free Dofollow Backlinks
3. Blog Comment
This is not a secret that reply comments in other people’s blogs can increase your backlinks. Especially they have same topic with your blog, have a high PR and dofollow.
Comment on other people’s blog is still the fastest and easy way to get free dofollow backlinks. You just need to blogwalking. This is still proving to be powerful ways to get traffic.
4. Forum
Participating in relevant forums can also generate free dofollow backlinks to your website. You just need to put backlinks on forum signature.
Not all forum grant permission to create a signature, you should read the rules of each forum before you create forum signature.
5. Social Bookmark
Submit articles to many social bookmarks also useful to get free backlinks. It can also bring in a lot of traffics faster. Depend on your articles topic. Submit each article that you created on the blog or article directory to social bookmarks. Many high PR social bookmarking you can use to share your website like Digg, Mixx, etc.
6. Press Release
Press release is the same as article directory. The difference is article as announcement in press release. Such announcements about products, events, websites or anything else that has just been launched.
7. 2.0 Web

Free dofollow Backlinks from 2.0 webs have good quality because they usually has high PR like wordpress, blogger, squidoo, hubpage, etc.
8. Blog Network.
In blog network, You write articles on many other blogs. This way is also good and quick to get free dofollow backlinks because your articles will appear in a lot of blogs that are relevant to your blog. Blog networks usually also use spinner content to avoid duplicate contents. An free blog network example is free traffic system.
The main problem for search free dofollow backlinks is bored and spend more time. You can use the backlink builder service that is already trusted and proven. So, you don’t have to spend your time to find free dofollow backlinks. You can use your time to grow your business.

What Is No Follow Link
SEO Basics is the eCreative IM blog column written for SEO beginners just learning the basics of search engine optimization. You can find all our SEO Basics articles by browsing the SEO Basics Archive or find the specific tips you’re looking for in our SEO Tips & Guides page.
Nofollow links were introduced back in 2005, originally as a way to combat comment spam on blogs. As anyone with a blog knows, comment spam is alive and well. However since that introduction the purpose of the nofollow link attribute has changed to become a way of identifying paid links and untrustworthy links, and ensuring that your site does not gain SEO benefit from nofollow backlinks.
In short, a Nofollow link tells search engine bots not to follow the link. The link passes on no SEO value — it essentially exists only for people, and not for search engines.
Some of the common places you’ll see nofollow links is in any kind of paid link (if a paid link doesn’t have the rel=nofollow attribute, Google could punish your site) and comments on blogs and forums, where the site doesn’t necessarily trust any link that any random person might post (the idea here being to discourage link spam).
A nofollow link attribute appears in your HTML code as: <a href=”www.site.com” rel=”nofollow”>Link</a>  Note that most blog or forum software has user comment links assigned the nofollow attribute by default.

How Google Really Treats a Nofollow Backlink

Despite the fact that the original idea was that search engines would effectively ignore nofollow links, Google’s bot does, in fact, often follow those links and use it to find other pages on the internet.
However, Google has been firm and adamant that those links do not pass on PageRank, do not count as a backlink with any weight, and do not help your SEO in any way. This has been called into question from time to time, especially because nofollow links sometimes do show up in Google’s Webmaster Tools as a backlink. Google’s Matt Cutts was pretty clear on this behavior:
Do not assume just because you see a backlink that it’s carrying weight. I’m going to say that again: Do not assume just because you see a backlink that it’s carrying weight. Sometime in the next year, someone will say “But I saw an insert-link-fad-here backlink show up in Google’s backlink tool, so it must count. Right?” And then I’ll point them back here, where I say do not assume just because you see a backlink that it’s carrying weight.
There has been no truly persuasive scientific evidence that Google is wrong in their statement that nofollow backlinks have no SEO weight. There is some anecdotal evidence that there might be some minor amount of weight to nofollow links, or that it might happen in some circumstances, but no one has demonstrated any reproduceable evidence that nofollow links are at all valuable to SEO.

Nofollow Links and Link Sculpting

At one point after Google first started using the nofollow link attribute SEO professionals started using it internally on sites to control how much PageRank was passed to various links. This benefit was written out of the Google algorithm years ago. You cannot use the nofollow attribute to control how PageRank is passed through links any longer.

What is Internet Marketing
Internet marketing (also known as online marketing) is exactly as it sounds, a way to market your products or services on the internet. Many make the mistake of believing that all it takes to be considered an internet marketer is to have a  live website . That is far from the truth. There are millions of websites added online every day. What makes a website adequately compete in internet marketing goes far beyond registering a domain name and uploading content. It takes skillful strategy and an evolving knowledge of the internet marketing industry.In order for your business to really thrive, there must be an internet marketing campaign in place. The internet is extremely competitive but it is worth it to find your place amongst the competition. The possibilities for your business when it is successful marketed on the internet are limitless. With this platform, your company is given the opportunity to reach a clientele far beyond the bounds of your business physical location. The internet allows for your business to be accessed around the world.
Internet marketing uses two primary avenues, search engine optimization (SEO) and search engine marketing (SEM), with other avenues of internet marketing being developed every day.
  • With SEO, organic methods are used to improve the visibility of your website or web pages in search engine page results. SEO considers how search engines work, what people search for and the search terms used when conducting a search. To adequately optimize a site there may be site content editing that needs to take place. Tailoring content may increase the relevance of the content on your page to the keywords that are being searched. Also, adding backlinks, which are incoming links to your website or web page from outside sources, are an essential factor for successfully marketing your site on the internet.
  • SEM is also a form of internet marketing. With SEM paid measures such as pay-per-click, contextual advertising, and paid inclusion are used to promote site visibility in search engine results. SEM means the marketing of a site so that advertisements are more relevant in searches and rankings.
  • Affiliate marketing is another branch of internet marketing that is the practice in which a business offers affiliates rewards for each visitor brought to their site by the affiliates marketing efforts. Affiliate marketing is often overlooked but is a useful internet marketing strategy that, if implemented correctly, can prove to produce a rewarding return.
  • Social media marketing is an internet marketing strategy that has recently gained popularity. This is the process of marketing through social media outlets such as Facebook, Twitter and YouTube. Since more and more people are spending countless hours on social media forums it is likely that this facet of internet marketing to gain popularity.
All these attributes are taken into consideration when determining a viable internet marketing strategy. Many rely on professionals to utilize the available methods as these professionals have a knowledge of exactly how these methods operate and ways to better utilize the strategies to suit your company needs.

What Is Google Adwords

Google Ad Words is a fast and simple way to promote on Google and its ad partners, apart from of your budget. Ad Words promotions are display along by way of search consequences on Google, as well as on search and satisfied sites in the rising Google Network, which take account of websites like AOL, EarthLink, How Stuff Works, & Blogger. With searches on Google and sheet view on the Google Network each day, your Google Ad Words promotions reach vast viewers.
When you generate an Ad Words ad to sprint on Google and its search partners, you can choose keywords for which your ad will show and specify the greatest sum you're willing to pay for each click. You pay only when someone clicks on your ad
When you create an Ad Words ad to run on the Display Network, you can choose the exact content post where you'd like your ad to show, or you can let related targeting match your keywords to content. You can pay for each click (CPC) or for each thousand period someone sees your ad (which is called CPM bidding).
To save you even more money, our Ad Words Discounter robotically trim down the actual cost-per-click (or CPC) you pay to the lowest cost needed to continue your ad's position. The Ad Words Discounter keeps working no matter which method of display or bidding you choose.
There's no minimum monthly charge with Ad Words -- just a nominal activation fee. You can choose from a variety of ad formats, including text, image, and video ads, and easily track your ad performance using the reports in your online account Control Center.

What Is PPC Marketing 
Pay per click (PPC) also called Cost per click is an Internet publicity representation used to direct traffic to websites, where advertisers pay the publisher (typically a website owner) when the ad is clicked. With search engines, advertisers typically bid on keyword phrases relevant to their target market. Content sites commonly charge a fixed price per click rather than use a bidding system. PPC "display" advertisements are shown on web sites or search engine results with related content that have agreed to show ads. This approach differs from the "pay per impression" methods used in television and newspaper advertising.
In contrast to the generalized portal, which seeks to drive a high volume of traffic to one site, PPC implements the so-called affiliate model that provides purchase opportunities wherever people may be surfing. It does this by offering financial incentives (in the form of a percentage of revenue) to affiliated partner sites. The affiliates provide purchase-point click-through to the merchant. It is a pay-for-performance model: If an affiliate does not generate sales, it represents no cost to the merchant. Variations include banner exchange, pay-per-click, and revenue sharing programs.
Websites that utilize PPC ads will display an advertisement when a keyword query matches an advertiser's keyword list, or when a content site displays relevant content. Such advertisements are called sponsored links or sponsored ads, and appear adjacent to or above organic results on search engine results pages, or anywhere a web developer chooses on content site.Among PPC providers, Google Ad Words, Yahoo! Search Marketing, and Microsoft ad Center are the three largest network operators, and all three operate under a bid-based model.
The PPC advertising model is open to abuse through click fraud, although Google and others have implemented automated systems to guard against abusive clicks by competitors or corrupt web developers.

Determining cost per click
There are two primary models for determining cost per click: flat-rate and bid-based. In both cases the advertiser must consider the potential value of a click from a given source. This value is based on the type of individual the advertiser is expecting to receive as a visitor to his or her website, and what the advertiser can gain from that visit, usually revenue, both in the short term as well as in the long term. As with other forms of advertising targeting is key, and factors that often play into PPC campaigns include the target's interest (often defined by a search term they have entered into a search engine, or the content of a page that they are browsing), intent (e.g., to purchase or not), location (for geo targeting), and the day and time that they are browsing.

What is Google Webmaster Tools?
Google Webmaster Tools (GWT) is a free and easy way for webmasters to view their own website the way that Google sees it. GWT is a free tool that can be used for websites of all sizes, and features information such as:
  • Which of your pages are included in Google’s Index
  • Any errors encountered while crawling your site
  • Search queries that list your site as a result
  • Which sites link to yours
  • And more
This guide covers how to set up GWT in seven easy steps, from creating an account to adding and deleting users and associating your Google Analytics.
Step 1 – Google Account
To set up Google Webmaster Tools you will need to register for a Google account. If you have access to any of their other products such as Analytics, Gmail or Google Places then you will already have an account.
If you don’t have an account, you can associate your work email address with a Google account to take advantage of their products by clicking here
Step 2 – Register for Google Webmaster Tools
Visit Google Webmaster Tools, and register for a Google Webmaster Tools Account, and sign in.
Step 3 – Add Website URL
Once signed in, you’ll be able to add your website address by clicking on the “Add a Site” button, typing the address and clicking “Continue”. When entering your domain, you can either add the top level domain (www.example.com) or you are able to add folders if you would like to target a specific area of the site. This is particularly useful if your site is split into regions or country codes and to target different international markets (www.example.com/folder1/folder2/).
Add Website URL to Google Webmaster Tools
Do Follow Back Link Directory

Once you have clicked “Continue”, you will automatically be taken to the verification screen to choose your verification method.
Step 4 – Verify Website
GWT provides you with four different methods of validating your website, including uploading an HTML to the root server, adding a meta tag to the homepage, linking to Google Analytics and using DNS. Choose the one that is most suitable to your needs.
Upload HTML File to Server
GWT can provide you with a blank HTML file with a specific code attached to it that associates with your account. You can easily download this by clicking the link “this HTML verification file”. Once you have downloaded this file, you need to upload it directly to the root file of your server then click “Verify” to confirm the placement has been made. Once verification has been made, you will start to see some data population and you can move on to step 6.
HTML File Upload Google Webmaster Tools
Do Follow Back Link Directory

If you need to verify at a later date then you can move on to Step 5 when you are ready to verify the account.
Add a Meta Tag
GWT provides you with a meta tag that needs to be added to the <head> tag as an alternative method to adding the HTML file to the website. This meta tag just needs to be copied from GWT (shown below) and added to the homepage of the chosen website. Once this has been implemented, click the “Verify” button to start catching data (move to step 6). If you are not in a position to verify the account, click “Not Now” and come back to Step 5 when you are ready.
Meta Tag Verification Google Webmaster Tools
Do Follow Back Link Directory

Link to Google Analytics Account
GWT provides you with an easier way of verifying your site if you use Google Analytics through the same Google Account that you have set up. If you are using two different accounts, for GWT and Google Analytics, then this option will not work. SEOptimise recommend that you use the same account for all Google products and if it’s a business account, create a central account for the entire business. To verify the Google Analytics account, click “Verify”. Once verified move on to Step 6.
Google Analytics Verification Google Webmaster Tools
Do Follow Back Link Directory

Add DNS Records to your Domain
This option is for those who can sign in to a domain registrar or hosting provider and add a new DNS record. When you choose from the drop down menu (as indicated below), GWT provides instructions. After following the instructions, you need to verify that this has been completed correctly by clicking the “Verify” button. If this works, move on to Step 6; otherwise repeat the instructions or choose another method of verification.
Do Follow Back Link Directory
Step 5 – How to Verify Once Method has been Implemented
If you were unable to verify at the time of setting up GWT, then you can verify it at a later date. When logging in to GWT, you will be presented with the websites that you have set up and a link to “Verify the Site” underneath the Manage heading. Click the “Verify this Site” link.
Verify Google Webmaster Tools
You will be presented with a similar screen as discussed in Step 4, where you need to choose your validation method. The validation method that you choose on this occasion needs to be the same as what you had originally chosen when setting up GWT.
Example: If you had originally chosen to add a HTML file to the root, then you need to reselect the HTML file verification.
Step 6 – Manage Users
GWT allows the administrator of the account to provide access to multiple users by adding them to the Verification Details via the “Manage” link as you log in to the tool.
Manage Users in Google Webmaster Tools
Do Follow Back Link Directory

Once you have clicked the “Manage” link, you will be directed through to the Verification Details page, where you will be allowed to add/edit/delete the users who have access to the data via their own Google account.
To add a new user, click the “Add an owner” button and enter their email address. This will only work for users who have a registered Google Account, so if they do not currently have one please refer them to step 1.
Manage Users in Google Webmaster Tools
Do Follow Back Link Directory

If you would like to remove any users who have been previously added then just click the “Unverify” link.
Step 7 – Associate Google Webmaster Tools with Google Analytics
If you use Google Analytics to track your website, then GWT allows you to import some data into the Google Webmaster Tools interface to add extra value to your data. This option will only be available if your Google Analytics account is associated with the same account that GWT is set up on. To add Google Analytics to your profile, click “Manage” and select “Google Analytics Profile”.
Associated Google Analytics with Google Webmaster Tools
Do Follow Back Link Directory

After you select “Google Analytics Profile”, you will be redirected to the Google Analytics page within GWT, where you can select the Google Analytics account/profile that is relevant to the website. Using the tick box, select the profile relevant to your account and click the save button; this will begin to insert data direct from your analytics account. If you haven’t got Google Analytics but would like to create an account, click “Create Google Analytics Account”.
Associate Google Analytics with Google Webmaster Tools
Do Follow Back Link Directory

So if you followed all the steps correctly you should now have fully installed Google Webmaster Tools account that will have data about your website populating the dashboards soon.
I hope you found the Google Webmaster Tools: A Beginner’s Guide to Installation useful. If you have any comments on the guide I have provided or on Google Webmaster Tools in general I would love to hear from you,