Monday, July 25, 2011

How Search Engines Use Links

The search engines use links primarily to discover web pages, and to count the links as votes for those web pages. But how do they use this information once they acquire it? Let’s take a look:

Index inclusion

Search engines need to decide what pages to include in their index. Discovering pages by crawling the Web (following links) is one way they discover web pages (the other is through the use of XML Sitemap files). In addition, the search engines do not include pages that they deem to be of low value because cluttering their index with those pages will not lead to a good experience for their users. The cumulative link value, or link juice, of a page is a factor in making that decision.

Crawl rate/frequency

Search engine spiders go out and crawl a portion of the Web every day. This is no small task, and it starts with deciding where to begin and where to go. Google has publicly indicated that it starts its crawl in PageRank order. In other words, it crawls PageRank 10 sites first, PageRank 9 sites next, and so on. Higher PageRank sites also get crawled more deeply than other sites. It is likely that other search engines start their crawl with the most important sites first as well. This would make sense, because changes on the most important sites are the ones the search engines want to discover first. In addition, if a very important site links to a new resource for the first time, the search engines tend to place a lot of trust in that link and want to factor the new link (vote) into their algorithms quickly.

Ranking
 
Links play a critical role in ranking. For example, consider two sites where the on-page content is equally relevant to a given topic. Perhaps they are the shopping sites Amazon.com and (the less popular) JoesShoppingSite.com. The search engine needs a way to decide who comes out on top: Amazon or Joe. This is where links come in. Links cast the deciding vote. If more sites, and more important sites,
link to it, it must be more important, so Amazon wins.

Thursday, July 21, 2011

Keyword Targeting

The search engines face a tough task; based on a few words in a query, sometimes only one, they must return a list of relevant results, order them by measures of importance, and hope that the searcher finds what he is seeking. As website creators and web content publishers, you can make this process massively simpler for the search engines and, in turn, benefit from the enormous traffic they send by employing the same terms users search for in prominent positions on your pages.

Keyword targeting has long been a critical part of search engine optimization, and although other metrics (such as links) have a great deal of value in the search rankings, keyword usage is still at the core of targeting search traffic.

The first step in the keyword targeting process is uncovering popular terms and phrases that searchers regularly use to find the content, products, or services your site offers. There’s an art and science to this process, but it consistently begins with a list of keywords to target Once you have that list, you’ll need to include these in your pages. In the early days of SEO, the process involved stuffing keywords repetitively into every HTML tag possible. Now,keyword relevance is much more aligned with the usability of a page from a human perspective.

Since links and other factors make up a significant portion of the search engines’ algorithms, they no longer rank pages with 61 instances of “free credit report” above pages that contain only 60. In fact, keyword stuffing, as it is known in the SEO world, can actually get your pages devalued via search engine penalties. The engines don’t like to be manipulated, and they recognize keyword stuffing as a disingenuous tactic.

Keyword usage includes creating titles, headlines, and content designed to appeal to searchers in the results (and entice clicks), as well as building relevance for search engines to improve your rankings. Building a search-friendly site requires that the keywords searchers use to find content are prominently employed.

XML Sitemap Guide

Google, Yahoo!, and Microsoft all support a protocol known as XML Sitemaps. Google first announced it in 2005, and then Yahoo! and Microsoft agreed to support the protocol in 2006. Using the Sitemaps protocol you can supply the search engines with a list of all the URLs you would like them to crawl and index.

Adding a URL to a Sitemap file does not guarantee that a URL will be crawled or indexed. However, it can result in pages that are not otherwise discovered or indexed by the search engine getting crawled and indexed. In addition, Sitemaps appear to help pages that have been relegated to Google’s supplemental index make their way into the main index.

This program is a complement to, not a replacement for, the search engines’ normal, link-based
crawl. The benefits of Sitemaps include the following:
  • For the pages the search engines already know about through their regular spidering, they use the metadata you supply, such as the last date the content was modified (lastmod date) and the frequency at which the page is changed (changefreq), to improve how they crawl your site.
  • For the pages they don’t know about, they use the additional URLs you supply to increase their crawl coverage.
  • For URLs that may have duplicates, the engines can use the XML Sitemaps data to help choose a canonical version.
  • Verification/registration of XML Sitemaps may indicate positive trust/authority signals.
  • The crawling/inclusion benefits of Sitemaps may have second-order positive effects, such as improved rankings or greater internal link popularity.

The Google engineer who in online forums goes by GoogleGuy (a.k.a. Matt Cutts, the head of Google’s webspam team) has explained Google Sitemaps in the following way: Imagine if you have pages A, B, and C on your site. We find pages A and B through our normal web crawl of your links. Then you build a Sitemap and list the pages B and C. Now there’s a chance (but not a promise) that we’ll crawl page C. We won’t drop page A just because you didn’t list it in your Sitemap. And just because you listed a page that we didn’t know about doesn’t guarantee that we’ll crawl it. But if for some reason we didn’t see any links to C, or
maybe we knew about page C but the URL was rejected for having too many parameters or some other reason, now there’s a chance that we’ll crawl that page C.

Sitemaps use a simple XML format that you can learn about at http://www.sitemaps.org. XML Sitemaps are a useful and in some cases essential tool for your website. In particular, if you have reason to believe that the site is not fully indexed, an XML Sitemap can help you increase the number of indexed pages. As sites grow in size, the value of XML Sitemap files tends to increase dramatically, as additional traffic flows to the newly included URLs.

Wednesday, July 13, 2011

Now What The Hell Is keyword research!!

OK, now you know a little about seo and you have set up your domain/blog and started added content. But now you are wondering how can i choose the best keywords for my website/blog that would be relative to content and will rank me up in google search. Well this is not as hard as you might think. There are a lot of keyword research tools available on the internet and the great news is that most of them are free. Lets take a brief overview of what keyword research is and how are you going to choose your keywords out of an almost unlimited choice of words.

Keyword research is the core of any SEO campaign, since the keywords you pick during the research will be included in your website copy, into your PPC campaigns and any other website promotion campaigns. In a sense, keyword research is similar to customer research, because you are studying what words your potential clients use when searching for your service or product.

When starting your keyword research, you'll need to pick the main keywords to base your keyword research on. These will be the keywords that you, your customers and your competitors chiefly use when speaking of your service or product. The valid keywords would be synonyms, different in one aspect or another - view them as the directions your website SEO will go. Naturally, you'll need to use quite a variety of sources for your keywords. The most common tools used for keyword research are:

  • Google Keyword Tool
  • Wordtracker
  • Sitepoint
  • Keyword discovery
And many others. So you can choose the one which suits you the best. And remember that it'll give you an idea about how much the keyword is being used and what are the alternatives. You can take the basic idea from here and make keywords that best describe your blog/website. And it is a good practice to use synonyms.

Tuesday, July 5, 2011

Seo Basics Overview

Google is a name that almost everyone in the world knows and they should, as the world's most widely used search engine. Many people throw up a website for different reasons and believe that is the end of it. The internet contains so much information that it would take a person several lifetimes to read a fraction of the information, it needs the best organization possible. This task has been accomplished by search engines; the largest of which are Google, Yahoo, and Bing. Getting your site in front of viewers all comes down to how well it is optimized. SEO, also known as search engine optimization, is ideal for people who are trying to get visitors to their site and don't want to pay outrageous prices to get them. It's the sad truth but when it comes to SEO advice you need to be very careful who you listen to because there is so much damaging information out there. This article will be discussing some of the common mistakes that occur when you're trying to optimize your site.

SEO takes time to work but once it does it is almost self sustaining. Patience is a must for SEO. Of course, once you've done the initial work, the payoff is lots of traffic. Everybody wants to be on the first page for a commercial keyword that is in demand. You must be ready to build links, create unique content and put in the time to have everything in place, then you can simply forget getting ranked for any keyword. The speed at which your site loads is also a factor, not only to your popularity with visitors, but also to how the search engines view your site. If your site contains messy code or is overloaded with convoluted code, it will make your site load more slowly in a visitor's browser, which is something Google doesn't like. Where possible, try to avoid internal code on your pages and use an external CSS file instead. This small change can speed up your loading times and make it much easier to update your site later. Remember that search engines aim to offer their own users the best possible results to suit their needs, so your aim is to find ways to appeal to what the search engines want. In essence, this can mean your site needs to match what your visitors are looking for and also needs to fit within what the search engines are looking for.

Getting incoming links too quickly is also a mistake, as it will send out a red flag to the search engine and they'll penalize you for link spamming. A major myth going around right now is that your new site will catapult to the number spot on the first page if your link is mass distributed across the web. But this doesn't work that way, as search engines like natural link building. Are you willing to get your brand new site sandboxed and de-indexed because you didn't have the patience to build your links naturally over time? The big secret is to build links slow and steady.

See, SEO isn't as complicated as it seems, just stick to the basics and you will be doing better than most. The more you practice optimizing your site, the better results you will get. So, now you have it; quality backlinks, internally linking related pages, on-site optimization, and patience.

Monday, July 4, 2011

Social Bookmarking

Social bookmarking web sites offer users convenient storage of their bookmarks remotely for access
from any location. Examples of these sites include del.icio.us, digg, Reddit, and so on. These sites usually
allow these bookmarks to be private, but many choose to leave them public. And when a particular
web page is publicly bookmarked by many users, that is a major positive contributing factor in the
ranking algorithm of the search function on a social bookmarking site. Ranking well in these searches
presents another great source of organic traffic. Furthermore, if a web page is bookmarked by a large
number of people, it may result in a front page placement on such a site. This usually results in a landslide of traffic.

Many blogs present links to streamline the process of bookmarking a page. As is typical with facilitating
any action desired from a web site user, this may increase the number of bookmarks achieved by a page
on your web site. And it is a very common SEO practice with much effective results.

These little icons make it easy for people browsing a web site to do some free marketing for you — in
case they like the content at that particular URL and want to bookmark it. There are many free APIs on the web like addthis (personal favorite) which allows you to add the icons of almost all the social bookmarking websites on the internet.

Using the Robots Meta Tag

Using the robots meta tag you can exclude any HTMLbased content from a web site on a page-by-page basis, and it is frequently an easier method to use when eliminating duplicate content from a preexisting site for which the source code is available, or when a site contains many complex dynamic URLs.

To exclude a page with meta-exclusion, simply place the following code in the < head > section of the
HTML document you want to exclude:
<meta name=”robots” content=”noindex, nofollow” />
This indicates that the page should not be indexed (noindex) and none of the links on the page should
be followed (nofollow). It is relatively easy to apply some simple programming logic to decide whether
or not to include such a meta tag on the pages of your site. It will always be applicable, so long as you
have access to the source code of the application, whereas robots.txt exclusion may be difficult or
even impossible to apply in certain cases.

To exclude a specific spider, change “robots” to the name of the spider — for example googlebot,
msnbot, or slurp. To exclude multiple spiders, you can use multiple meta tags. For example, to
exclude googlebot and msnbot:

<meta name=”googlebot” content=”noindex, nofollow” />
<meta name=”msnbot” content=”noindex, nofollow” />

The only downside is that the page must be fetched in order to determine that it should not be indexed in the first place. This is likely to slow down indexing.

Thursday, June 30, 2011

Google Page Rank

PageRank is an algorithm patented by Google that measures a particular page’s importance relative to
other pages included in the search engine’s index. It was invented in the late 1990s by Larry Page and
Sergey Brin. PageRank implements the concept of link equity as a ranking factor. PageRank considers a link to a page as a vote, indicating importance.

PageRank approximates the likelihood that a user, randomly clicking links throughout the Internet, will
arrive at that particular page. A page that is arrived at more often is likely more important — and has a
higher PageRank. Each page linking to another page increases the PageRank of that other page. Pages
with higher PageRank typically increase the PageRank of the other page more on that basis. You can
read a few details about the PageRank algorithm at http://en.wikipedia.org/wiki/PageRank.
To view a site’s PageRank, install the Google toolbar (http://toolbar.google.com/) and enable
the PageRank feature, or install the SearchStatus plugin for Firefox (http://www.quirk.biz/
searchstatus/). One thing to note, however, is that the PageRank indicated by Google is a cached
value, and is usually out of date.

PageRank values are published only a few times per year, and sometimes using outdated
information. Therefore, PageRank is not a terribly accurate metric. Google
itself is likely using a more current value for rankings.

PageRank is just one factor in the collective algorithm Google uses when building search results pages
(SERPs). It is still possible that a page with a lower PageRank ranks above one with a higher PageRank
for a particular query. PageRank is also relevance agnostic, in that it measures overall popularity using
links, and not the subject shrouding them. Google currently also investigates the relevance of links when
calculating search rankings, therefore PageRank should not be the sole focus of a search engine marketer.
Building relevant links will naturally contribute to a higher PageRank. Furthermore, building too many
irrelevant links solely for the purpose of increasing PageRank may actually hurt the ranking of a site,
because Google attempts to detect and devalue irrelevant links that are presumably used to manipulate it.
PageRank is also widely regarded by users as a trust-building factor, because users will tend to perceive
sites with a high value as more reputable or authoritative. Indeed, this is what PageRank is designed to
indicate. This perception is encouraged by the fact that Google penalizes spam or irrelevant sites (or
individual pages) by reducing or zeroing their PageRank.

Seo Definition

The search engine industry frequently innovates as do consumer behaviors for discovery and sharing. Those changes require search marketers to take a fresh look at what search engine optimization (SEO) is and why companies should or should not engage in its practice.

Defining search engine optimization is often focused on the mechanics:
“SEO considers how search engines work and what people search for. Optimizing a website primarily involves editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines.” (Wikipedia).

Even Google offers a definition of what an SEO is along with guidelines:
“Many SEOs and other agencies and consultants provide useful services for website owners, including: Review of your site content or structure, Technical advice on website, development: for example, hosting, redirects, error pages, use of JavaScript, Content development, Management of online business development campaigns, Keyword research, SEO training, Expertise in specific markets and geographies.”

Since the key components of how a search engine works include: crawling, indexing and sorting, those are the functional focus areas of most SEO efforts.  Most experienced internet marketing professionals will admit that is a limited view of the value SEO brings.

What about link building and promotion of content? What about search for content that isn’t product or service oriented? What about search within closed networks? What about real-time search? What about niche search: vertical, local, mobile, multi-lingual? What about social search?

Code, site architecture and server issues that affect how search engine bots interact with and index a web site’s content are certainly important as are keyword research and the subsequent use of those keywords in tags, on-page copy, markup and anchor text links between pages.  These areas all fall under the realm of “on-page SEO”.  The Yang to that Yin is “off-page SEO” which is basically link building. For more of this kind of practical SEO advice, read “Basics of Search Engine Optimization“.

Defining SEO can be as simple as, “Optimizing digital content for better performance in search.”  That’s a broad definition and the implications and value from improved search performance can range from increased sales to lowered customer service costs. It really depends on what customers are searching for, whether available company content is optimized and if analytics are in place to benchmark and measure performance.

Consumers are prompted to use search in a variety of scenarios ranging from research to finding products for purchase.  In most cases, SEO consultants (like TopRank :) ) are hired by corporate marketing departments to improve the search visibility of products and services being marketed to customers. Improved search engine placement typically results in an increase in traffic (qualified by the search terms used) and an increase in sales.

Marketing departments fund most Search Engine Optimization efforts whether they are executed in-house, by outside consultants or as is increasingly common, a combination of the two.  Companies that take a holistic view of search and approach the notion of “If it can be searched, it can be optimized” strategically, find themselves with an opportunity to not only improve marketing performance and efficiency via SEO, but do the same for other content areas as well.

Most companies only engage, implement and measure SEO efforts to increase revenue as part of marketing. Those same companies often publish many other types of content ON and OFF the site as well as content that has a ROI, but isn’t part of marketing. Customers are searching for this content and if it’s not well optimized, cannot find it. That spells opportunity.

Companies that implement keyword research and SEO efforts for their web sites holistically, typically realize a very desirable combination of benefits that include an increase in sales as well as a reduction of costs in certain areas. If not a reduction in costs, at least an improvement in efficiency and performance.

For example: Most on site search functions for company web sites rate a C to D minus. Google on the other hand, can do quite well with such content.  After the sale, customers often search for company contact info, product support and customer service related info. “How to fix this or that” or “Where to re-order this or that part.”  Increasing numbers of companies have invested in the aggregation and publishing of this kind of information, but the usability inherent to those systems is often flawed.

Making FAQ and Knowledgebase information available to external crawlers like Googlebot, SLURP and MSNBot as well as the systematic keyword optimization of such content makes it more easily available to customers that are looking. Companies invest in aggregating product and customer service / support information to reduce overall customer service costs (call centers are expensive) and hopefully provide better service to customers in a way that is more convenient for them (i.e. 24/7 online).  Making support content perform better in search can facilitate those performance and cost efficiency goals.

There are other examples I could share involving SEO for job listings, news content and others, but I think you get the model. Assess all content being published online (from text to digital assets) to determine the audience and whether improved search performance can help reach business goals. Those goals might be increased sales, branding/reputation, recruiting better employees while lowering recruiter costs or increasing media coverage while lowering PR agency costs.  The thread that binds this kind of SEO effort is that customers are searching for content being published on and off company web sites (inside social networks for example), but it’s often difficult to find. Making content easier for customers to find can help multiple departments reach business goals.

In the end, whether search optimization efforts are for marketing, public relations, talent acquisition, customer service or consumer research, my preferred definition of SEO is a broad one and commercially focused: “Optimizing digital content for search engines and consumers to improve performance and reach business goals.”

Tuesday, June 28, 2011

On-Site Optimization for Search Engines

On-site Search Engine Optimization (SEO) refers to the actions you take on your website to make your site appeal to the search engines. These include actions like the words you choose to use in the navigation links, meta tags, and the words within your content.

Meta tags are snippets of code you can include on your website to give the search engines more information on what your site is about. Visitors to your webpage won’t be able to see this information, but the search engine programs that scour your site will appreciate the tip off.

Alternatively, you don’t want to frustrate the search engine robots by bogging down your site with loads of pictures and flash animations. Search engines rely on the words and phrases on your site to get a feel for what it’s about, so that they know when to pull your site up in response to user search terms. If the search engine robots encounter lots of images or movie files, they won’t be able to figure out what your site is about and as a result, it will surely rank lower in the search results.

It’s important to get your on-site optimization right because you control it (as opposed to the off-site). The following are a couple of quick, but important on-site SEO principles to implement.

First, if you’re aiming for high search engine rankings, you’ll want to focus each page on a particular keyord or keyphrase. As mentioned earlier, an ideal keyphrase is not too competitive, but still searched for frequently in the search engines. Once you’ve identified your target keyphrase, you’ll want to write content that includes this keyphrase about 2-5% of the time. Don’t go overboard – your content should still sound natural. In the short run, you can trick the search engines by stuffing a page full of your target keyphrase, repeated over and over again, but this is a poor long-term strategy.

Second, the domain name you choose is important as well. The domain name should reflect the main keyphrase for your site and, whenever possible, be a .com domain. Although most experts disagree whether a continuous domain is better than a dashed domain – “www.makemoneyonline.com” versus “www.make-money-online.com” – it is becoming increasingly clear that .com domain names are given more weight than some of the other extensions, like .info or .us.

What Is Keyword Research

As you explore the world of internet marketing, you’ll probably come across the phrases “keyword” or “keyword research” more than one time. what are keywords and why should you care about them?

Think about what happens when you go to a search engine like Google or Yahoo. If you’re looking for information on cruise deals for your next vacation, you might enter the word “cruises” into the search bar to find websites related to vacation deals. That word you entered is what’s referred to as a keyword. If you entered a string of words, say “best cruise deals”, you’ve entered a keyphrase into the search engine.

The keyword or keyphrase tells the search engine what kind of results to bring back for you. Search engines scour the internet, recording the information they find on various sites around the web. When you enter the keyphrase “best cruise deals”, the search engines comb their databanks to find sites related to “best cruise deals”.

When you’re building a website, choosing the keywords and phrases to focus on is extremely important. You want to target the keywords people are actively searching for. But as you can imagine, some keywords are more competitive than others. An example of a general keyphrase is “lose weight”. It’s much more competitive than the keyphrase “lose weight with the lemon juice diet”.

If you build a new site around the phrase “lose weight”, you’ll be competing with well established websites and likely find your site buried on page 89 of Google’s search results for the term.

Where your site falls on the search results pages is measured as its ranking. For the general keyphrase “lose weight”, you’ll likely be low in the rankings, so your site won’t receive much traffic or exposure.

On the other hand, if you focus on less competitive keywords, you can expect to rank much higher in the search engine results – you may even be able to land the much-coveted first page ranking.

This is why good keyword research is vital to the success of your online business. Focus on less competitive keywords and you can expect to rank much higher in the search engine results. Focus on lots of less competitive keywords and you’ll get lots of free traffic.

Ideally, you want to find keyphrases that have high search volumes and low competition. The two go hand-in-hand – the phrase “make your own digital camera using a potato” likely has low competition, but it won’t matter if no one ever searches for it. The experts argue back and forth about exactly what type of numbers you should be looking for – “search engine results of less than 500,000″ or “at least 1,000 daily searches” – but it will be up to you to determine which parameters work best for your internet business.

The other thing to keep in mind is that you optimize the individual pages of your site, not just the home page. So you don’t just pick one keyword or keyphrase. You need to select a number of them and set up an actual strategy for ranking high for all of them.

To conduct your keyword research, you can take advantage of a number of free or paid keyword research tools. One of the best free research tools is available through the Google Adwords program, although you can find other good options at SEOBook.com or by searching online. If you’re looking for programs that offer added functionality, look into WordTracker’s subscription service. Any one of these programs will help you identify keywords and keyphrases with high traffic and low competition to build your web pages around.

Tuesday, June 14, 2011

Basics of Meta Tags



Allrite.. Lets get started with the Meta tags. The most basic seo technique. Meta Tags are HTML tags used to describe the content of the page. Every page is related to something, e.g a website about mobile phones have the content related to mobiles. That could be mobile phone pictures, articles, prices, or maybe some applications for cell phones. So to let the search engine know what is your page is actually about, we use Meta Tags. The meta tags look like this


<meta name="keywords" content="(Keywords here)">


every webpage have this element by default. We just need to change the content area with the required keywords. What that do is it tells the search engine spider about the key phrases that have been used in a specific page. Now when the spider reads that tag, it automatically indexes the page against these keywords.


Meta Tags are not used by google anymore. Because many webmasters have exploited this feature and stuffed the meta tags with irrelevant keywords. But this is still used by other major search engines like Yahoo,bing,ask etc. So their importance is still there and one needs to pay attention on them as well. There is one more Meta tag used by search engines is:


<meta name="description" content="(description here)">


Now as it is clear from the name. This meta tag includes the brief summary of the content of the page. Whenever You search something on a search engine. It displays a brief text about the website just below its link. Meta description should not be more than 160 characters. And it is a good practice to use a few keywords in your description as well. But make sure not to stuff it with keywords. It should be written for humans not just for the search engine spider. Because whenever someone searches something, they always take a look at the description text to see what the page is about. So the more relevant the description is, the more chances are that your link will get clicked. So the first thing you wanna do as an seo of your website is to optimize the meta tags of your website.
 


Monday, June 13, 2011

SEO Genius

Hello everyone!! This is my first post on this blog. I was thinking to find a place where i can share my knowledge as an search engine optimizer and i got here, i think blogger is a good way to start. You must've seen many blogs or websites related to seo and link building. But here, we'll make things simpler. This is basically for newbies, who have heard about seo and wondering what is it all about. You can learn the basics of search engine optimization. As a webmaster, website traffic is something that everyone is looking forward to. And no matter what your site is about. Without visitors its useless. So here i will try to give you simple tips to improve your search engine ranking/PR/backlinks etc. And remember that nowadays, SMO(social media optimization) is also a very very important factor to improve your traffic. I will try to give you basic insights and information to get you started. Do let me know about how am i writing or any specific topics that you want to know about. I am really willing to help. Hope to see you again... !! :) Happy Blogging