Showing posts with label Google. Show all posts
Showing posts with label Google. Show all posts

Thursday, July 21, 2011

XML Sitemap Guide

Google, Yahoo!, and Microsoft all support a protocol known as XML Sitemaps. Google first announced it in 2005, and then Yahoo! and Microsoft agreed to support the protocol in 2006. Using the Sitemaps protocol you can supply the search engines with a list of all the URLs you would like them to crawl and index.

Adding a URL to a Sitemap file does not guarantee that a URL will be crawled or indexed. However, it can result in pages that are not otherwise discovered or indexed by the search engine getting crawled and indexed. In addition, Sitemaps appear to help pages that have been relegated to Google’s supplemental index make their way into the main index.

This program is a complement to, not a replacement for, the search engines’ normal, link-based
crawl. The benefits of Sitemaps include the following:
  • For the pages the search engines already know about through their regular spidering, they use the metadata you supply, such as the last date the content was modified (lastmod date) and the frequency at which the page is changed (changefreq), to improve how they crawl your site.
  • For the pages they don’t know about, they use the additional URLs you supply to increase their crawl coverage.
  • For URLs that may have duplicates, the engines can use the XML Sitemaps data to help choose a canonical version.
  • Verification/registration of XML Sitemaps may indicate positive trust/authority signals.
  • The crawling/inclusion benefits of Sitemaps may have second-order positive effects, such as improved rankings or greater internal link popularity.

The Google engineer who in online forums goes by GoogleGuy (a.k.a. Matt Cutts, the head of Google’s webspam team) has explained Google Sitemaps in the following way: Imagine if you have pages A, B, and C on your site. We find pages A and B through our normal web crawl of your links. Then you build a Sitemap and list the pages B and C. Now there’s a chance (but not a promise) that we’ll crawl page C. We won’t drop page A just because you didn’t list it in your Sitemap. And just because you listed a page that we didn’t know about doesn’t guarantee that we’ll crawl it. But if for some reason we didn’t see any links to C, or
maybe we knew about page C but the URL was rejected for having too many parameters or some other reason, now there’s a chance that we’ll crawl that page C.

Sitemaps use a simple XML format that you can learn about at http://www.sitemaps.org. XML Sitemaps are a useful and in some cases essential tool for your website. In particular, if you have reason to believe that the site is not fully indexed, an XML Sitemap can help you increase the number of indexed pages. As sites grow in size, the value of XML Sitemap files tends to increase dramatically, as additional traffic flows to the newly included URLs.

Tuesday, July 5, 2011

Seo Basics Overview

Google is a name that almost everyone in the world knows and they should, as the world's most widely used search engine. Many people throw up a website for different reasons and believe that is the end of it. The internet contains so much information that it would take a person several lifetimes to read a fraction of the information, it needs the best organization possible. This task has been accomplished by search engines; the largest of which are Google, Yahoo, and Bing. Getting your site in front of viewers all comes down to how well it is optimized. SEO, also known as search engine optimization, is ideal for people who are trying to get visitors to their site and don't want to pay outrageous prices to get them. It's the sad truth but when it comes to SEO advice you need to be very careful who you listen to because there is so much damaging information out there. This article will be discussing some of the common mistakes that occur when you're trying to optimize your site.

SEO takes time to work but once it does it is almost self sustaining. Patience is a must for SEO. Of course, once you've done the initial work, the payoff is lots of traffic. Everybody wants to be on the first page for a commercial keyword that is in demand. You must be ready to build links, create unique content and put in the time to have everything in place, then you can simply forget getting ranked for any keyword. The speed at which your site loads is also a factor, not only to your popularity with visitors, but also to how the search engines view your site. If your site contains messy code or is overloaded with convoluted code, it will make your site load more slowly in a visitor's browser, which is something Google doesn't like. Where possible, try to avoid internal code on your pages and use an external CSS file instead. This small change can speed up your loading times and make it much easier to update your site later. Remember that search engines aim to offer their own users the best possible results to suit their needs, so your aim is to find ways to appeal to what the search engines want. In essence, this can mean your site needs to match what your visitors are looking for and also needs to fit within what the search engines are looking for.

Getting incoming links too quickly is also a mistake, as it will send out a red flag to the search engine and they'll penalize you for link spamming. A major myth going around right now is that your new site will catapult to the number spot on the first page if your link is mass distributed across the web. But this doesn't work that way, as search engines like natural link building. Are you willing to get your brand new site sandboxed and de-indexed because you didn't have the patience to build your links naturally over time? The big secret is to build links slow and steady.

See, SEO isn't as complicated as it seems, just stick to the basics and you will be doing better than most. The more you practice optimizing your site, the better results you will get. So, now you have it; quality backlinks, internally linking related pages, on-site optimization, and patience.

Thursday, June 30, 2011

Google Page Rank

PageRank is an algorithm patented by Google that measures a particular page’s importance relative to
other pages included in the search engine’s index. It was invented in the late 1990s by Larry Page and
Sergey Brin. PageRank implements the concept of link equity as a ranking factor. PageRank considers a link to a page as a vote, indicating importance.

PageRank approximates the likelihood that a user, randomly clicking links throughout the Internet, will
arrive at that particular page. A page that is arrived at more often is likely more important — and has a
higher PageRank. Each page linking to another page increases the PageRank of that other page. Pages
with higher PageRank typically increase the PageRank of the other page more on that basis. You can
read a few details about the PageRank algorithm at http://en.wikipedia.org/wiki/PageRank.
To view a site’s PageRank, install the Google toolbar (http://toolbar.google.com/) and enable
the PageRank feature, or install the SearchStatus plugin for Firefox (http://www.quirk.biz/
searchstatus/). One thing to note, however, is that the PageRank indicated by Google is a cached
value, and is usually out of date.

PageRank values are published only a few times per year, and sometimes using outdated
information. Therefore, PageRank is not a terribly accurate metric. Google
itself is likely using a more current value for rankings.

PageRank is just one factor in the collective algorithm Google uses when building search results pages
(SERPs). It is still possible that a page with a lower PageRank ranks above one with a higher PageRank
for a particular query. PageRank is also relevance agnostic, in that it measures overall popularity using
links, and not the subject shrouding them. Google currently also investigates the relevance of links when
calculating search rankings, therefore PageRank should not be the sole focus of a search engine marketer.
Building relevant links will naturally contribute to a higher PageRank. Furthermore, building too many
irrelevant links solely for the purpose of increasing PageRank may actually hurt the ranking of a site,
because Google attempts to detect and devalue irrelevant links that are presumably used to manipulate it.
PageRank is also widely regarded by users as a trust-building factor, because users will tend to perceive
sites with a high value as more reputable or authoritative. Indeed, this is what PageRank is designed to
indicate. This perception is encouraged by the fact that Google penalizes spam or irrelevant sites (or
individual pages) by reducing or zeroing their PageRank.

Seo Definition

The search engine industry frequently innovates as do consumer behaviors for discovery and sharing. Those changes require search marketers to take a fresh look at what search engine optimization (SEO) is and why companies should or should not engage in its practice.

Defining search engine optimization is often focused on the mechanics:
“SEO considers how search engines work and what people search for. Optimizing a website primarily involves editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines.” (Wikipedia).

Even Google offers a definition of what an SEO is along with guidelines:
“Many SEOs and other agencies and consultants provide useful services for website owners, including: Review of your site content or structure, Technical advice on website, development: for example, hosting, redirects, error pages, use of JavaScript, Content development, Management of online business development campaigns, Keyword research, SEO training, Expertise in specific markets and geographies.”

Since the key components of how a search engine works include: crawling, indexing and sorting, those are the functional focus areas of most SEO efforts.  Most experienced internet marketing professionals will admit that is a limited view of the value SEO brings.

What about link building and promotion of content? What about search for content that isn’t product or service oriented? What about search within closed networks? What about real-time search? What about niche search: vertical, local, mobile, multi-lingual? What about social search?

Code, site architecture and server issues that affect how search engine bots interact with and index a web site’s content are certainly important as are keyword research and the subsequent use of those keywords in tags, on-page copy, markup and anchor text links between pages.  These areas all fall under the realm of “on-page SEO”.  The Yang to that Yin is “off-page SEO” which is basically link building. For more of this kind of practical SEO advice, read “Basics of Search Engine Optimization“.

Defining SEO can be as simple as, “Optimizing digital content for better performance in search.”  That’s a broad definition and the implications and value from improved search performance can range from increased sales to lowered customer service costs. It really depends on what customers are searching for, whether available company content is optimized and if analytics are in place to benchmark and measure performance.

Consumers are prompted to use search in a variety of scenarios ranging from research to finding products for purchase.  In most cases, SEO consultants (like TopRank :) ) are hired by corporate marketing departments to improve the search visibility of products and services being marketed to customers. Improved search engine placement typically results in an increase in traffic (qualified by the search terms used) and an increase in sales.

Marketing departments fund most Search Engine Optimization efforts whether they are executed in-house, by outside consultants or as is increasingly common, a combination of the two.  Companies that take a holistic view of search and approach the notion of “If it can be searched, it can be optimized” strategically, find themselves with an opportunity to not only improve marketing performance and efficiency via SEO, but do the same for other content areas as well.

Most companies only engage, implement and measure SEO efforts to increase revenue as part of marketing. Those same companies often publish many other types of content ON and OFF the site as well as content that has a ROI, but isn’t part of marketing. Customers are searching for this content and if it’s not well optimized, cannot find it. That spells opportunity.

Companies that implement keyword research and SEO efforts for their web sites holistically, typically realize a very desirable combination of benefits that include an increase in sales as well as a reduction of costs in certain areas. If not a reduction in costs, at least an improvement in efficiency and performance.

For example: Most on site search functions for company web sites rate a C to D minus. Google on the other hand, can do quite well with such content.  After the sale, customers often search for company contact info, product support and customer service related info. “How to fix this or that” or “Where to re-order this or that part.”  Increasing numbers of companies have invested in the aggregation and publishing of this kind of information, but the usability inherent to those systems is often flawed.

Making FAQ and Knowledgebase information available to external crawlers like Googlebot, SLURP and MSNBot as well as the systematic keyword optimization of such content makes it more easily available to customers that are looking. Companies invest in aggregating product and customer service / support information to reduce overall customer service costs (call centers are expensive) and hopefully provide better service to customers in a way that is more convenient for them (i.e. 24/7 online).  Making support content perform better in search can facilitate those performance and cost efficiency goals.

There are other examples I could share involving SEO for job listings, news content and others, but I think you get the model. Assess all content being published online (from text to digital assets) to determine the audience and whether improved search performance can help reach business goals. Those goals might be increased sales, branding/reputation, recruiting better employees while lowering recruiter costs or increasing media coverage while lowering PR agency costs.  The thread that binds this kind of SEO effort is that customers are searching for content being published on and off company web sites (inside social networks for example), but it’s often difficult to find. Making content easier for customers to find can help multiple departments reach business goals.

In the end, whether search optimization efforts are for marketing, public relations, talent acquisition, customer service or consumer research, my preferred definition of SEO is a broad one and commercially focused: “Optimizing digital content for search engines and consumers to improve performance and reach business goals.”