Using Meta Descriptions

In Improve snippets with a meta description makeover on the Google Webmaster Central blog, an explanation of how Google provides page description information and uses meta descriptions provides valuable information in controlling the appearance of your site’s pages in Google search results. It is encouraging to see this thorough explanation as I have been confused as to the benefit of meta descriptions and the priority one should give to customizing them for each page.

The meta description is one place Google (or any search engine) can get your page snippet — the short text preview Google displays for each web result — from. This is preferable as it gives you control of what is displayed as the page description for search results rather than an extract from your page or other sources.

To summarize, it seems to me the key points for meta descriptions are:

  • Quality meta descriptions without keyword stuffing are more likely to be used as a page description.
  • Provide a different description for each page. Use site level descriptions for the main page and other aggregation pages and page specific descriptions for other pages.
  • Descriptions should be human readable. Use a sentence structure when appropriate, or clearly tagged and separated information for permalink pages such as blog post or product pages.
  • While not addressed in the Google article, it is touched on in the comments. Keep your descriptions short so they are not cut off. It appears descriptions are limited to between 140 and 160 characters. Keep the important information at the beginning of the description if you must exceed the character limit (approximately two lines in Google’s search results). Tweak your descriptions when you see them truncated on Google’s search results to provide the best results.

Considering this information, I need to look at an automated way to insert appropriate meta descriptions on every page. My previous standard was a site level meta description for every page, or just the home page. If I cannot find a satisfactory plugin myself I will add that to the list of development tasks I have. Once developed, this is something I can easily provide to clients to maximize the appropriateness of their meta descriptions in an automated way. This is the type of value-add I like to offer my clients. If only I had more time to focus on these tasks.

Getting a Site Indexed

Something I have learned the past few years while talking to friends and clients regarding new or existing web sites is the lack of understanding of how search engines index your site’s content and how that content shows up in search results. Of course that is the basis of the Search Engine Optimization (SEO) industry, of which I am not a big fan (that explanation is for a later post). A recent post by Matt Cutts of Google on the changes in the frequency of index updates is very interesting and displays how far the industry has come in just the last 7 years.

Matt states that in 2000 when he joined Google there was a 3-4 month period where they did not update their index at all and another search engine went for over a year without updating their index (perhaps one of the casualties of the search engine shake up). This would mean that no new sites or new content from existing sites would show up in searches until the index update. It was mid-2000 when Google started regular monthly index updates, driving the search engine industry to provide accurate and fresh results for searchers.

Since then Google has been improving their index updates to the point where things can appear in the index only minutes after being posted. Of course, other search engines have had to follow suit. This is where I appreciate Google’s focus on their search customers (although content owners love fresh results as well).

Changes in technology have helped Google reach these new levels of freshness. Instead of Google spiders having to crawl each site daily (which is impossible for them to do when they are indexing billions of sites) sites can ping Google when they have updates. This is possible now with the rise of RSS and sitemaps. Sites do not have to be a traditional blog to utilize these techniques either.

In the past, clients would bring new projects and expect a site to be created and launched in two months, as well as indexed by all the major search engines with a high result on key search terms on launch day. When I explained that sites had to be submitted for crawling by the search engines, and then there was a waiting period before they would be added to the index and available in search results, for a total wait time of 4-6 months, it often opened up their eyes to the search engine industry. Many wanted to pay to be included in the index and listed as the #1 result but after some explanation, they would understand the reality of the web. As all were small organizations or individuals, I stressed the importance of focusing on their content and doing what they could with the search engines but not obsessing over their initial rankings. Some dropped their site project with this news; others went forward and discovered their wait for a crawl and to show up in the index was not as detrimental as they thought it would be. Now it seems very easy to set up a site with feeds and sitemap pinging capabilities and you can be discovered and indexed in days or hours. Then you can immediately work on building content and incoming links from valuable resources (not link exchanges) to increase your visibility.

Some more good general advice is provided in a Google Webmaster Central post on getting indexed (English at bottom) for the Portuguese market, but it is relevant to every site. The top two points are critical – Be a subject authority (write good content people are looking for) and keep the search engines informed of your site updates, which are hopefully frequent. If you are not checking off these two points, then all the other optimization will do little to gain and maintain visitors, no matter how high you get your site to rank.

Webmaster Tips from Google

I have a number of items marked in my feed reader to write a post upon. They all relate to Google and good site practice for search engine visibility.

In the post The number of pages Googlebot crawls, the importance of a common, standard URL structure is noted. Googlebot crawls URL’s and therefore sees http://www.example.com/ and http://www.example.com/index.html as two different URL’s even though they may display the same content. While there are methods to redirect visitors or tell Google which page is preferred, it is easier to adopt a standard URL structure and consistently use it. By having only one URL to access any page or content, it is much more likely that sites linking to your content will use the correct URL, therefore not splitting your incoming links between two or more different URL’s. Keep this in mind when informing people to link to your site, and if you do see incoming links using and incorrect URL, politely ask the source to update their link.

In an older post on the Google Webmaster Central blog, a quick tutorial is provided on understanding keywords (or common words) identified on Google’s webmaster tools page for your site. If you have a website and have not signed up for a Google Webmaster tools account, I encourage you to do so. This free service provides valuable information to how Google sees your site, allowing you to correct issues and improve your site standing.

One option for even more statistical information about your visitors can come from Google Analytics, which was recently updated with improved interface and new features. While free, Google Analytics requires your visitors’ data be sent and stored at Google. If this is a problem for you, other alternatives, such as Mint, can be customized to provide whatever statistics you like, is hosted on your own server and is available for a minimal cost.

Another development in search engine technology that will affect website owners soon is universal search. Google talked a little about universal search in this post from May and it was deeply analysed in this post on Search Engine Land. The convergence of media types available to searchers with one search query will affect the return rankings for results and increase the competitiveness for the ever sought higher rankings. Universal search should not change how websites optimize themselves for search engines, however. I still believe that quality, relevant content is they cornerstone for your search foundation followed by appropriate inbound links.