The Art and Science of Writing Good Titles

Crafting a good title for your pages or blog posts is a problem many people struggle with. Do you focus on keywords or on attention grabbing formulas? Hopefully this post will give you some insight into how to write better titles. The difficulty with crafting good titles is that it’s part science and part art, and you have to know when to lean in one direction or the other. To illustrate my point, I’m going to take the same concept and show you variations you would use for different purposes.

All three of these are similar but distinctly different posts. They will have different content and editorial styles …

Are you creating a page to drive sales, capture leads, or make conversions? If you are, you want to focus strictly on keywords and give little if any concern to being sensational or even slightly creative. An example would be…:


by Michael Gray
http://www.wolf-howl.com/seo/art-of-writing-good-titles/

Crafting a good title for your pages or blog posts is a problem many people struggle with. Do you focus on keywords or on attention grabbing formulas? Hopefully this post will give you some insight into how to write better titles. The difficulty with crafting good titles is that it’s part science and part art, and you have to know when to lean in one direction or the other. To illustrate my point, I’m going to take the same concept and show you variations you would use for different purposes.

All three of these are similar but distinctly different posts. They will have different content and editorial styles …

Are you creating a page to drive sales, capture leads, or make conversions? If you are, you want to focus strictly on keywords and give little if any concern to being sensational or even slightly creative. An example would be…: Leer más “The Art and Science of Writing Good Titles”

 Merrick Lozano of PRLeap Gives Tips About Press Releases

This is the third time we’re speaking with Merrick Lozano of PRLeap, so let’s dive right in. The last time we talked here was 2007 when we spoke about local search. What’s new in the area of press releases that people should know about?

Thank you for having me back Michael.

When we last spoke in 2007 the press release had just celebrated its 100th birthday. It had evolved into an effective tool for increasing a brand’s search visibility. You can use an online press release to reach customers and writers who are searching for the type of information you are writing about.

With the emergence of social networks, the press release has continued to evolve – showing its flexibility – as it becomes a tool for sparking conversations and engaging customers and influencers. The social media press release, also known as the social media release (SMR), bundles together videos, pictures, links, and other social objects into a story ready to be distributed via online press release services like PR Leap. [Más…]

This summer we upgraded our social media release template with the Facebook Like button and the Tweet button from Twitter – to make it easy to spark a conversation in those respective communities. The impact was immediate: with only a few Likes and Tweets, a news release not only gets an increase in visitors from Facebook and Twitter, but it also gets a spike in search traffic.
… socializing a press release into a social media release makes it easier to spark conversations on communities where your audience is participating. You’ll get much better results if you help get the conversation started by liking, tweeting and submitting the social media release to target niche sites…

The benefits of socializing a press release are clear, but not all social media releases are equal. Traditionally, a press release was written for the press. This meant writing a news story in the third person. Giving the social media release the flexibility to be written in conversational tone for most audiences makes it more engaging. But most press release services and newswires will not distribute or publish a release unless it’s free of direct address.


Post image for Merrick Lozano of PRLeap Gives Tips About Press Releases

Michael Gray
By Michael Gray
http://www.wolf-howl.com/featured/tips-press-releases/

This is the third time we’re speaking with Merrick Lozano of PRLeap, so let’s dive right in. The last time we talked here was 2007 when we spoke about local search. What’s new in the area of press releases that people should know about?

Thank you for having me back Michael.

When we last spoke in 2007 the press release had just celebrated its 100th birthday. It had evolved into an effective tool for increasing a brand’s search visibility. You can use an online press release to reach customers and writers who are searching for the type of information you are writing about.

With the emergence of social networks, the press release has continued to evolve – showing its flexibility – as it becomes a tool for sparking conversations and engaging customers and influencers. The social media press release, also known as the social media release (SMR), bundles together videos, pictures, links, and other social objects into a story ready to be distributed via online press release services like PR Leap. Leer más “ Merrick Lozano of PRLeap Gives Tips About Press Releases”

How to Diagnose and Improve Website Crawling

Sitemap Statistics

If things are radically out of whack, you can download a table of pages in the index from webmaster central and diagnose on a page by page level to see what is or or isn’t in the index.

Next, you want to try and do a full crawl of the website using something like Xenu. While it’s usually used to check for broken links, in the process it does crawl the website. If you have a large website, you are going to want to limit the crawling.

Another product that I like to use is Website Auditor. One of the interesting things about using Website Auditor is that you can specify crawling depth, which is how deep you want a crawl to go. Start at the homepage and go only one level. Run it again, this time with 2 levels, then 3. Additionally use your Webmaster Central report on most linked pages (think of them as link hubs). If your important pages aren’t within 2-3 pages of linking hubs on your website, you will have problems. IMHO it’s more important than ever to cultivate deep linking and to use that deep linking to spread your link equity, inbound trust, and authority wisely around your website.

In recent years Google has done away with the term/classification “supplemental index.” IMHO this was more of a public relations move, as they just grew tired of hearing from people who were upset that any part of their site was in the supplemental index–but I digress. There are certain parts of your website that aren’t as important as others or, as in the case of say a privacy policy, are important to people but not for rankings. To help you understand what pages Google thinks are important, you need to look at last crawl date in the Google Cache.

Pages that have the most links are going to get crawled more frequently. Pages that have the most trust and authority are going to get crawled most often. Pages that are linked to from those linking hubs, or trusted and authoritative hubs, will get crawled next most frequently. At each step away from the linking hubs, or authority points, crawl frequency will decrease–think of it like a classic pagerank model.


Michael Gray - Graywolf's SEO Blog

Michael Gray

By Michael Gray | http://www.wolf-howl.com/seo/diagnose-improve-crawling/

When you are reviewing a website, whether for your own projects or for a client project, one of the important areas to review is crawlability. In this post I’d like to talk about some of the ways you can look for and diagnose crawling issues.

If your important pages aren’t within 2-3 pages of linking hubs on your website, you will have problems …

The first step to diagnosing a crawling problem is to use a simple [site:example.com] search and compare how many pages you really have with how many Google thinks you have. Now, bear in mind that this number is an estimate. What you are trying to do is get a rough estimate of how many pages Google knows about, as Matt Cutts recently discussed in a Webmaster Central Video:

If you have several hundred or thousand pages but Google only shows 100, then you have a problem. Depending on how large the site is, anywhere from 10-30% accuracy would be a good rule of thumb.

The second thing you would want to look at would be Webmaster Central. If you submit a sitemap, Google tells you how many URLs you submitted and how many are in the index. The closer those numbers are, the better. Don’t worry if it’s not a 100% match because sometimes you include pages in your sitemap that get blocked at the page level with a robots meta tag. At this point, you are just concerned with gross numbers. Leer más “How to Diagnose and Improve Website Crawling”

How to Do A Content Audit of Your Website

If you have a website that’s been around for a few years and you’re looking for ways to make some improvements, one of the tactics I recommend is doing a content audit.

When you do a content audit you have a few goals in mind:

* Get rid of any low quality or unimportant pages
* Look for pages or sections that can be improved or updated
* Improve your rankings by more effectively using your link equity, internal anchor text, and interlinking your content

Get the Data
your inbound link equity can only support a certain number of pages …

The first thing you need to do is to get an understanding of where your website currently stands. You’ll need a list of the pages of your website, the number of inbound links, and amount of visitors your page receives. If you are using Webmaster central, you can export a spreadsheet of all the pages with the number of links. The next thing you have to do is add a column for page views. I like to use a timeframe between a year and year and half.

Depending on the number of pages your website has, it could take a while to get all this data. This is the perfect task for an intern or outsourced labor from a place like ODesk. I recently performed this task on a website that has 1800 URL’s. It cost me $75, and I had the data back in just over 24 hours.
Identify the Low Performing Pages

The two primary factors I like to look at are how many links does a post/page have and how much traffic did it generate in the past 18 months. Any page that generated less than 100 page views is a candidate for deletion. Additionally, any page that generated less than 25 links is also a candidate for deletion.


Michael Gray

By Michael Gray
http://www.wolf-howl.com/seo/content-audit-website/

If you have a website that’s been around for a few years and you’re looking for ways to make some improvements, one of the tactics I recommend is doing a content audit. 

When you do a content audit you have a few goals in mind:

  • Get rid of any low quality or unimportant pages
  • Look for pages or sections that can be improved or updated
  • Improve your rankings by more effectively using your link equity, internal anchor text, and interlinking your content

Get the Data

your inbound link equity can only support a certain number of pages …

The first thing you need to do is to get an understanding of where your website currently stands. You’ll need a list of the pages of your website, the number of inbound links, and amount of visitors your page receives. If you are using Webmaster central, you can export a spreadsheet of all the pages with the number of links. The next thing you have to do is add a column for page views. I like to use a timeframe between a year and year and half.

Depending on the number of pages your website has, it could take a while to get all this data. This is the perfect task for an intern or outsourced labor from a place like ODesk. I recently performed this task on a website that has 1800 URL’s. It cost me $75, and I had the data back in just over 24 hours.

Identify the Low Performing Pages

The two primary factors I like to look at are how many links does a post/page have and how much traffic did it generate in the past 18 months. Any page that generated less than 100 page views is a candidate for deletion. Additionally, any page that generated less than 25 links is also a candidate for deletion. Leer más “How to Do A Content Audit of Your Website”

Can Google Detect an Affiliate Website

Now if the folks at Sitonomy can detect that 4% of the* links on a page are from CJ, I’m positive that Google can as well. I’m sure Google can tell on page level throughout the site and the site as a whole. I’m also quite sure Google has an idea at what point, whether by percentage or by total number of links, that a site becomes an affiliate website. It would also be fairly easy to say, once you cross that threshold, you need a higher level of trust to rank for competitive terms. This is one of the reasons I strongly disagree with Lori Weiman, who says affiliates should never cloak links.

UPDATED: the % is total links scanned not just links on the page, my bad.

So what are the takeaways here:

* Use a tool like Sitonomy to check your most important pages and see what they are able to find as far as affiliate links
* Look into redirection tools that mask your links, and make sure you block them from search engine spiders
* Obfuscate some of your other links as well even if they aren’t affiliate links: people should always be unsure of your intent
* Always make sure you comply with FTC regulations for disclosure. If needed, use a nice non-machine-readable graphic for maximum stealthiness


Michael Gray

By Michael Gray
http://www.wolf-howl.com

One of the questions that often comes up is does Google hate affiliate websites, and are they penalized in the algorithm?

I’m also quite sure Google has an idea at what point, whether by percentage or by total number of links, that a site becomes an affiliate website

The answer to that is slightly nuanced but, for simplicity’s sake, they don’t hate affiliate websites. Nor have I seen any evidence that shows affiliate sites are penalized. What Google does hate is thin affiliate websites with little or no trust. However, a better question to ask is can Google detect affiliate websites, and can they make it harder  for affiliate websites to rank … ? But those are entirely different questions.

If you’ve read the leaked quality rater guide from 2009, you’ll see that Google has set up lot of hurdles specifically making it harder for affiliate websites to “pass” the sniff test. One of the quickest and easiest ways that Google can determine an affiliate website is through “naked” links to common affiliate programs like Linkshare, CJ, ShareASale, and others. But, really, how good can Google be at detecting those links? Well, here’s a publicly available free tool put out by Sitonomy that checks what types of programming tools are being used by a website. Leer más “Can Google Detect an Affiliate Website”

Adsense: Why Bloggers Don’t Get It

You may post about commercial related subjects like your job, what you like to buy, or even your hobbies. However these posts are all about your life, they are no more commercially viable or attractive than say Aunt Millie’s Holiday Newsletter. Yes we all have an Aunt Millie in our family, every year she sends out a finely crafted newsletter in a coordinating envelope she ordered from paperdirect.com telling us all about her family. We learn how hard her husband works, how many activities her kids are in and how good they are at them. We also read the details of how her scrapbooking business hasn’t taken off yet, but she promises to spend more time on it right after New Years. So if you were a business owner would you want to advertise anywhere on Aunt Millie’s Newsletter? Then why would a business want to pay you top dollar to advertise on your blog? What’s that, you say your blog gets (insert a high number here) of readers per day, surely that has to be worth something? Well did you know Aunt Millie sends out over 800 copies of her holiday newsletter to 17 countries, on 4 continents? Now before you get all fired up about it, understand that I don’t have a problem with you having a personal blog or sharing it with the public. However your expectation that it has value outside of your family/friends/community, is a serious misconception.


Michael Gray

By Michael Gray | //wolf-howl.com

In doing the research for my series of Adsense articles, two common ideas kept getting repeated:

  • My Adsense ads are horrible, they only pay out (insert low dollar figure here)
  • My Adsense CTR is horrible, I only get a (insert extremely low CTR here)

To be fair these comments weren’t coming just from bloggers, but bloggers did make up an overwhelmingly large percentage. I think this stems from a misconception on the part of the bloggers that they are entitled to high payout and CTR. I’d like to spend a little time to share my feelings on this subject. In the early days a blog may just have been an online diary or journal, but like the days of the Nehru jackets, they are gone. What a blog is now is Chronologically Structured Content Management System, as opposed to the classic web hierarchical structured implementation. Let’s be clear, you can still use a blog as your online diary or journal, but nowdays it’s just as likely to be used as a commercial blog. Yes, I did just say commercial blog, and no the earth didn’t open under my feet and swallow me whole for saying it. Let’s take some time to look at a your typical blog. Leer más “Adsense: Why Bloggers Don’t Get It”

I Wish We Had Google Understand Not Google Instant Search

Earlier this week Google launched the latest iteration of the SERP’s, Google Instant. While I, like everyone else, had fun playing and finding some of the holes in it, it’s really not a product that I think will succeed. To Google’s credit, I can’t ever say that I’ve heard people complain that Google takes too long to serve them results. What I do hear and personally experience is that I wish Google understood what I was looking for …
Google didn’t learn that all this complexity isn’t what people want from the failed Google wave and Sidewiki experiments …

I understand why Google launched a product like Google instant search: they feel that, because they are smart enough at predicting what you are looking for, they can interpret your query after a word or two–or sometimes after just a few letters. They think they know you so well that they can guess what you want without being told. Without getting too involved in what’s going on behind the scenes, Google is using previous search volume to predict the most likely term(s) you are looking for. It’s a sophisticated leap forward in technology to be sure, but it’s not something that solves a problem I hear people complain about. (As a side, this does give a lot more context to the bizarre statement Eric Schmidt made a few weeks ago: “They want Google to tell them what they should be doing next.”)


Post image for I Wish We Had Google Understand Not Google Instant Search

Michael Gray

By Michael Gray | //wolf-howl.com

Earlier this week Google launched the latest iteration of the SERP’s, Google Instant. While I, like everyone else, had fun playing and finding some of the holes in it, it’s really not a product that I think will succeed. To Google’s credit, I can’t ever say that I’ve heard people complain that Google takes too long to serve them results. What I do hear and personally experience is that I wish Google understood what I was looking for …

Google didn’t learn that all this complexity isn’t what people want from the failed Google wave and Sidewiki experiments …

I understand why Google launched a product like Google instant search: they feel that, because they are smart enough at predicting what you are looking for, they can interpret your query after a word or two–or sometimes after just a few letters. They think they know you so well that they can guess what you want without being told. Without getting too involved in what’s going on behind the scenes, Google is using previous search volume to predict the most likely term(s) you are looking for. It’s a sophisticated leap forward in technology to be sure, but it’s not something that solves a problem I hear people complain about. (As a side, this does give a lot more context to the bizarre statement Eric Schmidt made a few weeks ago: “They want Google to tell them what they should be doing next.”) Leer más “I Wish We Had Google Understand Not Google Instant Search”