Love: unraveling local search ranking factors. So you’re a small business (or a large one for that matter), and after doing some reading on SEO, you’ve gone to Google Local Business Center and claimed your business’ listing. You’ve read about PageRank and the importance of attracting high quality incoming links with your sparkling content. But what other factors go into the ranking algorithms for local search? How do you climb up that 10-pack? How do you improve your “Location Prominence” score–the equivalent of PageRank? In this post, Mike Blumenthal takes a look at a Google patent to help provide insight into the factors that explicitly help determine this Location Prominence.
Potential Factors in Ranking a Website Highly for Location Specific Searches:
Incoming links – not simply directory links, but links from other authoritative sites; sites with a high PageRank or Location Prominence score.
Reviews – I’m particularly interested in how Google uses reviews as a factor in local search rankings. There are the metrics that are already quantified–the actual number of reviews a business has received on a site like Yelp for example and rating itself, 3 stars, 4 stars or 5 stars. But how do you quantify the content of the review? How do you turn “good”, “bad”, “efficient”, “okay”, “disgusting”, “spicy” or “pusillanimous” (maybe you rented a guard dog, alright) into a number? What’s the scale for all negative words? What’s the most negative word you can give a restaurant? Does that mean that word passes along a -100 score?
Citations – it’s not merely about links, but how many times your business and its accompanying address appear on a website, not as a link.
Information about the business – search engines want information. It helps them develop a rich tapestry of search results. They’re machines, not humans. They can’t decipher meaning like you and me. Providing the search engines with little information about your business is like the difference between a picture from an inexpensive camera versus a professional camera. If you don’t participate in sites like Yelp, Google Local Business Center, comment on industry blogs, add your business to Best of the Web, then you’re taking a picture of your business with a cheap camera. Google wants you to use that Nikon D3X! What’s the business’ annual revenue? How many employees does the business have? How long has the business been in existence and how long have they been present in listings across the web?
Love: the need for speed! Recently, Google announced they were open sourcing a nifty Firefox add-in, integrated with another superb tool called Firebug, called Page Speed. Page load time is a factor in quality score on the PPC side of life and there have beenrumblings about whether or not page load time plays a role or will play a role in natural search rankings for some time now. Let’s assume it doesn’t play a role in natural search rankings, though. Does that mean I should compress the images on my site, enable gzip compression or remove unused CSS from my site anyway? If you happen to have a site that takes a bit longer than usual to load, I’d vote yes. Users find pages that take too long to load annoying, which translates into users bouncing away. The thinking behind improving page load, and as a corollary the user experience, is driven by five best practices:
Optimizing caching – keeping your application’s data and logic off the network entirely
Minimizing round-trip times – reducing the number of serial request-response cycles
Minimizing request size – reducing upload size
Minimizing payload size – reducing the size of responses, downloads and cached pages
Optimizing browser rendering – improving the browser’s layout of a page
Aside: “…reducing…cached pages.” Hmm, interesting. Nofollow links to your About Us page, AND robots.txt them out?
Love: data, but don’t allow imperfect data to cause you to freeze and not act. One of my favorite lines from this post says there is no limit to the amount of data to you can collect and store on the Internet, and it’s headache-inducingly correct. I’ve mentioned in previous posts the importance of collecting data, analyzing data and then providing an interpretation of that data for insight into what action should be taken, and I of course still feel that way, but I’m not a Quant. There’s a point where granular becomes so microscopic that the difference in dataset A and dataset B will not cause your client to change his or her decision. Therefore, you need to accept imperfection and act. I know we’re big into models and science and equations, but so was Wall Street, and we saw what happened there. Certainly collect your data, but don’t allow it to bog you down into indecision, and don’t allow incomplete data to bolster that indecision. After all, it’s all incomplete (esoteric alert!).
“How do you measure the effectiveness of your magazine ad? Now compare that to the data you have from DoubleClick. How about measuring the ability of your TV ad to reach the right audience? Compare that with measuring reach through Paid Search (or Affiliate Marketing, or …). Do you think you get more useful data from Neilsen’s TV panel of between 15k – 30k US residents to represent the diversity of TV content consumption of 200 million American television viewers?”
Love: social media for something other than retweeting, posting pictures or helping you acquire links. Social media websites work because they facilitate communication and sharing amongst users (and they allow us to talk about ourselves, of course). The good ones also work on a different level–user interface. Thinking about your website in this way, and incorporating these features, can help drastically improve your conversion rate. Remember, it’s all about the user, not you!
Love: scientifically proven ways to do anything. Who doesn’t want to be persuasive? You’re a business, right? You’re trying to tell your story in order to persuade the potential client to help you write the next chapter, right? A few favorites from the post:
Too many options necessitate selection, and hence frustration…
How restaurant mints are a personalized affair
Asking people to substantiate their decision will lead to higher commitment
It’s late Tuesday which means it’s time for Link Love Monday. Hopefully you had a solid, and if you were lucky, relaxing weekend. For the rest of us who entertained guests over the Memorial Day holiday, here’s to drinking lots of water and going to bed at 7:30 tonight! Today’s set of links lean toward the local side of search, but the general principles involved in optimizing for local can certainly be transfered to natural search. Let the link love flow:
Love: An aggregator that helps simplify the process of optimzing your local search presence. Last month Google kept it weird and got a lot more local by providing the sexy local 10-pack for a broader range of non-geo targeted keywords ([austin tacos] versus simply [tacos], for example). If you’re a small business, it’s even more important for you to claim your listings in Google, Yahoo! and MSN, as well as optimize your site for maximal local search exposure. GetListed.org, which has been around for awhile now, provides a hub for you to start this process. Simply enter the name of your business, enter the zip code and you’ll be provided with an overview of your local presence — have you claimed your listing? Do you have reviews, citations, pictures or videos in your local listing?
Aside: speaking of reviews, looking for content ideas for your website? Check out your reviews on sites like Yelp.com. Users might tell you exactly the type of information you need to add to your site.
Love: Tracking your presence — it’s vital. Okay, so you’ve hopped into GetListed.org,claimed your listings, updated your website accordingly, notice you’ve improved your lot in the local 10-pack and…now what? How do you know how well your listing in the local 10-pack is working? How much traffic is it driving to your site? You can’t manage what you don’t measure (as Bill Leake, our fearless leader here at Apogee, would say). Check out this excellent post on how to track both local traffic from the 10-pack found in the SERPs as well as traffic originating specifically from http://maps.google.com.
Love: How local search might work, so that you can turn your inquisitive how’s into actionable how-to’s. This post covers an important part of local search, and search in general — intention. What does a searcher actually mean when he or she types in a keyword and how might you go about calculating meaning/intention. The post uses an article by an University of Massachussetts research and two articles from Yahoo! Labs as the basis for the discusion.
“If you have a web site that offers goods or services or information tied to a particular location, the processes described in this paper are some that may help searchers stand a better chance of finding your site online the next time that they search for ‘attorney’s office,’ or ‘camping near shenandoah park,’ or ‘Macy’s Parade Hotel,’or use some other query that may involve a geographical intent without including an actual location.”
Love: Customizing Google Analytics, tracking your presence (again) and mega-posts, of course. A great post covering 23 ways you can customize Google Analytics in order to tease out more of the information you need to create a full tapestry of your online presence. Track: full referring URLs, Universal Search traffic, downloads (PDFs, WMV, etc.) and more.
Love: A search friendly CMS. Seriously. It will save you heartache, pain, sleepless nights and money on Pepto Bismol. This is a question that’s asked fairly often by clients, particularly when they’re thinking about or are in the process of changing their CMS. It’s okay to be a control freak when it comes to your CMS — just control it.
So it begins. Welcome to the inaugural installment of Link Love Monday™ (alright, so it’s not really trademarked — is that illegal?) where I’ll pass along links I’ve found particularly stimulating. Unfortunately, I’d like all of the links I post to have actually been born last week, but monitoring industry blogs could be a full-time job and I already have one of those. So, some links will be a bit aged as they’ve been perched in my bookmarks for awhile, but they will taste that much better, while others might be born on the morning of Link Love Monday. Either way, it will be an evolving weekly post — which is to say it will likely be more organized, better branded (logo in the works) and more robust. So without further adieu: Link: Using Analytics for Local Search Optimization
Love: The emphasis on selecting the proper keywords for your local search campaign. This is always important, but particularly important for small businesses where getting it right the first time can save time (a.k.a. “money”) on redoing a keyword list and the subsequent on-page changes.
“In this attorney’s case, they might quickly find that while “family law” is a formal term more preferred by their profession, more of their potential customers are likely searching for the term ‘divorce.’ And, in most cases, consumers are searching for ‘lawyers’ when trying to find listings of this type of business, rather than ‘attorneys.’”
Love: The use of the seven deadly sins to drive home the importance of partaking in social media in a responsible, authentic and sinless way. Remain outwardly focused with your social media — focused on the user. Don’t spam him, don’t ignore her, don’t clam up, open up. Each foray into the social media sphere is entirely different according to your business. Again, remain outwardly focused and adapt.
“7. Sloth: Ahhh the deadliest of all sins. Wanting it all but being too lazy to do what it takes. You have to connect with people, you have to write good stuff, you have to stay current and you have to be willing to show up and put the effort in.”
Love:Google’s decision to extend microformats into search results — the Internet will be better served with more and more structured data. What are microformats? Basically, information about information, metadata. This sort of markup language allows you to tell search engines and other programs that the information contained in this HTML is, without a doubt, the name of my business, its location, phone number, fax number, et cetera. One of the more widely used microformats are called hCards — think of them as your business card for machines. Would you like to make one? Try this hCard Creator! Link:In Pursuit of Elegance: 12 Indispensable Tips
Love: Simplicity. Why? It’s important to your business — specifically your business as a website. People are on the Internet looking for information. Scavenging. Scanning. Scoopering. Your website does not need to mimic the hustle and bustle of Times Square. If users liken finding information on your website to finding Waldo, then you’re losing out.
“Study the best: Google, Apple, Lexus, and Ferrari. They understand that complexity is their best friend, not an enemy. They understand it, so they can exploit it. The Google interface is clean and simple though the algorithm is massively complex. Even Einstein understood this. E=mc2 has an easy and immortal ring to it.”
Love: Tools and their ability to make your work easier. In this case, the work we’re talking about is adding multimedia to your blog posts that can entice users to keep coming back to your blog because it’s chalk full of awesome information whether text, images or video. I took Apture on a test drive with my personal blog and found it worked well. It’s this simple: sign up, head to a blog post, highlight a word and an interface pops up that returns music, videos, maps, slideshows, Tweets, news and more related to the highlighted word. Link: Twitter Evolves
Love: thought provoking posts – who doesn’t? And seriously, what would a link repository be without Oprah’s favorite social media platform Twitter? If only this post were about Oprah’s use of Twitter to disseminate fashion advice. Instead, we’ll go with Twitter and copyright laws.
“There are only 27^140 possible tweets, can I just copyright them all and then sue anybody who uses Twitter?”
Change is Good
The Google Adwords keyword tool has changed over time, and overall, the tool continues to be a great resource for search marketers. In this most recent change, the interface remained almost exactly the same, except that two of the columns of data were replaced with new sets of data. The keyword tool now displays:
the keyword(s) that were searched for plus similar words related to that query,
paid search advertiser competition,
local search volume for the most recent completed month (new),
global monthly search volume for the average monthly search volume worldwide(new).
For the last nine months or so, the two volume columns displayed were: the previous month’s search volume (not specific to your location) and an average for any month’s search volume. When the tool first launched a few years ago, search volumes were displayed in a bar graph format, and last summer Google added specific numbers to this tool, making it increasingly useful for SEM data-hungry junkies.
Local + Everything Else ≠ Global
The local column is designed to factor in my physical location and language, and display the local searches for the keywords in question. This local data will likely be helpful for both SEO and PPC research (keep in mind, these numbers reflect searches in the entire Google Network, so they are more of a guideline than an exact measurement). I am puzzled, though, about how the local search volume for a recent month is consistently notably higher than the global monthly search volume — how could this be? Barry Schwartz noted the same discrepancy in his blog post last week on Search Engine Land. I realize that sometimes the most recent month will have a higher volume than the average of the previous 12 months, but it seems to be the rule rather than the exception for the searches I tested.
Results for “black tea” in July 2008:
Both “black tea” and “organic black tea” return higher average search volume than search volume in the previous month (June). This is not surprising as black tea is usually served hot, and search volume may drop during the warm summer months.
Only one keyword displayed higher search volume for June than for the average month: “chinese black tea”.
Results for “black tea” in April 2009:
If I am doing research for a site in Austin, TX that sells black tea, based on this local data, I would definitely recommend writing some articles about the caffeine found in black tea, as well as the benefits of drinking black tea.
Although this is a small sample set, all of the returned keywords recorded higher search volume localized to Ausin, Texas than the global monthly search volume. Maybe not an issue, but something to note and observe as this tool continues to be used.
As usual, these changes will require observation, testing, and analysis to determine how helpful, or how useless, they may be. In the meantime, I’ll keep wishing for the “perfect” keyword research tool!
I recently spoke with Jay Brock, a Natural Search Consultant at Apogee Search, aboutSEO keyword research. Jay provided insight into the importance of SEO keyword research, strategies for developing keyword lists, and highlighted tools to help evaluate keyword success.
Quality SEO keyword research should be the foundation for all SEO efforts. Since SEO can take six to nine months to show significant results, selecting effective keywords is crucial. Keyword research helps prevent your company from investing time and money in poorly developed tactics. Finding terms that will bring significant and relevant traffic is essential to your SEO campaign and overall profitability.
The keyword list development should be a collaborative effort. Jay suggests beginning with a brainstorm. When brainstorming, consider your target audience and how they search for you. Also, analyze paid search data for successes, and apply information learned from paid search campaigns to your natural search efforts. You should also take into account seasonality of products or services. And finally, include branded terms in your keyword list, because visibility is important on these terms too.
Once you have defined a list of keywords, there are three data points that will help you determine the value and competitiveness of your terms. They are as follows:
1. Monthly search volume can be found using various tools. The most widely used isGoogle AdWords. This free tool provides 12-month search volume, and month-by-month search volume which allows you to easily look at seasonality. SEO Book also provides a keyword tool, though it is more in depth and not quite as user-friendly as the Google Adwords keyword research tool.
2. For sites with keyword phrases in the title tags, perform a Google search using “allintitle” as one word, followed by a colon, and the keyword phrase, for example — allintitle:books. The results will show the total number of sites that compete using that keyword phrase in their title tags.
3. For sites with keyword phrases in the anchor text, perform a Google search using “allinanchor” as one world, colon, and the keyword phrase (example — allinanchor:books). You will see the number of competing websites with links pointing back to their sites using a specific keyword phrase.
With these three tips you can identify terms with a high search volume and low allintitle and allinanchor competition. These terms can drive considerable traffic with the least amount of competitors for keyword phrases.
Once your keyword list is defined you must decide which pages are most relevant to your keyword phrases. Select pages that already use the phrases in the content or are dedicated to a specific topic. On-page relevance is important for search engine rankings.
Remember to trust your instincts. Utilize tools and data, and also use your industry knowledge and consumer insights to define your keyword list.
Do you track the conversations about your business meandering through the Internet? You should. They are not only opportunities to connect with your customers and potential customers, but these conversations can act as link bait. The ultimate form of link bait – outside of a viral piece of content – is certainly the engaging and informative content of your website itself, but conversations don’t need to be 500 page novels. Conversations online can be as short as 140 characters (as they are on Twitter) or as long as the longest tail in the form of a blog comment. The entire Internet is a conversation. Figuring out where people are talking about your business and engaging these users, by both listening and providing information when necessary can lead to improved customer relationships, increased brand awareness, and can serve as link bait to improve your search rankings and keep you in front of the competition that still thinks their online presence isn’t that important. So, how might you find these conversations?
Here’s a brief list of the communication coffee shops you should be visiting regularly:
I must admit that the gap between the time I opened my Twitter account and when I actually became a part of the community stretched months, and it wasn’t because I was too busy clipping my finger nails. Twitter does provide value. Twitter provides instantaneous information about anything and everything. Whopper Virgins? Check. Recent news?Check. Conversations about your company? Check. Head to http://search.twitter.com, type in your product or your company name, say hello and start the conversation.
Blogs are perhaps the ultimate conversation conduits of the Internet. They allow users to write exhaustively about life, love, lemmings and your product. Also, you can leave a linkwhich will help drive traffic and, if the link is not “nofollow,” help increase the link juice flowing to your page. Head to Technorati and, again, search for your product or company name and engage your constituents (and add a link)!
Forums and Message Boards
BoardReader allows users to perform searches for specific posts and forums about your product or company as well as providing a topic profile for your search. Searching forums and messages boards allows you to find communities of users that are highly relevant to your business and can be an ongoing source of interaction for business and link juice.Blogs, unless they are targeted at your product, can be less qualified in terms of an ongoing relationship than forums or message boards, but they can still provide link juice via incoming non-nofollow links.
Okay, so this isn’t a website that allows you to find conversations per se, but using Google Analytics can help you find conversations about your product or company that have already taken place (in addition to its somewhat important task of tracking visitors, goal conversions and other metrics for any SEO campaign worth a dime). While scanning thereferrer traffic for a client recently, I noticed a decent amount of traffic referred to their website from bbc.co.uk. “Interesting,” I thought. I headed to Google and typed in “’www.Client’sURLHere.com’ site:bbc.co.uk” and…Voila! Someone had linked to the client’s website on the forums tucked inside the BBC website. Sure, the conversation had already taken place, but it was a naturally occurring (followed) link from a powerful domain (PR 9) which made the forums worth monitoring.
This list is by no means exhaustive, but merely a starting point. I’m sure some will notice that I have not included Google Alerts. It’s certainly a good tracking utility, but it might not provide the sort of granular conversation tracking you are looking for, particularly when it comes to forums and messages boards (not to mention the need to diversify your sources). In the end, people are talking about you and your product all over the World Wide Web. User engagement is the name of the game. You can increase your brand awareness and improve your rankings through links and your sales through direct contact.
It is said that in every market (even down markets) there is opportunity, and nobody wants to remind advertisers of that more than Google. And for good reason – media speculation abounds with optimistic predictions of SEM being relatively insulated from economic recession. As advertisers demand more accountability from their dollars and rein in once bountiful branding campaigns in preparation for a hard winter, direct marketing efforts take a more prominent role – with dollars shifting towards performance-based search.
The site contains a virtual menu of tools to help research, plan, and optimize campaigns. In times where every penny counts, there is a noticeable emphasis on the fact that many of these resources are free. It’s an impressive roster that offers a smorgasbord of sophisticated, no cost SEM applications:
Insights for Search
The greater psychology behind the site is spot-on since people crave more control and the ability to influence the outcome of their campaigns, companies, and lives in times of uncertainty. Google is reminding marketers that with help from the right resources and strategies they have the ability to turn an otherwise dismal 2009 into a year of opportunity.
On Wednesday, Yahoo! announced the release of their new Yahoo! Web Analytics (beta) tool, which stems from their acquisition of IndexTools this past May. Website owners who employ the new tool can expect powerful data and insights, as well as real-time reports and graphs that will help identify ways in which to improve a site’s overall performance.
“Nothing is worse for site owners—and consumers—than bad marketing or a lousy user experience,” states Jitendra Kavathekar on the Yahoo! blog. “Here’s an easy tool designed to combat them… and fast.”
Yahoo! also stated that they are expecting to release this new service in “phases” for the remainder of 2008 and through the first part of 2009. Their first major deployment was Yahoo! Small Business, in time for the upcoming busy holiday shopping season.
This will be a free service to clients and advertisers who accept the standard Yahoo! agreement.
“We have already started to roll Yahoo! Web Analytics out to advertisers who seek Yahoo!’s help to build custom micro-sites, as well as to third-party application developers who build widgets and other mini-apps for Yahoo! users via our developer network or our new Yahoo! Open Strategy tools,” Yahoo!’s blog states.
In order to receive future updates on Yahoo! Web Analytics, users are encouraged to complete a sign-up form on the Yahoo! Web Analytics (beta) homepage.
Google announced last week that it will start selling TV ads for NBC/Universal’s cable stations, a historical opportunity for both the television station and the search engine giant.
The agreement would allow Google TV ads to buy time on six of the NBC’s cable stations, including MSNBC, Oxygen and C-NBC. The companies will split the revenue from the commercials, although it is unknown how many ad spots NBC will make available to Google. Both companies have said there is a possibility that they will eventually make the service available for NBC ‘s local markets, including a Spanish speaking network and multiple NBC-owned websites.
“In NBCU, we’ve found another partner that shares Google’s commitment to innovation, research and the power of technology,” states Google’s official media blog. “With valuable feedback from advertisers, agencies and inventory partners, we’ve continued to evolve and strengthen the Google TV Ads platform.”
“This is just a continuing effort to try to find ways to take the transactional burden out of the mix, so that we can, all of us, focus on making advertising more effective for our clients,” said Mike Pilot, NBC Universal’s president-sales and marketing. “That’s what they care about at the end of the day.”
According to recent press reports, Google has begun similar programs, or Web tools, for users to buy ad space for both radio and print.
Dart throwing, names in a hat, coin tosses, and binge drinking are all accepted formulas for devising a list of keywords for an SEM campaign. There is however differences in creating paid search keywords versus natural search keywords.
The main difference paid search keywords have in comparison to natural search is the breadth of generated keywords. With paid search your keywords are at the mercy of a user’s search, which is unfortunate because a user’s intelligence pales in comparison to that of an algorithm. This is why paid search keywords need to be both precise and all encompassing. Take for instance name brand HDTV’s. Keywords such as “Sony,” “Magnavox” etc need to be included however those are all very high volume keywords, and unless you are willing to pay five or more dollars a click, chances are this keyword won’t see the first page of results. However if you tweak the keywords to match a probable user comparison thought process the keyword “Sony vs Magnavox” would be of more benefit to your overall campaign.
Another important thing to remember in keyword generation is that your customers are not looking for you they are looking for your product or service, which means they’re looking for anyone. Bidding on competitors’ names has received some bad press in the past, but it’s a smart (and legal) way to be seen in the search engines. There is no need to use it as an assault on the competition, but having your name rank among their results will be of benefit to users shopping for your product or service.
Those were just two tips to keep in mind for keyword generation. For the actual generation itself, there is once again no way a human’s intellect can match that of a machine, which is why keyword generation websites can immensely help the process. There are many sites out there, some for free (Google Adwords), that generate a list of keywords along with a number of impressions. Along with this list should be a short list of negative keywords that tell the search engines not to run an ad when an unwanted keyword is entered. For example a pickle farmer would want to create a negative keyword for the type of people that live in Poland.
Developing your paid search keywords is actually the first step in generating keywords for a natural search campaign. Throughout the first months of a paid search campaign you will begin to see which keywords are out performing the rest. The top 5, along with a few long tail keywords that perform well should be considered to be a part of a natural search keyword list. Performance depends on a robust clickthrough rate as well as impressions. In natural search the only thing search engines care about is relevancy, which is why these keywords should be the 15- 20 best keywords that define your site.
If you do not plan on running a paid search campaign preceding your natural search, tools like wordtracker are of use when trying to find high volume keywords. However these high volume words should be carefully chosen. Terms like “real estate” are too broad which is why “Louisville KY real estate” would work better for the algorithm retrieving your site. Also, try to stay away from prepositions. While words like “and, but, for” etc. work for your humans users in paid search campaigns, they are usually bypassed by the search engines.
In terms of keeping up with your competitors viewing their source code to see the keywords they have used in places such as the title, headings, and descriptions is useful when creating ideas for keywords and such on your site.
Above all the aforementioned it’s important to realize the monetary distinction between paid and natural search keywords. With paid search a keyword is more of a one-time payment multiplied by the number of times it is clicked, and you can manipulate this number with targeted ad copy. For example the keyword “SPAM Prevention Software” will be seen by a large market, but if you’re software is designed for enterprises only creating copy targeted for the private market will filter out the public market. With SEO your keywords are more of an investment on the basis of impressions. Which is why to succeed in SEO the same “SPAM Prevention Software” keyword might not be the best overall investment if only 10% of the total audience searching for SPAM prevention software sees it.
While I don’t want to explicitly say that paid search is an art while natural search is a science, it more or less is like that. The majority of the time a user will not put much thought into entering in his/her keywords which is why it is important to think existentially in order to have all bases covered in a paid search campaign. Conversely in natural search you are catering to an algorithm that is essentially a cold, heartless robot that wants nothing but results. Much like an elderly, widowed piano teacher.