Love: optimizing your universal search presence. The natural search results aren’t the only results that deserve optimization attention. Universal search requires a holistic approach to SEO where you need to also focus on optimizing for local, blogs, videos, images, and if you have an e-commerce website, product search which makes an appearance in the search results as shopping results. Check out the Product Search for Webmasters video from Google on how you can go about optimizing for shopping results. Also, you’ll need a Google Base account in order to get started with the optimization.
Love: focusing your efforts on more important things…I kid (sort of), the news from Google that Pagerank sculpting does not work as SEO’ers thought is important. The Google man himself Matt Cutts explains it on his blog. The basics:
Your page has a Pagerank score of 8
It has 4 outgoing links
Left as is, each link passes along 2 points of Pagerank, 8 divided by 4
Previously, if 2 of those links pointed at less important pages, “Contact Us” and “About Us” for example, some SEOs would nofollow those links
In doing so, it was believed this allowed the other 2 links to pass along 4 points of Pagerank rather than 2
Now, nofollowing those important links does not pass Pagerank points in this simple way and requires Pagerank sculpting using a number of other techniques
Here is a SEOmoz post of the topic that also provides pros and cons from the SEO perspective.
Love: simple and reliable tools that allow you to track your company and its keywords across multiple channels – blogs, microblogs (Twitter, FriendFeed), social bookmarks, comments (blog, forum or otherwise), news, video and more. It’s extremely easy to get bogged down with tools just as it is with too much data. Personally, I tend to stick with those that are simple, efficient and reliable and do not often switch unless the tools will allow me to provide even more actionable insight – Occam wins. SocialMention is simple and reliable. Not only are you provided with links to blogs, blog comments, Q&A sites, social bookmarks, and more, that mention your company or keywords, but SocialMention also provides data on sentiment (positive to negative mentions), reach (number of unique authors mentioning the entered keyword) and other metrics.
Love: making your website not only able to be found, but actually accessible to everyone. Think of the user. Remain outwardly focused (just like phenomenal non-profits). Not everyone uses the same setup for surfing the internet, so you should ensure nearly all users are able to actually find information on your site once they’ve found you on the search engines. There are a few pointers in this post that play a role in SEO:
Supply proper meta tags – small piece of the pie, but a piece of the pie nonetheless
Use accesible navigation – descriptive title and header tags provide keyword relevancy and help structure your site, which can help improve the ability of Google to provide Site InLinks
Love: when you get information about search from the mouths of the leviathans. This whiteapaper, distributed by Microsoft, details features of Bing, the layout of the search results page, the structure and details of the search results page and much more information.
Love: unraveling local search ranking factors. So you’re a small business (or a large one for that matter), and after doing some reading on SEO, you’ve gone to Google Local Business Center and claimed your business’ listing. You’ve read about PageRank and the importance of attracting high quality incoming links with your sparkling content. But what other factors go into the ranking algorithms for local search? How do you climb up that 10-pack? How do you improve your “Location Prominence” score–the equivalent of PageRank? In this post, Mike Blumenthal takes a look at a Google patent to help provide insight into the factors that explicitly help determine this Location Prominence.
Potential Factors in Ranking a Website Highly for Location Specific Searches:
Incoming links – not simply directory links, but links from other authoritative sites; sites with a high PageRank or Location Prominence score.
Reviews – I’m particularly interested in how Google uses reviews as a factor in local search rankings. There are the metrics that are already quantified–the actual number of reviews a business has received on a site like Yelp for example and rating itself, 3 stars, 4 stars or 5 stars. But how do you quantify the content of the review? How do you turn “good”, “bad”, “efficient”, “okay”, “disgusting”, “spicy” or “pusillanimous” (maybe you rented a guard dog, alright) into a number? What’s the scale for all negative words? What’s the most negative word you can give a restaurant? Does that mean that word passes along a -100 score?
Citations – it’s not merely about links, but how many times your business and its accompanying address appear on a website, not as a link.
Information about the business – search engines want information. It helps them develop a rich tapestry of search results. They’re machines, not humans. They can’t decipher meaning like you and me. Providing the search engines with little information about your business is like the difference between a picture from an inexpensive camera versus a professional camera. If you don’t participate in sites like Yelp, Google Local Business Center, comment on industry blogs, add your business to Best of the Web, then you’re taking a picture of your business with a cheap camera. Google wants you to use that Nikon D3X! What’s the business’ annual revenue? How many employees does the business have? How long has the business been in existence and how long have they been present in listings across the web?
Love: the need for speed! Recently, Google announced they were open sourcing a nifty Firefox add-in, integrated with another superb tool called Firebug, called Page Speed. Page load time is a factor in quality score on the PPC side of life and there have beenrumblings about whether or not page load time plays a role or will play a role in natural search rankings for some time now. Let’s assume it doesn’t play a role in natural search rankings, though. Does that mean I should compress the images on my site, enable gzip compression or remove unused CSS from my site anyway? If you happen to have a site that takes a bit longer than usual to load, I’d vote yes. Users find pages that take too long to load annoying, which translates into users bouncing away. The thinking behind improving page load, and as a corollary the user experience, is driven by five best practices:
Optimizing caching – keeping your application’s data and logic off the network entirely
Minimizing round-trip times – reducing the number of serial request-response cycles
Minimizing request size – reducing upload size
Minimizing payload size – reducing the size of responses, downloads and cached pages
Optimizing browser rendering – improving the browser’s layout of a page
Aside: “…reducing…cached pages.” Hmm, interesting. Nofollow links to your About Us page, AND robots.txt them out?
Love: data, but don’t allow imperfect data to cause you to freeze and not act. One of my favorite lines from this post says there is no limit to the amount of data to you can collect and store on the Internet, and it’s headache-inducingly correct. I’ve mentioned in previous posts the importance of collecting data, analyzing data and then providing an interpretation of that data for insight into what action should be taken, and I of course still feel that way, but I’m not a Quant. There’s a point where granular becomes so microscopic that the difference in dataset A and dataset B will not cause your client to change his or her decision. Therefore, you need to accept imperfection and act. I know we’re big into models and science and equations, but so was Wall Street, and we saw what happened there. Certainly collect your data, but don’t allow it to bog you down into indecision, and don’t allow incomplete data to bolster that indecision. After all, it’s all incomplete (esoteric alert!).
“How do you measure the effectiveness of your magazine ad? Now compare that to the data you have from DoubleClick. How about measuring the ability of your TV ad to reach the right audience? Compare that with measuring reach through Paid Search (or Affiliate Marketing, or …). Do you think you get more useful data from Neilsen’s TV panel of between 15k – 30k US residents to represent the diversity of TV content consumption of 200 million American television viewers?”
Love: social media for something other than retweeting, posting pictures or helping you acquire links. Social media websites work because they facilitate communication and sharing amongst users (and they allow us to talk about ourselves, of course). The good ones also work on a different level–user interface. Thinking about your website in this way, and incorporating these features, can help drastically improve your conversion rate. Remember, it’s all about the user, not you!
Love: scientifically proven ways to do anything. Who doesn’t want to be persuasive? You’re a business, right? You’re trying to tell your story in order to persuade the potential client to help you write the next chapter, right? A few favorites from the post:
Too many options necessitate selection, and hence frustration…
How restaurant mints are a personalized affair
Asking people to substantiate their decision will lead to higher commitment
So it begins. Welcome to the inaugural installment of Link Love Monday™ (alright, so it’s not really trademarked — is that illegal?) where I’ll pass along links I’ve found particularly stimulating. Unfortunately, I’d like all of the links I post to have actually been born last week, but monitoring industry blogs could be a full-time job and I already have one of those. So, some links will be a bit aged as they’ve been perched in my bookmarks for awhile, but they will taste that much better, while others might be born on the morning of Link Love Monday. Either way, it will be an evolving weekly post — which is to say it will likely be more organized, better branded (logo in the works) and more robust. So without further adieu: Link: Using Analytics for Local Search Optimization
Love: The emphasis on selecting the proper keywords for your local search campaign. This is always important, but particularly important for small businesses where getting it right the first time can save time (a.k.a. “money”) on redoing a keyword list and the subsequent on-page changes.
“In this attorney’s case, they might quickly find that while “family law” is a formal term more preferred by their profession, more of their potential customers are likely searching for the term ‘divorce.’ And, in most cases, consumers are searching for ‘lawyers’ when trying to find listings of this type of business, rather than ‘attorneys.’”
Love: The use of the seven deadly sins to drive home the importance of partaking in social media in a responsible, authentic and sinless way. Remain outwardly focused with your social media — focused on the user. Don’t spam him, don’t ignore her, don’t clam up, open up. Each foray into the social media sphere is entirely different according to your business. Again, remain outwardly focused and adapt.
“7. Sloth: Ahhh the deadliest of all sins. Wanting it all but being too lazy to do what it takes. You have to connect with people, you have to write good stuff, you have to stay current and you have to be willing to show up and put the effort in.”
Love:Google’s decision to extend microformats into search results — the Internet will be better served with more and more structured data. What are microformats? Basically, information about information, metadata. This sort of markup language allows you to tell search engines and other programs that the information contained in this HTML is, without a doubt, the name of my business, its location, phone number, fax number, et cetera. One of the more widely used microformats are called hCards — think of them as your business card for machines. Would you like to make one? Try this hCard Creator! Link:In Pursuit of Elegance: 12 Indispensable Tips
Love: Simplicity. Why? It’s important to your business — specifically your business as a website. People are on the Internet looking for information. Scavenging. Scanning. Scoopering. Your website does not need to mimic the hustle and bustle of Times Square. If users liken finding information on your website to finding Waldo, then you’re losing out.
“Study the best: Google, Apple, Lexus, and Ferrari. They understand that complexity is their best friend, not an enemy. They understand it, so they can exploit it. The Google interface is clean and simple though the algorithm is massively complex. Even Einstein understood this. E=mc2 has an easy and immortal ring to it.”
Love: Tools and their ability to make your work easier. In this case, the work we’re talking about is adding multimedia to your blog posts that can entice users to keep coming back to your blog because it’s chalk full of awesome information whether text, images or video. I took Apture on a test drive with my personal blog and found it worked well. It’s this simple: sign up, head to a blog post, highlight a word and an interface pops up that returns music, videos, maps, slideshows, Tweets, news and more related to the highlighted word. Link: Twitter Evolves
Love: thought provoking posts – who doesn’t? And seriously, what would a link repository be without Oprah’s favorite social media platform Twitter? If only this post were about Oprah’s use of Twitter to disseminate fashion advice. Instead, we’ll go with Twitter and copyright laws.
“There are only 27^140 possible tweets, can I just copyright them all and then sue anybody who uses Twitter?”
For the third installment of the SEO Toolbox, we are featuring Enquisite Pro. This tool is a supplemental web analytics tool that gathers information from page views that originate from a search engine. It presents you with four reports that will improve your SEO analysis. The reports are as follows:
The Longtail report informs you of keywords, entry pages, geographic locations and organizations referring traffic to your site.
The Search Engine Comparison report allows you to compare more than one search engine and determine keyword opportunities for Yahoo! and MSN. It also includes traffic, actions and ranking page data.
The Links report provides you top referral URLs, link trends, and landing page details.
The TopReferrals report can provide a visual graph of data that can be grouped and ungrouped, allowing evaluation of relevant information.
Additionally, Enquisite is a complementary to analytics tools like Google Analytics. It can augment your analysis with five distinguishing capabilities:
Streamlines data mining
Rankings page data available for keywords driving traffic/actions (even those not in your campaign)
International page rankings are available
Geographic capabilities as specific as zip codes
Ability to layer data
Enquisite Pro is a great tool to enhance the evaluation of your online efforts by providing actionable insights.
Change is Good
The Google Adwords keyword tool has changed over time, and overall, the tool continues to be a great resource for search marketers. In this most recent change, the interface remained almost exactly the same, except that two of the columns of data were replaced with new sets of data. The keyword tool now displays:
the keyword(s) that were searched for plus similar words related to that query,
paid search advertiser competition,
local search volume for the most recent completed month (new),
global monthly search volume for the average monthly search volume worldwide(new).
For the last nine months or so, the two volume columns displayed were: the previous month’s search volume (not specific to your location) and an average for any month’s search volume. When the tool first launched a few years ago, search volumes were displayed in a bar graph format, and last summer Google added specific numbers to this tool, making it increasingly useful for SEM data-hungry junkies.
Local + Everything Else ≠ Global
The local column is designed to factor in my physical location and language, and display the local searches for the keywords in question. This local data will likely be helpful for both SEO and PPC research (keep in mind, these numbers reflect searches in the entire Google Network, so they are more of a guideline than an exact measurement). I am puzzled, though, about how the local search volume for a recent month is consistently notably higher than the global monthly search volume — how could this be? Barry Schwartz noted the same discrepancy in his blog post last week on Search Engine Land. I realize that sometimes the most recent month will have a higher volume than the average of the previous 12 months, but it seems to be the rule rather than the exception for the searches I tested.
Results for “black tea” in July 2008:
Both “black tea” and “organic black tea” return higher average search volume than search volume in the previous month (June). This is not surprising as black tea is usually served hot, and search volume may drop during the warm summer months.
Only one keyword displayed higher search volume for June than for the average month: “chinese black tea”.
Results for “black tea” in April 2009:
If I am doing research for a site in Austin, TX that sells black tea, based on this local data, I would definitely recommend writing some articles about the caffeine found in black tea, as well as the benefits of drinking black tea.
Although this is a small sample set, all of the returned keywords recorded higher search volume localized to Ausin, Texas than the global monthly search volume. Maybe not an issue, but something to note and observe as this tool continues to be used.
As usual, these changes will require observation, testing, and analysis to determine how helpful, or how useless, they may be. In the meantime, I’ll keep wishing for the “perfect” keyword research tool!
The first featured tool is Xenu’s Link Sleuth, also called Xenu. Xenu is a computer program that tests websites for broken hyperlinks. It performs link verification on text, images, frames, plug-ins, backgrounds, local image maps, style sheets, scripts and Java applets. Xenu is an effective SEO tool because it is free, fast and accurate.
Xenu runs on Microsoft Windows, and under Crossover on Mac. Xenu has the ability to recheck broken links which is useful for temporary network errors. It supports SSL websites, offers partial testing of FTP and gopher sites, and detects and reports redirected URLs. Reports can be emailed for convenience.
Xenu is not just for broken links. You can also use it to:
Identify duplicate content issues
Optimize page load time by finding the largest pages and images
Find least linked pages
Find pages with the most out links
Discover deepest-buried pages and learn how easily they can be found
Recognize images that lack alternative text
Xenu’s Link Sleuth offers an expansive amount of data so it is important to select settings that are appropriate to the data you need.
Google SearchWiki was launched in November of 2008 with the purpose of personalizing searches and providing an opportunity to share comments among Google users regarding websites and search results. Since then, many SEM experts have been asking themselves, “where is Google going with this?” As with any new innovation, this new feature comes with pros and cons.
Danny Sullivan moderated a session with Corey Anderson, a Google engineer, on Wiki Search at the 2009 SMX West conference. The two of them managed to evoke many interesting points of discussion. According to Google’s representative, the personalized rankings (Wiki Search) are not currently being taken into consideration for Google’s primary search results. So, for all of you who started asking your friends and family to move your website into the number 1 position in their wiki searches, according to Google, this currently has no influence on the rankings. However, the possibility of these personalized rankings having an effect on search results has not been ruled out for the future.
One apparent negative impact Google’s SearchWiki will have on SEM professionals is the difficulty that will come with gathering ranking data and drawing identifiable comparisons from analytics data based on those rankings, as an individual’s personalized rankings will play a part in this traffic. Ultimately, search rankings will become of much less importance if this does become a reality.
On the other hand, Google will open up the competition with personalized rankings, asdomain age will be an unlikely factor in how an individual searcher ranks sites in their personal listings. This will present the opportunity for new ecommerce and informational sites with high relevancy to a search topic to appear in the top rankings, as searchers will have the ability to judge what is the most relevant, regardless of the age of the domain.
This last point could ultimately be a plus for all web users. In order to gain rankings, website owners will be forced to include content relevant to the user rather than building sites primarily for the search engines. Users will now be choosing the websites they want to visit repeatedly, and this will inform Google which sites are meeting the needs of what a user is searching for. Google will be giving more control to the visitors and therefore, websites must be created with the primary goal of appealing to their target audience.
At this point, you may be asking yourself, “but what about spamming with negative or positive comments to remove your competitors?” Google made it clear that they will be reviewing the comments very carefully to avoid any red flag or black hat motives. This will hopefully ensure that this type of spamming will not be a problem to your website’s rankings.
Whether you like it or not, Google is constantly gathering information from everything you are searching for, regardless if you are logged in or not. Currently, Google alerts you when a search result has been customized according to previous searches and your location. However, many people may not realize that even when Google doesn’t inform you that the results have been customized, they may have been, most likely in a less obvious way. In certain instances, it is better to avoid user confusion rather than alert the user to these results.
Currently, Google is testing this Wiki functionality for their PPC ad space as well. While this is currently an issue that SEO folks are working to solve, PPC should also be paying close attention, as this will most likely effect how quality scores and bid prices will be determined in the future.
The World Wide Web is becoming more personal on a daily basis, and if websites are not prepared to handle this, they will no longer be able to effectively compete in the same search space that they may have previously dominated in the past.
Google Hot Trends is one of the most useful tools Google offers. It shows a list of keywords that have seen a spike in search traffic recently.
Unlike most search marketing tools, whose data is at least a day (if not months) old, Google Hot Trends generally shows data from the previous hour. That is about as close as it gets to “real time” in this industry.
This can be a powerful support tool for search campaigns that can monetize such traffic spikes (such as news sites or blogs).
More generally, Google Hot Trends provides a wonderful insight into what memes are bubbling up in the Internet soup. It usually includes current news, events, and the name of Hollywood’s “It Girl” plus the word [pictures]. And there are invariably pop culture references that go right over my head.
Imagine my surprise at yesterday’s list of keywords (click on the thumbnail for a larger version):
Did you catch that? Perhaps you should take a closer look at the fifth keyword:
Really? Didn’t that happen five years ago?
Oh yeah, it’s Super Bowl week.
As my buddy Jon Higby said, “Most people couldn’t tell you who was playing that year – but everyone remembers half time!”
Clearly, people are Feelin’ Kinda Sunday.
So, when coming up with keywords, be sure to consider the older keywords that might come back into vogue.
Just FYI, it was Super Bowl XXXVIII in February 2004, and the New England Patriots beat the Carolina Panthers 32-29 on a Adam Vinatieri field goal with four seconds in the game.
At the beginning of the year, this post on the Apogee Search Marketing Blog made some predictions about search marketing in 2008. Before we try to make any predictions about 2009, let’s take a minute to review 2008’s search predictions compared to what actually occurred over the last 12 months.
Apogee’s 2008 search predictions were as follows:
Management tools become the cost of having a seat at the paid search table, rather than a competitive advantage. PPC management tools were certainly abundant in 2008. And, yes, they were almost necessary to a campaign’s success. Whether these tools were internal or external, focused on automated bid management, analyzing data or testing campaign variables, management tools freed up paid search managers’ time so they could focus on new opportunities, expansion and overall strategy.
Business/marketing acumen becomes more important to paid search management than technical prowess. While tools are great, tools just do what we tell them to do. Ultimately paid search managers have to set appropriate goals for marketers and outline the necessary steps to reach those goals. This year ad copy and landing page testing have gained popularity as marketers focus on increasing conversion rates. With new tools such as Google Website Optimizer (GWO), these tests are becoming easier to implement.
Search engines continue to provide better bid management functionality. Most tools vendors don’t react. Search engines have made many improvements in an effort to provide better bid management functionality in 2008, but despite all of the changes made this year, there is still a long way to go in providing reliable bid management functionality.
Google announced a new quality score method this year that determines CPC in “real-time,” as opposed to its tried and true static quality scores. It also allows for marketers to see first page bids rather than minimum bids.
AdWords Editor now allows users to download performance statistics so that analysis and adjustments can easily be made in the same interface. In addition, the newest 7.0 version, allows users to see quality scores and first page bid estimates for keywords.
Yahoo! now allows marketers to view average rankings when in the bid editing page.
MSN Live Search released a desktop beta tool that is essentially an AdWords Editor for Microsoft.
Bid management tools are also still a bit behind the curve. While their automation saves paid search managers time by adjusting bids, they are slow to react to changes made by search engines. Adjusting bids manually within the search engine’s interface is often more complicated than just using the free tools offered by search engines. As for full blown campaign management, we’re still not seeing many tools with the ability to handle that functionality yet.
Google extends its lead in the paid search market, either a little or a lot, depending upon how you measure the industry. Without a doubt, Google continues to be the leader in the paid search realm. ComScore recently released that in October 2008 Google Sites held 63.1% of all searches, as compared to58.5% in October of 2007. Google’s revenue also increased 31% from third quarter 2007 to third quarter 2008, raking in $5.54 billion in Q3 2008.As for service offerings, Google rolled out tool after tool after tool aimed at helping paid search marketers in 2008. All of these tools successfully assist marketers in optimizing and expanding their paid search campaigns, allowing for Google to maintain and grow its steady cash flow.
Local search continues to grow, but still has a difficult time providing substantive traffic in most markets. This year businesses flocked to Google Local Business Center. It has become “the” thing to do. As Universal Search rolled out throughout the year, local search optimization became even more visible and critical. In most industries and major cities, a business with a service that is location-specific and not on Google Local, will basically be behind by the end of this year.Furthermore, Google’s Local Business Ads (LBAs), a version of paid ads that appear mostly on Google Maps, contributed heavily to local search’s growth in 2008.
Google rolls Click-To-Call in with its local search service, and still no one cares.Not much word about Click-to-Call this year; still no one cares. What has gained recognition in 2008 is phone call tracking for paid search campaigns. Companies such as ClickPath, provide the ability to track calls to the keyword level.
Google Pay Per Action gains traction with B2C advertisers, struggles with B2B advertisers. Google launched Pay Per Action beta globally in June 2007, but phased it out in 2008, citingthe DoubleClick/Performics acquisition as the reasoning.
Google Product Search (previously Froogle) celebrates its sixth birthday, remains in beta. Yes, Google Product Search is still in Beta. During 2008, this product caught up with other comparison shopping engines by showing groups of similar products when a search is performed. This change caughtsome bloggers’eyes when it first rolled out, but ironically, later in the year Google Product Search made it on the list of search engines you’ve never heard of. An option that many companies have not yet tapped into is submitting services just as you would submit products.
Google continues to rail against paid links. The paid linking industry adjusts and continues to provide SEO benefit to its clients. SocialSpark was launched byPayPerPost in mid-2008, and the head of Google’s webspam team Matt Cuttssays he actually likes IZEA’s new service. SocialSpark provides advertisers an opportunity to pay bloggers for a review but requires a nofollow link to the advertiser.Another paid link vendor, Text Link Ads (TLA) launched InLinks publicly in November. Throughout 2008, Google has commented and posted extensivelythat paid links are in violation of the FTC’s Guides Concerning Use of Endorsements and Testimonials in Advertising. SEO bloggers have been debating about the impact of these changes as recently as the last few weeks.For whatever reason, Yahoo! and MSN haven’t been quite as vocal against paid linking in 2008. Yahoo! isn’t worried about the payment as much as the likelihood that a paid link usually doesn’t give as much value as a non-paid link.
SEO becomes more metrics driven as companies learn to measure their SEO performance. Absolutely. Tracking SEO leads and sales provides ROI that is critical to include in a company’s overall marketing expenditure analysis. The tricky part here is if the company knows what the value of a lead is to them or if they can track natural search visitors all the way through to a sale. During 2008, fewer companies were concerned about rankings as they were forced to look more at the bottom line.
Rumors swirl about an imminent merger between Yahoo! and Microsoft triggering a deluge of blog posts and nothing else. Yes, definitely. Talks between Yahoo! and Microsoft surfaced again early in the year but have fizzled quickly. Mid-2008 Google stole Yahoo! from Microsoft and became the attention of all, but that too died out by the end of the year. Although Google and Yahoo! gave it a shot with a trial period during the spring, antitrust scrutiny and regulatory concerns ultimatelycaused Google to call it quits with Yahoo.
The line between Search Engine Optimization and Social Media Marketing blurs further, except among those that actually know how to perform SEO and/or SMM. SEO and SMM definitely continue to be blurred in some circles (i.e. many marketers think creating a Facebook page will greatly help their search engine efforts). While creation of the company’s profile in these outlets is a fairly straightforward process, actually promoting them becomes much trickier and requires a completely different set of goals and strategies. Tracking offline inquiries becomes an important consideration but is not yet mastered in most campaigns.
Alissa began by explaining that Google Website Optimizer is a free tool that website owners can use to determine the most effective combination of content on their site through a variety of tests. The results from these tests can then be used to maximize conversion rates among existing traffic to your site.
GWO can conduct both A/B testing and multivariate testing. A/B testing will test two different pages against one another, and multivariate testing will test the various elements on the page to determine the best combination.
Alissa then offered a few strategies to keep in mind when using Google Website Optimizer:
Ask big questions. You will only receive answers for the things that you actively test. You should test drastic changes rather than slight changes to obtain statistically significant data.
Spend time developing well-written copy to test.
Wait for significant data. GWO will indicate when a sample is large enough to be useful.
Alissa pointed out that there are different situations in which one would want to use A/B testing versus multivariate testing:
Multivariate tests compare different elements of a page. Google uses a full factorial test so it is very easy to generate a large number of combinations even when testing just a few elements. These tests require a high amount of traffic to achieve statistical significance.
A/B testing is suited for sites with less traffic because it requires a much smaller sample size to achieve statistical significance. This is the best option for testing a complete site redesign.
Finally, Alissa offered a few rules to follow when using Google Website Optimizer:
Wait for statistically significant data. As discussed earlier, initial results might reflect random chance, so wait until you have had enough traffic for the information to be significant.
Limit the elements being tested. If you create too many different combinations, it will take a great deal of time for GWO to produce meaningful results from your test.
Spend as much time developing content for a test as you would for a website. Even though this is a test, you are testing potential real options for your site. Do not get sloppy with the content just because it is a test.