Brands influence purchases; there is really no argument about that. The art of advertising has paired a brand with a single adjective since the ads moved beyond basic product descriptions. For example: Coca-Cola=Classic, Gatorade=Quench, and Volvo=Safety. In today’s market, Google=Search and Bing=Decide. What would happen if brands were to disappear and only the bare bones, no glitz or glamour services and products were left behind? Michael Kordahi has provided the public with just that, a way to compare search results without the big three’s branding.
What is BlindSearch?
Kordahi has taken the famous Coke v Pepsi blind taste test and applied it to modern day technology. The introduction of BlindSearch has given Internet users the ability to search for results from Google, Bing, and Yahoo! without any branding or layout.
BlindSearch provides the results in three side by side columns with a voting button at the top of each. Each column represents results from Google, Bing, and Yahoo! The user is able to vote on the column which most closely matches their desired results. After voting, the buttons are replaced with the search engine logos to reveal which search engine’s results most closely matched your search. Don’t get me wrong, this isn’t anything brand new. There are other search engine comparison sites out there. However, I found this one to be particularly blog worthy since the results are listed side by side rather than tabbed as in Zuula (not to mention a fun little test).
Why I’m writing about it
BlindSearch isn’t really a big deal per say. It is just a fun, quick way to see what search engine may be better for you without having a hundred experts telling you which one you should use. On the plus side, BlindSearch helps in beating out some lazy tendencies when doing deeper web research. Additionally, even if at a base level, it may be a helpful nudge to some users in considering other search alternatives in their every day routines.
In no way is BlindSearch a statistical tool that will be the end all be all of the search engine supremacy argument. Even if it was 100% effective, it could render the same results as the Coke v Pepsi test, in which Pepsi was more generally liked but Coke was still the dominating force in sales.
The first to clarify BlindSearch’s discrepancies is Kordahi with a clear disclaimer on the BlindSearch homepage.
“The system has many flaws that I know about already, the primary one of interest is the lack of localisation. So, all searches are going through the US as US searches. The other deficiency worth noting is that there is much missing from the actual experience of using these search engines eg, image thumbnails, suggestions, refine queries, etc.”
There are other arguments that can be made against the accuracy, relevancy, and even importance of this experiment.
First, it could be that modern search engines are already too much alike in terms of results. Search engines have begun to emulate the leaders, taking what is effective and applying it to their own engines.
A simple comparison could be described with handbags. Designer handbags are extremely popular, high end stores carry these high end bags. In order to provide an alternative, lower end stores emulate the designs, colors, and patterns and sell similar bags for a cheaper price. After a bit of time, knockoff purses are released that are identical to the high end originals and if done correctly, can rarely be told apart. Basically, the best was emulated, and now, even the competitors have a similar product.
A second argument is shown when viewing the top ten results. The list shows a fairly generic breath of search. None of the ten results show any type of long tail searches and therefore do not really replicate normal search. This could be in part to lazy testing and a desire to get a result as quickly as possible to “test” a user’s search engine preference. It could also be in part to the limited amount of data, only 600,000 queries to date.
Lastly, BlindSearch is a sort of site where users would try to get off the wall results and test the boundaries/parameters. The site is small and the audience is certainly a select sample.
Kordahi released the following results with roughly 8 weeks of data:
Google: 41%, Bing: 31%, Yahoo: 28%
Although an employee of Microsoft, Kordahi makes it abundantly clear that this project is not initiated by or affiliated with Microsoft.
2009 has thus far been the year of innovation in regards to search engine marketing. The ever growing popularity of Twitter and its newly introduced Twitter Search has since opened the flood gates of what is now referred to as “Real-Time Search.” The demand for instant, relevant results has spawned a slew of real-time search engines such as Cuil, Wolfram Alpha, and the popular Bing. Google’s Larry Page has even said, “I have always thought we needed to index the web every second to allow real-time search,” two of the newest real-time search engines, Collecta and Crowd Eye, have done just that. Essentially, the functionality of these two new real-time search engines beg the question, how and why are they different from traditional search engines?
On June 18th 2009, the world was introduced to a no frills all results, first true real-time search engine, Collecta. Breaching the search engine market, Collecta draws information from blogs using wordpress; news services such as Fox, CNN, and Reuters; social networking sites like Twitter, Jaiku and Identica; and even images from Flickr as described on their homepage. The result of Collecta’s efforts is a simple user interface (UI) that displays real-time results on the left column and a preview on the right.
As described by TechCrunch, Collecta’s advantage over typical search engines rests in the use of a Web standard called Extensible Messaging and Presence Protocol (XMPP). Using XMPP over the traditional HTTP Web standard allows for data to travel from one individual to another substantially quicker, therefore allowing Collecta to render true real-time results to the public.
Attempting to use the service on launch day was less than thrilling. I proceeded to make several searches and (after waiting an average of six minutes for a result) my questions were yet to be answered. It seemed as though on launch date Collecta’s severs were not ready to handle the influx of users hungry for a taste of real-time results and they made it known via Twitter. Using the search engine a few days later, I have noticed an almost instant return for my search but I still have a few reservations.
First and foremost, the results are not organized. Rather than seeing results based on relevancy, I am only seeing the latest blurb in regards to my query. This tightens the vice on top companies to not only be the first to gain valuable information on a subject, but also be the first to Tweet or distribute it via a social media outlet and continue to post updates to ensure their presence on the results page. Secondly, finding answers to less popular questions is nearly impossible since the success of the query is measured by its current relevancy of the news. Collecta appears to be a great option for breaking news, sports, and current comments on products or brands as described by Collecta’s CEO Gerry Campbell. At this point, it is the initial impression of users that will determine if Collecta’s juice is worth the squeeze.
Sharing Collecta’s June 18, 2009 launch date is CrowdEye, an alternative real-time search engine. Similar to Collecta, CrowdEye scours the Web for real-time information through Twitter and provides up to date results on the newest topics.
So what is so different about CrowdEye? First and foremost, after more carefully reviewing CrowdEyes FAQ, you will realize that “real-time” does not necessarily mean every Tweet as it happens. Instead, CrowdEye is only able to index “a large subset of tweets.” The disadvantage of this approach comes in form of not being able to produce second-by-second results as Collecta does.
Despite this small disadvantage, I find CrowdEye to be substantially more user friendly. CrowdEye divides results by Popular Links and Tweets allowing for natural search results to appear as well as real-time social media results. The page also includes a graph of Tweet volume over time; this allows the user to see the recent history of specific trends. Additionally, CrowdEye includes common words that you can click and create a filter in order to refine your search. Using CrowdEye for a broad range of searches answered my questions with relevant results using a nice mix of traditional and real-time search.
Constant developments by Facebook, Google, and other Web powerhouses ensures that the refinement of real-time search is far from over. In order to maximize the exposure a company must implement traditional search engine optimization, integrate social media tactics and now, with the introduction of real-time search, continually post relevant information within seconds of obtaining it.
It seems as though Collecta and CrowdEye have laid a solid foundation from which future search engines can learn, adapt, and tweak to bring us exactly what we want from our Web queries. As with everything Web related, the market for real-time search will continue to evolve and we as consumers, owners and contributors must continue to stay afloat.
In a move that is pretty commonplace within the search industry, Microsoft has attempted to make a splash as a search engine yet again. This time, instead of simply adding new features into MSN or Live, they launched an entirely new brand, Bing.com at the end of May. So–What is Bing? What impact has it made? What does “Bing & decide” even mean?
Here are some quick & easy definitions, answers, thoughts, and more about Bing:
Bing is “a new approach to user experience and intuitive tools to help customers make better decisions, focusing initially on four key vertical areas: making a purchase decision, planning a trip, researching a health condition, or finding a local business.” Microsoft is attempting to diverge from the established entity we all know and love, the search engine. Steve Ballmer, Microsoft’s CEO, is quoted in the company’s official Bing press release as saying,
“Today, search engines do a decent job of helping people navigate the Web and find information, but they don’t do a very good job of enabling people to use the information they find.”
Hmm–I definitely need help navigating through billions of webpages, but do people really need help using information? I interpreted “Bing & decide” to mean that I search with Bing.com, and then I decide how to use the information I find. But, thanks Steve, I guess we do need help deciding what to do.
Market Share: Does Bing have a chance?
This is a great question, and one that won’t have an answer really for at least another 6 months or so. Ultimately, Microsoft needs to increase their search market share to attract advertisers and make more money. Non-search industry geeks have to actually hear about Bing, go to Bing.com, use it, like what they see, remember to go back and search from their cell phones later, etc. Bing has launched a traditional advertising campaign, to hopefully scoop up the masses. According to ComScore, in the first week of Bing’s launch, Microsoft increased its usage among American searchers by almost 2%; other sources are saying Bing has squeaked past Yahoo! worldwide. Not bad, but only time will tell if Bing & Decide will really stick.
Shake up for Analytics
Web Analytics tools have had to jump quickly to tweak their products to measure and report Bing.com data. Web Analytics companies Webtrends and Omniture have released statements and posted on their blog about how their customers can view analytics for Bing. Google Analytics first started reporting Bing as a search engine as of June 5 but sometimes is still shown as a referring site, as well.
Bing has introduced more complications for web analytics as well–similar to Google’s longer snippets launched a few months ago, Bing searchers can hover to the right of a search listing to view content on the page and potential navigation path even before clicking the listing. Since analytics code isn’t activated until a user clicks thru to a site, these new features in search results may change the way a “website visit” is defined. Some are concerned that visits will decrease if searchers can preview each listing without visiting it, but the result of this change may not be as dismal as some may have predicted, as mentioned in MediaPost, “Preliminary data suggests that bounce rates on Web sites have declined from people originating on Bing.” For the first 10 days of June, a site I track that has 30K unique visitors per month had a bounce rate that was 5% lower in Bing than in Google, which supports the notion that previews of listings can be a good thing for websites.
Although Microsoft CEO mentioned publicly that the Bing.com domain was one of the only easy domains to purchase at a decent price, there is lots of speculation about the “true” meaning of B.I.N.G. I guess we’ll never know…
To me, Bing represents an attempt at healthy competition in the uncompetitive search industry. I’m all for it–as Celine Dion said (something that is a little too perfect for Microsoft and Bing), “I’m not in competition with anybody but myself. My goal is to beat my last performance.“
Okay, so the first installment of Link Love Monday was actually born to two proud parents on a Monday. The next installment hit on a Tuesday. Therefore, it’s only fitting that Link Love “Someday” occur on a Friday eventually. Things have been quite hectic around here to say the least, but due diligence will be paid to ensure a specific day of the week (most likely Friday) is linked with love. On to the links! Link: Tracking Transactions Back to the Initial Referrer with Google Analytics
Love tracking your websites progress at an even more granular level than Google Analytics normally allows. Let’s say you’re running a PPC campaign and the user clicks on your ad, heads to the page, but does not initially fill out a form or make a purchase. He or she needs time to mull. Some of us are mullers. We need time. The next day the user heads to Google, types in the name of your company, lands on your website and then fills out a form or makes the purchase. By default, this transaction would be credited to organic search even though the user initially found you through your PPC campaign. The transaction can be tracked the other way as well, where the user initially finds you vianatural search, but returns via paid search and makes the purchase then. Either way, this will allow you to get a better idea of how well your well ranked website and paid search campaigns work in tandem.
“Google Analytics, by default, will attribute transactions to the last referrer. While this is all fine and good, there are some situations where you would really like to be able to track these transactions back to the initial referrer rather than the last referrer.”
Love optimizing your website after the organic or paid search click. Many potential clients come to us looking only for search engine optimization and/or paid search campaign management — and that’s perfectly okay. However, many people do not think about optimizing the site itself. Does the navigation bar remain in place as the user navigates through your site? Is there a visible call-to-action in the same spot across the entirety of the site? How complicated is the checkout process? All of these issues can cause users to leave your site due to confusion or frustration if they cannot find what they’re looking for in a timely manner. From my personal experience, if a website requires me to register to participate or make a purchase, the minute I see this requirement is the same minute I hit the back button (unless I cannot live without the product or participation). Check out these 12 tips that can make your checkout process that much more efficient for your users — you’ll likely improve your conversion rate.
“8. Keep the checkout interface simple - The checkout process is different to the rest of the browsing experience on your site. During this process your customers aren’t shopping — they’re making the purchase. This means all the browsing controls are redundant here and would only distract your customers from the task at hand. Eliminate these unnecessary elements — e.g. product category links, top products, latest offers, and so on — to keep the interface simple.”
Love the recurring theme of tracking your work. If you work with an agency that a) does not track your websites progress through various metrics or b) tracks the websites progress in the form of handing you charts filled with numbers and without explaining what they mean…well…then perhaps it’s time to do some shopping for yourself. Measuring your websites progress through a combination of metrics, across multiple channels, and uncovering what these numbers mean is incredibly powerful. If there is one area where you can get a muscular leg up on your competition, it would be through early adoption of data-driven marketing and advertising initiatives.
“…Still, getting advertising agency employees to rely on data is difficult, agencies say. And as people trained on Wall Street migrate to Madison Avenue, executives anticipate battles between creative types and wonks. Traditional ad agencies still don’t have budgets that allow for a lot of digital experimentation, Mr. Herman says. He notes that most traditional agencies “make the bulk of their money in print, radio and television.” So even as this area becomes increasingly technology-driven, old ways of doing business and clients reluctant to embrace radically new approaches mean that the advertising culture won’t change overnight.”
Love your blog. Please. Don’t let it sit and waste away. Certainly participate in the latestsocial media platform if it fits within the scope of your overall online marketing strategy, but do NOT leave your blog heaving for breath on the roadside — especially if you “don’t have anything” to write about. Not only is figuring out what to write about as simple as taking these 10 tips and running with them, but your blog can serve as the hub of all of your social media efforts — direct users back to your blog and then track where they go from there. Talk to them. You can be even more authentic with 250 words as opposed to 140 characters.
“1. Grab your local newspaper – pick one column (it could be a news item or op-ed piece) and blog your own perspective on it.”
PS – that’s what we’re looking for, your perspective — subjectivity. “There are no facts, only interpretation.” Even if you don’t believe that, keep it in mind and it will help you write stirring blog posts. Link: The Local Business Center Dashboard Opens Its Doors
Love more data on your Google Local Business listing! I know, you just can’t get enough data (alright, so maybe there is such thing as data-overload). Additional data includes:
Impressions: The number of times the business listing appeared as a result on a Google.com search or Google Maps search in a given period.
Actions: The number of times people interacted with the listing; for example, the number of times they clicked through to the business’ website or requested driving directions to the business.
Top search queries: Which queries led customers to the business listing; for example, are they finding the listing for a cafe by searching for “tea” or “coffee”?
Zip codes where driving directions originate: Which zip codes customers are coming from when they request directions to your location.
Love competition. Microsoft recently rolled out it’s new search engine, Bing, to the masses. It’s still too early to know whether or not people will change their search patterns and turn to Bing instead of Google (I doubt this will happen anytime soon, personally), but competition is good. Take a look at this article over at Search Engine Land to see how the search engines differ for a number of search terms. Side note: they’re certainly going after the David Letterman watching demographic. It seemed Bing made an appearance at each commercial break last night.
For the third installment of the SEO Toolbox, we are featuring Enquisite Pro. This tool is a supplemental web analytics tool that gathers information from page views that originate from a search engine. It presents you with four reports that will improve your SEO analysis. The reports are as follows:
The Longtail report informs you of keywords, entry pages, geographic locations and organizations referring traffic to your site.
The Search Engine Comparison report allows you to compare more than one search engine and determine keyword opportunities for Yahoo! and MSN. It also includes traffic, actions and ranking page data.
The Links report provides you top referral URLs, link trends, and landing page details.
The TopReferrals report can provide a visual graph of data that can be grouped and ungrouped, allowing evaluation of relevant information.
Additionally, Enquisite is a complementary to analytics tools like Google Analytics. It can augment your analysis with five distinguishing capabilities:
Streamlines data mining
Rankings page data available for keywords driving traffic/actions (even those not in your campaign)
International page rankings are available
Geographic capabilities as specific as zip codes
Ability to layer data
Enquisite Pro is a great tool to enhance the evaluation of your online efforts by providing actionable insights.
Do you track the conversations about your business meandering through the Internet? You should. They are not only opportunities to connect with your customers and potential customers, but these conversations can act as link bait. The ultimate form of link bait – outside of a viral piece of content – is certainly the engaging and informative content of your website itself, but conversations don’t need to be 500 page novels. Conversations online can be as short as 140 characters (as they are on Twitter) or as long as the longest tail in the form of a blog comment. The entire Internet is a conversation. Figuring out where people are talking about your business and engaging these users, by both listening and providing information when necessary can lead to improved customer relationships, increased brand awareness, and can serve as link bait to improve your search rankings and keep you in front of the competition that still thinks their online presence isn’t that important. So, how might you find these conversations?
Here’s a brief list of the communication coffee shops you should be visiting regularly:
I must admit that the gap between the time I opened my Twitter account and when I actually became a part of the community stretched months, and it wasn’t because I was too busy clipping my finger nails. Twitter does provide value. Twitter provides instantaneous information about anything and everything. Whopper Virgins? Check. Recent news?Check. Conversations about your company? Check. Head to http://search.twitter.com, type in your product or your company name, say hello and start the conversation.
Blogs are perhaps the ultimate conversation conduits of the Internet. They allow users to write exhaustively about life, love, lemmings and your product. Also, you can leave a linkwhich will help drive traffic and, if the link is not “nofollow,” help increase the link juice flowing to your page. Head to Technorati and, again, search for your product or company name and engage your constituents (and add a link)!
Forums and Message Boards
BoardReader allows users to perform searches for specific posts and forums about your product or company as well as providing a topic profile for your search. Searching forums and messages boards allows you to find communities of users that are highly relevant to your business and can be an ongoing source of interaction for business and link juice.Blogs, unless they are targeted at your product, can be less qualified in terms of an ongoing relationship than forums or message boards, but they can still provide link juice via incoming non-nofollow links.
Okay, so this isn’t a website that allows you to find conversations per se, but using Google Analytics can help you find conversations about your product or company that have already taken place (in addition to its somewhat important task of tracking visitors, goal conversions and other metrics for any SEO campaign worth a dime). While scanning thereferrer traffic for a client recently, I noticed a decent amount of traffic referred to their website from bbc.co.uk. “Interesting,” I thought. I headed to Google and typed in “’www.Client’sURLHere.com’ site:bbc.co.uk” and…Voila! Someone had linked to the client’s website on the forums tucked inside the BBC website. Sure, the conversation had already taken place, but it was a naturally occurring (followed) link from a powerful domain (PR 9) which made the forums worth monitoring.
This list is by no means exhaustive, but merely a starting point. I’m sure some will notice that I have not included Google Alerts. It’s certainly a good tracking utility, but it might not provide the sort of granular conversation tracking you are looking for, particularly when it comes to forums and messages boards (not to mention the need to diversify your sources). In the end, people are talking about you and your product all over the World Wide Web. User engagement is the name of the game. You can increase your brand awareness and improve your rankings through links and your sales through direct contact.
According to Google Watch, last week Google’s CEO, Eric Schmidt, was discussing an impressive $5.7 billion in sales during Q4 2009, another indicator that search marketing is still a viable advertising source in shaky economic conditions. During that talk he revealed an enticing forecast about Google increasing emphasis on semantic search technology:
“Wouldn’t it be nice if Google understood the meaning of your phrase rather than just the words that are in the phrase? We’ve made a lot of discoveries in that area that are going to roll out in the next little while.”
First, what is semantic search? The best explanation I’ve seen comes compliments of ZDNet and their informational video:
“Semantic search uses the science of meaning in language—instead of just searching keywords, it checks the context of the words to return more relevant results.”
As the amount of information available online increases, there is a need for more sophisticated methods of finding valuable data for any given query. This isn’t an entirely new concept – as Google matures it gets “smarter” with matching queries to relevant results as it constantly improves its algorithm, incorporating the beginnings of semantic technology. But Google isn’t alone. A search engine that has received some buzz about its semantic search beta test is Hakia, founded in 2004. The ‘About Us’ page contrasts Hakia against other portals:
“Today’s search engines bring popular results via statistical ranking methods but a popular Web site may not always be credible, and a credible Web site may not always be popular. As a result, searchers suffer in many ways ranging from wasted search time to using misleading information. Hakia’s semantic technology provides a new search experience that is focused on quality, not popularity. Hakia’s quality search results satisfy three criteria simultaneously: They (1) come from credible Web sites recommended by librarians, (2) represent the most recent information available, and (3) remain absolutely relevant to the query.”
Whether you call it the next evolution in search or Web 3.0, the age of semantic search is closer than you think. I predict that these new developments will have significant implications for paid search and search engine optimization strategies, and will undoubtedly lead the way to a better search engine user experience.
When a new year rolls around, a flurry of activity usually occurs with businesses wanting to create new websites. On one hand, this makes sense – many times, marketing budgets are renewed, or maybe the holiday season went better than planned so there is extra cash to invest in the company. From a search engine perspective though, creating a new website for either of these two reasons is not necessarily a good idea.
Reasons why you don’t want to burn down your old site and start over from scratch:
Cost: Website design and development are expensive. For a mid-sized website (100-500 pages), a complete overhaul will probably cost you $5-10K. That cost doesn’t necessarily mean more conversions either – websites do not adhere to the “if you build it, they will come” idea.
Domain Age: Search engines prefer old domains not old content. Domain age is one of over 100 variables that affects your site’s overall ability to rank for (and thus drive traffic through) keywords related to your business. If your site has been around since 2003 and your competitor’s has been around since 1999, their site will always be scored higher than yours for that variable.
Domain Authority: This variable in the search engine algorithm is one of the hardest to explain. Some important qualifiers are:
if your site has past penalties or not
if your site has accurate information
if your site is linked within a “good neighborhood”
Aged Links: It’s no secret in the SEO world that links are a good idea. Usually for a company site, links have been naturally added to the site over time – maybe a new client released a press release about hiring your firm, perhaps an organization you joined added a link to your company on your bio page, or maybe your company joined the local BBB and received a link back to your company’s website. Those links have aged over the years, and the search engines take those into account when they are measuring the value of your site.
So, what should you do at the beginning of a new year? It’s not that a website face lift is a bad thing. In fact, it can be very beneficial to the success of your website. With a few tweaks to the effectiveness of your website, you can potentially double (or even triple) your conversion rate. Pursuing search engine optimization through link building and appropriate tagging can also drastically increase the traffic to your site. Simply updating the content on your site on a regular basis will cause the search engines to notice and value your site even more for its fresh content.
By simply improving your current site rather than burning it down completely, you are protecting the value your website has already gained in the search engines over time, and this will ultimately boost the overall success of your website.
The year is almost done. I’ve noticed quite a number of “Best of 2008″ posts floating around the search marketing industry, along with the proverbial prediction posts: what does 2009 hold for SEM? The SEO department here at Apogee batted this question back and forth during one of our team meetings to formulate a list of answers. I’m not going to delve into the entire list right now, but instead, talk a bit about one item in particular from that list: the personalization of search and rankings.
Let’s first start with a definition of personalized search. I asked a few friends not barricaded within the ivory tower of search engine marketing if they noticed they could manipulate search results on Google when logged into one of their Google accounts, or if they noticed the “Customized based on recent search activity” text at the top right corner of their search results. The consensus answer? “No.” So, as long as there is an unaware class, I am of it. Basically, personalized search is providing search results to users based on search history, search query intent and user location, and in Google’s case,SearchWikiactivity, among other signals.
Of course, increased personalized search on its own isn’t much of a bold prediction. Rather, predicting the degree to which it will infiltrate search marketing, and search engine rankings in particular, is how you grab headlines. Well, I’m not here to join the “Rankings R Dead” team or the “Personalization Won’t Matter Much” team – I know, the grey area is boring and oh so non-polarizing, but I’m not looking for true believers here. Looks like I’m not grasping at headlines.
So how will personalized search affect rankings and how will we utilize them in SEO?
Personalized search will have the most impact on those queries whose relevancy is disparately dependent on location – restaurants and the like – and queries that are not location specific will be affected minimally, but could be affected by search history.
It’s logical that users receive different search results for a query such as “thai restaurants” if one lives in Ridgecrest, CA and another lives in Bremerton, WA. Location should be one of the primary signals in determining personalized results for these queries. However, it’s illogical that users receive completely different search results for queries such as “linguistics” based on location, but logical that search history slightly effect those rankings.
If search history reveals the person searching for “linguistics” frequents websites with videos, perhaps tweaking the algorithms so that websites with videos receive a bit of a boost makes sense. Nonetheless, the overriding desire here is for relevancy. Personalization does not occur in a vacuum where other users do not exist. Looking to the collective, through linking patterns, helps provide greater relevancy than merely one user’s search patterns.
Also, I don’t think the changes will be drastic for those queries where search history can play a role in determining rankings, because Yahoo! is going to anonymize user data after 90 days, and I think Google and Microsoft will not only set a similar limit, but will be forced to do so eventually. If this plays out as predicted, search results will only be personalized to a certain extent.
Finally, how will smart SEOs utilize rankings as a metric for success?
The first example, location dependent queries, would generally fall into the local search realm. I don’t foresee a major change in this department except that local businesses will be required to seek out SEO services from their hometown agency. Rankings are not dead. They are alive, well, breathing, eating, sleeping and working for you.
For those queries potentially affected by search history, I don’t think the changes will be drastic. It seems illogical to personalize search results for the sake of personalization while detracting from relevancy. Wikipedia works so well because it looks to the collective – groups are much wiser than individuals on their own.
Will personalization force smart SEOs to adapt?
Of course. Will it potentially make tracking certain rankings harder? Yes, but harder does not mean impossible. However, proclaiming rankings dead is akin to saying search engines are dead. Search is a cornerstone of the Internet. So long as people need to find information about things they do not know, rankings will matter. So long as the search results are organized (ranked), rankings will matter because they bring exposure, traffic, leads, sales and revenue. Rankings should have always been looked at as a means to these ends.
Yahoo! Sponsored Search advertisers can now geo-target ads at the country, city or ZIP code level, according to an announcement yesterday on the Yahoo! Search Marketing Blog.
Yahoo! joins Ask.com as the only major search engine to offer specific ZIP code-based targeting to advertisers. Google AdWords currently allows advertisers to use ZIP codes as a basis for ad targeting (for example, “ads will show 20 miles around 78759″) and also allows advertisers to draw custom target areas to show their ads, but it does not offer exact ZIP code targeting.
Geo-targeting by ZIP code allows advertisers the opportunity for more relevant clicks on their ads, which means more conversions.”Keep in mind the more you target, the fewer users your ads may reach. Generally, you’re trading relevancy for volume,” as stated on the Yahoo! Search Marketing Blog. Yahoo! recommends that advertisers select a minimum of 10 ZIP codes to broaden the scope of their ads in order to avoid a situation in which your ad is too targeted and receives limited clicks.
Screenshot of Yahoo!’s new geo-targeting feature.
This new feature is still in beta and perfect accuracy is not guaranteed, as with any geo-targeted marketing. Advertisers are free to opt-out of this feature that “is designed to help you hit the bullseye with your ads every time!” but Yahoo! explains that accuracy of geo-targeting “may vary depending on the level of targeting selected, as well as other factors.”