Friday, February 23, 2007
It started out as a project to allow you to add the Gmail and Google Headlines to the Google homepage. Google’s personalized search feature was rolled out from Google Labs to the public in September 2006.
Google's personalized search reorders search results based on your history of past searches, giving more weight to topics that interest you.
This means that if you search for "fly fishing" your future results for a search on "bass" will be more heavily weighted toward fishing than the musical instrument, according to Avni Shah, Google product manager.
Why Is It Important Now?
Starting Feb 2nd, anyone who signs-up for any Google service using a Google account (Gmail, Adsense, Google Analytics, Groups, Alerts, etc) will automatically be enrolled into Google personal search.
In the new system, a profile is created automatically. Exactly how this is done isn't explained yet by Google. That it's measuring what you click on and then skewing your results over time to favor sites that fall in particular topics areas seems clear.
If you don't want to see personalized results, or you don’t want Google to see your personal activity, you need to sign out of your Google account.
Impact on SEO
Personalized search re-orders a user’s search results primarily by noting the type of sites the user selects from the search results. That allows Google to look at those sites and give them a boost in rankings as shown to that particular user.
This change is good for SEO, especially for sites with good content. While good title tags and descriptions make the user click to your site from the search results page, the content and usability of the website makes the visitor stay in your site for a longer period of time and makes your site a top pick for the personalized search, assuming that length of stay is part of the personalized search algorithm.
So, as site relevance and stickiness become more important, you should consider some additional SEO tactics. For example, some efforts in terms of blogs and Social Marketing could help. Also, offer a way to easily bookmark your site. Getting bookmarked will help your site be seen as important.
More and more people are signed into Google personalized search results without knowing it. Privacy issues and SEO impact are two concerns here. Stay tuned to find out how the public and the industry are reacting to these…
Wednesday, February 07, 2007
Working in marketing and advertising agencies taught me how to manage several types of projects for a number of clients all at the same time. The deadlines were always tight, there were always several resources to manage for any given project, and there was always an element of controlled chaos in the day-to-day operations. Everybody had their own expertise and specific interests. There were strategists, copywriters, artists, production managers, media buyers, etc., who were each one piece in a big puzzle. That’s the big difference between a traditional advertising agency and a search engine marketing agency: the SEM agency is composed of one team who all have the same focus and area of expertise.
Devoting all resources to search engine marketing enables the staff to become definitive experts in the SEM field and keep abreast of new trends and strategies as they develop. This in turn builds a strong, smart team who are all working and thinking together to get the best possible results for the clients. While the pace may be similar to a traditional marketing agency, with work being done for many different clients at once, the way the team pulls together to seamlessly deliver analysis, reports, research and recommendations is a much smoother process than having to pull in different resources for each type of project.
Working at a search engine marketing agency encompasses many of the aspects I love about marketing. There’s always something new to learn, a new web site and brand to delve into and new industries to discover. But most importantly, there’s the satisfaction of knowing that I’m part of a team that is genuinely great at what they do.
Thursday, February 01, 2007
Duplicate content is exactly what it sounds like – two or more occurrences of identical or nearly identical content appearing on the web. To give you a rough estimate, if 60% of the content of a page in question is the same as another, the search engines will most likely consider them duplicates of each other.
Why It’s a Problem
For search engines, a satisfactory customer experience is their main goal. Duplicate content detracts from the customer experience by creating a results list filled with the same content over and over again—not a good way to build searcher loyalty. Duplicate content also takes up valuable bandwidth. To fix the problem, search engines employ all kinds of tactics throughout the indexing process that filter out duplicates and only rank what they consider as the “original” source of the content.
For websites, this is a problem because the search engines’ de-duping process is by no means perfect. Which means that another instance of your content may be ranked instead of the one housed on your site, possibly translating into a loss of traffic that is rightly yours.
Common Causes of Duplicate Content
There are several ways that multiple versions of the same content can be created:
- Multiple URLs pointing to the same content. For example: Domain aliases (additional domain names that point to the same content); Mirrored sites (an exact copy of your site – an SEO no-no); A change in URL structure (for example, a site redesign that leads to a change in URLs from www.yoursite.com/about to www.yoursite.com/?about)
- Similar content on different pages. For example: two pages talking about the same product.
- Printer-friendly versions of content (URL is different, but content is the same except for the removal of navigation and graphics)
- Content syndication (articles, rss feeds) that results in your content being on other sites
- Canonicalization (http://www.yoursite.com vs. http://yoursite.com)
- Session IDs
- Scraper sites (sites that copy content from other sites, usually for the purpose of creating MFA (“Made for AdSense”) sites that are designed solely to make a profit via click generation )
Ways to Avoid Duplicate Content
While not an exhaustive list, below are some best practices to follow to help avoid creating duplicate content and confusing the search engines:
- Use 301 redirects whenever possible to point domains and URLs to one source of content
- Use a robots.txt “no follow” protocol to tell spiders not to crawl duplicate versions of content
- Reevaluate similar pages – do you really need both or can they be combined?
- Be the authoritative source of your syndicated content – ask that absolute links back to you are included in any feeds or articles
While everything these days seems to be about copying and redistributing the latest thing ad nauseum to make a quick buck (i.e. reality shows, boy bands, low carb diets, etc.) the same does not hold true in the search engine world. The only way to benefit here is to claim sole ownership of your content and make it very clear to Google, Yahoo and the others that yours is the only one that should be shown in the results.