Tag Archives: SEO



Courtesy of Adam Singolda, MEDIA POST’s VIDEO INSIDER.

In the mid ’90s, webmasters started to optimize their site so that when a search engine had sent its “spider” to crawl the page, data would be properly extracted and visible to users proactively searching for it. That was SEO.

Better visibility on search engines meant more users landing on your website’s content. More users landing on the website meant more revenue.

That discipline later evolved to also offer a paid option for getting users into your sites — now considered one of the primary money makers for search engines.

15 years after, people still use search, true — but not as much as they used to, and in my opinion, will barely do so in the future.

Why? People have no idea what they want to do next, so how can they search for it?

The world is transforming from actively pursuing to passively discovering. People might search for an article or a video, but then discovery vehicles will get the user to bounce from one piece of content to another. In fact, I’m not even sure that search will remain to be the anchor as it is today for people to land on the first article or video. As an example — social channels are already getting massive momentum and users are spending more time on them (Facebook versus Google)

The biggest asset on the Web, in my opinion, is “owning” where users go. Today it’s primarily Google through its search engine — a very lucrative business indeed. In the not-so-far future, I think that discovery tools — from social vehicles to recommendation engines spread all around the web content pages, offering people content they might like from the Web — will win.

If that’s true, the huge market of optimizing search and paying for it (SEO/SEM) will slowly transform into optimizing and paying for Discovery tools that own users’ attention and help navigate them to the “best next thing.”

I would call it discovery engine optimization (DEO).



MultiMedia Press Releases Get 77% More Views


Courtesy of Website Magazine.

Distributing press releases through services such as PR Newswire has for years been a highly effective marketing strategy for businesses on the Web. New data from PR Newswire, however, indicates a significant development regarding the effectiveness of today’s releases.

A recent update to the company’s Web analytics program enabled it to compare the copious data that details the activity press releases generate on PR Newswire.com. A closer inspection of the data confirmed that press releases with multimedia elements generate up to 77 percent more views than text-only releases.

PR Newswire’s research reveals that marketers can increase the number of views by 14 percent simply by adding a photo, and that including a video will raise that number to 20 percent. The percentage more than doubles to 48 percent more views with both a photo and a video, and adding additional elements such as audio or PowerPoint to photos, video and text will result in 77 percent more views than a text-only release.

The study determined that the increase in views is due to the fact that multimedia news releases (MNR) are more broadly distributed than text (non-MNR) press releases. Each element of a multimedia release is distributed separately and can attract its own audience on social networks and search engines. Videos, for example, are distributed to more than 70 video-specific portals.

The effect of distribution is illustrated clearly in the stark contrast between traffic sources for text press releases versus traffic sources for multimedia content. Search engines are the primary drivers of traffic to text press releases while other web sites are the primary drivers of traffic for multimedia content.

Multimedia news content is shared much more enthusiastically on social networks. The number is driven somewhat by the fact that multimedia press releases generally include a variety of sharable elements such as photos, video and slides in addition to text. The wide distribution of these elements as described previously also plays a part in driving the sharing process.

Nonetheless, the differences in the degree to which multimedia releases are shared more frequently than plain text is striking. Across the one-month sample of content on PR Newswire.com, multimedia releases were shared 3.53 times more often than text releases. Text releases were shared, on average, .99 times per hour per release while MNRs were shared, on average, 3.5 times per hour.

Multimedia content also has a longer shelf-life, holding the audiences’ interest for more than twice as long as text press releases. On average, text press releases generate visibility for 9.4 days while multimedia press releases generate visibility an average of 20 days. The higher degree of sharing also contributes to extending the message life.

The Two Sides of SEO


Today, I pass onto you this clever commentary I stumbled upon in SearchEngineLand courtesy of Bryson Meunier:

“Often, when people in the industry talk about the two sides of SEO, they’re talking about black hat and white hat tactics.

Having worked as an SEO since 2003 and in Internet marketing since 2000, both with Fortune 50 and mom and pop businesses with business goals as different as night and day, I think the distinction is deeper than just black hat and white hat.

It seems the best way to illustrate this is with a description of two SEOs, in the literary tradition of Goofus and Gallant:

Two Sides Of Link Building
This SEO refers to herself as a link builder, and spends all day checking reports from the software that automatically sends out reciprocal email requests. She doesn’t necessarily care if they’re effective or annoying to millions of people because she has a paycheck coming in and, hey, this is business.
That SEO convinced a client to permanently redirect a temporarily redirected domain, and gained more than 100,000 authoritative links in the process, which allowed them to jump from page two to one, where they have ranked consistently in the top 5 on a very competitive brand-agnostic keyword for the last two years without adding the keyword to the title tag or the body copy, which conflicted with their style guidelines.

Two Sides Of EDU Links
This SEO goes out and celebrates at the end of the day because she has identified and secured links from three authoritative EDU domains in the course of the day.
That SEO has a client who works for a university who changed domains ten years ago and let the domain expire instead of redirecting it and is not having success talking to Educause about subverting their policy about not re-acquiring the expired domain in order to let the client reclaim these thousands of old links that are rightfully theirs and could be helping them compete for competitive keywords because it is a rule that they’ve made, and other university clients who find out what SEO is will want to do the same thing.

That SEO looked in vain in Google’s webmaster help center for answers on how to handle link recovery issues such as this, and found nothing. When he reached out to his company’s Google rep, she referred him to the webmaster forum, but he couldn’t post a question due to confidentiality issues.

Two Sides Of Goals and Metrics
This SEO can’t sleep because he’s anxious about whether his PR8 links that he bought will bring his toolbar PageRank score to 5/10 and allow him to report the good news to his client.

That SEO sleeps well knowing that she is meeting her goal of natural search impressions, clicks and conversions that she forecasted for the client at the beginning of the project, and implementation of recommendations is on track to help her reach her goals in the end.

Two Sides Of Allegiance
This SEO thinks Google is the enemy and writes in her blog and in social media outlets regularly about how hypocritical the search engines are.
That SEO thinks of herself as an extension of the search engine’s search quality team, and regularly reports competitors who violate the webmaster guidelines as part of the SEO process. That SEO uses search engines in life as much as anyone, and gets upset when the search results aren’t relevant. That SEO thinks having a rigorously controlled Google Webmaster certification program similar to the AdWords and Analytics programs would be a great trust signal that could help Google fix their current spam problem.

Two Sides Of Implementation
This SEO makes changes to his website all day and night without anyone knowing or caring what is done.

That SEO just got off a four hour conference call with Legal in order to explain how search engines work and why it’s going to be beneficial to the business to make the title tags more descriptive. Changes to the website will not happen for months.

Two Sides Of Process
This SEO finally goes to bed at 3am because he’s been scrolling through tweets all day. He didn’t actually make any changes to the website that he’s optimizing, and probably spent too much time tweeting back and forth with @WestchesterSEOCompany1234 about Matt Cutts’s cats, but tomorrow is another day.

That SEO has to keep a detailed project plan of what’s being done when so that all stakeholders in the SEO project will know what’s expected of them when, and SEO requirements will not delay the launch date of the web site or require additional resources that weren’t in the budget.

Two Sides Of Discourse
This SEO guru focuses on bare bones implementation in the service of getting the client to the top of the search results with available resources for however long the tactics work.

That SEO guru doesn’t have a lot of time to write articles or speak, as she spends most of her day realizing her natural search goals and planning for the future, but when she does contribute to the industry it’s less on reverse engineering algorithms and more on creative ways to help her clients get more and better traffic by focusing on synergies between what SEOs and search engines need

Which Side Are You On?
Ask yourself: what kind of SEO are you, and what kind of SEO do you want to be? In my experience, it’s very easy to be “this SEO” as the majority of SEO gurus out there are trying to sell SEO services to small businesses with authority issues that don’t have resources to compete fairly or find creative ways to help clients become more visible in natural search results.

But when I’m hiring an SEO to help our company help clients take their natural search visibility to the next level, I’m weeding out “this SEO” in the interview process and looking for “that SEO” with great communication skills who focuses on business value of natural search traffic, quality of execution and attention to detail, and has a knack for creative problem solving.

I’m not suggesting that there are only two types of SEOs. I think there’s a more nuanced explanation that’s closer to the truth. However, I’m simplifying the issue to prove a point.

In these examples, “this SEO” is the one that gets covered often in this industry because the barrier to entry is lower, but it’s also the example that has very little to do with my work as an SEO and the work of others like me.

Fortunately, publications like Search Engine Land start to fill the gap with columns like Industrial Strength, and SMX caters to “that SEO” by focusing certain sessions on using natural search to drive business value.

There are also great books that cater to this audience like Vanessa Fox’s Marketing in the Age of Google and Audience, Relevance and Search: Targeting Web Audiences with Relevant Content. Unfortunately. these things are the exception to the rule, and the signal to noise ratio for someone in the SEO industry who wants to be the kind of SEO that I and others like me aspire to be is low.

If you are an SEO or you’re writing about SEO, please do your part to strengthen the signal by not assuming all SEOs are interested in what you consider to be SEO, and keep in mind that there are people out there who make a living as SEOs whose lives don’t resemble the lives of other SEOs in the slightest.

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.


Bryson Meunier is an Associate Director of Content Solutions at Resolution Media, an Omnicom Media Group Company, and a primary architect of Resolution Media’s natural search product and Digital Behavior Analysis.

100% Organic: 25 Super Common SEO Mistakes

What follows are innocent mistakes that many SEOs make. Courtesy of SeachEngineLand.  Some of these things catch even the best of us…


1. Google AdWords Keyword Tool Set To Broad Match

The Google AdWords Keyword Tool defaults to “Broad match” mode, which yields useless data from an SEO perspective — useless in that the numbers are hugely inflated to include countless phrases incorporating the search term specified. For example, the Keyword Tool reports 30.4 million queries for “shoes”, but that includes multi-word phrases such as “dress shoes,” “leather shoes,” “high heeled shoes,” and even “horse shoes,” “snow shoes,” and “brake shoes.”

In Exact mode, the search query volume for “shoes” drops to 368,000. The difference between those numbers is striking, isn’t it? So always remember if you are doing keyword research for SEO in the AdWords Keyword Tool: untick the box next to Broad match and tick the box next to Exact.

2. Disallowing when you meant to Noindex

Ever notice listings in the Google SERPs (search engine results pages) without titles or snippets? That happens when your robots.txt file has disallowed Googlebot from visiting a URL, but Google still knows the URL exists because links were found pointing there. The URL can still rank for terms relevant to the anchor text in links pointing to disallowed pages. A robots.txt Disallow is an instruction to not spider the page content; it’s not an instruction to drop the URL from the index.

If you place a meta robots noindex meta tag on the page, you’ll need to allow the spiders to access the page so it can see the meta tag. Another mistake is to use the URL Removal tool in Google Webmaster Tools instead of simply “noindexing” the page. Rarely (if ever) should the removal tool be used for anything. Also note that there’s a Noindex directive in the REP (Robots Exclusion Protocol) that Googlebot obeys (unofficially). More on disallow and noindex here.

3. URL SERP Parameters & Google Instant

I just wrote about parameters you can append to Google SERP URLs. I’ve heard folks complain they aren’t able to add parameters to the end of Google SERP URLs anymore — such as &num=100 or &pws=0 — since Google Instant appeared on the scene. Fear not, it’s a simple matter of turning Google Instant off and URL parameters will work again.

4. Not using your customer’s vocabulary

Your customer doesn’t use industry-speak. They’ve never used the phrase “kitchen electrics” in a sentence, despite the fact that its the industry-accepted term for small kitchen appliances. Your customer may not search in the way you think makes intuitive sense. For example, I would have guessed that the plural “digital cameras” would beat the singular “digital camera” in query volume — yet it’s the other way around according to the various Google tools.

Sometimes it is lawyers being sticklers that gets in the way — such as a bank’s lawyers insisting the term “home loan” be used and never “mortgage” (since technically the latter is a “legal instrument” that the bank does not offer). Many times the right choice is obvious but it’s internal politics or inertia keeping the less popular terminology in place (e.g. “hooded sweatshirt” when “hoodie” is what folks are searching for).

5. Skipping the keyword brainstorming phase

Too rarely do I hear that the site’s content plan was driven by keyword brainstorming. Keyword brainstorming can be as simplistic as using Google Suggest (which autocompletes as you type and is built into Google.com) or Soovle (which autocompletes simultaneously from from Google, Bing, Yahoo, YouTube, Wikipedia, Amazon, and Answers.com). The idea is to think laterally.

For example, a baby furniture manufacturer discovers the popularity of “baby names” through looking at popular terms starting with “baby” and decides to build out a section of their site dedicated to related terms (“trends in baby names”, “baby name meanings”, “most overused baby names” etc.).

6. Mapping URLs to keywords, but not the other way around

It’s standard operating procedure to map all one’s site content to keyword themes (sometimes referred to as primary keywords, declared search terms, or gold words.) What’s not so common is to start with a target (i.e. most desired) keyword list and map each keyword to the most appropriate page to rank for that keyword and then optimize the site around the keyword-to-URL pairs.

For example, “vegan restaurants in phoenix” could be relevant to five different pages, but the best candidate is then chosen. The internal linking structure is then optimized to favor that best candidate, i.e. internal links containing that anchor text are pointed to the best candidate rather than spread out across all five. This makes much more sense than competing against oneself and none of the pages winning.

7. Setting up a free hosted blog

Free hosted blog platforms like WordPress.com and Blogger.com provide a valuable service. Over 18 million blogs are hosted on WordPress.com. They’re just not a service I would sign up for if I cared about SEO or monetization. They aren’t flexible enough to install your own choice of plugins or themes/frameworks to trick out the blog with killer SEO. And for Heaven’s sake, don’t make your blog a subdomain wordpress.com. For $10 per year, you can get a premium WordPress.com account under your own domain name.

Did you know putting AdSense ad units on your WordPress.com blog is against the service’s Terms & Conditions? Much better to get yourself a web host and install the self-hosted version of WordPress so you have full control over the thing.

8. Not properly disabling Google personalization

Not long ago, Google started personalizing results based on search activity for non logged in users. For those who thought that logging out of Google was sufficient in order to get non-personalized results, I’ve got news for you: it isn’t. Click on “Web History” in the Google SERPs and then “Disable customizations based on search activity”. Or on an individual query you can add &pws=0 to the end of the Google SERP URL (but only if Google Instant is off, see above).

9. Not logging in to the free tools

Some of the web-based tools we all use regularly, such as Google Trends, either restrict the features or give incomplete (or less accurate) data if not logged in. The Google AdWords Keyword Tool states quite plainly: “Sign in with your AdWords login information to see the full list of ideas for this search”. It would be wise to heed the instruction.

10. Not linking to your top pages w/your top terms on your home page

The categories you display on your home page should be thought through in terms of SEO. Same with your tag cloud if you have one. And the “Popular Products” that you feature. In your mind translate “Popular Products” into “Products for which I most want to get to the top of Google.”

11. Not returning a 404 status code when you’re supposed to

As I mentioned previously, it’s important to return a 404 status code (rather than a 200 or 301) when the URL being requested is clearly bogus/non-existent. Otherwise, your site will look less trustworthy in the eyes of Google. And yes, Google does check for this.

12. Not building links to pages that link to you

Many amateur SEOs overlook the importance of building links to pages that link to their sites. For commercial sites, it can be tough to get links that point directly to your site. But once you have acquired a great link, it can be a lot easier to build links to that linking page and thus you’ll enjoy the indirect benefit.

13. Going over the top with copy and/or links meant for the spiders

Countless home pages have paragraphs of what I refer to as “SEO copy” below the footer (i.e. after the copyright statement and legal notices) at the very bottom of the page. Often times they embed numerous keyword-rich text links within that copy. They may even treat each link with bold or strong tags. Can you get any more obvious than that? I suppose if you put an HTML comment immediately preceding that said “spider food for SEO!” (perhaps “Insert keyword spam for Google here” might be more apropos?)

14. Not using the canonical tag

The canonical tag (errr, link element) may not always work but it certainly doesn’t hurt. So go ahead and use them. Especially if it’s an ecommerce site. For example, if you have a product mapped to multiple categories resulting in multiple URLs, the canonical tag is an easy fix.

15. Not checking your neighborhood before settling in

If you’re buying a home, you’d check out the area schools and the crime statistics, right? Why wouldn’t you do the same when moving into a new IP neighborhood. Majestic SEO has an IP neighborhood checker. This is especially important for the small-time folks. You don’t want to be on the same IP address (shared hosting) with a bunch of dodgy Cialis sites.

16. Doing too much internal linking

Don’t water down your link juice so much that only a trickle goes to each of your pages. An article page should flow PageRank to related topics not to everything under the sun (i.e. hundreds of links).

17. Trusting the data in Google webmaster tools

Ever notice Google Webmaster Tools’ data doesn’t jive with your analytics data? Trust your analytics data over the webmaster tools data.

18. Submitting your site for public site review at a conference where Google engineers are present

Doh! (Insert Homer Simpson voice here.) Unless you’re absolutely sure you have nothing weird going on within your site or link neighborhood, this is pretty much a suicide mission. Corollary: talking to Matt Cutts at a conference without covering your badge up with business cards. Note this mistake was contributed by a guy we’ll call “Leon” (you know who you are, “Leon”!)

19. Cannibalizing organic search with PPC

Paying for traffic you would have gotten for free? Yeah that’s gotta hurt. I wrote about this before in Organic Search & Paid Search: Are they Synergistic or Cannibalistic?.

20. Confusing causation with correlation

When somebody tells me they added H1 tags to their site and it really bumped up their Google rankings, the first question I ask is: “Did you already have the headline text there and just change a font tag into an H1, or did you add keyword-rich headlines that weren’t present before?” It’s usually the latter. The keyword-rich text at the top of the page bumped up the keyword prominence (causation). The H1 tag was a correlation that didn’t move the needle.

21. Not thinking in terms of your (hypothetical) Google “rap sheet”

You may recall I’ve theorized about this before. Google may not be keeping a “rap sheet” of all your transgressions across your network of sites, but they’d be foolish not to. Submitting your site to 800 spam directories over a span of 3 days is just plain stupid. If it’s easy enough to see a big spike in links in Majestic SEO, then it’s certainly easy enough for Google to spot such anomalies.

22. Not using a variety of anchor text

That just doesn’t look natural. Think link diversity.

23. Treating all the links shown in Yahoo Site Explorer as “followed”

Don’t ask me why YSS includes nofollowed links in its reports, but it does. Many YSS users wrongly assume all of the links reported under the “Inlinks” tab are followed links that pass link juice.

24. Submitting a Reconsideration Request before EVERYTHING has been cleaned up

This may not be “super-common” because many SEOs have never submitted a “Reconsideration request” to Google. But if you have or plan to, then make sure everything — and I mean EVERYTHING — has been cleaned up and you’ve documented this in your submission.

25. Submitting to the social sites from a non power user account

Nothing goes flat faster than a submission from an unknown user with no history, no followers, no “street cred”. Power users still rule, Digg redesign or not.

Bonus tip: Stop focusing on low- (or no) value activities

Yes I’ll beat on the meta keywords tag yet again. Google never supported it. All it is is free info for your competitors. Guaranteed there are items on your SEO to-do list like this that aren’t worth doing. Be outcome-focused, not activity-focused. Focus on what matters.

Of course this wasn’t an exhaustive list. There are many, many more. I could easily make this a three article series too. I will try to resist the temptation. 😉

What mistakes are you seeing your co-workers, clients, and competitors make? Share them in the comments!

Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.


Stephan Spencer is the Vice President of SEO Strategies at Covario. Formerly the founder and president of natural search marketing firm Netconcepts (recently acquired by Covario), and he is also the inventor of the GravityStream SEO proxy technology, now rebranded as Organic Search Optimizer. He is also an author of the O’Reilly book The Art of SEO along with co-authors Rand Fishkin, Jessie Stricchiola, and Eric Enge. He blogs primarily on his own site, Stephan Spencer’s Scatterings.

Fascinating Social Media Facts For 2010



Social media is not just a social instrument of communication. It is not all about people sharing ideas and thoughts with other people. It is the creation and exchange of ‘User Generated Content’. The ability to transform broadcast media monologues into a social media dialogues that spread, sometimes, faster and wider than television, radio or print. Social Media when compared to year 2009 shows a fantastic growth in terms of people participation, penetration, user-ability, business and more.

Now, we are almost at the end of year 2010, and therefore it is time to study and understand some of the Social Media facts and trends that were evolved and followed over the year. Scouting through the web has brought together the following list of Fascinating Social Media Facts. Most of these facts are based on surveys (online or offline) over a sample size, these are also mentioned to ensure that we get the perspective of each of these facts.

General Facts

1. Australia has the most number of established users of social media in the world, followed by USA and UK.

2. In terms of the impact of social networks on advertising, word of mouth is the popular option with 78% of customers trust peer recommendations on sites. While, only 14% trust advertisements.

3. Advertising has also been impacted greatly because of social media with only 18% of traditional TV campaigns generate a positive return on investment.

4. Facebook, Blogspot followed by Myspace are the top sites visited by under 18s.

5. An average user becomes a fan of 2 pages every month.

6. 24 out of the 25 largest newspapers are experiencing declines in circulations because the news reaches users in other formats.

7. 25% of search results for the world’s top 20 brands are linked to user-generated content.

8. *In a sample survey of 2884 people across 14 countries, 90% of participants know at least one social networking site.

9. * In a sample survey of 2884 people across 14 countries,72% of participants of the internet population are active on at least 1 networking sites. The top 3 countries part of at least 1 network site was Brazil (95%), USA (84%), and Portugal (82%).

10. * In a sample survey of 2884 people across 14 countries, users of social networking sites are saturated. Connected people feel no need to further expand their membership on other social network sites.

11. * In a sample survey of 2884 people across 14 countries, on an average, users log in twice a day to social networking sites and 9 times a month on professional websites.

12. * In a sample survey of 2884 people across 14 countries, sending personal messages is the most popular online activity. The top 5 activities online are sending personal messages, watching photos, checking status, reacting to others’ status, and uploading pictures.

13. * In a sample survey of 2884 people across 14 countries, people have about 195 friends on an average.


• *Online sample survey of 2,884 consumers spread over 14 countries between the age of 18 to 55 years old by Online Media Gazette.

Facebook Facts

14. Facebook has over 500 million users.

15. If Facebook were a country, it would be the world’s 3rd largest country.

16. An average Facebook user spends about 55 minutes a day on the site.

17. An average Facebook user spends about 6.50 hours a week on the site.

18. The average Facebook user spends 1.20 days a month on the site.

19. Facebook’s translation application support over 100 languages.

20. There are over 900 million objects that people interact with (pages, groups, events and community pages)

21. Average user is connected to 80 community pages, groups and events

22. Average user creates 90 pieces of content each month

23. More than 30 billion pieces of content (web links, news stories, blog posts, notes, photo albums, etc.) shared each month.

24. ** In a sample survey of 2884 people across 14 countries, Facebook is studied to have the highest penetration. The top 3 sites include Facebook (51%), MySpace (20%), and Twitter (17%).

25. **Over 300,000 users translate the site through the translations application.

26. **Over 150 million people engage with Facebook on external websites every month.

27. **Two-thirds of comScore’s U.S. Top 100 websites and half of comScore’s Global Top 100 websites are integrated with Facebook.

28. **There are over 100 million active users accessing Facebook currently through their mobile devices.


• *Online sample survey of 2,884 consumers spread over 14 countries between the age of 18 to 55 years old by Online Media Gazette.
• ** Statistics from Facebook press office.

YouTube Facts

29. The most popular YouTube video – Justin Bieber, Baby ft. Ludacris has had over 374,403,983 views

30. ** YouTube receives over 2 billion viewers each day.

31. ** 24 hours of video is uploaded to YouTube by users every minute.

32. ** 70% of YouTube users are from the United States.

33. ** More than half of YouTube’s users are under the age of 20.

34. ** To watch all the videos currently on YouTube, a person has to live for around 1,000 years.

35. ** YouTube is available across 19 countries and in 12 languages.

36. ** Music videos account for 20% of uploads on YouTube.


• ** Statistics from YouTube press centre.

Blogger Facts

37. There are over 181 million blogs.

38. 34% of bloggers post opinions about products and brands.

39. ** The age group for 60% of bloggers is 18-44 years.

40. ** One in five bloggers updates their blogs every day.

41. ** Two thirds of bloggers are male.

42. ** Corporate blogging accounts for 14% of blogs.

43. ** 15% of bloggers spend 10 hours a week blogging.

44. ** More than half of all bloggers are married and/or parents.

45. ** More than 50% of bloggers have more than one blog.


• ** Statistics from Technorati’s State of the Blogosphere 2009.

Tweet Facts

46. 54% of bloggers post content or tweet on a daily basis.

47. 80% of Twitter users use Twitter on mobile devices.

48. There have been over 50 million tweets in 2010.

49. The 10 billionth Twitter’s tweet was posted in March 2010.

50. **There are over 110 million users of Twitter currently.

51. **180 million unique users access Twitter each month.

52. **More than 600 million searches happen on Twitter every day.


Box Hill Institute: Box Hill Institute (Social Media at Box Hill Institute)
• **Statistics from Twitter and the Chirp Conference.

LinkedIn Facts

53. Of the 60 million users of LinkedIn half of them are from outside US.

54. By March 2010 Australia alone had over 1 million LinkedIn users.

55. 80% companies use LinkedIn as a recruitment tool.

56. **Every second a new member joins LinkedIn.

57. **Almost 12 million unique visitors visit LinkedIn every day.

58. ** LinkedIn has executives from all Fortune 500 companies.

59. **1-in-20 LinkedIn profiles are accounted by recruiter.


• ** Statistics from LinkedIn press centre and SysComm International.

Wikipedia Facts

60. If $1 was paid to you for every time an article was posted on Wikipedia, you would earn $156 per hour.

61. *Wikipedia has the maximum number of articles at 3 million articles. This is followed by, German (1.08 million), French (958,000), Italian (697,000), and Spanish (608,000).

62. **69% of users edit Wikipedia to fix errors.

63. **73% of Wikipedia users edit Wikipedia because they want to share knowledge.

64. **4.4% editors of Wikipedia’s are PhD’s, 19% of the editors hold master degrees.

65. **Bad weather usually results in more number of updates in Wikipedia.

66. **13% of the editors on Wikipedia are women.


Social Media Today
• * http://www.axleration.com/15-interesting-facts-about-wikipedia/
• ** http://pochp.wordpress.com/2010/08/16/surprising-facts-about-wikipedia/

Foursquare Facts

67. Over the last first year of Foursquare, it has more than half a million users, 1.4 million venues, and 15.5 million check-ins.

68. * Foursquare is five times larger than Gowalla.

69. * Foursquare is growing 75% faster than Gowalla each day.

70. **Foursquare passed the 3 million users milestone in August 2010.

• *http://techcrunch.com/2010/07/07/foursquare-gowalla-stats/
• **http://www.crunchbase.com/company/foursquare

9. All Sources

Online sample survey of 2,884 consumers spread over 14 countries between the age of 18 to 55 years old by Online Media Gazette.
Danny Brown Resources

About Writer: 

Sorav is a young entrepreneur started his Internet Marketing career at the age of 17. Sorav is amongst the pioneer of Social Media & Digital Marketing in India. He writes Social Media and Digital Marketing Blog and conduct Social Media Training and Workshops across various cities over the globe.  Connect with Writer Sorav Jain on Twitter: @soravjain,  FacebookLinkedIn

About the Author

Sorav is a qualified Masters in International Marketing Management from Leeds University Business School (U.K) and also alumnus of Loyola College, Chennai the finest institutions renowned globally. He started his career at age of 17 as SEO executive and Freelancer content writer. From Leeds University Business School he has been awarded with the Best Market Research Presentation Award, Leadership Award, Class Champion Award 08 and many precious accolades

Does Google Instant Mark the End of SEO?

Image representing Google as depicted in Crunc...

Image via CrunchBase


by Chris Crum  Courtesy of WebProNews

Google Instant Considerations for Search Marketing

A reporter (I believe she was from AdAge) attending Google’s Q&A about Google Instant pointed out that the new search feature tends to favor big brands. This isn’t really surprising, as it is these brands that are more likely to be searched for most often. After all, they’re big because people know them.

Do you think Google Instant is a threat to SEO? Share your thoughts.

iCrossing has a list of brands that come up when you enter each letter of the alphabet (not all are brands, but many are). A is for Amazon (not Apple), B is for Bank of America, M is for Mapquest (not Microsoft), N is for Netflix, P is for Pandora, V is for Verizon, and Y is for Yahoo.

You must keep in mind, however, that the instant results are personalized. Google takes into account things like your location and your surfing habits when providing you results.

Google Instant doesn’t necessarily make things any easier on small businesses, but it’s showing big brands in cases where Google probably would’ve suggested big brands anyway. If users do a lot of local searches, it’s possible that Google could show more local results (including small businesses) for those users, I’m speculating.

Steve Rubel says that Google Instant makes SEO irrelevant. “Here’s what this means,” he says. “No two people will see the same web. Once a single search would do the trick – and everyone saw the same results. That’s what made search engine optimization work. Now, with this, everyone is going to start tweaking their searches in real-time. The reason this is a game changer is feedback. When you get feedback, you change your behaviors.”

He’s not wrong about that, but I’m not sure that makes SEO irrelevant. Google has been showing different results to different users for quite a while now. This is really just an extension of that.

Businesses might want to try (and have other people try) doing searches for keywords that they would expect people to use to find their site. See what comes up (keep in mind the personalization) and work from there. Easier said than done no doubt, but it’s something to consider. Think about what kinds of people will be interested in your products and what other kind of searches they might be doing. It’s not a science, but again, perhaps something worth considering. It might mean getting to know your customers better, which can’ t be a bad thing anyway. Maybe it means asking them to take surveys. Maybe it doesn’t.

The whole thing doesn’t help organic SEO’s case in the old SEO vs PPC debate. I’ll give Rubel that.

Speaking of PPC, Google says Google Instant changes the way it counts impressions. “It’s possible that this feature may increase or decrease your overall impression levels,” says Google’s Dan Friedman. “However, Google Instant may ultimately improve the quality of your clicks since it helps users type queries that more directly connect them with the answers they need.”

Trevor Claiborne of the Google Analytics Team says that Analytics users might notice some fluctuations in AdWords impression volume and traffic for organic keywords. “For example, you may find that certain keywords receive significantly more or fewer impressions moving forward,” he says.

You should read this post on the Google Webamster Central blog. It says that impressions are measured in three ways: the traditional way, when a user clicks on a link that appears as they begin to type, and when a user stops typing, and the results are displayed for a minimum of 3 seconds.

Sidenote: Google’s Matt Cutts weighed in on the whole will Google Instant kill SEO thing. “Almost every new change at Google generates the question ‘Will X kill SEO?’ Here’s an video I did last year, but it still applies,” he says.

He says, however that over time, it could change SEO. “The search results will remain the same for a query, but it’s possible that people will learn to search differently over time,” says Cutts. “For example, I was recently researching a congressperson. With Google Instant, it was more visible to me that this congressperson had proposed an energy plan, so I refined my search to learn more, and quickly found myself reading a post on the congressperson’s blog that had been on page 2 of the search results.”

Google Instant will likely become increasingly important to search marketing, because not only will it roll out to more countries (it’s starting in the U.S. and a select few others), but it will soon come to mobile and browser search boxes. Each of these factors will greatly increase how often Instant results are displayed.

The mobile factor actually has implications for Google retaining a substantial amount of mobile searches in general. The better (and quicker) Google can give results on any kind of query, the less reason users have to go to different apps to acquire certain information.

Google clearly said that ranking stays the same with Google Instant, but it will change the way people search. It will affect their search behavior, and that is what search marketers are going to have to think about more than ever. You should also consider that some people will simply deactivate the feature, leaving them open to Google’s standard results.

Tell us what you think of Google Instant. Do you like it or not?

29 SEO Mistakes

a chart to describe the search engine market

Image via Wikipedia


Here’s a quick summary of some common SEO mistakes prepared by Stephan Spencer of SEARCH ENGINE LAND:

Worst Practice N/A Will stop Won’t stop
1. Do you use pull-down boxes for navigation?      
2. Does your primary navigation require Flash, Java or Javascript to function?      
3. Is your web site done entirely in Flash or overly graphical with very little textural content?      
4. Is your home page a “splash page” or otherwise content-less?      
5. Does your site employ frames?      
6. Do the URLs of your pages include “cgi-bin” or numerous ampersands?      
7. Do the URLs of your pages include session IDs or user IDs?      
8. Do you unnecessarily spread your site across multiple domains?      
9. Are your title tags the same on all pages?      
10. Do you have pop-ups on your site?      
11. Do you have error pages in the search results (“session expired”, etc.)?      
12. Does your File Not Found error return a 200 status code?      
13. Do you use “click here” or any other superfluous copy for your hyperlink text?      
14. Do you have superfluous text like “Welcome to” at the beginning of your title tags?      
15. Do you unnecessarily employ redirects, or are they the wrong type?      
16. Do you have any hidden or small text meant only for the search engines?      
17. Do you engage in “keyword stuffing“?      
18. Do you have pages targeted to obviously irrelevant keywords?      
19. Do you repeatedly submit your site to the search engines?      
20. Do you incorporate your competitors’ brand names in your meta tags?      
21. Do you have duplicate pages with minimal or no changes?      
22. Does your content read like “spamglish”?      
23. Do you have “doorway pages” on your site?      
24. Do you have machine-generated pages on your site?      
25. Are you “pagejacking”?      
26. Are you cloaking?      
27. Are you submitting to FFA (“Free For All”) link pages and link farms?      
28. Are you buying expired domains with high PageRank scores to use as link targets?      
29. Are you presenting a country selector as your home page to Googlebot?

Worst practices explained: 

  1. Do you use pull-down boxes for navigation? Search engine spiders can’t fill out forms, even short ones with just one pull-down. Thus, they can’t get to the pages that follow. If you’re using pull-downs, make sure there is an alternate means of navigating to those pages that the spiders can use. Note this is not the same as a mouseover menu, where sub-choices show up upon hovering over the main navigation bar; that’s fine if done using CSS (rather than Javascript.)
  2. Does your primary navigation require Flash, Java or Javascript? If you rely on search engine spiders executing Flash, Java or Javascript code in order to access links to deeper pages within your site, you’re taking a big risk. The search engines have a limited ability to deal with Flash, Java and Javascript. So the links may not be accessible to the spiders, or the link text may not get associated with the link. Semantically marked up HTML is always the most search engine friendly way to go.
  3. Is your site done entirely in Flash or overly graphical with very little textual content? Text is always better than graphics or Flash animations for search engine rankings. Page titles and section headings should be text, not graphics. The main textual content of the page should ideally not be embedded within Flash. If it is, then have an alternative text version within div tags and use SWFObject to determine whether that text is displayed based on whether the visitor has the Flash plugin installed.
  4. Is your home page a “splash page” or otherwise content-less? With most webites, as mentioned above, the home page is weighted by the search engines as the most important page on the site (i.e., given the highest PageRank score.) Thus, having no keyword-rich content on your home page is a missed opportunity.
  5. Does your site employ frames? Search engines have problems crawling sites that use frames (i.e., where part of the page moves when you scroll but other parts stay stationary.) Google advises not using frames: “Frames tend to cause problems with search engines, bookmarks, emailing links and so on, because frames don’t fit the conceptual model of the Web (every page corresponds to a single URL.) “Furthermore, if a frame does get indexed, searchers clicking through to it from search results will often find an “orphaned page”: a frame without the content it framed, or content without the associated navigation links in the frame it was intended to display with. Often, they will simply find an error page.What about “iFrames”, you ask? iFrames are better than frames for a variety of reasons, but the content within an iframe on a page still won’t be indexed as part of that page’s content.
  6. Do the URLs of your pages Include “cgi-bin” or numerous ampersands? As discussed, search engines are leery of dynamically generated pages. That’s because they can lead the search spider into an infinite loop called a “spider trap.” Certain characters (question marks, ampersands, equal signs) and “cgi-bin” in the URL are sure-fire tip-offs to the search engines that the page is dynamic and thus to proceed with caution. If the URLs have long, overly complex “query strings” (the part of the URL after the question mark), with a number of ampersands and equals signs (which signify that there are multiple variables in the query string), then your page is less likely to get included in the search engine’s index.
  7. Do the URLs of your pages include session IDs or user IDs? If your answer to this question is yes, then consider this: search engine spiders like Googlebot don’t support cookies, and thus the spider will be assigned a new session ID or user ID on each page on your site that it visits. This is the proverbial “spider trap” waiting to happen. Search engine spiders may just skip over these pages. If such pages do get indexed, there will be multiple copies of the same pages each taking a share of the PageRank score, resulting in PageRank dilution and lowered rankings.If you’re not quite clear on why your PageRank scores will be diluted, think of it this way: Googlebot will find minimal links pointing to the exact version of a page with a particular session ID in its URL.
  8. Do you unnecessarily spread your site across multiple domains? This is typically done for load balancing purposes. For example, the links on the JCPenney.com home page point off to www2.jcpenney.com, or www3.jcpenney.com, or www4.jcpenney.com and so on, depending on which server is the least busy. This dilutes PageRank in a way similar to how session IDs in the URL dilute PageRank.
  9. Are your title tags the same on all pages? Far too many websites use a single title tag for the entire site. If your site falls into that group, you’re missing out on a lot of search engine traffic. Each page of your site should “sing” for one or several unique keyword themes. That “singing” is stifled when the page’s title tag doesn’t incorporate the particular keyword being targeted.
  10. Do you have pop-ups on your site? Most search engines don’t index Javascript-based pop-ups, so the content within the pop-up will not get indexed. If that’s not good enough reason to stop using pop-ups, you should know that people hate them – with a passion. Also consider that untold millions of users have pop-up blockers installed. (The Google Toolbar and Yahoo Companion toolbar are pop-up blockers, too, in case you didn’t know.)
  11. Do you have error pages in the search results (“session expired” etc.)? First impressions count . . . a lot! So make sure search engine users aren’t seeing error messages in your search listings. Hotmail took the cake in this regard, with a Google listing for its home page that, for years, began with: “Sign-In Access Error.” Not exactly a useful, compelling or brand-building search result for the user to see. Check to see if you have any error pages by querying Google, Yahoo and Bing for site:www.yourcompanyurl.com. Eliminate error pages from the search engine’s index by serving up the proper status code in the HTTP header (see below) and/or by including a meta robots noindex tag in the HTML.
  12. Does your “file not found” error page return a 200 status code? This is a corollary to the tip immediately above. Before the content of a page is served up by your Web server, a HTTP header is sent, which includes a status code. A status code of 200 is what’s usually sent, meaning that the page is “OK.” A status code of 404 means that the requested URL was not found. Obviously, a file not found error page should return a 404 status code, not a 200. You can verify whether this is the case using a server header checker and then into the form input a bogus URL at your domain, such as http://www.yourcompanyurl.com/blahblah. An additional, and even more serious, consequence of a 200 being returned with URLs that are clearly bogus/non-existent is that your site will look less trustworthy by Google (Google does check for this).Note that there are other error status codes that may be more appropriate to return than a 404 in certain circumstances, like a 403 if the page is restricted or 500 if the server is overloaded and temporarily unavailable; a 200 (or a 301 or 302 redirect that points to a 200) should never be returned, regardless of the error, to ensure the URL with the error does not end up in the search results.
  13. Do you use “click here” or other superfluous copy for your hyperlink text? Wanting to rank tops for the words “click here,” eh? Try some more relevant keywords instead. Remember, Google associates the link text with the page you are linking to, so make that anchor text count.
  14. Do you have superfluous text like “Welcome To” at the beginning of your title tags? No one wants to be top ranked for the word “welcome” (except maybe the Welcome Inn chain!) so remove those superfluous words from your title tags!
  15. Do you unnecessarily employ redirects, or are they the wrong type? A redirect is where the URL changes automatically while the page is still loading in the user’s browser. Temporary (status code of 302) redirects — as opposed to permanent (301) ones — can cost you valuable PageRank. That’s because temporary redirects don’t pass PageRank to the destination URL. Links that go through a click-through tracker first tend to use temporary redirects. Don’t redirect visitors when they first enter your site at the home page; but if you must, at least employ a 301 redirect. Whether 301 or 302, if you can easily avoid using a redirect altogether, then do that. If you must have a redirect, avoid having a bunch of redirects in a row; if that’s not possible, then ensure that there are only 301s in that chain. Most importantly, avoid selectively redirecting human visitors (but not spiders) immediately as they enter your site from a search engine, as that can be deemed a “sneaky redirect” and can get you penalized or banned.
  16. Do you have any hidden or small text meant only for the search engines? It may be tempting to obscure your keywords from visitors by using tiny text that is too small for humans to see, or as text that is the same color as the page background. However, the search engines are on to that trick.
  17. Do you engage in “keyword stuffing”? Putting the same keyword everywhere, such as in every ALT attribute, is just asking for trouble. Don’t go overboard with repeating keywords or adding a meta keywords tag that’s hundreds of words long. (Why even have a meta keywords tag? They don’t help with SEO, they only help educate your competitors on which keywords you are targeting.) Google warns not to hide keywords in places that aren’t rendered, such as comment tags. A good rule of thumb to operate under: if you’d feel uncomfortable showing to a Google employee what you’re doing, you shouldn’t be doing it.
  18. Do you have pages targeted to obviously irrelevant keywords? Just because “britney spears” is a popular search term doesn’t mean it’s right for you to be targeting it. Relevancy is the name of the game. Why would you want to be number one for “britney spears” anyway? The bounce rate for such traffic would be terrible.
  19. Do you repeatedly submit your site to the engines? At best this is unnecessary. At worst this could flag your site as spam, since spammers have historically submitted their sites to the engines through the submission form (usually multiple times, using automated tools, and without consideration for whether the site is already indexed). You shouldn’t have to submit your site to the engines; their spiders should find you on their own — assuming you have some links pointing to your site. And if you don’t, you have bigger issues: like the fact your site is completely devoid of PageRank, trust and authority. If you’re going to submit your site to a search engine, search for your site first to make sure it’s not already in the search engine’s index and only submit it manually if it’s not in the index. Note this warning doesn’t apply to participating in the Sitemaps program; it’s absolutely fine to provide the engines with a comprehensive Sitemaps XML file on an ongoing basis (learn more about this program at Sitemaps.org).
  20. Do you incorporate your competitors’ brand names in your meta tags? Unless you have their express permission, this is a good way to end up at the wrong end of a lawsuit.
  21. Do you have duplicate pages with minimal or no changes? The search engines won’t appreciate you purposefully creating duplicate content to occupy more than your fair share of available positions in the search results. Note that a dynamic (database-driven) website inadvertently offering duplicate versions of pages to the spiders at multiple URLs is not a spam tactic, as it is a common occurrence for dynamic websites (even Google’s own Googlestore.com suffers from this), but it is something you would want to minimize due to the PageRank dilution effects.
  22. Does your content read like “spamglish”? Crafting pages filled with nonsensical, keyword-rich gibberish is a great way to get penalized or banned by search engines.
  23. Do you have “doorway pages” on your site? Doorway pages are pages designed solely for search engines that aren’t useful or interesting to human visitors. Doorway pages typically aren’t linked to much from other sites or much from your own site. The search engines strongly discourage the use of this tactic, quite understandably.
  24. Do you have machine-generated pages on your site? Such pages are usually devoid of meaningful content. There are tools that churn out keyword-rich doorway pages for you, automatically. Yuck! Don’t do it; the search engines can spot such doorway pages.
  25. Are you “pagejacking”?” Pagejacking” refers to hijacking or stealing high-ranking pages from other sites and placing them on your site with few or no changes. Often, this tactic is combined with cloaking so as to hide the victimized site’s content from search engine users. The tactic has evolved over the years; for example “auto-blogs” are completely pagejacked content (lifted from RSS feeds). Pagejacking is a big no-no! Not only is it very unethical, it’s illegal; and the consequences can be severe.
  26. Are you “cloaking”? “Cloaking” is the tactic of detecting search engine spiders when they visit and varying the content specifically for the spiders in order to improve rankings. If you are in any way selectively modifying the page content, this is nothing less than a bait-and-switch. Search engines have undercover spiders that masquerade as regular visitors to detect such unscrupulous behavior. (Note that cleaning up search engine unfriendly URLs selectively for spiders, like Yahoo.com does on their home page by dropping their ylt tracking parameter from all their links, is a legitimate tactic.)
  27. Are you submitting to FFA (“Free For All”) links pages and link farms? Search engines don’t think highly of link farms and such, and may penalize you or ban you for participating on them. How can you tell link farms and directories apart from each other? Link farms are poorly organized, have many more links per page, and have minimal editorial control.
  28. Are you buying expired domains with high PageRank scores to use as link targets? Google underwent a major algorithm change a while back to thwart this tactic. Now, when domains expire, their PageRank scores are reset to 0, regardless of how many links point to the site.
  29. Are you presenting a country selector as your home page to Googlebot? Global corporations sometimes present first-time visitors with a list of countries and/or languages to choose from upon entry to their site. An example of this is at EMC.com . This becomes a “worst practice” when this country list is represented to the search engines as the home page. Happily, EMC had done their homework on SEO and is detecting the spiders and waving them on. In other words, Googlebot doesn’t have to select a country before entry. You can confirm this to be the case yourself: do a Google search a “cache:www.emc.com” and you will see the EMC’s U.S. home page.