Thursday, March 20, 2014

New Title Tag Guidelines & Preview Tool

Google's recent SERP redesign may not seem like a big deal to the casual observer, but at least one change could have a real impact on SEOs. This post will explore the impact of the redesign on title tags, and define a new, data-driven length limit, but first, a new tool...

Title tag preview tool (2014 edition)

Pardon the reverse order of this post, but we wanted to put the tool first for repeat visitors. Just enter your title and the search query keywords (for highlighting) below to preview your result in the redesign:
Enter Your Full Title Text:
Enter Search Phrase (optional):
I'm really happy for you, and Imma let you finish, but Beyonce has one of the best
www.example.com/example
This is your page description. The font and size of the description has not changed in the latest redesign. Descriptions get cut off after roughly 160 characters ...

Note: Enter keyword phrases as natural queries, without commas. This preview tool only highlights exact-match text (not related concepts) and is only intended as an approximation of actual Google results.

How the redesign impacts titles

Google's redesign increased the font size of result titles, while keeping the overall container the same size. Look at the following search result both before and after the redesign:
The title on the top (old design) has a small amount of room to spare. After the redesign (bottom), it's lost six full characters. The old guidelines no longer apply, and so the rest of this post is an attempt to create a new set of guidelines for title tag length based on data from real SERPs.

Tuesday, March 18, 2014

Google Humans Do Review Every Reconsideration Request

Over the past week or two there has been some people suggesting that Google does not review manually every single reconsideration request.
A new Google Webmaster Help thread has one such complaint but the truth is, at least from what we are told, Google employees reviews 100% of all reconsideration requests.
I have been told that directly by Googlers and Matt Cutts did a video a couple years ago on the topic. That was before Google swapped the reconsideration requests in the manual action viewer where now all reconsideration requests have to be submitted via the manual action section and thus all are reviewed by humans.
They might have some templated responses they use but humans do click on, read and paste the response.
Here is Matt's video:

Forum discussion at Google Webmaster Help.

Google Secure Search Going Global, So Is Not Provided

WebmasterWorld thread links to a story that says Google has recently begun defaulting to Google's secure, encrypted search worldwide.
Here is a statement from Niki Christoff, Google's Director of Corporate Communications:
The revelations of this past summer underscored our need to strengthen our networks. Among the many improvements we've made in recent months is to encrypt Google Search by default around the world. This builds on our work over the past few years to increase the number of our services that are encrypted by default and encourage the industry to adopt stronger security standards.
Honestly, I thought Google's secure search was default globally already based on my 93% not provided count. But I guess, it will soon be 100%.
Yes, as Google defaults all search to SSL, it will stripe out the referral query data and marketers will lose out as well. Most don't care if it goes global because most have already lost 90%+.
But we are eagerly awaiting what Google is going to announce with the upcoming not provided changes.
Forum discussion at WebmasterWorld.
Google has penalized a few more international link networks on Friday afternoon. Google went after, as promised Italian and Spanish networks and those who participated in them - as well as a couple more Germany link networks.
Earlier in the day on Friday, Matt Cutts, Google's head of search spam, tweeted that Google has "taken action" on another German link network, this one named efamous plus a German agency network. This is the second time in almost two months that Google booted a Germany link agency and network. The thing isefamous looks pretty legit but I guess behind the scenes, in Google's mind, it was not?
Later on in the day, this may be somewhat historic, but Matt Cutts didn't announce it first, that they penalized an Italian and Spanish link network. Giacomo Gnecchi, a Google search quality analysts who has been with Google for maybe about 4 years, tweeted it in Italian and then much later, Matt Cutts retweeted it and then posted a translated version on Twitter. This shouldn't be a surprise because Matt warned it days before.

Google: Being Disavowed Won't Hurt Your Site

I am pretty sure we covered this before but the message is not getting out. If someone puts your site in their disavow link file, it will NOT have a negative impact on your rankings.
There are many link spammers trying to get their links removed from sites and are using very threatening emails and messages to encourage those sites to remove the links. One example is in a thread at Google Webmaster Help.
Here is part of the message:
We would like to bring your notice that failure to remove these links would require us to file a "Disavow Links" report with Google. Once we submit this report to Google, they may "flag" your site as"spammy" or otherwise if anything is not in compliance with their guidelines. The last thing we want is to have another web master go through this grief!
John Mueller from Google responds to the concern saying:
They are wrong. Having URLs from your website submitted in their disavow file will not cause any problems for your website. One might assume that they are just trying to pressure you. If the comment links they pointed to you are comment-spam that was left by them (or by someone working in their name) on your website, perhaps they are willing to help cover the work involved in cleaning their spam up?
I love how he outright calls them wrong and then goes on and suggests they pay up to remove the link they placed on their site. It is like throwing it back in their face and using a tactic the spammer would have used.
Forum discussion at Google Webmaster Help.

Google's Official Advice On Out Of Stock Products On E-Commerce Sites

A common question large sites or e-commerce sites have to ask themselves is what do I do about product pages that are either temporarily out of stock or forever out of stock. It is a question we asked a coupletimes. In fact, Google's John Mueller gave his two centsin 2008.
Well, now, Google's Matt Cutts created a short video answering what you should do in three different situations.



Tuesday, March 11, 2014

Seo Elite Software

The secret tool that tens of thousands of the top ranked Google sites have been quietly using for years...
"Who Else Wants To Finally Get A #1 Google Ranking In As Little As 7 Days... And Drive A Minimum Of 789 Unique Visitors To Your Websites Per Day?"
Introducing the ONLY search engine optimization software
product that has been 100% PROVEN to dramatically
increase your rankings in Google, Yahoo, and MSN.
Above is a real-life screenshot of 1 of my many websites.
The site above gets, on average, 12,783 visitors PER DAY!


From: Brad Callen

Indianapolis, IN
Tuesday, March 11, 2014

Dear fellow internet marketer,
My name is Brad Callen. I've been marketing online since the early 2000s and have been fortunate enough to have generated millions of dollars solely from the internet.

Friday, March 7, 2014

Google Notifies Google Places Users Of Duplicate Listings Issues

Google is upgrading more and more businesses to the new Google business local dashboard and during that process is running into issues with old listings being duplicate.
Google is sending out emails to business owners with issues with instructions on how to repair the duplicate issue and move forward with the upgrade.
Jade Wang from Google posted in the Google Business Helpforums a snippet of the email and then in detail described the issues and how to address them.
The email reads:
We'd like to inform you that Google Places no longer accommodates more than one authorized owner per business location. Your account contains one or more listings that have been identified as duplicates of other listings and as a result, some of the information you provide will not be shown to Google users anymore...
There are two issues where this can be triggered:
(1) Your account and another account that you don't control became verified for the same business using the old Places dashboard.
(2) You may have verified the page multiple times using accounts you control.
In each case, Jade describes how to repair the issue so you can continue with your upgrade.
For more details see the Google Business Help thread for those details.
Forum discussion at Google Business Help.

Google: Don't Worry If Your Google Webmaster Tools Preview Is Wrong

You know that often, Google Webmaster Tools shows an image preview of your site's home page on the dashboard listing view of your site profiles within Google Webmaster Tools.
What if the image of your site is wrong? Should you be concerned? Google's John Mueller implies it is really not something to be too concerned about.
In a Google Webmaster Help thread, one person was concerned because his mobile site was displayed in the preview and not his main site. John said, don't "give too much weight to these preview images."
I wouldn't give too much weight to these preview images -- they're not meant to be a representation of how Googlebot crawls the page. If you do see something wrong there, I'd still follow up to see where it came from, but it's not something that would be affecting how we index & rank your website. From what I can tell here, we do recognize your website appropriately, and can pick up the smartphone version. Since the homepage has a very different response size depending on the version that you serve, one way to dig into the details could be to check your server logs, comparing size with the user-agent that made the request, to double-check that your real users get the correct version.
These previews don't represent what Googlebot crawls or your indexing and ranking. It is likely just a style thing for Webmaster Tools.

Local SEO Checklist

WebmasterWorld moderator, travelin cat, posted an awesome and free local SEO cheat sheet onWebmasterWorld.
He said he has uses this cheat sheet for "some time" and if you take it step by step in a careful and detailed manner, "this information should work for you as well."
Here are the instructions but for the cheat sheet, go toWebmasterWorld.
First, complete the following questionnaire BEFORE you begin to submit your website to the search engines, review websites and other citation sites. Some of these questions may not apply to your business, if that is the case, just leave them blank.
Be accurate and thorough. The most important thing to remember is to BE CONSISTENT! All of your submissions must have identical information or you will not get the search engine rankings that you will need to improve business.
After you have completed this questionnaire, use it and the information that it asked you to gather up, to submit to the 25 sites included below it.
Forum discussion at WebmasterWorld.

Google's Matt Cutts On Percolator, Dremel & Pregel

I doubt this will help you with your SEO and rankings but hey it is always fun hearing an engineer talk about some really technical details about how certain things are used at Google.
In his latest video, Matt answers how Google uses Percolator, Dremel and Pregel?
Here are the technical documents on each:
Here is how Matt sums it up in four minutes:

Google: Keep URL Length Shorter Than 2,000 Characters

SEOs obsess about the smallest things, even how long is too long for a URL.
Google Webmaster Help thread has SEOs and webmasters asking how long can they go for a URL. Google actually answered the question.
John Mueller of Google said, while there is "no theoretical length limit" and they can go forever, Google does recommend you keep it under 2,000 characters. Google's John Mueller wrote:
As far as I know, there's no theoretical length limit, but we recommend keeping URLs shorter than 2000 characters to keep things manageable.
It is interesting cause DoubleClick, a Google company, maxes out on 2,000 characters in a URL. It seems the GET method maxes out on 2,000 characters for a URL and that Internet Explorer can't go beyond 2,000.

Thursday, March 6, 2014

100 Free Instant Approval Directory Submission List March 2014

Directory sumbmission is one of the best way to get backlinks to their website or blog. Balklinks makes website search engine friendly. More backlinks more popularity in search enignes. Directory websites are very important to get backlinks and to get indexed. It is very important Directory Submission website list for the webmaster to increase ranking in Google Search engine. But many of the free directory submission sites take a long time to index our websites / blogs. Below is the list of directories which help to get backlinks to your website or blog instantly for free. You website will be indexed after submission to the directory. Though some sites do not have good page rank but these can hlep webmasters to instant backlinks to their website or blog. Some of the sites have good page rank. 

100 Free Instant Approval Directory Submission List : March 2014

Instant Approval Directory Listing
Bedirectory.com
247webdirectory.com
highrankdirectory.com
archivd.com
Adbritedirectory.com
9dir.com
Addirectory.org
Clicksordirectory.com
ellysdirectory.com
Sublimedir.net
Ask-Directory.com
submissionwebdirectory.com
Upsdirectory.com
craigslistdir.org
hitwebdirectory.com
Bing-Directory.com
poordirectory.com
craigslistdirectory.net
Bestbuydir.com

Latest Social Bookmarking Sites March 2014

Tuesday, March 4, 2014

Matt Cutts Video: How Google Determines What’s A Paid Link

Google head of search spam Matt Cutts released a pretty detailed video discussing the Google webspam team’s criteria for determining whether a link is considered a paid link or not.
There are five basic criteria Google uses in this determination. The first is the most obvious, is the link an explicit link for sale; then, the others are less obvious. The others include: how close is the value to money, is it a gift or a loan, what is the intent of the audience and is it a surprise or not.

Explicit Link Sales

Links that are explicitly sold for money are the most obvious. A webmaster sells a link to another webmaster in exchange for a certain dollar payment. That is clearly a paid link, and Matt Cutts said that is the most common paid link example, by far.

Close To The Value Of Money

The next determination Google uses is to see how close is the value to money. For example, a gift card is pretty close to money in that it can be often exchanged for a dollar value. But if you give someone a free pen that is valued at $1, the chances are that the value of that $1 pen won’t influence the user. However, a free beer or free trial to software is less value to users than a $600 gift card.

Gift Vs. Loan

If you give someone a laptop versus loaning them a laptop or gift someone a car versus loaning them a car, those are huge distinctions. Often, companies will loan a tech reviewer a device or car or something in order for them to properly review the item. But if you give them the item forever and not ask them to return it, that is closer to a paid link then a loan.

A new Googlebot user-agent for crawling smartphone content

Webmaster level: Advanced
Over the years, Google has used different crawlers to crawl and index content for feature phones and smartphones. These mobile-specific crawlers have all been referred to as Googlebot-Mobile. However, feature phones and smartphones have considerably different device capabilities, and we've seen cases where a webmaster inadvertently blocked smartphone crawling or indexing when they really meant to block just feature phone crawling or indexing. This ambiguity made it impossible for Google to index smartphone content of some sites, or for Google to recognize that these sites are smartphone-optimized.

A new Googlebot for smartphones

To clarify the situation and to give webmasters greater control, we'll be retiring "Googlebot-Mobile" for smartphones as a user agent starting in 3-4 weeks' time. From then on, the user-agent for smartphones will identify itself simply as "Googlebot" but will still list "mobile" elsewhere in the user-agent string. Here are the new and old user-agents:
The new Googlebot for smartphones user-agent:
Mozilla/5.0 (iPhone; CPU iPhone OS 6_0 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10A5376e Safari/8536.25 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
The Googlebot-Mobile for smartphones user-agent we will be retiring soon:
Mozilla/5.0 (iPhone; CPU iPhone OS 6_0 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10A5376e Safari/8536.25 (compatible; Googlebot-Mobile/2.1; +http://www.google.com/bot.html)
This change affects only Googlebot-Mobile for smartphones. The user-agent of the regular Googlebot does not change, and the remaining two Googlebot-Mobile crawlers will continue to refer to feature phone devices in their user-agent strings; for reference, these are:
Regular Googlebot user-agent:
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
The two Googlebot-Mobile user-agents for feature phones:
  • SAMSUNG-SGH-E250/1.0 Profile/MIDP-2.0 Configuration/CLDC-1.1 UP.Browser/6.2.3.3.c.1.101 (GUI) MMP/2.0 (compatible; Googlebot-Mobile/2.1; +http://www.google.com/bot.html)
  • DoCoMo/2.0 N905i(c100;TB;W24H16) (compatible; Googlebot-Mobile/2.1; +http://www.google.com/bot.html)

Faceted navigation best (and 5 of the worst) practices

Faceted navigation, such as filtering by color or price range, can be helpful for your visitors, but it’s often not search-friendly since it creates many combinations of URLs with duplicative content. With duplicative URLs, search engines may not crawl new or updated unique content as quickly, and/or they may not index a page accurately because indexing signals are diluted between the duplicate versions. To reduce these issues and help faceted navigation sites become as search-friendly as possible, we’d like to:


Selecting filters with faceted navigation can cause many URL combinations, such ashttp://www.example.com/category.php?category=gummy-candies&price=5-10&price=over-10

Background

In an ideal state, unique content -- whether an individual product/article or a category of products/articles --  would have only one accessible URL. This URL would have a clear click path, or route to the content from within the site, accessible by clicking from the homepage or a category page.

Infinite scroll search-friendly recommendations

Your site’s news feed or pinboard might use infinite scroll—much to your users’ delight! When it comes to delighting Googlebot, however, that can be another story. With infinite scroll, crawlers cannot always emulate manual user behavior--like scrolling or clicking a button to load more items--so they don't always access all individual items in the feed or gallery. If crawlers can’t access your content, it’s unlikely to surface in search results. 

To make sure that search engines can crawl individual items linked from an infinite scroll page, make sure that you or your content management system produces a paginated series (component pages) to go along with your infinite scroll. 


Infinite scroll page is made “search-friendly” when converted to a paginated series -- each component page has a similar <title> with rel=next/prev values declared in the <head>.

You can see this type of behavior in action in the infinite scroll with pagination demo created by Webmaster Trends Analyst, John Mueller. The demo illustrates some key search-engine friendly points:
  • Coverage: All individual items are accessible. With traditional infinite scroll, individual items displayed after the initial page load aren’t discoverable to crawlers.
  • No overlap: Each item is listed only once in the paginated series (i.e., no duplication of items).

3 tips to find hacking on your site, and ways to prevent and fix it

Google shows this message in search results for sites that we believe may have been compromised.You might not think your site is a target for hackers, but it's surprisingly common. Hackers target large numbers of sites all over the web in order to exploit the sites' users or reputation.

One common way hackers take advantage of vulnerable sites is by adding spammy pages. These spammy pages are then used for various purposes, such as redirecting users to undesired or harmful destinations. For example, we’ve recently seen an increase in hacked sites redirecting users to fake online shopping sites.

Once you recognize that your website may have been hacked, it’s important to diagnose and fix the problem as soon as possible. We want webmasters to keep their sites secure in order to protect users from spammy or harmful content.

3 tips to help you find hacked content on your site

  1. Check your site for suspicious URLs or directories
    Keep an eye out for any suspicious activity on your site by performing a “site:” search of your site in Google, such as [site:example.com]. Are there any suspicious URLs or directories that you do not recognize?

Saturday, March 1, 2014

You Can Fake Your Google +1 Counts With Redirects

Enrico Altavilla discovered a bug with how Google+ shows +1s for a page, which has already been patched by Google. It is pretty amazing and reminds me of how webmasters faked their PageRank back in 2005.
In short, by using a simple redirect on the page, a page was able to pretend it was another page and use the +1s from the page they are redirecting to as the their own +1s.
So when a site had a weird JavaScript redirect to YouTube, it thought the site was indeed YouTube and that page inherited the +1s YouTube had.
Google +1s hijack
Again, this no longer works but I figured I share the story so if it happens again, a different way, we have something to look back at.

Google's Matt Cutts: Content Clarity Over Technical Content

There is an excellent video from Google's Matt Cutts on the question Should I focus on clarity or jargon when writing content?
The short answer is focus on clarity over jargon.
Matt explains that in most cases, having content that most people understand is way more important that having all the scientific and technical jargon about the topic you are covering. If you can't explain the topic to a novice, then the reader likely won't be able to understand your content.
Best case, start off explaining it in simple terms and get more technical as you go. But if you had to pick, it seems Matt is saying content clarity is more important over detailed technical and scientific content, in most cases.
Here is the video:

Google's Matt Cutts Wants You To Send Him Examples Scraper Sites

Matt Cutts, Google's head spam guy, posted on Twitterthat he wants you to submit reports and examples of scraper sites or URLs that are outranking the original source.
He made a Google Doc form where you can submit the report over here. The form asks you the source URL, i.e. the original source of the content, the URL of the page stealing the content, the search results page where it is being outranked, and just an agree link.
You should keep in mind, in January 2011, Google came out with an algorithm specifically designed to prevent scrapers from ranking well, i.e. the scraper algorithm.
The best example thus far was posted by +JonDunn with a tip from Dan Barker:
google scraper example
Classic!
Anyway, I assume this means Google is going to use this data to improve or create a new algorithm in the future.
Forum discussion at TwitterWebmasterWorld & Google+.

Sunday, February 23, 2014

Bing Updates Webmaster Guidelines: Keyword Stuffing Now Off Limits

Last night, Bing has updated their webmaster guidelines adding a section about "keyword stuffing." Surprised it wasn't there from the onset? Yea, me too but truthfully, there are a ton of things they can/should add there that are not currently there.
What is new? The section on keyword stuffing, which reads:
When creating content, make sure to create your content for real users and readers, not to entice search engines to rank your content better. Stuffing your content with specific keywords with the sole intent of artificially inflating the probability of ranking for specific search terms is in violation of our guidelines and can lead to demotion or even the delisting of your website from our search results.
I verified using various caching services that the paragraph was indeed not there a day or two ago.
That being said, the language is pretty strong. If you do use keyword stuffing techniques on your site, Bing may give your site a "demotion" or even worse "delist" your site from the Bing search results.
Hat tip to +GauravGupta2014 for informing me about this.

Google's Matt Cutts: We Tested Dropping Backlinks From Algorithm, It Was Much Worse

Google's Matt Cutts latest video has Google admitting they did and do indeed test their search results by turning off linkage data as part of their algorithm. Matt Cutts said the results would be "much much worse" if they did indeed do that in real life.
That does indeed make sense since Google's core algorithm was mostly based on links and PageRank and all these years they spent improving on it and such. They invested so much time and resources in using links to rank sites that dropping it now would make for a mess.
It is funny, because a couple weeks ago, we asked you what you would do if Google dropped backlinks from the algorithm. We so far have over 300 responses and 34% said they would be very excited, 32% said they'd be curious and 17% said they'd be very concerned.
Here is Matt's video on the topic:

Business Names Google Places Quality Guidelines Updated

Google has updated their Google Places quality guidelines once again this time to clarify how you can name your business within Google Places/Google Local/Google Maps.
Jade Wang from Google pulled out the changes and posted them in the Google Places Help forums. The changes include:
  • Your title should reflect your business's real-world title.
  • In addition to your business's real-world title, you may include a single descriptor that helps customers locate your business or understand what your business offers.
  • Marketing taglines, phone numbers, store codes, or URLs are not valid descriptors.
  • Examples of acceptable titles with descriptors (in italics for demonstration purposes) are "Starbucks Downtown" or "Joe's Pizza Delivery". Examples that would not be accepted would be "#1 Seattle Plumbing", "Joe's Pizza Best Delivery", or "Joe's Pizza Restaurant Dallas".
Hopefully that clarifies things a bit better, because these guidelines are updated relatively frequently.

Monday, February 17, 2014

Google's Advice On Infinite Crawl Pages & SEO

Google's John Mueller, Maile Ohye, and Joachim Kupke co-authored a technical blog post on the Google Webmaster Blog on how to make infinite scroll pages more search engine friendly.
The issue, as you can understand, is that GoogleBot and crawlers can't scroll down a page and thus can't load more content with that action. In Google;s blog post, they announced the definitive guide on how to make infinite scroll pages more search-friendly.
In short, Google is recommending that you convert the infinite scroll page to paginated series by using the HTML5 History API. John even made a demo page of infinite scroll that is search engine friendly.
click for full size

Google Drops New Webmaster Guideline On Not Blocking Google Ads

Yesterday we broke the news that Google added that you should not block Google ads within the technical requirements within Google's Webmaster Guidelines.
It seems Google has pulled the new line completely from their guidelines. It is unclear why but I suspect after I emailed them they reviewed it and found it to be confusing as well.
The new guideline that was added for about 24 hours read:
Make efforts to ensure that a robots.txt file does not block a destination URL for a Google Ad product. Adding such a block can disable or disadvantage the Ad.
As I explained, it was confusing because Google asks you specifically to block other ads from being crawled. But here, Google wants you to allow Google to crawl those ads. You and I understand why, because Google uses landing page quality score as part of AdWords ranking but still, it is confusing how they worded it.
Now, the language and the bullet point, is completely gone. The guidelines are back to how they were the day before.
Google has not responded to my request for clarification as of yet.