Showing posts with label seoupdates. Show all posts
Showing posts with label seoupdates. Show all posts

Monday, February 3, 2014

Google's Matt Cutts: Don't Use Article Directories For Link Building

In a short video yesterday, Google's Matt Cutts told webmasters and SEOs not to use article directory sites for link building strategies.
Well, he kind of said not to use it and hinted to it at the end by saying:

Here is the video:

Google Places Business Adds 1,000 Categories Internationally

Jade Wang from Google's Places, business help, team posted in theGoogle Business Help forums that they have greatly expanded the business category support by adding over a 1,000 new categories internationally.
Previously, they had a very limited number of categories for international (Non USA) businesses. The reason was because it was complicated to translate them all. Well, now Google has translated a ton and can support 1,000 new categories.
Jade said:
Today, we are taking a first step of many to improve categories that merchants can use to represent their businesses. Specifically, we’re adding over 1,000 new categories in the new Places dashboard. These categories are available globally and translated to every language Google supports.
Why not sooner? Jade explained:

DMOZ Drops Over 1 Million Sites From Directory?

Did you notice that DMOZ, one of the oldest and largest human crafted web directories, has removed over 1 million sites and 10,000 editors from their directory?
DigitalPoint Forum thread first noticed it. If you look at the live site now, you will see 4,261,763 sites, 89,252 editors and over 1,019,865 categories in the footer. But if you go to the WayBackMachine archive you will see 5,310,345 sites, 99,997 editors and over 1,019,508 categories.
Here are screen shots:
NOW:
dmoz-new-sites
OLD:
dmoz-old-sites
As you can see, DMOZ dropped about 1 million sites from their directory and 10,000 editors. There was no announcement about this, so I am not sure if this is just a glitch on the footer.

Sunday, January 26, 2014

Initial SEO Strategy: Tips That You Must follow Before Optimizing Your Site

Initial SEO Strategy: Tips That You Must follow Before Optimizing Your Site

Essential search engine optimization (SEO) is basic. What's more fundamental. SEO will help you position your site appropriately to be found at the most basic focuses in the purchasing methodology or when someone require your website.

What are web crawlers searching for? In what manner would you be able to fabricate your site in a manner that will please both your visitors/customers, and Google, Bing, and other web crawlers? Above all, in what manner can SEO help your web existence get to be more beneficial?

"Skipping the basics and spending all your time and money on social and 'fancy stuff' is the same as skipping brushing your teeth and showering, but buying white strips and wearing expensive cologne," Shelby (Director of SEO, Chicago Tribune/435 Digital) said.

#What is SEO, Basically?
The objective of foundational SEO isn't to trick or "diversion" the internet searchers. The reason for SEO is to:
  • Make an incredible, consistent visitor experience. 
  • Convey to the web crawler your expectations so they can propose your site for significant query.
Your Website is Like a Cake
Your links, paid query, and social media goes about as the icing, however your content, web data structural planning, content administration framework, and base go about as the sugar and makes the cake. Without it, your cake is flavorless, exhausting, and gets tossed in the garbage.

Wednesday, January 22, 2014

Google Webmaster Tools Crawl Errors Reports Now Showing Errors On Final Redirect URL

In the past, we have seen occasional confusion by webmasters regarding how crawl errors on redirecting pages were shown in Webmaster Tools. It's time to make this a bit clearer and easier to diagnose! While it used to be that we would report the error on the original - redirecting - URL, we'll now show the error on the final URL - the one that actually returns the error code.

Let's look at an example:

 
URL A redirects to URL B, which in turn returns an error. The type of redirect, and type of error is unimportant here.
In the past, we would have reported the error observed at the end under URL A. Now, we'll instead report it as URL B. This makes it much easier to diagnose the crawl errors as they're shown in Webmaster Tools. Using tools like cURL or your favorite online server header checker, you can now easily confirm that this error is actually taking place on URL B.
This change may also be visible in the total error counts for some websites. For example, if your site is moving to a new domain, you'll only see these errors for the new domain (assuming the old domain redirects correctly), which might result in noticeable changes in the total error counts for those sites.
Note that this change only affects how these crawl errors are shown in Webmaster Tools. Also, remember that having crawl errors for URLs that should be returning errors (e.g. they don't exist)does not negatively affect the rest of the website's indexing or ranking (also as discussed on Google+).
We hope this change makes it a bit easier to track down crawl errors, and to clean up the accidental ones that you weren't aware of! If you have any questions, feel free to post here, or drop by in the Google Webmaster Help Forum.

Google’s Matt Cutts: We Don’t Use Twitter Or Facebook Social Signals To Rank Pages

Google’s head of search spam, Matt Cutts, released a video today answering the question, “are Facebook and Twitter signals part of the ranking algorithm?” The short answer was no.
Matt said that Google does not give any special treatment to Facebook or Twitter pages. They are in fact, currently, treated like any other page, according to Matt Cutts.
Matt then answered if Google does special crawling or indexing for these sites, such as indexing the number of likes or tweets a specific page has. Matt said Google does not do that right now. Why?
They have at one point and they were blocked. I believe Matt was referring to Google’s real time search deal expiring with Twitter. Matt explained that they put a lot of engineering time into it and then they were blocked and that work and effort was no longer useful. So for Google to put more engineering time into this and then be blocked again, it just doesn’t pay.
Another reason, Google is worried about crawling identity information at one point and then that information changes but Google doesn’t see the update until much later. Having outdated information can be harmful to some people.

Tuesday, January 21, 2014

Was Expedia Penalized By Google? SearchMetrics Says So.

Yesterday morning Patrick Altoft tweeted to me that SearchMetrics is reporting Expedia lost about 25% of their Google traffic overnight.
SearchMetric is indeed reporting this and I confirmed it with Marcus Tober from SearchMetrics via email. I posted the story on the traffic drop at Search Engine Land and asked if it was related to the link buying allegations Expedia was surrounded with last month.
Hacker News thread, that I've been following for the past couple weeks has more details about how Expedia may have been involved in link schemes and may have received an unnatural link penalty by Google. Google has not yet responded to my inquiries about the penalty. So I have no confirmation from Google or Expedia about it.
SEM Rush shows no drop off in traffic:
click for full size

Keep Writing Quality Content: SEO Bloggers React To Matt Cutts’ Claim “Guest Blogging Is Dead”

Google’s head of webspam, Matt Cutts, caused an uproar in the SEO community yesterday when he published a blog post on his personal blog claiming guest blogging for SEO purposes is dead.
In his post, Cutts offered a history of how guest blogging has moved from being a reliable source of high-quality content to now being overrun with spam.
“Guest blogging is done; it’s just gotten too spammy,” wrote Cutts. “In general I wouldn’t recommend accepting a guest blog post unless you are willing to vouch for someone personally or know them well.”
As Cutts’ words spread across the web, many SEO bloggers took to their own blogs to offer their take on the demise of guest blogging.
With so much being said on the topic, we’ve put together a round-up of industry reactions, summarizing comments from a selection of popular SEO bloggers.

Monday, January 20, 2014

Google Removed 350 Million Ads & Rejected 3 Million Publishers

Google announced on Friday their efforts to keep their ad network safe, in-line and trustworthy. They shared some pretty crazy stats on what that means for their ad network.
  • Removed 350 million bad ads in 2013
  • Disabled 270,000 advertisers in 2013
  • Blacklisted more than 200,000 total publisher pages
  • Disapproved 3,000,000 attempts to join AdSense
  • Disabled 250,000 publisher accounts
Google published this infographic to show their efforts in a friendly way:
click for full size
Forum discussion at WebmasterWorld.

Google Apologizes For The Hotel Listing Hijack In Google Places

I have to assume most of you by now heard about the huge mess going on with Google Maps business listing in the hotel sector? Danny Sullivan at Search Engine Land, with the team, wrote up an awesome story explaining the how Thousands Of Hotel Listings Were Hijacked In Google+ Local.
I can tell you this story was in the work for a few days and Danny broke it just the other day. It is honestly shocking how something like this can happen to huge hotel chains. It is even more shocking how Google tries to sweep it under the rug. Yea, I know Google Maps is plagued with issues, especially on the Google Places business listing side. But this is a huge mess.
In short, some how, spammers hijacked listings of hotel chains across the world, replacing the hotel's URL with a URL to book the listing on their own affiliate site. This likely ended up costing the hotels a tremendous amount in affiliate fees, which I wouldn't blame them if they didn't pay and end up suing the affiliate that did this.
Here is an example showing one listing with a hijacked URL:
Google barely said anything but now they have their community manager, Jade Wang, respond in a Google Business Help thread that no one really looks at. She wrote:

Thursday, December 26, 2013

When You Have Bad Links Impacting Your Google Rankings, Which Do You Remove?

I see this all the time, a forum thread, where a webmaster knows his rankings are suffering in Google because he was hit by Penguin because he has a lot of really bad, manipulative links. A WebmasterWorld thread sums up the issues a webmaster in this predicament is in.(1) They hired an SEO company (2) That SEO company ranked them well for years (3) Then Penguin smashed their links (4) They no longer rank as well (5) They are upset with the SEO company (6) They need to figure out how to rank again (7) Removing the links are the only option (8) But removing links that were the result of their initial good rankings won't help them rank immediately
In this thread, the site owner sums it up as:
1) What is the sure proof way to make sure a link is 100% bad? 2) I don't want to remove all links cause I am worried my site will drop even more. I'm sure there are some semi-good links that might be helping.
3) After submitting disavow file, typically how long does it take to recover? We have two sites, one seems to be under penguin and panda updates and the other received a manual penalty for certain bad links for certain terms.
It is sad, indeed. But you need to disavow the links, that is for sure. Those links are not helping you and they are now hurting you. Remove the hurt. Then get people to link to you because they want to link to you.
But which links should you remove? Which links are actually hurting you. That is the hard question. One SEO offered his advice:

Black Hats Prepare To Spam Google's Author Authority Algorithm

For the past six-months, Google has been working on an algorithm to promote authorities on topics.In short, Google is going to try to figure out which authors or individuals are authorities on a specific topic and promote their content across any site, in some way. You can read more about it in the links above.
This morning, I spotted a thread at Black Hat World where "black hats" are seeking ways to exploit this algorithm by "faking" author authority.
This is how one explained it:
So Google now allows you to "tag" an author in your content. Good authors who are popular get extra ranking bonuses for their articles.So it seems very simple to me. Find a popular author in your niche, and tag him in your links to your content.
Extra link juice off someone else's work.

Sunday, December 22, 2013

Google December 19th Ranking Fluctuations Reported

So Google confirmed they reduced the authorship snippets from showing in the results. We know that. Matt Cutts strongly implied that there was no update on December 17th, despite all the tracking tools lighting up on that date. That implication that Google minimizes the algorithm updates before the holidays should apply to what I am seeing today - a lot of chatter, in some niches, of a Google update.
The key indicator I use is webmaster/SEO chatter. I check the chatter atWebmasterWorld and dozens of other forums and the chatter has picked up yesterday. Martin Ice Web, who is a Senior Member at WebmasterWorld but is based in Germany, is the loudest on claiming updates today. There are many who agree and see major changes and there are many threads in the Google forums with individual complaints.
That being said, one of the tools I rarely show you, because it often doesn't match the other tools, is the DigitalPoint Keyword Tracker averages changes. It is something fairly new added to the forums sidebar and this reports changes of actual rankings for hundreds of thousands of sites and I'd say millions of keywords. It is based off what webmasters enter into the tracking tools.

Google's URL/Content Removal Tool Now A Wizard

Google has updated their URL removal tool to make it easier and smarter to remove content specifically from third-party web sites.
Google told us at Search Engine Land specifically that the tool is smarter by analyzing the content of the URL you submitted and letting you know what options you have based on the details of the cache result, search results and the actual live page.
You can remove complete pages if the page is actually not live or blocked to spiders. You can remove content from a page if the content shows in the Google cache and is not live on the page.
Here are some screen shots:


Google's URL/Content Removal Tool Wizard
Google's URL/Content Removal Tool Wizard
Google's URL/Content Removal Tool Wizard
Google's URL/Content Removal Tool Wizard - click for full size
Now, it may even work on soft-404s, so be careful. As WebmasterWorld's moderator said:

Saturday, December 21, 2013

Google’s Matt Cutts: Don’t Duplicate Your Meta Descriptions

Google’s Matt Cutts, the head of search spam, released a video today providing an SEO tip on meta descriptions. Matt said, do not have duplicate meta descriptions on your site.
Matt said it is better to have unique meta descriptions and even no meta descriptions at all, then to show duplicate meta descriptions across pages.
In fact, Matt said for his own blog, he doesn’t bother to make meta descriptions for his own site.
In short, it is better to let Google auto-create snippets for your pages versus having duplicate meta descriptions.

Google Says It’s Now Working To ‘Promote Good Guys’

Google’s Matt Cutts says Google is “now doing work on how to promote good guys.”
More specifically, Google is working on changes to its algorithm that will make it better at promoting content from people who it considers authoritative on certain subjects.
You may recall earlier this year when Cutts put out the following video talking about things Google would be working on this year.


In that, he said, “We have also been working on a lot of ways to help regular webmasters. We’re doing a better job of detecting when someone is more of an authority on a specific space. You know, it could be medical. It could be travel. Whatever. And try to make sure that those rank a little more highly if you’re some sort of authority or a site, according to the algorithms, we think might be a little more appropriate for users.”
Apparently that’s something Google is working on right now.

Google: Duplicate Content Pollutes 25-30% Of The Web

So Google confirmed they reduced the authorship snippets from showing in the results. We know that. Matt Cutts strongly implied that there was no update on December 17th, despite all the tracking tools lighting up on that date. That implication that Google minimizes the algorithm updates before the holidays should apply to what I am seeing today - a lot of chatter, in some niches, of a Google update.
The key indicator I use is webmaster/SEO chatter. I check the chatter at WebmasterWorld and dozens of other forums and the chatter has picked up yesterday. Martin Ice Web, who is a Senior Member at WebmasterWorld but is based in Germany, is the loudest on claiming updates today. There are many who agree and see major changes and there are many threads in the Google forums with individual complaints.
That being said, one of the tools I rarely show you, because it often doesn't match the other tools, is the DigitalPoint Keyword Tracker averages changes. It is something fairly new added to the forums sidebar and this reports changes of actual rankings for hundreds of thousands of sites and I'd say millions of keywords. It is based off what webmasters enter into the tracking tools.
It showed a bit of a change on the 17th but on the 19th, it really skyrocketed, like the forums did.
DigitalPoint Keyword Tracker averages
I've emailed Google yesterday to find out if something is going on specific with rankings but I have yet to hear back.
It can be a refresh of Panda or something else but I have no confirmation from Google.

Google: Authorship Works With Google+ Vanity URLs

Once you get your authorship working, no one wants to mess around and change URLs. Some don't like to mess around even if the authorship is not live in the search results, in fear changing it might break something. This is even more so now that there was a reduction in authorship display in the results.
authorship markup
One question I get a lot from more advanced SEOs is should I use the numeric or custom/vanity URL from my Google+ page. Should I use the root or /about or /post URL as well? The answer is, it does not matter.
Google's John Mueller went on record about this topic this morning on Google+ saying:
I've seen this come up more since custom/vanity URLs for Google+ profiles have become more popular. Authorship works fine with vanity profile URLs, it works fine with the numeric URLs, and it doesn't matter if you link to your "about" or "posts" page, or even just to the base profile URL. The type of redirect (302 vs 301) also doesn't matter here. If you want to have a bit of fun, you can even use one of the other Google ccTLDs and link to that.
So there you have it, concrete information from Google on one of the scary topics to touch on.

For more information, please see:
https://plus.google.com/authorship
https://support.google.com/webmasters/answer/1408986
https://support.google.com/plus/answer/2676340

Friday, December 20, 2013

Google: Your Various ccTLDs Will Probably Be Fine From The Same IP Address

Ever wondered if Google would mind if you had multiple ccTLD sites hosted from a single IP address? If you’re afraid they might not take kindly to that, you’re in for some good news. It’s not really that big a deal.
Google’s Matt Cutts may have just saved you some time and money with this one. He takes on the following submitted question in the latest Webmaster Help video:
For one customer we have about a dozen individual websites for different countries and languages, with different TLDs under one IP number. Is this okay for Google or do you prefer one IP number per country TLD?

“In an ideal world, it would be wonderful if you could have, for every different .co.uk, .com, .fr, .de, if you could have a different, separate IP address for each one of those, and have them each placed in the UK, or France, or Germany, or something like that,” says Cutts. “But in general, the main thing is, as long as you have different country code top level domains, we are able to distinguish between them. So it’s definitely not the end of the world if you need to put them all on one IP address. We do take the top-level domain as a very strong indicator.”

Tuesday, December 17, 2013

Google Structured Data Dashboard Adds Error Reports

Since we launched the Structured Data dashboard last year, it has quickly become one of the most popular features in Webmaster Tools. We’ve been working to expand it and make it even easier to debug issues so that you can see how Google understands the marked-up content on your site.
Starting today, you can see items with errors in the Structured Data dashboard. This new feature is a result of a collaboration with webmasters, whom we invited in June to>register as early testers of markup error reporting in Webmaster Tools. We’ve incorporated their feedback to improve the functionality of the Structured Data dashboard.
An “item” here represents one top-level structured data element (nested items are not counted) tagged in the HTML code. They are grouped by data type and ordered by number of errors:


We’ve added a separate scale for the errors on the right side of the graph in the dashboard, so you can compare items and errors over time. This can be useful to spot connections between changes you may have made on your site and markup errors that are appearing (or disappearing!).
Our data pipelines have also been updated for more comprehensive reporting, so you may initially see fewer data points in the chronological graph.