Monday, February 17, 2014

Google: GoogleBot Follows Up To Five Redirects At The Same Time

Google's John Muller said in a webmaster hangout on Friday that GoogleBot will follow up to five redirects at the same time, past that, you are probably out of luck.
I don't believe we had a number, a solid number, on how many redirects Google will follow. This may, and I may be wrong, be the first time Google gave a number on the number of redirects they follow at one time.
We had Matt Cutts talk about PageRank dilution through redirects in the past.
Google's John Mueller said this 46 minutes and 3 seconds into the hangout embedded below:
Of course, this is useful information for SEOs when doing audits.

Google Says Switching To HTTPS Won't Change Your Page Rankings

Google Webmaster Help thread has someone complaining that his site's rankings dropped and the one thing he noticed was that his home page and 56 other pages are indexed in Google with the https version.
To which John Mueller of Google responded:
Before you get too focused on technical issues, I'd just like to add that going from https to http, or the other way around, generally won't noticeably change your pages' ranking.
So he is implying that there may be something else here to blame for his ranking drop and not the HTTPS issue he is citing.
But is it true that switching from HTTPS to HTTP or the other way around "generally won't noticeably change your pages' ranking?"
Six months ago or so, we covered that making the switch is doable but you need to do it right. Matt Cutts also posted a video on the topic in 2011, of course, things change quickly in search, so I am not sure if he would be as reserved if he made the video today:


Monday, February 10, 2014

Smo Active Users Email Id 2014

Smo Active Users Email Id Database

11webexpert@gmail.com,
11webpoint@gmail.com,
aannagates@gmail.com,
aannagates@gmail.com,
abdulsami134@gmail.com,
abhirajuts@gmail.com,
abhirajuts@gmail.com,
abhirampathak3@gmail.com,
adityakumarshrivastwa@gmail.com ,
adityakumarshrivastwa@gmail.com ,
agarwalmana404@gmail.com,
ajayk7090@gmail.com,
alaina.wats@gmail.com,
alaina.wats@gmail.com,
alariceengel@gmail.com,
alastairalex1@gmail.com,
alexzendra889@gmail.com,
alissalori1@gmail.com,

Friday, February 7, 2014

Free German Classified Sites List 2014

Free German Classified Sites List 2014

 Here is a free German classifed sites list. These sites are high page rank websites and also most popular sites in Germany.

S.No
Website URL
PR
Alexa Rank




1
3
1636915
2
3
300
3
6
24716
4
6
112297
5
4
947101
6
2
72369
7
1
9378
8
6
7193
9
2
260736
10
3
1349501
11
5
2655125
12
4
1124957
13
3
49549
14
3
177199
15
3
177147

Australia Free Business Listing Sites 2014

Uk Free Business Listing Sites 2014

Monday, February 3, 2014

Google Places Adds A Slew Of Business Categories

Google is working on improving and expanding the business categories for Google Places For Business for countries around the world. In fact, they’re adding over 1,000 new categories in the Places dashboard.
Google Business community manager Jade Wang posted an announcement in the Google Product Forums (via Search Engine Roundtable), saying that the move comes based on feedback from merchants. She writes:

Google Warns German Webmasters That Paid Links Violate Google’s Guidelines

Google head of search spam Matt Cutts postedon Twitter this morning another stern warning to German webmasters about a link penalty looming for them.
Matt Cutts tweeted:
A reminder (in German) that paid links that pass PageRank violate our guidelines: http://goo.gl/sHDdlC

The blog post is written in German on the German Webmaster blog, which basically says Google reserves the right to issue penalties for unnatural links. It then goes through the process of explaining the types of unnatural links and how to submit a reconsideration request if you were hit.

Google Is Not Broken

In spite of what many think, Google is not broken. But wait, naysayers will say, Look at this search result, it stinks! This spammer is succeeding in ranking high, they emerged from nowhere and are now in the top three results!google logo - basic 570x270
It’s true — there are many such examples that you can point to. Making sense of this landscape can be quite confusing, but that’s what I will attempt to do in today’s post.
Firstly, there are two basic reasons why Google can be quite slow to address some of the problems you might find.

1. They Can Afford To Be Thoughtful And Patient

Why, you ask? They have dominant market share. Here is the December 2013 market share data from comScore:
comScore Search Market Share
comScore notes that “‘Explicit Core Search’ excludes contextually driven searches that do not reflect specific user intent to interact with the search results.” In my experience, the practical impact of adjusting for this is that the Google search market share is a bit higher. Most sites I look at show a larger percentage of their organic search coming from Google than 67%.

Can You Rank In Google Without Content?

WebmasterWorld thread has a webmaster who has a site that doesn't have any real content. It is basically statistical downloads and specifications downloadable as PDFs or Zip files.
Can you rank web pages with no content at all in Google?
A good example of a page that ranks without having the exact words on it is the Adobe Reader page which ranks for [click here].
But what about a page with almost no content? It is possible to rank on anchor text alone?
Yes, but it has to be very obscure and non-competitive words.

Google's Matt Cutts: Don't Use Article Directories For Link Building

In a short video yesterday, Google's Matt Cutts told webmasters and SEOs not to use article directory sites for link building strategies.
Well, he kind of said not to use it and hinted to it at the end by saying:

Here is the video:

Google Places Business Adds 1,000 Categories Internationally

Jade Wang from Google's Places, business help, team posted in theGoogle Business Help forums that they have greatly expanded the business category support by adding over a 1,000 new categories internationally.
Previously, they had a very limited number of categories for international (Non USA) businesses. The reason was because it was complicated to translate them all. Well, now Google has translated a ton and can support 1,000 new categories.
Jade said:
Today, we are taking a first step of many to improve categories that merchants can use to represent their businesses. Specifically, we’re adding over 1,000 new categories in the new Places dashboard. These categories are available globally and translated to every language Google supports.
Why not sooner? Jade explained:

DMOZ Drops Over 1 Million Sites From Directory?

Did you notice that DMOZ, one of the oldest and largest human crafted web directories, has removed over 1 million sites and 10,000 editors from their directory?
DigitalPoint Forum thread first noticed it. If you look at the live site now, you will see 4,261,763 sites, 89,252 editors and over 1,019,865 categories in the footer. But if you go to the WayBackMachine archive you will see 5,310,345 sites, 99,997 editors and over 1,019,508 categories.
Here are screen shots:
NOW:
dmoz-new-sites
OLD:
dmoz-old-sites
As you can see, DMOZ dropped about 1 million sites from their directory and 10,000 editors. There was no announcement about this, so I am not sure if this is just a glitch on the footer.

Sunday, January 26, 2014

Initial SEO Strategy: Tips That You Must follow Before Optimizing Your Site

Initial SEO Strategy: Tips That You Must follow Before Optimizing Your Site

Essential search engine optimization (SEO) is basic. What's more fundamental. SEO will help you position your site appropriately to be found at the most basic focuses in the purchasing methodology or when someone require your website.

What are web crawlers searching for? In what manner would you be able to fabricate your site in a manner that will please both your visitors/customers, and Google, Bing, and other web crawlers? Above all, in what manner can SEO help your web existence get to be more beneficial?

"Skipping the basics and spending all your time and money on social and 'fancy stuff' is the same as skipping brushing your teeth and showering, but buying white strips and wearing expensive cologne," Shelby (Director of SEO, Chicago Tribune/435 Digital) said.

#What is SEO, Basically?
The objective of foundational SEO isn't to trick or "diversion" the internet searchers. The reason for SEO is to:
  • Make an incredible, consistent visitor experience. 
  • Convey to the web crawler your expectations so they can propose your site for significant query.
Your Website is Like a Cake
Your links, paid query, and social media goes about as the icing, however your content, web data structural planning, content administration framework, and base go about as the sugar and makes the cake. Without it, your cake is flavorless, exhausting, and gets tossed in the garbage.

Wednesday, January 22, 2014

Google Webmaster Tools Crawl Errors Reports Now Showing Errors On Final Redirect URL

In the past, we have seen occasional confusion by webmasters regarding how crawl errors on redirecting pages were shown in Webmaster Tools. It's time to make this a bit clearer and easier to diagnose! While it used to be that we would report the error on the original - redirecting - URL, we'll now show the error on the final URL - the one that actually returns the error code.

Let's look at an example:

 
URL A redirects to URL B, which in turn returns an error. The type of redirect, and type of error is unimportant here.
In the past, we would have reported the error observed at the end under URL A. Now, we'll instead report it as URL B. This makes it much easier to diagnose the crawl errors as they're shown in Webmaster Tools. Using tools like cURL or your favorite online server header checker, you can now easily confirm that this error is actually taking place on URL B.
This change may also be visible in the total error counts for some websites. For example, if your site is moving to a new domain, you'll only see these errors for the new domain (assuming the old domain redirects correctly), which might result in noticeable changes in the total error counts for those sites.
Note that this change only affects how these crawl errors are shown in Webmaster Tools. Also, remember that having crawl errors for URLs that should be returning errors (e.g. they don't exist)does not negatively affect the rest of the website's indexing or ranking (also as discussed on Google+).
We hope this change makes it a bit easier to track down crawl errors, and to clean up the accidental ones that you weren't aware of! If you have any questions, feel free to post here, or drop by in the Google Webmaster Help Forum.

Google’s Matt Cutts: We Don’t Use Twitter Or Facebook Social Signals To Rank Pages

Google’s head of search spam, Matt Cutts, released a video today answering the question, “are Facebook and Twitter signals part of the ranking algorithm?” The short answer was no.
Matt said that Google does not give any special treatment to Facebook or Twitter pages. They are in fact, currently, treated like any other page, according to Matt Cutts.
Matt then answered if Google does special crawling or indexing for these sites, such as indexing the number of likes or tweets a specific page has. Matt said Google does not do that right now. Why?
They have at one point and they were blocked. I believe Matt was referring to Google’s real time search deal expiring with Twitter. Matt explained that they put a lot of engineering time into it and then they were blocked and that work and effort was no longer useful. So for Google to put more engineering time into this and then be blocked again, it just doesn’t pay.
Another reason, Google is worried about crawling identity information at one point and then that information changes but Google doesn’t see the update until much later. Having outdated information can be harmful to some people.

Tuesday, January 21, 2014

Was Expedia Penalized By Google? SearchMetrics Says So.

Yesterday morning Patrick Altoft tweeted to me that SearchMetrics is reporting Expedia lost about 25% of their Google traffic overnight.
SearchMetric is indeed reporting this and I confirmed it with Marcus Tober from SearchMetrics via email. I posted the story on the traffic drop at Search Engine Land and asked if it was related to the link buying allegations Expedia was surrounded with last month.
Hacker News thread, that I've been following for the past couple weeks has more details about how Expedia may have been involved in link schemes and may have received an unnatural link penalty by Google. Google has not yet responded to my inquiries about the penalty. So I have no confirmation from Google or Expedia about it.
SEM Rush shows no drop off in traffic:
click for full size