It has been said that part of the new SEO is undoing the old SEO. It is against that backdrop that Marie Haynes works getting unnatural link penalties removed for sites that get that dreaded message in Webmaster Tools. I met Marie almost a year ago and we quickly struck up a working relationship and friendship based on shared thinking on algorithmic penalties and nuances and how to recover once afflicted. Marie and I also share an unhealthy infatuation with our beloved Toronto Maple Leafs. Marie was kind of enough to take some time to answer questions for me about the state of the penalty space.
1) Although you have not been in the industry for a long period of time, you have quickly established yourself as one of the leading professionals working in the the field of penalty removal. How did you get in to search marketing and why were you drawn to the penalty aspect of the business?
A few years ago I was quite happy in my career as a veterinarian. I stumbled upon an SEO forum when I was trying to figure out how to get my veterinary advice website and my husband’s real estate site to rank higher in Google and quickly became hooked. Two years ago I was pregnant with our second child and was placed on bed rest. I spent most of the time hanging out in SEO forums. One day, shortly after Penguin rolled out I gave someone some advice on a forum and it was bad advice. A senior member of the forum called me on it and I realized that there was an awful lot about Google penalties and algorithm changes that many people did not know. I made it my life’s mission to understand these things. When I felt I knew enough I offered to look at a few sites for people who were experiencing traffic drops. At first I didn’t charge anything, but simply gave my opinion. As I started to get more confident in my diagnoses, I started to charge a small amount of money for my time. I got more and more business and kept raising my prices.
In mid 2012 a site owner asked me if I could remove his manual unnatural links penalty for him. Although I had read everything I could on the subject, I had never removed a penalty. I offered to do it for no charge and told him that if I succeeded he could pay me $300. It was one of my harder penalty removal jobs to date! This was prior to the disavow tool and Google did not give any example links or much guidance at all on what they wanted site owners to do. Still, I managed to get the penalty removed and I now had someone who was willing to give me a glowing reference. Since then I have removed a large number of penalties and have consulted with hundreds of site owners who are experiencing problems with penalties and algorithm changes. While I still have my veterinary license, I can’t see myself ever stopping what I am doing now. I love this stuff so much!
2) There are all kinds of debates in the industry as it pertains to the need to perform a manual link audit when removing penalties rather than using automated tools to classify links. What are your thoughts on that? Do you think that these software vendors are trying to sell a “quick fix” or is there a place for them?
I think that there can be a place to use automated link auditing tools alongside of a manual audit. But, I would never rely on them completely. I don’t believe that it’s possible for a tool to be able to be 100% accurate. And, if you’re trying to remove a link based penalty, you need to be as accurate as possible. The frustrating thing though is that most website owners really have no clue how to audit their links. So, I can see why they turn to tools because really, the only other option they have is to pay someone to do a manual audit and that can be very expensive.
With that being said, manual audits are not perfect either. I’ve definitely called some links natural that were unnatural and vice versa. Still, we do all of our audits completely by hand at the moment. We’re working on a tool to help us automate part of the process but I won’t release it until I am really happy with its accuracy!
3) What do you think triggers manual penalties? Do you think there a an algorithmic component to certain sites being flagged for manual review? If so, how come some sites that have been impacted by a manual action seem to escape algorithmic features such as Penguin? You would think that the threshold for something as severe as a manual action would trigger something algorithmically?
There are probably a few ways that a site can get a manual review and subsequently a penalty. We know that manual spam reports can be filed. Google says that they review each report filed but don’t always act on them. Still, I do think that some manual reviews come about as a result of a report being filed. Recently, when My Blog Guest was penalized, we saw that thousands of sites that had any connection with MBG received manual penalties as well. While I wouldn’t say that this was “algorithmic”, I do think that Google could programmatically find sites that had a connection to MBG.
I think it’s also possible that Google has some sort of system whereby when they flag sites that are selling links and then manually review the recipients of those links. So, let’s say they take down a blog network. They may have some sort of programmatic review that looks at every site that is linked from that network and then determines whether a manual review is a good idea.
All of that is just speculation though. The real answer to this question is, “I don’t know.” :)
4) Google has said that they receive about 5,000 reconsideration requests a week. Care to speculate how they scale that? Obviously there must be some thresholds that a site must reach before the penalty is revoked but sometimes it seems like some sites are easier than others. Do you think the human aspect of the process contributes to this?
I have heard from several reliable sources that there are multiple tiers when it comes to the webspam team. It appears that there is a junior tier that initially assesses your reconsideration request. They look at a subset of links that Google has determined to be unnatural and then see what you have done with them. Are any removed? Are they in your disavow? If it is obvious that you have not addressed a good number of links then they press the “rejected” button and you get a message of failure. But, if it appears that you may have done enough work to pass, then they pass you on to a higher level of the webspam team. I don’t know if this happens in every case or if it’s possible that the first tier team member can administer a pass. But, the second tier member can have a more detailed look at your situation. I believe that this person will look in greater detail at your spreadsheets if necessary.
You are right that some sites have penalties that are easier to remove than others. I believe that a lot of this depends on the level of manipulation that has been done in the past. The sites that Google seems to be really picky on are the ones that have had years and years of link building done on their behalf. I find it is often much harder to lift penalties on sites like Payday Loans sites than it is on a mom and pop business that simply hired a low budget SEO for a few months. With that being said, I’ve worked with some small businesses where Google was insanely picky about which links they wanted removed. In one case, the site owner had simply purchased two Fiverr link building gigs that resulted in difficult to find Russian forum profile links being created. Google would not lift the penalty until we tracked down a bunch of these using Google searches as none of these links were in backlink checkers.
I think that sometimes sites can have difficulty removing a penalty because of bad luck. Google employee John Mueller has mentioned previously that the webspam team looks at a different subset of unnatural links each time they assess your link removal efforts. If you are unlucky enough to have that subset contain a large number of links that aren’t easy to find then you may end up failing. That brings up another pet peeve of mine and that is that Google often will give us example links that we can’t find anywhere. I wonder if those links are affecting sites that are hit with algorithmic problems like Penguin? Perhaps this is one of the reasons why we don’t see more Penguin recoveries.
5) Have you ever had Google reject a reconsideration request and provide natural links as examples? If so how do you handle that?
Yes. This has happened to me several times. Sometimes, even though that particular link is natural, it can be a clue that there are other similar links that are paid that still need to be addressed. And sometimes Google just gets it wrong. I recently had a case where I was auditing links manually and found that the site had a number of forum mentions from users of a particular video game. These users were finding value in some content that my client had produced. I told the my client that this was something that they could use to their favor in the future as they could create more content like this and reach out to websites that centered around this game. Imagine my surprise when Google gave us one of these links as an example of an unnatural link! We contacted the site owner of the example link given and she verified that the link was truly natural and that their users loved my client’s site!
What we did in this case was include our conversation with the site owner in a Google doc and referenced it in our reconsideration request. We told Google that we did not want to remove these links because they were good links. We did have additional unnatural links that needed to be removed, so we addressed those and Google did remove the site’s penalty even though we kept the natural links that Google had marked as unnatural.
6) There is a lot of chatter about the impact of negative SEO. Have you ever seen evidence of a site being penalized as a result of a negative SEO attack?
Ugh. I do not enjoy talking about negative SEO. I will commonly have clients complain that they were penalized because of negative SEO, but I have yet to see a case where negative SEO was the sole culprit. I do believe that it is possible that negative SEO could act as sort of a spam report. What I mean by that is that a site that already has a base of unnatural links could look even more unnatural if thousands of additional spam links were pointing at them.
Google says that they are pretty good at simply discounting negative SEO attacks. While I’ve had clients whose backlink profiles were littered with links as a result of hacking attempts and malware, I can’t think of a situation where negative SEO alone was responsible for a site’s penalty.
7) Were you at all surprised by how aggressively Google went after My Blog Guest and the sites that were using that service?
What surprised me the most about that situation was the case where Doc Sheldon’s site received a sitewide penalty because he hosted a couple of guest posts with manipulative links. I kind of feel like Google was trying to use him as an example to spread the word that guest posting for links must stop.
I was surprised at how many sites got manual penalties. That week I was bombarded with requests for help and in every case the site owners had taken part in MBG. Now, I know of many sites that used MBG and didn’t get penalized. I don’t believe that Google penalized everyone who used MBG but I think they manually reviewed a lot (if not all) of the sites that used the service. As mentioned before though, I don’t know how they accomplish so many manual reviews. Do they automate part of the process to find the worst offenders and then manually review those? Do they have an army of manual reviewers? I don’t know.
8) Any thoughts on what types of links might be next in Google’s cross-hairs?
I really think that the MBG manual penalties were a testing ground for the next Penguin iteration. I believe that Google is going to try to algorithmically devalue links in guest posts. I think that if they do this, the right thing to do would be to simply devalue any links that they can determine were obtained in guest posts as opposed to attaching any penalty to them. That way, if they inadvertently flag a good link as a link obtained by guest posting there is no major harm done. I think that the way that they will determine guest post links will be really complicated but I do think that it can be done. If I can look at a site’s backlink profile manually and determine that they’ve used guest posting (even from high quality blogs) as a way to build a lot of links, I’m sure that Google can find a way to do this algorithmically.
9) Do you still think we are going to be dealing with manual penalties on the scale we are seeing now or is this escalation of manual actions generating enough fear that it will change the SEO mindset to the point where scaled link building will become a thing of the past?
My gut instinct is that we’ve still got a couple of more years left of wide scale manual penalties. From what I see in forum chatter I don’t think that people are getting the message that manipulative link building has got to stop. Just this morning I saw a post in the Google help forums of a site owner who was convinced that links in spun articles on sites like ezine articles and the like were totally natural as long as they didn’t contain keyword rich anchor text.
I do think that Google is trying to scare people though so that they won’t want to build links. I regularly talk to site owners who are afraid to link out to anyone and that’s not good. One owner of an authoritative site asked me if he should be nofollowing all of his links when he cites another related website. And no! He should not! But somehow the message will get through to people that the old mindset of getting as many links as possible has got to stop. That’s going to take quite a bit more time.
10) It has been six months since the last iteration of Penguin. Do you think we are likely to see a refresh soon? Do you think that Penguin is eventually going to get baked in to the algorithm so that we don’t have these huge events that occur when it does get rolled out?
I am guessing that a refresh will happen in the next few weeks. I really do feel like they are using the widespread manual penalties of guest posters as a testing ground for Penguin. If you recall, Google went after some large blog networks like Build My Rank in March of 2012 and then released Penguin in April of 2012. If my theory is right, then the manual penalties were testing ground for the algorithm changes.
Sadly, I do think that eventually Penguin will be baked into the algorithm just like Panda now is. I think it will take another year or two before we get to that point though.
11) What is the biggest mistake you see webmasters make when they fail to get a penalty revoked?
By far the most common reason for a site owner to fail to get a manual unnatural links penalty removed is not being thorough enough with their cleanup. Many people have a hard time grasping that any link that was made so that your Pagerank will increase is an unnatural one. Also, links that were obtained by offering a product for review or as a result of widespread reciprocal linking are unnatural as well. I do a lot of reviews for site owners who are struggling to pass at reconsideration and I will often see that there are a good number of unnatural links that were not removed or disavowed.
12) There are some who claim that they can get manual actions removed by just using the disavow tool and not doing any removals. Have you ever seen any evidence of this working?
This is a tough question to answer. I did have one case where we were unable to remove a single link because they were all part of a black hat link network. We disavowed everything and explained our plight to the webspam team. We got our penalty lifted without removing a single link.
But, I have a tough time believing that simply disavowing is going to work regularly because Google has gone to great lengths to tell us that just disavowing is not enough. I asked John Mueller about this claim (that disavowing would be enough to get a manual penalty removed) in a hangout and he said that this would not be consistently possible. After speaking with Tim Grice on Twitter who claims that he has removed a large number of penalties without removing any links, I convinced one of my clients to let me try this. We did a thorough audit and disavowed, filed for reconsideration and failed. However, Google gave us example links of tough to find links from spam profiles. I did not get a chance to refile with just a simple disavow as the client was eager to do a proper link removal and be thorough as possible so we could prove to the webspam team that we were doing our absolute best. We did removals and got the penalty lifted…but who knows…would another disavow had done the trick?
Until Google tells us otherwise, we’ll still go about doing link removals for our clients with manual penalties.
13) Do you think the Toronto Maple Leafs will ever win the Stanley Cup in our lifetime?
Augh. I have been a Leafs fan for 40 years now. How sad is that? I was going to write a beautiful analogy of how the Leafs were like huge cumbersome website hit by Panda that keeps showing promise and then getting knocked down again…over and over and over again…
But instead, I think that I’ll answer this question by saying, “No.”
I feel your pain Marie….I feel your pain……….