How Google Changed the Internet for the Better
In 2011, Google issued the first Google Panda update to its search algorithm. The goal was to provide users with a better online experience by ensuring the highest-quality and most relevant sites would be at the top of search rankings. Before the Panda Update, low-quality sites could manipulate the search algorithm to improve their rankings, often at the expense of higher-quality sites. While these changes to the algorithm punished low-quality sites with lower rankings, some changes could also affect higher-quality sites as well.
Duplicate content is content that appears multiple times under different URLs. Google’s goal was to ensure that the original, high-quality site would rank higher in searches than low-quality content farms that may have plagiarized content from other sites. Google will thus try to display the most relevant of the pages available with that content.
Low-quality sites that plagiarize content are usually disqualified from the top of the search rankings for a variety of reasons. But if your own page has duplicate content on it, no matter how high-quality it is, Google and other search engines must decide which page is more relevant. This can reduce page visibility for all of your duplicate pages, which would each only show up part of the time.
It’s easy to end up with duplicate content. In fact, 29% of all content on the Internet is actually duplicate content. Mobile versions and print versions of pages carry different URLs from the original and therefore are considered duplicate content. To combat this and ensure that Google knows which page is the original, the one that should be used for search rankings, you can canonicalize or use a 301 redirect on the duplicate pages to direct Google to the main page.
User-generated content can be a great thing for your site, but can also be a source of lower search rankings if your user content contains low-quality content or has grammar and spelling errors. Everything from guest blogs to comments can affect your rankings, so it’s important to carefully vet your user-generated content. Comments and guest blogs can be great resources for SEO, so removing them altogether could negatively impact your rankings. Moderating comments and editing guest blogs before publishing them can help to limit spam in your comments sections and poor-quality content on your site.
You’d think that having more links to your site on the Internet would be a good thing, but not all backlinks are of equal quality. Links from sites that have poor-quality content do, however, reflect poorly on your site as well. These sites include those designed exclusively for SEO and links rather than actually producing quality content for users.
There are several ways to get rid of links that could be affecting your Google search ranking. These methods include contacting the administrators of the sites in question to ask them to remove links to your page and disavowing bad links. Removing too many links could also harm your search ranking, so investigate links carefully to make sure they truly are spam before taking action on them.
Keep Your Site Panda-Proof
While the primary targets of the Google Panda update were low-quality sites using black-hat SEO tactics, some of the changes in Google’s algorithm affected higher-quality sites as well. If you’ve got high-quality content for your users, you’re already most of the way there to Panda-proofing your site. The rest is optimizing your site for SEO using white-hat tactics and making sure your site is not affected by the poor-quality of other sites.