As a web developer by day and author by night, I’m always amazed when authors and bloggers are afraid to repost guest posts they’ve made on other people’s sites onto their own site (or vice versa). In this link on Google’s site related to the content used, they specifically use the term “content scraping”. What is it and how do you avoid it?
Basically content scraping generally involves some sort of automated process that looks at content on the internet and copies it to your site. Or it could be something like an aggregator where all of the content on the entire site is compiled from other sites on the internet without adding any unique content of your own.
So, if you are a blogger who writes mostly unique content or an author who writes guest posts and reposts them on your website, chances are you don’t have to fear damaging your search ranking. Chances are if you repost the article in on one or two sites, it’s not enough to “harm” either site’s ranking.
But, if you’re really paranoid, just change a few sentences as is advised in this blog post.
The main point in understanding what unique content means from Google’s perspective is this: Don’t make all of the content on your site pull word for word from other sources. If you are using aggregators, you must have unique content or change the content in some small way. If you don’t know what an aggregator is, then you’re probably safe.