/    Sign up×
Community /Pin to ProfileBookmark

What is the Consequences of Duplicate Contents?

Hello Expert friends,
If I use duplicate content for my site, what results I get?
Please tell me!!

to post a comment
SEO

2 Comments(s)

Copy linkTweet thisAlerts:
@jay_rogersJan 07.2016 — Yes, If you are a site owner then you should suffer the loss of rankings and traffic losses and search engines provide less relevant results. Search engines don't know which version(s) to rank for query results, search engines don't know whether to direct the link metrics ,to one page, or keep it separated between multiple versions
Copy linkTweet thisAlerts:
@richardstevensJan 10.2016 — In order to understand why the Duplicate Content is a problem you need to see it from the search engine’s point of view. Search engines need to crawl, analyze & index, find the reputation of each page and be able to search fast through their index in order to return the results to the users. Having lots of duplicate content in a website is bad for search engines since they waste their resources on pages that do not usually have a significant value for the users.

Matt Cutts, a well known Google employee, has mentioned in one Google-Webmaster-Help video that in order to crawl a large part of the web you need a relatively small number of machines (more than 25 less than 1000). This means that crawling a website requires a relatively small amount of resources. Nevertheless the analysis of the page, the evaluation of the links and the indexation is a much more time consuming process. Those of you, who have coded web-spiders in the past, know that the analysis requires lots of CPU and memory comparing to the web requests. This is due to the complexity of the algorithms that are used in the text analysis.

Clearly the duplicate content is a problem for the search engine users because it affects the quality of the search results. But why this is a problem for the webmasters? Well, since this problem requires additional resources that cost money to the search engine companies, they try to force the webmasters and the SEOs to help them solve the issue. And the cheapest way to solve it is by motivating the webmasters to eliminate their duplicate pages.
×

Success!

Help @nazmulph spread the word by sharing this article on Twitter...

Tweet This
Sign in
Forgot password?
Sign in with TwitchSign in with GithubCreate Account
about: ({
version: 0.1.9 BETA 5.18,
whats_new: community page,
up_next: more Davinci•003 tasks,
coming_soon: events calendar,
social: @webDeveloperHQ
});

legal: ({
terms: of use,
privacy: policy
});
changelog: (
version: 0.1.9,
notes: added community page

version: 0.1.8,
notes: added Davinci•003

version: 0.1.7,
notes: upvote answers to bounties

version: 0.1.6,
notes: article editor refresh
)...
recent_tips: (
tipper: @AriseFacilitySolutions09,
tipped: article
amount: 1000 SATS,

tipper: @Yussuf4331,
tipped: article
amount: 1000 SATS,

tipper: @darkwebsites540,
tipped: article
amount: 10 SATS,
)...