So it is really great and far better to use noindex, follow with meta robots. Isn’t it? So always use this method to noindex category and tag pages from your blog and only keep the quality content without any duplication and without losing the internal link SEO.
At first I am going to disclose to you what occurred in the event that you deny Google bot and the vast majority of the other web indexes from robots.txt. Forbid class and label pages through robots.txt will totally stop Google bot or crawlers from getting to the URL. So every one of the connections accessible inside the pages won't be open by crawlers. Accordingly you will lose numerous inward connecting highlights which can help you a parcels to rank your site better on SEO.
In the event that you use meta robots with noindex, permit, Google and the greater part of the other web indexes won't consider the pages copy and along these lines it will shield you from copy and low quality substance. Be that as it may, it will even now can get to the connections of your own site. Accordingly you won't lose those inward connecting of your own site and in the meantime you can maintain a strategic distance from copy, low quality and short length page with just a single post inside a tag from Google as it won't tallied. Web connecting is critical for web crawlers, so for what reason to lose them? That is the reason you clearly should utilize noindex, pursue with meta robots. it won't stop seek crawlers totally to get to your page. crawlers can in any case get to the page and can check joins with pursue. In this manner you won't lose the SEO enhancement that can be for the inside connecting for tag and class pages.
baba techs provide best web dvelopment services i lahore .Our web development services net is largest network in lahore . AND ALSO our company a largest software house in Lahore