SEO Myths

There are a lot of misconceptions about SEO and web development. None of the following are true, but I regularly run into people who believe them.

  • An XML sitemap is needed for SEO; it helps rankings.
  • Google will only index the pages listed in your XML sitemap.
  • You can reset your SEO and have Googlebot do a fresh crawl to start over.
  • Googlebot crawls your entire site periodically.
  • It is a problem if Google doesn't index every page of a site.
  • Every error listed in Google Search Console needs to be fixed.
  • Now that Googlebot renders JavaScript, all Angular.js and React sites get crawled.
  • Structured data helps search engine rankings.
  • Disallowing pages in robots.txt prevents them from getting indexed.
  • Duplicating content between different pages on your site could cause Google to penalize your site.
  • I can create SEO friendly URLs for site search pages and link to them to get them ranked.
  • Adding URL parameters does not create new pages for search engines.
  • You need to change your DNS NS records to point to your web host.
  • Improving stats in Google Analytics without changing the site will help SEO.
  • I should look at Google's websites to get an idea of what to do for SEO.
  • nofollow can be used on internal links to control link juice.
  • Alexa rank can be used to judge which sites have good SEO.
  • Google uses the number of visitors to a site as a ranking factor.
  • If an SEO tool finds a problem with my site, it absolutely has to be fixed.
  • I should use under construction and coming soon notices.
  • Doing SEO is all about implementing meta tags.
  • Free web hosting is a good deal.
  • If other sites are doing it without getting penalized by Google, I can do it it too.

Leave a comment

Your email address will not be published. Required fields are marked *