There are a lot of misconceptions about SEO and web development. None of the following are true, but I regularly run into people who believe them.
- An XML sitemap is needed for SEO; it helps rankings.
- Google will only index the pages listed in your XML sitemap.
- You can reset your SEO and have Googlebot do a fresh crawl to start over.
- Googlebot crawls your entire site periodically.
- It is a problem if Google doesn't index every page of a site.
- Every error listed in Google Search Console needs to be fixed.
- Now that Googlebot renders JavaScript, all Angular and React sites get crawled.
- Structured data helps search engine rankings.
- Disallowing pages in robots.txt prevents them from getting indexed.
- Duplicating content on different internal URLs could cause Google to penalize your site.
- Site search can power SEO friendly pages.
- Adding URL parameters does not create new pages for search engines.
- You need to change your DNS NS records to point to your web host.
- Improving stats in Google Analytics without changing the site will help SEO.
- Google's own websites are a model example for SEO practices.
nofollow
can be used on internal links to control link juice.- Alexa rank can be used to judge which sites have good SEO.
- Google uses the number of visitors to a site as a ranking factor.
- If an SEO tool finds a problem with my site, it absolutely has to be fixed.
- I should use under construction and coming soon notices.
- SEO is all about implementing meta tags.
- Free web hosting is a good deal.
- If other sites are doing it without getting penalized by Google, I can do it it too.
- "SEO friendly" URLs are necessary for search engine rankings