Some website owners put a lot of stock in their Alexa rank. They hope that improving their Alexa rank will help their SEO. That just isn't the case.
SEO
Googlebot can now fully render pages and sites that are built with JavaScript. It is tempting to think that Google can now index all JavaScript sites, but that is not the case. You need to know how search engine crawlers index JavaScript sites because there are quite a few common pitfalls when trying to build a search engine friendly site in JavaScript.
If you create a page on your website (like /some-page.html
), then it is just one page, even URL parameters are added (like /some-page.html?foo=bar
), right? The myth is that search engines see both of those URLs as a single page.
In reality, search engines have to treat URLs with parameters of if they could be completely separate pages. Some sites rely on URL parameters to show different content (eg /show?page=some-page
and /show?page=other-page
).
It is tempting to think that you can have Google forget everything it knows about your site and start over. However Google has a very long memory. There is no way to reset everything that Google knows about your site.
Webmasters live in fear of the "duplicate content penalty." The myth is that having two URLs on your site that show the same content is an SEO disaster. It will cause the rankings of your entire site to plummet.
I have heard several people say that they think your sitemap controls which pages from your website are on Google. In reality, XML sitemaps have little to do with which pages Google chooses to index. It is very common for Google to index pages that are not included in an XML sitemap.
If you disallow a page in robots.txt, Google may choose to index the page anyway. Google claims to honor, robots.txt, so how is that possible?
Many SEO guides suggest creating XML Sitemaps. They either say or imply that sitemaps are needed to get Google to index your site and get good rankings. XML sitemaps do have some uses for SEO, but:
- XML sitemaps won't influence Google rankings.
- Google rarely chooses to index a URL that it only finds via an XML sitemap.
In the 1990s when the web was new, under construction notices were popular. Many sites would have them on unfinished pages. Some people are tempted to use them today, but there are some good reasons that their popularity has declined.
It is tempting to freak out when you open Google Search Console and find that it lists errors. They can look like big problems with your site that need to be fixed. However, most errors found in Google Search Console can be ignored.