If you run a crawler against your own site, it will generally crawl all your pages and then give you a report. It is tempting to think that Googlebot works the same way, but it doesn't. Googlebot doesn't crawl your entire site, wait for a while, and then come back and crawl your entire site again.
Stephen
Some website owners put a lot of stock in their Alexa rank. They hope that improving their Alexa rank will help their SEO. That just isn't the case.
Googlebot can now fully render pages and sites that are built with JavaScript. It is tempting to think that Google can now index all JavaScript sites, but that is not the case. You need to know how search engine crawlers index JavaScript sites because there are quite a few common pitfalls when trying to build a search engine friendly site in JavaScript.
If you create a page on your website (like /some-page.html
), then it is just one page, even URL parameters are added (like /some-page.html?foo=bar
), right? The myth is that search engines see both of those URLs as a single page.
In reality, search engines have to treat URLs with parameters of if they could be completely separate pages. Some sites rely on URL parameters to show different content (eg /show?page=some-page
and /show?page=other-page
).
It is tempting to think that you can have Google forget everything it knows about your site and start over. However Google has a very long memory. There is no way to reset everything that Google knows about your site.
Webmasters live in fear of the "duplicate content penalty." The myth is that having two URLs on your site that show the same content is an SEO disaster. It will cause the rankings of your entire site to plummet.
I have heard several people say that they think your sitemap controls which pages from your website are on Google. In reality, XML sitemaps have little to do with which pages Google chooses to index. It is very common for Google to index pages that are not included in an XML sitemap.
If you disallow a page in robots.txt, Google may choose to index the page anyway. Google claims to honor, robots.txt, so how is that possible?
Many SEO guides suggest creating XML Sitemaps. They either say or imply that sitemaps are needed to get Google to index your site and get good rankings. XML sitemaps do have some uses for SEO, but:
- XML sitemaps won't influence Google rankings.
- Google rarely chooses to index a URL that it only finds via an XML sitemap.
I generate a LetsEncrypt wildcard certificate for my domain name. On my home LAN, each device has a subdomain (device.example.com
). When those devices have a web interface for administration, I enable HTTPS and give the device the Let's Encrypt certificate to use. This allows you to administer the device without seeing scary warnings about security from your browser.
I installed FreshTomato on my router. Getting the certificate onto it is pretty easy if you use the script I wrote.
I wanted a music player with the following features:
- Inexpensive (less than $40)
- Low power (1 watt)
- Plugs in to a audio system (no batteries, not a portable player)
- Runs 24/7
- Automatically starts playing again after losing power
- Expandable storage
- Web interface
I built it on a Raspberry Pi Zero W. It is a full computer that is only 2 inches long
I recently purchased a Honewell Timer Switch from Amazon. The instructions that came with it for replacing an existing 3-Way switch don't work. Here is how to install it in place of an existing 3-way switch.
It is often desirable to send an event to Google Analytics to track when somebody clicks on a link or submits a form. This is not easy to implement properly.
Here is a simple Perl program that outputs a capital gamma character (Ɣ): print "\x{194}\n"
. When you run it, you will probably get a warning from Perl about a "Wide character in print at line 1".
$ perl -e 'print "\x{194}\n"'
Wide character in print at -e line 1.
Ɣ
You can solve this using several different methods.
I use external drives to backup everything on my Synology NAS. My data easily fits on a single large drive so:
- I put two large drives in my NAS and use raid1 (mirrored disks)
- The entire backup (or even multiple backups) can fit on an external drive.