Domain Empire

guide Basics of Technical SEO

Spaceship Spaceship
Watch

Gustavo-Woltmann

Restricted (50-70%)
Impact
40
Sitemap.xml file. A good sitemap shows Google how to easily navigate your website (and how to find all your content!). If your site runs on WordPress, all you have to do is install YoastSEO or Rankmath SEO, and they’ll create a sitemap for you. Otherwise, you can use an online XML Sitemap generation tool.

Proper website architecture. The crawl depth of any page should be lower than 4 (i.e. any given page should be reached with no more than 3 clicks from the homepage). To fix this, you should improve your internal linking.

Serve images in next-gen format. Next-gen image formats (JPEG 2000, JPEG XR, and WebP) can be compressed a lot better than JPG or PNG images. Using WordPress? Just use Smush and it’ll do ALL the work for you. Otherwise, you can manually compress all images and re-upload them.

Remove duplicate content. Google hates duplicate content and will penalize you for it. If you have any duplicate pages, just merge them (by doing a 301 redirect) or delete one or the other.

Update your 'robots.txt’ file. Hide the pages you don’t want Google to index (e.g: non-public, or unimportant pages). If you’re a SaaS, this would be most of your in-app pages.

Optimize all your pages by best practice. There’s a bunch of general best practices that Google wants you to follow for your web pages (maintain keyword density, have an adequate # of outbound links, etc.). Install Yoast SEO or RankMath and use them to optimize all of your web pages.

If you DON’T have any pages that you don’t want to be displayed on Google, you DON’T need robots.txt.

Gustavo Woltmann.
 
0
•••
The views expressed on this page by users and staff are their own, not those of NamePros.
Remove duplicate content. Google hates duplicate content and will penalize you for it. If you have any duplicate pages, just merge them (by doing a 301 redirect) or delete one or the other.

Actually, they don’t “hate” dupe content and unless you are blatantly scraping there is no “penalty.” However it confuses them and gives them the dilemma of which copy to index and display - sometimes they give up and go with something else. It’s not a good thing, obv, but it isn’t a “penalty.”

Update your 'robots.txt’ file.
Hide the pages you don’t want Google to index (e.g: non-public, or unimportant pages). If you’re a SaaS, this would be most of your in-app pages.

Robots.txt doesn’t prevent content from being indexed, it prevents it from being crawled. Big difference. A noindex directive prevents indexation.
 
0
•••
  • The sidebar remains visible by scrolling at a speed relative to the page’s height.
Back