By now you probably feel that ChatGPT is everywhere – in your feed, in your mind… maybe even in your fridge.
However, that doesn’t mitigate the fact that it can be darn useful for this industry.
Joe Hall recently shared an interesting guide for making some of your technical SEO processes more efficient, and it’s kind of mind-blowing.
So let’s get technical…
Create static XML sitemaps – It’s rare, but sometimes you encounter a bunch of static web pages managed by a custom content management system (CMS) that doesn’t auto generate sitemaps.
If that happens, you may have to roll up your sleeves and create sitemaps manually.
Or, instead of writing it by hand, you can try the following ChatGPT prompt:
Create a valid XML site map that includes the following URLs: [list URLS]
Create JSON-LD schema – With AI, you don’t need to use unreliable JSON-LD automated generators to create this type of schema markup.
Instead, just type this: Generate JSON-LD schema for a local [business type] named [business name] at the address [address] with the phone number [phone number]
Create crawl directives for robots.txt – Normally, you don’t have to edit the robots.txt file. But on the rare occasion that you do, you’ll have to be nitpicky.
For instance, what would you do to prevent crawlers from accessing certain directories when you still want them to crawl subdirectories inside that blocked original?
Well, just try this prompt: Create a robots.txt file that blocks Bing [or crawlers from any other entity] from crawling the /assets/ directory, but allows crawling the /assets/javascript/ directory.
Awesome, right? If you want to see the output examples, as well as some more tips, check out the entire article. You may get some ideas of your own!