Currently we default to allowing AI bots to crawl your sites. This is mainly because a lot of people are using AI instead of traditional search nowadays to get answers to their questions, and AI wouldn’t be able to reference your site(s) as an authority if they are blocked, thus potentially reducing traffic to your site(s).
I’m wondering if we should switch our default to opt-in instead of out-out though.
What do y’all think, should we stick with allowing AI bots by default and block them on request, or block AI bots by default and allow them on request?
I believe it should be open by default, if you’re selling a product or service you 100% want AI bots to know about you. If a customer was unaware that AI bots were blocked on their site it could really screw-up the potential of their business.
No, I’ve noticed that AI hasn’t done so well at summarizing my site on search engines (it often mixes up my site with other completely unrelated sites.) I’m not so sold on the idea looking at how it digs up info on my projects or other people’s work.
AI is the new search engine. Block them at your own detriment. If you are not performing well in the AI summaries, I would suggest investigating what the bots want. I’m not going to lay it out, but think the same thing as always and a brochure site is not going to cut it anymore.