The user agent filter list in the Apache reverse proxy for my Mastodon instance is starting to get slightly unwieldy...

Currently looks like this:

RewriteCond %{HTTP_USER_AGENT} "(\(GabSocial\/|centurybot9@gmail\.com|spider@seoscanners\.net|spider@seocompany\.store|;\ DotBot\/|SemrushBot\/|AhrefsBot\/|SEOkicks;|Baiduspider\/|fasthttp|python-requests\/2\.|;\ Bytespider|MJ12bot\/|SeznamBot\/|Scrapy\/)" [NC]
RewriteRule .* - [F]

Follow

...should probably just add all the other known search engine spiders too.
I did start off with the notion of an actual "microblog", so that's something that's public by default, and I assumed I would write posts that I would want to be discoverable.
I've not been so sure about that for quite some time now.

Sign in to participate in the conversation
INFRa Mastodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!