LemmyIs.Fun
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Cynicus Rex@lemmy.ml to Privacy@lemmy.mlEnglish · 10 个月前

How to block AI Crawler Bots using robots.txt file

www.cyberciti.biz

external-link
message-square
63
link
fedilink
79
external-link

How to block AI Crawler Bots using robots.txt file

www.cyberciti.biz

Cynicus Rex@lemmy.ml to Privacy@lemmy.mlEnglish · 10 个月前
message-square
63
link
fedilink
Just a moment...
www.cyberciti.biz
external-link
  • ɐɥO@lemmy.ohaa.xyz
    link
    fedilink
    arrow-up
    60
    ·
    10 个月前

    I disallow a page in my robots.txt and ip-ban everyone who goes there. Thats pretty effective.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      16
      ·
      10 个月前

      Did you ban it in your humans.txt too?

      • bountygiver [any]@lemmy.ml
        link
        fedilink
        English
        arrow-up
        18
        ·
        edit-2
        10 个月前

        humans typically don’t visit [website]/fdfjsidfjsidojfi43j435345 when there’s no button that links to it

        • Avatar_of_Self@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          ·
          10 个月前

          I used to do this on one of my sites that was moderately popular in the 00’s. I had a link hidden via javascript, so a user couldn’t click it (unless they disabled javascript and clicked it), though it was hidden pretty well for that too.

          IP hits would be put into a log and my script would add a /24 of that subnet into my firewall. I allowed specific IP ranges for some search engines.

          Anyway, it caught a lot of bots. I really just wanted to stop automated attacks and spambots on the web front.

          I also had a honeypot port that basically did the same thing. If you sent packets to it, your /24 was added to the firewall for a week or so. I think I just used netcat to add to yet another log and wrote a script to add those /24’s to iptables.

          I did it because I had so much bad noise on my logs and spambots, it was pretty crazy.

          • Mikelius@lemmy.ml
            link
            fedilink
            arrow-up
            10
            ·
            10 个月前

            This thread has provided genius ideas I somehow never thought of, and I’m totally stealing them for my sites lol.

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          14
          ·
          edit-2
          10 个月前

          I LOVE VISITING FDFJSIDFJSIDOJFI435345 ON HUMAN WEBSITES, IT IS ONE OF MY FAVORITE HUMAN HOBBIES. 🤖👨

    • LazaroFilm@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      10 个月前

      Can you explain this more?

      • JackbyDev@programming.dev
        link
        fedilink
        English
        arrow-up
        25
        ·
        10 个月前

        Imagine posting a rule that says “do not walk on the grass” among other rules and then banning anyone who steps on the grass with the thought process that if they didn’t obey that rule they were likely disobeying other rules. Except the grass is somewhere that no one would see unless they actually read the rules. The rules were the only place that mentioned that grass.

        • Anonymouse@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          10 个月前

          I like the Van Halen brown M&M version. https://www.smithsonianmag.com/arts-culture/why-did-van-halen-demand-concert-venues-remove-brown-mms-from-the-menu-180982570/

    • Onno (VK6FLAB)@lemmy.radio
      link
      fedilink
      arrow-up
      7
      ·
      10 个月前

      Is the page linked in the site anywhere, or just mentioned in the robots.txt file?

      • ɐɥO@lemmy.ohaa.xyz
        link
        fedilink
        arrow-up
        10
        ·
        10 个月前

        Only in the robots.txt

        • Onno (VK6FLAB)@lemmy.radio
          link
          fedilink
          arrow-up
          8
          ·
          10 个月前

          Excellent.

          I think I might be able to create a fail2ban rule for that.

    • asudox@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      10 个月前

      Not sure if that is effective at all. Why would a crawler check the robots.txt if it’s programmed to ignore it anyways?

      • ɐɥO@lemmy.ohaa.xyz
        link
        fedilink
        arrow-up
        16
        ·
        10 个月前

        cause many crawlers seem to explicitly crawl “forbidden” sites

      • Crashumbc@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 个月前

        Google and script kiddies copying code…

      • MangoPenguin@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 个月前

        You could also place the same page as a hidden link on your home page.

    • Dizzy Devil Ducky@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 个月前

      I doubt it’d be possible in most any way due to lack of server control, but I’m definitely gonna have to look this up to see if anything similar could be done on a neocities site.

    • whyNotSquirrel@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 个月前

      smart

    • spookedintownsville@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 个月前

      Can this be done without fail2ban?

      • ɐɥO@lemmy.ohaa.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 个月前

        probably. never used it tho

        • spookedintownsville@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 个月前

          How did you do it? Looking to do this on my own site.

          • ɐɥO@lemmy.ohaa.xyz
            link
            fedilink
            English
            arrow-up
            3
            ·
            9 个月前

            My websites Backend is written in flask so it was pretty easy to add

      • MangoPenguin@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 个月前

        Should be able to do it with Crowdsec

Privacy@lemmy.ml

privacy@lemmy.ml

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !privacy@lemmy.ml

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

  • Posting a link to a website containing tracking isn’t great, if contents of the website are behind a paywall maybe copy them into the post
  • Don’t promote proprietary software
  • Try to keep things on topic
  • If you have a question, please try searching for previous discussions, maybe it has already been answered
  • Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
  • Be nice :)

Related communities

  • Lemmy.ml libre_culture
  • Lemmy.ml privatelife
  • Lemmy.ml DeGoogle
  • Lemmy.ca privacy

much thanks to @gary_host_laptop for the logo design :)

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 566 users / day
  • 2.4K users / week
  • 6.94K users / month
  • 16.5K users / 6 months
  • 1 local subscriber
  • 38.1K subscribers
  • 2.31K Posts
  • 53.1K Comments
  • Modlog
  • mods:
  • k_o_t@lemmy.ml
  • tmpod@lemmy.pt
  • Yayannick@lemmy.ml
  • ranok@sopuli.xyz
    cake
  • BE: 0.19.11
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org