Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Open Source Devs fight AI drags with wisdom and revenge


AI Web Crawling Bots are the camoublaci of the Internet, many program developers believe. Some devs, inventors, often began to retreat in humorous way.

Can be targeted by bad scanning behavior of any website – Sometimes it takes the site – Open source developers are “disproportionate” effective, writes Nikcolò Vendandi, a developer of Linux Desktop known as the owner of Plasma and Blogbooks.

Their nature is socialized with more and more sources of infrastructure, which hosts free and open sources (FOSS) projects, and they have fewer sources of commercial products.

The point is that many AI bots robots are a tool that says the Robot.TXT file, which is created for previously created for previous search engines.

A “cry for help” Blog Post In January, Foss Developer Xe Iaso said Amazonbot hit a git server website that caused DDo-cut. GIT servers can download or contribute to anyone who wants to receive foss projects.

However, this bot has hidden Lason’s robot.txt, other IP addresses and claimed to be other users.

“False, the use of the user agents, block AI creeping bots, change their own user agent, as a lawyer and more” Laso cried.

“They will break your site until they fall, and then they will break each link. Each link will hit each link to each link, and they will click on the same links and repeatedly,” he said.

Enter the god of graves

Thus, Iaso, a vehicle called Anubis, and wisely returned.

Anubis is The proof of the opposite power of attorney Premises should not be allowed to be hit on a git server. Blocks the bots, but allow people to be managed by browsers.

Funny Party: Anubis is the name of a God in Egyptian mythology that leads to the dead.

“Anubis drew your soul (heart) and if you were heavier than a feather, the heart was eaten and mega died,” said aso techcrunch. If a web request is through the problem and determined to be a person, A cute anime photo declares success. The drawing says “I get my hand at Antropomorphing Anubis” says Iaso says. If there is a bot, the request was rejected.

The project named Wryly was spread like the wind between the fossil society. Laso He shared him in Github On March 19 and in a few days, 2000 stars, 20 contributors and 39 fork collected.

Revenge as defense

Any popularity of Anubis shows that Iason’s pain is not unique. In fact, the venerator shared the story after the story:

  • Founder ceo Sourcehut Develault described Spend “The week is 20-100% of the week, Hyper-aggressive LLM reptiles,” and “dozens of short cuts in the week.”
  • Jonathan Corbet, who developed a famous Foss working on the Linux industry news site, warned the site Be slow by ddos ​​level traffic “AI’s hug bots.”
  • Kevin Fenzi, Sysadmin of the Great Linux Fedora Project, AI scraping bots said It was so aggressive, it should have blocked the entrance to the entrance to Brazil.

Venerandi TechCrunch says that the same issues know many other projects. One of them “had to temporarily ban all Chinese IP addresses at once.”

At a time, this developers – even forbid all countries to ban all countries, “they have to force robot.txt files to extinguish their ignition AI boots.

In addition to aggravating the spirit of a web request, other devs deny revenge is the best defense.

A few days ago Hacker Newsuser xyzal Recommended Robot.Txt, “A bucket load of articles about the benefits of articles about drinking bleaching” or “Articles on measles in the bed”.

“We think the bots _negative_ utilities are not only zero value, and from visiting our traps,” Xyzal explained.

As he heads, anonymous named an anonymous creative, known as “Aaron” in January Nepenthes exactly aiming to do it. In an infinite maze of fake content, Devin traps draggars in a target ARS TECHNICA aggressive if not harmful. The tool is named after a fragrant plant.

And the cloud, perhaps the largest commercial player who offers several vehicles to turn off AI reptiles, left a similar tool in the last week AI Labyrinth.

“Slowing AI reptiles and other bots that do not respect, the slowing, confusing, confusing and non-repeated” directives “notes of” notes and waste. ” In the blog post. CloudFlare has more “inappropriate content” nourishes “delayed content” more than removing your legitimate website information.

Sourcehut’s Develault said, “Nepenthes has a sense of justice that responds to this, because it is a defect, but as a result, Anubis is a solution that works on his website.

However, the devault gave a public, heartfelt integration: “LLMS or AI view generators or GitHub copylot or github copylot or any one of any of the trash, stop to stop making, just stop to stop doing any of any of these debris.”

The likelihood of this is a struggle with a touch of developers, especially in Fosmada, the mind and humor.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *