Menu
Sign In Pricing Add Podcast

Decoder with Nilay Patel

How the Wayback Machine is fighting linkrot

2018.286 - 2038.418 Nilay Patel

That's made a lot of people suddenly aware of something called robots.txt, the file which dictates which web pages third-party crawlers and other automated tools are allowed to visit on a website. Lots of websites are now making changes to block these scrapers, and it's called into question one of the oldest and most widely used practices on the open web, one that's vital for preservation.

0
💬 0

Comments

There are no comments yet.

Log in to comment.