Decoder with Nilay Patel
How the Wayback Machine is fighting linkrot
Nilay Patel
That's made a lot of people suddenly aware of something called robots.txt, the file which dictates which web pages third-party crawlers and other automated tools are allowed to visit on a website. Lots of websites are now making changes to block these scrapers, and it's called into question one of the oldest and most widely used practices on the open web, one that's vital for preservation.
0
💬
0
Comments
Log in to comment.
There are no comments yet.