Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Did you just put pressure on archive.org to take down the link? The archive.org link is no longer working and it says "This URL has been excluded from the Wayback Machine." https://web.archive.org/web/20201104050026if_/https://github...


You don't have to pressure them to remove a page. I remember that all you needed to do was add a line to your robots.txt to have a page excluded, and you can also just request to have a page excluded (that you own).


I know about that because i use robots.txt on my personal website to exclude, but how do you automatically exclude links that were already archived?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: