Answer a question about this article:
Since 1996, they have been archiving cached pages of web sites onto their large cluster of Linux nodes. They revisit sites every few weeks or months and archive a new version if the content has changed. Sites can also be captured on the fly by visitors who are offered a link to do so. The intent is to capture and archive content that otherwise would be lost whenever a site is changed or closed down. Their grand vision is to archive the entire Internet.
When does Wayback Machine save a copy of a website?
if the content has changed