The New York Times has contributed to what seems to be a running series over the past couple of days concerning the way technology is portrayed in the media. Here’s David E. Sanger and Eric Schmitt (no, not him):
Using “web crawler” software designed to search, index and back up a website, Mr. Snowden “scraped data out of our systems” while he went about his day job, according to a senior intelligence official. “We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” the official said. The process, he added, was “quite automated.” […]
Among the materials prominent in the Snowden files are the agency’s shared “wikis,” databases to which intelligence analysts, operatives and others contributed their knowledge. Some of that material indicates that Mr. Snowden “accessed” the documents. But experts say they may well have been downloaded not by him but by the program acting on his behalf.
My “Spidey senses” tell me that this was “probably wget“. Crack “reporting”.