Snowden used common web crawler tool to collect NSA files
The famous whistleblower Edward Snowden used inexpensive and freely available software to gain access to at least 1.7 million secret files, The New York Times reported, quoting senior intelligence officials investigating the breach.
The collection process was “quite automated,” a senior intelligence official revealed. Snowden used “web crawler” software to “search, index and back up” files. The program just kept running, as Snowden went about his daily routine.
“We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” the official said.
Investigators concluded that Snowden’s attack was not highly sophisticated and should have been easily detected by special monitors. The web crawler can be programmed to go from website to website, via embedded links in each document, copying everything it comes across.
The whistleblower managed to set the right algorithm for the web crawler, indicating subjects and how far to follow the links, according to the report. At the end of the day, Snowden was able to access 1.7 million files including documents on internal NSA networks and internal “wiki" materials, used by analysts to share information across the world.
Reportedly, Snowden had full access to the NSA’s files, as part of his job as the technology contractor in Hawaii, managing computer systems in a faraway outpost that focused on China and North Korea.
Officials added that the files were accessible because the Hawaii outpost was not upgraded with the latest security measures.
The web crawler used by Snowden was similar to, but not as advanced as the Googlebot crawler, used by Google and its search engine to access billions of websites and download their contents for fast search results.
The whistleblower did raise some flags while working in Hawaii, prompting questions about his work, but he was able to ward off criticism successfully.
“In at least one instance when he was questioned, Mr. Snowden provided what were later described to investigators as legitimate-sounding explanations for his activities: As a systems administrator he was responsible for conducting routine network maintenance. That could include backing up the computer systems and moving information to local servers, investigators were told,” according to the report.
Snowden admitted in June to taking an undisclosed number of documents, which in the last half-year have been regularly relied on by the international media for a number of high-profile reports about the US National Security Agency and its British counterpart, GCHQ. He was then granted political asylum by Russia and now resides in Moscow.
The leaks have unveiled a number of previously unreported NSA operations, including those involving dragnet surveillance programs that put the digital lives of millions, if not billions, of individuals across the world into the possession of the US government.
- Site Admin
- Posts: 155
- Uploads: 65
- Kudos: 7
Re: Snowden used common web crawler tool to collect NSA fil
Did you also know that GCHQ attempted a DDoS attack on anonymous networks? Such as AnonyOps, etc? They seem to think the #HTP hashtag might have been headed by the GCHQ this whole time, and nobody knew until the attack took place.
"The only necessity for the triumph of evil is when the good men do nothing..." - Albert Einstein
- Posts: 244
- Kudos: 17