NSA Taps Into Google, Yahoo Data

The post NSA Taps Into Google, Yahoo Data appeared first on HostGator Web Hosting Blog | Gator Crossing . In May of 2013, former National Security Agency contractor Edward Snowden fled the United States with classified documentation revealing some of the most sophisticated and prolific public spying in American history. The PRISM program he divulged is an extensive campaign that utilizes classified intelligence directives to acquire “ metadata ” from major Internet players like Google and Yahoo. Since then, Snowden has brought to light myriad directions of similar ilk, geared toward data collection in the name of intelligence efforts. In a recent leak, however, it was revealed that PRISMs scope pales in comparison to the NSA’s international data mining project, known by the acronym MUSCULAR and run in tandem with the British GCHQ. The program, it was shown, utilizes the linkages between Google and Yahoo data centers, mining entire data flows and shipping the data back to NSA data warehouses in Fort Meade. The NSA program utilizes a structural flaw in the two companies’ architecture. Yahoo and Google maintain high speeds through decentralized data centers spanning multiple continents and connected by thousands of miles of fiber optic cable. In order to maximize performance, these data centers continuously sync information between repositories, including whole user accounts and email indexes. In order to obtain the information desired, the NSA needed to circumvent exemplary data security protocols. These protocols include 24-hour guards, biometric identity verification, and heat-sensitive camera at data centers. According to the article in the Washington Post , company sources had reason to believe that their internal networks were safe. Despite these measures, a weakness was uncovered. An internal NSA slide show leaked by Snowden contained a hand-drawn diagram outlining the transition point between Google internal networks and user computers. The drawing highlighted Google front-end servers as the weak point, noting that these servers actively decrypted information and could be exploited for data acquisition purposes. Neither company was aware of the backdoor intrusion. Both companies acknowledge and acquiesce to front-end requests for data but maintained that their internal networks were secure. Google vice president for security engineering Eric Grosse even announced plans to encrypt linkages between data centers with the presumption of security. Since the leak, both companies have reacted in outrage . Google’s chief legal officer, David Drummond remarked on the subject: “We have long been concerned about the possibility of this kind of snooping, which is why we have continued to extend encryption across more and more Google services and links, especially the links in the slide.” Yahoo commented: “We have strict controls in place to protect the security of our data centers, and we have not given access to our data centers to the NSA or to any other government agency.” Legally speaking, the NSA is exploiting a loophole related to international espionage practices. While Congressional oversight has limited domestic spying, international monitoring remains less inhibited. Because the data centers of the two Internet giants span multiple continents, interception of these data flows is technically permitted under Section 702 of the FISA Amendments Act of 2008. This international monitoring occurs with the cooperation of the British GCHQ. The UK agency maintains a data cache that can hold three-to-five days of traffic data before recycling storage. During this time, NSA software utilizes search terms in order to sift desirable data from the dredges. This data, once identified, is shipped via fiber-optic cables to the data warehouses in Fort Meade. This information, the agency claims, has produced intelligence leads against “hostile foreign governments.” At this point, this assertion of intelligence value remains largely unsubstantiated, likely due to the classified nature of such leads. The scope of the MUSCULAR program lies in the volume of search terms used while sifting through acquired data. According to records, these inquires include 100,000 terms, more than two-fold the amount used in the PRISM program. The volume indicated in the Washington Post’s documents topped 181 million records over a 30 day period. The data acquired includes who sent or received emails, the subject of these emails, when and where they were sent, and the text, audio, and video content of these messages. The program strikes a chord with both companies due to its unique nature. Both organizations were willing participants in the collection of data through front-end means, but the back-end intrusion remains uncharacteristically aggressive. Google, as mentioned, will move to encrypt its internal networks, however Yahoo has not indicated whether it will do the same. The ramifications of these revelations is yet to be seen. However it is likely that, in the wake of negative public reaction to the PRISM documents, the sentiment will be similar. Ultimately, the continued exposure of agency programs continue to demonstrate the inter-connected and heavily monitored nature of our digital communications; a fact that can no longer go unacknowledged. Check out the fast shared web hosting by HostGator.com

This entry was posted in HostGator, Hosting, php, VodaHost, vps and tagged , , , , , , . Bookmark the permalink.

Leave a Reply