Quantcast

«

»

Nov 12 2013

NSA and GCHQ spied on OPEC too

The German publication Der Spiegel has published yet another revelation from the Edward Snowden documents. These reveal that the NSA and the British GCHQ have been spying on the OPEC countries.

Documents disclosed by whistleblower Edward Snowden reveal that both America’s National Security Agency (NSA) and Britain’s Government Communications Headquarters (GCHQ) have infiltrated the computer network of the the Organization of the Petroleum Exporting Countries (OPEC).

In January 2008, the NSA department in charge of energy issues reported it had accomplished its mission. Intelligence information about individual petroleum-exporting countries had existed before then, but now the NSA had managed, for the first time, to infiltrate OPEC in its entirety.

Saudi Arabia’s OPEC governor is also on the list of individuals targeted for surveillance, for which the NSA had secured approval from the secret Foreign Intelligence Surveillance Court. The documents show how careful the Americans were to suspend their surveillance when the Saudi visited the United States. But as soon as he had returned to Riyadh, the NSA analysts began infiltrating his communications once again.

The British, who also targeted OPEC’s Vienna headquarters, were at least as successful as the NSA. A secret GCHQ document dating from 2010 states that the agency had traditionally had “poor access” to OPEC. But that year, after a long period of meticulous work, it had managed to infiltrate the computers of nine OPEC employees by using the “Quantum Insert” method, which then creates a gateway to gain access into OPEC’s computer system. GCHQ analysts were even able to acquire administrator privileges for the OPEC network and gain access to two secret servers containing “many documents of interest.”

It is also clear that they were not collecting just metadata but were collecting everything, including the messages themselves.

When you couple this with the reports of the spying on the Brazilian oil companies and Belgian telecommunications companies, the protestations of the US and UK they have to spy to foil terrorism become seen as increasingly hollow. This is government espionage conducted to get a competitive advantage over their trade competitors, plain and simple.

8 comments

Skip to comment form

  1. 1
    trucreep

    The fact that they haven’t foiled any terrorist attacks just adds more to the pile. They fight transparency and accountability because if we were able to get a clear picture of the ineptitude that is going on here, people would be outraged. Of course, we have a pretty clear picture already unfortunately…

  2. 2
    Lofty

    How dare you suggest spying on everyone is illegal?
    “I thought whistleblowers revealed things that were illegal,”

  3. 3
    Acolyte of Sagan

    Right, enough’s enough.
    It will be far simpler now to publish a list of everybody not being spied on.

  4. 4
    lanir

    So… Who reaps the benefits of industrial spying anyway? I’m at least as interested in who all the assholes involved are as I am who they’re spying on. And it sounds like it will tie in nicely with the oil companies we keep paying because they’re so broke and needy they’re all among the richest and most profitable companies on the planet.

  5. 5
    Nick Gotts

    The current cant from our lords and masters in both the USa and the UK is, and has been for the past few decades, that free markets and the fair competition they bring about benefit everyone. The extensive commercial espionage revealed by Snowden makes it clear that they don’t believe a word of it.

  6. 6
    TomeWyrm

    Technically nobody? If your communication passed through the US, it’s almost certainly being stored somewhere by the NSA. If not they’re trying really REALLY hard to make my statement the reality. I mean that’s literally the driving mission of General Keith Alexander. He wants to find the needle in a haystack by collecting the entire haystack in case he maybe might need it later.

    If you want a list of people not under a targeted program… that’s still going to be bigger than the list of people that are (or have been) targets… even if the list is still “anybody we can think of that might have useful information”

  7. 7
    Marcus Ranum

    they were not collecting just metadata

    The whole “metadata” story is a lie; it makes absolutely no sense to just collect the metadata from a message, without having the ability to retrieve the entire message if the metadata matching system flags it as being interesting. What they are saying, and have been saying all along, if you read between the lies, is that they collect everything and extract metadata from it, which is searched automatically. Now, the next trick is that anything can be metadata – you search the text for keywords that may be indicators of terrorism and the value (“has terrorism keywords” == TRUE) is now metadata. The Subject: line? Metadata. Etc.

    A hypothetical design for a system like PRISM looks like this:
    - messages are collected at the “edge” of the collection network
    - messages are parsed, metadata is extracted; this almost certainly includes bayesian classifiers to determine the probability that a message is interesting based on pre-trained codexes
    - audio stream data is run through a speech-to-text engine (may be forwarded to a separate collector with hardware assist for this purpose) and metadata is extracted (again, probably a bayesian classifier) at this point the text produced from the audio message is probably ‘metadata’
    - collected messages are stored locally at the edge node where it was originally gathered, or at a local storage node
    - the metadata plus a pointer to the stored message, is forwarded in to the core analysis system. note that this dramatically decreases the amount of data that the core has to manage, the actual content is “not looked at” unless it is retrieved from the edge system. there is a good chance that the word “collecting” is being distorted to mean “brought into the core” not simply, uh, collecting
    - the core analysis system stores and indexes the metadata
    - large-scale traffic analysis sweeps and probes are run through the data as it is stored in the core. typical algorithms used here would be straight-up matching, as well as semantic forest analysis. the core probably does a potential-match analysis in which a message is not simply assigned a boolean value such as interesting/not interesting but rather a partial value interesting/not interesting/may be interesting; the messages that “may be interesting” dwell in the matching engine for a long time pending completion with matches against other documents that push them into interesting/not interesting category
    - the traces of interesting messages are presented to analysts
    - if the analyst requests more information, the edge system where the original data was stored is queried to retrieve it

  8. 8
    Marcus Ranum

    for the past few decades, that free markets and the fair competition they bring about benefit everyone

    The US gov’t used to complain loudly and point the wagging finger of blame, at the French intelligence service for passing information about Boeing versys Airbus proposals to Airbus. Because, you know, those French pay dirty, n’est-ce pas?

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>