The Symantec source-code drama played out over the past six weeks and is, finally, settling into a stable position. The short summary: an Anonymous-aligned operator under the handle YamaTough was in possession of source code for several Symantec products from 2006, reportedly stolen from a third party in India where Symantec had provided code for compliance review. YamaTough demanded a fifty-thousand-dollar ransom from Symantec to not release the code; the FBI conducted a sting operation in which an agent posed as a Symantec representative and engaged YamaTough in negotiations; YamaTough released portions of the code anyway, partly out of suspicion that the negotiation was a sting and partly because the AntiSec rhetoric of the past year would have made acceptance of the ransom inconsistent with the theatre. The released code includes Norton Antivirus 2006 and pcAnywhere 2006; Symantec had pre-emptively advised users to disable pcAnywhere on 17 January before the source actually leaked, which strongly suggests they knew the pcAnywhere code was compromised and what it would expose. Patches for pcAnywhere were issued through January and early February; the customer base who run pcAnywhere have been on a difficult timeline.

The third-party-review angle is the part that interests me most, because it reframes a question that comes up often in vCISO conversations. Symantec gave the source code to the Indian CBI for compliance review of products being sold into the Indian market — a perfectly normal arrangement, the kind of thing that any large security vendor doing international business has to do for various national regulators. The implicit assumption was that the CBI would treat the source code with appropriate care. The CBI evidently did not, and the source ended up in the hands of an Anonymous-affiliated operator working with Indian-based collaborators. There was a six-year window between the theft and the release; the public disclosure forced Symantec into a rushed patching cycle for products whose 2006 source was suddenly common knowledge.

The operational lesson here is one I have been writing into engagement reports for some clients but had not stated as cleanly until now. When you give your source code, your sensitive operational documentation, or your incident-response procedures to a third party for any purpose — regulatory compliance, vendor due diligence, customer satisfaction surveys, audit work — you have effectively delegated the security of that material to whatever operational discipline the third party happens to have. Most third parties do not have your operational discipline. The structural answer for materials this sensitive is to control what is shared, escrow it through a secure-review service rather than handing copies over, and assume that anything actually shared will at some point be in less-secure hands than your own. The FBI conference-call leak last month illustrates the same problem on the verbal-coordination side; this is the same problem on the source-and-document side. Most organisations are not yet operating on that basis.

For the engagements I run through Hedgehog, I have been adding a "third-party material handling" line to the standard scope. The questions are: what sensitive material has been shared with which third parties; under what controls; with what visibility into the third party's handling. Browne Jacobson's external auditors, News International's regulators, Towry's compliance reviewers, the various professional-bodies oversight committees that come up in legal-and-financial engagements — all of these are third parties that have at various times received sensitive material. Several clients have not previously considered this surface as part of their threat model.

The other thing this incident illustrates is the operational maturation of the AntiSec rhetoric. The negotiation theatre with the FBI agent, the deliberate decision to release the code regardless of the ransom outcome, the public commentary on Pastebin throughout — all of it follows the same template that we have been seeing since the LulzSec campaign. The structural goal is not financial gain; it is reputational damage and political theatre. This makes the "pay the ransom" defence pathway less viable than it would be against straightforward criminal extortion, because there is no acceptable outcome for the AntiSec operator that involves the material not being released. The defensive answer is back to the same answer: assume the material is going public, and work back from there.

The pcAnywhere situation is the part that has had the most direct operational impact on my clients. Several of the secondments and Hedgehog clients have pcAnywhere deployed somewhere in the estate, mostly for legacy-equipment remote-management purposes. Symantec's advisory to disable pcAnywhere until patches were available was, given the 2006-source-code situation, completely reasonable, but it left a number of operations teams with the question of how to manage the affected equipment in the interim. The migration paths are mostly to standard remote-administration alternatives — RDP through a properly-secured jump host, SSH for the Unix-side equipment, vendor-specific remote-management for industrial-control equipment. None of these are difficult; the operational pain has been the urgency of the timeline rather than the technical complexity of the migration. Brian Krebs has been tracking the pcAnywhere fallout in detail and is the right source for the operational chronology.

The wider question — what happens to the customers of any large security vendor whose source code leaks — is one I expect to come back to over the rest of 2012. Symantec is unlikely to be unique. The structural exposure of any vendor whose source is held by multiple regulatory and audit parties, scattered across a decade of reviews, is more substantial than vendors are publicly acknowledging. The next leak of this shape will not be a six-year-old codebase; it will be something more current. The defensive position needs to anticipate that.

The next post is probably the Hedgehog SOC discussion, which has finally moved from "thinking about it" to "deciding". Or whatever happens with the Megaupload trial, which is now scheduled for August in the Eastern District of Virginia.


Back to all writing