Friday, March 30, 2007

Auditing Privacy Part 2 - Risk Assessment of Data Loss

The easy way to assess privacy risks is to focus on the impact of data theft to the organization by including the private data as a corporate asset. There are well documented methods to identify the vulnerabilities in means of collecting, storing and sharing the data. Similarly, there are methods to identify and list the data's threats (hackers, "insiders," and negligent loss). The impacts will likely shake out along the lines of direct costs (postage, call center, other incident response costs), potential legal and regulatory actions and reputation damage. (For an example, Protegrity assessed the TJX data breach at $1.7 billion, though TJX was not strictly a privacy issues, it has parallels*).

This would be the easy way, but may not result in the most accurate results. The problem lies in identifying the impacts of a privacy breach. The attribute of “privacy” assigned to the data is what makes the data valuable, and worthy of protection. However, "privacy" is not an attributed that belongs to the corporation, but to the individual the data describes. So an assessment of risk to the corporation of privacy loss should start at looking at the impact of the loss to the individual.

Why do many corporations, when disclosing losses of tremendous amounts of data, appear to suffer only short term damage to their reputation. I posit that the potential damage to a corporation is proportional to the actual real damage to privacy of the individuals described in the lost damage. (See Guin v Brazos)

The real impact of a privacy incident on individuals has been hidden behind a cloud of security vendor fear mongering and media induced panic. The common problems with the data is equating data loss with a privacy breach. Identity theft properly defined is likely a higher impact, lower frequency event than is commonly reported.

The SB1386-style disclosure laws have been a boon to identifying the frequency of data loss, but the information that has to be disclosed does little to help identify the impact. An auditor concerned strictly with compliance would have to place equal risk to any loss of private data. But the auditor should take the risk assessment to the next step and focus on the individuals, identifying the risks that lead to actual harm to the privacy of individuals. Compliance risk is equivalent for the loss of a laptop carrying an encrypted database of private data and the same databases being heisted off a web server unencrypted by a criminal with the intent to exploit the identities. The real risk to the privacy of the individuals described in the database is clearly different.

Beyond the risk of a data loss, the auditor should also consider the equally important risks of the collection of private data and the dossier-ification of data. More on that later.

*Why the high risk to TJX? Though not strictly a privacy issue, the damages related are an issue of a loss to a third party - the banks - rather than TJX itself.

"Some would call this good fortune" from s2art

Tuesday, March 27, 2007

Impacted Molars II

Panopticonistas Cyveillance say ID theft is so bad, we are all going to die. Seems like shutting down copyright scofflaws got a little too Web 1.0 for them, so they've unleashed their vicious crawling spiders on a search for contraband identities. And guess what they found out? EVERYBODY'S IDENTITY IS ALREADY PWN'D! Now that they've collected this data, I'm curious as to what are they going to do with all those credit card numbers, SSNs and mothers' maiden names. Did they help shut down the sites hosting the illicit data? Did they notify the victims? This sort of research is on an odd ethical footing. I hope they get it all sorted before they do their research on other forms of digital contraband.

California Secretary of State Debra Bowen kicks ass in the name of privacy for Californians. She gets privacy, and maybe even cares about the citizens of California. I wish she could impart some of her knowledge to the Texas county clerks.

CDT publishes their draft Privacy Principles for Identification. Seem pretty much like Fair Information Practices to me, which is not necessarily a bad thing.

Fake Teeth Resting on Image of Monk courtesy jsdart

Monday, March 26, 2007

Insider Threat Assessment

Step one: Play a crappy new-agey cover of "All Along the Watchtower."

Thursday, March 22, 2007

Panopticon Enabled Desktops Increase Productivity!

From Dark Reading, the joys of workforce monitoring software with Ascentive!:

"We call it 'workforce activity management,'" says Schran. "Our latest edition provides all the insight necessary to eliminate time-wasting, increase productivity, and protect private company data."
Or, in the words of Ascentive's VP of Customer Relations Jeremy Bentham,

Morals reformed - health preserved - industry invigorated - instruction diffused - public burthens lightened - Economy seated, as it were, upon a rock - the gordian knot of Gramm Leach Bliley and Sarbanes-Oxley are not cut, but untied - all by a simple idea in Software Architecture!
More from Dark Reading:

Perhaps even more importantly, employee monitoring tools can deter workers from insider activities such as data theft or unauthorized file access, Schran adds. "If your employees are downloading files to a USB device, our software will record that action," he says. "Our data has already been used in evidentiary proceedings in court."

But I prefer the hot buzz on this product from their EU Product Evangelist Michel Foucault:

The heaviness of the old 'houses of security', with their fortress-like architecture, could be replaced by the simple, economic geometry of a 'house of certainty'. The efficiency of power, its constraining force have, in a sense, passed over to the other side - to the side of its surface of application. He who is subjected to a field of visibility, and who knows it, assumes responsibility for the constraints of power; he makes them play spontaneously upon himself; he inscribes in himself the power relation in which he simultaneously plays both roles; he becomes the principle of his own subjection. By this very fact, the external power may throw off its physical weight; it tends to the non-corporal; and, the more it approaches this limit, the more constant, profound and permanent are its effects: it is a perpetual victory that avoids any physical confrontation and which is always decided in advance.

And they say security software people don't read post-structuralist French philosophers. Heck, Foucault is all around you! I running a Jacques Derrida Packet Sniffer & Deconstructor right now! Or am I?

Tuesday, March 20, 2007

Auditing Privacy Part 1 - Ethics and the Canon

It would comfort many compliance auditors to discover the ultimate checklist and tear after their organization's privacy program, collecting tick marks and developing the dreaded deficiency finding. I say to them, "Google is your friend." For the more enlightened internal auditor, the first step in evaluating their organizations privacy practices should be a step back.

The Canon
There are best practices, and there are benchmarks. There are torts, laws, and rational fear of the irrational regulator. However, for most every auditable area there is also The Canon. Take a file to the gilded crust of Sarbanes-Oxley and the PCOAB (and all their works and all their ways), you eventually uncover the Generally Accepted Accounting Principles. Take a snowblower to the myriad layers of dust and ash of the Code of Federal Regulations. If you squint and hold your head just right, you'll see a vague outline of the Decalogue. And somewhere below ornate filigree and baroque ornamentation of HIPAA, Gramm Leach Bliley and SB1386 is the shape of the Fair Information Practices of the US Department of Health, Education and Welfare, 1973.

From the link above, here are the five practices of the modern privacy canon:

  1. Collection limitation
  2. Disclosure
  3. Secondary usage
  4. Record correction
  5. Security
These five principles will be your mantra for your audit. They will guide your question and inform your issues. Advanced practitioners may chose from the following according to their path:

The 10 AICPA's Generally Accepted Privacy Principles

The OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data

The Ethos
Like the Torah, the Sermons of Buddha, the Qur'an, the Gospels, or Fermat's Principle, a canon is only meaningful if applied. You must ask the CEO, the CIO, the Chief Marketeer, the General Counsel, and listen, and interpret their answers accordingly. Are the principles used as values to guide their decisions, obstacles to be worked around, or are they simply unknown? Read your corporate policies regarding privacy. Do you see in them evidence of the Fair Information Practices, or do they appear to be more oriented to a specific set of industry specific regulations? Interview the folks who handle the data. Do they treat the data with the care they would treat their own? The answers to these questions will begin to lead you to determining if your organization has the ethical basis for a privacy program.

What Does This Mean?
A compliance oriented organization may maintain reasonable concordance with Fair Information Practices without even knowing what they are. However, the organization may be reactive, and inefficient. The organization's privacy direction will be dictated by outside entities, rather than developed within.
A organization with a firm foundation in privacy practices, coupled with an ethic duty to privacy, will be more efficient, more effective, and retain a better reputation in the face of an incident.

Monday, March 19, 2007

I Am Not A Cop

A couple posts on the role of internal audit in the information security controls of a company got me thinking.
First, Anton describes an auditor as "policing agent" model:

  • InfoSec develops controls.
  • Operations operationalizes them.
  • Audit goes around with a checklist to make sure they got done
Farnum at Computerworld comments, as does Rothman.

The issue I have with this model is that if what InfoSec develops are inadequate, they could still be well implemented. InfoSec should take ownership in the controls, and insure they are implemented and monitor their performance after they are implemented. When the auditor comes along, he or she should be looking not only at the implementation, but if the system as designed by InfoSec achieves the requisite goals of risk reduction acceptable to the board. Unlike the crime, systems development or drug prescription analogies, information security is an ongoing management process.

So I'm looking through rose colored glasses rather than my usual green eyeshade, but I'm not going to play Kavenaugh to bunch of Mackeys.

Thursday, March 15, 2007

More Questions than answers

This evening has been spent practicing for my SXSW day show: a brief discussion about privacy for which some auditors will be getting CPE. As a result, I have also spent the evening listening to my voice slowly decay into a burbling croak.

But, I was happy that IT Security published what's on their blog feed. Some good stuff there, and I'm definitely subscribing to fellow Texan McKeay's keenly honed published thoughts. He nailed the county clerk bit better than I could. I could have saved some electrons and blood vessels if I read him first.

Speaking of privacy, my favorite bass player got in the mail a solicitation to participate in clinical study of some new medicine that replaces some prescription med. The suggested way to sign up was to go to a url: http://MYWIFESNAME.DRUGCOMPANYNAME.COM. That seemed odd. Half of me want to do some DNS-fu on the beast, see what names I can get (if any), and see what information I can gather. The other half of me is mildly outraged but barely has the energy to google to finder others in equivalently mild states of outrage. The third half feels like having a scotch and going to bed. Strictly for medicinal purposes. In Balvenie veritas.

Wednesday, March 14, 2007

Repost Redux: Special SXSW Edition

Having read a few additional commentaries, I began to think some more on two issues I posted about earlier.

Greg Abbott vs. The County Clerks
Mordaxus at Emergent Chaos says we need to chill, which made me wonder if there was less to this issue than I previously thought. The more I think of it, thought, the less appealing the whole mess appears. The clerks routinely sell the data in their charge to data brokers. The Open Records Act (Texas' FOIA) allows the clerks to charge for the records. By redacting the confidential parts, the data would be less attractive to the brokers, and the clerks revenue stream might dry up.
The clerks are digitizing and distributing information on the Internet beyond the scope of its original purpose, and counter to Texas law. I don't have a problem holding these folks accountable to the law and their duty as custodians of the data. I will be having a beer or three at SXSW, though, probably at the Yard Dog and at Woody's.

The Hacker vs. The Corporation
Both Emergent Chaos and ArsTechnica have things to say about the study I posted about yesterday. EC posted a link to the study, but after reading it, I don't think I've changed my mind. I am, in fact, more confused about the purpose of the study than before. The distinction between "hacker" and "corporate malfeasance" does not strike me as interesting as the distinction between "stolen" and "lost." The question for me as a consumer remains a question of risk. Am I more likely to suffer damage to my reputation or finances if my personal data is "lost" or if it is "hacked"? No doubt frequency is part of the equation, but so are the capabilities and intention of the threat.

Photo of the Casting Couch in action by me.

Tuesday, March 13, 2007

Charts 'n Graphs

From Pogo, this article from Physorg on the classic Evil Hacker v. Evil Suit dilemma. From the article:

If Phil Howard’s calculations prove true, by year’s end the 2 billionth personal record – some American’s social-security or credit-card number, academic grades or medical history – will become compromised, and it’s corporate America, not rogue hackers, who are primarily to blame. By his reckoning, electronic records in the United States are bleeding at the rate of 6 million a month in 2007, up some 200,000 a month from last year.

Goodness. This article seems to do more damage than good in increasing awareness of the privacy issue. The key bit of data that seems to be missing is the damage. More from the article:
Malicious intrusions by hackers make up a minority (31 percent) of 550 confirmed incidents between 1980 and 2006; 60 percent were attributable to organizational mismanagement such as missing or stolen hardware; the balance of 9 percent was due to unspecified breaches
So, how many fraudulent charges were made, fake IDs manufactured or reputations horribly disfigured by each category? The author of the study adds:

"And the surprising part is how much of those violations are organizationally prompted – they’re not about lone wolf hackers doing their thing with malicious intent."

So, would you rather Big Nameless Credit Card Company notify you:

A. that your name/credit card/SSN/date of birth were lost at an airport while stored on an encrypted laptop hard drive


B. that Lone Wolf Hacker sniped your digits of their server (running unpatched IIS 2.0 on unpatched Win98)

Of course I can't prove that either scenario is inherently more dangerous for the consumer. I can just shake my angry fist at the data.

Thursday, March 8, 2007

SSN Panic, Texas Style

Here's the Computerworld run-down. And here's the Attorney General's letter (worth reading) and the proposed bill to change the law Texas HB 2061 so as all the county clerks don't get thrown in jail.

The AG letter says it in fourteen different ways NO, YOU CANNOT RELEASE SSNs, quoting an imperial raftload of laws, state and federal, why, and why you should even be asking the question. The clerks need to grab a big ol Sharpie and start their redactin'. Shut down your infonet tube, and stop selling your goods to some skanky information brokers from the desolate wasteland known as "Not Texas." Good on the OAG. Shame on collective elected doofi that are trying to find them an out.
I can only take solace in knowing the traditional efficiency and effectiveness of Our Lege.

This fiasco is an example of why privacy principles rather than mere compliance is important to an organization. Even if the Ft. Bend clerks were ignorant of the law, they reflected a disregard for the citizens they are charged to serve.

Wednesday, March 7, 2007

Learn to Play Sonic Reducer

I was going to write about this article on Dark Reading, that includes this power-quote of insight and mind-blowitude:

"A lot of blogs now have become very big on the Internet," noted OSC Director Douglas Naquin in an interview with The Washington Times.

...but I figured my time (and yours) would be better spent learning to play "Sonic Reducer" with Cheetah Chrome.

E flat, C sharp, and lots of feedback.

Photo of Mr. Chrome from John Santanello

Tuesday, March 6, 2007

It's the Crime, Not the Tool

Tim Wilson at Dark Reading on IT Security: The New Big Brother:

"To identify potential insider threats, IT must monitor end users' behavior by scanning email, tracking network activity, and even watching employees for "trigger" events that might cause disgruntlement. Right now, I'm working on a story about ways corporations might monitor their employees outside the workplace to determine whether their out-of-office conduct might cause data leaks."
This is how the TSA dealt with the "insider threat" (i.e., passengers) on airplanes. Like the TSA, Mr. Wilson's focus appears to be on the tools that commit the crime (box cutters, e-mail, 3 oz. containers of fluid, USB drives) rather than the crime itself. Schneier has harped on this non-stop since 9/11. The proposed regime of surveillance will result in myriad false positives and employees as happy as your average passenger who has to remove his shoes and toss his shampoo and nail clippers into the trash at the security checkpoint.

In addition, what qualifies your IT Security department to be skilled in identifying what is legitimate and what is suspicious? How many eyes does the CEO want looking at legitimate confidential traffic? This filtering and monitoring scheme seems to be increasing risk of exposure rather than decreasing it.

Part of the solution does not involve any IT at all. Supervisors supervise. Their job is to monitor the employee activities. Managers should insure this happens.

Another part is development of an ethical culture within the corporation, where people have a channel to report if someone is acting "hinky." Internal and external auditors and ethics officers play an important role in an ethical environment. All the monitoring software in the world couldn't have prevented Enron, but an internal auditor put a stop to it.

Monday, March 5, 2007

Privacy and Security Lessons from Criminal Enterprises: The Corner & PCI

Either you have heard the stories, or encountered first hand the difficulty in convincing an organization's leaders to take adequate precautions to insure the privacy of identity related data, and maintain the integrity, confidentiality and availability of their information assets. Privacy and security have to be marketed to management since privacy and security are "non-functional" without a "ROI." As a last ditch effort, privacy and security can be pitched as a compliance effort; these activities must be performed to satisfy the requirements of an
independent, potentially hostile third party.

Nonetheless, criminal organizations, which by definition care not one whit about compliance, and have a vigorous appreciation of the bottom line, focus significant efforts on the privacy of personal data and the security of transactions and communications. For example the following story of touts, runners, ground stashes* and the electronic processing of credit cards.

The typical drug transaction occurs thusly:

  • Junkie finds slinger. Junkie's selection may be based on the Slinger's reputation, effectiveness of the Touts, past business practices or location.
  • Slinger takes order, collects cash from Junkie.
  • Slinger signals the order to a Runner.
  • Runner distributes product to Junkie, either from minimum amount on person, or collected from ground stash.
  • Junkie moves on to consume product.
So the slinger is the payment processor, and the merchant is the runner. Both will be held accountable for inventory, and separation of duties not only minimizes the compliance risk (i.e., being observed by law enforcement), but also provides an accounting control. The corner boy who put out the package knows that even if the slinger and the runner collude, the collusion will result in a wrong count at the end of the day.

So what part of this transaction is so hard for folks like TJX to understand? A couple items to consider:
  • Although the merchant may mitigate risk by gaining distance from the transaction (Verified by Visa, PayPal), the merchant is more interested in the customers than the Slinger is in the Junkies. The merchant and the processor want to keep all that secondary data and compile it, and convert it into cash. The Slinger wants only not to get burned by a counterfeit bill.
  • No one is responsible for the "count" on credit card transactions. Unlike the corner, the matching of goods, customer and payment is out of order in electronic commerce, with each party shirking responsibility for the transaction.
  • Each has to deal with impostors, though. The seller of baking soda is the "phisher" of the drug trade.

Next, yelling "5-0" as an intrusion detection mechanism.

*taken largely from Simon & Burns terrific book The Corner
or on most episodes of Simon's The Wire.

Friday, March 2, 2007

Impacted Molars: Insurance, Banks and Godzilla

A Risk Management & Assessment Deathmatch

Gunnar Peterson's interpretation of Warren Buffet's risk management.


The Bank Lawyer's outstanding post on bank risk managers and regulators


Alex's Godzilla pandemic risk deflation.

Thursday, March 1, 2007

One, but he gets 3 hours credit.

The official TAMU account of a hack into their authentication system.

The Eagle has the most entertaining coverage of Aggie Hack 07.

"We learn from our mistakes," said Pierce Cantrell, vice president and
associate provost for information technology. "These are complicated
systems, and there is a huge learning curve. It's a computer
cat-and-mouse game in this business, and I think we do a really good
job handling account security."

Provost Cantrell is a member of Tom & Jerry school of threat assessment.
It's all about cheese and butcher knives and tails in light sockets. You get some soot on your face after the mouse hands you dynamite, sure, but what can you do? Despite what Tom may say, Jerry is really doing a heckuva job.

From the trenches comes another approach:

[Executive director of computing and information services] Putnam said
he's unsure why anyone would want to break into the university computer
system, but hackers try to test their limits and see how far they can
get into a secure system.

"You can speculate, but that's all you can do," he said. "It's like why
do you climb a mountain? Because it's there."

Director Putnam is more of the Edmund Hilary school of threat assessment. It's so effing ineffable why these meddling kids would want to monkey with the authentication mechanism of Aggie U, you are just spinning your wheels looking into it. To paraphrase Nigel Tufnel, some mysteries are better left unsolved.

The appropriate aggie joke is left as an exercise for the reader.