Showing posts with label privacy. Show all posts
Showing posts with label privacy. Show all posts

Wednesday, October 6, 2010

The Professional

An interesting narrative, trapped unfortunately behind a pay wall, comes from the Chronicle of Higher Education - "Chapel Hill Researcher Fights Demotion After Security Breach"

A cancer researcher's database of gets potentially pwnd (two years from incident to discovery), spurring the usual breach notification process.  Her bosses cut the researcher's pay and reduced her status to associate from full professor.  The justification was that she, as principal investigator, was responsible for the security of the personal data entrusted her by the subjects of the study.

The meat from the article (emphasis added):

The provost also accused her of assigning server-security duties to an inexperienced staff member, who failed to install important patches and upgrades, and of not providing the staff member with the training needed. Ms. Yankaskas countered that the staff member, who has since left, had worked for the university's technology office and that the employee never submitted a formal request for additional training.
"I had an employee who I trusted who told me things were OK," she added. "I would have no way to get on the computer and tell if it was secure. Unless I assumed my employee was lying to me, I don't know what I could have done."
Working in the Public Interest
I believe that there is a another option.  Some folks are in charge of security but are not liars, but are incompetent.    And, yes, it is hard to tell them apart.

If it was money that was stolen, and someone said "I have no way of telling if the books were correct.  I trusted the accountant.  He was an experienced bank teller" what would be the response.  Why didn't you hire a forkin' CPA?  CPAs have professional knowledge, and ethical obligations, and if they fail to meet them, you can have their license pulled.  

No so with security folks.  Why is it acceptable to treat for others to manhandle your personal, private data more cavalierly then your  accounting records?  

I'm tempted to start my rant on certification, psuedo-science and "computer forensic professionals" but I'll save it for the next post.   


Wednesday, September 22, 2010

Risk a Harm?

Interesting post and comments on privacy risk from Solove at Concurring Opinions.  Despite being raised by a pack of feral solicitors, I can't claim to understand all the legal theories involved.  I'm attracted to the liquidated damages idea for a number of reasons, including the ability to build a reserve or get underwriting to mitigate potential incidents.  

Harms at Risk

On the other hand, this is where the disclosure rules suck.   For example, an organization loses track of a hunk of physical media that contains a couple hundred thousand records that contain personally identifiable information (but not financial information - no bank or credit card account number).   In this example, there is a very high probability that the media was subsequently destroyed.  Are the individuals identified on the media well served by being notified?  

Imagine there was a method to calculate the likelihood of financial damage to the individual due to the loss of the media.  Lets imagine that there is less than 1% chance that the information will be used in a crime in the next 2 years, and it decreases by half every year that follows.  However, if it is used in a crime, it is likely that the crime will be of a significant impact - a genuine fraud involving a false credentials that would take more than $100,000 for the victim to unravel.   Is notifying the victim of the risk, and making him feel uneasy (since humans perceive risk differently than equations) responsible?  

Or is this just an excuse for me to illustrate a post with a picture of Harms at risk?  

Tuesday, October 6, 2009

Sociables


When I read this commentary on privacy from Andrea Dimaio from Gartner, I was mildly surprised that people still thought like this, that privacy is tied to secrecy.

Bob Blakley responds at the Burton Group. I agree with his analysis, so it must be brilliant. The back and forth in the comments is worth reading.

Monday, March 17, 2008

Releative Position and Privacy


Ed Felton recently wrote two posts on the failure of the marketability of privacy, and how corporations and consumers should respond. According to Felton:

There’s an obvious market failure here. If we postulate that at least some customers want to use web services that come with strong privacy commitments (and are willing to pay the appropriate premium for them), it’s hard to see how the market can provide what they want.
In the follow-up, Felton describes a standard contract and a sort of privacy escrow protocol to protect individuals against the desperate actions of a cratering start-up.

The more I read and think about privacy, the theory that an individual's privacy has a value that can be exchanged on the market becomes less and less compelling. Frank Pasquale wrote at Concurring Opinions that in the market model, you trade your privacy for efficiency and convenience, using Gmail as an example:
[C]onsider the type of suspicions that might result if you were applying to a new job and said "By the way, in addition to requiring 2 weeks of vacation a year, I need to keep my email confidential." The bargaining model is utterly inapt there. . . . just as it would have been for women to "bargain" for nondiscrimination policies, or mineworkers to bargain, one by one, for safety equipment.
He concludes that people who trade their privacy will outcompete those who do not, and that
"[a] collective commitment to privacy may be far more valuable than a private, transactional approach that all but guarantees a 'race to the bottom.' " The paper he cites on cost benefit analysis and relative position was interesting (to me at least) when read in terms of privacy. From the abstract:
When a regulation requires all workers to purchase additional safety, each worker gives up the same amount of other goods, so no worker experiences a decline in relative living standards. The upshot is that an individual will value an across-the-board increase in safety much more highly than an increase in safety that he alone purchases.
"Privacy" can be substituted for "safety." Can "security" also be considered in this context? Is it already?

Tuesday, October 23, 2007

Tonight, We Dine in Utica!

So, despite a workload that would stun an ox, I still manage to read my Internet privacy stories. Like this one from Ars Technica about the University of Utica and their Secret Service data wrangling on identity theft.

I click over to the .edu to read what they had to say in the original text. But, curiously enough, they asked me for my contact information. Well, o.k. - but what is your privacy policy? I hit the link to their privacy policy. This is madness!. No. This Is Utica!

Friday, September 7, 2007

Howls of Derisive Laughter, Bruce!


By now, we all know that the concentric perimeter devised by the kangaroo jockeys assigned to protect the best and brightest of Asia and the Pacific were ineffective against comedian pranksters. (Perilocity has the lowdown.)

But what if they had been REAL pranksters? The NYC could teach those koala huggers a lesson in deterring those cats. They successfully defended the Republican National Convention against a variety of threats ranging from partial nudity, Johnny Cash impersonators, poetry, wet T-shirts and rock 'n roll. I'm confident that a couple of pranksters with a Canadian flag and a limo would not have escaped the attention of The Finest, and would have at least one entry in a database. And, oh, yes, their data would be aggregated, sooner or later. Yes.

I guess my point is two-folded:

1. A system meant to trap terrorists may not trap your prototypical Prankster 2.0, just as a system designed to trap thieves may not trap auditors. (I believe I have railed on this before.)

2. A system meant to trap terrorists may also trap Johnny Cash impersonators.

Monday, August 20, 2007

I Feel That It's Almost Crime


Imagine Monster put a click-through license on the malware, adjusted the privacy policy a tad (include an opt-out for additional "services"), and voila! It's not a privacy breach, it's an additional revenue stream! The 1.6M bits of Monster job hunter data is at least as hot as the Glengarry leads.

Imagine that Certegy/Fidelity records were not sent in wild cascading romp through the land of data brokery by the actions of a rogue database administrator, but through a perfectly legal contract. (As Mr. Certegy assures us, the data was sold to legitimate data brokers.) So the whole thing is a just a crossed "T" or dotted "I" away from being 110% on the up and up. Instead of class action, we'd be talking steak knives and Eldorados!

It's just semantics. "Data broker" = "Identity Thief." "Lead Generation" with "Privacy Breach."
It's all the same. But the Yukon keeps me up all night, and it feels like it's almost crime.

Thursday, August 2, 2007

Impacted Molars: Pay Hell Gettin' It Done Edition


Random Eye-tooth:
I've been reading the Counterinsurgency Manual, and I'm figuring there is some analogue to a corporate approach to minimize the "insider threat."

Extraction:
Mr. Loblaw describes a grisly example of privacy abuse in a recent decision du jour, selecting the choicest text of a 6th Circuit decision so I don't have to. But I will.

As the plaintiffs’ complaint explains, prisoners have threatened and taunted the officers, often incorporating the plaintiffs’ social security numbers (which they have committed to memory) into the taunts. Some prisoners wrote the social security numbers of some of the plaintiffs on slips of paper that they threw out of their cells.
Now that's what I call abuse of NPI, a sort of SSN gassing. But do the plaintiffs get relief? No.

[T]he guards’ social securities numbers are not sensitive enough and the threat of retaliation from prisoners was not substantial enough to warrant constitutional protection.
Ride the NPI Country:
Courtesy the continual compendium of outrages privacy related, i.e, Pogo, come this story hashes ID crime stats. The conclusion it appears to draw is that Big Sky Country is a den of ID thieves. All the big increases in identity crime occur in North Dakota and Montana, with the notable exception of Springfield, IL, which can be attributed to Groundskeeper Willie and Apu. Considering that there are more people in my MSA than all of Montana or North Dakota, I wish I could get a thorough look at the stats. Not so bad that I'm going to request data from a "marketing@" e-mail address, which ID Analytics requires.

Computer Security for Trainables:
From the Chronicle tech blog, the winners of Educause's security awareness video contest. I dunno. These videos will not be a part of my infosec counterinsurgency program. No beat, can't dance to 'em.


Bonus:
"Sweet fancy moses": the whole shocking story. Discuss.

Monday, July 16, 2007

Privacy is a Technological Imperative


My seasonal July funk has been working on me and my attitude, but not so much that I can't find some perverse humor in the slashdot discussion on privacy as a biological imperative.

Ms. Sweeney's correlation of privacy to the stealth required by the predator stalk and consume prey was latched on to by the /.ers like an antelope at a watering hole. I don't see it myself. There is a fundamental difference between the biological need to eat and personal need for privacy. The development of information technologies creates the need for personal identity, and creates the tools to destroy it. Examples include the portable camera (which drove Warren & Brandeis to define the right to privacy in the context of the US Constitution), the telephone, punch-cards and TCP/IP.

These aren't new or original thoughts, but just how I see it.



Lion enjoying a private moment courtesy hannes.steyn.

Monday, May 7, 2007

Throwing Scorpion Out With the Frog Water


Declan McCullagh says that the federal government is unlikely to implement the National Research Council's privacy recommendations, in particular, a privacy commissioner, because it isn't in the federal government's scorpion-like nature. Ars Technica also has coverage. (And why must it always be a czar?)

The US is having the same issue with privacy legislation that it had with television resolution. We adopted early, because we needed to see our Felix the Cat on the airwaves, and 441 lines of resolution are all that NBC in 1941 could muster. Likewise, the privacy principles developed by the US government in the 1970s were developed too soon, when databases were just creeping out of the punch card era. US privacy law ends up like broadcast TV sets - an archaic lo-res standard, while other parts of the world lagged behind, but adapted a more advanced standard. Think of Europe's Privacy Directive as PAL.

From what I've read of the NRC's paper (the Executive Summary), it seems they are going for a full blown HiDef 1080p Dolby Surround sort of privacy regime. Just as the networks dragged their feet on the 441 lines of resolution until they were forced to move ahead with HD by the FCC, so will industry drag their feet on privacy until a privacy czar, prince or archbishop cajoles them into the 21st century. I'm being optimistic, but at least the frog was committed.



Lo-Res Felix from FelixtheCat.com

Thursday, April 26, 2007

Go Ask Alec Baldwin


SSL apostate Ian G. refers to an article on estimation of loss due to a privacy breach.

I think we are measuring the wrong thing, and operating on these assumptions is dangerous.

From the article, a Forrester analyst says:


"After calculating the expenses of legal fees, call centers, lost employee productivity, regulatory fines, stock plummets, and customer losses, it can be dizzying, if not impossible, to come up with a true number."
The $90 - $305 range smacks of too much precision and not enough accuracy. Only software project managers can get away with ranges like that. These numbers are more harmful that worthwhile. Most of these factors are not driven by record count (legal fees, stock plummets or lost productivity). Record specific costs are generally lower (call center and postage - and if you lose enough records, you don't even have to mail notices). So let's just call it BTUs per furlong and call it a day. And I don't think "customer losses" is as important in assessing the risk as "losses to customer."

The next Forrester quote underlines the problem I have with the general corporate thinking about privacy breaches:
"Previously, when a company had a data breach, a response team would fix the problem and test the mitigation, then the company would resume normal activities. Now we have to spend time on public relations efforts, as well as assuring both customers and auditors that new processes are in place to guard against such breaches in the future."
The reason you could get away with just fixing it and moving on was because the company did lose anything it owned. What it lost was owned by its customers. Losing one bit of highly sensitive data about one litigious customer could cause more damage that a dozen laptops filled with the SSNs of 10 million people.

It's the "loss to the customer" that will drive your high dollar PR and legal efforts, which have scale, and can dwarf your call center and postage costs in an afternoon.

I'd like to take the data, rehash it according to type of breach, sensitivity of data and litigiousness of customer. Then I think you'd start on the road to a meaningful metric.

Friday, March 30, 2007

Auditing Privacy Part 2 - Risk Assessment of Data Loss


The easy way to assess privacy risks is to focus on the impact of data theft to the organization by including the private data as a corporate asset. There are well documented methods to identify the vulnerabilities in means of collecting, storing and sharing the data. Similarly, there are methods to identify and list the data's threats (hackers, "insiders," and negligent loss). The impacts will likely shake out along the lines of direct costs (postage, call center, other incident response costs), potential legal and regulatory actions and reputation damage. (For an example, Protegrity assessed the TJX data breach at $1.7 billion, though TJX was not strictly a privacy issues, it has parallels*).


This would be the easy way, but may not result in the most accurate results. The problem lies in identifying the impacts of a privacy breach. The attribute of “privacy” assigned to the data is what makes the data valuable, and worthy of protection. However, "privacy" is not an attributed that belongs to the corporation, but to the individual the data describes. So an assessment of risk to the corporation of privacy loss should start at looking at the impact of the loss to the individual.


Why do many corporations, when disclosing losses of tremendous amounts of data, appear to suffer only short term damage to their reputation. I posit that the potential damage to a corporation is proportional to the actual real damage to privacy of the individuals described in the lost damage. (See Guin v Brazos)


The real impact of a privacy incident on individuals has been hidden behind a cloud of security vendor fear mongering and media induced panic. The common problems with the data is equating data loss with a privacy breach. Identity theft properly defined is likely a higher impact, lower frequency event than is commonly reported.


The SB1386-style disclosure laws have been a boon to identifying the frequency of data loss, but the information that has to be disclosed does little to help identify the impact. An auditor concerned strictly with compliance would have to place equal risk to any loss of private data. But the auditor should take the risk assessment to the next step and focus on the individuals, identifying the risks that lead to actual harm to the privacy of individuals. Compliance risk is equivalent for the loss of a laptop carrying an encrypted database of private data and the same databases being heisted off a web server unencrypted by a criminal with the intent to exploit the identities. The real risk to the privacy of the individuals described in the database is clearly different.


Beyond the risk of a data loss, the auditor should also consider the equally important risks of the collection of private data and the dossier-ification of data. More on that later.




*Why the high risk to TJX? Though not strictly a privacy issue, the damages related are an issue of a loss to a third party - the banks - rather than TJX itself.



"Some would call this good fortune" from s2art



Tuesday, March 27, 2007

Impacted Molars II


Occlusal
Panopticonistas Cyveillance say ID theft is so bad, we are all going to die. Seems like shutting down copyright scofflaws got a little too Web 1.0 for them, so they've unleashed their vicious crawling spiders on a search for contraband identities. And guess what they found out? EVERYBODY'S IDENTITY IS ALREADY PWN'D! Now that they've collected this data, I'm curious as to what are they going to do with all those credit card numbers, SSNs and mothers' maiden names. Did they help shut down the sites hosting the illicit data? Did they notify the victims? This sort of research is on an odd ethical footing. I hope they get it all sorted before they do their research on other forms of digital contraband.

Distal
California Secretary of State Debra Bowen kicks ass in the name of privacy for Californians. She gets privacy, and maybe even cares about the citizens of California. I wish she could impart some of her knowledge to the Texas county clerks.

Mandibular
CDT publishes their draft Privacy Principles for Identification. Seem pretty much like Fair Information Practices to me, which is not necessarily a bad thing.



Fake Teeth Resting on Image of Monk courtesy jsdart

Thursday, March 22, 2007

Panopticon Enabled Desktops Increase Productivity!


From Dark Reading, the joys of workforce monitoring software with Ascentive!:

"We call it 'workforce activity management,'" says Schran. "Our latest edition provides all the insight necessary to eliminate time-wasting, increase productivity, and protect private company data."
Or, in the words of Ascentive's VP of Customer Relations Jeremy Bentham,

Morals reformed - health preserved - industry invigorated - instruction diffused - public burthens lightened - Economy seated, as it were, upon a rock - the gordian knot of Gramm Leach Bliley and Sarbanes-Oxley are not cut, but untied - all by a simple idea in Software Architecture!
More from Dark Reading:

Perhaps even more importantly, employee monitoring tools can deter workers from insider activities such as data theft or unauthorized file access, Schran adds. "If your employees are downloading files to a USB device, our software will record that action," he says. "Our data has already been used in evidentiary proceedings in court."


But I prefer the hot buzz on this product from their EU Product Evangelist Michel Foucault:

The heaviness of the old 'houses of security', with their fortress-like architecture, could be replaced by the simple, economic geometry of a 'house of certainty'. The efficiency of power, its constraining force have, in a sense, passed over to the other side - to the side of its surface of application. He who is subjected to a field of visibility, and who knows it, assumes responsibility for the constraints of power; he makes them play spontaneously upon himself; he inscribes in himself the power relation in which he simultaneously plays both roles; he becomes the principle of his own subjection. By this very fact, the external power may throw off its physical weight; it tends to the non-corporal; and, the more it approaches this limit, the more constant, profound and permanent are its effects: it is a perpetual victory that avoids any physical confrontation and which is always decided in advance.

And they say security software people don't read post-structuralist French philosophers. Heck, Foucault is all around you! I running a Jacques Derrida Packet Sniffer & Deconstructor right now! Or am I?

Tuesday, March 20, 2007

Auditing Privacy Part 1 - Ethics and the Canon

It would comfort many compliance auditors to discover the ultimate checklist and tear after their organization's privacy program, collecting tick marks and developing the dreaded deficiency finding. I say to them, "Google is your friend." For the more enlightened internal auditor, the first step in evaluating their organizations privacy practices should be a step back.

The Canon
There are best practices, and there are benchmarks. There are torts, laws, and rational fear of the irrational regulator. However, for most every auditable area there is also The Canon. Take a file to the gilded crust of Sarbanes-Oxley and the PCOAB (and all their works and all their ways), you eventually uncover the Generally Accepted Accounting Principles. Take a snowblower to the myriad layers of dust and ash of the Code of Federal Regulations. If you squint and hold your head just right, you'll see a vague outline of the Decalogue. And somewhere below ornate filigree and baroque ornamentation of HIPAA, Gramm Leach Bliley and SB1386 is the shape of the Fair Information Practices of the US Department of Health, Education and Welfare, 1973.

From the link above, here are the five practices of the modern privacy canon:

  1. Collection limitation
  2. Disclosure
  3. Secondary usage
  4. Record correction
  5. Security
These five principles will be your mantra for your audit. They will guide your question and inform your issues. Advanced practitioners may chose from the following according to their path:

The 10 AICPA's Generally Accepted Privacy Principles

The OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data


The Ethos
Like the Torah, the Sermons of Buddha, the Qur'an, the Gospels, or Fermat's Principle, a canon is only meaningful if applied. You must ask the CEO, the CIO, the Chief Marketeer, the General Counsel, and listen, and interpret their answers accordingly. Are the principles used as values to guide their decisions, obstacles to be worked around, or are they simply unknown? Read your corporate policies regarding privacy. Do you see in them evidence of the Fair Information Practices, or do they appear to be more oriented to a specific set of industry specific regulations? Interview the folks who handle the data. Do they treat the data with the care they would treat their own? The answers to these questions will begin to lead you to determining if your organization has the ethical basis for a privacy program.

What Does This Mean?
A compliance oriented organization may maintain reasonable concordance with Fair Information Practices without even knowing what they are. However, the organization may be reactive, and inefficient. The organization's privacy direction will be dictated by outside entities, rather than developed within.
A organization with a firm foundation in privacy practices, coupled with an ethic duty to privacy, will be more efficient, more effective, and retain a better reputation in the face of an incident.

Tuesday, March 13, 2007

Charts 'n Graphs

From Pogo, this article from Physorg on the classic Evil Hacker v. Evil Suit dilemma. From the article:


If Phil Howard’s calculations prove true, by year’s end the 2 billionth personal record – some American’s social-security or credit-card number, academic grades or medical history – will become compromised, and it’s corporate America, not rogue hackers, who are primarily to blame. By his reckoning, electronic records in the United States are bleeding at the rate of 6 million a month in 2007, up some 200,000 a month from last year.


Goodness. This article seems to do more damage than good in increasing awareness of the privacy issue. The key bit of data that seems to be missing is the damage. More from the article:
Malicious intrusions by hackers make up a minority (31 percent) of 550 confirmed incidents between 1980 and 2006; 60 percent were attributable to organizational mismanagement such as missing or stolen hardware; the balance of 9 percent was due to unspecified breaches
So, how many fraudulent charges were made, fake IDs manufactured or reputations horribly disfigured by each category? The author of the study adds:

"And the surprising part is how much of those violations are organizationally prompted – they’re not about lone wolf hackers doing their thing with malicious intent."

So, would you rather Big Nameless Credit Card Company notify you:

A. that your name/credit card/SSN/date of birth were lost at an airport while stored on an encrypted laptop hard drive

OR

B. that Lone Wolf Hacker sniped your digits of their server (running unpatched IIS 2.0 on unpatched Win98)

Of course I can't prove that either scenario is inherently more dangerous for the consumer. I can just shake my angry fist at the data.

Thursday, March 8, 2007

SSN Panic, Texas Style


Here's the Computerworld run-down. And here's the Attorney General's letter (worth reading) and the proposed bill to change the law Texas HB 2061 so as all the county clerks don't get thrown in jail.

The AG letter says it in fourteen different ways NO, YOU CANNOT RELEASE SSNs, quoting an imperial raftload of laws, state and federal, why, and why you should even be asking the question. The clerks need to grab a big ol Sharpie and start their redactin'. Shut down your infonet tube, and stop selling your goods to some skanky information brokers from the desolate wasteland known as "Not Texas." Good on the OAG. Shame on collective elected doofi that are trying to find them an out.
I can only take solace in knowing the traditional efficiency and effectiveness of Our Lege.

This fiasco is an example of why privacy principles rather than mere compliance is important to an organization. Even if the Ft. Bend clerks were ignorant of the law, they reflected a disregard for the citizens they are charged to serve.

Tuesday, March 6, 2007

It's the Crime, Not the Tool


Tim Wilson at Dark Reading on IT Security: The New Big Brother:

"To identify potential insider threats, IT must monitor end users' behavior by scanning email, tracking network activity, and even watching employees for "trigger" events that might cause disgruntlement. Right now, I'm working on a story about ways corporations might monitor their employees outside the workplace to determine whether their out-of-office conduct might cause data leaks."
This is how the TSA dealt with the "insider threat" (i.e., passengers) on airplanes. Like the TSA, Mr. Wilson's focus appears to be on the tools that commit the crime (box cutters, e-mail, 3 oz. containers of fluid, USB drives) rather than the crime itself. Schneier has harped on this non-stop since 9/11. The proposed regime of surveillance will result in myriad false positives and employees as happy as your average passenger who has to remove his shoes and toss his shampoo and nail clippers into the trash at the security checkpoint.

In addition, what qualifies your IT Security department to be skilled in identifying what is legitimate and what is suspicious? How many eyes does the CEO want looking at legitimate confidential traffic? This filtering and monitoring scheme seems to be increasing risk of exposure rather than decreasing it.

Part of the solution does not involve any IT at all. Supervisors supervise. Their job is to monitor the employee activities. Managers should insure this happens.

Another part is development of an ethical culture within the corporation, where people have a channel to report if someone is acting "hinky." Internal and external auditors and ethics officers play an important role in an ethical environment. All the monitoring software in the world couldn't have prevented Enron, but an internal auditor put a stop to it.

Monday, March 5, 2007

Privacy and Security Lessons from Criminal Enterprises: The Corner & PCI


Either you have heard the stories, or encountered first hand the difficulty in convincing an organization's leaders to take adequate precautions to insure the privacy of identity related data, and maintain the integrity, confidentiality and availability of their information assets. Privacy and security have to be marketed to management since privacy and security are "non-functional" without a "ROI." As a last ditch effort, privacy and security can be pitched as a compliance effort; these activities must be performed to satisfy the requirements of an
independent, potentially hostile third party.

Nonetheless, criminal organizations, which by definition care not one whit about compliance, and have a vigorous appreciation of the bottom line, focus significant efforts on the privacy of personal data and the security of transactions and communications. For example the following story of touts, runners, ground stashes* and the electronic processing of credit cards.

The typical drug transaction occurs thusly:

  • Junkie finds slinger. Junkie's selection may be based on the Slinger's reputation, effectiveness of the Touts, past business practices or location.
  • Slinger takes order, collects cash from Junkie.
  • Slinger signals the order to a Runner.
  • Runner distributes product to Junkie, either from minimum amount on person, or collected from ground stash.
  • Junkie moves on to consume product.
So the slinger is the payment processor, and the merchant is the runner. Both will be held accountable for inventory, and separation of duties not only minimizes the compliance risk (i.e., being observed by law enforcement), but also provides an accounting control. The corner boy who put out the package knows that even if the slinger and the runner collude, the collusion will result in a wrong count at the end of the day.

So what part of this transaction is so hard for folks like TJX to understand? A couple items to consider:
  • Although the merchant may mitigate risk by gaining distance from the transaction (Verified by Visa, PayPal), the merchant is more interested in the customers than the Slinger is in the Junkies. The merchant and the processor want to keep all that secondary data and compile it, and convert it into cash. The Slinger wants only not to get burned by a counterfeit bill.
  • No one is responsible for the "count" on credit card transactions. Unlike the corner, the matching of goods, customer and payment is out of order in electronic commerce, with each party shirking responsibility for the transaction.
  • Each has to deal with impostors, though. The seller of baking soda is the "phisher" of the drug trade.

Next, yelling "5-0" as an intrusion detection mechanism.


*taken largely from Simon & Burns terrific book The Corner
or on most episodes of Simon's The Wire.

Sunday, February 25, 2007

Everyday Privacy and Security: The Drug Store

After a conversation with a friend, I thought I'd cite some examples of how privacy and security impact day-to-day life. Here's the first in the series; though I admit, dissecting the CMEA would take more effort than I have time to fully understand. My ear is still ringing and Battlestar is on in 20 minutes.

The scenario:
Last week I went to see the doctor about my tendinitis and a persistent ringing in my right ear. I rarely go to the doctor, so you must take my word that these were annoying, persistent and painful condititions, resulting in grouchiness, restlessness, nonsensicalitude and Irritable Spouse Syndrome (ISS). I was processed through the HMO machine like a burger at Jack in the Box, with a shot of cortisone in my arm and an Rx for some OTC pseudo-ephedrine.

At Walgreens, I scan the aisles for Sudafed, a rare purchase since I'm not normally an allergy sufferer. I pick up a card for the store-branded Wal-Phed and head over to the pharmacy. The pharmacist asked for my drivers license. I show it to her, figuring it was an age requirement. She asks me to take it out of my wallet. I hand it to her, and she types my information into the cash register. She asks me to sign what looks like a receipt. What for? I'm paying cash. It's the law. It's for the Wal-Phed. So I pay her the $3.50 or so, grab the receipt, my license and leave.

What Just Happened Here:
An ingredient in the Wal-Phed is used to manufacture bathtub methamphetamines (speed/crank). To stem this scourge, the Combat Methamphetamine Epidemic Act (CMEA: part of the USA PATRIOT Act Reauthorization of 2005) placed additional controls on retail sale of ephedrine, pseudoephedrine, and phenylpropanolamine.
Consumers have to show ID and be tracked by retailers so they get just enough to take care of their stuffy nose, but not enough to start up a meth lab. The retailers have to protect the privacy of their congested customers according to the law, thusly:

C) PRIVACY PROTECTIONS.—In order to protect the privacy of individuals who purchase scheduled listed chemical products, the Attorney General shall by regulation establish restrictions on disclosure of information in logbooks under subparagraph (A)(iii). Such regulations shall— ‘‘(i) provide for the disclosure of the information as appropriate to the Attorney General and to State and local law enforcement agencies; and ‘‘(ii) prohibit accessing, using, or sharing information in the logbooks for any purpose other than to ensure compliance with this title or to facilitate a product recall to protect public health and safety.

The Data the Walgreens Now Has On Me:
Well, my name and my Texas Drivers License information (DOB, address, glasses wearer, motorcycle rider). According to the DEA website, I could also show my passport, or, if I were under 18, my report card. They also know that I bought Wal-Phed and paid cash.


What About the Data Now?
Good question. The CMEA states that the retailer has to keep it for 2 years. There is also a raft of conflicting state laws, some requiring the logbooks to be kept electronically. The retailers' association raises concerns regarding HIPAA, tracking consumer behavior (e.g., can Walgreens send me a coupon for Wal-Phed now?) and real-time tracking versus logbook maintenance. Ever since it went behind the counter, pseudoephedrine sales have decreased, so does it really matter anymore?

Everyday Privacy For Me?
Walgreens knows I ride a motorcycle because my ear rings.
This data for a cash transaction will be maintained for two years.
It may or may not be subject to any privacy rules, depending on when/if the DEA writes the regulation.
I may have no recourse if Walgreens decides to use the information in a way to which I haven't consented.
I may have no recourse if Walgreens loses, misplaces, or sells the information to unsavory third parties.