Prying Information Loose and Dealing with Loose Information

A sociology of information triptych this morning. Disclosure laws that fail to fulfill their manifest/intended function, the secret work of parsing public information, and the pending capacity to record everything all bear on the question of the relationship between states and information.

In a 21 Jan 2012 NYT article, “I Disclose … Nothing,” Elisabeth Rosenthal (@nytrosenthal) suggests that despite increasing disclosure mandates we may not, in fact, be more informed. Among the obviating forces are information overload, dearth of interpretive expertise, tendency of organizations to hide behind “you were told…”, formal rules provide organizations with blueprint for how to play around with technicalities (as, she notes, Republican PACs have done, using name changes and re-registration to “reset” their disclosure obligation clocks), routinization (as in the melodic litanies of side-effects in drug adverts), and the simple fact that people are not in a position to act on information even it is abundantly available and unambiguous. On the other side, the article notes that there is a whole “industry” out there — journalists, regulators, reporters who can data mine the disclosure information even if individuals cannot take advantage.

Rachel Martin’s (@rachelnpr) piece on NPR’s Weekend Edition Sunday, CIA Tracks Public Information For The Private Eye describes almost the mirror image of this: how intelligence agencies are building their infrastructure for trying to find patterns in and making sense of the gadzillions of bits of public information that just sits their for all to see. It’s another case that hints at an impossibility theorem about “connecting the dotsa priori.

And finally, in another NPR story, “Technological Innovations Help Dictators See All” Rachel Martin interviews John Villasenor about his paper, “Recording Everything: Digital Storage as an Enabler of Authoritarian Governments” on the idea that data storage has become so inexpensive that there is no reason for governments (they focus on authoritarian ones, but no reason to limit it) not to collect everything (even if, as the first two stories remind us, they may currently lack the capacity to do anything with it). I if surveillance uptake and data rot will prove to be competing tendencies.

The first piece suggests research questions: what are the variables that determine whether disclosure is “useful”? what features of disclosure rules generate cynical work-arounds? if “more is not always better,” what is? can we better theorize the relationship between “knowing,” open-ness, transparency, disclosure and democracy than we have so far?

The second piece really cries out for an essay capturing the irony of how the information pajamas get turned inside out with the spy agency trying to see what’s in front of everyone (we are reminded in a perverse sort of way of Poe’s “The Purloined Letter“). Perhaps we’ll no longer associate going “under cover” with the CIA.

And the alarm suggested in the third piece is yet another entry under what I (and maybe others) have called the informational inversion — when the generation, acquisition, and storage of information dominates by orders of magnitude our capacity to do anything with it.

Do Organizations that ‘Fess Up Do Better?

Geoffrey W. McCarthy, a retired chief medical officer for the V.A., wrote, in a letter to the NYT on 9 August in response to an article on radiation overdoses in medical tests about two approaches to how organizations manage information about organizational errors. He notes that the issue illustrates the contradictions between “risk management” and “patient (or passenger or client or consumer) safety.”

He notes that the risk manager will say “don’t disclose” and “don’t apologize” because these could put the organization at legal or financial risk. A culture of safety and organizational improvement, though, would say “fully disclose,” not because it will help the patient, but because it is a necessary component of organizational change. The organization has to admit the error if is going to avoid repeating it, he asserts.

This suggests a number of sociology of information connections, but we’ll deal with just one here. This example points to an alternative to the conventional economic analysis of the value of information. The usual approach is to “price” the information in terms of who controls it and who could do what with it (akin to the risk manager’s thinking above). But here we see a process value — the organization itself might change if it discloses the information (independent, perhaps, of the conventional value of disclosure or non-disclosure). One could even imagine an alternative pricing scheme that says “sure, Mr. X might sue us, but by disclosing the information we are more likely to improve our systems in a manner that lets us avoid this mistake in the future (along with the risk it poses to us and the costs it might impose on society). Why pour resources into hiding the truth rather than into using the information to effect change?

One rebuttal to this says that an organization can do both, and maybe so. Another would say that this is just mathematically equivalent to what would happen in litigation (perhaps through punitive damages).

But I think that Mr. McCarthy is onto something in terms of “information behaviors.” There are, I expect, a whole bunch of “internal externalities” associated with what we decide to do with information. In other places I’ve examined the relational implications of information behavior. This points to another family of effects: organizational. More to come on this.

Notification and the Public Sphere

Working today on the outline for a chapter on “notification and the public sphere.”  In previous chapters the focus was notification and the maintenance of relationships among individuals. In this chapter I look at the broader distribution of information in society and the institutions that give rise to it.

The raw material I am working with runs the gamut from sunshine and freedom of information laws, mandatory disclosure regulations, discovery in legal context, state mandated notification, truth and reconciliation commissions, emergency warning systems, diplomatic protocol, gag rules, and privacy standards. Generically, I’m thinking of these as “information institutions.”

This is admittedly a big bucket of diverse phenomena; today’s work was a first stab at grouping and categorizing and discovering underlying dimensions that organize these things as manifestations of basic informational forms.

Here are my preliminary categories.

Sunshine, Stickers, Labels, and Report Cards. Laws and rules that say that the state and private and public actors cannot keep (all) secrets. Some of these are things like sunshine laws that promote accountability or combat corruption, others are disclosure rules that address information asymmetry between producers and consumers or between service providers and the public. This category resonates with the “is more information always better” posts that have appeared here previously.

Structured Honesty: Social Organization of Informational Equality. Being able to say “I don’t have to tell you” is an important manifestation of inequality with both material and symbolic consequences. In various forms, the capacity to maintain some control over the disposition of some information is widely recognized as a key component of autonomous personhood. This category includes institutions that collectively enforce (true) information sharing — from legal rules of discovery to truth commissions. It is, I think, distinct from the previous and next categories, but I’m still working on a rigorous way to distinguish them.  The “democracy and the information order” posts that have appeared previously would fall into this category (6 August 2008, 20 September 2007,  22 May 2007, 11 March 2007)

The Social Organization of Omniscience (includes warning systems). These can be distinguished from the disclosure examples because in those cases one entity either has the information and just needs to be compelled to release it or has/controls access to the information and needs to be compelled to collect and release it. By contrast, this category includes cases where either the information is dispersed and we organize a means to detect and aggregate and channel it. Or, where a special channel is set aside to that one type of information (perhaps a rare one) can take precedence. Examples: ER doctors who must report abuse or abortion providers who must provide parental notification for minors, emergency warning systems (tornado.,hurricane, tsunami), airport announcements that recruit everyone as a lookout for unattended bags (see also post on children as spies).

Protocol. In diplomacy, for example, protocol strongly regulates who would speak with whom. As in computer communication protocols, these institutions allow us to tie systems together.

Socially Sanctioned Non-Telling. This is almost the opposite of the first category (leaving an interesting space in between) — secrets that are socially organized. Gag rules and sealed agreements, trade secrets, intellectual property regimes, governments classification systems (top secret, etc.), official secrets acts, privacy standards.

The Economy and Information : Does More Info Make the World a Better Place?

This week’s serial superlatives in things economic — each day the “events of recent days” were “the most stunning thing to happen since the thirties” — has led to lots and lots of hand wringing and calls for new kinds or amounts of regulation. And a lot of what folks are saying has to do with information — more of it, in public, is what we need!

This brings to mind two things I’ve recently read. One is a 2007 book by Fung, Graham, and Weil called Full Disclosure: The Perils and Promise of Transparency (Cambridge University Press). It is a report on an empirical study of 18 cases of what they call “targeted transparency” — legislated requirements that corporations (or other private entities) disclose specific information so that the public can make informed choices about their products, services, etc. They looked at the history of disclosure as public policy, why it emerged when it did, whether it’s likely to continue to expand, and whether, and under what conditions, it works. In a nutshell, they conclude that it works well in some cases, not at all in others. The process is always political and it works when the results of the political process produce a system that is “user oriented” and “sustainable.” I’ll post a full review of the book here in the near future.

The other piece I was reminded of was by Malcom Gladwell in a January 2007 New Yorker: “Open Secrets: Enron, intelligence, and the perils of too much information.” In it Gladwell builds on, among others, the work of Yale law professor Jonathan Macey who, in a review article about the Enron debacle, argued that the problem was not information that Enron hid, but the fact that no one could put together the puzzle pieces represented by the information they disclosed.

I recommend both Fung et al. and the Gladwell piece as grist for your thought mill this week.

See also this old post on the “is more better” question.