Friday, July 11, 2008

Psychology of Intelligence Analysis

In the course of working with some of Endeca's more interesting clients, I started reading up on how the intelligence agencies address the challenges of making decisions, especially in the face of incomplete and contradictory evidence. I ran into a book called Psychology of Intelligence Analysis by former CIA analyst Richards Heuer. The entire book is available online, or you can hunt down a hard copy of the out-of-print book from your favorite used book seller.

Given the mixed record of the intelligence agencies over the past few decades, you might be wondering if the CIA is the best source for learning how to analyze intelligence. But this book is a gem. Even if the agencies don't always practice what they preach (and the book makes a good case as to why), the book is an excellent tour through the literature on judgment and decision making.

If you're already familiar with work by Herb Simon, Danny Kahneman, and Amos Tversky, then a lot of the ground he covers will be familiar--especially the third of the book that enumerates cognitive biases. I'm a big fan of the judgment and decision making literature myself. But I still found some great nuggets, particularly Chapter 8 on Analysis of Competing Hypotheses. Unlike most of the literature that focuses exclusively on demonstrating our systematic departures from rationality, Heuer hopes offer at least some constructive advice.

As someone who builds tools to help people make decisions using information that not only may be incomplete and contradictory, but also challenging to find in the first place, I'm very sensitive to how people's cognitive biases affect their ability to use these tools effectively. One of the HCIR '07 presentations by Jolie Martin and Michael Norton (who have worked with Max Bazerman) showed how the manner in which information was partitioned on retail web sites drove decisions, i.e., re-organizing the same information affected consumer's decision process.

It may be tempting for us on the software side to wash our hands of our users' cognitive biases. But such an approach would be short-sighted. As Heuer shows in his well-researched book, people not only have cognitive biases, but are unable to counter those biases simply by being made aware of them. Hence, if software tools are to help people make effective decisions, it is the job of us tool builders to build with those biases in mind, and to support processes like Analysis of Competing Hypotheses that try to compensate for human bias.

3 comments:

Anonymous said...

Daniel,

I enjoyed reading that and look forward to your continued examination of these topics. I studied Heuer in the mid 1980's when he had a tiny following of grad students and professors at Naval Postgraduate School in Monterey (he would guest lecture from time to time). His writings had a huge impact on me, and I've often wondered why we humans have such a hard time learning those lessons. I wonder how Heuer would analyze the recent IndyMac meltdown for example. In hindsight the very smart leaders of that organization were making very dumb decisions and that was probably due to their biased assessment of what would happen in the future.

Anyway, this is a very important dialog and I would appreciate you continuing to examine these thoughts. And if your examination results in conclusions that make their way into the IT enterprise we will all benefit from them.

v/r,
Bob

Daniel Tunkelang said...

Bob,

Thanks for the kind words. While I've read many of the authors Heuer cites (benefiting in part to my exposure to some of the greats of judgment and decision making at CMU), I didn't realize until recently that they had such a following in the intelligence community.

The unfortunate paradox of researching cognitive biases is that we cannot overcome them through awareness alone. Despite all of my training, my instincts betray me unless I can recognize my own cognitive biases and overcome them through conscious rationality.

But as Heuer points out, there is hope. We're sometime able to recognize others' biases even though we are blinded by our own--which suggests an effective way to work in teams. We can also explore different framings of problems to overcome anchoring and other biases induced by framing.

My own hope is that software tools can help us through their own immunity to cognitive bias (which suggests that an ideal decision support agent should not to pass the Turing test). But that requires establishing enough rigor around our understanding of cognitive biases to make our tools serve effectively as rationality prostheses. Hopefully a vision that will drive the next generation of R&D in decision support systems.

Daniel

David Fauth said...

Daniel,

As someone working in the intelligence community, Heurer's work was mentioned this past week as something that should be mandatory for all employees to read.

I too am interested in your continued work in this area.

Friday, July 11, 2008

Psychology of Intelligence Analysis

In the course of working with some of Endeca's more interesting clients, I started reading up on how the intelligence agencies address the challenges of making decisions, especially in the face of incomplete and contradictory evidence. I ran into a book called Psychology of Intelligence Analysis by former CIA analyst Richards Heuer. The entire book is available online, or you can hunt down a hard copy of the out-of-print book from your favorite used book seller.

Given the mixed record of the intelligence agencies over the past few decades, you might be wondering if the CIA is the best source for learning how to analyze intelligence. But this book is a gem. Even if the agencies don't always practice what they preach (and the book makes a good case as to why), the book is an excellent tour through the literature on judgment and decision making.

If you're already familiar with work by Herb Simon, Danny Kahneman, and Amos Tversky, then a lot of the ground he covers will be familiar--especially the third of the book that enumerates cognitive biases. I'm a big fan of the judgment and decision making literature myself. But I still found some great nuggets, particularly Chapter 8 on Analysis of Competing Hypotheses. Unlike most of the literature that focuses exclusively on demonstrating our systematic departures from rationality, Heuer hopes offer at least some constructive advice.

As someone who builds tools to help people make decisions using information that not only may be incomplete and contradictory, but also challenging to find in the first place, I'm very sensitive to how people's cognitive biases affect their ability to use these tools effectively. One of the HCIR '07 presentations by Jolie Martin and Michael Norton (who have worked with Max Bazerman) showed how the manner in which information was partitioned on retail web sites drove decisions, i.e., re-organizing the same information affected consumer's decision process.

It may be tempting for us on the software side to wash our hands of our users' cognitive biases. But such an approach would be short-sighted. As Heuer shows in his well-researched book, people not only have cognitive biases, but are unable to counter those biases simply by being made aware of them. Hence, if software tools are to help people make effective decisions, it is the job of us tool builders to build with those biases in mind, and to support processes like Analysis of Competing Hypotheses that try to compensate for human bias.

3 comments:

Anonymous said...

Daniel,

I enjoyed reading that and look forward to your continued examination of these topics. I studied Heuer in the mid 1980's when he had a tiny following of grad students and professors at Naval Postgraduate School in Monterey (he would guest lecture from time to time). His writings had a huge impact on me, and I've often wondered why we humans have such a hard time learning those lessons. I wonder how Heuer would analyze the recent IndyMac meltdown for example. In hindsight the very smart leaders of that organization were making very dumb decisions and that was probably due to their biased assessment of what would happen in the future.

Anyway, this is a very important dialog and I would appreciate you continuing to examine these thoughts. And if your examination results in conclusions that make their way into the IT enterprise we will all benefit from them.

v/r,
Bob

Daniel Tunkelang said...

Bob,

Thanks for the kind words. While I've read many of the authors Heuer cites (benefiting in part to my exposure to some of the greats of judgment and decision making at CMU), I didn't realize until recently that they had such a following in the intelligence community.

The unfortunate paradox of researching cognitive biases is that we cannot overcome them through awareness alone. Despite all of my training, my instincts betray me unless I can recognize my own cognitive biases and overcome them through conscious rationality.

But as Heuer points out, there is hope. We're sometime able to recognize others' biases even though we are blinded by our own--which suggests an effective way to work in teams. We can also explore different framings of problems to overcome anchoring and other biases induced by framing.

My own hope is that software tools can help us through their own immunity to cognitive bias (which suggests that an ideal decision support agent should not to pass the Turing test). But that requires establishing enough rigor around our understanding of cognitive biases to make our tools serve effectively as rationality prostheses. Hopefully a vision that will drive the next generation of R&D in decision support systems.

Daniel

David Fauth said...

Daniel,

As someone working in the intelligence community, Heurer's work was mentioned this past week as something that should be mandatory for all employees to read.

I too am interested in your continued work in this area.

Friday, July 11, 2008

Psychology of Intelligence Analysis

In the course of working with some of Endeca's more interesting clients, I started reading up on how the intelligence agencies address the challenges of making decisions, especially in the face of incomplete and contradictory evidence. I ran into a book called Psychology of Intelligence Analysis by former CIA analyst Richards Heuer. The entire book is available online, or you can hunt down a hard copy of the out-of-print book from your favorite used book seller.

Given the mixed record of the intelligence agencies over the past few decades, you might be wondering if the CIA is the best source for learning how to analyze intelligence. But this book is a gem. Even if the agencies don't always practice what they preach (and the book makes a good case as to why), the book is an excellent tour through the literature on judgment and decision making.

If you're already familiar with work by Herb Simon, Danny Kahneman, and Amos Tversky, then a lot of the ground he covers will be familiar--especially the third of the book that enumerates cognitive biases. I'm a big fan of the judgment and decision making literature myself. But I still found some great nuggets, particularly Chapter 8 on Analysis of Competing Hypotheses. Unlike most of the literature that focuses exclusively on demonstrating our systematic departures from rationality, Heuer hopes offer at least some constructive advice.

As someone who builds tools to help people make decisions using information that not only may be incomplete and contradictory, but also challenging to find in the first place, I'm very sensitive to how people's cognitive biases affect their ability to use these tools effectively. One of the HCIR '07 presentations by Jolie Martin and Michael Norton (who have worked with Max Bazerman) showed how the manner in which information was partitioned on retail web sites drove decisions, i.e., re-organizing the same information affected consumer's decision process.

It may be tempting for us on the software side to wash our hands of our users' cognitive biases. But such an approach would be short-sighted. As Heuer shows in his well-researched book, people not only have cognitive biases, but are unable to counter those biases simply by being made aware of them. Hence, if software tools are to help people make effective decisions, it is the job of us tool builders to build with those biases in mind, and to support processes like Analysis of Competing Hypotheses that try to compensate for human bias.

3 comments:

Anonymous said...

Daniel,

I enjoyed reading that and look forward to your continued examination of these topics. I studied Heuer in the mid 1980's when he had a tiny following of grad students and professors at Naval Postgraduate School in Monterey (he would guest lecture from time to time). His writings had a huge impact on me, and I've often wondered why we humans have such a hard time learning those lessons. I wonder how Heuer would analyze the recent IndyMac meltdown for example. In hindsight the very smart leaders of that organization were making very dumb decisions and that was probably due to their biased assessment of what would happen in the future.

Anyway, this is a very important dialog and I would appreciate you continuing to examine these thoughts. And if your examination results in conclusions that make their way into the IT enterprise we will all benefit from them.

v/r,
Bob

Daniel Tunkelang said...

Bob,

Thanks for the kind words. While I've read many of the authors Heuer cites (benefiting in part to my exposure to some of the greats of judgment and decision making at CMU), I didn't realize until recently that they had such a following in the intelligence community.

The unfortunate paradox of researching cognitive biases is that we cannot overcome them through awareness alone. Despite all of my training, my instincts betray me unless I can recognize my own cognitive biases and overcome them through conscious rationality.

But as Heuer points out, there is hope. We're sometime able to recognize others' biases even though we are blinded by our own--which suggests an effective way to work in teams. We can also explore different framings of problems to overcome anchoring and other biases induced by framing.

My own hope is that software tools can help us through their own immunity to cognitive bias (which suggests that an ideal decision support agent should not to pass the Turing test). But that requires establishing enough rigor around our understanding of cognitive biases to make our tools serve effectively as rationality prostheses. Hopefully a vision that will drive the next generation of R&D in decision support systems.

Daniel

David Fauth said...

Daniel,

As someone working in the intelligence community, Heurer's work was mentioned this past week as something that should be mandatory for all employees to read.

I too am interested in your continued work in this area.