There's a nifty little article in today's Chronicle of Higher Education, about a biologist thinking about national security. You can find it here.
The thrust of the article is that natural creatures (the subject of the article is apparently very fond of the octopus as an example) have defense systems that are decentralized and adaptable. They don't rely on central control, but on distributed systems that can autonomously react to new situations, including situations that haven't been thought of yet. It's a very effective approach.
This idea isn't a new one. Others studying organizational efficiency have demonstrated that you're much better off having disparate, interdisciplinary groups working on problems than having rigid silos all controlled from the top. Studies of creativity have found that mixing people from very different backgrounds in the same space and an environment of few constraints can produce amazing results.
The problem isn't that these ideas aren't right. The problem is that they run counter to two very powerful forces, one common to the modern human condition, the other inherent in any system that has a governance structure.
The first problem, put simply, is trust. In order for a distributed, flexible, autonomous system to work, we have to trust each other. In particular, I have to trust that Team A sent out to secure area X or deal with problem Y will do just fine, without someone checking in on them constantly, making them write reports, and otherwise dealing with their work. "Accountability" is the positive spin we've put on this, but anybody who's worked in a large organization knows that "accountability systems" are often extremely stifling of both creativity and effective work.
Then there's the governance problem. Organizations that involve some kind of hierarchy are almost invariably self-selecting. The people who aspire to positions of "leadership" are, far too often, those whose motive is to be "in charge". They enjoy telling other people what to do, or think that they know better than others how those others should do their jobs. I have worked for some very egregious examples over my career; I'm sure many of you have, too.
So when you go to an organization - a government, a corporation, an "institution" - and tell it to develop a distributed system of autonomous responders, you're telling those same people to give up their power, their control, their authority. And it is that power that, very often, motivated them to get where they are in the first place. So they scoff. They find reasons why they shouldn't. They hide behind high-minded arguments about accountability and tradition. And the result is a lot of wasted effort and really lousy results.
At least the biologists can tell us why.