A Quote by I. J. Good

When discussing complex systems like brains and other societies, it is easy to oversimplify: I call this Occam's lobotomy. — © I. J. Good
When discussing complex systems like brains and other societies, it is easy to oversimplify: I call this Occam's lobotomy.
Some brains are easy to hack into, and other brains are nearly impossible to hack into because they are so complex.
Any attempt to reduce the complex properties of biological organisms or of nervous systems or of human brains to simple physical and chemical systems is foolish.
Societies would _not_ be better off if everyone were like Mr Spock, all rationality and no emotion. Instead, a balance - a teaming up of the internal rivals - is optimal for brains. ... Some balance of the emotional and rational systems is needed, and that balance may already be optimized by natural selection in human brains.
Technically speaking, since our complex societies are highly susceptible to interferences and accidents,they certainly offer ideal opportunities for a prompt disruption of normal activities. These disruptions can, with minimum expense, have considerably destructive consequences. Global terrorism is extreme both in its lack of realistic goals and in its cynical exploitation of the vulnerability of complex systems.
I finished by saying that it struck me that all the ethical systems I was discussing were after the fact. That is, that people act as they are disposed to, but they like to feel afterwards that they were right and so they invent systems that approve of their dispositions.
Public figures talk and act as if environmental change will be linear and gradual. But the Earth's systems are highly complex, and complex systems do not respond to pressure in linear ways.
If there's one thing government needs desperately, it's the ability to quickly try something, pivot when necessary, and build complex systems by starting with simple systems that work and evolving from there, not the other way around.
So Occam's razor - Occam says you should choose the explanation that is most simple and straightforward - leads me more to believe in God than in the multiverse, which seems quite a stretch of the imagination.
We use and need to use both systems in complex political societies, and we oscillate in our commitments, because both oligarchy and rule by the will of masses have their bad points, as the ancient philosophers all knew.
We must adjust our value systems and work to modify today's societies, in which economic interests are carried to the extreme and irrationally produce not merely objects, but weapons of war. These societies don't care about the destruction of the planet and mankind as long as they earn profits - it can't go on like this.
What is important is that complex systems, richly cross-connected internally, have complex behaviours, and that these behaviours can be goal-seeking in complex patterns.
There are a few societies that show signs of having been very rational about the physics of construction and the physics of real life. Some of the old middle-Eastern societies had downdraft systems over whole cities, and passive, rapid-evaporation ice-making systems. They were rational people using good physical principles to make themselves comfortable without additional sources of energy.
One can expect the human race to continue attempting systems just within or just beyond our reach; and software systems are perhaps the most intricate and complex of man's handiworks. The management of this complex craft will demand our best use of new languages and systems, our best adaptation of proven engineering management methods, liberal doses of common sense, and a God-given humility to recognize our fallibility and limitations.
Managers are not confronted with problems that are independent of each other, but with dynamic situations that consist of complex systems of changing problems that interact with each other. I call such situations messes. Problems are extracted from messes by analysis. Managers do not solve problems, they manage messes.
The more we learn of the true nature of non-human animals, especially those with complex brains and corresponding complex social behavior, the more ethical concerns are raised regarding their use in the service of man - whether this be in entertainment, as "pets," for food, in research laboratories, or any of the other uses to which we subject them.
As we build systems that are more and more complex, we make more and more subtle but very high-impact mistakes. As we use computers for more things and as we build more complex systems, this problem of unreliability and insecurity is actually getting worse, with no real sign of abating anytime soon.
This site uses cookies to ensure you get the best experience. More info...
Got it!