Top 24 Quotes & Sayings by Hans Christian von Baeyer

Explore popular quotes and sayings by Hans Christian von Baeyer.
Last updated on December 18, 2024.
Hans Christian von Baeyer

Hans Christian von Baeyer is a Chancellor Professor of Physics at the College of William and Mary. His books include Information: The New Language of Science, Warmth Disperses and Time Passes: The History of Heat and QBism: The Future of Quantum Physics.

Born: 1938
The smell of subjectivity clings to the mechanical definition of complexity as stubbornly as it sticks to the definition of information.
As every bookie knows instinctively, a number such as reliability - a qualitative rather than a quantitative measure - is needed to make the valuation of information practically useful.
We don't know what energy is, any more than we know what information is, but as a now robust scientific concept we can describe it in precise mathematical terms, and as a commodity we can measure, market, regulate and tax it.
If quantum communication and quantum computation are to flourish, a new information theory will have to be developed. — © Hans Christian von Baeyer
If quantum communication and quantum computation are to flourish, a new information theory will have to be developed.
If the intensity of the material world is plotted along the horizontal axis, and the response of the human mind is on the vertical, the relation between the two is represented by the logarithmic curve. Could this rule provide a clue to the relationship between the objective measure of information, and our subjective perception of it?
Underneath the shifting appearances of the world as perceived by our unreliable senses, is there, or is there not, a bedrock of objective reality?
The solution of the Monty Hall problem hinges on the concept of information, and more specifically, on the relationship between added information and probability.
If you don't understand something, break it apart; reduce it to its components. Since they are simpler than the whole,you have a much better chance of understanding them; and when you have succeeded in doing that, put the whole thing back together again.
In fact, an information theory that leaves out the issue of noise turns out to have no content.
Numbers instill a feeling for the lie of the land, and furnish grist for the mathematical mill that is the physicist's principal tool.
This is not what I thought physics was about when I started out: I learned that the idea is to explain nature in terms of clearly understood mathematical laws; but perhaps comparisons are the best we can hope for.
To put it one way, a collection of Shakespeare's plays is richer than a phone book that uses the same number of letters; to put it another, the essence of information lies in the relationships among bits, not their sheer number.
Science has taught us that what we see and touch is not what is really there.
In order to understand information, we must define it; bit in order to define it, we must first understand it. Where to start?
Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information.
The switch from 'steam engines' to 'heat engines' signals the transition from engineering practice to theoretical science.
Paradox is the sharpest scalpel in the satchel of science. Nothing concentrates the mind as effectively, regardless of whether it pits two competing theories against each other, or theory against observation, or a compelling mathematical deduction against ordinary common sense.
Both induction and deduction, reasoning from the particular and the general, and back again from the universal to the specific, form the essence to scientific thinking.
An electron is real; a probability is not.
Information gently but relentlessly drizzles down on us in an invisible, impalpable electric rain.
The problem of defining exactly what is meant by the signal velocity, which cropped up as long ago as 1907, has not been solved. — © Hans Christian von Baeyer
The problem of defining exactly what is meant by the signal velocity, which cropped up as long ago as 1907, has not been solved.
For generations, field guides to plants and animals have sharpened the pleasure of seeing by opening our minds to understanding. Now John Adam has filled a gap in that venerable genre with his painstaking but simple mathematical descriptions of familiar, mundane physical phenomena. This is nothing less than a mathematical field guide to inanimate nature.
Time has been called God's way of making sure that everything doesn't happen at once. In the same spirit, noise is Nature's way of making sure that we don't find out everything that happens. Noise, in short, is the protector of information.
Claude Shannon, the founder of information theory, invented a way to measure 'the amount of information' in a message without defining the word 'information' itself, nor even addressing the question of the meaning of the message.
This site uses cookies to ensure you get the best experience. More info...
Got it!