Top 257 Computing Quotes & Sayings - Page 5

Explore popular Computing quotes.
Last updated on December 11, 2024.
The Relativity theory, the copernican upheaval, or any great scientific convulsion, leaves a new landscape. There is a period of stunned dreariness; then people begin, antlike, the building of a new human world. They soon forget the last disturbance. But from these shocks they derive a slightly augmented vocabulary, a new blind spot in their vision, a few new blepharospasms or tics, and perhaps a revised method of computing time.
Access to supercomputers. The science is well ahead of our ability to implement it. It's quite clear that if we could run our models at a higher resolution we could do a much better job-tomorrow-in terms of our seasonal and decadal predictions. It's so frustrating. We keep saying we need four times the computing power. We're talking just 10 or 20 million a year-dollars or pounds-which is tiny compared to the damage done by disasters. Yet it's a difficult argument to win.
You have as much computing power in your iPhone as was available at the time of the Apollo missions. But what is it being used for? It’s being used to throw angry birds at pigs; it’s being used to send pictures of your cat to people halfway around the world; it’s being used to check in as the virtual mayor of a virtual nowhere while you’re riding a subway from the nineteenth century.
I've found that in life, and certainly in music, it's all just a series of occurrences in which you are constantly assessing the potential outcome of your decisions based on past precedence. As an aside, this is apparently the defining factor in science that separates artificial intelligence from human consciousness. One thing that our brains are really good at is taking a bunch of answers and extrapolating the question, whereas in computing, you input a question and it will provide an answer.
Only the heart knows the correct answer. Most people think the heart is mushy and sentimental. But it's not. The heart is intuitive; it's holistic, it's contextual, it's relational. It doesn't have a win-lose orientation. It taps into the cosmic computer - the field of pure potentiality, pure knowledge, and infinite organizing power - and takes everything into account. At times it may not even seem rational, but the heart has a computing ability that is far more accurate and far more precise than anything within the limits of rational thought.
Right up till the 1980s, SF envisioned giant mainframe computers that ran everything remotely, that ingested huge amounts of information and regurgitated it in startling ways, and that behaved (or were programmed to behave) very much like human beings... Now we have 14-year-olds with more computing power on their desktops than existed in the entire world in 1960. But computers in fiction are still behaving in much the same way as they did in the Sixties. That's because in fiction [artificial intelligence] has to follow the laws of dramatic logic, just like human characters.
Meaning and value depend on human mind space and the commitment of time and energy by very smart people to a creative enterprise. And the time, energy, and brain power of smart, creative people are not abundant. These are the things that are scare, and in some sense they become scarcer as the demand for these talents increases in proportion to the amount of abundant computing power available.
The NeXT purchase is too little too late. The Apple of the past was an innovative company that used software and hardware technology together to redefine the way people experienced computing. That Apple is already dead. Very adroit moves might be able to save the brand name. A company with the letters A-P-P-L-E in its name might survive, but it won't be the Apple of yore.
I think that it's extraordinarily important that we in computer science keep fun in computing. When it started out, it was an awful lot of fun. Of course, the paying customers got shafted every now and then, and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful, error-free perfect use of these machines. I don't think we are. I think we're responsible for stretching them, setting them off in new directions, and keeping fun in the house. I hope the field of computer science never loses its sense of fun.
Beauty is more important in computing than anywhere else in technology because software is so complicated. Beauty is the ultimate defense against complexity. ... The geniuses of the computer field, on the the other hand, are the people with the keenest aesthetic senses, the ones who are capable of creating beauty. Beauty is decisive at every level: the most important interfaces, the most important programming languages, the winning algorithms are the beautiful ones.
The digital revolution is far more significant than the invention of writing or even of printing.”“The better we get at getting better, the faster we will get better.”“In 20 or 30 years, you’ll be able to hold in your hand as much computing knowledge as exists now in the whole city, or even the whole world.”“The rate at which a person can mature is directly proportional to the embarrassment they can tolerate.”“The key thing about all the world’s big problems is that they have to be dealt with collectively. If we don’t get collectively smarter, we’re doomed.
There is no reason products and services could not be swapped directly by consumers and producers through a system of direct exchange – essentially a massive barter economy. All it requires is some commonly used unit of account and adequate computing power to make sure all transactions could be settled immediately. People would pay each other electronically, without the payment being routed through anything that we would currently recognize as a bank. Central banks in their present form would no longer exist – nor would money.
The way Moore's Law occurs in computing is really unprecedented in other walks of life. If the Boeing 747 obeyed Moore's Law, it would travel a million miles an hour, it would be shrunken down in size, and a trip to New York would cost about five dollars. Those enormous changes just aren't part of our everyday experience.
Every disruptive innovation is powered by a simplifying technology, and then the technology has to get embedded in a different kind of a business model. The first two decades of digital computing were characterized by the huge mainframe computers that filled a whole room, and they had to be operated by PhD Computer Scientists. It took the engineers at IBM about four years to design these mainframe computers because there were no rules. It was an intuitive art and just by trial and error and experimentation they would evolve to a computer that worked.
If we were able to put every single solitary cancer cell that has a genomic - had their genome done in one place, we have the computing capacity to go in and look at what are the similarities and dissimilarities that make them work and don't work. And every expert will tell you, it is probably gonna exponentially increase the capacity to be able to find, A, cures, B, vaccines, and C, turn some cancers into chronic diseases, rather than it cost you your life.
The object of geometry in all its measuring and computing, is to ascertain with exactness the plan of the great Geometer, to penetrate the veil of material forms, and disclose the thoughts which lie beneath them? When our researches are successful, and when a generous and heaven-eyed inspiration has elevated us above humanity, and raised us triumphantly into the very presence, as it were, of the divine intellect, how instantly and entirely are human pride and vanity repressed, and, by a single glance at the glories of the infinite mind, are we humbled to the dust.
Not fair? Oh, I'm sorry I get this lovely laptop computing device when all you get is the ability to walk, control your hands, and know you'll survive until your eighteenth birthday." Then the kid was going, "Uh, I didn't mean..." But Tad wasn't done yet. While the whole class watched in horror, he put his hands through the metal support braces on the arms of his wheelchair and forced himself to stand up. Then he took a shaky little step to the side, gestured toward the chair, and said, "Why don't you take a turn with the laptop? You can even have my seat.
This site uses cookies to ensure you get the best experience. More info...
Got it!