Top 257 Computing Quotes & Sayings - Page 4

Explore popular Computing quotes.
Last updated on April 21, 2025.
If you think about computing, there isn't just one way to compute, just like there's not just one way to move around. You can have shoes, you can have a car, you can have a bicycle, submarine, rocket, plane, train, glider, whatever. Because you have one doesn't mean you get rid of another one... But PCs continue to be important.
Items of interest will be located, identified, monitored, and remotely controlled through technologies such as radio-frequency identification, sensor networks, tiny embedded servers, and energy harvesters - all connected to the next-generation internet using abundant, low-cost, and high-power computing.
The infrastructure of today's computing environment is for good or ill structured around the Windows operating system. Can you get around that? Yes you can, but you have to work harder to do that and most individuals and most corporations are not going to make that extra effort.
[The] dynamics of computational artifacts extend beyond the interface narrowly defined, to relations of people with each other and to the place of computing in their ongoing activities. System design, it follows, must include not only the design of innovative technologies, but their artful integration with the rest of the social and material world.
The increasing presence of cloud computing and mobile smart phones is driving the digitization of everything across both consumer and enterprise domains. It is hard to imagine any area of human activity which is not being reengineered under this influence, either at present or in the very near future.
The computing world is very good at things that we are not. It is very good at memory. — © Eric Schmidt
The computing world is very good at things that we are not. It is very good at memory.
In the 'Disruptive Broadcasting' space, TV on IP networks is now just another application in a broadband world. We have already seen the transformation of the computing and communications industry with respect to traditional telecom. Now, history is repeating itself with traditional broadcasting.
Gene therapy technology is much like computing technology. We had to build the super computer which cost $8 million in 1960. Now everyone has technologies that work predictably and at a cost the average person can afford.
People constantly face problems they've never seen before, and they have to solve them somehow. So a million people come up with a million solutions that are just a little bit different. If computing is being done by fewer resources, there will be enormous security gains by pushing things into standard practices.
The best programs are written so that computing machines can perform them quickly and so that human beings can understand them clearly. A programmer is ideally an essayist who works with traditional aesthetic and literary forms as well as mathematical concepts, to communicate the way that an algorithm works and to convince a reader that the results will be correct.
Every kid coming out of Harvard, every kid coming out of school now thinks he can be the next Mark Zuckerberg, and with these new technologies like cloud computing, he actually has a shot.
I don't think it's a lack of will. I think it's an issue of what people view as constitutional rights under the Fourth Amendment, number one, and what customers and business partners expect around the world from secure computing systems. And it's a difference of view.
Cloud computing offers individuals access to data and applications from nearly any point of access to the Internet, offers businesses a whole new way to cut costs for technical infrastructure, and offers big computer companies a potentially giant market for hardware and services.
These are the men who, without virtue, labour, or hazard, are growing rich, as their country is impoverished; they rejoice, when obstinacy or ambition adds another year to slaughter and devastation; and laugh, from their desks, at bravery and science, while they are adding figure to figure, and cipher to cipher, hoping for a new contract from a new armament, and computing the profits of a siege or tempest.
The most important application of quantum computing in the future is likely to be a computer simulation of quantum systems, because that's an application where we know for sure that quantum systems in general cannot be efficiently simulated on a classical computer.
Alan Turing gave us a mathematical model of digital computing that has completely withstood the test of time. He gave us a very, very clear description that was truly prophetic.
Essentially, we're always trying to reduce latency. As you try to reduce the latency of the experience, you can only get it down so far before we start running into the limitations of game engines, computing, the intensity of the experience you're trying to compute.
The other terror that scares us from self-trust is our consistency; a reverence for our past act or word, because the eyes of others have no other data for computing our orbit than our past acts, and we are loath to disappoint them.
No one has a monopoly on knowledge the way that, say, IBM had in the 1960s in computing, or that Bell Labs had through the 1970s in communications. When useful knowledge exists in companies of all sizes and also in universities, non-profits and individual minds, it makes sense to orient your innovation efforts to accessing, building upon and integrating that external knowledge into useful products and services.
With the advent of computing, human invention crossed a threshold into a world different from everything that came before. The computer is the universal machine almost by definition, machine-of-all-trades, capable of accomplishing or simulating just about any task that can be logically defined.
Computer games tend to be boys' games, warlike games with more violence. We have not spent enough time thinking through how to encourage more girls to be involved in computing before coming to college so they can see a possible career in information technology.
We couldn't build quantum computers unless the universe were quantum and computing. We can build such machines because the universe is storing and processing information in the quantum realm. When we build quantum computers, we're hijacking that underlying computation in order to make it do things we want: little and/or/not calculations. We're hacking into the universe.
Many jobs at Google require math, computing, and coding skills, so if your good grades truly reflect skills in those areas that you can apply, it would be an advantage. But Google has its eyes on much more.
Computing has gone from something tiny and specialized to something that affects every walk of life. It doesn't make sense anymore to think of it as just one discipline. I expect to see separate departments of user interface, for example, to start emerging at universities.
By 2020, most home computers will have the computing power of a human brain. That doesn't mean that they are brains, but it means that in terms of raw processing, they can process bits as fast as a brain can. So the question is, how far behind that is the development of a machine that's as smart as we are?
In some far-off distant time, when the twentieth century history of primitive computing is just a murky memory, someone is likely to suppose that devices known as logic gates were named after the famous co-founder of Microsoft Corporation
The 21st century has more potential than perhaps any other in our brief evolutionary history. We stand on the cusp of computing, genetic and energy generation breakthroughs that were only recently in the realm of science-fiction. A golden age of humanity is tantalisingly within our grasp.
The Internet is a computing platform built on top of core technology. Applied technology is what gets built on top of that: It's Web services.
What we're really trying to do is have heterogeneous systems really become the foundation of our computing going forward. And that's the idea that you make every processor and every accelerator a peer processor.
Before 'Dilbert,' I tried to become a computer programmer. In the early days of computing, I bought this big, heavy, portable computer for my house. I spent two years nights and weekends trying to write games that I thought I would sell. Turns out I'm not that good a programmer, so that was two years that didn't work out.
Augmented reality will drive all things like chat, social networking, photos, videos, organizing data, modeling, painting, motion capture, and visual programming. Every form of computing will be combined together and unified in a single platform.
What happens to people like myself, who have been involved with computing for a long time, is that you begin to see how many of the 'new' ideas are simply old ones coming back into view on the swing of the pendulum, with new and faster hardware to back it up.
The hope is that, in not too many years, human brains and computing machines will be coupled together very tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.
Such is modern computing: everything simple is made too complicated because it's easy to fiddle with; everything complicated stays complicated because it's hard to fix.
For me, it matters that we drive technology as an equalizing force, as an enabler for everyone around the world. Which is why I do want Google to see, push, and invest more in making sure computing is more accessible, connectivity is more accessible.
The human brain became large by natural selection (who knows why, but presumably for good cause). Yet surely most "things" now done by our brains, and essential both to our cultures and to our very survival, are epiphenomena of the computing power of this machine, not genetically grounded Darwinian entities created specifically by natural selection for their current function.
At every juncture, advanced tools have been the key to a new wave of applications, and each wave of applications has been key to driving computing to the next level.
Much of my work has come from being lazy. I didn't like writing programs, and so, when I was working on the IBM 701 (an early computer), writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs.
Computing is not about computers any more. It is about living. Whatever big problem you can imagine, from world peace to the environment to hunger to poverty, the solution always includes education, ... We need to depend more on peer-to-peer and self-driven learning. The laptop is one important means of doing that.
I think the increased ubiquity of the internet and networked computing in general allowed me to have some tether no matter where I was geographically. I could log in to a computer from anywhere in the world and access the same information and the same people. It allowed me to transcend the physical differences.
If everybody would agree that their current reality is A reality, and that what we essentially share is our capacity for constructing a reality, then perhaps we could all agree on a meta-agreement for computing a reality that would mean survival and dignity for everyone on the planet, rather than each group being sold on a particular way of doing things.
Here is what the world looked like in 2000... there were no plug and play solutions for ecommerce/warehouse management and customer service that could scale... which means that we had to employ 40+ engineers. Cloud computing did not exist, which means that we had to have a server farm and several IT people to insure that the site did not go down.
Did Google know much about media? Or Amazon about commerce? Tesla about cars? SpaceX about rockets? EBay about classifieds? What did I know about computing when I started Sun Microsystems? We should celebrate these entrepreneurs, not pillory them for fighting entrenched incumbent industries that have political influence and money.
Software unification. So that I no longer care what computing device I pick up, whether it's a laptop or desktop, whether it's one I own or one in a public place, whether it has a small screen or a large screen.
Instruction tables will have to be made up by mathematicians with computing experience and perhaps a certain puzzle-solving ability. There need be no real danger of it ever becoming a drudge, for any processes that are quite mechanical may be turned over to the machine itself.
Millennials, and the generations that follow, are shaping technology. This generation has grown up with computing in the palm of their hands. They are more socially and globally connected through mobile Internet devices than any prior generation. And they don't question; they just learn.
Microsoft and Dell have been building, implementing and operating massive cloud operations for years. Now we are extending our longstanding partnership to help usher in the new era of cloud computing, by giving customers and partners the ability to deploy the Windows Azure platform in their own datacenters.
In the world of computers and just devices in general, the lifespan, or the shelf life, is relatively short just because technology moves so fast and the costs drop so quickly and the power, whether it's computing power or memory rises very, very quickly.
If our American women are going to work to put food on the table and pay for the mortgage, then we better make sure that they get put into jobs that pay well and that pay their worth. That's why I'm such a huge advocate about computing jobs, because those are the jobs.
When you have a large amount of data that is labeled so a computer knows what it means, and you have a large amount of computing power, and you're trying to find patterns in that data, we've found that deep learning is unbeatable.
I can see a day soon where you'll create your own college degree by taking the best online courses from the best professors from around the world - some computing from Stanford, some entrepreneurship from Wharton, some ethics from Brandeis, some literature from Edinburgh - paying only the nominal fee for the certificates of completion.
Productivity is grounded in the PC. Where does the computing power come from? How would you run 'USA Today' without PCs? Run a hospital without PCs? People don't want products, they want solutions.
Personal computing today is a rich ecosystem encompassing massive PC-based data centers, notebook and Tablet PCs, handheld devices, and smart cell phones. It has expanded from the desktop and the data center to wherever people need it - at their desks, in a meeting, on the road or even in the air.
A lot of the progress in machine learning - and this is an unpopular opinion in academia - is driven by an increase in both computing power and data. An analogy is to building a space rocket: You need a huge rocket engine, and you need a lot of fuel.
As devices continue to shrink and voice recognition and other kinds of alternative user-interfaces become more practical, it is going to change how we interact with computing devices. They may fade into the background and just be around, allowing us to talk to them just as we would some other trusted companion.
I can't think of anything that isn't cloud computing with all of these announcements. The computer industry is the only industry that is more fashion-driven than women's fashion. Maybe I'm an idiot, but I have no idea what anyone is talking about. What is it?
Cloud computing, smartphones, social media platforms, and Internet of Things devices have already transformed how we communicate, work, shop, and socialize. These technologies gather unprecedented data streams leading to formidable challenges around privacy, profiling, manipulation, and personal safety.
The Eee Pad Transformer Prime is a category-defining product. Powered by Tegra 3, it launches us into a new era of mobile computing, in which quad-core performance and super energy-efficiency provide capabilities never available before. With Transformer Prime, ASUS has once again led the industry into the next generation.
It is easy to predict that some of the discoveries of research directed towards Grand Challenges - but only the most unexpected ones, and at the most unexpected times - will be the basis of revolutionary improvements in the way that we exploit the power of our future computing devices.
Considering that we live in an era of evolutionary everything---evolutionary biology, evolutionary medicine, evolutionary ecology, evolutionary psychology, evolutionary economics, evolutionary computing---it was surprising how rarely people thought in evolutionary terms. It was a human blind spot. We look at the world around us as a snapshot when it was really a movie, constantly changing.
This site uses cookies to ensure you get the best experience. More info...
Got it!