Top 1200 Computer Graphics Quotes & Sayings

Explore popular Computer Graphics quotes.
Last updated on November 7, 2024.
There has to be the popcorn genre element, or I don't engage the same way. I like action and vehicle design and guns and computer graphics as much as I like allegory. It's a constant balancing game. I want audiences to be on this rollercoaster that fits the Hollywood mould, but I also want them to absorb my observations.
CGI is to me like watching a cartoon. It can be effective, if it's done well. A lot of times you don't feel any real risk. You're watching a bunch of computer-generated graphics.
At Harvard, I worked for some time as a researcher in a lab for computer graphics and spatial analysis, which is one of the birthplaces for what we do. — © Jack Dangermond
At Harvard, I worked for some time as a researcher in a lab for computer graphics and spatial analysis, which is one of the birthplaces for what we do.
The visual team of 'Blade Runner' - one of the last big fantasy movies to be made without much computer graphics finery - worked directly for Scott, who sketched each of his prolific ideas on paper (they were called 'Ridley-grams').
Clearly, if wed had the kind of computer graphics capability then that we have now, the Star Gate sequence would be much more complex than flat planes of light and color.
I play a lot of computer games. I love computer graphics. I've had Pixar in me for a long time.
Hollywood is a special place; a place filled with creative geniuses - actors, screenwriters, directors, sound engineers, computer graphics specialists, lighting experts and so on. Working together, great art happens. But in the end, all artists depend on diverse audiences who can enjoy, be inspired by and support their work.
No computer network with pretty graphics can ever replace the salespeople that make our society work.
The key questions will be: Are you good at working with intelligent machines or not? Are your skills a complement to the skills of the computer, or is the computer doing better without you? Worst of all, are you competing against the computer?
Since the beginning of the computer age, there has been immense development in computer intelligence but exactly zero development in computer consciousness.
Imagine you are writing an email. You are in front of the computer. You are operating the computer, clicking a mouse and typing on a keyboard, but the message will be sent to a human over the internet. So you are working before the computer, but with a human behind the computer.
There are about a dozen great computer graphics people and Jim Blinn is six of them.
... what is faked [by the computerization of image-making], of course, is not reality, but photographic reality, reality as seen by the camera lens. In other words, what computer graphics have (almost) achieved is not realism, but rather only photorealism - the ability to fake not our perceptual and bodily experience of reality but only its photographic image.
There are many innovators hard at work seeking to perfect alternatives to meat, milk, and eggs. These food products will, like computer-generated graphics or photography or sound systems, just keep getting better and better until there is little difference between an animal-based protein and a plant-based one, or farm-produced versus cultured meat. That will make it easy for people to make the kinds of choices that will usher in a world with far less violence.
An artist creates songs and timeless moments that are reflections that impact culture, and you can do that in any way - with guitars, ukelele, a computer. So, that will never die. It's always the artist behind the computer, not the computer.
My reading is extremely eclectic. Lately I've been teaching myself computer graphics, so I'm reading a lot about that. I read books of trivia, of facts.
I got my computer. The great thing about the computer is that you only need enough money to buy a computer and some food, and you're all right. I don't have to go to premières.
I was the first to advocate the Web. But I am very troubled by this thing that every kid must have a laptop computer. The kids are totally in the computer age. There's a whole new brain operation that's being moulded by the computer.
When I use a direct manipulation system whether for text editing, drawing pictures, or creating and playing games I do think of myself not as using a computer but as doing the particular task. The computer is, in effect, invisible. The point cannot be overstressed: make the computer system invisible.
I think 'The Lost World' could've been a successful movie except for the fact that it pre-dated the good special effects and computer graphics. — © John Rhys-Davies
I think 'The Lost World' could've been a successful movie except for the fact that it pre-dated the good special effects and computer graphics.
When I first started in the industry, there were - this is prior to the era of computer graphics and all these digital tools - there were some pretty rigid, technologically imposed limitations about how you shoot things, because if you didn't shoot 'em the right way, you couldn't make the shot work.
If you ask anybody at Cyber Command or look at any of the job listings for openings for their positions, you'll see that the one thing they don't prioritize is computer network defense. It's all about computer network attack and computer network exploitation at Cyber Command.
I think miniatures are still superior to a lot of computer graphics.
I came in during the era of models, motion control, and optical printers. ILM had just started its own computer graphics division, after the Lucasfilm computer division had been sold off and became Pixar.
Currently computer graphics are used a great deal, but it can be excessive.
I was asking questions which nobody else had asked before, because nobody else had actually looked at certain structures. Therefore, as I will tell, the advent of the computer, not as a computer but as a drawing machine, was for me a major event in my life. That's why I was motivated to participate in the birth of computer graphics, because for me computer graphics was a way of extending my hand, extending it and being able to draw things which my hand by itself, and the hands of nobody else before, would not have been able to represent.
If you need to do a movie where you have an army of 10,000 soldiers, that's a very difficult thing to shoot for real. It's very expensive, but as computer graphics techniques make that cheaper, it'll be more possible to make pictures on an epic scale, which we haven't really seen since the '50s and '60s.
When we considered what to do with the graphics capability of the Wii, we put more attention and focus on the ability to create new experiences rather than the quality of the graphics.
We're also looking a lot at graphics and video. We've done a lot on a deep technical level to make sure that the next version of Firefox will have all sorts of new graphics capabilities. And the move from audio to video is just exploding. So those areas in particular, mobile and graphics and video, are really important to making the Web today and tomorrow as open as it can be.
Until I reached my late teens, there was not enough money for luxuries - a holiday, a car, or a computer. I learned how to program a computer, in fact, by reading a book. I used to write down programs in a notebook and a few years later when we were able to buy a computer, I typed in my programs to see if they worked. They did. I was lucky.
Clearly, if we'd had the kind of computer graphics capability then that we have now, the Star Gate sequence would be much more complex than flat planes of light and color.
... the reason we think that computer graphics technology has succeeded in faking reality is that we, over the course of the last hundred and fifty years, have come to accept the image of photography and film as reality.
I've never been much of a computer guy at least in terms of playing with computers. Actually until I was about 11 I didn't use a computer for preparing for games at all. Now, obviously, the computer is an important tool for me preparing for my games. I analyze when I'm on the computer, either my games or my opponents. But mostly my own.
I've always been very interested in the question of how computation can fundamentally advance the things that we can see. This led me to have a fascination with medical imaging, especially things like MRI and scanning, and eventually computer graphics.
Our music is always, as you know, very spacey- computer graphics, music, images, lyrics, and visual art we make ourselves, or that we make with artists. And it's all synchronized.
A smartphone is a computer - it's not built using a computer - the job it does is the job of being a computer. So, everything we say about computers, that the software you run should be free - you should insist on that - applies to smart phones just the same. And likewise to those tablets.
I always loved both 'Breakout' and 'Asteroids' - I thought they were really good games. There was another game called 'Tempest' that I thought was really cool, and it represented a really hard technology. It's probably one of the only colour-vector screens that was used in the computer graphics field at that time.
With all the hype that computer graphics has been getting, everybody thinks there's nothing better than CGI, but I do get a lot of fan mail saying they prefer our films to anything with CGI in it. I'm grateful for that, and we made them on tight budgets, so they were considered B-pictures because of that. And, now here we are, and they've outlasted many so-called A-pictures.
I think the brain is essentially a computer and consciousness is like a computer program. It will cease to run when the computer is turned off. Theoretically, it could be re-created on a neural network, but that would be very difficult, as it would require all one's memories.
In view of all the deadly computer viruses that have been spreading lately, Weekend Update would like to remind you: when you link up to another computer, you're linking up to every computer that that computer has ever linked up to.
What is the central core of the subject [computer science]? What is it that distinguishes it from the separate subjects with which it is related? What is the linking thread which gathers these disparate branches into a single discipline. My answer to these questions is simple -it is the art of programming a computer. It is the art of designing efficient and elegant methods of getting a computer to solve problems, theoretical or practical, small or large, simple or complex. It is the art of translating this design into an effective and accurate computer program.
The attribution of intelligence to machines, crowds of fragments, or other nerd deities obscures more than it illuminates. When people are told that a computer is intelligent, they become prone to changing themselves in order to make the computer appear to work better, instead of demanding that the computer be changed to become more useful.
My biggest challenges when I first started out were not having a computer or camera or Wi-Fi! The computer and the camera had to be borrowed, and there were times that I used the computer at the library, and I literally sat outside people's houses to steal their Internet connections.
In the early 1970s, I headed to graduate school at the University of Utah and joined the pioneering program in computer graphics because I realized that's where I could combine my interests in art and computer science.
All problems in computer graphics can be solved with a matrix inversion. — © Jim Blinn
All problems in computer graphics can be solved with a matrix inversion.
Loads of computer graphics equals a terrible video in my book.
While Hollywood's computer graphics quality is world-class, their production pipelines are a mess of non-standard tools and labor-intensive processes driven by the mantra of maximizing quality regardless of cost. It's very Balkanized.
The only thing I do on a computer is play Texas Hold 'Em, really. Obviously my cell phone is a computer. My car is a computer. I'm on computers every day without actively seeking them out.
The original AMD GCN architecture allowed for one source of graphics commands, and two sources of compute commands. For PS4, we've worked with AMD to increase the limit to 64 sources of compute commands - the idea is if you have some asynchronous compute you want to perform, you put commands in one of these 64 queues, and then there are multiple levels of arbitration in the hardware to determine what runs, how it runs, and when it runs, alongside the graphics that's in the system.
ILM was the first company that I had worked at that had a computer-graphics division.
A family living at the poverty level is unlikely to be able to afford a computer at home. Even with a computer, access to the Internet is another significant expense. A child might borrow a book from a public library; but it is not possible to take a computer home.
Stanley Kubrick knew we had good graphics around MIT and came to my lab to find out how to do it. We had some really good stuff. I was very impressed with Kubrick; he knew all the graphics work I had ever heard of, and probably more.
What I proposed was a computer that would be easy to use, mix text and graphics, and sell for about $1,000. Steve Jobs said that it was a crazy idea, that it would never sell, and we didn't want anything like it. He tried to shoot the project down.
The best graphics are about the useful and important, about life and death, about the universe. Beautiful graphics do not traffic with the trivial.
I think a nerd is a person who uses the telephone to talk to other people about telephones. And a computer nerd therefore is somebody who uses a computer in order to use a computer.
I worked for seven years doing computer graphics to pay my way through graduate school - I have no romance with computer work. There's no amount of phony graphics and things making sound effects on the screen that can change that.
My particular aesthetic of light and color and design wouldn't change as a result of working with computer graphics rather than with slit scan or miniatures. — © Douglas Trumbull
My particular aesthetic of light and color and design wouldn't change as a result of working with computer graphics rather than with slit scan or miniatures.
Every time you turn on your new car, you're turning on 20 microprocessors. Every time you use an ATM, you're using a computer. Every time I use a settop box or game machine, I'm using a computer. The only computer you don't know how to work is your Microsoft computer, right?
Innovate, integrate, innovate, integrate, that's the way the industry works, ... Graphics was a stand-alone graphics card; then it's going to be a stand-alone graphics chip; and then part of that's going to get integrated into the main CPU.
What, then, is the basic difference between today's computer and an intelligent being? It is that the computer can be made to seebut not to perceive. What matters here is not that the computer is without consciousness but that thus far it is incapable of the spontaneous grasp of pattern--a capacity essential to perception and intelligence.
This site uses cookies to ensure you get the best experience. More info...
Got it!