A Quote by Craig R. Barrett

Innovate, integrate, innovate, integrate, that's the way the industry works, ... Graphics was a stand-alone graphics card; then it's going to be a stand-alone graphics chip; and then part of that's going to get integrated into the main CPU.
Power, of course, is very important, but really, there's a bundle of technologies you have to have to make sure they're highly integrated, so you have to have modem, you have to have connectivity, you have to have GPS, you have to have graphics, you have to have CPU.
We're also looking a lot at graphics and video. We've done a lot on a deep technical level to make sure that the next version of Firefox will have all sorts of new graphics capabilities. And the move from audio to video is just exploding. So those areas in particular, mobile and graphics and video, are really important to making the Web today and tomorrow as open as it can be.
It is difficult to make good scalable use of a CPU like you can of a graphics card. You certainly don't want 'better or worse' physics or AI in your game
When we considered what to do with the graphics capability of the Wii, we put more attention and focus on the ability to create new experiences rather than the quality of the graphics.
I worked for seven years doing computer graphics to pay my way through graduate school - I have no romance with computer work. There's no amount of phony graphics and things making sound effects on the screen that can change that.
The best graphics are about the useful and important, about life and death, about the universe. Beautiful graphics do not traffic with the trivial.
When I step back and look at what's important to AMD, it's about graphics leadership - visual computing leadership - as well as a strong computing experience. We have the capability to integrate those two together.
The original AMD GCN architecture allowed for one source of graphics commands, and two sources of compute commands. For PS4, we've worked with AMD to increase the limit to 64 sources of compute commands - the idea is if you have some asynchronous compute you want to perform, you put commands in one of these 64 queues, and then there are multiple levels of arbitration in the hardware to determine what runs, how it runs, and when it runs, alongside the graphics that's in the system.
Stanley Kubrick knew we had good graphics around MIT and came to my lab to find out how to do it. We had some really good stuff. I was very impressed with Kubrick; he knew all the graphics work I had ever heard of, and probably more.
My philosophy is, I can't make every product that can possibly use a high-performance CPU and graphics. Why shouldn't I enable others, in a positive fashion, to leverage AMD IP in more places?
Even if I have to stand alone, I will not be afraid to stand alone. I'm going to fight for you. I'm going to fight for what's right. I'm going to fight to hold people accountable.
Even if I have to stand alone, I will not be afraid to stand alone. I'm going to fight for you. I'm going to fight for what's right. I'm going to fight to hold people accountable
What I've found is that a lot of people in the media industry tend to use Macs because they're so good for graphics and music.
First we thought the PC was a calculator. Then we found out how to turn numbers into letters with ASCII — and we thought it was a typewriter. Then we discovered graphics, and we thought it was a television. With the World Wide Web, we've realized it's a brochure.
Public discussions are part of what it takes to make changes in the trillions of graphics published each year.
All of these technologies that we are putting together... our memory technology, our CPU, our graphics architecture, our GPUs - all that is being applied to where the data is. You can almost predict where Intel will be in the future. It will be where data resides.
This site uses cookies to ensure you get the best experience. More info...
Got it!