A Quote by Fei-Fei Li

When I was a graduate student in computer science in the early 2000s, computers were barely able to detect sharp edges in photographs, let alone recognize something as loosely defined as a human face.
In the early 1970s, I headed to graduate school at the University of Utah and joined the pioneering program in computer graphics because I realized that's where I could combine my interests in art and computer science.
Because computers have memories, we imagine that they must be something like our human memories, but that is simply not true. Computer memories work in a manner alien to human memories. My memory lets me recognize the faces of my friends, whereas my own computer never even recognizes me. My computer's memory stores a million phone numbers with perfect accuracy, but I have to stop and think to recall my own.
A science only advances with certainty, when the plan of inquiry and the object of our researches have been clearly defined; otherwise a small number of truths are loosely laid hold of, without their connexion being perceived, and numerous errors, without being enabled to detect their fallacy.
When you start in science, you are brainwashed into believing how careful you must be, and how difficult it is to discover things. There's something that might be called the 'graduate student syndrome'; graduate students hardly believe they can make a discovery.
As a child, I did what any normal kid who grew up without any electricity would do - I spent countless hours working on a computer wired to my parents' car battery... and learned how to code. This natural passion for computers lead me into the Internet market during the late 1990s and early 2000s.
There was a golden period that I look back upon with great regret, in which the cheapest of experimental animals were medical students. Graduate students were even better. In the old days, if you offered a graduate student a thiamine-deficient diet, he gladly went on it, for that was the only way he could eat. Science is getting to be more and more difficult.
I was lucky to get into computers when it was a very young and idealistic industry. There weren't many degrees offered in computer science, so people in computers were brilliant people from mathematics, physics, music, zoology, whatever. They loved it, and no one was really in it for the money.
The first thing I think, I was building computers, I started to build a computer when I was 17 or 18 at home, an IBM compatible computer, and then I started to sell computers, and when I sold a computer to a company called Ligo I think, and they were selling systems which became blockbuster.
I started doing science when I was effectively 20, a graduate student of Salvador Luria at Indiana University. And that was - you know, it took me about two years, you know, being a graduate student with Luria deciding I wanted to find the structure of DNA; that is, DNA was going to be my objective.
My father was a graduate student at Oxford in the early 1960s, where the conventions and etiquette of clothing were crucial to the pervasive class consciousness of the place and time.
The term "informatics" was first defined by Saul Gorn of University of Pennsylvania in 1983 (Gorn, 1983) as computer science plus information science used in conjunction with the name of a discipline such as business administration or biology. It denotes an application of computer science and information science to the management and processing of data, information and knowledge in the named discipline.
Starting early and getting girls on computers, tinkering and playing with technology, games and new tools, is extremely important for bridging the gender divide that exists now in computer science and in technology.
Computer science is one of the worst things that ever happened to either computers or to science.
I realized very early in life what my abilities and limitations were, and foreign languages was definitely one of my limitations. With strenuous effort, I just barely passed my French class at Harvard so I could graduate.
With both people and computers on the job, computer error can be more quickly tracked down and corrected by people and, conversely, human error can be more quickly corrected by computers. What it amounts to is that nothing serious can happen unless human error and computer error take place simultaneously. And that hardly ever happens.
I was never as focused in math, science, computer science, etcetera, as the people who were best at it. I wanted to create amazing screensavers that did beautiful visualizations of music. It's like, "Oh, I have to learn computer science to do that."
This site uses cookies to ensure you get the best experience. More info...
Got it!