A Quote by Clay Shirky

Algorithms don't do a good job of detecting their own flaws. — © Clay Shirky
Algorithms don't do a good job of detecting their own flaws.
As algorithms push humans out of the job market, wealth and power might become concentrated in the hands of the tiny elite that owns the all-powerful algorithms, creating unprecedented social and political inequality. Alternatively, the algorithms might themselves become the owners.
There's never a mistake in the universe. So if your partner is angry, good. If there are things about him that you consider flaws, good, because these flaws are your own, you're projecting them, and you can write them down, inquire, and set yourself free. People go to India to find a guru, but you don't have to: you're living with one. Your partner will give you everything you need for your own freedom.
A good job is more than just a paycheck. A good job fosters independence and discipline, and contributes to the health of the community. A good job is a means to provide for the health and welfare of your family, to own a home, and save for retirement.
It's good to have flaws; it's learning to love your flaws.
The problem with Google is you have 360 degrees of omnidirectional information on a linear basis, but the algorithms for irony and ambiguity are not there. And those are the algorithms of wisdom.
I put my flaws on front street. So the world accepted my flaws, so I don't have any flaws.
Any kind of run-of-the-mill flaws that are easily solved, to me, are boring. Situational flaws, for example. I like flaws that are rooted in a deep distrust in people because of a lack of love.
When Spotify launched in the U.S. in 2011, it relied on simple usage-based algorithms to connect users and music, a process known as 'collaborative filtering.' These algorithms were more often annoying than useful.
Once you see the problems that algorithms can introduce, people can be quick to want to throw them away altogether and think the situation would be resolved by sticking to human decisions until the algorithms are better.
These algorithms, which I'll call public relevance algorithms, are-by the very same mathematical procedures-producing and certifying knowledge. The algorithmic assessment of information, then, represents a particular knowledge logic, one built on specific presumptions about what knowledge is and how one should identify its most relevant components. That we are now turning to algorithms to identify what we need to know is as momentous as having relied on credentialed experts, the scientific method, common sense, or the word of God.
It's just a compulsion to create something new and stay busy. I don't know how to do anything else. It was never exactly right. Those records came out in spite of their flaws. And because of their flaws they were good.
In deep learning, the algorithms we use now are versions of the algorithms we were developing in the 1980s, the 1990s. People were very optimistic about them, but it turns out they didn't work too well.
What the Chronics are - or most of us - are machines with flaws inside that can't be repaired, flaws born in, or flaws beat in over so many years of the guy running head-on into solid things that by the time the hospital found him he was bleeding rust in some vacant lot.
The trick is finding a person whose flaws don't drive you crazy...you know...someone whose flaws you can live with...someone who can stand your flaws, too.
Data dominates. If you've chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming.
It's difficult to make your clients understand that there are certain days that the market will go up or down 2%, and it's basically driven by algorithms talking to algorithms. There's no real rhyme or reason for that. So it's difficult. We just try to preach long-term investing and staying the course.
This site uses cookies to ensure you get the best experience. More info...
Got it!