A Quote by William Gibson

A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding.
You have to imagine a world in which there's this abundance of data, with all of these connected devices generating tons and tons of data. And you're able to reason over the data with new computer science and make your product and service better. What does your business look like then? That's the question every CEO should be asking.
In theory, there is nothing the computer can do that the human mind can not do. The computer merely takes a finite amount of data and performs a finite number of operations upon them. The human mind can duplicate the process
The promoters of big data would like us to believe that behind the lines of code and vast databases lie objective and universal insights into patterns of human behavior, be it consumer spending, criminal or terrorist acts, healthy habits, or employee productivity. But many big-data evangelists avoid taking a hard look at the weaknesses.
For every job you require a kind of mindset. To be a teacher one should be knowledgeable. To be a software engineer you should know computer data system analysis, computer language etc. So, my mindset is not aligned with politics.
I love computer programmers. They have a very beautiful definition of complexity as 'the capacity to transmit the maximum information with the minimum data'.
The computer is here to stay, therefore it must be kept in its proper place as a tool and a slave, or we will become sorcerer's apprentices, with data data everywhere and not a thought to think.
Integral to the orb is our low cost long-range wireless radio data system and a protocol that allows us to send this data over 90% of the US population every 15 minutes throughout the day.
The biggest mistake is an over-reliance on data. Managers will say if there are no data they can take no action. However, data only exist about the past. By the time data become conclusive, it is too late to take actions based on those conclusions.
The fact that all normal children acquire essentially comparable grammars of great complexity with remarkable rapidity suggests that human beings are somehow specially designed to do this, with data-handling or 'hypothesis-formulating' ability of unknown character and complexity.
One of the myths about the Internet of Things is that companies have all the data they need, but their real challenge is making sense of it. In reality, the cost of collecting some kinds of data remains too high, the quality of the data isn't always good enough, and it remains difficult to integrate multiple data sources.
A data scientist is that unique blend of skills that can both unlock the insights of data and tell a fantastic story via the data.
Basically, if you want to have a computer system that could pass the Turing test, it as a machine is going to have to be able to self-reference and use its own experience and the sense data that it's taking in to basically create its own understanding of the world and use that as a reference point for all new sense data that's coming in to it.
Data!data!data!" he cried impatiently. "I can't make bricks without clay.
Go out and collect data and, instead of having the answer, just look at the data and see if the data tells you anything. When we're allowed to do this with companies, it's almost magical.
Biases and blind spots exist in big data as much as they do in individual perceptions and experiences. Yet there is a problematic belief that bigger data is always better data and that correlation is as good as causation.
A great deal of creativity is about pattern recognition, and what you need to discern patterns is tons of data. Your mind collects that data by taking note of random details and anomalies easily seen every day: quirks and changes that, eventually, add up to insights.
This site uses cookies to ensure you get the best experience. More info...
Got it!