Top 60 Quotes & Sayings by Hannah Fry

Explore popular quotes and sayings by an English mathematician Hannah Fry.
Last updated on December 24, 2024.
Hannah Fry

Hannah Fry is a British mathematician, author, lecturer, radio and television presenter, podcaster and public speaker. She is Professor in the Mathematics of Cities at the UCL Centre for Advanced Spatial Analysis. She studies the patterns of human behaviour, such as interpersonal relationships and dating, and how mathematics can apply to them. Fry delivered the 2019 Royal Institution Christmas Lectures.

Every criminal-justice system has to find some kind of balance between protecting the rights of innocent people falsely accused of crimes and protecting the victims of crimes.
There's actually an awful lot of mathematics that goes into designing a railway, keeping it running, making sure everything runs optimally. Every time you need something to be optimal there's going to be some mathematics at play.
Everything we're doing online is being not just monitored, but that information is being packaged up and sold and resold to manipulate us. — © Hannah Fry
Everything we're doing online is being not just monitored, but that information is being packaged up and sold and resold to manipulate us.
When designing algorithms as a business owner, your incentive is your profit, something for your business, it's not an incentive to maximise something for the individual.
I do think we're at a point in our history where almost all of the big, grand, challenges faced by the human race are those that demand a scientific solution: climate change; access to clean water; over-crowding; plastic waste.
No weather forecaster can tell you for sure when to wear a rain slicker, stock up on canned goods, or evacuate a city that's in a cyclone's path. All forecasters can offer is their best guess at the atmosphere of the future, whispered by the simulated blue marble and wrapped up in uncertainty.
Algorithms and data should support the human decision, not replace it.
We have this imbalance where the people who are making algorithms aren't talking to the people who are using them. And the people who are using them aren't talking to the people who are having decisions made about their lives by them.
In our urge to automate, in our eagerness to adopt the latest innovations, we appear to have developed a habit of unthinkingly handing over power to machines.
We should actively be thinking about what our inventions would look like if exploited by someone with a less of a moral compass and decide if the world would really be better off with them in it.
The best couples, or the most successful couples, are the ones with a really low negativity threshold. These are the couples that don't let anything go unnoticed and allow each other some room to complain.
But for me, true art can't be created by accident. There are boundaries to the reach of algorithms. Limits to what can be quantified. Among all of the staggeringly impressive, mindboggling things that data and statistics can tell me, how it feels to be human isn't one of them.
We literally hand over our most private data, our DNA, but we're not just consenting for ourselves, we are consenting for our children, and our children's children. Maybe we don't live in a world where people are genetically discriminated against now, but who's to say in 100 years that we won't?
On average, the higher the novelty score a film had, the better it did at the box office. But only up to a point. Push past that novelty threshold, and there's a precipice; the revenue earned by a film fell off a cliff.
All around us, algorithms provide a kind of convenient source of authority: an easy way to delegate responsibility, a short cut we take without thinking.
History is littered with examples of objects and inventions with a power beyond their professed purpose. Sometimes it's deliberately and maliciously factored into their design, but at other times, it's a result of thoughtless omissions.
One of the problems maths struggles with is that it's invisible. We haven't got explosions on our side. — © Hannah Fry
One of the problems maths struggles with is that it's invisible. We haven't got explosions on our side.
You can't assess the value of an innovation in isolation, you have to consider whose hands it's in.
But the threat of a pandemic is different from that of a nerve agent, in that a disease can spread uncontrollably, long after the first carrier has succumbed.
Human emotion isn't neatly ordered and rational and easily predictable.
What seems obvious to one person wouldn't occur to another. Your perspective is hard coded into the work that you create.
Every time you shop online, every time you sign up for a newsletter, or register on a website, or enquire about a new car, or fill out a warranty card, or buy a new home, or register to vote - you are unwittingly handing over a small clue as to who you are and how you behave.
I think people have this hang-up from school that maths is this dusty old textbook that was finished hundreds of years ago, and all the answers are in the back. Whereas in my job I struggle to find anything that maths can't offer an interesting perspective on.
When you don't have diversity in the creative process, you inevitably end up with a single, narrow perspective in the output.
And anytime a programmer makes a decision about how to deal with data, how to average it or clean it, you're imparting more of your own bias on it.
We're living in an age where new technology offers gigantic upsides - artificial intelligence has the potential to diagnose cancer, catch serial killers and reduce prison populations.
Once you see the problems that algorithms can introduce, people can be quick to want to throw them away altogether and think the situation would be resolved by sticking to human decisions until the algorithms are better.
You can harvest any data that you want, on anybody. You can infer any data that you like, and you can use it to manipulate them in any way that you choose. And you can roll out an algorithm that genuinely makes massive differences to people's lives, both good and bad, without any checks and balances.
Curating our data is valuable. Like 23andMe - while selling us the chance to know whether we're Vikings or whatever, they're amassing these huge DNA databases that are unimaginably valuable. Get people to pay you to add their DNA to this database. Genius!
At some point in the future - possibly the very near future - Britain will be hit by a deadly pandemic, and its impact could be utterly devastating.
I spend quite a lot of time thinking about how curated our information is. What we watch, what we read, what we buy, often who we talk to, is all shaped and influenced by some kind of a mathematical algorithm.
The Gottman Institute's study about arguments in long-term relationships concludes that couples with the best chance at long-term success are the ones with a low negativity threshold: if something's wrong, they speak up about it immediately. That's something I've taken on board.
When it comes to love, making long-term decisions is a risky business. Sooner or later, most of us decide to leave our carefree bachelor or bachelorette days behind us and settle down.
Writing about 'What is art?' is not something I ever thought I'd be doing.
There's barely any aspect of our modern lives that hasn't had a mathematical contribution at some point and yet, if you asked the average person, they might think that maths is just difficult, irrelevant and uninteresting.
A century ago the Spanish flu confounded scientists and devastated whole regions, but while today's society has air travel and an enormous, heterogeneous population, we also have antibiotics, fantastic communication networks and, perhaps most crucially, more data than ever.
But as soon as Facebook decided that they wanted to become purveyors of news, suddenly you have these highly personalized newsfeeds where everything is based on what your friends like, what you like, things that you've read in the past.
Because for me, equations and symbols aren't just a thing. They're a voice that speaks out about the incredible richness of nature and the startling simplicity in the patterns that twist and turn and warp and evolve all around us, from how the world works to how we behave.
Because, ultimately, we can't just think of algorithms in isolation. We have to think of the failings of the people who design them - and the danger to those they are supposedly designed to serve.
It's true that you can't take an individual rain droplet and say where it's come from or where it's going to end up. But you can say with pretty good certainty whether it will be cloudy tomorrow.
I'm a mathematician. I can trade in facts about false positives and absolute truths about accuracy and statistics with complete confidence. — © Hannah Fry
I'm a mathematician. I can trade in facts about false positives and absolute truths about accuracy and statistics with complete confidence.
As the law catches up and the battle between corporate profits and social good plays out, we need to be careful not to be lulled into a false sense of privacy.
In every community, there are a number of 'social super-spreaders' among us. Long-suspected and emphatically confirmed by our data, these are people who - through dint of their job, or lifestyle, or perhaps even genetic makeup - would be more dangerous in the instance of a pandemic than the average person.
I certainly think there are some skills we'll lose as we hand things over to automation. I can barely remember my own phone number now, let alone the long list of numbers I used to know, and my handwriting has completely gone to pot.
So my favorite online dating website is OkCupid, not least because it was started by a group of mathematicians.
The invisible pieces of code that form the gears and cogs of the modern machine age, algorithms have given the world everything from social media feeds to search engines and satellite navigation to music recommendation systems.
I'm an academic. I did my PhD in fluid dynamics and now I work at the University College London in an interdisciplinary department looking at patterns of human behaviour in urban settings.
Technology on its own isn't good or or evil.
Wind depends on temperature. Temperature depends on pressure. And pressure depends on wind. It's an intricate mathematical tapestry that is far too intertwined to unpick by hand.
I don't like the algorithms that don't do what they claim they do.
The weather doesn't respect political or geographic boundaries: we're all living under the same sky. And so weather prediction has been a marvel not only of technology but also of international cooeperation.
In medicine, you learn about ethics from day one. In mathematics, it's a bolt-on at best. It has to be there from day one and at the forefront of your mind in every step you take.
Imagine life without any algorithms at all, you wouldn't be able to do anything. This is already completely encompassing. We have a habit of over-trusting what mathematics or computer scientists tell us to do, without questioning it, too much faith in the magical power of analysis.
One of the first things I did when I finished my Ph.D. was work with the police to look at what happened during the London riots in 2011, which took over the city. — © Hannah Fry
One of the first things I did when I finished my Ph.D. was work with the police to look at what happened during the London riots in 2011, which took over the city.
Our long-range predictions - especially those which anticipate extreme-weather events - rely on an assumption that the future will be similar to the past. Lose that, and we lose the tools that have allowed us to prepare for such eventualities.
The future doesn't just happen. We are building it, and we are building it all the time.
I'm writing a book called 'The Indisputable Existence of Santa Claus' about the maths of Christmas: how to set up a secret Santa so it's totally fair; how to decorate your tree mathematically; how to win at Monopoly.
Whenever we haven't got enough information to make decisions for ourselves, we have a habit of copying the behaviour of those around us.
People are often quite lazy. We like taking the easy way out - we like handing over responsibility, we like being offered shortcuts that mean we don't have to think.
If we permit flawed machines to make life-changing decisions on our behalf - by allowing them to pinpoint a murder suspect, to diagnose a condition or take over the wheel of a car - we have to think carefully about what happens when things go wrong.
This site uses cookies to ensure you get the best experience. More info...
Got it!