Skip to content

Equations have built giants like Google. Who will find the next billion dollars in math? | David Sumpter

| Business Top stories | Usa news

In 1998, a computer science PhD student named Larry Page filed a patent for Internet research based on an obscure piece of math. The method, known today as PageRank, has found the most relevant web pages much faster and more accurately than ever before. The patent, originally held by Stanford, was sold in 2005 for shares that are now worth more than $1 billion. Page’s company, Google, has a net worth of over $1,000,000.

It was not Page, nor Google co-founder Sergey Brin, who created the math described in the patent. The equation they used is at least 100 years old and is based on the properties of matrices (mathematical structures similar to a spreadsheet of numbers). Similar methods were used by Chinese mathematicians more than two millennia ago. Page and Brin realized that by calculating what is called the stationary distribution of a matrix describing connections on the World Wide Web, they could find the most popular sites faster.

Applying the right equation can suddenly solve an important practical problem and completely change the world we live in.

The story of PageRank is neither the first nor the most recent example of a little-known piece of math transforming technology. In 2015, three engineers used the idea of ​​gradient descent, dating back to French mathematician Augustin-Louis Cauchy in the mid-19th century, to increase the time people spent watching YouTube by 2,000%. Their equation transformed the service from a place where we went for a few funny clips to a major consumer of our viewing time.

Since the 1990s, the financial industry has been built on variations of the diffusion equation, attributed to various mathematicians including Einstein. Professional gamblers use logistic regression, developed by Oxford statistician Sir David Cox in the 1950s, to ensure that they win at the expense of punters who are less savvy in mathematics.

There are good reasons to expect there will be more trillion-dollar equations: generations-old mathematical theorems with potential for new applications. The question is where to look for the next one.

A few candidates can be found in mathematical works from the later part of the 20th century. One comes in the form of fractals, patterns that are self-similar, repeating on many different levels, like the branches of a tree or the shape of a head of broccoli. Mathematicians developed a comprehensive theory of fractals in the 1980s, and there was a craze for apps that could store data more efficiently. Interest died down until recently, when a small community of computer scientists began showing how mathematical fractals can produce the most amazing, weird, and wonderful patterns.

Another area of ​​mathematics that is still seeking a lucrative application is chaos theory, the best-known example of which is the butterfly effect: if a butterfly flaps its wings in the Amazon, you need to know that to predict a storm. in the North Atlantic. More generally, the theory tells us that, to accurately predict storms (or political events), we need to know every little air disturbance on the entire planet. An impossible task. But chaos theory also points to repeatable models. The Lorenz attractor is a pattern of time which, although chaotic, produces somewhat regular and recognizable patterns. Given the uncertainty of the times we live in, perhaps it is time to revive these ideas.

Some of my own research has focused on self-propelled particle models, which describe motions similar to flocks of birds and schools of fish. I now apply these models to better coordinate tactical formations in football and to spot players moving in ways that create more space for themselves and their teammates.

Another related model is the current enhanced random walks, which capture how ants build trails and the structure of slime mold transport networks. This model could take us from the computers of today – which have central processing units (CPUs) that perform calculations and separate memory chips to store information – to new forms of computing in which computing and memory are part of the same process. Like trails of ants and slime molds, these new computers would benefit from being decentralized. Computationally difficult problems, especially in AI and computer vision, could be broken down into smaller subproblems and solved faster.

Whenever there is a revolutionary application of an equation, we see a whole range of imitations being imitated. The current boom in artificial intelligence is primarily driven by just two equations – gradient descent and logistic regression – coming together to create what is known as a neural network. But history shows that the next big leap forward doesn’t come from repeatedly using the same math trick. Rather, it comes from a completely new idea, read from the darker pages of the math book.

The challenge of finding the next billion dollar equation is not simply knowing every page of this book. Page spotted the right problem to solve at the right time, and he persuaded the more theoretically inclined Brin to help him find the math to help them. You don’t have to be a math whiz yourself to use the topic to good effect. You just need to get a feel for what equations are and what they can and can’t do.

Mathematics still conceals many hidden intellectual and financial riches. It is up to all of us to try to find them. The search for the next billion dollar equation is on.

  • David Sumpter is Professor of Applied Mathematics at Uppsala University, Sweden, and author of The Ten Equations that Rule the World: And How You Can Use Them Too

Equations have built giants like Google. Who will find the next billion dollars in math? | David Sumpter

| Breaking News Updates Fox news
theguardian Gt

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.