I feel like Quanta just has Terry Tao on speed dial at this point, and that he loves to answer the call. They always ask him about this stuff for their math articles and he always gives great answers.
> That Kelley and Meka managed to spot the strength of once-overlooked ideas shows the often fitful nature of mathematical progress — a quality that to Tao is more of a blessing than a curse. “It’s not always the case that math just gets harder and harder and harder,” he said. “Thank God.”
What is the largest AP-3-free set for, say, the integers from 1 to 100? What is the largest N where we've computed it explicitly / how expensive is that to do?
Issue here: Click-bait headlines. Uh, I go to the Quanta Web site ~once a day. I respect Quanta enough not to get torqued at any of their headlines. Also there I pay attention to what fields of content (usually science) and not much to the headlines. For the fields, I tend to pass over medical, social, psychological, and biological sciences and stay with math, physics, cosmology, and engineering, and it is easy enough to guess the field of the content without getting torqued at the headlines. Soooo, net, at Quanta I don't object to their headlines.
For the article under discussion, yup, it is in the field of "combinatorics". I bumped into that field while scheduling the fleet at FedEx, in grad school, and in some business problems since. Result: Combinatorics is one heck of a challenge, both in the theory of the pure math (and the field can quickly become pure math, e.g., number theory) and also in the applications in business. In practice we squeak by with a little in theory and a lot in intuitive heuristics and relatively blind enumeration (that exploits powerful computing).
So, it would be nice, maybe a biggie in various respects, to have some major progress in combinatorics.
So, in such an attack, sure, it would be nice for someone some afternoon to have some big insight, bigger than heap sort instead of bubble sort, the fast Fourier transform instead of the direct approach, of error correcting codes instead of just three copies, etc. So, here is an invitation: Get a sharp, soft pencil, a big, soft eraser, a pad of paper, lean back, put feet up, and have some of the needed big ideas!!! All are invited and welcome!!!
After some decades, we are still looking for that big insight!! So, what now? How to modify the attack?
One way is, chip away at the problem wherever can see can get something new and apparently relevant, connected, or just anywhere in combinatorics -- or just follow the standard good research criteria of "new, correct, and significant" but at least in sight of the ballpark of combinatorics.
So, with that approach, the results in the current Quanta article are an example of the desired and welcome progress.
> Though both Bloom and Sisask had other pressing matters to attend to at the time they received the email — Bloom had just adopted a puppy, while Sisask was in the middle of moving — they quickly set to work verifying the new paper.
Neural networks were first prototyped in the 1950's [1]. How practical was it to study them back then?
Aleksandr Lyapunov's theory of stability for dynamical systems, which forms the bedrock for much of control theory, was first proposed in 1892 and went unnoticed until the 1930's [2]. How practical was it for Lyapunov to study that topic?
It's virtually impossible to predict which mathematical tools will be "practical" or "useful" in the future. The point is that if someone needs the tool down the road, they'll have access to the solution due to the efforts of these mathematicians.
Mathematics continually surprises us when seemingly disparate areas suddenly become linked, sharing some sort of relationship that was not appreciated until much later.
Some think mathematics is something like a physical law, or property of the universe. So figuring out such things may be like fundamental physics research. Hard to quantify the value, but there is value. Seemingly unrelated developments in mathematics have had a habit of helping to solve problems in physics in the past, at least. I don't think anyone really expected the exciting new mathematics of statistics in the 18th century, to one day be relevant to physics. But it would be.
I've spent a lot of time and brain cycles over the last decade writing software, entire products actually, that now sit as motionless bits on a disk somewhere, unused and useless.
In the process of creating them I had a lot of fun, honed my existing skills and picked up new ones, broke out of various boxes I had been stuffed into in terms of how to architect systems, and just generally worked out my creative muscles.
I've also written a lot of software that is useful and used, and in a practical sense makes (lots of) money.
It's meaningless to break out the time and cycles spent on one Vs the other. The necessary skills required were enabled by one another. Moreover it would have been to a large extent impossible to determine a priori which would be which.
Am I an impractical person because I've spent time and cycles on the former?
A Hash Table's double hashing [1] creates an arithmetic progression. Improving bounds of this theorem could lead to faster, smaller hash-tables. This is the first wild speculation I could come up with.
> What's the point of spending "time" (brain cycles) on such problems?
What's the point of anything?
> Is it just a mathematician's version of brain teaser?
All of mathematics can be characterised as "brain teasers". Make of that what you will.
> What useful development comes out of such?
I'm not qualified for this particular example. However, it seems quite number-theoretic, and number-theory has given us such "practical" things as asymmetric encryption and universal computation (Turing's famous paper from 1936 was titled "On Computable Numbers", after all).
I have a friend working on lidar for a military contractor. He uses these exact theorems to design beam pulses for measuring distances of asteroids and stuff. The idea is to construct beams so that you can tell by the autocorrelation function the exact distance away an object is. It turns out that these weird kinds of integer sequences can help you design beams that give the autocorrelation function this property.
It's normally extremely hard to guess the applications of this kind of research. Having said that, almost all modern cryptography relies on maths that was not obviously applicable at the time.
>In late 2021, Kelley and Meka were analyzing the chances that a team of players in a certain cooperative game would be able to win, a standard type of computer science problem. It occurred to them that techniques from research on the size of progression-free sets might be helpful. But they found it easier to directly study those techniques than to apply them to the cooperative game. “My best idea for how to make progress on this problem [was] to actually improve the tool itself, not to use it in a more clever way,” Kelley said.
I wish they went into more detail about how this problem was relevant to the other problem.
Most of the other comments focus on how and why theory research often leads to "practically useful" outcomes down the line, so I'll instead say that not all research has to be practical. Sometimes people do things just because they're interesting and fun.
Why do people play board games, or video games, or sports? Why do people read fiction? Why do people have hobbies which strictly distract from productive work time? The answer to any of these questions is the same as the answer for "Why do people do theoretical research?": it's fun.
Eh, someone just suggested that a bound of this form might break some cryptographic assumptions for finite fields of (large) prime order. (This is not obvious to me, but it's a point that I think is worth considering, and already would show that such a result is useful.)
It's very hard to know how these things end up being useful. Even if a specific piece of knowledge isn't useful by itself, it can lead to things that are useful down the line. It's both very hard to know which ones will be, or which ones do lead to actually useful results; for example, think of work on propositional logic which ends up being the bedrock of computation, work on elliptic curves (cryptography), work on PDEs (physics), quantum physics (transistors), photonics (communications), etc.
Some fields have pretty "fast-to-application" times, but it's not always clear which ones will _not_ yield useful results. Plus, and here's the real, honest answer: all of this shit is very fun! Why not do it?
I'm also a (somewhat) practical person, and that guides many of the problems I try to solve, but sometimes there's just such a tantalizing puzzle that it's hard not to resist! It's pretty damn hard to "only" work on "useful" stuff and not be just an ok academic/programmer/etc., you have to let yourself get nerd-sniped into doing random things. With very high probability, these little "side-quests" end up being very useful down the line, for one reason or another.
There is no direct practical reason. Other people are giving you the long term outlook view, why it is valuable in aggregate, but if you were aiming for immediately practical results neither math nor CS would fit the bill.
I can't speak for others but for me math is kind of like a brain teaser on crack. It is immensely satisfying to think about and I would imagine most mathematicians get a similar kick out of it. Not a mathematician myself mind you.
Some number theory results about sequences and progressions can be applied to create more efficient algorithms to approximate the solutions to multi-dimensional integrals that come up in physics (especially solid state physics on lattices).
Solid state physics problems of that sort are used in the process of engineering new materials (e.g. a more efficient dielectric for capacitors, a magnetic material that can store magnetic memory data at higher temperatures without destablizing, a higher temperature superconductor, etc). Idk if this particular result has such an application, but I wouldn't be surprised if it did.
In addition to solid state physics, "special functions" show up in many areas of applied math and physics, including the design, calibration, and data analysis of/from MRI machines and antennas. Many special functions including the hypergeometric functions can be approximated more efficiently using fancy number theory algorithms.
There is also a broad category of physics and chemistry problems that are best solved by Monte Carlo simulations, and sometimes Monte Carlo approximations can be made using many fewer iterations by choosing a set of points within the sample space that satisfy number theoretic constraints.
Also, physics models often inspire machine learning models so it's not impossible that some niche ML algorithm could eventually benefit from a faster algorithm that uses this result or a derivative of it.
None of these directly answer your question about a definitive application for this particular number theory result, but I hope it gives you an idea of what it could potentially be used for. The closest analogy I know if is that certain multi-dimensional Monte Carlo integral approximations can be done orders of magnitude more quickly if you choose lattice points whose coordinates satisfy some similar-ish constraints to the ones described for this problem.
Understanding the fundamental nature of numbers and sets will allow a person to avoid working on solutions to their "real world" problems which would be precluded by that nature.
Take cryptography for instance. Why large prime numbers? Why elliptic curves? If you have proven the fundamental mathematics behind factoring large primes, then you have a basis for thinking that a cryptographic system based on large primes will be reasonably secure.
Many of the developments you use everyday are based on the understanding of just such "brain teasers" :-).
Before cryptography, all of number theory was considered just one brain teaser after another. Without those brain teasers we wouldn't have secure private communications.
Sometimes, discovering these proofs makes the math easier. If we know X is the case for all Ys, and X is easier to compute, you can use X where you would use Y before. Of if you know X is true for all Ys, you can skip the part where you have to verify that your Y has X property. Because we now know it must have it.
And it's math all the way down in computers. So anything that makes math easier or quicker has the potential to affect anything a computer can process.
lets just hope that politicions who controls the flow of resource are not as shortsighted as you or we are doomed. The computer and all the software won't be exist without this boring and pointless mathemathics from a long time before
Understanding mathematics before we understand what problems it solves happens all the time. Maybe nothing will come of this, but because we can't know that, it's worth researching.
People didn't believe there was non-Euclidean geometry and not long after the taboo was broken, we had General Relativity. Einstein was a genius, but he did not invent the math for GR. Mathematicians like Riemann, Minkowski, Hilbert, Ricci, Levi-Civita and others created the mathematics for it first.
Finding the area under a curve or the tangent line of an arbitrary curve were considered two unconnected brain teasers until Calculus was invented by Newton and Leibniz and applied to gravity by Newton.
The "Math First" approach has done more than enough to prove its continued viability and without it you wouldn't have been able to type that here.
For a long time certain mathematicians took great pride in studying the most useless part of their field, prime numbers. Without their studies we would not have modern cryptographic techniques (RSA, SSL, HTTPS). And no bitcoin either.
Mathematicians aren't concerned with that at all. If something they do ends up being useful to science and engineering, then wonderful. But it's not the point.
All fundamental research is odd like that. Someone has to make significant progress on a 'useless' topic before some semblance of usefulness starts becoming visible.
Well, you can now use this as a tool to show that if you can reduce part of a problem to creating a progression free set you can use that to show the complexity of a problem.
The ways I've seen combinatorics (and graph theory!) interact with software engineering is typically showing that a problem is too expensive to solve at scale.
We're mortals, the only relevantquestion is "is it a better use of brain cycles than X".
Considering that way vaster amounts of brain cycles are devoted to making people watch one more TikTok video, then, I think this kind of brain-scratching is more useful than much of what most of us do.
My master thesis was about a problem that I could only find an NP solution for, but some of the underlying mathematical problems have not been studied/solved in depth yet, so I'm patiently waiting for a solution.
Maybe the question is what useful developments haven’t come from mathematical play originally, and can’t trace their lineage to abstract brain teasers?
You must be confused, the title specifies computer science and mathematicians - the default assumption here should be that if it is useful no one is going to realize for some decades.
This is why America's obsession with trying to boost math scores is possibly a waste of time and money. No amount of math literacy will ever approach anything like this. Those who are gifted and motivated enough will learn the material anyway and there are enough professors who can teach it.
People who are professionals are so way far above and beyond anything done by non-professionals. It's not like that with reading or writing, in which knowing how to read means you can in theory appreciate almost any book.
I think more emphasis should be on learning the basics, which enough kids find hard enough to do. No reason to try to make high schoolers learn algebra 1 & 2.
The authors of the paper are academics in university CS departments with heavy math backgrounds (one with a B.S. in mathematics, another was a visiting professor at MIT dept of mathematics). I don't understand what you mean or how it relates to the article or surrounding circumstances.
[+] [-] mabbo|3 years ago|reply
> That Kelley and Meka managed to spot the strength of once-overlooked ideas shows the often fitful nature of mathematical progress — a quality that to Tao is more of a blessing than a curse. “It’s not always the case that math just gets harder and harder and harder,” he said. “Thank God.”
[+] [-] uptownfunk|3 years ago|reply
[+] [-] thirdmunky|3 years ago|reply
[+] [-] n4r9|3 years ago|reply
[+] [-] lixtra|3 years ago|reply
[+] [-] charlieyu1|3 years ago|reply
[deleted]
[+] [-] mherdeg|3 years ago|reply
What is the largest AP-3-free set for, say, the integers from 1 to 100? What is the largest N where we've computed it explicitly / how expensive is that to do?
[+] [-] miley_cyrus|3 years ago|reply
[+] [-] graycat|3 years ago|reply
Issue here: Click-bait headlines. Uh, I go to the Quanta Web site ~once a day. I respect Quanta enough not to get torqued at any of their headlines. Also there I pay attention to what fields of content (usually science) and not much to the headlines. For the fields, I tend to pass over medical, social, psychological, and biological sciences and stay with math, physics, cosmology, and engineering, and it is easy enough to guess the field of the content without getting torqued at the headlines. Soooo, net, at Quanta I don't object to their headlines.
For the article under discussion, yup, it is in the field of "combinatorics". I bumped into that field while scheduling the fleet at FedEx, in grad school, and in some business problems since. Result: Combinatorics is one heck of a challenge, both in the theory of the pure math (and the field can quickly become pure math, e.g., number theory) and also in the applications in business. In practice we squeak by with a little in theory and a lot in intuitive heuristics and relatively blind enumeration (that exploits powerful computing).
So, it would be nice, maybe a biggie in various respects, to have some major progress in combinatorics.
So, in such an attack, sure, it would be nice for someone some afternoon to have some big insight, bigger than heap sort instead of bubble sort, the fast Fourier transform instead of the direct approach, of error correcting codes instead of just three copies, etc. So, here is an invitation: Get a sharp, soft pencil, a big, soft eraser, a pad of paper, lean back, put feet up, and have some of the needed big ideas!!! All are invited and welcome!!!
After some decades, we are still looking for that big insight!! So, what now? How to modify the attack?
One way is, chip away at the problem wherever can see can get something new and apparently relevant, connected, or just anywhere in combinatorics -- or just follow the standard good research criteria of "new, correct, and significant" but at least in sight of the ballpark of combinatorics.
So, with that approach, the results in the current Quanta article are an example of the desired and welcome progress.
[+] [-] unsupp0rted|3 years ago|reply
[+] [-] czbond|3 years ago|reply
What useful development comes out of such?
[+] [-] ubj|3 years ago|reply
Aleksandr Lyapunov's theory of stability for dynamical systems, which forms the bedrock for much of control theory, was first proposed in 1892 and went unnoticed until the 1930's [2]. How practical was it for Lyapunov to study that topic?
It's virtually impossible to predict which mathematical tools will be "practical" or "useful" in the future. The point is that if someone needs the tool down the road, they'll have access to the solution due to the efforts of these mathematicians.
[1]: https://en.wikipedia.org/wiki/Neural_network#History
[2]: https://en.wikipedia.org/wiki/Lyapunov_stability#History
[+] [-] retrac|3 years ago|reply
Some think mathematics is something like a physical law, or property of the universe. So figuring out such things may be like fundamental physics research. Hard to quantify the value, but there is value. Seemingly unrelated developments in mathematics have had a habit of helping to solve problems in physics in the past, at least. I don't think anyone really expected the exciting new mathematics of statistics in the 18th century, to one day be relevant to physics. But it would be.
[+] [-] davnicwil|3 years ago|reply
In the process of creating them I had a lot of fun, honed my existing skills and picked up new ones, broke out of various boxes I had been stuffed into in terms of how to architect systems, and just generally worked out my creative muscles.
I've also written a lot of software that is useful and used, and in a practical sense makes (lots of) money.
It's meaningless to break out the time and cycles spent on one Vs the other. The necessary skills required were enabled by one another. Moreover it would have been to a large extent impossible to determine a priori which would be which.
Am I an impractical person because I've spent time and cycles on the former?
[+] [-] rrobukef|3 years ago|reply
[1] https://en.wikipedia.org/wiki/Double_hashing
[+] [-] chriswarbo|3 years ago|reply
What's the point of anything?
> Is it just a mathematician's version of brain teaser?
All of mathematics can be characterised as "brain teasers". Make of that what you will.
> What useful development comes out of such?
I'm not qualified for this particular example. However, it seems quite number-theoretic, and number-theory has given us such "practical" things as asymmetric encryption and universal computation (Turing's famous paper from 1936 was titled "On Computable Numbers", after all).
[+] [-] alecst|3 years ago|reply
[+] [-] sebzim4500|3 years ago|reply
[+] [-] Vt71fcAqt7|3 years ago|reply
I wish they went into more detail about how this problem was relevant to the other problem.
[+] [-] Ar-Curunir|3 years ago|reply
Why do people play board games, or video games, or sports? Why do people read fiction? Why do people have hobbies which strictly distract from productive work time? The answer to any of these questions is the same as the answer for "Why do people do theoretical research?": it's fun.
[+] [-] semi-extrinsic|3 years ago|reply
[+] [-] LolWolf|3 years ago|reply
It's very hard to know how these things end up being useful. Even if a specific piece of knowledge isn't useful by itself, it can lead to things that are useful down the line. It's both very hard to know which ones will be, or which ones do lead to actually useful results; for example, think of work on propositional logic which ends up being the bedrock of computation, work on elliptic curves (cryptography), work on PDEs (physics), quantum physics (transistors), photonics (communications), etc.
Some fields have pretty "fast-to-application" times, but it's not always clear which ones will _not_ yield useful results. Plus, and here's the real, honest answer: all of this shit is very fun! Why not do it?
I'm also a (somewhat) practical person, and that guides many of the problems I try to solve, but sometimes there's just such a tantalizing puzzle that it's hard not to resist! It's pretty damn hard to "only" work on "useful" stuff and not be just an ok academic/programmer/etc., you have to let yourself get nerd-sniped into doing random things. With very high probability, these little "side-quests" end up being very useful down the line, for one reason or another.
[+] [-] burnished|3 years ago|reply
I can't speak for others but for me math is kind of like a brain teaser on crack. It is immensely satisfying to think about and I would imagine most mathematicians get a similar kick out of it. Not a mathematician myself mind you.
[+] [-] the88doctor|3 years ago|reply
Solid state physics problems of that sort are used in the process of engineering new materials (e.g. a more efficient dielectric for capacitors, a magnetic material that can store magnetic memory data at higher temperatures without destablizing, a higher temperature superconductor, etc). Idk if this particular result has such an application, but I wouldn't be surprised if it did.
In addition to solid state physics, "special functions" show up in many areas of applied math and physics, including the design, calibration, and data analysis of/from MRI machines and antennas. Many special functions including the hypergeometric functions can be approximated more efficiently using fancy number theory algorithms.
There is also a broad category of physics and chemistry problems that are best solved by Monte Carlo simulations, and sometimes Monte Carlo approximations can be made using many fewer iterations by choosing a set of points within the sample space that satisfy number theoretic constraints.
Also, physics models often inspire machine learning models so it's not impossible that some niche ML algorithm could eventually benefit from a faster algorithm that uses this result or a derivative of it.
None of these directly answer your question about a definitive application for this particular number theory result, but I hope it gives you an idea of what it could potentially be used for. The closest analogy I know if is that certain multi-dimensional Monte Carlo integral approximations can be done orders of magnitude more quickly if you choose lattice points whose coordinates satisfy some similar-ish constraints to the ones described for this problem.
[+] [-] ChuckMcM|3 years ago|reply
Take cryptography for instance. Why large prime numbers? Why elliptic curves? If you have proven the fundamental mathematics behind factoring large primes, then you have a basis for thinking that a cryptographic system based on large primes will be reasonably secure.
Many of the developments you use everyday are based on the understanding of just such "brain teasers" :-).
[+] [-] pnut|3 years ago|reply
Humanity already tried thinking it was better than the selfless pursuit of knowledge, that path leads nowhere good.
[+] [-] kccqzy|3 years ago|reply
[+] [-] bena|3 years ago|reply
And it's math all the way down in computers. So anything that makes math easier or quicker has the potential to affect anything a computer can process.
[+] [-] ano88888|3 years ago|reply
[+] [-] garbagecoder|3 years ago|reply
People didn't believe there was non-Euclidean geometry and not long after the taboo was broken, we had General Relativity. Einstein was a genius, but he did not invent the math for GR. Mathematicians like Riemann, Minkowski, Hilbert, Ricci, Levi-Civita and others created the mathematics for it first.
Finding the area under a curve or the tangent line of an arbitrary curve were considered two unconnected brain teasers until Calculus was invented by Newton and Leibniz and applied to gravity by Newton.
The "Math First" approach has done more than enough to prove its continued viability and without it you wouldn't have been able to type that here.
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] heikkilevanto|3 years ago|reply
[+] [-] ramesh31|3 years ago|reply
Mathematicians aren't concerned with that at all. If something they do ends up being useful to science and engineering, then wonderful. But it's not the point.
[+] [-] screye|3 years ago|reply
[+] [-] aaomidi|3 years ago|reply
I can already see it being useful for verifying entropy and for verifying anonymization of statistical data.
[+] [-] nostrebored|3 years ago|reply
The ways I've seen combinatorics (and graph theory!) interact with software engineering is typically showing that a problem is too expensive to solve at scale.
[+] [-] fooker|3 years ago|reply
[+] [-] phtrivier|3 years ago|reply
Considering that way vaster amounts of brain cycles are devoted to making people watch one more TikTok video, then, I think this kind of brain-scratching is more useful than much of what most of us do.
[+] [-] alfalfasprout|3 years ago|reply
[+] [-] dayjaby|3 years ago|reply
[+] [-] haha69|3 years ago|reply
[+] [-] dahart|3 years ago|reply
[+] [-] bennysonething|3 years ago|reply
[+] [-] giantg2|3 years ago|reply
Edit: why disagree... with me asking questions?
[+] [-] dang|3 years ago|reply
https://news.ycombinator.com/newsguidelines.html
[+] [-] burnished|3 years ago|reply
[+] [-] BeetleB|3 years ago|reply
Same use as the rest of mathematics: Entertainment for those interested in the topic.
[+] [-] paulpauper|3 years ago|reply
People who are professionals are so way far above and beyond anything done by non-professionals. It's not like that with reading or writing, in which knowing how to read means you can in theory appreciate almost any book.
I think more emphasis should be on learning the basics, which enough kids find hard enough to do. No reason to try to make high schoolers learn algebra 1 & 2.
[+] [-] tmhn2|3 years ago|reply
[+] [-] permo-w|3 years ago|reply
did you have a bad maths teacher at some point?