Skip to content

Is P=NP a Grave Matter?

October 31, 2023


Our favorite problem moribund?

The photo at right was taken by a friend—with thanks—in San Carlos, California, last weekend. We do not know who put out the Halloween display. My late colleague Alan Selman sported a license plate that declared P NE NP (NE for “not equal to”). Dick and I last gave our thoughts in several posts in 2020 after waffling on whether we believe “equal” or “not equal.”

This Halloween we discuss a different matter: Is work on P vs. NP whistling past the graveyard?

And a second question is suggested by the punning last words of Mercutio in Romeo and Juliet. Always a jokester despite a mortal wound, he says:

“Ask for me tomorrow, and you shall find me a grave man.”

The pun is on “grave” meaning “serious.” Certainly P vs. NP has been central to our field’s formative period. But will it stay that way tomorrow?

Consider, for instance, the career of Adam Tauman Kalai whom we just featured. His DBLP page shows recent papers on:

  • Algorithmic fairness.

  • Large language models.

  • Loss minimization in neural networks.

  • Automated code generation.

  • Low-rank matrices.

These are all vital areas, and it is wonderful that theory opens a window on them. But they are not going to break through on P vs. NP. Well, low-rank matrices are important to lower bounds, so that is a possibility. We’ve kept our ears to the ground for word of new ideas on P vs. NP. The closest we’ve seen is recent work on the Minimum Circuit Size Problem, which is surveyed at the end of this wonderful long recent article in Quanta that surveys the state of complexity theory.

Has the Specter of NP-Hardness Faded?

Complexity theory used to represent a check on scaling up computers to solve moderately larger cases of important hard computational problems. The famous 1979 book by Michael Garey and David Johnson, Computers and Intractability: A Guide to the Theory of NP-Completeness, opened with a parable of a project worker needing to explain to the boss why a task is hard.

Well, nothing since has prevented computers from scaling up majorly. Right now Google Chrome is telling me (Ken) that one of my tabs—just one of a dozen tabs I have open—is taking up 419 megabytes of memory. A tab I just opened for tracking tonight’s World Series game grabbed 130 MB.

What’s more, the hard problems themselves have yielded in many significant, sizable instances. We noted the success of SAT solvers already seven years ago and likewise for new Traveling Salesman algorithms. An analogy is that the game of Go is PSPACE-hard on {n \times n} boards, but the standard {19 \times 19} board is plenty complex for us—and conquerable by computers.

Other disciplines have found that they can grow around problems that are “Big” but not intractable. Weather forecasting has long regarded getting to kilometer-scale fineness as the “holy grail” but limitations apart from computing power have conspired to open other areas of progress that our computers are ready for. Deep learning results come from long training computations. The underlying model may be a small number of levels of threshold circuits—but we have not clamped any tight upper bounds on the corresponding complexity class TC⁰ either. Large language models (LLMs), ones that result from months of computation, are able to solve problems in a universal manner—with varying degrees of confidence, to be sure.

President Joe Biden just rolled out a sweeping executive order Monday that aims to monitor and regulate the risks of artificial intelligence while also harnessing its potential. This marks the latest effort to address a rapidly evolving technology that has sparked concern among world leaders. He did not say the same about complexity theory. The AI warnings are quite apart from P vs. NP. Perhaps he should have included complexity? Perhaps marrying LLMs to SAT solvers and game programs will radically change the world even more than AI is expected to do so now. Is this possible?

Is P =? NP the Right Question?

The focus on whether SAT is in polynomial time hides a more desperate reality: In over half a century of trying, we really have not proved a super-linear lower bound on the complexity of any natural problem.

  • There are problems in NP that require very slightly above linear time on a standard multitape Turing machine. But the reason leans mightily on the limited nature of Turing tapes, and even then they fail when the Turing machine is allowed planar tapes, let alone some kind of random access.

  • Not even those problems have super-linear lower bounds on Boolean circuit size. No such bound is known even for problems in {2^{O(n)}} exponential time, which are provably not in polynomial time. We covered one recent effort at general circuit lower bounds here.

  • Walter Baur and Volker Strassen proved {\Omega(n\log n)} bounds on the number of arithmetical circuit gates—indeed, of multiplication gates—needed for some simple {n}-variable functions like {f(x_1,\dots,x_n) = x_1^n + \cdots + x_n^n}. But maybe the “size” of an {n}-variable polynomial of total degree {d} should be reckoned as {N = n\log d} anyway. Then nobody knows a super-linear lower bound in terms of {N}.

The real question is:

Why do the barriers to proving nontrivial lower bounds slam down right at linear to begin with?

This question involves us in models and “worlds” that are a lot less robust and approchable than the complexity notions by which our field developed.

What Hope to Solve P vs. NP?

We are coming on the 30th anniversary of the “Natural Proofs” barrier. It was discussed here.

One essence is that natural attempts to prove lower bounds for one hard problem—if they succeed—convert around to become actual algorithms that give upper bounds for another hard problem. In the case of discrete logarithm, the two problems are the same. Thus the effort to push lower bounds by “natural” means can actually be self-defeating.

We have noticed this self-defeating phenomenon in other veins. Completeness and hardness results often exploit succinct representations of core predicates. Those predicates embody expansive power in small programs or circuits—power that makes it hard to prove that those small programs or circuits lack general solving power.

We have remarked on particular attempts with self-defeating elements

  • here in regard to the “polynomial method” for lower bounds;

  • here in regard to circuit-based attempts on P {\neq} NP;

  • here in regard to the MCSP problem;

  • here in regard to permanent versus determinant; and

  • here on the “geometric complexity” effort led by Ketan Mulmuley.

Mulmuley’s program had real novelty and seeming potential to resolve P vs. NP when it was introduced, for reasons I gave in my own 2002 survey of it. But Ketan himself admitted that he didn’t see resolution on any shorter timescale than 150 years. Maybe there is hope in efforts like this by David Zuckerman that relax the problem in a way that may dodge some barriers and apply high-powered combinatorics.

Open Problems

The vital meta-problems are: how long will the P versus NP problem stay open—and how long will it stay inviting?


[some word and link fixes; added mention of Zuckerman’s work at end]

28 Comments leave one →
  1. Frank Vega permalink
    November 1, 2023 4:59 am

    \textbf{Sometimes the wind blows where nobody suspects for.}

    In August 2023, a proof of NP \subseteq NSPACE(\log^{2} n) was accepted by the MICOPAM conference as the participant 79:

    https://micopam.com/people/participant-list/

    This proof was under the assumption that the output tape was unbounded (maybe this assumption is like the “Spherical cow”). See the details here:

    https://www.researchgate.net/publication/370426542_NP_on_Logarithmic_Space

    Nevertheless, it’s hard to gain the attention of people about science when you are only a “lover of” (an amateur). Certainly, I have constantly posted in LinkedIn the last month (October 2023) that my proof of the Riemann hypothesis have passed the first scrutiny of one of the top math journal (I believe maybe this unexpected event could be the first one in math history). See the details here:

    https://www.researchgate.net/publication/374144255_On_Nicolas_Criterion_for_the_Riemann_Hypothesis

    I have also shared this good news with my closest friends and colleagues such as Patrick Solé:

    https://www.researchgate.net/profile/Patrick-Sole

    and he told me “Hi Frank, where is it the catch?”. However, I have received his support over the current revision of the paper where I needed his help.

    Today, you could also be reading this comment and ask yourself: Where is it the catch?” Then, I would again answer with the first sentence of this comment:

    \textbf{Sometimes the wind blows where nobody suspects for.}

  2. November 2, 2023 8:32 am

    Ask for me tomorrow, and you shall find me a grave man.
    — Romeo and Juliet • Mercutio 3.1.97–98

    Maybe it’s time for a reanimation

  3. Ivan Gusev permalink
    November 7, 2023 6:53 am

    I heard about the prophecy that anyone who will prove RH will immediately die. Everybody who studied it professionally lived for very long time. How much do you believe it?

    • botsinadeqardinuta permalink
      November 8, 2023 6:20 am

      Do you have a source on the RH prophecy?

  4. Light Yagami permalink
    November 8, 2023 6:19 am

    There are still no known superpolynomial lower bounds on Extended Resolution proofs. I don’t think there are any strong barriers* to interpreting CNFs are bipartite graphs and parsing them for short ER proofs.

    You could even guess tha the time complexity would be quadratic — 2SAT can be solved in linear time, the ER productions** would have to have a larger treewidth by one that the Resolution productions, so closing the ER productions would require finding cycles in a graph which is linear time and you would have to do it for each clause so the total running time and space would be expected to be quadratic.

    • The potential barriers I’ve noticed are that: 1) parsing context-free graph in general is NP-complete 2) variables would have to be reused in proofs 3) the graph grammar could fail to be context-free.

    ** ER productions would to take two clauses with a shared variable $(L_1 \lor x)\land( \lnot x \lor L_2)$ to three clauses with two variables $(L_1 \lor x_1 \lor x_2)\land( \lnot x_1 \lor L_2)\land( \lnot x_2 \lor L_2)$. To match an ER production, the literals in L_2 have to be matched and they could be separated by a distance of O(n) after many other productions.

  5. Penny permalink
    November 8, 2023 7:43 am

    I recently came across an interesting paper on the P!=NP problem (https://arxiv.org/abs/2302.09512v8). Research in theoretical computer science has primarily revolved around computability (whether an algorithm exists) and computational complexity (whether an efficient algorithm can be designed). This paper introduces a new perspective that seems to highlight another question: whether an algorithm can be designed. For combination problems, exhaustive algorithms are already defined by the problem itself, so they’re not “designed algorithms”. Hence, the essence of algorithm design is essentially replacing exhaustive algorithms with non-exhaustive ones.

  6. Penny permalink
    November 8, 2023 7:44 am

    So, given a specific problem, should we research in this order: Is there an algorithm? → If so, can we design an algorithm? → If so, can we design an effective algorithm? It looks like that whether we can design an algorithm is more fundamental, and needs to be considered before discussing algorithm efficiency, but it seems that we’ve overlooked this issue before…E.g., whether or not we can design a corresponding algorithm for a problem is something we don’t know and rarely study.

  7. botsinadeqardinuta permalink
    November 9, 2023 3:23 am

    There are still no known superpolynomial lower bounds on Extended Resolution proofs. I don’t think there are any strong barriers* to interpreting CNFs are bipartite graphs and parsing them for short ER proofs.

    You could even guess tha the time complexity would be quadratic — 2SAT can be solved in linear time, the ER productions** would have to have a larger treewidth by one that the Resolution productions, so closing the ER productions would require finding cycles in a graph which is linear time and you would have to do it for each clause so the total running time and space would be expected to be quadratic.

    • The potential barriers I’ve noticed are that: 1) parsing context-free graph in general is NP-complete 2) variables would have to be reused in proofs 3) the graph grammar could fail to be context-free.

    ** ER productions would to take two clauses with a shared variable $(L_1 \lor x)\land( \lnot x \lor L_2)$ to three clauses with two variables $(L_1 \lor x_1 \lor x_2)\land( \lnot x_1 \lor L_2)\land( \lnot x_2 \lor L_2)$. To match an ER production, the literals in L_2 have to be matched and they could be separated by a distance of O(n) after many other productions.

  8. Javaid Aslam permalink
    November 10, 2023 8:32 pm

    I would like to quote Einstein:
    “It’s not that I’m so smart, it’s just that I stay with problems longer.”

    I wish the strategy that is applied to the computers for deep learning were also applied to the effort to resolve the P vs. NP problem.
    The belief that “P != P “stems from a tacit assumption that all the combinatorics about any polynomially bounded structure are well understood, and P != NP is the underlying axiom.
    It is arrogance.

  9. Frank Vega permalink
    November 15, 2023 8:43 pm

    \textbf{Sometimes bad things could happen for better.}

    Despite of my Riemann hypothesis passed the first scrutiny of one of the top math journal in number theory, it was rejected by the Editorial Board at the end.

    Something very similar happens in July 2021, where the same journal rejected my manuscript about the Riemann hypothesis, we improved/resubmitted to The Ramanujan Journal and it was finally published by Springer in May 2022:

    https://link.springer.com/article/10.1007/s11139-022-00574-4

    Consequently, I did the same this time (improved/resubmitted) to another prestigious journal. This is the final submitted manuscript in November, 2023:

    https://www.researchgate.net/publication/375667085_Criterion_for_the_Riemann_Hypothesis

    The only difference in this opportunity is that we are dealing with the final proof of the Riemann hypothesis. Rephrasing famous quote of Beautiful Mind Film:

    “I need to believe that something extraordinary is possible because of:”

    \textbf{Sometimes bad things could happen for better.}

  10. November 16, 2023 4:42 am

    \textbf{Sometimes unlucky things could happen for better.}

    Despite of my Riemann hypothesis proof passed the first scrutiny of one of the top math journal in number theory, it was rejected by the Editorial Board at the end.

    Something very similar happens in July 2021, where the same journal rejected my manuscript about the Riemann hypothesis, we improved/resubmitted to The Ramanujan Journal and it was finally published by Springer in May 2022:

    https://link.springer.com/article/10.1007/s11139-022-00574-4

    Consequently, I did the same this time (improved/resubmitted) to another prestigious journal. This is the final submitted manuscript in November, 2023:

    https://www.researchgate.net/publication/375667085_Criterion_for_the_Riemann_Hypothesis

    The only difference in this opportunity is that we are dealing with the real and final proof of the Riemann hypothesis. Rephrasing famous quote of Beautiful Mind Film:

    “I need to believe that something extraordinary is possible because of:”

    \textbf{Sometimes unlucky things could happen for better.}

  11. Frank Vega permalink
    November 16, 2023 6:09 am

    \textbf{Sometimes unlucky things could happen for better.}

    Despite of my Riemann hypothesis proof passed the first scrutiny of one of the top math journal in number theory, it was rejected by the Editorial Board at the end.

    Something very similar happens in July 2021, where the same journal rejected my manuscript about the Riemann hypothesis, we improved/resubmitted to The Ramanujan Journal and it was finally published by Springer in May 2022:

    https://link.springer.com/article/10.1007/s11139-022-00574-4

    Consequently, I did the same this time (improved/resubmitted) to another prestigious journal. This is the ultimate manuscript based on Solé and Planat criterion and submitted in November, 2023:

    https://www.researchgate.net/publication/375667085_Criterion_for_the_Riemann_Hypothesis

    The only difference in this opportunity is that we are dealing with the definitely and real proof of the Riemann hypothesis. Rephrasing famous quote of Beautiful Mind Film:

    “I need to believe that something extraordinary is possible because of:”

    \textbf{Sometimes unlucky things could happen for better.}

  12. Frank Vega permalink
    November 22, 2023 5:11 am

    This might be the simplest way to prove the Riemann hypothesis using the Patrick Solé and Michel Planat criterion #mathematics #breakthrough23 #news

    https://www.researchgate.net/publication/375800780_Simple_Proof_for_the_Riemann_Hypothesis

  13. Frank Vega permalink
    November 25, 2023 5:51 am

    This might be the simplest way to prove the Riemann hypothesis #mathematics #breakthrough23 #news

    https://www.researchgate.net/publication/375667085_Criterion_for_the_Riemann_Hypothesis

  14. Frank Vega permalink
    November 27, 2023 11:21 pm

    The Riemann hypothesis is a conjecture that the Riemann zeta function has its zeros only at the negative even integers and complex numbers with real part 1/2. This is the most important unsolved problem in pure mathematics. In August-November 2023, we state a new criterion for the Riemann hypothesis. Using our criterion, we prove that the Riemann hypothesis is true.

    A preliminary version is/will available in the proceedings of MICOPAM 2023 (Date: August, 2023):

    https://micopam.com/micopam-2023/

    A much better version was submitted for peer-review in a top math journal (Date: November, 2023):

    https://www.researchgate.net/publication/375960862_Short_Proof_for_the_Riemann_Hypothesis

  15. November 28, 2023 12:00 am

    The Riemann hypothesis is a conjecture that the Riemann zeta function has its zeros only at the negative even integers and complex numbers with real part 1/2. This is the most important unsolved problem in pure mathematics. In August-November 2023, we state a new criterion for the Riemann hypothesis. Using our criterion, we prove that the Riemann hypothesis is true.

    A preliminary version is/(will be) available in the proceedings of MICOPAM 2023 (Date: August, 2023):

    https://micopam.com/micopam-2023/

    A much better version was submitted for peer-review in a top math journal (Date: November, 2023):

    https://www.researchgate.net/publication/375960862_Short_Proof_for_the_Riemann_Hypothesis

  16. Frank Vega permalink
    November 28, 2023 12:50 am

    The Riemann hypothesis is a conjecture that the Riemann zeta function has its zeros only at the negative even integers and complex numbers with real part 1/2. This is the most important unsolved problem in pure mathematics. In May 2022, The Ramanujan Journal published my article about few steps forward for trying to solve this problem:

    Robin’s criterion on divisibility
    https://link.springer.com/article/10.1007/s11139-022-00574-4

    In August-November 2023, we state a new criterion for the Riemann hypothesis based on Patrick Solé and Michel Planat results. Using our criterion, we prove that the Riemann hypothesis is true.

    A preliminary version is/(will be) available in the proceedings of MICOPAM 2023 (Date: August, 2023):

    MICOPAM 2023 (Participant 79)
    https://micopam.com/micopam-2023/

    A much better version was submitted for peer-review in a top math journal (Date: November, 2023):

    Short Proof for the Riemann Hypothesis
    https://www.researchgate.net/publication/375960862_Short_Proof_for_the_Riemann_Hypothesis

    Rephrasing famous quote of Beautiful Mind Film:

    “I need to believe that something extraordinary is possible…”

  17. Frank Vega permalink
    November 30, 2023 5:24 am

    The Riemann hypothesis is considered by many to be the most important unsolved problem in pure mathematics. There are several statements equivalent to the problem. In November 2023, we state a new criterion for the Riemann hypothesis. Using our criterion, we prove that the Riemann hypothesis is true:

    https://www.researchgate.net/publication/376001711_New_Criterion_for_the_Riemann_Hypothesis

  18. Frank Vega permalink
    December 7, 2023 4:31 am

    In November 2023, we state a new criterion for the Riemann hypothesis. Using our criterion, we prove that the Riemann hypothesis is true. #mathematics #research #news

    https://www.researchgate.net/publication/376001711_New_Criterion_for_the_Riemann_Hypothesis

  19. Frank Vega permalink
    December 12, 2023 9:04 am

    The Riemann hypothesis is considered by many to be the most important unsolved problem in pure mathematics. Using our criterion on superabundant numbers (based on Ramanujan’s work), we prove that the Riemann hypothesis is true:

    https://www.researchgate.net/publication/376416052_Riemann_Hypothesis_on_Superabundant_Numbers

  20. Frank Vega permalink
    December 14, 2023 9:59 am

    The Riemann hypothesis is considered the holy grail of mathematics. The eminent mathematician David Hilbert once said: “If I were to awaken after having slept a thousand years, my first question would be: has the Riemann Hypothesis been proven?” In 1913, Grönwall found what is known today as the Grönwall’s function. Years later, the great Ramanujan found that this function had a certain relationship with the Riemann hypothesis. Since then, Ramanujan began to study certain numbers of which his own tutor Hardy commented to a colleague, confessing: “Even Ramanujan could not make highly composite numbers interesting.” Ramanujan had a very long manuscript on highly composite numbers but some of it was not published until they were found stored in a Cambridge library at the end of the 20th century. During all those years, the prolific mathematician Erdős rediscovered the properties of these numbers and they began to be called superabundant and colossally abundant numbers. The French mathematicians Robin and Nicolas developed during their research a closer relationship between these numbers and the Riemann hypothesis. We found another close relationship between the Riemman hypothesis and these kind of numbers using the Grönwall’s function which announces that the Riemann hypothesis is indeed true:

    https://www.researchgate.net/publication/376489146_An_Equivalent_to_the_Riemann_Hypothesis

    The End 🙂

  21. Frank Vega permalink
    December 16, 2023 7:20 am

    In December 2023, I submitted two Riemann hypothesis proofs to a very prestigious math journals:

    https://www.researchgate.net/publication/376001711_New_Criterion_for_the_Riemann_Hypothesis

    and

    https://www.researchgate.net/publication/376489146_An_Equivalent_to_the_Riemann_Hypothesis

  22. Frank Vega permalink
    December 18, 2023 9:10 pm

    In December 2023, we submitted two Riemann hypothesis proofs to a very prestigious math journal:

    https://www.researchgate.net/publication/376606745_An_Equivalent_to_the_Riemann_Hypothesis

    and

    https://www.researchgate.net/publication/376607859_New_Criterion_for_the_Riemann_Hypothesis

Leave a Reply

Discover more from Gödel's Lost Letter and P=NP

Subscribe now to keep reading and get access to the full archive.

Continue reading