{\huge Transcomputational Problems}
Problems beyond brute force search
Hans-Joachim Bremermann was a mathematician and biophysicist. He is famous for a limit on computation, Bremermann's limit, which is the maximum computational speed of a self-contained system in the material universe.
Today Ken and I want to talk about the limit and why it is not a limit.
A transcomputational problem is a problem that requires processing of more than $latex {10^{93}}&fg=000000$ bits of information. Our friends at Wikipedia say:
Any number greater than this is called a transcomputational number. This number is, according to Bremermann, the total number of bits processed by a hypothetical computer the size of the Earth within a time period equal to the estimated age of the Earth. The term transcomputational was coined by Bremermann.
Pro
What is interesting is that he thought about ``transcomputational'' problems in 1962. Yes almost a decade before the P=NP problem was stated. See his paper for details.
He noted back then that certain problems were beyond any reasonable brute-force search. In his own words: The experiences of various groups who work on problem solving, theorem proving and pattern recognition all seem to point in the same direction: These problems are tough. There does not seem to be a royal road or a simple method which at one stroke will solve all our problems. My discussion of ultimate limitations on the speed and amount of data processing may be summarized like this: Problems involving vast numbers of possibilities will not be solved by sheer data processing quantity. We must look for quality, for refinements, for tricks, for every ingenuity that we can think of. Computers faster than those of today will be a great help. We will need them. However, when we are concerned with problems in principle, present day computers are about as fast as they ever will be. We may expect that the technology of data processing will proceed step by step---just as ordinary technology has done. There is an unlimited challenge for ingenuity applied to specific problems. There is also an unending need for general notions and theories to organize the myriad details.
Quite insightful for a paper that dates a decade before Cook-Karp-Levin on P=NP. It also predates the limits associated to Jacob Bekenstein's bound on the information capacity of finite space and/or to Rolf Landauer's principle.
One wonders what might have happened if Bremermann's paper had been better known in our theory community. Ken notes that the Russian theory community highlighted the question of perebor---brute-force search. But he senses the emphasis was on problems for which it could be necessary in the abstract, rather than tied to Earth-scale considerations like Bremermann's.
Con
Of course there are several issues that were missed. One is quantum computation---I believe all his calculations depend on a classical view of computation. There are several other points that we can raise to attempt to beat his ``limit.''
$latex {\bullet }&fg=000000$ Change the algorithms: Of course his limit could be applied to computing primality, for example. The brute force method is hopeless for even modest-sized numbers. Yet we know methods that are much better than brute force and so we can easily beat his limit.
Steve Wozniak visited Buffalo yesterday as a UB Distinguished Speaker, and in a small group attended by Ken, told his standard anecdote about the first program he wrote, which was to solve the Knight's Tour problem on an $latex {8 \times 8}&fg=000000$ chessboard. He first coded a brute-force solution trying all Knight moves at each step but realized before he hit ``run'' that it would take about $latex {10^{25}}&fg=000000$ years. This awakened him, he said, to the fact that good algorithms have to go hand-in-hand with good hardware.
$latex {\bullet }&fg=000000$ Change the answers: Another method is to change what we consider as an answer. Approximation algorithms of course are one important example: allow the answer to be near the optimal one. This has opened the floodgates to increase the class of problems that we can solve.
$latex {\bullet }&fg=000000$ Change the problems: Another method is to change the problems that we attack. In many cases we can avoid general problems and exploit special structure of a problem. Examples that come to mind include: replace dense matrices by sparse ones; replace arbitrary graphs by planar ones or those with restricted minors; and replace data analysis of arbitrary data sets by analysis of data that is generated with specific noise, like Gaussian.
$latex {\bullet }&fg=000000$ Change the world: We have posted about the idea of a world without true randomness, presenting Leonid Levin's proof that SAT is nearly polynomially solvable in such a world. That post offered the weaker idea that every efficient generator of SAT instances might be solved by Levin's algorithm on all but finitely many instances. The finite bound might be huge, but the fact of Levin's algorithm would carry weight: it would solve everything else based solely on the principle that ``nothing succeeds like success.'' We can put this idea in concrete terms like Bremermann's:
Could we live in a world where the act of creating an instance that requires processing more than $latex {10^{93}}&fg=000000$ bits of information requires processing more than $latex {10^{93}}&fg=000000$ bits of information?
We note that Larry Stockmeyer proved that every Boolean circuit capable of deciding a certain class of logical formulas $latex {f}&fg=000000$ that fit into 8 lines of 80-column ASCII text must have more gates than atoms in the observable universe. But this does not rule out a little algorithm $latex {A}&fg=000000$ solving every $latex {f}&fg=000000$ that we could generate---unless we spend the time to cycle through every legal formula that fits into 640 characters.
Open Problems
Are there realistic limits on computation of the type that Bremermann postulated? What are the right limits in light of today's insights into computation?