{\huge A Magic Madison Visit}

To give a Hilldale Lecture and learn about fairness and dichotomies

\vspace{.5in}

Jin-Yi Cai was kind enough to help get me, Dick, invited last month to give the Hilldale Lecture in the Physical Sciences for 2017-2018. These lectures are held at The University of Wisconsin-Madison and are supported by the Hilldale Foundation. The lectures started in 1973-1974, which is about the time I started at Yale University---my first faculty appointment.

Today Ken and I wish to talk about my recent visit and then appreciate something about Jin-Yi's work on ``dichotomies'' between polynomial time and $latex {\mathsf{\#P}}&fg=000000$-completeness.

The Hilldale Lectures have four tracks: Arts & Humanities and Physical, Biological, and Social Sciences. Not all have a speaker each year. The Arts & Humanities speaker was Yves Citton of the University of Paris last April, and in Social Sciences, Peter Bearman of Columbia spoke on Sept. 28, three weeks before I. Last year's speaker in the Physical Sciences track was Frank Shu of Berkeley and UCSD on the economics of climate change. Before him came my former colleagues Peter Sarnak and Bill Cook. Dick Karp was invited in 2004, and mathematicians Barry Mazur and Persi Diaconis came between him and Cook.

I am delighted and honored to be in all this company. We all may not seem to have much to do with the physical sciences but let's see. I spoke on ($latex {\cdots}&fg=000000$) a subject for another time---let this post be about my hosts.

My Visit

The other highlight of my visit was meeting with the faculty of the CS department and also with the graduate students. I hope they enjoyed our discussions as much as I did.

One topic that came up multiple times is the notion of fair algorithms. This is a relatively new notion and is being studied by several researchers at Wisconsin. The area has its own blog. The authors of that blog also wrote a paper titled, ``On the (Im)Possibility of Fairness,'' whose abstract we quote:

What does it mean for an algorithm to be fair? Different papers use different notions of algorithmic fairness, and although these appear internally consistent, they also seem mutually incompatible. We present a mathematical setting in which the distinctions in previous papers can be made formal. In addition to characterizing the spaces of inputs (the ``observed'' space) and outputs (the ``decision'' space), we introduce the notion of a construct space: a space that captures unobservable, but meaningful variables for the prediction. We show that in order to prove desirable properties of the entire decision-making process, different mechanisms for fairness require different assumptions about the nature of the mapping from construct space to decision space. The results in this paper imply that future treatments of algorithmic fairness should more explicitly state assumptions about the relationship between constructs and observations.

There is a notion of fairness in distributed algorithms but this is different. The former is about the allocation of system resources so that all tasks receive due processing attention. The latter has to do with due process in social decision making where algorithmic models have taken the lead. Titles of academic papers cited by a paper by three of the people I met in Madison and someone from Microsoft speak why the subject has arisen:

Also among the paper's 29 references are newspaper and magazine articles whose titles state the issues with less academic reserve: ``Websites vary prices, deals based on users’ information''; ``Who do you blame when an algorithm gets you fired?''; ``Machine bias: There's software used across the country to predict future criminals. And it’s biased against blacks''; ``The dangers of letting algorithms enforce policy.'' Evan a May, 2014 statement by the Obama White House is cited.

Yet also among the references are papers familiar in theory: ``Satisfiability modulo theories''; ``Complexity of polytope volume computation'' (by Leonid Khachiyan no less), ``On the complexity of computing the volume of a polyhedron''; ``Hyperproperties'' (by Michael Clarkson and Fred Schneider), ``On probabilistic inference by weighted model counting.'' What's going on?

The Formal Method

What's going on can be classed as a meta-example of the subject's own purpose:

How does one formalize a bias-combating concept such as fairness without instilling the very kind of bias one is trying to combat?

We all can see the direction of bias in the above references. You might think that framing concepts to apply bias in the other direction might be OK but there's a difference. Bias in a measuring apparatus is more ingrained than bias in results. What we want to do---as scientists---is to formulate criteria that are framed in terms apart from those of the applications in a simple, neutral, and natural manner. Then we hope the resulting formal definition distinguishes the outcomes we desire from those we do not and stays robust and consistent in its applications.

This is the debate---at `meta' level---that Ken and I see underlying the two papers we've highlighted above. We blogged about Kenneth Arrow's discovery of the impossibility of formalizing a desirable ``fairness'' notion for voting systems. The blog guys don't find such a stark impossibility theorem but they say that to avoid issues with analyzing inputs and outcomes, one has to attend also to some kind of ``hidden variables.'' The paper by Madison people tries to ground a framework in formal methods for program verification, which is connects to probabilistic inference via polytope volume computations.

Many other ingredients from theory can be involved. The basic idea is determining sensitivity of outcomes to various facets of the inputs. The inputs are weighted for relevance to an objective. Fairness is judged according to how well sensitivity corresponds to relevance and also to how the distribution of subjects receiving favorable decisions breaks according to low-weight factors such as gender. Exceptions may be made by encoding some relations as indelible constraints---the Madison plus Microsoft paper gives as an example that a Catholic priest must be male.

Thus we see Boolean sensitivity measures, ideas of juntas and dictators, constraint satisfaction problems, optimization over polytopes---lots of things I've known and sometimes studied in less-particular contexts. My Madison hosts brought up how Gaussian distributions are robust for this analysis because they have several invariance properties including under rotations of rectangles. This recent Georgetown thesis mixes in even more theory ideas. The meta-question is:

Can all these formal ingredients combine to yield the desired outcomes in ways whose scientific simplicity and naturalness promote confidence in them?

Thus wading in with theory to a vast social area like this strikes us as a trial of ``The Formal Method.'' Well, there is the Hilldale Social Sciences track...

Did ``hidden variables'' bring quantum to your mind? We are going there next, with Ken writing now.

A Magic Madison Upper Bound

We covered Jin-Yi's work in 2014 and 2012 and 2009. So you could say in 2017 we're ``due'' but we'll take time for a short remark. All of these were on dichotomy theorems. For a wide class $latex {\mathcal{C}}&fg=000000$ of counting problems in $latex {\mathsf{\#P}}&fg=000000$ his dichotomy is that every problem in $latex {\mathcal{C}}&fg=000000$ either belongs to polynomial time or is $latex {\mathsf{\#P}}&fg=000000$-complete. There are no in-between cases---that's the meaning of dichotomy.

Jin-Yi's answer to a question last month between Dick and me recently brought home to us both how wonderfully penetrating and hair-trigger this work is. Dick had added some contributions to the paper covered in the 2009 post and was included on the paper's final 2010 workshop version. Among its results is the beautiful theorem highlighted at the end of that post:

Theorem 1 There is an algorithm that given any $latex {K > 0}&fg=000000$ and formula for a quadratic polynomial $latex {f(x_1,\dots,x_n)}&fg=000000$ over $latex {\mathbb{Z}_K}&fg=000000$ computes the exponential sum

$latex \displaystyle Z(f) = \sum_{x_1,\dots,x_n \in \mathbb{Z}_K} \omega^{f(x_1,\dots,x_n)} = \sum_{j=0}{K-1} F_j \omega^j &fg=000000$

exactly in time that is polynomial in both $latex {n}&fg=000000$ and $latex {\log K}&fg=000000$. Here $latex {\omega = e^{2\pi i f(x_1,\dots,x_n)/K}}&fg=000000$ and $latex {F_j}&fg=000000$ means the number of arguments on which $latex {f}&fg=000000$ takes the value $latex {j}&fg=000000$ modulo $latex {K}&fg=000000$.

That the time is polynomial in $latex {\log K}&fg=000000$ not $latex {K}&fg=000000$ is magic. We can further compute the individual weights $latex {F_j}&fg=000000$ by considering also the resulting expressions for $latex {Z(2f),Z(3f),\dots,Z((K-1)f)}&fg=000000$. Together with $latex {\sum_j F_j = K^n}&fg=000000$ they give us $latex {K}&fg=000000$ equations in the $latex {K}&fg=000000$ unknowns $latex {F_j}&fg=000000$ in the form of a Vandermonde system, which is always solvable. Solving that takes time polynomial in $latex {K}&fg=000000$, though, and we know no faster way of computing any given $latex {F_j}&fg=000000$.

When $latex {K}&fg=000000$ is fixed, however, polynomial in $latex {n}&fg=000000$ is all we need to say. So the upshot is that for any fixed modulus, solution counting for polynomials of degree $latex {d = 2}&fg=000000$ is in $latex {\mathsf{P}}&fg=000000$. Andrzej Ehrenfeucht and Marek Karpinski proved this modulo primes and also that the solution-counting problem for degree $latex {d = 3}&fg=000000$ is $latex {\mathsf{\#P}}&fg=000000$-complete even for $latex {K = 2}&fg=000000$. So the flip from $latex {\mathsf{P}}&fg=000000$ to $latex {\mathsf{\#P}}&fg=000000$-complete when $latex {d}&fg=000000$ steps from $latex {2}&fg=000000$ to $latex {3}&fg=000000$ is an older instance of dichotomy. The newer one, however, is for polynomials of the same degree $latex {d = 2}&fg=000000$.

Dichotomy and Quantum

I wrote a post five years ago on his joint work with Amlan Chakrabarti for reducing the simulation of quantum circuits to counting solutions in $latex {\{0,1\}^n}&fg=000000$. One motive is which subclasses of quantum circuits might yield tractable cases of counting. The classic---pun intended---case is the theorem that all circuits of so-called Clifford gates can be simulated in classical polynomial time (not even randomized). I observed that such circuits yield polynomials $latex {f}&fg=000000$ over $latex {\mathbb{Z}_4}&fg=000000$ that are sums of terms of the form

$latex \displaystyle x^2 \qquad\qquad\text{or}\qquad\qquad 2xy. &fg=000000$

These terms are invariant on replacing $latex {x}&fg=000000$ by $latex {x+2}&fg=000000$ or $latex {y}&fg=000000$ by $latex {y+2}&fg=000000$ modulo $latex {4}&fg=000000$. Hence for such $latex {f}&fg=000000$ there is an exactly $latex {2^n}&fg=000000$-to-$latex {1}&fg=000000$ correspondence between solutions in $latex {\{0,1\}^n}&fg=000000$ and those in $latex {\mathbb{Z}_4^n}&fg=000000$. Since counting the latter is in $latex {\mathsf{P}}&fg=000000$ by Theorem 1, the stabilizer theorem follows.

Adding any non-Clifford gate makes a set that is universal---i.e., has the full power of $latex {\mathsf{BQP}}&fg=000000$. The gates I thought of at the time of my post all bumped the degree up from $latex {2}&fg=000000$ to $latex {3}&fg=000000$ or more. A related but different representation by David Bacon, Wim van Dam, and Alexander Russell gives a dichotomy of linear versus higher degree. The controlled-phase gate, however, is non-Clifford but in my scheme produces polynomials $latex {f'}&fg=000000$ as sums of terms of the form

$latex \displaystyle x^2 \qquad\qquad\text{or}\qquad\qquad xy. &fg=000000$

Those are quadratic too, so Theorem 1 counts all the solutions in polynomial time. Does this make $latex {\mathsf{BQP = P}}&fg=000000$? The hitch is that quantum needs counting binary solutions---and having $latex {xy}&fg=000000$ not $latex {2xy}&fg=000000$ defeats the above exact correspondence.

I thought maybe the counting problem for quadratic-and-binary could be intermediate---perhaps at the level of $latex {\mathsf{BQP}}&fg=000000$ itself. But Jin-Yi came right back with the answer that his dichotomy cuts right there: this 2014 paper with his students Pinyan Lu and Mingji Xia has a general framework for CSPs that drops down to say the problem is $latex {\mathsf{\#P}}&fg=000000$-complete. Thus the state of play is:

  1. Counting all solutions of quadratic polynomials over $latex {\mathbb{Z}_4}&fg=000000$ is easy.
  2. Counting the binary solutions however is hard: $latex {\mathsf{\#P}}&fg=000000$-complete.
  3. Counting the binary solutions when the coefficient of all terms of the form $latex {xy}&fg=000000$ is $latex {2}&fg=000000$ is easy again.

Thus the difference between easy and general quantum circuits hangs on that `$latex {2}&fg=000000$'---a coefficient, not an exponent---as does factoring (not?) being in $latex {\mathsf{P}}&fg=000000$. Of course this doesn't mean quantum circuits are $latex {\mathsf{\#P}}&fg=000000$-complete---they are generally believed not to be even $latex {\mathsf{NP}}&fg=000000$-hard---but that saying $latex {f'}&fg=000000$ has terms of the form $latex {x^2}&fg=000000$ and $latex {xy}&fg=000000$ captures ostensibly more than $latex {\mathsf{BQP}}&fg=000000$.

Might there be some other easily-identified structural properties of the $latex {f'}&fg=000000$ produced (say) by circuits of Hadamard and controlled-phase gates that make the counting problem intermediate between $latex {\mathsf{P}}&fg=000000$ and $latex {\mathsf{\#P}}&fg=000000$, if not exactly capturing $latex {\mathsf{BQP}}&fg=000000$? Well, the dichotomy grows finer and richer and stronger with each new paper by Jin-Yi and his group. This feeds in to Jin-Yi's most subtle argument for believing $latex {\mathsf{\#P \neq P}}&fg=000000$ as we said it here, echoing reasons expounded by Scott Aaronson and others for $latex {\mathsf{NP \neq P}}&fg=000000$: if the classes were equal there would be no evident basis for our experiencing such fine and sharp distinctions.

Open Problems

Trying to make up for blog pause while we were busy with a certain seasonal event, we offer several open problems: