13

Is there a practical way to compute the permanent of a large ($91 \times 91$) $(0,1)$ matrix?

I have tried to use the matlab function written by Luke Winslow which works great for smaller matrices but it's not getting anywhere with this size.

UPDT: I am thinking of the incidence matrix of an order 9 projective plane. Does that help?

  • Is there any symmetry or reduction that you can exploit? (An obvious example would be that it has a block submatrix all with the same entry.) Gerhard "Ask Me About Binary Matrices" Paseman, 2013.04.28 – Gerhard Paseman Apr 28 '13 at 20:10
  • have you tried Brian Butler's code?

    https://sites.google.com/site/brianbutlerengineer/home/research

    (scroll to bottom of page)

    – Carlo Beenakker Apr 28 '13 at 20:35
  • 2
    My first guess would be that no, unless the matrix has some properties that will help (for instance, if it's an adjacency matrix of a planar graph): in general, computing permanents is a hard problem (sf. http://en.wikipedia.org/wiki/Permanent_is_sharp-P-complete ) – Victor Kleptsyn Apr 28 '13 at 20:47
  • Do you need the exact value or would an approximation suffice? –  Apr 28 '13 at 20:58
  • If your matrix is an integer matrix, it should be possible to compute the last few bits in the base 2 expansion. Maybe that would be helpful? – John Wiltshire-Gordon Apr 28 '13 at 23:04
  • 1
    If the matrix is extremely sparse it could be possible. It is much bigger than problems usually solvable. – Brendan McKay Apr 29 '13 at 02:39
  • Perhaps there is a small submatrix of this that might be hard to compute. If you tell us more about the instance, perhaps we can suggest something. Gerhard "Ask Me About Smaller Cases" Paseman, 2013.04.28 – Gerhard Paseman Apr 29 '13 at 02:44
  • If I understand Ryser's formula correctly, you can use it to great effect, since it will amount to (something like) 9^91. Gerhard "Invites Verification Of This Estimate" Paseman, 2013.04.29 – Gerhard Paseman Apr 29 '13 at 13:33
  • I now understand Rysers formula better,and the answer will not be as simple. However, you can use it to approximate the permanent, and I am guessing the answer will be near 9^91. Gerhard "Give Or Take A Googol" Paseman, 2013.04.29 – Gerhard Paseman Apr 29 '13 at 15:31
  • @quid an approximation might do, depending how good it is. – Felix Goldberg Apr 29 '13 at 16:16

2 Answers2

8

The answer is unfortunately probably no, but there are a few things you could try.

There are algorithms that run in time polynomial in the value of the permanent, meaning that the permanent can be computed quickly if its value is small, but that is not going to help you for the incidence matrix of a 9x9 projective plane, whose permanent will be huge. I think getting the exact value in your example will be hopeless unless you can exploit the geometric structure to dramatically reduce the size of the computation.

If you only want an approximation, then there's a little bit more hope. Famously, Jerrum, Sinclair and Vigoda exhibited a fully polynomial randomized approximation scheme. This was a fantastic theoretical achievement, but in practice the algorithm is not all that fast (having high exponents and multiplicative constants) and is not at all trivial to implement.

Your best bet may be to look at recent work on using belief propagation to approximate the permanent. I am not up to speed on the latest developments but I would start by contacting the authors of "Approximating the Permanent with Fractional Belief Propagation" for suggestions.

Timothy Chow
  • 78,129
  • Timothy, when you write that there are algorithms that run in time polynomial in the value of the permanent, I assume you mean for nonnegative matrices only! For general matrices, I believe even deciding whether the permanent is (say) 0 or 1 is #P-hard under nondeterministic reductions. – Scott Aaronson Apr 29 '13 at 21:32
  • Scott, yes, I was assuming 0-1 matrices (or more generally nonnegative matrices) throughout; perhaps I should have been more explicit. The Jerrum-Sinclair-Vigoda result is also for nonnegative matrices only. – Timothy Chow Apr 30 '13 at 02:10
  • A general estimation method that I think would work here starts by defining any backtrack program that counts the matchings in this bipartite graph. Then estimate the number of nodes at the right level of the search tree by using Knuth's method that involves tracing random paths. This gives an unbiased estimator, though it can be hard to get a reliable estimate of the precision. – Brendan McKay Apr 30 '13 at 11:09
  • Brendan, the trouble with Knuth's method is that even though the mean of the estimator is correct, in many cases the distribution is highly skewed, so that with only polynomially many samples you will whp get a gross underestimate of the tree size. I can't be sure, but based on my past experience with similar problems, that's what I'd predict would happen for Felix's problem. This behavior of Knuth's algorithm is one reason why it's so remarkable that the MCMC methods give you guaranteed error bounds in polynomial time. – Timothy Chow Apr 30 '13 at 14:32
  • 2
    Yes, I've seen that happen too, but I've also had some successes. One thing that I've found to help a lot is to collapse some number of lower levels in the tree into one level (so when you get to that level you perform an exhaustive search). That takes care of the situation where most paths die out just before the end. Another modification that should help here would be to define the tree to exclude all useless branches (using a flow algorithm for example) so that no paths die out at all. It will be much slower but also much less skewed I think. – Brendan McKay Apr 30 '13 at 16:09
  • 2
    (Ran out of characters) There is another method where you act like you are generating the whole search tree, but toss a coin at each node to decide whether or not to make its children. The bias of the coin, perhaps different at each level, is adjusted in advance so that what remains is not too large but still has a fair number of leaves. I've made some spot-on predictions using this method, but I've never seen it analyzed. I don't think it is equivalent to Knuth's method. – Brendan McKay Apr 30 '13 at 16:15
1

The general problem is very hard (#P-complete in complexity terms). A quick web search found this article that might be helpful:

none
  • 11
  • A WSEAS conference proceedings? That does not necessarily mean the paper is a load of rubbish, but one should exercise a great deal of caution. – Emil Jeřábek Apr 29 '13 at 11:44