Arc Forumnew | comments | leaders | submitlogin
1 point by lacker 5945 days ago | link | parent

A bit nit picky, but memoization alone won't make the solution optimal, since you are dealing with very large numbers. n! has O(n log n) digits, so in the limit it will take O(n log n) time just to calculate n! from n * (n-1)!.


1 point by jmatt 5945 days ago | link

I agree memoization isn't optimal.

But I don't know about your big-O estimate. Since the triangle is constructive and you are memoizing calculating n! from n * (n - 1)! will take O(1). It'll be the cached (n - 1)! value times n.

-----

1 point by lacker 5940 days ago | link

Doing a single multiplication is not O(1) when the numbers have more than a constant number of digits. I'm not precisely sure how long multiplying large numbers takes, but it's at least linear in the number of digits involved. And (n-1)! has O(n log n) digits.

-----

1 point by eds 5945 days ago | link

I don't follow how n! has O(n log n) digits. Mind explaining?

-----

2 points by lacker 5940 days ago | link

No prob. Sorry if I go on at too much length here ;-)

So a useful formula for counting digits is, a positive integer x has floor(log10(x)) + 1 digits. You can figure this out yourself by thinking, where does the digit-counting function bump up a notch? At 10, 100, 1000, etc. So asymptotically the number of digits in x is O(log x).

So n! has O(log n!) digits. The trickier part is figuring out or knowing that O(n log n) = O(log n!). Using log ab = log a + log b you can expand out the factorial:

  O(log n!) = O(log (n * (n-1) * ... * n * 1))
            = O(log n + log (n-1) + ... + log 2 + log 1)
            = O(n log n)
In case the last step isn't clear, you can do this splitting-in-half bounding trick. Since each element in the sum is less than log n you can bound from above with

  log n + log (n-1) + ... + log 2 + log 1 < n log n
And if you just take the larger half of the list you can bound from below with

  log n + log (n-1) + ... + log 2 + log 2 > log n + log (n-1) + ... + log (n/2)
                                          > (n/2) log (n/2)
which is itself O(n log n). So O(log n!) = O(n log n).

In general the rules of thumb you use to reduce O(log n!) are:

  1. complicated expressions inside factorials are ugly, you should simplify them
  2. O(sum of n things) is usually O(n * the biggest thing)
Make sense?

-----

1 point by kens 5945 days ago | link

You're multiplying O(n) numbers, each of which is O(log n) digits long, so the result is O(n log n) digits long.

-----

1 point by shader 5945 days ago | link

Usually, it seems to be either (n log n) or (n log n) - 1 digits.

And usually in this case I would leave of the O, as that usually refers to the performance of an algorithm in time or space. I suppose you could construe the number of digits to be "space" but multiplying O(n) numbers doesn't make that much sense.

-----