Thursday, 24 September 2009

BINOMIAL MATRIX (II)

If $latex A_n$ is an square matrix with elements:

$latex \displaystyle a_{i,j}=\binom{i+k}{j}$

Then:

$latex \displaystyle |A_{n}|=\binom{n+k}{n}$


Proof:

The $latex \displaystyle A_{n}$ matrix can be decomposed as the product of a lower triangular matrix, $latex \displaystyle L_{n}$, and an upper triangular matrix, $latex \displaystyle U_{n}$:

$latex \displaystyle A_{n}=L_{n}*U_{n}$

Example:

$latex \displaystyle A_{3}=\left(\begin{array}{ccc} 1+k & \frac{1}{2} k (1+k) & \frac{1}{6} (-1+k) k (1+k) \\ 2+k & \frac{1}{2} (1+k) (2+k) & \frac{1}{6} k (1+k) (2+k) \\ 3+k & \frac{1}{2} (2+k) (3+k) & \frac{1}{6} (1+k) (2+k) (3+k) \end{array} \right)$

$latex \displaystyle L_{3}=\left(\begin{array}{ccc} 1 & 0 & 0 \\ \frac{2+k}{1+k} & 1 & 0 \\ \frac{3+k}{1+k} & \frac{2 (3+k)}{2+k} & 1 \end{array} \right)$

$latex \displaystyle U_{3}=\left( \begin{array}{ccc} 1+k & \frac{1}{2} k (1+k) & \frac{1}{6} (-1+k) k (1+k) \\ 0 & \frac{2+k}{2} & \frac{1}{3} k (2+k) \\ 0 & 0 & \frac{3+k}{3}\end{array} \right)$

After some math "plumbing", we can propose as decomposition:

$latex \displaystyle l_{i,j}=\frac{i + k}{j + k} \binom{i - 1}{j - 1}$

$latex \displaystyle u_{i,j}=\binom{j - 1}{i - 1} \binom{i + k}{j} {\binom{k + i - 1}{i - 1}}^{-1}$

If this decomposition is correct then the matrix product should be $latex \displaystyle a_{i,j}$:

$latex \displaystyle \sum_{r=1}^n{l_{i,r} \cdot u_{r,j}}=\sum_{r=1}^{min(i,j)}{l_{i,r} \cdot u_{r,j}}$

Where:

$latex \displaystyle l_{i,r}=\frac{i + k}{r + k} \binom{i - 1}{r - 1}$

$latex \displaystyle u_{r,j}=\binom{j - 1}{r - 1} \binom{r + k}{j} {\binom{k + r - 1}{r - 1}}^{-1}$

$latex \displaystyle l_{i,r} \cdot u_{r,j}$, can be simplified, changing the binomial coefficient to the gamma function, as:

$latex \displaystyle l_{i,r} \cdot u_{r,j}=\frac{(i+k)\cdot r}{i \cdot j}\cdot\binom{i}{r}\binom{k}{j-r}$

$latex \displaystyle \sum_{r=1}^{min(i,j)}{l_{i,r} \cdot u_{r,j}}=\sum_{r=1}^{min(i,j)}{\frac{(i+k) r}{i \cdot j}\cdot\binom{i}{r}\binom{k}{j-r}}=\\ \frac{(i+k)}{i \cdot j} \cdot \sum_{r=1}^{min(i,j)}{r\binom{i}{r}\binom{k}{j-r}}$

This last expression can be simplified, with the help of:

$latex \displaystyle r\binom{i}{r}=i \binom{i-1}{r-1}$, the Vandermonde´s convolution (see reference [2]), and the absortion formula (see reference [3]), to:

$latex \displaystyle \sum_{r=1}^{min(i,j)}{l_{i,r} \cdot u_{r,j}}=\binom{i+k}{j}=a_{i,j}$

Now we can calculate easily $latex |A_n|$, because the LU-decomposition used is the so called Doolittle decomposition, where the matrix $latex L_{n}$ has all ones on its diagonal.

$latex \displaystyle |A_{n}|=\prod_{i=1}^{n}{l_{i,i}*u_{i,i}}=\prod_{i=1}^{n}{u_{i,i}}=\prod_{i=1}^{n}{\frac{i+k}{i}}=\binom{n+k}{k}=\binom{n+k}{n}$


Archives:[a]-092609-Binomial Matrix (II).nb


References:
[1]-John H. Mathews and Kurtis Fink, 2004 - Module for PA = LU Factorization with Pivoting
[2]-Matthew Hubbard and Tom Roby - Pascal's Triangle From Top To Bottom - the binomial coefficient website- Catalog #: 3100003.
[3]-Matthew Hubbard and Tom Roby - Pascal's Triangle From Top To Bottom - the binomial coefficient website- Catalog #: 2400002.



Tuesday, 22 September 2009

BINOMIAL MATRIX (I)

If we use yesterday's idea, with little variations, we can create new expressions for matrices with elements related to binomial coefficients, for instance:

Let be $latex U_n$ an square matrix with elements:

$latex \displaystyle u_{i,j}=\binom{j}{i}$

$latex U_n$ is an upper triangular matrix with all its diagonal elements equal to $latex 1$, and similar to Pascal's triangle.(That I prefer to name Tartaglia's Triangle)

Example:

$latex U_{10}= \left( \begin{array}{cccccccccc} 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 & 9 & 10 \\ 0 & 1 & 3 & 6 & 10 & 15 & 21 & 28 & 36 & 45\\ 0 & 0 & 1 & 4 & 10 & 20 & 35 & 56 & 84 & 120 \\ 0 & 0 & 0 & 1 & 5 & 15 & 35 & 70 & 126 & 210 \\ 0 & 0 & 0 & 0 & 1 & 6 & 21 & 56 & 126 & 252 \\ 0 & 0 & 0 & 0 & 0 & 1 & 7 & 28 & 84 & 210 \\ 0 & 0 & 0 & 0 & 0 & 0 & 1 & 8 & 36 & 120 \\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 1 & 9 & 45 \\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 1 & 10 \\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 1 \end{array} \right)$

And, of course:

$latex |U_{n}|=1$

If we multiply this matrix by its transposed one, then we get a symmetrical matrix with:

$latex |A_{n}|=|U_{n}^{T}*U_{n}|=|U_{n}|^2=1$

Example:

$latex A_{10}=U_{10}^{T}*U_{10}$

$latex \displaystyle A_{10}=\\ \\ \left( \begin{array}{cccccccccc} 1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 & 9 & 10 \\ 2 & 5 & 9 & 14 & 20 & 27 & 35 & 44 & 54 & 65 \\ 3 & 9 & 19 & 34 & 55 & 83 & 119 & 164 & 219 & 285 \\ 4 & 14 & 34 & 69 & 125 & 209 & 329 & 494 & 714 & 1000 \\ 5 & 20 & 55 & 125 & 251 & 461 & 791 & 1286 & 2001 & 3002 \\ 6 & 27 & 83 & 209 & 461 & 923 & 1715 & 3002 & 5004 & 8007 \\ 7 & 35 & 119 & 329 & 791 & 1715 & 3431 & 6434 & 11439 & 19447 \\ 8 & 44 & 164 & 494 & 1286 & 3002 & 6434 & 12869 & 24309 & 43757 \\ 9 & 54 & 219 & 714 & 2001 & 5004 & 11439 & 24309 & 48619 & 92377 \\ 10 & 65 & 285 & 1000 & 3002 & 8007 & 19447 & 43757 & 92377 & 184755 \end{array}\right)$

So we have constructed a new matrix with determinant equal to one:

$latex \displaystyle a_{i,j}=\sum_{r=1}^n{u_{i,r}^{*} \cdot u_{r,j}}=\sum_{r=1}^{min(i,j)}{u_{r,i} \cdot u_{r,j}}$

$latex \displaystyle a_{i,j}=\sum_{r=1}^{min(i,j)}{\binom{i}{r} \binom{j}{r}}=-1+\sum_{r=0}^{i}{\binom{i}{r} \binom{j}{r}}=\binom{i+j}{i}-1=\binom{i+j}{j}-1$

Note(i): For a proof on binomial identity see references [1] or [2]

OEIS Related Sequences:




Row/Column


Sequence
1 A000027
2 A000096
3 A062748
4 A063258
5 A062988
6 A124089
7 A124090
8 A165618
9 A035927
10 -


References:
[1]-Ronald L. Graham, Donald E. Knuth, and Oren Patashnik (Reading, Massachusetts: Addison-Wesley, 1994 - Concrete Mathematics - Identity (5.32) in Table 169.
[2]-Matthew Hubbard and Tom Roby - Pascal's Triangle From Top To Bottom - the binomial coefficient website- Catalog #: 3100004.

Monday, 21 September 2009

BETA FUNCTION MATRIX DETERMINANT



If $latex A_n$ is an square matrix of order $latex n$, whose elements are defined as:

$latex \displaystyle a_{i,j}=\frac{1}{\beta(i,j)}$, where $latex \beta$ is the beta function. Then:

$latex |A_{n}|=n!$


Proof:

If we use Cholesky method, we can decompose this matrix as:

$latex A_{n}=U_{n}^{T}*U_n$

Where $latex U_n$ is an upper triangular matrix.

But instead of applying the algorithm to a generic case, we are going to propose a factorization, and after we will check that this decomposition generates the appropriate matrix. This mean to use software to speed up the proof, like:

$latex \displaystyle A_{5}=\left(\begin{array}{ccccc} 1 & 2 & 3 & 4 & 5 \\ 2 & 6 & 12 & 20 & 30 \\ 3 & 12 & 30 & 60 & 105 \\ 4 & 20 & 60 & 140 & 280 \\ 5 & 30 & 105 & 280 & 630 \end{array} \right)= U_{n}^{T}* \left( \begin{array}{ccccc} 1 & 2 & 3 & 4 & 5 \\ 0 & \sqrt{2} & 3 \sqrt{2} & 6 \sqrt{2} & 10 \sqrt{2} \\ 0 & 0 & \sqrt{3} & 4 \sqrt{3} & 10 \sqrt{3} \\ 0 & 0 & 0 & 2 & 10 \\ 0 & 0 & 0 & 0 & \sqrt{5} \end{array} \right)$

The elements of $latex U_n$, seems to be:

$latex \displaystyle u_{i,j}=\sqrt{i}\cdot \binom{j}{i}$

$latex U_{n}^T$ is the transpose of $latex U_n$, so:

$latex \displaystyle u_{i,j}^{*}=u_{j,i}=\sqrt{j} \cdot \binom{i}{j}$

If we multiply both triangular matrices:

$latex \displaystyle a_{i,j}=\sum_{r=1}^n{u_{i,r}^{*} \cdot u_{r,j}} =\sum_{r=1}^n{r \cdot \binom{i}{r} \binom{j}{r}}=\sum_{r=1}^{min(i,j)}{r \cdot \binom{i}{r} \binom{j}{r}}$

This last binomial expression, can be added to a closed form, equal to the reciprocal of beta function, (See proof on reference [1])

$latex \displaystyle a_{i,j}=i \cdot \binom{i+j-1}{j-1}=\frac{1}{\beta(i,j)}$

Now, that we have found this decomposition, for $latex A_n$, then it is unique and $latex A_n$ is Hermitian and positive definite.

$latex \displaystyle |A_{n}|=|U_{n}^{T}*U_{n}|=|U_{n}|^{2}=\left(\prod_{i=1}^{n}{u_{i,i}}\right)^{2}=\prod_{i=1}^{n}{u_{i,i}^2}=\prod_{i=1}^{n}{\left(\sqrt{i}\cdot\binom{i}{i}\right)^2}=\prod_{i=1}^{n}{i}=n!$

Archives:[a]-092109-Beta Determinant.nb


References:
[1]-Ronald L. Graham, Donald E. Knuth, and Oren Patashnik (Reading, Massachusetts: Addison-Wesley, 1994 - Concrete Mathematics - page 181 Problem 5
[2]-A000142-Factorial numbers: n! The On-Line Encyclopedia of Integer Sequences!

Tuesday, 8 September 2009

CURIOUS SERIES-002

This is the Dirichlet convolution of $latex \displaystyle \omega*1$:

$latex \displaystyle a(n)=\sum_{d|n}{\omega(d)}$

Where $latex \displaystyle \omega$ is the number of distinct prime factors function.

This function, $latex \displaystyle \omega(n)$ is an additive function:

$latex \displaystyle \omega(n)=\omega(d \cdot n/d ) \leq \omega(d)+\omega(n/d)$

Where the equal symbol holds iff $latex GCD(n/d,d)=1$

If we sum over all divisors:

$latex \displaystyle \sum_{d|n}{\omega(n)} \leq \sum_{d|n}{\bigg(\omega(d)+\omega(n/d)\bigg)}=\sum_{d|n}{\omega(d)}+\sum_{d|n}{\omega(n/d)}=2 \sum_{d|n}{\omega(d)} $

$latex \displaystyle \omega(n)\cdot \sum_{d|n}{1}=\omega(n)\cdot\tau_{2}(n) \leq 2 \sum_{d|n}{\omega(d)}=2\cdot a(n)$

If $latex s$ is a squarefree number and if $latex d|s$ then $latex GCD(d,d/s)=1$ and, the number of divisors , $latex \displaystyle \tau_{2}(s)=2^{\omega(s)}$, then:

$latex \displaystyle a(s)=\omega(s)\cdot 2^{\omega(s)-1}$

For every number $latex n>1$:

$latex \displaystyle a(n)=\sum_{d|n}{\omega(d)} \leq \sum_{d|n \; d\neq 1}{\omega(n)}= \omega(n) \cdot \sum_{d|n \; d\neq 1}{1}= \omega(n)\cdot(\tau_{2}(n)-1) $

$latex \displaystyle \omega(n)\cdot(\tau_{2}(n)-1) \geq a(n) \geq \bigg\lceil \frac{\omega(n)\cdot \tau_{2}(n)}{2} \bigg\rceil$


NOTE: Sequences in OEIS:

$latex a(n)=A062799(n)$

$latex \omega(n)=A001221(n)$

$latex \tau_{2}(n)=A000005(n)$

References:

[1]-A062799-Inverse Moebius transform of A001221, the number of distinct prime factors of n The On-Line Encyclopedia of Integer Sequences!
[2]-A001221-Number of distinct primes dividing n (also called omega(n)). The On-Line Encyclopedia of Integer Sequences!
[3]-A000005-d(n) (also called tau(n) or sigma_0(n)), the number of divisors of n.. The On-Line Encyclopedia of Integer Sequences!