Merca Fast Partition Algorithms
Merca Fast Partition Algorithms
ascending compositions
arXiv:1903.10797v1 [math.CO] 26 Mar 2019
Mircea Merca
Constantin Istrati Technical College
Grivitei 91, 105600 Campina, Romania
mircea.merca@profinfo.edu.ro
Abstract
In this paper we give a fast algorithm to generate all partitions of a
positive integer n. Integer partitions may be encoded as either ascending
or descending compositions for the purposes of systematic generation. It
is known that the ascending composition generation algorithm is substan-
tially more efficient than its descending composition counterpart. Using
tree structures for storing the partitions of integers, we develop a new
ascending composition generation algorithm which is substantially more
efficient than the algorithms from the literature.
1 Introduction
Any positive integer n can be written as a sum of one or more positive inte-
gers λi , n = λ1 + λ2 + · · · + λk . If the order of integers λi does not matter,
this representation is known as an integer partition; otherwise, it is know as a
composition. When λ1 ≤ λ2 ≤ · · · ≤ λk , we have an ascending composition.
If λ1 ≥ λ2 ≥ · · · ≥ λk then we have a descending composition. We notice
that more often than not there appears the tendency of defining partitions as
descending composition and this is also the convention used in this paper. In
order to indicate that λ = [λ1 , λ2 , . . . , λk ] is a partition of n, we use the nota-
tion λ ` n introduced by G. E. Andrews [2]. The number of all partitions of a
positive integer n is denoted by p(n) (sequence A000041 in OEIS [8]).
The choice of the way in which the partitions are represented is crucial
for the efficiency of their generating algorithm. Kelleher [4, 5] approaches the
problem of generating partitions through ascending compositions and proves
that the algorithm AccelAsc is more efficient than any other previously known
algorithms for generating integer partitions.
1
In this study we show that the tree structures presented by Lin [6] can be
used to efficiently generate ascending compositions in standard representation.
The idea of using tree structures to store all partitions of an integer is based
on the fact that two partitions of the same integers could have more common
parts. Lin [6] created the tree structures according to the following rule: the
root of the tree is labeled with (1, n), and (x0 , y 0 ) is a child of the node (x, y) if
and only if
n jyk o
x0 ∈ x, x + 1, . . . , ,y and y 0 = y − x0 .
2
If x0 = y then (x0 , y 0 ) = (y, 0) is a leaf node. It is obvious that any leaf node
has the form (x, 0), 0 < x ≤ n.
1,1 2,0
1,0
2
that connects the root node with the leaf node only the leaf node and the nodes
that are followed by the left child. For instance, (1, 5)(1, 4)(1, 3)(2, 2)(2, 0) is a
path that connects the root node (1, 5) to the leaf node (2, 0). From this path
node (1, 3) is deleted when listing because it is followed by the node (2, 2) which
is its right child. Keeping from every remained pair only the first value, we get
the ascending composition [1, 1, 2, 2].
1,4 2,4
1,0 2,0
Considering, on the one hand, the way in which the partition tree of n was
created, and on the other hand, the rule according to which we can convert any
ordered tree into a binary tree, we can deduce the rule according to which we
can directly create the partition strict binary tree of n. The root of partition
strict binary tree of n is labeled with (1, n − 1), the node (xl , yl ) is the left child
of the node (x, y) if and only if
(
x, if 2x ≤ y,
xl = and yl = y − xl ,
y, otherwise
and the node (xr , yr ) is the right child of the node (x, y) if and only if
(
x + 1, if 2 + x ≤ y,
xr = and yr = x + y − xr .
x + y, otherwise
3
consider that the partition [n] has this property and we denote the number of
these partitions by p(t) (n, m). For instance, p(2) (15, 3) = 7, namely 15 has seven
partitions with parts as least as large as 3 in which the first part is at least twice
the secondjpart:
k [15], [12, 3], [11, 4], [10, 5], [9, 3, 3], [8, 4, 3] and [6, 3, 3, 3].
n
When t+1 < m ≤ n, it is easily seen that p(t) (n, m) = 1.
j k
n
Theorem 2.1. Let n, m and t be positive integers so that m ≤ t+1 . Then
The number 12 has four partitions with parts at least as large as 3 in which the
first part is twice the second part: [12], [9, 3], [8, 4] and [6, 3, 3]. The number 15
has three partitions with parts at least as large as 4 in which the first part is
twice the second part: [15], [11, 4] and [10, 5].
j k
n
Theorem 2.2. Let n, m and t be positive integers so that m ≤ t+1 . Then
bX
t+1 c
n
(t)
p (n, m) = 1 + p(t) (n − k, k) .
k=m
(t)
Proof. We expand the
j term
k p (n, m + 1) from the relation (1) and take into
(t) n
account that p n, t+1 + 1 = 1.
For example,
4
Proof. We are to prove the theorem by induction on n. For n = t + 1 we have
m = 1. Considering that p(t) (t + 1, 1) = 2, p(t−1) (1, 1) = 1 and
is true for any integer n0 , t < n0 < n. By Theorem 2.2 we can write
bn/(t+1)c
X
p(t) (n, m) = 1+ p(t−1) (n − k, k) − p(t−1) (n − t − k, k)
k=m
bn/(t+1)c bn/(t+1)c
X X
= 1+ p(t−1) (n − k, k) − p(t−1) (n − t − k, k) ,
k=m k=m
and
(t−1) n n−t
p (n − t − k, k) = 1 , for <k≤ ,
t+1 t
we get
bn/tc b(n−t)/tc
X X
p(t) (n, m) = p(t−1) (n − k, k) − p(t−1) (n − t − k, k) .
k=m k=m
The number 15 has three partitions with parts at least as large as 3 in which
the first part is thrice the second part: [15], [12, 3] and [9, 3, 3].
The p(t) (n, m) numbers are then also a direct generalization of the terminal
compositions developed by Kelleher [4, section 5.4.1].
We denote by p(t) (n) the number of partitions λ ` n that have the property
λ1 ≥ t · λ2 . It is clear that p(t) (n) = p(t) (n, 1). We then immediately have the
following result.
Corollary 2.4. Let n and t be positive integers, so that n > t > 1. Then
5
The following two corollaries then follows easily from Corollary 2.4.
Corollary 2.5. Let n be a positive integer. Then, the number of partitions of
n, that have the first part at least twice larger than the second part is
For instance, p(2) (5) = p(5) − p(3) = 7 − 3 = 4, that means that number 5
has four partitions that have the first part at least twice larger than the second
part: [5], [4, 1], [3, 1, 1] and [2, 1, 1, 1]. The sequence p(2) (n) appears in OEIS [8]
as A027336. Corollary 2.5 is known and can be found in Kelleher [5, Corollary
4.1].
Proof. For n ∈ {1, 2, 3} the formula is immediately proved. For n > 3 we apply
Corollary 2.4 and Corollary 2.5.
For example, p(3) (5) = p(2) (5) − p(2) (2) = 4 − 1 = 3, which means that
number 5 has three partitions that have the first part at least thrice larger than
the second part: [5], [4, 1] and [3, 1, 1]. The sequence p(3) (n) appears in OEIS
[8] as A121659.
The next theorem allows us to deduce an upper bound for p(3) (n).
Theorem 2.7. Let n be a positive integer. Then
1 − q − q2 + q5
(q; q)∞
are non-positive.
6
We have
1 − q − q2 + q5 (1 − q)(1 − q 2 ) − q 3 (1 − q 2 )
=
(q; q)∞ (q; q)∞
1 q3
= −
(q 3 ; q)∞ (1 − q) (q 3 ; q)∞
∞
X q 3n ∞
X q 3n+3
= −
n=0
(q; q)n n=0 (1 − q) (q; q)n
∞ ∞
X q 3n X q 3n (1 − q n )
= 1+ −
n=1
(q; q)n n=1 (1 − q) (q; q)n
∞
q 3n 1 − qn
X
= 1+ 1−
n=1
(q; q)n 1−q
∞
X q 3n
−q − q 2 − · · · − q n−1
= 1+
n=2
(q; q)n
Proof. To prove this inequality we apply Corollary 2.4 and Theorem 2.8.
7
Algorithm 1 Inorder traversal of strict binary tree
Require: root
1: initialize an empty stack S
2: v ← root
3: c ← true
4: while c do
5: while node v has left child do
6: push node v onto stack S
7: v ← left child of node v
8: end while
9: visit v
10: if stack S is not empty then
11: pop node v from stack S
12: visit v
13: v ← right child of node v
14: else
15: c ← false
16: end if
17: end while
Strict binary trees for storing integer partitions are special cases of strict
binary trees, while Algorithm 1 is a general algorithm for inorder traversal of
the strict binary trees. Adapting Algorithm 1 to the special case of partition
strict binary trees leads to more efficient algorithms for inorder traversal of these
trees. Next we are to show how these algorithms can be obtained.
Let (x, y) be a node from partition strict binary tree, so that 2x ≤ y. If
(xl , yl ) is the left child of the node (x, y) and has the property 2xl > yl then,
(xl , yl ) is the root of a strict binary subtree with exactly three nodes: (x, y − x),
(y − x, 0) and (y, 0). If (xr , yr ) is the right child of the node (x, y) and has the
property 2xr > yr then (xr , yr ) is the root of a strict binary subtree where all
the left descendents are leaf nodes. This note allows us to inorder traverse the
partition strict binary tree by the help of a stack in which we push only those
nodes (x, y) that have the property 2x ≤ y. In this way, we get the Algorithm
2 for inorder traversal of the partition strict binary tree of n.
The number of the inner nodes that have the form (x, y) with the property
2x ≤ y is equal to the number of the partition of n that have at least two parts
where the first part is at least twice larger than the second part. Algorithm 2
executes p(2) (n) − 1 operations of pushing in the stack S (line 6) and as many
operations of popping from the stack S (line 15). Any inner nodes that has the
form (x, y) with the property 2x > y has only a left child which is leaf node. A
leaf node that has a left child is always visited in the line 10 and, by the help of
Corollary 2.5, we deduce that the number of these nodes is p(n − 2). A leaf node
that is a right child is visited in the line 13. It results that the total number of
iterations of the internal while loop from the lines 4-21 is p(2) (n).
8
Algorithm 2 Inorder traversal of partition strict binary tree - version 1
Require: n
1: initialize an empty stack S
2: (x, y) ← (1, n − 1)
3: c ← true
4: while c do
5: while 2x ≤ y do
6: push node (x, y) onto stack S
7: (x, y) ← left child of node (x, y)
8: end while
9: while x ≤ y do
10: visit (y, 0), (x, y)
11: (x, y) ← right child of node (x, y)
12: end while
13: visit (x + y, 0)
14: if stack S is not empty then
15: pop node (x, y) from stack S
16: visit (x, y)
17: (x, y) ← right child of node (x, y)
18: else
19: c ← false
20: end if
21: end while
Thus, the reducing of the number of operations executed upon the stack
allows to get a more efficient algorithm for traversing the partition strict binary
tree. But the number of operations executed upon the stack could be reduced
even more.
Let (x, y) be a node from the partition strict binary tree, so that 3x ≤ y. If
(xl , yl ) is the left child of the node (x, y) and has the property 3xl > yl , then
2xl ≤ yl and (xl , yl ) is the root of a strict binary tree in which the left subtree is
a strict binary tree in which all the left children are leaf nodes. This note allows
us to modify Algorithm 2 in order to get a new algorithm for inorder traversing
the partition strict binary trees. Thus, in Algorithm 3 we push in the stack S
only those inner nodes (x, y) that have the property 3x ≤ y.
The number of the nodes (x, y) with the property 3x ≤ y from the partition
strict binary tree of positive integer n is equal to the number of partitions of
n that have at least two parts where the first one is at least three times larger
than the second one. In this way, Algorithm 3 executes p(3) (n) − 1 operations
of pushing in the stack S (line 6) and as many operations of popping from the
stack S (line 25). In other words, the lines 6-7 are executed p(3) (n) − 1 times, in
the same way as the lines 25-27. The total number of iterations of the internal
while loop from the lines 9-18 is p(2) (n) − p(3) (n), namely p(2) (n − 3). The leaf
nodes that are right children are visited in the lines 16 and 23. The number
of these nodes is p(2) (n). Considering that the line 16 is executed p(2) (n − 3)
9
times, we deduce that the total number of iterations of the internal while loop
from the lines 4-31 is p(3) (n).
10
node is visited if and only if the visited node is the left child of the node from
the stack. Taking into account the fact that partition strict binary trees have
been obtained by converting partition trees, it is clear that, when visiting a leaf
node, the content of the stack together with the leaf node represents a partition.
11
ak represents the top of the stack. If we give up visiting the inner nodes from
Algorithm 2 and the visit of a leaf node is preceded by the visit of the array
(a1 , a2 , . . . , ak ) we get Algorithm 4 for generating ascending compositions in
lexicographic order.
Algorithm 4 is presented in a form that allows fast identification both of the
correlation between the operations executed in the stack S and the operations
executed with the array (a1 , a2 , . . . , ak ), and the movement operations in the
tree. Thus, the lines 7-9 are responsible with the pushing of the nodes in the
stack and the movement in the left subtree. The extraction of the nodes of the
stack is realized in the lines 26-28, and the movement in the right subtree is
realized in the lines 18-19 and 29-30. A slightly optimized version of Algorithm
4 is presented in Algorithm 5.
We note that Algorithm 5 was rewritten without using the if statement and
the boolean variable c. It is obvious that the statements from the first branch of
the if statement could not be eliminated, but they were rearranged around the
visit statement (line 20). This was possible because the initializing statement of
the variable k was changed. As Algorithm 5 executes less assignment statements
and does not contain the if statement, the time required to execute Algorithm
12
5 is less than the time required to execute Algorithm 4.
Comparing the algorithm AccelAsc described and analyzed by Kelleher
[4, 5] with Algorithm 5, we note that Algorithm 5 is a slightly modified version
of the algorithm AccelAsc. In fact we have two presentations of the same
algorithm. The differences are determined by the different methods of getting
them.
The notes made upon the number of operations executed in Algorithm 2
allow us to determine how many times the assignment statement are executed
and how many times the boolean expressions are evaluated in Algorithm 5.
Theorem 4.1. Algorithm 5 executes 4p(n) + 4p(2) (n) assignment statements
and evaluates p(n) + 3p(2) (n) boolean expressions.
Proof. The total number of iterations of the internal while loop from the lines
4-23 is p(2) (n), and the while loop contains 5 assignment statements. The total
number of iterations of the internal while loop from the lines 5-9 is p(2) (n) − 1,
and the while loop contains 3 assignment statements. The total number of
iterations of the internal while loop from the lines 11-17 is p(n − 2), and the
while loop contains 4 assignment statements. Taking into account the first 3
assignment statements from the algorithm, we get the relation with which we
determine the number of executions of assignment statements: 8p(2) (n)+4p(n−
2). The boolean expression from the line 4 is evaluated p(2) (n) + 1 times, the
boolean expression from the line 5 is evaluated p(2) (n) + p(2) (n) − 1 times, and
the boolean expression from the line 11 is evaluated p(2) (n) + p(n − 2) times.
Thus, it results that the boolean expressions from the algorithm are evaluated
4p(2) (n) + p(n − 2) times. According to Corollary 2.5, the proof is finished.
If in Algorithm 3 we execute the conversions made in Algorithm 2, we get
Algorithm 6 for generating ascending composition in lexicographic order.
13
16: visit a1 , a2 , . . . , au
17: p←x+1
18: q ←y−p
19: while p ≤ q do
20: at ← p
21: au ← q
22: visit a1 , a2 , . . . , au
23: p←p+1
24: q ←q−1
25: end while
26: at ← y
27: visit a1 , a2 , . . . , at
28: x←x+1
29: y ←y−1
30: end while
31: while x ≤ y do
32: ak ← x
33: at ← y
34: visit a1 , a2 , . . . , at
35: x←x+1
36: y ←y−1
37: end while
38: y ←x+y−1
39: ak ← y + 1
40: visit a1 , a2 , . . . , ak
41: k ←k−1
42: x ← ak + 1
43: end while
14
statements: 5p(3) (n) + 4p(2) (n) + 4p(n − 2). The boolean expression from the
line 4 is evaluated p(3) (n) + 1 times, the boolean expression from the line 5 is
evaluated p(3) (n) + p(3) (n) − 1 times, the boolean expression from the line 12 is
evaluated p(2) (n) times, and the boolean expressions from the lines 19 and 31
are evaluated p(3) (n) + p(n − 2). It results that the boolean expressions from
the algorithm are evaluated 4p(3) (n) + p(2) (n) + p(n − 2) times. By Corollary
2.5 and Corollary 2.6 the proof is finished.
Theorem 4.3. Algorithm 6 is more efficient than Algorithm 5.
Proof. According to Theorem 4.1 and Theorem 4.2, it is sufficient to show that
By Corollary 2.4, we can rewrite this inequality as p(2) (n) ≤ 4p(2) (n − 3). Using
Corollary 2.4 and Corollary 2.8, we obtain
We can see the evolution of the ratios r1 (n) and r2 (n), for 1 < n ≤ 1500.
The graph is realized in Maple [3] and allows to note that the best performances
of Algorithm 6 compared to Algorithm 5 appears for 50 ≤ n ≤ 150.
Now let us measure CPU time for few values of n. To do this we encode the
algorithms in C++ and the programs obtained with Visual C++ 2010 Express
Edition will be run on three computers in similar conditions. The processor of
the first computer is Intel Pentium Dual T3200 2.00 GHz, the processor of the
second computer is Intel Celeron M 520 1.60GHz, and the the processor of third
one is Intel Atom N270 1.60GHZ.
CPU time is measured when the program runs without printing out ascend-
ing compositions. We denote by t1 (n) the average time for Algorithm 6 obtained
15
Figure 3: Evolution of performances of Algorithm 6 compared to Algorithm 5
after ten measurements, by t2 (n) the average time for Algorithm 5 obtained after
ten measurements, and by r(n) the ratio of t1 (n) and t2 (n),
t1 (n)
r(n) = .
t2 (n)
In Table 1 we present the results obtained after the measurements made and
also the ratios r1 (n) and r2 (n).
Analyzing the data presented in Table 1, we realize that the ratio r2 (n) is
16
a good approximation of the ratio r(n), obtained experimentally on computers
with Intel Pentium Dual or Intel Celeron processors. We note that the ratio
r(n) obtained on the computer with Intel Atom processors is well approximated
by the ratio r1 (n).
Algorithm 6 is the fastest algorithm for generating ascending compositions
in lexicographic order in standard representation and can be considered an ac-
celerated version of the algorithm AccelAsc that Kelleher [4, 5] presented.
References
[1] G. E. Andrews, Enumerative proofs of certain q-identities, Glasgow Math.
J. 8(1), 33-40, (1967)
[6] R. B. Lin, Efficient Data Structures for Storing the Partitions of Integers,
The 22nd Workshop on Combinatorics and Computation Theory. Taiwan,
(2005).
[7] L. Livovschi, and H. Georgescu, Sinteza şi analiza algoritmilor. Editura
Ştiinţifică şi Enciclopedică, Bucureşti, (1986)
17