1. My Goal
I want to estimate the runtime of an algorithm. To do so I need to calculate the number of combinations it has to process.
2. Question
The number of combinations that have to be processed is the number of unique distributions of k items into m buckets. Each bucket has its own capacity ci. The number of items k is guaranteed to fit in, i.e. 0 <= k <= sum(ci). Each item has size 1 and the items are undistinguishable (think of tennis balls as items and stacks to place them as buckets). That means a valid combination is comprised of m many numbers xi, where sum(xi) = k, and 0 <= xi <= ci.
How to calculate the number of combinations?
3. What I Tried
(Sorry, I'm a programmer, I start indexing at 0, but still I say "first" when I mean that there was none before.)
I'm at a loss here, because e.g. let's say the first bucket has a capacity of only 1 (c0 = 1) and the second bucket is bigger, it can hold 5 items (c1 = 5).
For deciding where to put the first item, I have m many choices. But depending on if I put the item into bucket 0 or bucket 1, I'm left with either m-1 or with m choices to put the second item. That makes it difficult for me to find a closed form for calculating the number of combinations.
I have an algorithm which iterates over all combinations, so it's possible for me to check the results of a formula. However, I would like to estimate the runtime in advance. Depending on my problem sizes, the runtime is usually between a second and some hours.