2

So I have x number of items, each one has its own appearance ratio for making a pattern, of which I'm trying to determine the smallest possible pattern. The appearance ratios for all items total up to 1.

This is a bit of a complicated question to get my head around so I'll try and explain with examples…

For example there may be four letters (a, b, c, d) with respective ratios of 0.4, 0.3, 0.2, 0.1, which would require a minimum pattern size of 10 (as 0.1 is the smallest ratio, and 0.1 / 1 = 10). Resulting pattern would be aaaabbbccd.

In another example, of 0.4, 0.4, 0.2 we would get a minimum pattern size of 5 (aabbc).

However an example of 0.3, 0.3, 0.4 would require a pattern size of 10 (aaabbbcccc) because the 0.4 cannot be divided by the 0.3 without resulting in a decimal.

What algorithm could calculate the pattern size based on the input ratios?

Essential
  • 123

1 Answers1

2

As Qiaochu Yuan posted, what you're looking for is the least common denominator of the ratios involved. Many thanks to him - the post I previously had here did not deal with ratios whose decimal expansions are infinite. I have corrected this post accordingly. We can find the LCD of a set of ratios writing the ratios as fractions with integral numerators and any common denominator, and then dividing the denominator by the gcd of these numerators.

For example, with $.3, .3, .4 = \frac{3}{10}, \frac{3}{10}, \frac{4}{10}$, gcd$(3, 3, 4) = 1$, so the pattern size is calculated to be $\frac{10}{1} = 10$.

By contrast, for the example $.2, .4, .4 = \frac{2}{10}, \frac{4}{10}, \frac{4}{10}$, gcd$(2, 4, 4) = 2$, so the pattern size is calculated to be $\frac{10}{2} = 5$.

For an additional example, $.35, .35, .3 = \frac{35}{100}, \frac{35}{100}, \frac{30}{100}$, gcd$(35, 35, 30) = 5$, so the pattern size is calculated to be $\frac{100}{5} = 20$. The pattern is, correspondingly, (${\bf aaaaaaabbbbbbbcccccc}$)

Alex Wertheim
  • 20,278
  • Many thanks for the reply @AWertheim. I've been studying what gcd is this morning and how to get the gcd of multiple integers (I'm trying to input this all into a javascript program for a game I'm making), and I got that part working with recursion. Although I can't quite figure out how you're going from the decimals to integers. In machine code, would you just be looking for any decimals then multiplying all values by 10 until you no longer detect decimals, and use that multiple as the denominator to divide by the gcd? – Essential May 13 '13 at 17:53
  • My pleasure, @Essential! Yes, that sounds like a good strategy to me, AND it avoids the problem of worry about infinite decimal expansions. Let me know if you have any other questions! – Alex Wertheim May 13 '13 at 19:52