My card game only has a roll for accuracy.

Which means that a maximum accuracy always does a fixed set of damage.

In my other 2 games. It was still a random number for everything.

The question is, should I think of something to make every weapon "random". Even if the accuracy is 100%?

How to do that?

Because 8/8 weapons always do 100% damage.

Don't you also have a QUANTITY (as in "Number of Units") too??? That also is variable too... And all my "units" do a RANDOM (1D8) accuracy that ranges between 20% and 100%. Basically you MERGE three (3) "Squadrons" into ONE (1) PLATOON... And roll 3D8s for accuracy that you choose how to assign... So in my version it's rarely 100%... The odds a 1:8 x 3 dice.

Sure there is a chance... But only much more rare to roll 100%. So I don't have this problem and I have also "custom" D6s for certain accuracy rolls too.

These dice also vary in probabilities also... Again so I don't have this issue either (whether it be standard rolls or custom ones).

**Note #1: **I'm not sure how to compute the probability of rolling a "1" on any three D8s. I THINK it's 1/8 + 1/8 + 1/8 = 3/8 = 37.5% and near to 40% given 3 dice... But I could be wrong... (Not sure)

**Note #2: **For ME there is no "maximum accuracy" it is always RANDOM using a D8 (as I explained above 3D8s, you choose how to distribute the dice roll values among your three squadrons). There is both a DMG (Damage) and QTY (Unit Count)... But Accuracy is ALWAYS random. That's just how I simplified it in my version...

That is what the 3d6 and 3d8 where for. Having only 3 dice. Making a big variety of results with all the cards.

However, the 100% accuracy cards would always hit 100%.

Now, I have been thinking. Should I have the exact same roll do something for the damage as well?

No, that would make the game imbalanced.

A 100% accuracy would have 1 to 8 damage.

A 12,5% accuracy would have 1 damage.

The average damage factor between the lowest and highest, would not be 8/1=8.

But 4.5/0.125=36. When comparing the 100% accuracy to the 12.5% accuracy.

The true weight factor would be:

100% of 4.5 = 4.5

Maybe... I could do this. And calculate for each accuracy, the new factors. Thus a chance to hit. And the hit has a damage value. The damage is on average for the whole range of the accuracy.

d8:

1) 12.5% of 1.0 = 0.125

2) 25.0% of 1.5 = 0.375

3) 37.5% of 2.0 = 0.750

4) 50.0% of 2.5 = 1.250

5) 62.5% of 3.0 = 1.875

6) 75.0% of 3.5 = 2.625

7) 87.5% of 4.0 = 3.500

8) 100.% of 4.5 = 4.500

d6

1) 16.7% of 1.0 = 0.167

2) 33.3% of 1.5 = 0.500 (This is suitable for the game)

3) 50.0% of 2.0 = 1.000

4) 66.7% of 2.5 = 1.667

5) 83.3% of 3.0 = 2.500

6) 100.% of 3.5 = 3.500

***

Another approach is a third set of dice, to roll. One that would be a factor to any die that was a hit. Just like in my proto-type game. Where accuracy could be 100%. But the damage roll would be d6-2, d6-1 or d6.

I don't know what else I could do.

d6 where the result is actually a factor on the hits?

If 3d6 or 3d8 are rolled. Each accuracy would have 0, 1, 2 or 3 hits.

Maybe roll 3 coloured dice. And all accuracy with 1 hit are influence by the white die.

All accuracy with 2 hits are influenced by the white and grey die.

And all accuracy with 3 hits are influenced by the white, grey and black die.

The result would be a factor of 0, 1 or 2 for each die.

The 100% accuracy still have a chance now on doing 0 damage. But if the white die has a hit, and the grey and black die have a miss. Then the 100% accuracy would deal relatively less damage than a lower accuracy that had only 1 hit.

So, now we have the player roll 3d6(red) and 3d8(blue) for the cards. Then another 3d6(white+grey+black)

What do you think?

Then the player rolls