Hi, let me introduce me first. I once was a proud C++ beginner, but that was years ago, now I'm back to being a complete noob. Five years or so ago, I did a beginners class in which eventually I learned complexity calculation and to code a tower of Hanoi. I still have that code and tried to alter it, together with simple recursive assignments to make what I want. But I failed miserably, for I can't even wrap my head around the code I made all those years ago.
Now for my problem. I would like to code something that gives me the chances of failure and success for the Crafting skill in Dungeons and Dragons. But I've got no idea where to begin. I thought would be available on the internet already, but I can't find it, so I want to make it myself. Can you give me a clue about how to tackle this problem?
Let me explain Crafting in D&D:
A craftable item has two variables: its Difficulty Class (DC) and its price.
A player has two variables: his skill (e.g. '5') and the die roll (1 to 20).
To craft an item, you have to roll the die, add your skill and compare it with the DC of the item (and one week has past in the game).
- If the result is 5 or more lower than the DC, you failed and it all stops.
- If the result is 4 or less lower than the DC, nothing happens (you spend time, however).
- If the result is equal or more than the DC you collect points, so to say. With enough points (price times 10) your item is done, if not, you have to keep going until you have enough points. The points you collect depends on the DC and your result (it's result times DC).
Extra options:
- Instead of letting one week pass, you can choose to roll every in-game day, you collect points seven times faster, but you need ten times the points (prices times 100) doing so.
- You may choose to add 10 to the DC, making it more difficult to succeed, but collecting more points if you succeed.
So a 10 gp item with DC 5 gives the following options for a player with skill 5:
- In weeks, DC 5: possible results are 6-25, all successes, giving 30-125 points per week, 100 needed for completion, 30% chance to complete in one week (15-20 are six numbers out of 20), 0% chance of failure.
- In days, DC 5: possible results are 6-25, all successes, giving 30-125 points per day, 1000 needed for completion. Expected to be done in 1000/((30+125)/2)= 13 days.
- In weeks, DC 15: failure: 1-5 (25%), nothing happens: 6-9 (20%), success: 10-20 (55%), giving 150-300 points, 100 needed for completion. Chance to complete in one week: 55%, two weeks: 11% three weeks 2,02% etc. Chance to fail in first week: 25%, second week: 5%, third week: 1%, etc.
- In days, DC 15: failure: 1-5 (25%), nothing happens: 6-9 (20%), success: 10-20 (55%), giving 150-300 points, 1000 needed for completion. Expected to be done in 1000/((.55*(150+300)/2)= 8 days (chance of failure somewhere in those 8 days: 90%)
That's what I can calculate from the top of my head.
Eventually I want to get the chances of "completing the item by week/day N" (or rather 1 to N), "failure by week/day N" and then compare those for all the different options.
However, I don't know how to think this true. Should I try all possibilities within a maximum amount of weeks/days, and count the failures and completions for each week/day?
Should it be iterative or recursive?
Or should I use chance calculation?
How should I start on this?
To make a simple example, if I pick a random number 1,2 or 3 and I stop at 1, pick again at 2 and count the times i pick 3, how should I code it to get the chance I pick 3 two times before I pick a 1, for how many repeations?
I figured it doesn't has to be neatly done for a grade or something.
So instead of trying all possibilities, which would be 20^(weeks or days), I wrote a simple script in Matlab to try the actual crafting once, using a random roll, and repeating that 10.000 times, collecting where it stops every time.