@Mikey Boy
Thanks!
kingkush wrote: |
---|
But yes i do completely agree if you understand icy1's example that is currently the best and easiest way to do it if you're looking to shuffle the vector. |
Yeah, I don't want to shuffle the vector (or change the vector at all). Just pick a random element. I suppose one could copy the vector and shuffle the copy but this would probly be less efficient?
I also came across the rand( ) and srand( ) functions on youtube and wanted to ask what's the problem with it? I tested it with a self-written dice.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35
|
#include <iostream>
#include <ctime>
#include <cstdlib>
using namespace std;
int main()
{
int one{0}, two{0}, three{0}, four{0}, five{0}, six{0};
int z;
int i=0;
int N;
cout << "how often do you want to roll the dice?"<<endl;
cin >> N;
srand(static_cast<unsigned int>(time(0)));
while(i<N){
z=(rand() % 6) +1;
if(z==1) one++;
else if(z==2) two++;
else if(z==3) three++;
else if(z==4) four++;
else if(z==5) five++;
else six++;
i++;
}
cout << "1 was rolled so many times: "<< one<<" with probability P= "<< (double)one/N<<endl;
cout << "2 was rolled so many times: "<< two<<" with probability P= "<< (double)two/N<<endl;
cout << "3 was rolled so many times: "<< three<<" with probability P= "<< (double)three/N<<endl;
cout << "4 was rolled so many times: "<< four<<" with probability P= "<< (double)four/N<<endl;
cout << "5 was rolled so many times: "<< five<<" with probability P= "<< (double)five/N<<endl;
cout << "6 was rolled so many times: "<< six<<" with probability P= "<< (double)six/N<<endl;
return 0;
}
|
Now I understand that rand( ) yields a random int between 0 and 30.000 or something.
rand() % n +1 gives a random no. between 1 and n.
I see the problem that, due to modulo, lower int's are a bit more likely than higher ints.
If rand() only gave numbers between 0 and 15, for example, rand()%6 would give numbers between 0 and 5, but the 0,1,2,3 a bit more often.
For the actual rand()%n, however, this seems to be negligible if 30 000 >> n.
Is there another problem in using this 'easy' way or is it just that @Ganado's way is simply avoiding this problem?
The n in my code would be 400, so it might be enough then to choose the easy way.
edit: I will just check it with a dice with 400 sides :D
eidt2: So after 100 billion throws of a dice with 400 sides using rand( ) and srand( ), the relative frequency of the different sides can be divided in two groups. The frequency of the lower numbers is a bit higher than the frequency of the higher numbers, whereas the frequency inside of a group is constant. So the modulo effect is visible (at least at this amount of throws - I don't know yet how many 'throws' I am going to use in my actual program).
Testing Ganado's code now with my dice.