I'm having a trouble testing my code. The point is to enter an array of numbers and for the code to find the smallest one. It's compiling just fine, but the testing isn't working out so well.
Line 14 defines size2, which is only used on line 22.
Line 16 uses i for the size of array. The i is not const, nor has it any connection to size2.
Line 18 defines a new i that masks the i of the outer scope. Curiously, this i is unsigned while the other i and size2 are signed. Considering the purpose of the variables one would expect all of them to be unsigned.
Line 18 calculates the size of the array. That is fine and dandy, but the size of array is stored in the (now invisible) variable that was set on line 15 and used on line 16.
Line 22 the size2, being set separately, could be larger than the array. That would be an error.