I have my program all but done, but I can't get past this last part. I need to have the user input their height as a decimal. For example someone who is 5 foot 11 inches would need to enter it as 5.11, then I need to display there height in inches. Since i'm 5 foot 11, I would need to enter 5.11, then have it display 71 inches. I have no idea how to do this.
Despite the decimal confusion you could write a few lines of code to take the int value of the input and multiply it by 12. subtract the same integer value from the input and multiply that by 100 and add that to the previous result. So, 5.11 -> 5 x12 = 60, 5.11-5= .11, x100=11, + 60 = 71.
Be careful with single-digit inch values. Would 5' 1" be written as 5.1 or 5.01?
Another way to do this is instead of treating the whole thing as a float, treat it as 2 integer values separated by a dot. Read the feet integer, multiply by 12, ignore the dot, and read the inches integer.
5.1 -> 5 x 12 = 60, ignore the '.', inches = 1, + 60 = 61.