the machine has limits. I think a standard double gets 16 or so significant digits, maybe a few more. On top of that, 22/7 isn't even close to pi after like 4 digits. A tenbyte has a few more digits, I forget how many digits, but its nowhere near 100, its probably like 20-25 range. If you want big precision, youll need a non atomic variable type, which means writing your own or using a freebie large number class -- and those perform like the whales they are.
I don't know what setprecision will accept as input, but its not going to do anything past 20 or so for your output.