Would it be possible to have a few powerful central cpus in a building, and have everyone share them? What I was thinking of is if you had something like an arduino where your cpu normally goes, and having that wirelessly transmit data to a processor that would then send the processed data back. The only problems I could think of would be the network bandwidth, and that small internet issues would crash all of the computers in the building.
So in other words you want a wireless CPU instead of having one directly hooked up to your motherboard? Seems like more of a problem and if you are concerned about a laptop being slow or overheating I would suggest you look into a desktop. Laptop has some ups and some downs same with a desktop.
@ OP: Yes, not only is this possible but it's becoming increasingly popular in medium and large businesses in the US. I've actually evaluated a few configurations of this idea using a Raspberry Pi and Citrix Reciever and Oracles Virtual Box myself. It's a cool idea that takes a bit of planning to get going but eliminates so many failure points that it would be ridiculous not to at least try it. Is there anything specific about the concept that you wanted to know?
I think the idea of thin client computing is gaining more and more traction as the years are going by. Already, there are so many online software as a service alternatives to things that usually were done inside large, mostly offline software packages. Koding is a great example, where you basically get a virtual installation of ubuntu that lives in the cloud, and use that to code on from anywhere with an internet connection. Eventually, I think it'll get to the point where you just almost all computers are just thin clients, and you just type in your credentials and go. Of course, the lack of an internet connection is still a limiting factor in these scenarios.
@ OP: Yes, with a thin client the only thing you need locally is the core Mobo w\ NIC, CPU, RAM, PSU and a boot loader. The end users workstation can be had for $100 - $200 USD. All of the heavy lifting is pushed off onto a central server. There is also the added bonus in long term power savings since thin client machines use about as much power as running the old incandescent light bulbs used to. When all is said and done, this is a solid idea that is set to take off in the very near future. Personally the more I look at this idea the more I start thinking that, with the possible exception of CAD, running PC's in an office environment was probably a market failure.
Yes it's possible and many places do it already. A friend of mine actually did it on his tablet just using Microsoft's Remote Desktop. His gaming desktop plugged in at home in an idle state with Remote Desktop running and connected to the internet, when at college he ran a Remote Desktop app through the internet and just hooked up to his PC at home...
Then he could search for anything he wanted using his home internet rather than the college proxy, he could do proper tasks or browse or save notes on his home PC or even run few light games (light because you don't have the keyboard/mouse control from a tablet).
Their are plenty of places though that have extremely powerful servers that are capable of doing any heavy processing required and pretty much just sending a picture back to a cheap-ass netbook that barely has enough processing power to turn on.
The benefit being that you have your fast non-portable PC doing all the work and have your really cheap portable netbook being able to do larger tasks at a decent speed (because it's not actually doing them at all)
My high school computer science teacher wanted to do this - have one or two main servers in the back room while all the students just got wireless monitors.
OP, that's an astute observation. In fact, the industry has gone from that model (centralized computers) to localized computers and back again.
The issue is the difference in the cost of bandwidth vs. the cost of computing. In the early days of computers, processing power was expensive so we had mainframe computers and distributed terminals that accessed them. The PC revolution happened when the cost of computing dropped low enough that it was economical to put the processor on the users desk. About 15 years ago the cost of bandwidth got really cheap so now it is sometimes cheaper to have computers in data centers and connect to them remotely.