Think about what kind of hardware do they need to run all those VM's. Virtual machines are slow, 10 Crysis' running at the same time would require something like 32 cores, 24 GB of RAM and huge array of GPUs - at least 10.
100 players? 1000? 10000? Yeah, 1000 players on single datacenter, thats not much, but a big city needs few of them. 3-4 centers with capacity to host games up to 20k players would mean... 20 * 3.2 * 1000 CPU cores(64000), 20 * 2 * 1000 GB's of RAM(40000), 20k GPU's and tons of motherboards. 1 GPU eats around 100 W of power(very optimisic here, in reality it is more like 150 - 200W), now that makes 0.1 * 20k, 2000 KW or 2 GW. One CPU core consumes around 20 W of power, 1.28 GW for CPUs. RAM is quite cheap power-wise, the motherboard, hard disks(SSDs would be a suicide) and PSU take their own. GPU and CPU would total in 3.28 GW, add HDDs, RAM and motherboard and you will get around 4 GW. Now add 80 % efficiency of PSU and you are around 4.8 GW. Heating up the center with 4.8 GW of power, and that power has to be gotten out from the center. Consumption goes around 5-5.5 GW when adding the coooling. Sure the average consumption will be lower though.
Then add the bandwidth. One client takes the 10 Mbps, 20k clients are making it 200 Gbps, not bad.
And yeah, USA has 48 contiguous states, and one state would need multiple hosts. 150 datacenters for games? Add Europe to that, goes to around 300 datacenters(though, smaller ones).
Hardware upgrades. Driver updates. Software made in-house to run all the stuff.
There is no company which can do this effectively. Sure, hosting few thousand tic-tac-toes or chess' would be possible, but who would pay for that? There are free Java implementations over the net.
People like to play multiplayer games. This kind of approach makes it impossible to play real-time multiplayer games, since the input lag is unbearable.