The game industry has always hated the idea that you can buy games second-hand by reselling the media, thus depriving the publisher and developer of revenue.
Here's the "final solution" to that pesky problem.
Before everyone goes "Wow!", it is nothing more than virtual desktop technology, which itself is nothing more than a combo of virtualization plus remote desktop-type technologies. Could be cool. Imagine someone creates a VM "in the cloud" that is stable, known to work with the game, and just streams the visuals to you and your controls back to the cloud. Support for the game maker could be drastically reduced.
How could it fail? Stream an HD game to a house without stutter, without hitting Comcast's 250GB a month cap, etc...
Anyone else see pros/cons with this idea?
BTW, this isn't similar to the Phantom or other consoles of yore. Those failed because, as I understand it, apart from shaky management publishers were not willing to stream the assets to a console. This system does not stream/download the game, it only streams the AV output of the game being played in a cloud.
I wonder if and how they are virtualizing the PS3 or XBox 360? Or, will it be PC games only?
The other interesting facet of this is that people won't have to shell out for a high-powered computer to play the most recent games. If the bandwidth is there, you can play any game on a bottom-of-the-line computer (or on your TV using their set-top box). Also, this could help cut piracy if the subscription fees for the service are reasonable. I suspect many people who would be willing to pirate a game rather than buy it for \\$60 would also be willing to pay a couple dollars per hour of gameplay rather than pirate it.
I don't think I agree with your statement that this is "nothing more than" virtual desktop technology - the video compression they have to do to make this work at all is pretty hellacious. It's my understanding that remote desktops work by sending drawing commands across the network rather than actually streaming full-motion video of the desktop. I could be wrong, though.
I'm fairly skeptical about the quality of gaming experience that will be provided by this. From what I've read it sounds like they have it working right now on an internal network with a few hundred users. The real Internet is a jungle, and lag + packet loss + bandwidth issues across a few hundred miles of real Internet do add up.
I don't think I agree with your statement that this is "nothing more than" virtual desktop technology - the video compression they have to do to make this work at all is pretty hellacious
Well, there is a virtual machine on the cloud end and the desktop is transmitted, compressed, to the user. I agree that they are forced to use advanced compression techniques by virtue of traficking in HD content (like a Hulu or Netflix) to make this work effectively, and yes that would be a big step above the VNCish kinds of compression. I am curious if they developed a unique solution, or licensed/used a canned approached that already exists. I am assuming the latter, thus why I said "nothing more than". Totally guessing, though.
What would actually be different is how they virtualize machines to play games at high resolution, when most current virtualization solutions play to the common lowest denominator. Or, if/how they virtualize PS3 or XBox360 games. Now that would be a real differentiator and barrier to entry for competitors...
I am curious if they developed a unique solution, or licensed/used a canned approached that already exists. I am assuming the latter, thus why I said "nothing more than". Totally guessing, though.
This article seems to be saying that they've developed their own video compression technology.
if/how they virtualize PS3 or XBox360 games. Now that would be a real differentiator and barrier to entry for competitors...
Yeah, it would be. I haven't heard any information about this. However, I do notice that neither Sony, Microsoft, nor Nintendo is on the lists of publishers they've struck deals with, and all the games they currently have available seem to be PC ones...
The Kotaku article does seem to indicate that. I guess maybe a custom solution proved better than a pre-existing HD compression codec (or NIH stuck again!). They seem to have a full set of publishers behind them, but no OS/console maker as you noted.
I hope it works. It seems like a cool service, and, if anything, will finally test and put to rest the piracy/pricing conundrum (does piracy make prices higher? without piracy, would prices be lower?) that has wasted many forum-hours...
I don't believe in this. At all.
They are hyping the technology with example games as Crysis, and argue that you won't have to buy the latest hardware to enjoy the latest games in all their glory.
I read somewhere that they claim their latency to be close to 1 ms. How could that even be remotely possible when latency over the internet is usually around 50-100 ms?
Shooters like Crysis, Quake or CS always have problems with network lag to some degree. There are clever techniques that try to hide this, but it is there. These techniques work in the client, usually ensuring the appearance of a responsive system, even if a spike in latency causes the other players to jump around jerkily now and then.
But with this system, all input processing and rendering is done on the server. That means the FULL latency will be visible to the player. To keep the framerate acceptable, the server would need to render the next frame while sending the current, introducing even more latency. We are now talking about 100-200 ms. Imagine shooting your opponent with a 200 ms delay. You could easily count to 3 before the game responds.
For casual games, I'd say it would work fine. But then I can't see any advantage over web based distribution either.
The article I saw claimed a latency of 1ms for just the video encoding part. I don't think they're claiming the network latency is that low.
But I agree with your main point - while this may work in controlled conditions over a small internal network it's hard to imagine it would be a great gaming experience over the real Internet. But I guess we'll see.
It's my understanding that remote desktops work by sending drawing commands across the network rather than actually streaming full-motion video of the desktop.
That's how X server works, but remoting technologies like that used in Windows XP and virtually all 3rd party solutions do send bitmap images.
The casino industry has already been pushing this technology for quite some time now. Many virtual lottery terminals now use this server side technology and online poker (in Casinos) are to follow as well. Although their solutions are LAN based. I'm not at all surprised someone decided to take it a step further and bring it into the PC gaming industry.
Personally I think the idea is moot. ISPs have shown that they will no longer tolerate a fat pipe dream we all wish we could have. Any bandwidth demanding technology will have to be approved by the ISPs first. And last I checked, they are in no way going to eliminate the caps. To add on top of that, IPTV is taking off big now, so add in all those HDTV shows and movies you're downloading. The Internet could be such a beautiful media rich place, but corporate morons would rather us progress slow as snails before we evolve to that next stage. Until then, this service will be lucky enough to generate enough revenue to stay afloat.
remoting technologies like that used in Windows XP and virtually all 3rd party solutions do send bitmap images.
Sure, but they still have the advantage of a frequently-small dirty rectangle + rarely having to update at 30fps or 60fps for an extended period of time. I certainly have not seen streaming video artifacts when I use Remote Desktop in XP - also, playing video files over Remote Desktop doesn't really seem to work. HD streaming video with real-time encoding (not encoding with a few seconds' latency or streaming from a pre-encoded source) isn't really something we've seen before, AFAIK.
Indeed, remoting wasn't designed for video. I think they (MS) have some sort of QoS to prevent saturating the CPU and network. Some remoting software like RealVNC lets you rip at a blazing 120fps (default config). It brings an office network to its knees, but for the most part you can remote test games and stuff on it, or otherwise cut everyone off internet radio
Their claims of encoding in 1ms to me seems possible with the right hardware. They probably have the digital out from the video card fed into a dedicated video encoder which gets fed back to the user. Their encoding algo is probably really basic too. At bitrates of 1.5 (480) and 5.0 (720), that's a compression ratio of \~80% compared to the original RGB data, less if they use YUV with a sample rate of 4:2:0. Such ratios have been around for a long time. With dedicated hardware, even the older codecs could pull this off. And with low resolutions, no one will notice.
Thinking about the network side, at 30 frames per second, that requires user input to register every 33 milliseconds. With the same prediction code commonly used on the client side, they could merge that over on their side. The gamer might not even notice if they lose out on one or two updates. However, at 100+ milliseconds (approx 3 updates) the delay will definitely be noticeable especially in high-paced action oriented games. If these guys team up with ISPs to provide services locally from the routing stations, they could pull this off real easily. As it stands, we can only sit by and watch... them fail
At bitrates of 1.5 (480) and 5.0 (720), that's a compression ratio of \~80% compared to the original RGB data
Hmm. That's a lot less compression than I thought. I think your numbers are off, though - for uncompressed RGB, 24 bits per pixel, 720x480, 30fps I calculate a data rate of about 250 Mbits/sec. If they're compressing that down to 1.5 Mbits/sec that's more than a 99% compression ratio. Unless I made a silly mistake! This kind of compression ratio is certainly achievable - MPEG-4 codecs such as DivX do even better than that (but DivX is not built for streaming, of course). You make a good point about using dedicated video encoding hardware. I wonder if that is also virtualized? Can they really have a separate GPU and video encoder for every active player?
Yeah, my bad I forgot to factor in the animation. I was thinking about a single image (duh!). I would assume it's a 1:1 relationship, so one game would occupy one unit. They're probably using Blades, which at the time I was working on something like this they were the popular choice and not that expensive. With the kind of operation OnLive is throwing, they probably got a sweet deal with them. The video encoder unit could handle several units at once so they can minimize costs. So if they can encode at 1ms, that means the hardware can process 33 units in order to stay sync with a 30fps target (assuming that is their target framerate). A KVM switch could handle cycling between each unit on a timed basis.
Will never happen for online games. Maybe for turn based games, if even for them.
First, lag. Second, bandwidth. Third maintenance costs.
Agree that lag is a problem. Bandwidth is a problem for some, but not all. Comcast's 250GB monthly cap would be hit after about 50 hours of play at 10 Mbps presumably used full-throttle. (I wonder what the bandwidth use is for a game like Crysis, quoted in Kotaku's article?) Many players put in more than 50 hours of gaming in a month...
What do you mean by "maintenance". Maintenance, as I think of it, is actually an argument strongly in favor of this model. Publisher build a game that can run on a single virtual machine. Once Onlive has "certified" a game, it can replicate the VM as needed. Player does not have to fight his own machine. In fact, if you buy their device, they have minimal hardware support. It's a win all around? Where do you see potential problems?
Think about what kind of hardware do they need to run all those VM's. Virtual machines are slow, 10 Crysis' running at the same time would require something like 32 cores, 24 GB of RAM and huge array of GPUs - at least 10.
100 players? 1000? 10000? Yeah, 1000 players on single datacenter, thats not much, but a big city needs few of them. 3-4 centers with capacity to host games up to 20k players would mean... 20 * 3.2 * 1000 CPU cores(64000), 20 * 2 * 1000 GB's of RAM(40000), 20k GPU's and tons of motherboards. 1 GPU eats around 100 W of power(very optimisic here, in reality it is more like 150 - 200W), now that makes 0.1 * 20k, 2000 KW or 2 GW. One CPU core consumes around 20 W of power, 1.28 GW for CPUs. RAM is quite cheap power-wise, the motherboard, hard disks(SSDs would be a suicide) and PSU take their own. GPU and CPU would total in 3.28 GW, add HDDs, RAM and motherboard and you will get around 4 GW. Now add 80 % efficiency of PSU and you are around 4.8 GW. Heating up the center with 4.8 GW of power, and that power has to be gotten out from the center. Consumption goes around 5-5.5 GW when adding the coooling. Sure the average consumption will be lower though.
Then add the bandwidth. One client takes the 10 Mbps, 20k clients are making it 200 Gbps, not bad.
And yeah, USA has 48 contiguous states, and one state would need multiple hosts. 150 datacenters for games? Add Europe to that, goes to around 300 datacenters(though, smaller ones).
Hardware upgrades. Driver updates. Software made in-house to run all the stuff.
There is no company which can do this effectively. Sure, hosting few thousand tic-tac-toes or chess' would be possible, but who would pay for that? There are free Java implementations over the net.
People like to play multiplayer games. This kind of approach makes it impossible to play real-time multiplayer games, since the input lag is unbearable.
From some subsequent reading, since they only push out compressed video and get simple input, they claim they can operate fine at 1280x720 at 5Mbps. (Which, I think is what they aim for with set-top gaming to hit the widest audience. I don't get the feeling they will offer HD resolution.) Of course, I'll be the first to argue that hardcore gamers may not be satisfied with that resolution, but as a near-hardcore gamer myself, I'd be willing to give up some resolution in order to get better stability and (presumably) quick accessibility to a library of games.
Furthermore, I think you are being too conservative on your VM/core ratio. Assuming it is a well-designed operational backbone, they are balancing someone playing Crysis with someone playing some less taxing game. Even if you were to put all Crysis players on a single 32 core server, with two cores per VM, you could still get at least 16 VMs on that server. That's a 60% increase over your numbers, which would mean a lot at the scale you were discussing. Given that a game like Crysis could run on one core (assuming 720p), the average would necessarily be even higher, further reducing your datacenter numbers.
The mystery would be how they handle GPU/rendering. Haven't figured that one out. Obviously, they have. I don't they they will rack one server per player; that would be cost-prohibitive. I wonder if they offload render to some of those NVidia render boxes. Or, is it similar to AMD's "Fusion Render Cloud"?
As for bandwidth costs and energy costs, that all very manageable. YouTube, Hulu and NetFlix do just fine, don't they? Why can't someone else with "streaming games" instead of "streaming movies"?
Localized latency and bandwidth (read: ISPs) will be the nail-in-coffin issue. Sometimes, between 7-9pm, my Comtastic connection turns Craptastic as the whole neighborhood gets online and the oversold bandwidth gets chewed up like a ham sandwich. If that's the situation with most people, I can see their market being so restricted, they'll never get off the ground.
Just an interesting followup on this. A couple days ago Gamespot did an interview with the CEO. Worth checking out. Viewing all running games in the menu like that is sick!
There are a lot of thread-ender's on the technical side of this that people have already highlighted, but here's a much more down to earth reason this will never work.
Given the ludicrous tech and bandwidth that needs to be supplied to pretty much every home on the planet, how much do you think the subscription fee is likely to be? And more importantly, how many months would it take before you could buy your own xbox or whatever and save money. And get a lag-free, 100% up-time experience to boot.
And streaming a game is not like streaming a movie. Some boss computer has to be generating that game, and it has to be a computer better than yours. For everyone!
It's a scam. There has to be some bean-counter in this operation that knows the sums don't add up.
next page →