Why Cloud Computing Gaming is Going to Happen


Eurogamer writes an entertaining piece on why Cloud Computing gaming is a fantasy. One humorous excerpt:

More than that, OnLive overlord Steve Perlmen has said that the latency introduced by the encoder is 1ms. Think about that; he’s saying that the OnLive encoder runs at 1000fps. It’s one of the most astonishing claims I’ve ever heard. It’s like Ford saying that the new Fiesta’s cruising speed is in excess of the speed of sound. To give some idea of the kind of leap OnLive reckons it is delivering, I consulted one of the world’s leading specialists in high-end video encoding, and his response to OnLive’s claims included such gems as “Bulls***” and “Hahahahaha!” along with a more measured, “I have the feeling that somebody is not telling the entire story here.” This is a man whose know-how has helped YouTube make the jump to HD, and whose software is used in video compression applications around the world.

To Summarize the Major Technical Hurdles

  • Processing: Hardware requirements scale linearly with the number of concurrent users. In order to support a million concurrent users, you would need one million computers at a data center running the games, which would be an outrageously expensive infrastructure challenge and maintenance issue.
  • Video Encoding: Movie streaming sites require lots of preprocessing time to prepare videos for download. H.264 video compression is super expensive. Doing that in *real-time* per user would be insane.
  • Network Latency and Lag: Movies are non-interactive and buffering smoothes out latency issues. Interactive games can’t buffer video content (beyond sub-second micro-buffering), and any small latency problems directly translate into end-user lag. Extra lag can really ruin high precision reflex games like competitive fighters and shooters.
  • Bandwidth: Real time video requires too much bandwidth. What about bandwidth caps and network fees?

Counter-points

  • Processing: This is similar to video game lounges, where they have individual gaming terminals available for hourly rental. What happens when a hundred people show up and there are only twenty terminals? Well, they have to wait or leave. And if the lounges are consistently popular, then they expand. An Internet service like what OnLive is proposing can also start small and grow naturally from there. They can also take advantage of many efficiencies of scale like blade servers or similar technologies.
  • Video Encoding: Complain all you want about how hard it is to do H.264 encoding, if I can get a rinky old webcam that does live video streaming, I’m sure there are some compression theory hot shots out there that can replicate this decade old technology. Or maybe even improve upon it.
  • Network Latency and Lag: Have you played online action games? These games use the exact same Internet with the exact same latency issues. People have been playing tons of Internet multiplayer games for over a decade, and lag is just an occasional annoyance.
  • Bandwidth: This type of service requires the same general bandwidth that Internet video and movies do. While this is still an issue, tons of people are already watching tons of Internet video regularly.

Bottom Line

Every one of these technical obstacles has already been solved years ago. Live video streams? Latency-tolerant Internet games? Bandwidth intense HD video? Server-side computer clusters? This stuff is beyond wild theories and daring research papers. This is old hat. Climb out from underneath your desks, step out of your closets, and enjoy the future.


Written by: Darrin - Contributing Editor


  1. #1 by mcloki on March 30th, 2009

    I’m in the skeptical Camp.
    While I believe that games like Bejeweled and chess, poker and a most non twitch games can be handled I’ll wait to see if SOCOM can be handled in this manner. I know they demoed Crysis, but I’m betting that that was on an intranet not the internet.
    The latency and network issues are the problem. Carriers switching network addresses hourly, bandwidth caps being inconsistently applied nationwide. All of these things add up. If this system works why wouldn’t they repurpose it and strike deals with every major movie studio and let them deliver movies over the web. The money is there for the taking. It would technically be easier since it would just be VOD over IP. or “Vodio”.
    The concept is great, if it works I will sign up, but I need to see it to believe it.
    And on part 3 of your counter point. The lag issue on consoles fills the forums. And with consoles all that is being delivered is the positional data of the other characters movements. Your console then uses local data to recreate tose movements. OnLive is delivering a stream of video downstream while getting your positional inputs upstream. Then computing that data. sending it out to a render frame, 60 times per second, compressing it to a proprietary video and audio format and sending that frame over the internet to your consoles which then decodes that frame and renders it to your screen. Wow thats a lot of work and a lot of interworking parts that can go wrong. How is chat handled?
    Again maybe there are geniuses that can do this. but if this were to happen these guys would need to have the processing power of every PS3 sold at their disposal. That’s a hell of a lot of hardware these guys are going to need.
    I’ll go with you half way Darrin. Games that don’t need split second reflexes will work under this method, Killzone 3 may be a stretch. But in 5 years and with a fiber internet connection I may be willing to change my mind.

  2. #2 by darrin on March 30th, 2009

    Today’s Internet multiplayer games used much less bandwidth than a full-video feed does, but latency has the exact same impact on either approach.

    Simple games, like cards and checkers as well as simple strategy/RPG/action games are dominated by in-browser Flash. They could do the same thing with a OnLive system, but what’s the point? It requires more expensive infrastructure and maintenance for nothing.

    Also, I’m skeptical about them doing action games on this. Gamers ant razor sharp resolution, glass smooth frame rate, and millisecond-level response times.

    But for MMO type games, like PlayStation Home, no one cares about slight drops in that stuff. If lag is 10ms instead of 1ms, or FPS drops to movie-quality 24FPS (or less), no one cares. If you can eliminiate load times, zoning pop-in, downloading, video buffering, and having to patch and maintain the client software, those would be huge gains. It wouldn’t just be replicating the current gaming experience on a new infrastructure, it would be making dramatic improvements.

  3. #3 by mcloki on March 30th, 2009

    The push for this technology won’t come from gamers it will come from publishers. Companies these days want your money. None of these Onlive games can be resold. That’s a big deal. Especially to game companies. The money alone is going to push publishers this way. Onlive needs to create a standard hardware spec and get that standard hardware spec placed into set top boxes or directly into televisions. Imagine Sony HDTV’s that come out of the box with Onlive built in. The more I think about it the more I think Onlive is just looking to get bought out.

  4. #4 by John on March 30th, 2009

    Existing online games are only sending the equivalent of keystrokes. That’s a few hundred bytes per second. bytes, not kilobytes or megabytes. And they already complain about lag.

    Wanna transfer the megabytes of HD gaming at 60 FPS with one frame less lag? Even on a Gigabit LAN you would have trouble with it.

    For anything fast-paced this is pure bull, or er… VC money pit if you wish. And their 1ms lag is server-side, not what gamers will experience.

    But for things like Home, yeah it could work. But would it be economical? THere is a reason why most games go for peer-to-peer and client-side work: it’s wayyy cheaper.

  5. #5 by Emrah on March 30th, 2009 [ 7319 Points ]

    The latency (lag) in multiplayer games currently works differently than what’s going to happen onLive. In current “client-side” gaming, when you jump, you jump. When you shoot, you shoot. The server then decides whether to honour your actions or not, depending on your network status, and the enemy’s status. That’s why the enemy falls a little later than he should, e.g. in Killzone 2, and several other current games.

    Effects of network latency and packet losses are remedied to some extent using Client-Side Prediction:
    http://en.wikipedia.org/wiki/Client-side_prediction

    What’s happening on server is ALWAYS different (even if slightly) than what you are actually seeing in a game, except for situations where the network lag is less than a few miliseconds (gaming on LAN). That’s why you may die thinking how you could, as you unloaded a full clip on an enemy, but he ends up killing you. What actually happened was, he started shooting earlier on his client-side, and his actions was honoured prior to yours on the server side, this can easily be seen on COD4′s kill-cam.

    However, onlive, since they aim for the comfort of a single player experience, will have this latency problem, in the current status of our internet infrastructure. It will probably NOT have this problem 7-8 years from now, the latency and bandwidth will indeed be very less of a problem. But other problems of scaling will most probably remain.

    BTW, multiplayer will be very easy to tackle if they get single player to work, in fact, it would be awesome, since all players would receive the same game state, as the game servers would talk to eachother with ultra-low latency, when games are built for the platform.

  6. #6 by Trieloth on March 30th, 2009

    Humm nope wont happen… your saying Mario, Link, Master Cheif, Kratos, etc. will just be thrown to the side in favor for a bunch of multiplatform titles? I can see if the big wigs started doing it it then it might happen(M$,Sony,NIN). But some people…alot of people dont have fast enough internet. Christ my phone company told me my 1.5meg internet connection is blazing fast and its the fasts they offer. Another thing people like hard copys of games. Its gonna end up like the Jaguar/Neo geo, only the rich will have it. But predicting people and the future is IMPOSSIBLE. so Good Luck

  7. #7 by Hentaku on March 30th, 2009

    I remain certain that it isn’t going to happen anytime soon. We’ll talk in about 10 years from now.

  8. #8 by Devils on March 30th, 2009

    Just wanted to thank Darrin for providing something to read on this site besides stuff about mockers. please do more of this mate.

  9. #9 by Devils on March 30th, 2009

    Just wanted to thank Darrin for providing something to read on this site besides stuff about mockers. please do even more of this mate.

  10. #10 by John on March 31st, 2009

    > It will probably NOT have this problem 7-8 years from now, the latency and
    > bandwidth will indeed be very less of a problem.

    I doubt it. If the bandwidth has improved tremendously since the 56K days, the *latency* has not, and in most times it’s worse. Pings of 50-100ms are commonplace nowadays, and worse pings even on the latest fiber connections are not unheard of.

    >BTW, multiplayer will be very easy to tackle
    >if they get single player to work

    This isn’t as easy. The perceived lag in multiplayer is influenced by the combination of all lags from all players. This is why you tend to see high ping players getting kicked from servers (on private servers), and why private servers are so highly valued for clans training.

  11. #11 by Emrah on March 31st, 2009 [ 7319 Points ]

    @john: I agree, but if they could make come up with a sufficient response time for all players involved to be almost as responsive as local play, which is what they claim, multiplayer would be piece of cake as the game would be hosted at the same place as the video servers.

    It is just that their claim is not achievable on today’s infrastructure. A tenth second of latency is VERY frustrating to any player, since your game state will also have to be updated a tenth second later, unlike current games which predicts the current state on the client side)

  12. #12 by P5ycH0 on March 31st, 2009

    I like the idea. Not for the games, but for the improvements ISP’s need to make in order to coop with this. They were slacking the last couple of years.

  13. #13 by yodaddy on March 31st, 2009

    I got 15megs down and 1 up… does that count for anything ?

  14. #14 by George on March 31st, 2009

    I don’t buy this: “Hardware requirements scale linearly with the number of concurrent users.”

    I’ve been able to do that where I work, and you get more than twice the amount of users if you double the hardware. For example we recently doubled server capabilites and are now hosting about 6000 users on one machine; previously we were only able to host 2000 users on hardware half as good.

    I think it’s more like car engines, the total horsepower is far greater than the sum of the parts.

    8 core CPUs will be mainstream very soon, I expect that we will see smaller, cheaper “mini-clouds” not too far away. You can all ready get 4 motherboard rack mount clusters (4, 2 processor boards in a 2U chassis).

  15. #15 by Darrin on March 31st, 2009 [ 17143 Points ]

    OK, so latency is an issue today. During a latency bump in today’s Internet games, the game will continue to control and render smoothly (all done locally), but will fall out of sync with the server and will jarringly resynch when network connection is resumed.
    With server-side game rendering, the multiplayer sync problems wouldn’t be any worse than they are today, but latency problems would also affect the player’s own character control (since player input is processed remotely rather than locally).

    George, I agree, that you can probably build hardware setups that scales better than linearly. But even in that worst-case scenario, that wouldn’t be a deal breaker so that’s not worth debating


You must be logged in to post a comment.

Like trophies? Like giveaways? Want to speak your mind? Register here!