General Discussion Undecided where to post - do it here. |
Reply to Thread New Thread |
|
![]() |
#2 |
|
|
![]() |
![]() |
#3 |
|
|
![]() |
![]() |
#4 |
|
Do you have any articles to back up your claims? I'm seriously interested to know if this is feasible because I have had this idea ever since I learned of remote desktop. The serious problem lies in speed. If they can overcome that, it should be possible. I'll give a quick list of what should be common sense: 1. Latency. This system means this is the loop controller input actions must take: Input -> internet transfer [which has significant latency] -> Server-PC running game -> capture and encode Video -> internet [which has significant latency] -> decode Video -> Output. There's no way around this, this will introduce significantly more latency than the typical model: Input -> console -> output. 2. Bandwidth. 5Mbps for HD gaming, this is about 2.2GB of total bandwidth per hour of use. Not to mention any network activity if it's multiplayer game. Bandwidth caps are cropping up everywhere, the highest for Rogers (the largest ISP in Canada) is 95GB per month, but most people have 50GB caps here. That is ridiculously expensive. 3. Quality. 5Mbps is ridiculously low bitrate for reasonable HD quality, especially considering this will be real-time encoded which means it'll be single-pass with lots of the quality-improving heuristics disabled as they're too expensive. By comparison, Bluray encodes (with the same codecs) are around 30Mbps. You'll get extensive compression artifacts in the game, from blurring to macroblocking. 4. Cost. To run modern PC games, you need beefy hardware. This is precisely the problem they're trying to avoid, but they can't. They're just transferring the cost from you to them for the upfront capital, so they can charge you monthly fees to be able to access it over the internet with a laggy, blurry result. It'll cost a fortune to build any kind of large system capable of many simultaneous games of, say, Crysis or Halo. Ridiculously expensive, not to mention ridiculously power-hungry and HOT (and hard and expensive to cool.) 5. Scalability. When games are released, they have usage spikes. Think Halo 3 or GTA4 when they're released. It's impossible for them to have enough "servers" available for these kinds of high-demand, spikey-usage games for everyone to use at once. Which means you'll be paying a monthly subscription and dealing with lots of "sorry, server too busy" or extensive queues to use the product you're paying for. I know for a fact they're not going to want to increase their capacity 10 fold just to cover the peaks and valleys of usage time, but they'd have to to not piss off their customers. That's it in a nutshell. But as I said, this really should be kind of intuitive. |
![]() |
![]() |
#5 |
|
Credibility. I want articles on OnLive. Gamespot has the streaming video of the Q&A after with the OnLive guys and they didn't say ANYTHING substantial. Lots of dodging, squirming, and vague niceties. Agreed, but maybe they found a way to beat this problem? Yes. Not only have they found a way to reinvent the entire internet infrastructure overnight, they've found a way to eliminate internet latency! Not only THAT, but their 5 Mbps won't count towards anyone's bandwidth caps because it is MAGIC. Not only THAT, but they've invented a new codec that's better than everything thousands of PhDs and specialized companies have done in the past decade such that its 5Mbps HD stream is completely indistinguishable from an uncompressed local image stream! Build server farms in the north pole? Yes! Because that would help with their latency problem! There are ways to control this if it were to become a problem. No, there's not. Which is precisely why you didn't say what they are! The only way to "control" this is for them to buy enough servers to hit peak usage, which is an insane proposition. MS had to spend billions of dollars just to set up the Xbox Live infrastructure, which only does the network operations which is nowhere near to being the same magnitude as controlling the whole show like they are proposing. And even then, with the billions of dollars and a mature infrastructure, Xbox Live still has issues during peak times (eg, holidays). I don't think you can comprehend all of the issues here. They're not minor things they can "work around", if you understood how the internet works and if you understood how video codecs work and if you understood how cloud computing works, you'd realize how ridiculous this concept is. All of the problems I am aware of, but perhaps there is a way to make it work. That's the problem with you Asher, you take it as it is, you don't dream. ![]() |
![]() |
![]() |
#7 |
|
well ps3 lets you play PSP and PS1 games remotely using your PSP. I've tried it and it's not too bad. If current gen console has remote playing enabled (which they do not) I suppose it's possible and a possibility for the future.
I too think this isn't feasible. If we assume this succeeds, how are they gonna run the millions of games that gamers will play? Asher might have "extreme P.O.V" and while the idea sounds nice, I do question the practicality of it. Plus the controller looks like crap. :-p still, i'd like to see someone argue in favor of why it will be possible. it would be interesting to hear. |
![]() |
![]() |
#8 |
|
|
![]() |
![]() |
#10 |
|
If you didn't mind utterly shitty resolution, it would be pretty easy, actually. It's really the resolution and refresh rate that causes the main problem - both in server ability to serve the games, and in bandwidth. Remote computing and all of its variants works well for the business world where you don't have to have constant refreshes, so you're really only streaming a small amount of data at any one point; the latest POC we tested along those lines was something like 180kbps for 1440x900 resolution, for example. Think about that; 1440x900 is slightly under 1.5 million pixels; if you want 30FPS, say, to refresh the entire screen 30 times a second means 45m pixels PER SECOND, and at even 16 bit color (64k colors) that's what, 90 MBPS uncompressed?. To get that down to 180kbps, that means you're only refreshing probably 15FPS or so (45MBPS full refresh), compressing to hell and gone (say 1/8 of max), still 5 MBPS, and then only refreshing 3 to 4% of the screen per refresh. No way you can do that with a video game - even Civ would not look decent with that, and it's probably one of the modern games most easily adapted to this.
Now, if you wanted to do something with server-based games that was actually feasible and accomplished something useful, perhaps you could distribute the computing between the desktop and the server, letting the server handle physics calculations, for example, which might be more difficult for a lower-spec PC to handle, and leaving the PC to handle the images and such. That's not so different from how MMOs work, though I don't think they tend to have difficult physics calculations; but AFAIK they handle the game processing on the server end (ie, what events occur, who wins battles, etc.) and pass the results on to the client, whose primary purpose is to display images and accept/translate user input. I'm not sure you could sell such a service meaningfully, though, unless you had a REALLY good physics system; and still it probably would stay in the MMO realm, I imagine. |
![]() |
![]() |
#11 |
|
If you didn't mind utterly shitty resolution, it would be pretty easy, actually. It's really the resolution and refresh rate that causes the main problem - both in server ability to serve the games, and in bandwidth. Remote computing and all of its variants works well for the business world where you don't have to have constant refreshes, so you're really only streaming a small amount of data at any one point; the latest POC we tested along those lines was something like 180kbps for 1440x900 resolution, for example. Think about that; 1440x900 is slightly under 1.5 million pixels; if you want 30FPS, say, to refresh the entire screen 30 times a second means 45m pixels PER SECOND, and at even 16 bit color (64k colors) that's what, 90 MBPS uncompressed?. To get that down to 180kbps, that means you're only refreshing probably 15FPS or so (45MBPS full refresh), compressing to hell and gone (say 1/8 of max), still 5 MBPS, and then only refreshing 3 to 4% of the screen per refresh. No way you can do that with a video game - even Civ would not look decent with that, and it's probably one of the modern games most easily adapted to this. Some games now are disconnecting the rendering of the game from the main game loop for various reasons, the most recent example being Killzone 2. Because of this, there's VERY minute perceived "lag" or inconsistency with how manipulation of the controls correspond to movement on the screen. We're talking about 10ms or less. But it's already got many Killzone 2 players up in arms. The controls feel "mushy" or "disconnected" and it impacts the playability of the game. Now take that sensation and increase it by an order of magnitude or more -- which is just internet latency alone -- and add on top of that compression artifacts, and you've got a real stinker on your hands. |
![]() |
![]() |
#12 |
|
|
![]() |
![]() |
#13 |
|
|
![]() |
![]() |
#14 |
|
Well, I mean the lag between you and another player's machine. Since they're the same machine, there's no lag... (well, lightspeed, I guess? There's 4 datacentres planned in the US alone, not to mention the fact that you won't necessarily only be playing against other OnLive members (unless they wall off the MP, which would be incredibly stupid). |
![]() |
![]() |
#15 |
|
|
![]() |
![]() |
#16 |
|
Well, if you include both (user-to-server) lag and (server to server) lag, the lag will be at LEAST as bad, and potentially 2x as bad.
Right now you have lag just once - in a 100ms ping (pretending that is equivalent to lag), you are delayed 100ms in receiving actions from your friend. He does something, you find out 100ms later. Right? Now, once he does something, it has the him-to-server lag (say 80ms), then the server to server lag (minimal), then the server-to-you lag (on average, the same as his lag; say 80ms). Two lags for every action, not just one, and probably more in total (in this case, 160ms). Of course, if these servers were the same servers you would normally pass traffic through, it would just be one half of the lag and the other half, but that is not going to be the case... I was noting that the server-to-server portion of the lag would be minimal [at the cost of increased user-to-computer/server lag]. |
![]() |
![]() |
#17 |
|
|
![]() |
![]() |
#18 |
|
|
![]() |
![]() |
#19 |
|
I made the mistake of reading some of the comments in the original article. This one by Motorola29 in particular hurts my brain.
Welcome to the apocalypse of gaming. Cripple the economy even more by shutting down any competition between sony, microsoft, and nintendo. Investing money into something you can't touch is a mistake. I'm glad to see our society is moving towards socialism. Now we can't even play a game anymore without some major corporation playing big brother and tracking everything we buy, sell, watch, and how we pay for it. I hope this thing fails for the sake of the game industry. WTF? The first paragraph kills, because this thing won't shut down competition between the three, and because people have been investing in Google for a while now relatively happily. The second bit suggests to me the guy doesn't know WTF socialism is. A corporation finding out how people purchase doesn't equate to public/state-ownership of capital and production. So many ****ing people misusing "socialism" because they're too ****ing dim to know what it really is and thus apply it to everything because the ****ing talking Republican heads are using it. |
![]() |
![]() |
#20 |
|
|
![]() |
Reply to Thread New Thread |
Currently Active Users Viewing This Thread: 3 (0 members and 3 guests) | |
|