1. #1
    IImayneII's Avatar Senior Member
    Join Date
    Mar 2014
    Posts
    1,060

    30 fps "better?"

    I am seriously starting to ask myself what Ubisoft is smoking....I tought technology was moving forwards instead of backwards. Clearly not the case at Ubisoft...

    The games industry is "collectively" moving away from 60fps,
    Share this post

  2. #2
    It was an interesting read.

    ^ Why can't everyone relax, like me. I am sure that everything will be OK.
    Share this post

  3. #3
    BlueBadger400's Avatar Trials Community Specialist
    Join Date
    Nov 2013
    Location
    Winland
    Posts
    592
    Fusion General discussion is not the right place to talk about other ubisoft games or their graphical settings. Thread moved.
    Share this post

  4. #4
    TeriXeri's Avatar Senior Member
    Join Date
    Mar 2014
    Posts
    1,163
    I can see 30 fps work for certain genres if it allows to boost other things like objects/view distances & effects.

    Not so much for a game that works with milliseconds like Trials or a fast paced multiplayer shooter as I can definately see a difference between a 30 fps video (youtube) or a 60 fps one (recorded locally).
    (might be a bit different in reality as a youtube video generally cuts every other frame but still).

    But I see this more of a case of making games compatible at consoles, rather then pushing the industry forward, we all know a high-high end GPU alone pretty much costs more then a console right now.

    A complete mid-range "gaming" PC including screen probably still costs more then €500, and that won't last many years.

    It will be a while still until we see games run in full glory at all those new innovations being pushed out, like 4k resolution 120+Hz, would need a LOT more hardware then say a 720p 30fps. even compared from 1080 60 fps it's a huge stepup.

    RAM will be greatly improve with the new DDR4 architecture & CPU's are going pretty much 6/12/16 core soon (with cheaper then ever pricing), but it seems GPU development/pricing doesn't improve at the same rates, as High end GPU seem to get more and more expensive.
    SSD took a long time to drop in price per GB relative to speeds, and now they are getting cheaper/larger but that took a few years to settle.
    Share this post

  5. #5
    IImayneII's Avatar Senior Member
    Join Date
    Mar 2014
    Posts
    1,060
    Originally Posted by BlueBadger400 Go to original post
    Fusion General discussion is not the right place to talk about other ubisoft games or their graphical settings. Thread moved.
    Sorry for posting in the wrong thread, tought I was in the general - general discussions




    Yes 30 fps can work for certain genres, but I am just amazed in how low the industry is willing to set the standards these days. There are already enough games that support 1080p 60fps, so when they release games sub par, and don't even officially support these settings it feels more like a lack of optimizing the game and going for more profit.
    Share this post

  6. #6
    En0-'s Avatar Trials Developer
    Join Date
    Jun 2011
    Posts
    1,775
    If you go to 30fps instead of 60fps, you can do something more shiny, higher quality That's a trade off in one way or the other. There is no "1 truth", there is 1 truth by game
    Share this post

  7. #7
    IImayneII's Avatar Senior Member
    Join Date
    Mar 2014
    Posts
    1,060
    Originally Posted by En0- Go to original post
    If you go to 30fps instead of 60fps, you can do something more shiny, higher quality That's a trade off in one way or the other. There is no "1 truth", there is 1 truth by game
    While I can agree with that, for some games. It is not something the industry should strive for in my opinion. And the argument in the article of "you don't gain that much from 60 fps and it doesn't look like the real thing. It's a bit like The Hobbit movie, it looked really weird" makes no sense at all...Feels like that guy doesn't even know what he is talking about or just wants to delude the masses.

    I don't even know why he is comparing video game fps to a movie, makes no sense..and 30fps is more cinematic? Lol..he can't be serious...


    He also says "If the game looks gorgeous, who cares about the number?" Most people care because 60fps makes games smoother and more responsive. Some people actually care about gameplay before graphics.


    The only thing this article does is show how willing they are to lower quality and try to excuse it by using a "it's more cinematic", 'shiny is better" argument wich doesn't even make sense. There are enough games already out there at 60 fps/1080p that look gorgeous.

    And to be honest, the only reason I see for doing this is either making more profit, or because they mainly design their games for consoles nowadays (wich can barely handle 60fps / 1080p) and with all the obvious PC ports these days I don't see this being far from the truth.
    Share this post

  8. #8
    RetiredRonin's Avatar Senior Community Manager
    Join Date
    Sep 2012
    Location
    Ubisoft NC Office
    Posts
    7,343
    Wall of Text incoming!

    Also:

    The opinions expressed in this post are my personal opinion and do not necessarily reflect those of RedLynx or Ubisoft.

    Viewer discretion is advised.

    Originally Posted by bassline001 Go to original post
    While I can agree with that, for some games. It is not something the industry should strive for in my opinion. And the argument in the article of "you don't gain that much from 60 fps and it doesn't look like the real thing. It's a bit like The Hobbit movie, it looked really weird" makes no sense at all...Feels like that guy doesn't even know what he is talking about or just wants to delude the masses.

    I don't even know why he is comparing video game fps to a movie, makes no sense..and 30fps is more cinematic? Lol..he can't be serious...
    30FPS IS more cinematic that 60FPS.

    The Hobbit in 48HFR was a considered a failure by critics. People did NOT seem to like it, at all. Which isn't that surprising, as most films have a frame-rate of 24fps. When this number is brought up people experience the Soap Opera Effect which is pretty jarring for most viewers. Most soap operas (or Daytime Dramas) are filmed on 60fps digital cameras and broadcast at the same FPS. While being more "lifelike" and fluid, most viewers have been trained to view hue, saturation and contrast adjusted footage at 24fps as the standard, other frame rates look wrong, so people generally dislike them.

    The reason The Hobbit was mentioned was because people complained that it looked like it was filmed with a regular camcorder and not a "real camera" which, in turn, makes them feel like the production was cheap and tainted their fun. People with newer televisions can experience this at home. Watch part of your favorite movie at it's normal frame rate and then turn on your TVs motion interpolation. While I KNOW that the higher frame rate means a smoother picture, my previous conditioning has instructed me that the high frame rate looks artificial somehow.

    The fluidity of 60+ FPS and the ability for PCs and consoles to produce some fairly striking visuals is also moving us into the Uncanny Valley; short summary, as things look more real, our brains start REALLY picking out small issues with disproportionately higher criticism, even disgust and revulsion.

    Originally Posted by bassline001 Go to original post
    He also says "If the game looks gorgeous, who cares about the number?" Most people care because 60fps makes games smoother and more responsive. Some people actually care about gameplay before graphics.
    I can only speak on personal experience, but I've never had an issue playing games at 30fps. Each week when we stream we are playing Trials at 30fps through OBS, it's barely noticeable to me (if you notice it more, that's cool, I don't notice it much anymore, but I'm north of 30 now...). I have no issue as long as the game is playable to my acceptable standard. Then again, I also game on a 47inch TV from 8 feet away and can't notice a difference between 900p and 1080p. and can barely see the difference between 720p and 1080p.

    I'm not sure if that is because it isn't noticeable, or my eyes are bad. I've never really had eye trouble, and I am in fact writing this post on the same TV and can read my words perfectly well, but... I guess only an optometrist would really know.

    Originally Posted by bassline001 Go to original post
    The only thing this article does is show how willing they are to lower quality and try to excuse it by using a "it's more cinematic", 'shiny is better" argument wich doesn't even make sense. There are enough games already out there at 60 fps/1080p that look gorgeous.
    Is it lowering quality, or is it the highest graphical fidelity with a smooth frame rate? Teams don't select "inferior" graphics options when they can achieve higher ones. That just doesn't make any sense to me.

    Originally Posted by bassline001 Go to original post
    And to be honest, the only reason I see for doing this is either making more profit, or because they mainly design their games for consoles nowadays (wich can barely handle 60fps / 1080p) and with all the obvious PC ports these days I don't see this being far from the truth.
    I don't want to say that it's easier to design games for consoles, but there is only ONE set of hardware for the Xbox One, and one set of hardware for the PS4. If you make a game and it runs perfectly well on your X1 and PS4, there's a pretty high likelihood that it will run the same on all XB1 or PS4 consoles. PCs have damned near limitless possible configurations. From the parts in my house I could make about 30 different configurations.

    I'm not sure how you think developers would expect to make more profit by lowering quality. I just don't understand the logic.

    The opinions expressed in this post are my personal opinion and do not necessarily reflect those of RedLynx or Ubisoft.

    Viewer discretion is advised.
    Share this post

  9. #9
    IImayneII's Avatar Senior Member
    Join Date
    Mar 2014
    Posts
    1,060
    30 fps is "more cinematic" doesn't even make any sense because it's comparing something new to what you have grown up with. It is not a technical argument, it's nostalgic. Framerates alone doesn't make it feel more cinematic, lighting, motion blur, camera movement....all these things affect the feel of what you see. It's not because some people didn't like it, it is actually worse than lower framerates. More blur while panning is not "better" in my opinion. It's like people who complained HD had too much detail when it first came out...but now everyone wants HD

    And the 'cinematic' argument will not stand very long, because very soon, filmmakers will be capturing films at 60fps. Only nostalgic people will keep using this ridiculous argument.

    ------------------------------------------------------------------------------------------------------------------------------------
    And yes, companies will lower their standards to have better reviews, more sales, more profit...
    Insomniac Games did research a while back regarding framerates.
    They found that:

    *A higher framerate does not significantly affect sales of a game
    *A higher framerate does not significantly affect the reviews of a game

    And in particular they found that there was a clear correlation between graphics scores in reviews (where they are provided) and the final scores. And they found no such correlation between framerate and the graphics scores nor the final scores. No correlation between gameplay scores and final scores, however it does appear that gameplay scores are also influenced by graphics scores. i.e. Better looking games appear to be more "fun" to reviewers, in general.

    Their conclusions clearly reflect that their research wasn't about gameplay, it was about reviews, sales.Clearly also reflected in the way the poll was set up, wich skewed the outcome.
    ------------------------------------------------------------------------------------------------------------------------------------


    If devs want higher resolutions, why would you want to use 30fps, since it will make the screen blurry during movement, making detail useless, that doesn't make sense to me to be honest. Higher framerate gives you better picture, less latency...

    Saying you don't notice 30fps vs 60 fps is highly subjective, and probably because you are used to playing games at 30fps. I have been playing at 60fps for a while now and I am not eager to go back to 30fps. Even if you don't see the difference between 720p and 1080p, 60fps compared to 30fps on whatever resolution is alot better. I would even prefer 60fps on 720p than 30fps on 1080p. And saying you see barely a difference between those resolutions negates the idea that graphics are more important than framerate, since it is fps that actually makes gameplay better, not resolutions.

    example video's


    Yes, making games for consoles is probably alot easier since there is only one hardware setup. ButI don't really care how companies design their games for consoles because I own a PC. And that doesn't that mean that PC gamers have to suffer because they design their games for consoles first. I bought a PC because I love the customizations, higher settings. If the gaming-industry now is collectively going back to only officially supporting 30fps, what is the point in playing on a PC? Since 60fps is clearly attainable for PC gaming. And this is the exact reason why I bought one, if I want low resolutions with low fps, I would have bought a console. This shift back to 30fps should not be accepted in my opinion. Specially not for PC gaming.



    And well, it's not hard to see that companies these days (not the same as developers) mainly get their profits from hype and promoting. Just look at how much money is spend for promoting, in some cases almost or even as much as to create the games...and those are the games that have high selling numbers

    Look at where the wii u stands, barely any promotion was done and this reflects in their sales. Xbox1/ps4 have surpassed the wiiu in sales already and the wii u has been out for alot longer. Yes this isn't purely because of promotions, but Nintendo even had to remind people the Wii U is not a peripheral, but an all-new console.

    Look at WatchDogs, promoting insane graphics in screenshots/demo's but ultimatly delivering a sub-par product on launch while selling easily over 8 million copies.


    The more promotion, hype you have these days, the more you will sell. Even for products that don't deliver on promises. That's just a fact.
    Share this post

  10. #10
    teri, the new gtx970 is pretty good price performance wise and capable of 4k althought not maxed out but is that even needed at 4k and only for roughly 350 dollars and its a high end beast,i'll probably build me a pc with it soonish maybe

    concerning being more cinematic, i didn't notice much difference between the hobbit and other movies but i don't watch many movies at all
    if 30fps is done for being more cinematic than why the hell aren't they using 24fps?

    as for games high framerate is way more important because its interactive can people play at 30fps absolutly even from all genres and on console i don't think even most people will notice
    or maybe they'll notice but don't really know whats exactly wrong
    now whats is really important though in games is smoothness and lag which are very related to the framerate, now todays grafics capabilities are getting way better and the higher framerate will be less needed certainly with new technologies like gsync that will give variable framerate making lag less noticable
    so the only thing left will be input lag which will be only very important for "profesionals" and they will easily want stuff like 120fps

    as for hype and marketing, i don't think the wii u failed because nintendo didn't throw enough money at marketing
    Share this post