🛈 Announcement
Greetings! The Division forums are now archived and accessible in read-only mode, please go to the new platform to discuss the game
  1. #21
    Licher.Rus's Avatar Senior Member
    Join Date
    Jan 2017
    Location
    Russia
    Posts
    937
    Originally Posted by CategoryTheory Go to original post
    I think it's better thought of as, "Fast food chains serve what they serve; if you don't like it you shouldn't be patronizing them."

    Alternatives, such as the console versions of the games, are available. If you don't like the costs (monetary and otherwise) of using those instead, well, that's a choice you've made.

    It's fair to demand a certain level of quality from PC developers, but it really does sound like most people here are demanding a level of perfection that simply cannot be achieved on PCs in the real world.
    Fast food could be equal to some indie project that you bought at Steam on sale. And yes, there it's fine.

    Definitely not full (over)priced AAA games.
     1 people found this helpful
    Share this post

  2. #22
    CategoryTheory's Avatar Senior Member
    Join Date
    Feb 2015
    Location
    Tokyo, Japan
    Posts
    675
    Originally Posted by Licher.Rus Go to original post
    Fast food could be equal to some indie project that you bought at Steam on sale. And yes, there it's fine.
    You misinterpreted the metaphor. It's not the game that's the fast food here, it's the platform at a particular price point. (Yeah, I realize the metaphor is strained by this point, but I didn't choose it.) If you want better tasting food (fewer crashes), move from the fast food restaurant (PC) to a slightly more gourmet option (a console) and pay the price (controller instead of mouse, lack of graphics and control configurabililty, even more $$ if the game is an older release).

    Actually, I wouldn't be surprised if indie games (at least ones with reasonably competent developers) are generally more reliable than triple-A, simply because they usually try to do a lot less and so are less likely to bring out all the odd configuration problems in a system.

    Definitely not full (over)priced AAA games.
    This pretty much demonstrates the problem. On a platform where reliablity is extremely difficult, people complain in the same breath that not enough money is being spent on reliability and that the game costs too much.
    Share this post

  3. #23
    YodaMan 3D's Avatar Senior Member
    Join Date
    Feb 2013
    Posts
    8,108
    Originally Posted by CategoryTheory Go to original post
    All but the last sentence is quite correct. But it's not just about profits, it's also about what you're not doing when you're spending your time on bugs. Be honest, would you really be on board with Massive saying that there would be no new content or anything but bugfix updates for the next year because they'd decided to put their entire development team on bugfixes? Would you trade all the new content from the past year for a year of bugfixes? (I know I certainly wouldn't.)


    You could try getting together with your friends and raising a few hundred thousand dollars a year for some more developers dedicated to reliability and bugfixing.

    I am pretty firmly convinced that most gamers neither want to pay significantly more nor accept significantly fewer features in trade for more reliability. Like me, they prefer rolling the dice on a reasonably cheap game over spending a lot of money, and they just move on to the next game if they happen to get unlucky. (They may not say this, but they demonstrate it in how they actually behave.)

    Again, my usual disclaimer: this doesn't mean that Ubisoft and Massive couldn't be doing a better job with the resources they have. I'd bet dollars to doughnuts that their software development and release processes are not hugely better than industry average, which means that there's a lot of room for improvement. But no matter how much effort they put in and how much money they spend, they'll never get truly reliable software as long as their users insist on running their program along with essentially random collections of the game's dependencies and other software in what amounts to a completely uncontrolled environment (i.e., on their personal Windows PCs).
    1st, I would have serious questions, if they would have to have their full team research bugs. I would also add, even with their full team they haven't been raining new content. They supposedly been working on Avatar since TD. They just took on Star Wars. I would even doubt their full team is even working on anything TD related

    2nd, I know I would prefer not to spend anymore money on broken content. They might be surprised if they actually released content with minimum bugs or no bugs, how much players may be willing to spend on games from a company that didn't consider bugs as acceptable.

    3rd, What's the difference in buying one expensive game you would play for years with your friends, with the occasionally dlc or upgrade? Or spending the same amount, if not more on cheap game after cheap game every week?

    4th, As long as players accept sub-par video games. The industry won't strive to improve. Why strive to remove bugs, when players will accept them and make excuses for you.
     2 people found this helpful
    Share this post

  4. #24
    Merphee's Avatar Volunteer Moderator
    Join Date
    Nov 2016
    Location
    Discord: Merphee#2325
    Posts
    3,821
    Originally Posted by Robert-of-Hague Go to original post
    Having the kind of troubles we had is of course a big problem for us customers but for the developers as well.

    I have been in the ICT-business for 40 years (now retired) and about 10 years of those 40 as a software engineer.
    If there is one thing I learned in all those years is that there is no such thing as bug-free software.
    Even in a small software module, let’s say 100 lines of coding, there is a high probability of a bug. Some bugs can be found with intensive testing, but there are bugs that can only be found in the field.

    If you consider the amount of coding in a game like Div-2 in combination with the amount of coding of the different operating systems of all the platforms this game runs on, then it will be a miracle of galactic proportions if this will ever be bug free.

    And to make it even worse: every time a bit of software is changed or added, new bugs are introduced. Testing doesn’t solve this. Even with intensive testing you can’t find all the bugs. The only possibility for a software product to become even near a bug free situation is not to change anything, ever. No hardware changes and no software changes.

    So people, get used to it.
    Welcome in the world of ICT.
    I think players are aware that software bugs are inevitable. Deadlines combined with many different complex systems - something is surely going to slip through no matter what kind of budget a game has.

    In regards to Division 2, I believe the concern comes from old bugs not being fixed yet, depending on their severity.
     1 people found this helpful
    Share this post

  5. #25
    There are a lot of people at Massive and Ubisoft that are responsible for providing us (customers) a gaming experience that is commensurate with what was advertised, stable, and performs well within the stated system requirements.

    Every single one of these people are employed with job responsibilities that are aligned with that overall mission.

    Bugs happen. But there are entire frameworks which should be used that prevent critical bugs from getting past development testing, past QA testing, and released into production.

    Every software developer that works these days knows about interdependencies..... with third party libraries, OS'es, etc. and certainly interdependencies with other products in the studios portfolio.

    To have what happened happen, that doesn't happen without people failing at their job responsibilities. The Massive and Ubisoft Connect product owners need to sit down and figure out how they failed -- and how to not do that again.

    That is what needs to happen. What doesn't need to happen is for the customer base to "learn" how they should temper their reasonable expectations.

    It is not unreasonable to expect two teams under the same studio to actually do their jobs. And if those teams in fact did not fall down on their job responsibilities -- then layers of management above them certainly did.
     5 people found this helpful
    Share this post

  6. #26
    Licher.Rus's Avatar Senior Member
    Join Date
    Jan 2017
    Location
    Russia
    Posts
    937
    Originally Posted by CategoryTheory Go to original post
    You misinterpreted the metaphor. It's not the game that's the fast food here, it's the platform at a particular price point. (Yeah, I realize the metaphor is strained by this point, but I didn't choose it.) If you want better tasting food (fewer crashes), move from the fast food restaurant (PC) to a slightly more gourmet option (a console) and pay the price (controller instead of mouse, lack of graphics and control configurabililty, even more $$ if the game is an older release).

    Actually, I wouldn't be surprised if indie games (at least ones with reasonably competent developers) are generally more reliable than triple-A, simply because they usually try to do a lot less and so are less likely to bring out all the odd configuration problems in a system.


    This pretty much demonstrates the problem. On a platform where reliablity is extremely difficult, people complain in the same breath that not enough money is being spent on reliability and that the game costs too much.
    So whole PC platform is kinda fast food for you?

    Wow, that nice, even if remember that latest consoles are not even near in price to even medicore PCs

    So, you are saying, that going to fast food and buying spoiled food there for restaurant prices is fine?

    But I don't think that PC is fast food. What a strange idea?..
    Share this post

  7. #27
    As1r0nimo's Avatar Senior Member
    Join Date
    Mar 2019
    Posts
    2,036
    Originally Posted by Sircowdog1 Go to original post
    Seems like there's a lot of that going around. People who are just now coming into the forums to make excuses for Ubi/Massive without knowledge of the full history of just how inexcusably sloppy the development and implementation of this specific game has been.

    Sure, in a general textbook sense, some of what they're saying is accurate. Bugs happen. No big deal. But the repeated failures of both the actual code of this this game and the overall design theory......it's happened so often and so consistently that "bugs happen" doesn't work as an excuse anymore.
    Basically, you speak for the whole IT programmers community. Ok.
    This is fine then. Continue to Ignore commons sense
    Share this post

  8. #28
    Sircowdog1's Avatar Senior Member
    Join Date
    Mar 2016
    Posts
    3,953
    Originally Posted by As1r0nimo Go to original post
    Basically, you speak for the whole IT programmers community. Ok.
    This is fine then. Continue to Ignore commons sense
    What is it lately with people taking something low key, and assuming it means some kind of broad sweeping concept?

    I specifically said Ubi/Massive in regards to THIS game. Not the entire IT industry. Not all programmers everywhere. Not every single game in existence.

    How is it ignoring common sense to point out that the development of The Division 2 has been sloppy, filled with exploits and bugs far and above what would normally be encountered in a game with such a well-established dev team from a major publisher?

    I'll answer that for you: It's not. Not unless you're using Bethesda as your measuring stick. In which case I have to wonder what rock you've been living under where you think that bottom of the barrel development is standard.
     1 people found this helpful
    Share this post

  9. #29
    CategoryTheory's Avatar Senior Member
    Join Date
    Feb 2015
    Location
    Tokyo, Japan
    Posts
    675
    Originally Posted by Licher.Rus Go to original post
    But I don't think that PC is fast food. What a strange idea?..
    As I said, the metaphor is strained at best. It's not one I would have picked.

    But the key takeaway here is that if you think of a console as a platform, you can think of PCs as millions of subtly different platforms. (This is a bit broad, but captures the general idea.) This makes getting a certain level of reliability on PCs much more expensive than on consoles, so you would expect that to achieve an equivalant level of reliability that the PC game would be significantly more expensive. That probably wouldn't fly from a marketing poitn of view (I can just imagine the screaming in this forum if Ubi said that all PC gamers had to pay even 20% more than console gamers for the same game), so they sell it for the same price and either put the same amount of work into reliability (thus getting less on the PC side) or cross-subsidise from console version sales.

    Maybe imagine it more this way. You run a cafe that sells coffee and tea. The coffee of the same quality as your tea costs you twice as much per cup, but the coffee drinkers, who refuse to drink tea, will scream their bloody heads off and boycot your shop if you charge more for a cup of coffee than for a cup of tea. So how do you handle this?
    Share this post

  10. #30
    It is natural that some game update causing a bug that leads to crash for not just a few people.
    It is natural that this bug gets corrected ASAP or in a matter of few days.
    It is NOT natural that this f***ing bug is still exists and nearly 2 months old!
     4 people found this helpful
    Share this post