Wii U Couldn't Run AC: Unity

simplyTravis

Lamer Gamers Podcast Co-Host
#51
Haha, reading all of this stuff with the 1080p's, 60fps's, and gpu's it reminds me of this video dealing with iphones vs android when people would talk about GB's.

Note: I didn't remember how much rough language is in here...beware.


Just because you heard it is supposed to be 1080p and 60 fps or it isn't next-gen doesn't mean it isn't still gorgeous or beautiful! Look at Mario Kart 8! It is amazing to look at even without full on anti-aliasing. Most people play on crappy cheap lcd tv's with terrible ghosting or effects that cause lag already which makes most of this stuff pointless anyway. Ok, enough rant. I'm out. (drops mic)
 

Laer_HeiSeiRyuu

Well-Known Member
#52
Its just poorly disguised fanboys arguing with graphics whores who realize the PS4 and Xbox One arent good deals. Especially now that many of the benefits that consoles used to have in relation to PC's have disappeared. Now its literally just convenience. Even then PC's are for more convenient than they've ever been and console are more PC like.

Kind of like android vs iOS arguements.

Customization vs interface
 

GaemzDood

Well-Known Member
#53
Ubisoft really needs to fire their awful PR department. Resolution is in no way tied to the CPU. Also, the AI is fucking terrible and most o the NPCs are set dressing despite what their PR said. The Wii U could run this game at native 720p...if it was optimized. Though considering this game's code is a travesty on other platforms, it would probably run at a horrible framerate on Wii U too.

@Mike D., GPGPU setups are actually the future. I suggest you read this.

@Goodtwin, you do realize that if the PS4 used a more powerful CPU, it probably would've costed about the same as the Phillips CDI, right? Here's the thing: if you've bottlenecked the CPU, you can still use the GPU (which is pretty high end, that gives you a lot of room for CPU tasks) for CPU related tasks due to how the PS4 is set up. As for your comparison with tablet CPUs, the PS4 is leagues better than any tablet due to it's memory bandwidth, bus bandwidth, memory type, etc.. While it's true that it's CPU frequency is only somewhat better than mobile devices, it makes up for that with everything else and Sony's primary focus on their GPU/APU setup. It's the most powerful $400 system ever. At worst, the PS4's CPU simply limits open world titles with lots of interactive NPCs capped at 30 FPS, which isn't that bad.

Also, the PS4's CPU is better than the Cell in terms of actual gaming as opposed to non-gaming/specialized programming. On top of that, the PS3's RSX chip was terrible at rendering loads of enemies with high polygon counts and additional environmental effects. Games such as Dead Rising 2 (they rendered the same amount of zombies and effects as the 360 and PC release, hence why it's 1204x576 and runs abysmally), Ninja Gaiden Sigma 2 (they had to reduce the enemy/enemy polygon count and remove the exaggerated blood effects in order to get 60 FPS), Bayonetta (oh god where do I begin), and Mafia II (no blood puddles and less dense foliage) proved this.

And oh yeah, even the most powerful current gen home consoles struggle with 1080p / 60 fps, and they're going to be around for awhile.
The PS4 GPU doesn't struggle with 1080p and 60 FPS. Look no further than the fact that there are more 1080p/60 FPS AAA PS4 games as opposed to 1080p/60 FPS AAA games.

This is partially why I built an AMD rig rather than an Intel one. The PS4 and X1 both have 8-core APUs, and my rig has the AMD FX-8320, which is also 8-cores, although without the added embedded graphics chip. I figured that if and when developers tailor their games better for the current-gen platforms, it'll translate better for AMD-equipped PCs in terms of running games in the near future. That and AMD FX chips are perfectly capable for CPU intensive tasks to boot.



Developers were tailoring their engines for hardware that came out in 2005 and 2006 for 8 years. Those systems did not have DSPs, or dedicated processors for the OS, let alone dedicated ram set aside for the OS. When talking about CPUs, the 7th gen consoles could not do Out-of-Order executions, whereas all three current-gen consoles, including the Wii U, can.

The CPU in the Wii U has plenty of power underneath, if developers are willing to work with it. Core for core, as I said earlier, it's as powerful as the A10 APU's in the PS4/X1, but simply has less of them. And rather than x86, the CPU is PPC, which is what the Xbox 360's architecture is. So with that last part in mind, when developers are finding it difficult to get games running or optimized well, it's their own damn fault. And this is especially apparent when you factor in all the Indie developers having little issue getting games running how they want to. Sure, you could argue that most Indie titles are not as graphically intensive, but it's not as though the feature sets and other assets are from 2004 either. Gianna Sisters for example uses DX11 features for the Wii U version, and so do a few other Indie titles.

And then there's Nano Assault Neo, which wasn't optimized much at all, used only one core of the CPU, and even when the game wasn't optimized at all, the game still ran at 60fps @720p. Shin'en even said the game could've been optimized 40-50% more than what it currently is.

NFS: MW is another prime example. Criterion looked at the Wii U's hardware, found out where it was good and bad, and adapted their engine to it. In the end, it clearly showed the Wii U was more capable than the PS3 and 360. Hell, Trine 2, a frickin' launch title, proved that back in 2012.

I could go on and on about this, and I did on IGN during the first year or so of the Wii U, but there were a select few who simply could not grasp it. One guy in particular continued to say how the PS4's CPU was so high-tech and revolutionary, and no matter what I said, he said the Wii U's CPU was miniscule compared to the "mighty" A10 APU. I gave up after awhile because he just wouldn't listen. Oh, and he also said x86 was so much more advanced and ahead of the game compared to PPC, which is honestly very debatable. The issue is more that PPC is hardly used by anyone except Nintendo, so not many developers have their engines tailored to that architecture.

Ok, I'm done for now as we've turned this thread into a tech thread concerning hardware specs and capabilities.
The PS4 isn't using an AMD A10 CPU.
 
Last edited:

GaemzDood

Well-Known Member
#54
Its just poorly disguised fanboys arguing with graphics whores who realize the PS4 and Xbox One arent good deals. Especially now that many of the benefits that consoles used to have in relation to PC's have disappeared. Now its literally just convenience. Even then PC's are for more convenient than they've ever been and console are more PC like.

Kind of like android vs iOS arguements.

Customization vs interface
Except PCs aren't even using GPGPU setups yet. You'll be waiting two years for that.
 

GaemzDood

Well-Known Member
#55
The truth is Ubisoft is tilting its hand about all the consoles truthfully. Talking about 30fps is more cinematic than 60fps? Thats fine and all, but gameplay is better at 60fps, so lets not act like we are choosing 30fps over 60fps, they are choosing 30fps because at 60fps they wouldnt be able to get the visual fidelity nearly as nice. These PR statements are insane. They dont want to come out and tell people that they didnt buy cutting edge hardware, but....you didnt buy cutting edge hardware. The GPU powering the X360 was about a year ahead of its PC counterparts. Not in outright flops performance, but in feature set for sure. The tri core PPC CPU was also pretty darn powerful for 2005. These new consoles are not bring that same cutting edge tech with them this time, and people expecting 1080p 60fps with increased fidelity are in for disappointment. With a game like AC Unity, we are talking a lot of down scaling to make it work, so I dont disagree that the Wii U wouldnt get the same experience, but that doesnt negate the fact that if the Wii U were selling a lot better, and third party games were selling better on the platform, there would be a good chance Unity would have made it onto the console.
I take it you haven't seen my thread about the PS4's GPU?
 

GaemzDood

Well-Known Member
#56
Exactly, and its sad that good gameplay, and an overall good game concept in general seems to be down on the priority list, much lower than visuals and presentation values. If your cut scenes are eating up fifty percent of your games budget, something is wrong with your priorities.
Actually, all four of them are big priorities. This trend of outsider knowledge is bugging me.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#61
The PS4 GPU doesn't struggle with 1080p and 60 FPS.
Except when it, y'know, does struggle to even keep 30 fps. Running an up-res of Last of Us, or Fifa or NBA 2K at those arbitrary benchamarks, though, aren't and shouldn't be taxing to it.
 
Last edited:

GaemzDood

Well-Known Member
#62
Except when it, y'know, does struggle to even keep 30 fps. Running an up-res of Last of Us, or Fifa or NBA 2K at those arbitrary benchamarks, though, aren't and shouldn't be taxing to it.
Assassin's Creed: Unity struggles to hold 30 FPS on PC if you use anything higher than basic FXAA. It's a poorly coded game. Shadow of Mordor is locked at 30 FPS and it's far more impressive.

Also, there are far more demanding games that target 60 FPS on PS4. Lots of them, including PlanetSide 2.
Oh come on.......a title screen? Please Gaemz come on brother. :mfacepalm:
My PC couldn't run the title screen of the first Witcher game.
 

mattavelle1

IT’S GOT A DEATH RAY!
Moderator
#63
Assassin's Creed: Unity struggles to hold 30 FPS on PC if you use anything higher than basic FXAA. It's a poorly coded game. Shadow of Mordor is locked at 30 FPS and it's far more impressive.

Also, there are far more demanding games that target 60 FPS on PS4. Lots of them, including PlanetSide 2.

My PC couldn't run the title screen of the first Witcher game.
Ok that's neither here nor there on the Witcher game. I owned RE5 for 360. LMAO I can't believe I'm talking about this but anyhow. The title screen did nothing that the Wii couldn't handle.
 

Goodtwin

Well-Known Member
#64
@GaemzDood

Yea, I understand that using an I5 processor would have upped the cost a lot, but that doesnt change the fact that the Jaguar is a low end cpu, and developers can only offload so much to the gpu. Much of the cpu's workload does not work well with gpu compute, and its still a programming techniqe that is in its infancy. The PS4 is a well designed console, Im not bashing it, but its also not high end hardware either. Developers are already findind the limitations, and the progression that happened last gen, wont be nearly as pronounced this gen. Improvements will be more subtle for sure.
 

GaemzDood

Well-Known Member
#65
Ok that's neither here nor there on the Witcher game. I owned RE5 for 360. LMAO I can't believe I'm talking about this but anyhow. The title screen did nothing that the Wii couldn't handle.
I actually did own a laptop that was on the level as the Wii. I couldn't run many title screens for most AAA games without them crashing.
@GaemzDood

Yea, I understand that using an I5 processor would have upped the cost a lot, but that doesnt change the fact that the Jaguar is a low end cpu, and developers can only offload so much to the gpu. Much of the cpu's workload does not work well with gpu compute, and its still a programming techniqe that is in its infancy. The PS4 is a well designed console, Im not bashing it, but its also not high end hardware either. Developers are already findind the limitations, and the progression that happened last gen, wont be nearly as pronounced this gen. Improvements will be more subtle for sure.
Low end =/= low level. Naughty Dog explains it really well.
 

Goodtwin

Well-Known Member
#66
He says its better than Cell, and for most CPU task, he would be right. Its still comparing the Jaguar to the Cell, compare the Jaguar to even an I3, and its performance just doesnt stack up.
 

Shoulder

Your Resident Beardy Bear
#67
He says its better than Cell, and for most CPU task, he would be right. Its still comparing the Jaguar to the Cell, compare the Jaguar to even an I3, and its performance just doesnt stack up.
This. What developers CAN do however is understand what sort of hardware they have, and work with it to make their games better optimized for the job. If there's one thing consoles are very good at, it's optimization. Just look at games such as Halo 4, Uncharted 3, GOW: Ascension, and GTAV. Under PC circumstances, those games would have crippled under the specs the 360 and PS3 were running (and only 512MB of ram), but thanks to great developers utilizing the hardware efficiently, we got great results. Playing GTAV on PS3 was a pleasent experience overall, and I had little qualms about how it performed.
 

Goodtwin

Well-Known Member
#68
Nintendo is really good at not only optimizing their software, but managing development right from the start, and not overextending themselves early on. If you can fake certain effects, bake in shadows, use more efficient lighting and so on, then thats what you need to do in order to make sure you hit your performance target. If you are knee deep in development, and your game is running at 30fps and your target is 60fps, odds are there wont be enough optimizations to close that gap. You have to be in control of your resources from the start. The majority of Nintendo's games arent using overly sophisticated shader techniques, but the way Nintendo puts everything together ends up looking really nice.

Even in Bayonetta, the game runs at 60fps, and even though it looks nice, the lighting and mapping techniques are pretty limited. The shadows outside of characters are not real time, but baked instead, and light sources are minimal, with much of that baked into the textures as well. This is how you hit your 60fps target though. Bayonetta 2 on the other hand uses a lot more advanced lighting techniques, the areas are typically much larger, and the game struggles to maintain its 60fps target as strictly as they did the original.
 

GaemzDood

Well-Known Member
#69
He says its better than Cell, and for most CPU task, he would be right. Its still comparing the Jaguar to the Cell, compare the Jaguar to even an I3, and its performance just doesnt stack up.
Doesn't really matter tbh. Consoles are becoming way less CPU dependent. Hell, I could even envision a CPU-free future for consoles. Also, the Jaguar is better than the Cell all around. The Cell was a glorified vector processor.
 

GaemzDood

Well-Known Member
#70
Nintendo is really good at not only optimizing their software, but managing development right from the start, and not overextending themselves early on. If you can fake certain effects, bake in shadows, use more efficient lighting and so on, then thats what you need to do in order to make sure you hit your performance target. If you are knee deep in development, and your game is running at 30fps and your target is 60fps, odds are there wont be enough optimizations to close that gap. You have to be in control of your resources from the start. The majority of Nintendo's games arent using overly sophisticated shader techniques, but the way Nintendo puts everything together ends up looking really nice.

Even in Bayonetta, the game runs at 60fps, and even though it looks nice, the lighting and mapping techniques are pretty limited. The shadows outside of characters are not real time, but baked instead, and light sources are minimal, with much of that baked into the textures as well. This is how you hit your 60fps target though. Bayonetta 2 on the other hand uses a lot more advanced lighting techniques, the areas are typically much larger, and the game struggles to maintain its 60fps target as strictly as they did the original.
Bayonetta 2 only really dipped during the prologue fight and a few cinematics. Other than that, it's extremely smooth.
 

Goodtwin

Well-Known Member
#71
Bayonetta 2 only really dipped during the prologue fight and a few cinematics. Other than that, it's extremely smooth.
I noticed dips throughout the game, but I am a framerate whore, so I tend to notice even minor dips. Its not like its bad, its just that the game probably averages something closer to 45fps than 60fps. During the battles, Platinum actually turns the area into a small arena, basically, you can no longer see the objects away from the battle area. I think they did this to maintain 60fps during the battles. Its clever, and a lot of people probably didnt even pay attention, but its a clever way to reduce stress and maintain the framerate when the player is the most likely to care.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#72
Doesn't really matter tbh. Consoles are becoming way less CPU dependent. Hell, I could even envision a CPU-free future for consoles. Also, the Jaguar is better than the Cell all around. The Cell was a glorified vector processor.
Cell was a specialist, with a different design philosophy than what replaced it.

On the future of CPUs, that's anyone's guess. Mobile computing is driving modern CPU design. If a future console maker wants to use off-the-shelf parts in the next decade or so, they'll probably still use a CPU.
During the battles, Platinum actually turns the area into a small arena, basically, you can no longer see the objects away from the battle area. I think they did this to maintain 60fps during the battles. Its clever, and a lot of people probably didnt even pay attention, but its a clever way to reduce stress and maintain the framerate when the player is the most likely to care.
Huh. Yup, I didn't even really think about that. Very clever, Platinum.
 

GaemzDood

Well-Known Member
#73
I noticed dips throughout the game, but I am a framerate whore, so I tend to notice even minor dips. Its not like its bad, its just that the game probably averages something closer to 45fps than 60fps. During the battles, Platinum actually turns the area into a small arena, basically, you can no longer see the objects away from the battle area. I think they did this to maintain 60fps during the battles. Its clever, and a lot of people probably didnt even pay attention, but its a clever way to reduce stress and maintain the framerate when the player is the most likely to care.
Now that you say that, I've been thinking about a clever technique that could get open world games to play at 60 FPS on the PS4 & Xbox One without being made by based Kojima. Simply load a large portion of the area you're in; once you leave that area, stop rendering it until you enter back. It'll cause some in game area loading ala Oblivion on the 360, but it seems like a way to take stress off the CPU. That or there's always the option to simulate a 60 FPS image while still having the game play with 33ms response times ala that one Force Unleashed II 360 demo.

Also, to me, 45 FPS feels like 60 FPS with minor camera stutter, and I'm an all around graphics/framerate whore.
 

Shoulder

Your Resident Beardy Bear
#74
Now that you say that, I've been thinking about a clever technique that could get open world games to play at 60 FPS on the PS4 & Xbox One without being made by based Kojima. Simply load a large portion of the area you're in; once you leave that area, stop rendering it until you enter back. It'll cause some in game area loading ala Oblivion on the 360, but it seems like a way to take stress off the CPU. That or there's always the option to simulate a 60 FPS image while still having the game play with 33ms response times ala that one Force Unleashed II 360 demo.

Also, to me, 45 FPS feels like 60 FPS with minor camera stutter, and I'm an all around graphics/framerate whore.
The only issue with games running at 45fps or so with a 60hz display is you introduce stutter, which of course makes the game chug a bit, even if the framerate itself isn't that bad. This is because the framerate is no longer in sync with the refresh rate of display, and this is mainly why stutter is introduced.

That's why games are aimed at 30 or 60fps, because it's all divisible by 60Hz (and 120Hz). In some cases for some games, it would be better sustainable to maintain a locked 30fps compared to a framerate that hovers between the 45-60fps, although in the case with Bayonetta 2, that is an exception. Slower-paced games can get away with 30fps, and then you have none of that stutter, while faster-paced games benefit with the higher framerate.

nVidia does have that G-Sync display which basically syncs the refresh rate based on the framerate of the game. So a game running at 45fps would not stutter unlike a non-G-sync display.
 

GaemzDood

Well-Known Member
#75
The only issue with games running at 45fps or so with a 60hz display is you introduce stutter, which of course makes the game chug a bit, even if the framerate itself isn't that bad. This is because the framerate is no longer in sync with the refresh rate of display, and this is mainly why stutter is introduced.

That's why games are aimed at 30 or 60fps, because it's all divisible by 60Hz (and 120Hz). In some cases for some games, it would be better sustainable to maintain a locked 30fps compared to a framerate that hovers between the 45-60fps, although in the case with Bayonetta 2, that is an exception. Slower-paced games can get away with 30fps, and then you have none of that stutter, while faster-paced games benefit with the higher framerate.

nVidia does have that G-Sync display which basically syncs the refresh rate based on the framerate of the game. So a game running at 45fps would not stutter unlike a non-G-sync display.
It's really not that bad honestly. If a game is displayed like a 60 FPS title, generally runs around that range, and isn't running at a jerky unlocked framerate that sticks around the 28-44 FPS range ala Dark Souls II on the 360 or both CoD games on Wii U, then the gameplay benefits and it still looks a lot smoother in spite of minor judder. I made a thread about it on IGN a while ago.
 
Top