Wii U Couldn't Run AC: Unity

#1
--
https://www.redbull.com/en/games/stories/1331682984950/assassin’s-creed-unity-preview-and-interview

One platform you won’t be slaying counter-revolutionaries on with a friend, however, is the Wii U. While Nintendo’s console received the last two Creed instalments, and is getting a belated port of Watch Dogs, Ubisoft are dropping it for Unity, and with good reason, says Amancio. It just couldn’t handle the graphics, the scale and the vision of Unity.

“It couldn’t, it really couldn’t. I mean this is why we from the beginning, this was going to be a new-gen-only title, because the crowds aren’t aesthetic, they actually have impact. If we did anything to hinder that or to reduce that it would have a detrimental impact, it wouldn’t be the same experience. I don’t think that would be fair to fans, to sell the same game but with different levels of experience. Even the seamless nature of the series and the scale of the game right, we couldn't do that. We never load Paris. It wouldn't be possible, in our minds we’d be cheating fans by providing a lesser version of the same game.”
--

Well that's unfortunate. Looks like Nintendo really did skimp out when it came to deciding what hardware to put in the Wii U. The fact that they had to charge $350 for this thing when it launched only a little less than 2 years ago probably says that they made poor design choices. It also says that they didn't adequately prepare for the future when it came to designing the GamePad. Not a lot of accurate information has come out about what went on between Nintendo and third-parties behind closed doors, but regardless, if the Wii U can't even run a late 2014 game up to par, I think it's going to be Wii-like third party support from here on out, which could fine for us if Nintendo is able to keep putting out great first parties, but real shitty for Nintendo being that the Wii U isn't exactly selling itself like the Wii did.

I hope Nintendo has the capability to support the Wii U into 2017 at the very least because the idea of a stop-gap console doesn't sound very appealing. As college student who just wants to be able to buy a console that I know will be supported for a long enough time, if Nintendo can't manage to at least land on it's feet with Wii U, I'm honestly just not going to be as compelled to get the next platform as soon as I did Wii U.

Thoughts?
 
Last edited:

Ex-Actarus

Well-Known Member
#4
Maybe it's true, but then why the other Assassin's Creed game, Rogue is not coming to the Wii U then ?

And how come Watchdogs was made for both, last gen and current gen consoles ?

Nah, I don't believe Ubisoft here. I just think, it's was a business decision, not a technical one...
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#5
Huh. This is really interesting. Because they're not talking about some cutting edge graphical wonkery. They're talking very specifically about the impact of the crowds.

If the crowds aren't just there for scale* (in which case you could just fog in or subtract people), we're talking about more complex AI. Which would indicate what would traditionally be a CPU-heavy game.

The PS4 and One don't have powerfully exotic CPUs. Even on the PS4, which seems to be the undisputed performance champ this gen, devs have access to six of the eight cores of a pretty vanilla CPU. X86 relies on raw power, while PPC is less so, fwiw, and the respective consoles are built as such. And as FuzzyWuzzyGames told us, X86 to PPC isn't freaking brain surgery.

So it's highly unlikely that the CPU is the excuse.

If we give them the benefit of the doubt, that this won't be like a traditionally programmed console game? They'd be relying on GPGPU. And again, these aren't terribly exotic GPUs we're talking about. As FWG told us, the feature set of the Wii U's GPU is comparatively modern, too. And as a modern GPU, general processing with it is part of the point (which we heard about endlessly a few years ago) - that's why it's paired with a "slower" CPU, just like PS4 and XB One. So I highly, highly doubt they've done some witchcraft GPGPU programming that is impossible to replicate on another device.

On top of all of that? Every single freaking time I've heard a developer talk about "next-gen artificial intelligence," they've let me down and been talking out of their ass.

Weak sauce, Ubi. Just say "it won't sell." It's the truth.

Sorry...didn't mean to write an article here.

[*From Hyrule Warriors, we already know the Wii U can handle rendering hundreds of dumb enemies attacking.]
I am very suspicious. If the wii could play games like CoD4, then it is hard to believe that the wii U can not run AC unity.
Also, this.
 

Ex-Actarus

Well-Known Member
#6
@Mike D. , great insight and analysis !

Another PR fiasco day for Ubisoft :
- The Wii U hardware can't handle Unity ( yeah... right o_O )
- Parity for Unity at 30fps / 9oop on PS4 and Xbone ( ouch !!! )
- The Crew slightly delayed... ( probably for the better with Horizon 2 and Driveclub )

Not a good day at the office for Guillemot :rolleyes:...
 

Goodtwin

Well-Known Member
#7
Its all about the return on investment, and the investment to port Unity to Wii U would be much greater than they were with the previous two games, and we know they werent happy with the sales of those ports. Like Koenig said, if the COD games could run and maintain the experience on Wii, then this could be done on Wii U without destroying the core experience. Would there be compromises? Sure, absolutely, but just as we have seen with countless cross gen games, it can be done.

With all that said, Nintendo did invest a lot into the Gamepad, and although I like it, its hard to argue that if that money had been spent on building a console that was closer to X1 spec, then the porting cost would have been reduced. The question is, if the Wii U had been comparable to the X1 in terms of performance, would third party games sell any better on the platform? Sales have been terrible for these games. For games that sell millions and millions of copies on the Xbox and Playstation, its hard to imagine them every doing as well on a Nintendo platform, even if the userbase is about the same size. It really comes down to sales, and software sales, Ubisoft could give a shit if Wii U has only sold 7 million units, if there AC games had been selling a million plus units every year, then Unity would get a port this year, even if compromises has to be made.
 

Trexio

New Member
#8
Honestly I wouldn't really care about Unity being on Wii U or not. I mean it'd be cool, but seeing as how it is a next gen game and it runs a lot on the consoles themselves I can see them not bringing Unity to WiiU and that's understandable. However, that does not mean that the Wii U is "under powered" for Rogue. Ubisoft may be bitching how Nintendo players don't really play AC on the Wii U and "waste" money on it, but I feel like making Rogue available for Wii U wouldn't be a problem if it wasn't a damn port. They really just need to make a good quality AC that's compatible for the Wii U. Financially I don't think it'd take too much of their money as Unity and Rogue on last and next gen would sell very well in my opinion so in that sense, making a Wii U version would be nothing to them but they simply won't have any of that anymore because they think we "love" Just Dance.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#9
great insight and analysis !
Why thanks. But to be fair, I'm not the first person to talk to on tech matters. All info referenced isn't terribly hard to find (if you know where to look); those are just my best guesses based upon that very specific response from Ubi.
Its all about the return on investment, and the investment to port Unity to Wii U would be much greater than they were with the previous two games, and we know they werent happy with the sales of those ports. Like Koenig said, if the COD games could run and maintain the experience on Wii, then this could be done on Wii U without destroying the core experience. Would there be compromises? Sure, absolutely, but just as we have seen with countless cross gen games, it can be done.

With all that said, Nintendo did invest a lot into the Gamepad, and although I like it, its hard to argue that if that money had been spent on building a console that was closer to X1 spec, then the porting cost would have been reduced. The question is, if the Wii U had been comparable to the X1 in terms of performance, would third party games sell any better on the platform? Sales have been terrible for these games. For games that sell millions and millions of copies on the Xbox and Playstation, its hard to imagine them every doing as well on a Nintendo platform, even if the userbase is about the same size. It really comes down to sales, and software sales, Ubisoft could give a shit if Wii U has only sold 7 million units, if there AC games had been selling a million plus units every year, then Unity would get a port this year, even if compromises has to be made.
Well, like you said, even with the porting cost reduced, I don't think it'd have made much of a difference. If Assassin's Creed averages about 250K sales, that's not enough. COD was different on Wii, because the games could push a million copies, give or take. If COD only pushed 200K copies, they would have stopped making it on Wii.

What gets me here is that Ubi response about the crowds. It just doesn't make sense. The Wii U can render a lot of onscreen NPCs. So the artificial intelligence has got to be doing something more computationally taxing than just mulling around(*).

Even if Ubi had just said "well, the game would probably have to be sub-720p, and we don't want to do that," I'd understand. That would at least make sense.

*
That said, this is where I think it's also worth addressing the PS4 and One's headroom. I don't get how people are still treating these consoles like they've got nuclear-powered innards. They don't. They just don't. These aren't highly customized chipsets; even with a closed development ecosystem, they're still relatively inexpensive, mid-tier PC tech. No matter how you slice it, the performance gap isn't anywhere close to what it was last gen, and the respective architecture of each console is way closer now. Plus, all of the development resources expended on learning about squeezing performance out of a highly customized CPU like Cell means precisely dick now.

So porting something without having to reprogram from a really exotic CPU like Cell already makes porting easier. It's not a matter of power and never has been. It's money. If a dev thinks they can make money, they will. That's why devs invested so much into figuring out the PS3's guts, even though it was expensive to do so. They don't think they can make money on Wii U in the same way they do on PS/XB, and they're right.

EDIT for spoiler 2:

One of the comments on the front page is what I'm talking about. I'm surprised people are still trying to use clockrate as a measuring stick to begin with - the megahertz myth isn't hard to google - but even if you do? It's still laughable.

"The CPUs have more cores."
Yes, that work far less efficiently. X86 is nowhere close to cutting edge.

"The PS4 GPU is 6x more powerful."
Which is a drop in the bucket compared to last gen (remember nVidia saying the PS3's GPU was 50x as powerful as the PS2's?). It's not negligible, but calling this gen's performance benchmarks a "wide hardware gap" is stretching the meaning of those words to the breaking point.
 
Last edited:

Shoulder

Your Resident Beardy Bear
#10
This whole nonsense has nothing to do with capabilities of the system, but everything to do about the game selling well on a particular platform. ACIII, ACIV, SC:B, they all sold poorly on Wii U, no other way of saying it really.

If the Wii U were the most successful of the three systems out right now, Ubisoft would have no problem adapting their engine to work with the Wii U's hardware. Devs did it back during the 6th gen for the PS2, it sure as hell could happen for the Wii U.

This also explains why AC: Rogue is not making its way onto the Wii U either when it's coming to the PS3 and 360. Horsepower has nothing to do with this.
 

Juegos

All mods go to heaven.
Moderator
#11
I'm all for making sure you don't compromise your game by forcing it to run on weaker hardware (and I wish devs would forget the PS3 and X360 exist already, to this end), but this sounds sketchy, and I won't believe it until I see it.

I'm with the others here, it's probably just about the revenue projections for a Wii U version not being appealing, but they can't come out and say it because god forbid their fans find out that Ubisoft is a business and not a magical funhouse factory.
 
Last edited:

Odo

Well-Known Member
#12
You all have said, it doesn't have sense, at all.

Plus, it makes Ubisoft more confusing now. Are they skipping Wii U because "we don't buy mature games" or because it's weak?

I think Activion attitude is much more positive than Ubisoft's. They just said "no new COD for U". That's fine. Ubisoft doesn't stop making excuses and going with this idea of Wii U is not the same generation as PS4/X1.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#13
:mfacepalm: :mlaugh::mlaugh: :mfacepalm:

So I couldn't help it. Did a little googling on Unity and found this:

http://metro.co.uk/2014/10/07/assas...on-xbox-one-and-ps4-to-avoid-debates-4895590/

So that's...interesting.

One, the game is most likely locked at 900p / 30fps. Because these behemoth titans of next-gen power are...y'know...not exactly that.

Two, they're blaming it on - *drumroll* - the CPUs.

I hate to say I told ya so on the A.I., buuuuuut...

====
‘Technically we’re CPU-bound,’ he added. ‘The GPUs are really powerful, obviously the graphics look pretty good, but it’s the CPU [that] has to process the AI, the number of NPCs we have on screen, all these systems running in parallel.’

‘We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise, and we realised it was going to be pretty hard. It’s not the number of polygons that affect the frame rate. We could be running at 100fps if it was just graphics, but because of AI, we’re still limited to 30 frames per second.’
====


(Really wish I had made that article now and rushed it out.)

We all read about devs griping on the Wii U's slow CPU, but this is the first time I can remember one saying the same sort of thing about the PS4 or One.

This also suggests that GPGPU is not a panacea like some have suggested for next-gen hardware. There may come a day where a synthesis occurs in processing units, but in the physical universe we inhabit, you still have to put that CPU to work. And don't let multiple cores fool you - the X86 processors of the PS and XB are modest.

So sure, sales on the Wii U are probably the reason for no AC this year, but it could also be because they're using all available resources to work on the CPU bottlenecks on PS/XB.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#15
So Ubisoft is a way kind of blaming the specs of the PS4 and Xbone.

That's funny, actually a guy from Crytek was kind of complaing as well about the PS4/Xbone specs.

Source : http://www.gamespot.com/articles/crytek-says-its-getting-increasingly-difficult-to-/1100-6422746/
Well, to be fair, you can always trust a dev who has pushed the boundaries of PC visuals to be disappointed in consoles. That's pretty much a given. Hell, he's complaining about PC GPUs that can't push 4K! Geez, Crytek. Most of the gaming populace is trying to get around to 1080p sets first. Adoption of 50"+ sets for Ultra HD is still a few years in the future, and the tipping point will be having enough people buying a huge enough set and sit really, really close to it, to watch content that doesn't yet exist. And oh yeah, even the most powerful current gen home consoles struggle with 1080p / 60 fps, and they're going to be around for awhile.

But this part is telling:

====
"As opposed to the times of the original Crysis, we as an industry have reached a quality level now where it is getting increasingly more difficult to really wow people."
====

Well yeah. It's called diminishing returns. Plenty of people here have argued against the extent that it's happening, but it IS real. The visual bar is very, very high, and the higher it gets, the harder the improvements are to discern.
 

Ex-Actarus

Well-Known Member
#16
@Mike D. yes I agree that there is definitely the law of diminishing returns kicking in.

Having said that, I would say that most of devs - especially Cryteck - have pushed too much the technical side of things instead of focusing on the artistic one.

Last gen, games like Journey ( an indie game ) and Kirby Epic Yarn ( on the less powerful platform, the Wii ) were praised for their incredible graphics ! You might think, that devs would learn from that... NOPE !

I mean, the most impressive looking game I've seen this gen to this day is Zelda Wii U. I don't care about the technical specs or the amount of special effects. When Anouma snapped his fingers and I saw that background, my jaw dropped to the floor !

Actually Wii U games like Yoshi's Wooley or Splatoon impressed me a lot. I'm pretty sure the artistic touch is main reason why.

Ok, Arkham Knight looks great, but that reveal can't even come close to the reveal of Arkham Asylum ! Same for the Witcher 3, it looks gorgeous, but the Witcher 2, was just something else visually when it came out...

Diminishing returns indeed... So push the Artistic button Crytek !


 

Goodtwin

Well-Known Member
#17
No surprise that developers are finding the limits of the CPU already on PS4 and X1, they simply didnt go with high end CPU's this gen. Whats funny is that some people are actually surprised, the Jaguar CPU is a low power CPU designed for tablets. They got away with it by going with 8 cores. Core for core, the PPC750 core would not get trounced by a Jaguar core, but they have more of them on the PS4/X1.

If the developers truly are hitting the wall with the X1/PS4 CPU, then I suppose I can see a port to Wii U being a pretty troublesome task. For all we know, these NPC's may be maxing out 3-4 Jaguar cores. It seems like they were pretty focused on making them interactive this time, so I suppose I can see that not working to well on Wii U. Where the PR bullshit hits the wall is with AC Rogue. We know that could run on Wii U, but based on financial reasons, its not going to happen. Why even talk about Unity and Wii U when you know your not even going to run the code for Rogue through a compiler and slap it on a disk like you did with AC3 and AC4?

Wii U and X1 are the only consoles this gen to really try and bring something other than prettier versions of last gen games. Nintendo failed to really win big with the Gamepad, and Microsoft fell flat with the Kinect. Sony is victorious, and its honestly with a console that offers little more than a supercharged PS3. People can talk about big advancement in AI and physics, but the Jagaur is no I5 or I7 processor, and its the CPU that handles that load. GPGPU is no replacement for a good CPU. The latency of a GPU simply makes it a terrible match for things like AI.
 

Shoulder

Your Resident Beardy Bear
#18
No surprise that developers are finding the limits of the CPU already on PS4 and X1, they simply didnt go with high end CPU's this gen. Whats funny is that some people are actually surprised, the Jaguar CPU is a low power CPU designed for tablets. They got away with it by going with 8 cores. Core for core, the PPC750 core would not get trounced by a Jaguar core, but they have more of them on the PS4/X1.

If the developers truly are hitting the wall with the X1/PS4 CPU, then I suppose I can see a port to Wii U being a pretty troublesome task. For all we know, these NPC's may be maxing out 3-4 Jaguar cores. It seems like they were pretty focused on making them interactive this time, so I suppose I can see that not working to well on Wii U. Where the PR bullshit hits the wall is with AC Rogue. We know that could run on Wii U, but based on financial reasons, its not going to happen. Why even talk about Unity and Wii U when you know your not even going to run the code for Rogue through a compiler and slap it on a disk like you did with AC3 and AC4?

Wii U and X1 are the only consoles this gen to really try and bring something other than prettier versions of last gen games. Nintendo failed to really win big with the Gamepad, and Microsoft fell flat with the Kinect. Sony is victorious, and its honestly with a console that offers little more than a supercharged PS3. People can talk about big advancement in AI and physics, but the Jagaur is no I5 or I7 processor, and its the CPU that handles that load. GPGPU is no replacement for a good CPU. The latency of a GPU simply makes it a terrible match for things like AI.

Interestingly enough, however, The Wii U has the CPU and GPU on the same die, so latency wouldn't be as affected compared to the GPU and CPU on different dies. The PS4 and X1 are also on the same die. I think the issue with utilizing the GPGPU to its fullest is developers simply haven't adapted their engines to work well with it. In theory, the GPGPU would allow you to run an entire game using just the GPU itself with little to nothing being used for the CPU. There have been prototype demos showcasing the power of the GPGPU, so it's definitely there. The Frost engine for Battlefield was very CPU intensive, but the new Frost engine, 3.0, is more GPU intensive, so I believe it's more adapted to GPGPUs.

With regards to the Wii U, it has a DSP for sound, so nothing has to be used on the CPU (some very sound intensive games will easily use one or more cores). Also, it has a separate processor, specifically ARM, that's used for the OS. In other words, the little 3-core PPC750 CPU can be used entirely for games. And when you factor in the GPGPU, it should also in theory allow more CPU intensive games to run on it. The big issue of course is adapting a game engine to fully take advantage of the hardware, which not many have done.

That being said, Shin'en hope to accomplish that with FAST Racing Neo, and I think Slightly Mad plan to do that as well with Project CARS. But I'm also talking about technical showcases here as well, and not art styles, like what @Ex-Actarus was bringing up. Remember this technical breakdown of Wind Waker?

http://www.polycount.com/forum/showthread.php?t=104415

The takeaway you can get from this is Nintendo love using trickery to get the desired effect, whereas a lot of developers prefer to just brute force their way to make something work. This I think is part of Nintendo's philosophy as a whole because you can just take things at face value and go on with your life, but when you dive deeper, you discover there's more going on and more thought considered than previously thought.

I particularly love from that thread the the shadow effect from the stairs. It isn't a shadow at all, but instead a different shade of texture. Little things like that just fascinate me.
 

Koenig

The Architect
#19
That is one thing I truly do despise about multiplatform/PC games; developers never bother to fully optimize their products for the systems they release them on, and it is only going to get worse as development costs continue to skyrocket.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#20
No surprise that developers are finding the limits of the CPU already on PS4 and X1, they simply didnt go with high end CPU's this gen. Whats funny is that some people are actually surprised, the Jaguar CPU is a low power CPU designed for tablets. They got away with it by going with 8 cores. Core for core, the PPC750 core would not get trounced by a Jaguar core, but they have more of them on the PS4/X1.

If the developers truly are hitting the wall with the X1/PS4 CPU, then I suppose I can see a port to Wii U being a pretty troublesome task. For all we know, these NPC's may be maxing out 3-4 Jaguar cores. It seems like they were pretty focused on making them interactive this time, so I suppose I can see that not working to well on Wii U. Where the PR bullshit hits the wall is with AC Rogue. We know that could run on Wii U, but based on financial reasons, its not going to happen. Why even talk about Unity and Wii U when you know your not even going to run the code for Rogue through a compiler and slap it on a disk like you did with AC3 and AC4?

Wii U and X1 are the only consoles this gen to really try and bring something other than prettier versions of last gen games. Nintendo failed to really win big with the Gamepad, and Microsoft fell flat with the Kinect. Sony is victorious, and its honestly with a console that offers little more than a supercharged PS3. People can talk about big advancement in AI and physics, but the Jagaur is no I5 or I7 processor, and its the CPU that handles that load. GPGPU is no replacement for a good CPU. The latency of a GPU simply makes it a terrible match for things like AI.
It's interesting to me, though, that the answer that both Sony and MS chose this gen for their CPUs was to just throw cores at the problem. I remember when the specs on these boxes leaked, people were so surprised about the RAM and the CPU cores - because most games on PC still don't take up that much RAM, and still don't use that many CPU cores (devs build around PCs having two or four CPU cores available, usually).

The OS overhead is one answer (almost instantly switching between tasks was a priority) for it. The other, I guess, is that these X86 chips just aren't that powerful.

But if a PC version of a game is built with other CPUs in mind that aren't six-core affairs...well, it still makes me doubtful that Unity is impossible to run on Wii U. It's probably more that it'd require investing money to do it well, and the return on investment isn't there.

I'll believe all this amazingly advanced AI when I see it, though.
Interestingly enough, however, The Wii U has the CPU and GPU on the same die, so latency wouldn't be as affected compared to the GPU and CPU on different dies. The PS4 and X1 are also on the same die. I think the issue with utilizing the GPGPU to its fullest is developers simply haven't adapted their engines to work well with it. In theory, the GPGPU would allow you to run an entire game using just the GPU itself with little to nothing being used for the CPU. There have been prototype demos showcasing the power of the GPGPU, so it's definitely there. The Frost engine for Battlefield was very CPU intensive, but the new Frost engine, 3.0, is more GPU intensive, so I believe it's more adapted to GPGPUs.
I think you could be right on devs just not having their software tailored around GPGPU yet - console developers spent quite a few years learning how to wring performance out of the CPUs in the 360 and PS3 (and that experience isn't worth much right now).

But running everything on the GPU? That's getting into computing conjecture I'm ill-equipped to discuss, but it would seem to be beyond the current consoles. They may be built around more powerful GPUs, but they're still built around using those CPUs, too. And with the tools presently available, it makes you wonder if they can offload something like AI scripting to the GPU.

Then again...if Unity was built specifically with these consoles in mind? It suggests there's only so much you can do with GPGPU, and that a good 'ol CPU is still better suited to some tasks.
 

Shoulder

Your Resident Beardy Bear
#21
It's interesting to me, though, that the answer that both Sony and MS chose this gen for their CPUs was to just throw cores at the problem. I remember when the specs on these boxes leaked, people were so surprised about the RAM and the CPU cores - because most games on PC still don't take up that much RAM, and still don't use that many CPU cores (devs build around PCs having two or four CPU cores available, usually).

The OS overhead is one answer (almost instantly switching between tasks was a priority) for it. The other, I guess, is that these X86 chips just aren't that powerful.

But if a PC version of a game is built with other CPUs in mind that aren't six-core affairs...well, it still makes me doubtful that Unity is impossible to run on Wii U. It's probably more that it'd require investing money to do it well, and the return on investment isn't there.

I'll believe all this amazingly advanced AI when I see it, though.
This is partially why I built an AMD rig rather than an Intel one. The PS4 and X1 both have 8-core APUs, and my rig has the AMD FX-8320, which is also 8-cores, although without the added embedded graphics chip. I figured that if and when developers tailor their games better for the current-gen platforms, it'll translate better for AMD-equipped PCs in terms of running games in the near future. That and AMD FX chips are perfectly capable for CPU intensive tasks to boot.

I think you could be right on devs just not having their software tailored around GPGPU yet - console developers spent quite a few years learning how to wring performance out of the CPUs in the 360 and PS3 (and that experience isn't worth much right now).

But running everything on the GPU? That's getting into computing conjecture I'm ill-equipped to discuss, but it would seem to be beyond the current consoles. They may be built around more powerful GPUs, but they're still built around using those CPUs, too. And with the tools presently available, it makes you wonder if they can offload something like AI scripting to the GPU.

Then again...if Unity was built specifically with these consoles in mind? It suggests there's only so much you can do with GPGPU, and that a good 'ol CPU is still better suited to some tasks.
Developers were tailoring their engines for hardware that came out in 2005 and 2006 for 8 years. Those systems did not have DSPs, or dedicated processors for the OS, let alone dedicated ram set aside for the OS. When talking about CPUs, the 7th gen consoles could not do Out-of-Order executions, whereas all three current-gen consoles, including the Wii U, can.

The CPU in the Wii U has plenty of power underneath, if developers are willing to work with it. Core for core, as I said earlier, it's as powerful as the A10 APU's in the PS4/X1, but simply has less of them. And rather than x86, the CPU is PPC, which is what the Xbox 360's architecture is. So with that last part in mind, when developers are finding it difficult to get games running or optimized well, it's their own damn fault. And this is especially apparent when you factor in all the Indie developers having little issue getting games running how they want to. Sure, you could argue that most Indie titles are not as graphically intensive, but it's not as though the feature sets and other assets are from 2004 either. Gianna Sisters for example uses DX11 features for the Wii U version, and so do a few other Indie titles.

And then there's Nano Assault Neo, which wasn't optimized much at all, used only one core of the CPU, and even when the game wasn't optimized at all, the game still ran at 60fps @720p. Shin'en even said the game could've been optimized 40-50% more than what it currently is.

NFS: MW is another prime example. Criterion looked at the Wii U's hardware, found out where it was good and bad, and adapted their engine to it. In the end, it clearly showed the Wii U was more capable than the PS3 and 360. Hell, Trine 2, a frickin' launch title, proved that back in 2012.

I could go on and on about this, and I did on IGN during the first year or so of the Wii U, but there were a select few who simply could not grasp it. One guy in particular continued to say how the PS4's CPU was so high-tech and revolutionary, and no matter what I said, he said the Wii U's CPU was miniscule compared to the "mighty" A10 APU. I gave up after awhile because he just wouldn't listen. Oh, and he also said x86 was so much more advanced and ahead of the game compared to PPC, which is honestly very debatable. The issue is more that PPC is hardly used by anyone except Nintendo, so not many developers have their engines tailored to that architecture.

Ok, I'm done for now as we've turned this thread into a tech thread concerning hardware specs and capabilities.
 

Majorbuddah

My real name is Dolemite
#22
Conspiracy theories ITT. Sometimes a cigar is just a cigar. Why build a new, scaled down and crappier version of AC just for Wii U when everyone knows it'll sell like shit? Instead of blaming ubi and ea for these issues, it's (long past) time to start blaming Nintendo for making bad decisions with their hardware and business strategies. Been saying this since 2011, y'all. Iwata needs to go.
 

Majorbuddah

My real name is Dolemite
#23
I hope you guys recognize that Nintendo is already working on concepts for their next console, and under Iwata they'll just fuck it up again.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#24
Conspiracy theories ITT. Sometimes a cigar is just a cigar. Why build a new, scaled down and crappier version of AC just for Wii U when everyone knows it'll sell like shit? Instead of blaming ubi and ea for these issues, it's (long past) time to start blaming Nintendo for making bad decisions with their hardware and business strategies. Been saying this since 2011, y'all. Iwata needs to go.
Sorry dude, either you're reading the wrong thread, or you're just not reading what's being posted. The bolded point is kind of what everyone here has been saying. We all acknowledge it won't sell, and that's why it isn't coming. If that's what Ubi came out and said, I certainly wouldn't say they're wrong. But blaming it on "advanced crowd AI" is pretty damn lame.

The whole thing has proven a bit educational on the CPU front, though...
I could go on and on about this, and I did on IGN during the first year or so of the Wii U, but there were a select few who simply could not grasp it. One guy in particular continued to say how the PS4's CPU was so high-tech and revolutionary, and no matter what I said, he said the Wii U's CPU was miniscule compared to the "mighty" A10 APU. I gave up after awhile because he just wouldn't listen. Oh, and he also said x86 was so much more advanced and ahead of the game compared to PPC, which is honestly very debatable. The issue is more that PPC is hardly used by anyone except Nintendo, so not many developers have their engines tailored to that architecture.
Precisely. It's only news to anyone who thought X86 chips were super advanced based upon the number of cores on the spec sheet.
I hope you guys recognize that Nintendo is already working on concepts for their next console, and under Iwata they'll just fuck it up again.
The gaming R&D for all of the big three are all doing the same thing right now, yes. Whether any of them eff it up will be revealed at a later date.
 

Goodtwin

Well-Known Member
#25
The truth is Ubisoft is tilting its hand about all the consoles truthfully. Talking about 30fps is more cinematic than 60fps? Thats fine and all, but gameplay is better at 60fps, so lets not act like we are choosing 30fps over 60fps, they are choosing 30fps because at 60fps they wouldnt be able to get the visual fidelity nearly as nice. These PR statements are insane. They dont want to come out and tell people that they didnt buy cutting edge hardware, but....you didnt buy cutting edge hardware. The GPU powering the X360 was about a year ahead of its PC counterparts. Not in outright flops performance, but in feature set for sure. The tri core PPC CPU was also pretty darn powerful for 2005. These new consoles are not bring that same cutting edge tech with them this time, and people expecting 1080p 60fps with increased fidelity are in for disappointment. With a game like AC Unity, we are talking a lot of down scaling to make it work, so I dont disagree that the Wii U wouldnt get the same experience, but that doesnt negate the fact that if the Wii U were selling a lot better, and third party games were selling better on the platform, there would be a good chance Unity would have made it onto the console.
 
#26
yes I agree that there is definitely the law of diminishing returns kicking in.

Having said that, I would say that most of devs - especially Cryteck - have pushed too much the technical side of things instead of focusing on the artistic one.

Last gen, games like Journey ( an indie game ) and Kirby Epic Yarn ( on the less powerful platform, the Wii ) were praised for their incredible graphics ! You might think, that devs would learn from that... NOPE !

I mean, the most impressive looking game I've seen this gen to this day is Zelda Wii U. I don't care about the technical specs or the amount of special effects. When Anouma snapped his fingers and I saw that background, my jaw dropped to the floor !

Actually Wii U games like Yoshi's Wooley or Splatoon impressed me a lot. I'm pretty sure the artistic touch is main reason why.

Ok, Arkham Knight looks great, but that reveal can't even come close to the reveal of Arkham Asylum ! Same for the Witcher 3, it looks gorgeous, but the Witcher 2, was just something else visually when it came out...

Diminishing returns indeed... So push the Artistic button Crytek !
I agree with this mostly because devs are always trying to make movie quality games and it really bugs me. I mean Bloodborne, Arkham Knight, even the new Rainbow Six impressed me, but I don't really care about like graphical differences. I think that games should be taken more as an art form like Nintendo and other devs have been making. I'm almost disgusted with games trying to be more than what they are. Games are games high quality graphics or not. It always comes down to the gameplay.
 

Goodtwin

Well-Known Member
#27
I agree with this mostly because devs are always trying to make movie quality games and it really bugs me. I mean Bloodborne, Arkham Knight, even the new Rainbow Six impressed me, but I don't really care about like graphical differences. I think that games should be taken more as an art form like Nintendo and other devs have been making. I'm almost disgusted with games trying to be more than what they are. Games are games high quality graphics or not. It always comes down to the gameplay.
Exactly, and its sad that good gameplay, and an overall good game concept in general seems to be down on the priority list, much lower than visuals and presentation values. If your cut scenes are eating up fifty percent of your games budget, something is wrong with your priorities.
 

Odo

Well-Known Member
#29
The truth is Ubisoft is tilting its hand about all the consoles truthfully. Talking about 30fps is more cinematic than 60fps? Thats fine and all, but gameplay is better at 60fps, so lets not act like we are choosing 30fps over 60fps, they are choosing 30fps because at 60fps they wouldnt be able to get the visual fidelity nearly as nice. These PR statements are insane. They dont want to come out and tell people that they didnt buy cutting edge hardware, but....you didnt buy cutting edge hardware. The GPU powering the X360 was about a year ahead of its PC counterparts. Not in outright flops performance, but in feature set for sure. The tri core PPC CPU was also pretty darn powerful for 2005. These new consoles are not bring that same cutting edge tech with them this time, and people expecting 1080p 60fps with increased fidelity are in for disappointment. With a game like AC Unity, we are talking a lot of down scaling to make it work, so I dont disagree that the Wii U wouldnt get the same experience, but that doesnt negate the fact that if the Wii U were selling a lot better, and third party games were selling better on the platform, there would be a good chance Unity would have made it onto the console.
Summarizing it: Ubisoft only excuses itself and nothing more.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#30
I haven't been reading. Sorry for being a lazy shit poster.
Pshaw. :p
The truth is Ubisoft is tilting its hand about all the consoles truthfully. Talking about 30fps is more cinematic than 60fps? Thats fine and all, but gameplay is better at 60fps, so lets not act like we are choosing 30fps over 60fps, they are choosing 30fps because at 60fps they wouldnt be able to get the visual fidelity nearly as nice. These PR statements are insane. They dont want to come out and tell people that they didnt buy cutting edge hardware, but....you didnt buy cutting edge hardware. The GPU powering the X360 was about a year ahead of its PC counterparts. Not in outright flops performance, but in feature set for sure. The tri core PPC CPU was also pretty darn powerful for 2005. These new consoles are not bring that same cutting edge tech with them this time, and people expecting 1080p 60fps with increased fidelity are in for disappointment. With a game like AC Unity, we are talking a lot of down scaling to make it work, so I dont disagree that the Wii U wouldnt get the same experience, but that doesnt negate the fact that if the Wii U were selling a lot better, and third party games were selling better on the platform, there would be a good chance Unity would have made it onto the console.
Heh, I hadn't read that "cinematic" line earlier.

I actually do kinda agree on the line about Hobbit (that movie looked like shit in HFR, IMO - Rivendell looked like a Lisa Frank lunchbox), but videogames aren't movies. No Assassin's Creed game has looked like a movie because they aren't shooting actual humans through a camera at 24 fps and displaying it in a darkened theater with natural motion blur. So you might as well try to make a game look its best on its own terms, and 30 fps is perfectly workable for a third-person action game that isn't full of twitchy action, especially if you're trying to jam in as many effects as possible at a high resolution. Eye candy has a price, so they ought to just be honest about it.

More interesting is the comment on the rest of the industry.

"So I think collectively in the video game industry we're dropping that standard because it's hard to achieve, it's twice as hard as 30fps, and its not really that great in terms of rendering quality of the picture and the image."

Man, I wish Vessel of Light was still around here. He'd be handing out "told ya so's" and have every right to. Devs have had to pick between stability and eye candy in every gen since the dawn of 3D videogame worlds. In every generation, they've picked eye candy.

But anywho...the comments on GAF ought to be riotous about this.
 

Shoulder

Your Resident Beardy Bear
#31
Pshaw. :p

Heh, I hadn't read that "cinematic" line earlier.

I actually do kinda agree on the line about Hobbit (that movie looked like shit in HFR, IMO - Rivendell looked like a Lisa Frank lunchbox), but videogames aren't movies. No Assassin's Creed game has looked like a movie because they aren't shooting actual humans through a camera at 24 fps and displaying it in a darkened theater with natural motion blur. So you might as well try to make a game look its best on its own terms, and 30 fps is perfectly workable for a third-person action game that isn't full of twitchy action, especially if you're trying to jam in as many effects as possible at a high resolution. Eye candy has a price, so they ought to just be honest about it.

More interesting is the comment on the rest of the industry.

"So I think collectively in the video game industry we're dropping that standard because it's hard to achieve, it's twice as hard as 30fps, and its not really that great in terms of rendering quality of the picture and the image."

Man, I wish Vessel of Light was still around here. He'd be handing out "told ya so's" and have every right to. Devs have had to pick between stability and eye candy in every gen since the dawn of 3D videogame worlds. In every generation, they've picked eye candy.

But anywho...the comments on GAF ought to be riotous about this.
GAF always has something to say. Just this morning, I checked out a thread concerning FAST Racing Neo, and someone actually complained about the screenshots, saying how Shin'en are reusing textures and other assets. Well, what other fucking developer DOESN'T do that?

It got better though, which I'll just link the thread for you to read. It's quite good. I thought someone said that when Shin'en said "smoke and mirrors," it meant they had to fake certain things to get it to work. Well, no shit, Sherlock. Every developers will use tricks to get the desired result. Developers hardly ever just brute force their way to get the desrired look these days.

http://67.227.255.239/forum/showthread.php?p=132654551
 

Goodtwin

Well-Known Member
#32
GAF always has something to say. Just this morning, I checked out a thread concerning FAST Racing Neo, and someone actually complained about the screenshots, saying how Shin'en are reusing textures and other assets. Well, what other fucking developer DOESN'T do that?

It got better though, which I'll just link the thread for you to read. It's quite good. I thought someone said that when Shin'en said "smoke and mirrors," it meant they had to fake certain things to get it to work. Well, no shit, Sherlock. Every developers will use tricks to get the desired result. Developers hardly ever just brute force their way to get the desired look these days.

http://67.227.255.239/forum/showthread.php?p=132654551

Shin'en is reusing a ton of assets, in the scene with all the trees, if you look closely, its basically just a few unique assets used over and over again. They are hiding this a little by tightly packing them together, and then arranging them so they are positioned at various angles to the camera. Shin'en is a group of 5 guys, this is to be expected, they dont have 100 people who sit in an office making nothing by assets for the game for two years. Faking effects is normal, and if you can fake it, and the player isnt likely to notice, then that saves performance for other effects to better improve the games overall aesthetics. Every game developer does this, and its a highly desirable skill to be able to strike the correct balance.

Developers are going to make the same decision this gen that they did the last gen, resolution and framerate are not going to take priority over shader effects. It would not surprise me if a game comes out in 720p 30fps that is considered the best looking game on either console. Resolution and framerate are expensive, and those resources typically go farther with quality assets, lighting, shadows, and particle effects.

Thats why I think Nintendo could really blow people away with a 30fps game. Mario 3d World and Mario Kart 8 both run flawlessly at 60fps. Resource demands are cut in half when you go to 30fps. Expect to be impressed with Zelda, they will likely choose 30fps, and with the gained experience and knowledge of the console, they should have one impressive game on thier hands. Even on the CPU side of things, Zelda SS was developed on a console that had 1/6th of the Wii U's CPU prowess. Expect Hyrule to be much more alive than in the past.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#34
GAF always has something to say. Just this morning, I checked out a thread concerning FAST Racing Neo, and someone actually complained about the screenshots, saying how Shin'en are reusing textures and other assets. Well, what other fucking developer DOESN'T do that?
More importantly, what developer of a small-budget, closed-course racer doesn't reuse assets? Scenery will be flying past you in a blur. FFS, use your heads, GAF.
 

Shoulder

Your Resident Beardy Bear
#35
More importantly, what developer of a small-budget, closed-course racer doesn't reuse assets? Scenery will be flying past you in a blur. FFS, use your heads, GAF.
And in that thread, Shin'en specifically mentions they are using motion blur in the game to begin with, so you won't even notice the reused textures anyway. I tell ya, GAF, some days you have some brilliant discussions, but the majority of the time it's just full of derp.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#36
And in that thread, Shin'en specifically mentions they are using motion blur in the game to begin with, so you won't even notice the reused textures anyway. I tell ya, GAF, some days you have some brilliant discussions, but the majority of the time it's just full of derp.
I have learned way more on tech from you guys here (and the interview that was done with FWG) than just about anything I've read at GAF. Outside of guesstimating flops, the reason to visit GAF are for the meltdowns.
 

Shoulder

Your Resident Beardy Bear
#37
I have learned way more on tech from you guys here (and the interview that was done with FWG) than just about anything I've read at GAF. Outside of guesstimating flops, the reason to visit GAF are for the meltdowns.
I honestly did get a lot of my knowhow from GAF in those GPU and CPU die threads from awhile back. There were some very intelligent guys on there, and I simply followed up some of their info by researching it myself. That being said, I'm hardly an expert on how the technical knowhow works all the time. But as I said earlier, the Wii U's horsepower is fine. And quite frankly, I honestly do think Nintendo have almost given up on AAA 3rd party developers, so they are making headway with Indie devs, which I think is reminiscent of early Nintendo. They obviously want to train a lot of these people into making great games, so they do provide input and help when needed.

To some people, Indie devs are not enough, but what's the difference between an Indie game, and one which comes from a AAA 3rd party dev? Budget, and that is it. If Indie games have taught us anything, it's that gameplay fun factor does not necessarily increase with your wallet size.

Don't get me wrong, I love some of the bigger budgeted games out there, but what I've been saying for a couple of years now is Indie developers will become a driving force in this industry. Some will continue to remain small, whereas others will become much bigger, and created bigger games down the road. Call this the next-generation of video game developers. That's my take.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#38
I honestly did get a lot of my knowhow from GAF in those GPU and CPU die threads from awhile back. There were some very intelligent guys on there, and I simply followed up some of their info by researching it myself.
I really wish I could say the same. Outside of the flop guesstimates, I got zilch from them. There are some truly intelligent "insiders" at GAF, it's just that they are a very small part of the whole forum.

Completely agree with you on indies, though. Indies are stepping in to deliver what used to be the mid-tier releases ("A" or "AA"). In another decade, I can see indies leaving behind retro-themed, 8-and-16-bit arcade-y games en masse for deeper, dozen-hour plus adventures. I await the day when a dev like Ganbarion gives us Pandora's Tower II all by their lonesome, direct on the eShop. It's already happening, but it's going to get bigger.
 

Goodtwin

Well-Known Member
#39


As you can see, the Cell processor actually trumps the Jaguar processor when it comes to cloth physic calculation. I dont think there is any denying that the CPU's in PS4 and X1 are budget processors. Its just kind of funny for all the crap that the Wii U's CPU got, but then we see these "next-gen" struggle to outclass the Cell. Granted the Espresso CPU powering Wii U would probably come up short on this test compared to even the X360, but not by a ton. Obviously the point of this article is that the GPU's can not only offload this work, but do it extremely well, but it still gives a good insight on the performance of the Jaguar CPU. Wii U can do this as well, just not nearly as well. Wii U's GPU is a VLIW5 setup, and PS4 and X1 use the more modern GCN architecture. VLIW5 is a fine setup for graphics rendering, pretty efficient, but for doing compute operations such as the one listed above, we wouldnt be talking nearly the return. Perhaps not even able to surpass the Cell.

I will be interested to see how Watch Dogs turns out on Wii U. Ubisoft really did give themselves time on this one, so I think there is an outside chance that the Wii U build is actually a competent build. Wii U has twice the available memory, a more modern GPU that regardless of theoretical flops, and seems to outperform the PS3 and 360 GPU, even if only slightly. Just the memory should make it much easier for the developers to have lots of NPC running around. Remember in Zelda MM, the 4MB ram expansion pak was required, why? Because of the AI scripts. When you look at these games, the AI scripts arent that much more sophisticated than the ones in MM. Ok so maybe thats stretching it a bit, but you get my point. Bottom line is this, if Ubisoft actually invested the 5 months into optimizing the Wii U build, it should at least have higher res textures like in NFS Most Wanted, and a framerate that sticks close to 30fps. If they cant hit that target, then they should have just quit before they ever started.
 

Goodtwin

Well-Known Member
#40
I really wish I could say the same. Outside of the flop guesstimates, I got zilch from them. There are some truly intelligent "insiders" at GAF, it's just that they are a very small part of the whole forum.

Completely agree with you on indies, though. Indies are stepping in to deliver what used to be the mid-tier releases ("A" or "AA"). In another decade, I can see indies leaving behind retro-themed, 8-and-16-bit arcade-y games en masse for deeper, dozen-hour plus adventures. I await the day when a dev like Ganbarion gives us Pandora's Tower II all by their lonesome, direct on the eShop. It's already happening, but it's going to get bigger.
Yep, your going to see the resurgence of the medium sized development studios. Your average development team size during the PS2/GC era was around 50 people, that wouldnt even cover the artist in todays teams. Your going to see some of these small Indies grow, and we will see more mid level releases from them over time. Child of Light was created by a small team over at Ubisoft, there is no reason an Indie team of similar size couldnt create games like that. Or even 3d platformers, and RPG's like Pandoras Tower and The Last Story. Development tools and game engines are actually making this easier. It only becomes a huge task when your trying to create thousands of high quality assets, but its not nearly as bad if your content with Pandoras Tower in HD. I still think High Voltage Software needs to embrace this, and release The Grinder across all platforms digitally.
 

Odo

Well-Known Member
#41
I really wish I could say the same. Outside of the flop guesstimates, I got zilch from them. There are some truly intelligent "insiders" at GAF, it's just that they are a very small part of the whole forum.

Completely agree with you on indies, though. Indies are stepping in to deliver what used to be the mid-tier releases ("A" or "AA"). In another decade, I can see indies leaving behind retro-themed, 8-and-16-bit arcade-y games en masse for deeper, dozen-hour plus adventures. I await the day when a dev like Ganbarion gives us Pandora's Tower II all by their lonesome, direct on the eShop. It's already happening, but it's going to get bigger.
Yep, your going to see the resurgence of the medium sized development studios. Your average development team size during the PS2/GC era was around 50 people, that wouldnt even cover the artist in todays teams. Your going to see some of these small Indies grow, and we will see more mid level releases from them over time. Child of Light was created by a small team over at Ubisoft, there is no reason an Indie team of similar size couldnt create games like that. Or even 3d platformers, and RPG's like Pandoras Tower and The Last Story. Development tools and game engines are actually making this easier. It only becomes a huge task when your trying to create thousands of high quality assets, but its not nearly as bad if your content with Pandoras Tower in HD. I still think High Voltage Software needs to embrace this, and release The Grinder across all platforms digitally.

I see a great future!

All the 3rd parties that ignores Nintendo (and that Nintendo ignores) are going to be not as important as they are now.

The big guys of today once were small teams and all those indies of today are going to be Capcoms and Atlus of the future.
 

Laer_HeiSeiRyuu

Well-Known Member
#42
I hope you guys recognize that Nintendo is already working on concepts for their next console, and under Iwata they'll just fuck it up again.
And how could they fuck it up? The Wii U's designed very well. It just isnt a good fit for the current climate of the industry because muh corruption cant work with nintendo wii u not selling muh muh muh
 

Laer_HeiSeiRyuu

Well-Known Member
#43
All the people being disappointed with the hyped up critical flops of the year starting from titanfall are all acting like babies lol.

Meanwhile they're ignoring the platform that's been consistently great because "muh hardware preconceptions, muh library content muh ps4 will have over twice as many games by 2 years muh muh muh (Funny thing is it doesnt even have half the software announced yet and the tentpoles have bee disappointing because the first party studios arent putting out anything worth shit)

As long as game engines are still tailored toward last gen (cough fox engine cough) shit isnt gonna improve.

But that doesnt mean a thing regarding how fun your game is or how competent it is.
 

Juegos

All mods go to heaven.
Moderator
#45
Man, that Assassin's Creed Unity thing is really blowing out of proportion.

It would have been better for Ubisoft to just let it go.

Source : http://www.neogaf.com/forum/showthread.php?t=913010
Holy crap. I see people here mention the fps and resolution wars, and I always think they're exaggerating and that it's really just a few people here and there that really bash the other console for the slight performance differences. But that thread is completely filled with that shit, there's so much butthurt over whether the PS4 could have had a slightly higher framerate or resolution if only Ubisoft would stop being "lazy" with their code optimization.

I mean, I get the whole taking-advantage-of-the-hardware aspect, but the difference would be so minuscule between the versions of the PS4 and Xbone, and for what purpose? It's not like PS4 fans are going to stop buying Ubisoft's games because they are 900p instead of 1080p (not that most of them would even notice).

What I see in that thread above, is Sony fans just salty they don't get something to gloat about to Xbone owners.

A semi-related image:

 

Goodtwin

Well-Known Member
#47
It comes down to a loud minority of people. The whole resolution and framerate showdowns are really only discussed and argued by the fanboys. Forums do no give an accurate representation of the gaming market. Look how many people here own games like COD, AC, and Batman for Wii U. The percentage of us who buy third party games that participate in these forums is far higher than the Wii U userbase as a whole. Its the same with the X1 vs PS4, and the whole "parity" dilemma. Playstation fanboys expect their console to win in all these DF showdowns, so if a developer chooses a target that both platforms can run, then they feel let down.
 

Ex-Actarus

Well-Known Member
#48
I agree there is definitely a vocal minority here. But at the same time Ubisoft - as other developers - is to blame here. They are the ones consistently bringing the resolution in the conversation... 1080p here, 60 fps there... Give me a break !

Nintendo - rightly so - has NEVER been carried away with that marketing bullshit. They just show their games such as Mario 3D World, Wind Waker HD or Mario Kart 8. The games looks great ! Period ! And nobody cares about resolutions or whatever other technical aspects...

Assassin's Creed Unity looks absolutely stunning... But now some gamers just keep in mind the resolution. And Ubisoft made things even worse with a succession of terrible communications on the resolution.
 

Ex-Actarus

Well-Known Member
#49

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
#50
Assassin's Creed Unity looks absolutely stunning... But now some gamers just keep in mind the resolution. And Ubisoft made things even worse with a succession of terrible communications on the resolution.
It's the Digital Foundry-ization of the "console wars." But the gaming media is also at fault, reporting stuff like this as if it's really that important.
 
Top