From cinemablend: wii u edram bandwidth could be 563.2GB/s or more

Yes the Wii U's power supply can deliver that much power, but it doesnt mean the Wii U hardware is designed to draw that much power. The original Wii's power supply was capable of delivering way more power than the Wii ever demanded. You always have an oversize power adapter, they never cut it all that close.

Yes, Sony and many other lied about how their games would look. It is funny how they pimped everyone with KZ Shadowfall, its rendering less pixels than a true 720p and in the process convinced everyone that the PS4 is a monster, a true next gen 1080p machine. LOL I guess not, not if you want to push the fidelity. All of a sudden Ryse on X1 seems a little more impressive.

I will be honest, although I dont agree with a lot of Megefenix theories, it has brought to my attention that the tech experts were making far to many assumptions based on how the ports were performing. Developers complained about the CPU, not the GPU, and yet it was still targeted as the reason why ports were running worse on Wii U. It took a lot of reading over at Beyond3D, but thats truly the foundation behind the argument. Its the main reason the assumption moved from 320 SPU's to 160 SPU's. When you factor in just how much developers leveraged the VMX unit on the Xenon and the SPE's in the Cell for graphics processing, its not really a shocker that the Wii U doesnt blow them away with a more modern 352Gflop GPU, the Expresso is a good general purpose CPU for running traditional CPU code, but it cant even come close to matching the SIMD performance of the VMX unit in the Xenon or the SPE's in the Cell. So far, I honestly cant say I have seen a Wii U game that really surpasses games like Killzone 3 and the Last of Us, so I am inclined to believe that the RSX+Cell combo is most likely at least as capable as the Wii U's GPU. Keep in mind, the Cell can manipulate data to perform high level DX effects.

Both and Cell and Xenon were powerhouses when it came to running games, but hardware technlogy has changed a lot in the last few years. More processes can be rendered or outright calculated using the GPU due to General Processing capabilities in modern GPUs, so the role of the CPU has been dwindling slightly, and I think this is why Nintendo went with a more enemic, yet still capable CPU. With the GPU able to do more that the CPU normally would do, it reduces the need for the CPU in the first place, and unfortunately most developers have become so used to designing their engines for systems like the PS3 and 360, so that heavy role of the CPU is prominent. The Wii U, PS4 and X1 all have CPUs that are designed to not have as much of a role in game design because the GPU is able to do a lot more these days.
It would not surprise me if someday the CPU is ousted entirely, and the GPU will handle everything. In fact, some game demos/prototypes have already been designed to work only with the GPU. There should be vids on Youtube about it.
Eh. People said that about CPU's before. Lol.

The WiiU's CPU's actually pretty fine. But its partly why they cant put Pikmin 3 online.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
With the GPU able to do more that the CPU normally would do, it reduces the need for the CPU in the first place, and unfortunately most developers have become so used to designing their engines for systems like the PS3 and 360, so that heavy role of the CPU is prominent.
Back in the old Wii U tech threads, this is one of the points that sort of emerged. Before, everyone thought "well the PS4/One/Wii U consoles aren't that different from PCs, so it shouldn't be hard to push them to their respective limits," but that assumes a developer has been primarily used to building for a PC, not a console. All of the people who have been primarily working on console games are in the midst of a learning curve.
 

Goodtwin

Well-Known Member
The cell practically doubled the ps3 processing power. The gpu was still better for the majority of graphics rendering task, but anything that benefeited from SIMD instructions, tbe Cell and even Xenon could crunch them pretty well. A cpu and gpu flops are not created equal, but you stll get a ton of assistance from the Cell and Xenon that the Expresso cant deliver. The expresso is going to be a lot better at serial code, and branching code. Developers coded around the issues on current gen consoles. For the developers who ground up build Wii U games, I think Shin'en was right, it will be a much more straight forward process.
 
Yes the Wii U's power supply can deliver that much power, but it doesnt mean the Wii U hardware is designed to draw that much power. The original Wii's power supply was capable of delivering way more power than the Wii ever demanded. You always have an oversize power adapter, they never cut it all that close.

Yes, Sony and many other lied about how their games would look. It is funny how they pimped everyone with KZ Shadowfall, its rendering less pixels than a true 720p and in the process convinced everyone that the PS4 is a monster, a true next gen 1080p machine. LOL I guess not, not if you want to push the fidelity. All of a sudden Ryse on X1 seems a little more impressive.

I will be honest, although I dont agree with a lot of Megefenix theories, it has brought to my attention that the tech experts were making far to many assumptions based on how the ports were performing. Developers complained about the CPU, not the GPU, and yet it was still targeted as the reason why ports were running worse on Wii U. It took a lot of reading over at Beyond3D, but thats truly the foundation behind the argument. Its the main reason the assumption moved from 320 SPU's to 160 SPU's. When you factor in just how much developers leveraged the VMX unit on the Xenon and the SPE's in the Cell for graphics processing, its not really a shocker that the Wii U doesnt blow them away with a more modern 352Gflop GPU, the Expresso is a good general purpose CPU for running traditional CPU code, but it cant even come close to matching the SIMD performance of the VMX unit in the Xenon or the SPE's in the Cell. So far, I honestly cant say I have seen a Wii U game that really surpasses games like Killzone 3 and the Last of Us, so I am inclined to believe that the RSX+Cell combo is most likely at least as capable as the Wii U's GPU. Keep in mind, the Cell can manipulate data to perform high level DX effects.
considering thsat nintendo0 has aimed for performance and efficiency since gamecube, i pretty much dout they are gonna waste so much power, 1 to 5 watts may be possible, but wasting 20 watts or more?, thats very unbelievable, there would have been no reason to put so much efficiency on the power supply if you were going to thow away like the 1/3 of power that it would provide you
 

Goodtwin

Well-Known Member
Power supplies have always been over sized. Get over it. Wii U even when being pushed wont be pulling over 40 watts. Im sure somebody will test the power draw when Mario Kart 8 comes out. If that game is only 35 watts, then we will know for sure its never going over 40 watts.


Just looked at my Wii's power supply, Class 2 52 watt power supply. Wii pulls about 20 watts max.
 
Power supplies have always been over sized. Get over it. Wii U even when being pushed wont be pulling over 40 watts. Im sure somebody will test the power draw when Mario Kart 8 comes out. If that game is only 35 watts, then we will know for sure its never going over 40 watts.


Just looked at my Wii's power supply, Class 2 52 watt power supply. Wii pulls about 20 watts max.
Arent there games on the console pulling 60 watts?

They did tests on Gaf about that you know

They said that the console pulled relatively low amounts running games like Darksiders 2 and Deus Ex HR
 
Power supplies have always been over sized. Get over it. Wii U even when being pushed wont be pulling over 40 watts. Im sure somebody will test the power draw when Mario Kart 8 comes out. If that game is only 35 watts, then we will know for sure its never going over 40 watts.
Just looked at my Wii's power supply, Class 2 52 watt power supply. Wii pulls about 20 watts max.


in which games?zelda twilight princess?wasnt that a gamecube port?
already xbox 360 was using must of the watts found on the power supply on a test with gears of warherehttp://www.hardcoreware.net/reviews/review-356-2.htm
"

First let's take a look at power consumption of all three consoles during video game play. I tried several games to find one that uses a good amount of power during play, but not excessively so. For the PS3, I used NBA Live 2K7. For XBOX 360, Gears of War was used
. For Wii, it was The Legend of Zelda: Twilight Princess
, and for the PC, PREY was used.

As I mentioned on the first page, games were loaded and played for at least 30 minutes, and the middle 30 minutes of data was used to determine the average, peak, and low wattage.

Without further ado:

"
see, 186 watts out of the 203 watts of xbox 360 power sypply, and thats without accounting the usb portsthe wii test was done with twilight princess which is just a gamecube port with the addition of motion controls, was to be expected to just draw few watts, for a real test we would have to put the residdet evil darkside chroncles or last story or other games
normal;">wii u already was said to draw an avrage of 45 watts in games by iwata himeself, and this article abouyt the efficiency of the power suply is good evidence, they auint gonna waste 20 watts or more when already 10 watts or less will be wasted and dissipated as heatplease, when making assuptions provide the full story, not half the convenient one for your statements. Wii power draw was tested with a gamecube port, what did you expect? wii is about 1.5 to 2x more powerful thsan gamecube, do the math and tell us the average power consumption for a real wii game like resident evil darside chronicles or last story

Power supplies have always been over sized. Get over it.And gears of wars draws almost every bit of the watts found on the xbox 360 power supply, but gears of wars aint a simple port, so why a wii u real game made from the ground up wouldnt be able to do it when the power efficiency is confirmed to be >=85%?
 
How much is used internally and how much the supply draws are two different things.
i know, but how much teh wii u uses internally also depends on teh number of stream cores and tmus and other components that are on load during teh gameplay, obvisuly the ports dont squeeze every componnt that the wii u has, for that you would need to do a ground up game and make sure you use all the shaders, tmus and other things
there are alrady 12 watts or a little less that are already are dissipated as heat, doubt nintendo would waste as much as 20watts out of the 63.5watts(85% of 75 watts) or more that the power supply can provide, and if xbox 360 took profit of must of what the power supply had available, dont see why nintendo wouldnt, adn iwata also mentioned an average of 45 watts for games, so 53watts or or little less than that isnt impossible considering both the statement of iwata and the power supply eficiency
 
Top