Nintendo Switch Spec Thread

Cubits

Well-Known Member
Oh yeah, i never thought it would have 6 gigs (although that would push its game-available RAM past that of the Xbone!), i think it's more likely to be 2x2 primarily for the added bandwidth over the X1's standard single chip.

@Goodtwin I find Eurogamer's specs to be questionable because it doesn't make sense that the system would need active cooling if it was clocked so low in docked mode.

The Pixel C tablet runs the X1 SoC with the A57 cores at 1.9GHz, and even though it lacks the fan of the Sheild TV, it seems to avoid thermal throttling.
If that's actually true, then why would Nintendo need to resort to a fan to handle the same SoC running at half the speed?! The aluminium casing of the Pixel might dissipate more heat than plastic, but the switch also packs its own significant passive heat-sink and has vents to allow ambient correction of the internal temperature. The Pixel is a closed box.

There has to be a reason to warrant that fan. Either the switch is running a much less efficient processor than the Pixel, it's WAY worse at dissipating the lower amount of thermal output, or the Eurogamer numbers are wrong.
 
Last edited:

Shoulder

Your Resident Beardy Bear
Oh yeah, i never thought it would have 6 gigs (although that would push its game-available RAM past that of the Xbone!), i think it's more likely to be 2x2 primarily for the added bandwidth over the X1's standard single chip.

@Goodtwin I find Eurogamer's specs to be questionable because it doesn't make sense that the system would need active cooling if it was clocked so low in docked mode.

The Pixel C tablet runs the X1 SoC with the A57 cores at 1.9GHz, and even though it lacks the fan of the Sheild TV, it seems to avoid thermal throttling.


If that's actually true, then why would Nintendo need to resort to a fan to handle the same SoC running at half the speed?! The aluminium casing of the Pixel might dissipate more heat than plastic, but the switch also packs its own significant passive heat-sink and has vents to allow ambient correction of the internal temperature. The Pixel is a closed box.

There has to be a reason to warrant that fan. Either the switch is running a much less efficient processor than the Pixel, it's WAY worse at dissipating the lower amount of thermal output, or the Eurogamer numbers are wrong.
Isn't throttling usually done because of too much heat? If so, that's one of the big reasons I see it a concern for Nintendo, and hence why a fan was warranted. I'm actually inclined to believe that the numbers themselves are not technically wrong, but more just outdated. It'll definitely bed interesting to see what happens once the embargoes get lifted, and groups such as Digital Foundry get their hands on in examining the system.
 

Goodtwin

Well-Known Member
Pixel C throttles significantly when gaming. It will reduce clocks to about 450Mhz after about 30 minutes of a more demanding game. Also, running games on Android does not give full hardware utilization. Games on Switch will, and better hardware utilization means more heat. Even Nvidia Shield TV throttles it's gpu speeds up and down during benchmarks done by a member over at Gaf. Everyone assumed that the clocks were locked on Shield TV, but that turned out to be false. The CPU clocks have to remain constant for portable and docked, can't have game logic running slower in portable mode. The four A57 cores consume about 2 watts at 1Ghz, and this makes sense based on the max power draw in portable being 6watts to drain the battery in 2.5 hours.

There is a phrase in mobile devices, race to sleep. Basically mobile processors have impressive peak performance, but it's not sustainable. They are expected to work hard for short bust, and then clock way down, hence going to sleep.

Sent from my SM-G360V using Tapatalk
 
Last edited:

Shoulder

Your Resident Beardy Bear
Pixel C throttles significantly when gaming. It will reduce clocks to about 450Mhz after about 30 minutes of a more demanding game. Also, running games on Android does not give full hardware utilization. Games on Switch will, and better hardware utilization means more heat. Even Nvidia Shield TV throttles it's gpu speeds up and down during benchmarks done by a member over at Gaf. Everyone assumed that the clocks were locked on Shield TV, but that turned out to be false. The CPU clocks have to remain constant for portable and docked, can't have game logic running slower in portable mode. The four A57 cores consume about 2 watts at 1Ghz, and this makes sense based on the max power draw in portable being 6watts to drain the battery in 2.5 hours.

There is a phrase in mobile devices, race to sleep. Basically mobile processors have impressive peak performance, but it's not sustainable. They are expected to work hard for short bust, and then clock way down, hence going to sleep.

Sent from my SM-G360V using genital warts
Yeah. Sustained vs. Peak performance is key here, the latter of which is what smart devices rely on. All consoles, PCs, handhelds, etc, all rely on sustained performance for extended periods of time, which is key to maintaining that performance. And also yes, it is very interesting to know the Shield TV even throttles its performance.

That should mean that without throttling alone (provided no other changes to the chip), the Switch would still be as or more powerful than the Shield TV, despite the possible reduction in clocks.
 

Goodtwin

Well-Known Member
I have read comments from a developer really talking up just how beneficial a low level API is compared to Android over at Beyond 3D. He also spoke if low level optimization developers will spends months perfecting on dedicated gaming hardware, something they would never do for a small userbase like Shield TV. So yes, Switch will outperform Shield TV even with lower clocks.

Keep in mind even the Jaguar cores in the PS4 didn't choose max clock speeds(2Ghz), and it's thermal limitations are far less restrictive than Switch. Even Shield TV looks roomy compared to Switch after seeing the teardown.



Sent from my SM-G360V using Tapatalk
 

Goodtwin

Well-Known Member
http://www.eurogamer.net/?topic=digital_foundry

Looks like portable mode gets a new performance profile, bring the GPU clocks up from 307Mhz to 384Mhz. Docked performance has apparently stayed the same. Nintendo is still insisting that the standard profile be used whenever possible, most likely to save the battery life, but this performance profile is available to developers.

I see this being useful for developers who have no intention of shooting for 1080p docked. I could see many developers with AAA titles choosing to render at 720p both portable and docked, and simply using higher settings in docked mode, perhaps with better AA. I could totally see this with a game like Steep from Ubisoft. Lowering settings for portable wont be an eye sore because of the small screen, and docked 720p will make it much easier to have near PS4 visuals albeit at 720p. Its a 25% bump in GPU grunt in portable, so we are now looking at 196Glfop single precision performance, and 392Glfop half precision performance. Docked stays the same at 393Gflop single precision, and 786Gflop half precision.
 

Shoulder

Your Resident Beardy Bear
I was trying to remember, what ever happened with that rumor of the two AMD chips, one of them supposedly used in a gaming application? I know some folks used that as evidence to suggest the Switch was going to use an AMD chip, but of course that never happened.

I also seem to recall that up until it as officially announced, there were some people thinking the Tegra chip was just a placeholder, and AMD would be the chip to use, or even something else.
 

Koenig

The Architect
Do you guys have any idea why the load times in Breath the Wild are (still) so long on the switch? They are slightly faster than on the Wii U, but it is still much longer than I thought a cartridge would take. I suspect that this means it has something to do with the hardware handling the data than the data transfer itself...

Again, what do you guys think?
 

Goodtwin

Well-Known Member
I think some of it has to do with the game not really utilizing the extra ram Switch has. The game has to work with the 1GB Wii U has, and with it being a direct port, it may not utilize the extra ram, at least not to the extent you would think. I remember the same being true with 360 ports to Wii U, twice the memory, but we never saw much if any improvements because of it. I remember Splinter Cell having a higher definition texture pack for the 360 because it could stream them from the hard drive, and I was like WTF? Having an extra 512MB of ram should have made this easily possible on Wii U.

Sent from my SM-G360V using Tapatalk
 

Goodtwin

Well-Known Member
http://techinsights.com/about-techinsights/overview/blog/nintendo-switch-teardown/

It was all but confirmed anyway, but they have determined that the Switch does indeed use a stock Tegra X1 processor. I know its not sexy, but it makes a ton of sense. The performance and power consumption requirements all line up pretty damn well with Nintendo's priorities. Custom processors typically cost hundreds of millions to produce. In this case, Nintendo is simply buying Tegra X1's with no large R&D expense. My theory is that Nintendo received not only a stellar deal on the Tegra X1 itself, but the software support from Nvidia was part of the package. Developers are gushing over how easy the Switch is to develop for, and I am certain a lot of this is thanks to Nvidias tools and API.

In the form factor that Switch is in, the TX1 is just about as good as it gets. Your only going to get so much power from a device that pulls 11 watts. Even sitting next to the already very compact and small Wii U console, the Switch is a tiny device by comparison. Dollars and cents, the stock Tegra X1 makes sense.
 

Shoulder

Your Resident Beardy Bear
http://techinsights.com/about-techinsights/overview/blog/nintendo-switch-teardown/

It was all but confirmed anyway, but they have determined that the Switch does indeed use a stock Tegra X1 processor. I know its not sexy, but it makes a ton of sense. The performance and power consumption requirements all line up pretty damn well with Nintendo's priorities. Custom processors typically cost hundreds of millions to produce. In this case, Nintendo is simply buying Tegra X1's with no large R&D expense. My theory is that Nintendo received not only a stellar deal on the Tegra X1 itself, but the software support from Nvidia was part of the package. Developers are gushing over how easy the Switch is to develop for, and I am certain a lot of this is thanks to Nvidias tools and API.

In the form factor that Switch is in, the TX1 is just about as good as it gets. Your only going to get so much power from a device that pulls 11 watts. Even sitting next to the already very compact and small Wii U console, the Switch is a tiny device by comparison. Dollars and cents, the stock Tegra X1 makes sense.
And of course, GAF is losing their fucking minds over this news. What is somewhat interesting is nVidia said the chip is custom, and yet according to those folks at GAF, say this proves it is not. Well, just for the sake of argument, isn't it possible to customize a chip internally in terms of how it operates? One thing some people were talking about is the use of the 4 A57 cores and the 4 A53 cores. Basically, they say the X1 can only use the A53 cores, or the A57 cores, not all 8 at the same time (If I understand this right). And then my next question becomes: Why? Is there something hardware related that prevents that, or is it simply a software thing?

I only ask about the cores thing because I would think it would be possible for the A53 cores to be used for the OS stuff (which would be running in the background), and the A57 cores for games. And when you factor in that the Switch will be updated down the road to record video natively from the system itself (among other things), I doubt those A53 cores are just sitting there twiddling their thumbs. Nintendo would have asked nVidia to use those cores for something. I doubt there is really unused real estate going on here.

EDIT: Reading further, someone did bring up the question I asked, and someone else mentioned how it was more hardware related. Obviously, my level of knowledge towards how this works is beyond me, so I don't know if it's possible to use them both at the same time under any circumstances.
 

Shoulder

Your Resident Beardy Bear
Dude, GAF loses their collective mind over anything. The meltdowns are the stuff of legend.

But, y'know, insiders.
What I do find rather amusing is that when you factor in everything, the X1 does make the most sense; not just from a graphical perspective, but from a business standpoint as well. nVidia had a slew of Tegra X1's lying around, and they sold them to Nintendo for a good deal. Nintendo buys them, alters some of the aspects of it (ram, and clock speed from the sounds of it), and adds its own custom stuff around the hardware itself (touchscreen, HD rumble, etc).

One of the more interesting tidbits I read from that GAF thread was that USB-C is rather expensive right now, so the 60 dollar price of the dock itself is not so farfetched when all things considered. Not to mention that a simple HDMI to USB-C adapter will not work on the Switch, and we do know there is a chip of some kind that communicates with the Switch to allow "docked" mode to be unlocked, as well as recharging the battery.
 

theMightyME

Owner of The Total Screen
What I do find rather amusing is that when you factor in everything, the X1 does make the most sense; not just from a graphical perspective, but from a business standpoint as well. nVidia had a slew of Tegra X1's lying around, and they sold them to Nintendo for a good deal. Nintendo buys them, alters some of the aspects of it (ram, and clock speed from the sounds of it), and adds its own custom stuff around the hardware itself (touchscreen, HD rumble, etc).

One of the more interesting tidbits I read from that GAF thread was that USB-C is rather expensive right now, so the 60 dollar price of the dock itself is not so farfetched when all things considered. Not to mention that a simple HDMI to USB-C adapter will not work on the Switch, and we do know there is a chip of some kind that communicates with the Switch to allow "docked" mode to be unlocked, as well as recharging the battery.
If it was $60 I wouldn't Be SO offended, it is $90
 

Goodtwin

Well-Known Member
The A53 cores can only work independently from the A57 cores. So yes, it's not possible to use them hetergenously. We have also seen documentation showing one A57 core reserved for OS. The custom part was just PR. Yes, they customized the clock speeds, but that's it.

Sent from my SM-G360V using Tapatalk
 

theMightyME

Owner of The Total Screen
The A53 cores can only work independently from the A57 cores. So yes, it's not possible to use them hetergenously. We have also seen documentation showing one A57 core reserved for OS. The custom part was just PR. Yes, they customized the clock speeds, but that's it.

Sent from my SM-G360V using genital warts
Usually you can't run all cores in an Octocore chip because of heat... I remember a ways back when Samsung made their first Octocore somebody rooted the phone and set all cores as active, the chip ran spey fast for like a minute and then fried
 

Shoulder

Your Resident Beardy Bear
One thing about that "custom" aspect of the X1 is doesn't the notion that the chip can run in both undocked, and docked mode part of that? Was the chip really capable of that natively, or was the software modified in order to allow that to happen with the OS and API? Like we said, the ram, the change in clocks (to prevent throttling), and the docked/undocked modes wouldn't b technically lying on nVidia's part, even though the real estate on the chip itself doesn't appear any different from a X1 on a Shield TV.

From a business perspective, the word custom can mean anything from a total overhaul of the chip to something as simple as reducing clocks. It is then therefore altered from its stock form, which is very true here.

But as usual, GAF (like ET said) always losing their shit minds.
 

Shoulder

Your Resident Beardy Bear
I think a lot of the custom is also likely to be in t he firmware, as I believe their was a statement about nvidea working close with nintendo on that
Which is also what I think is the case here. Custom doesn't just mean alterations to the chip itself, which is what I'm hearing mostly about from people.

It'd be no different than Ferrari coming out with a special edition 488 GTB, but rather than fiddle with the engine or transmission, they instead did a bunch of software related things with traction control, the navigation system, the UI of the gauge clusters, etc, and calling it a custom Ferrari. Technically speaking, it would still be custom, but not custom as most would think.
 

Goodtwin

Well-Known Member
It's just PR. It cost lots of money for custom, and to this day, the Tegra X1 is a top dog in mobile graphics processing. A lot of expensive phones and tablets come with newer processors that still cannot surpass the Tegra X1.

Sent from my SM-G360V using Tapatalk
 

theMightyME

Owner of The Total Screen
It's just PR. It cost lots of money for custom, and to this day, the Tegra X1 is a top dog in mobile graphics processing. A lot of expensive phones and tablets come with newer processors that still cannot surpass the Tegra X1.

Sent from my SM-G360V using genital warts
those cpu cores are routinely bested by processors from apple, qualcomm, and samsung, but NOTHING compares to what the x1 is doing with its gpu cores on mobile... but that is also because heavy gpu ability is counter-intuitive to the primary needs of smartphones... the x1 was originally designed to be used in cars and non-mobile systems like tv set top boxes, both of which can afford greater power drain and heat.... the only comaprable use of an x1 to the switch is probably the google pixel-c... and that was designed as a laptop first, so still a bit different...

I think there likely is custom work on the chip, but it is probably more in the firmware and how the chip is designed to operate, part of which is the clock speeds... mobile chips ten to peak and then throttle, where as the switch stays consistent... that is more than just a clock speed adjustment, that is an operation adjustment I would think... the x1 was also designed with android hooks in mind, and I imagine that was cleaned up for nintendo's much lighter OS... all of that should require firmware optimizations.. and maybe even a few small changes to the hardware architecture
 

Shoulder

Your Resident Beardy Bear
those cpu cores are routinely bested by processors from apple, qualcomm, and samsung, but NOTHING compares to what the x1 is doing with its gpu cores on mobile... but that is also because heavy gpu ability is counter-intuitive to the primary needs of smartphones... the x1 was originally designed to be used in cars and non-mobile systems like tv set top boxes, both of which can afford greater power drain and heat.... the only comaprable use of an x1 to the switch is probably the google pixel-c... and that was designed as a laptop first, so still a bit different...

I think there likely is custom work on the chip, but it is probably more in the firmware and how the chip is designed to operate, part of which is the clock speeds... mobile chips ten to peak and then throttle, where as the switch stays consistent... that is more than just a clock speed adjustment, that is an operation adjustment I would think... the x1 was also designed with android hooks in mind, and I imagine that was cleaned up for nintendo's much lighter OS... all of that should require firmware optimizations.. and maybe even a few small changes to the hardware architecture
Ultimately, people (and people, I mean GAF) are split as to what this means. On the one hand, it's still the most capable handheld device, but on the other, it's nowhere near as capable as the PS4 or Xbox One. But then you have to consider the fact that again, this is a hybrid device. It is accomplishing the tasks of two separate devices. A device that can be hooked up (docked in this case) to your TV natively, and have comfy couch gaming with your big ass TV, AND one you can take with you to bed, on the plane, to work, bus, train, taxi, funeral (the latter of which might be slightly frowned upon though), the list goes on.
 

Goodtwin

Well-Known Member
The bottom line is the Switch must operate under the limitations of a portable, because the form factor is as such. Even the dream that was a Tegra Parker still wasn't going to be as powerful as an Xbox One. It would have been closer, but ultimately still well short. We know the Tegra X1 cannot operate at max clocks without a nice big heatsink and cooling fan. The Shield TV throttles. So you have to assume that the Tegra Parker within such a device would also need to substantially lower its clock speeds from the theoretical max clock. The Xbox One S shrunk its processor down from 28nm to 16nm Finfet, and still pulls about 60 watts. The Switch pulls less than 12 watts unless its charging the battery. We are talking 1/5th the power consumption here. The idea that Switch was going to be neck and neck with the Xbox One was a pipe dream. I admit, I was buying into some of the hype, but it just wasn't reality. The Tegra X1 great surpasses the Wii U and other last gen consoles, but falls well short of PS4/X1. If a developer wants to port games from those consoles to Switch, there will certainly be compromises, but the sentiment that the chasm is so large that ports are impossible is a complete joke. Call of Duty Black Ops and Modern Warfare 3 on Wii are proof. That was even tougher, because the Wii didn't support programmable shaders at all. If there is money to be made, ports will happen. If your concerned about western third party support, looking to the specs thread is the wrong place to frequent. No, for Western Third party support, you would need to keep a close on the sales thread. Switch continues to sell well, get ready for Call of Duty to make a return to a Nintendo console.
 

theMightyME

Owner of The Total Screen
The bottom line is the Switch must operate under the limitations of a portable, because the form factor is as such. Even the dream that was a Tegra Parker still wasn't going to be as powerful as an Xbox One. It would have been closer, but ultimately still well short. We know the Tegra X1 cannot operate at max clocks without a nice big heatsink and cooling fan. The Shield TV throttles. So you have to assume that the Tegra Parker within such a device would also need to substantially lower its clock speeds from the theoretical max clock. The Xbox One S shrunk its processor down from 28nm to 16nm Finfet, and still pulls about 60 watts. The Switch pulls less than 12 watts unless its charging the battery. We are talking 1/5th the power consumption here. The idea that Switch was going to be neck and neck with the Xbox One was a pipe dream. I admit, I was buying into some of the hype, but it just wasn't reality. The Tegra X1 great surpasses the Wii U and other last gen consoles, but falls well short of PS4/X1. If a developer wants to port games from those consoles to Switch, there will certainly be compromises, but the sentiment that the chasm is so large that ports are impossible is a complete joke. Call of Duty Black Ops and Modern Warfare 3 on Wii are proof. That was even tougher, because the Wii didn't support programmable shaders at all. If there is money to be made, ports will happen. If your concerned about western third party support, looking to the specs thread is the wrong place to frequent. No, for Western Third party support, you would need to keep a close on the sales thread. Switch continues to sell well, get ready for Call of Duty to make a return to a Nintendo console.
EXACTLY!

and as you said the wii didn't support hardware shaders at all and the switch probably has a more modern shader set than the xbone and ps4 do, even with less power...

the idea that games cannot be ported, or even that the compromises are that significant is just kind of stupid... graphics have been good enough since the 360... now it is all like "darn, it runs at a slightly lower resolution and isn't AS stunningly gorgeous, while still being stunningly gorgeous

somethign that grows even mroe apparent when you look at nintendo's bread and butter, which is more cartoony games
 

simplyTravis

Lamer Gamers Podcast Co-Host
A lot of people are surprised that the Switch can put out what it can with "just a Tegra X1". I think what a lot of people don't realize involves how it works with Android vs Switch. Android is made of a system of virtual environments (which is how it can run on nearly anything.) It makes for an incredibly lossy system that will never punch as hard as the hardware will allow. The Switch should get really low-level access without having to go through all of that emulation-like junk Android does. I'm sure that is what Nvidia mean by customizing. I would still rather a new chip, but it is interesting to see how truly powerful the Tegra 1 is unleashed.
 

Shoulder

Your Resident Beardy Bear
A lot of people are surprised that the Switch can put out what it can with "just a Tegra X1". I think what a lot of people don't realize involves how it works with Android vs Switch. Android is made of a system of virtual environments (which is how it can run on nearly anything.) It makes for an incredibly lossy system that will never punch as hard as the hardware will allow. The Switch should get really low-level access without having to go through all of that emulation-like junk Android does. I'm sure that is what Nvidia mean by customizing. I would still rather a new chip, but it is interesting to see how truly powerful the Tegra 1 is unleashed.
New chip is fine, but if the cost to benefit ratio is not there, there's little point in trying to go with something better. And if you think about it, There really isn't anything else out there that can match the X1 in terms of sustained performance, with the exception of the X2, but that is only just starting to make rounds. Given the timing of the system launch, the partnerships, and so on, the Tegra X1 makes the most sense from both a practical, and business standpoint. Nintendo I think made the right call.
 

DarkDepths

Your friendly neighbourhood robot overlord
A lot of people are surprised that the Switch can put out what it can with "just a Tegra X1". I think what a lot of people don't realize involves how it works with Android vs Switch. Android is made of a system of virtual environments (which is how it can run on nearly anything.) It makes for an incredibly lossy system that will never punch as hard as the hardware will allow. The Switch should get really low-level access without having to go through all of that emulation-like junk Android does. I'm sure that is what Nvidia mean by customizing. I would still rather a new chip, but it is interesting to see how truly powerful the Tegra 1 is unleashed.
Not that you are wrong in principle, but to be clear, Android apps don't necessarily *have* to run in what you've called "virtual environments". They can get the same access to the metal that any software running on a PC or Switch can.

I think the biggest difference is just reliability of expectations. On the Switch, you know what features you have to work with because every switch is the same. You also know what percentage of resources you have access to. On Android, you never know what the users device will support, or what kind of resources you'll have access to. You might end up on a high-end Galaxy but it's bogged down by a million background services sending your current location to Walmart.
 

Goodtwin

Well-Known Member
Nvidia Themselves ported Tomb Raider 2013 to Shield TV. Digital Foundry showed a solid 30fps, but has the typical frame pacing issues common with 30fps games on Android. A decent example of what to expect from down ports to Switch.


Sent from my SM-G360V using Tapatalk
 

Odo

Well-Known Member
A question for you guys. My question is about the size of the world in a game. Of course, BotW is already enormous, but Nintendo knows how to do it the right way.

The way Ubisoft programmes a world like Watch_Dogs 2's, for instance. Can Switch handle a port of a game like that without too much development effort?
 

Goodtwin

Well-Known Member
A question for you guys. My question is about the size of the world in a game. Of course, BotW is already enormous, but Nintendo knows how to do it the right way.

The way Ubisoft programmes a world like Watch_Dogs 2's, for instance. Can Switch handle a port of a game like that without too much development effort?
Huge open world games were on 360 and PS3, so I see no reason why they can't port a game like Watch Dogs 2 to Switch. Will there be obvious compromises in fidelity, resolution, and draw distance? Sure, and a lot of these ports will probably look more like a 360 title than ps4, but the core game will be intact.

Sent from my SM-G360V using Tapatalk
 

Odo

Well-Known Member
Huge open world games were on 360 and PS3, so I see no reason why they can't port a game like Watch Dogs 2 to Switch. Will there be obvious compromises in fidelity, resolution, and draw distance? Sure, and a lot of these ports will probably look more like a 360 title than ps4, but the core game will be intact.

Sent from my SM-G360V using genital warts
Yes, but why has Ubisoft stated that Wii U couldn't handle Assassins Creed Unity world? What I got back then is that the problem with Wii U wasn't only fidelity, resolution and graphics, but also loading too many things like NPCs, assets, elements, behaviours, etc

Since Switch is just a more powerful Wii U, a game like Watch Dogs 2 wouldn't be hard to handle either?
 

Shoulder

Your Resident Beardy Bear
Yes, but why has Ubisoft stated that Wii U couldn't handle Assassins Creed Unity world? What I got back then is that the problem with Wii U wasn't only fidelity, resolution and graphics, but also loading too many things like NPCs, assets, elements, behaviours, etc

Since Switch is just a more powerful Wii U, a game like Watch Dogs 2 wouldn't be hard to handle either?
I think for Ubisoft, it was all the AI scripts. Given how many issues they had with performance on the PS4 and Xbone, I would imagine the Wii U would've been even more difficult. Now, had the engine itself been made with the Wii U in mind, the situation might've turned out differently, but my thinking is the Unity engine they used was not made for the Wii U. Now, you might suggest that any engine could work with the Wii U, but you then have to consider the added costs and laborer to get it working, and for most, including Ubisoft, likely thought it wasn't worth the effort, given how badly ACIII, and AC: BF both sold on the Wii U.
 

theMightyME

Owner of The Total Screen
Yes, but why has Ubisoft stated that Wii U couldn't handle Assassins Creed Unity world? What I got back then is that the problem with Wii U wasn't only fidelity, resolution and graphics, but also loading too many things like NPCs, assets, elements, behaviours, etc

Since Switch is just a more powerful Wii U, a game like Watch Dogs 2 wouldn't be hard to handle either?
Whenever a company says something like that they mean "as is"... In that the GAME Could run on the Wii U.... But not as is... Changes would have to be made, changes that are not that big of a deal to the gamer, but cost more money for the publisher, money they agent willing to spend because 3rd party sales in the wii u sucked, as did the systems install base size

It is purely a money issue... It isn't worth the extra money or would cost Ubisoft TO MAKE it work

Business, not tech
 

Goodtwin

Well-Known Member
Switch has a CPU far more capable than Wii U. It has advanced out of order processing and the Neon FPU shots all over Wii U's paired singles SIMD performance.

The truth is things like AI can be as demanding or simple as the developer wants. It can be very demanding, even for an I5 quad core processor. Go check out some advanced strategy games on PC. There may only be a few characters on screen, and yet the game can melt an I5. It would be possible to create a single AI character that completely buckled the PS4 and X1. How? Give that AI a million different potential options and watch things slow to a crawl. AC Unity was poor choices by the developer. Tons of AI scripts that didn't make the game feel much different than any other AC game.

Sent from my SM-G360V using Tapatalk
 

Goodtwin

Well-Known Member
Splatoon 2 has certainly raised questions regarding Switch hardware performance. According to DF, Splatoon runs at a rock solid 60fps, but the docked resolution is only 720p native, and portable sees a dynamic resolution that dips as low as 548p, but does seem to maintain the rock solid 60fps gameplay. From my personal eye test, I am seeing improved lighting and shadows, and character models seem to be higher poly count, but overall this isn't a significant upgrade over the original in the tech department. So how hard is Splatoon pushing the Switch? Well, according to DF the Switch is pulling 3 watts less with Splatoon docked compared to Zelda BoTW. This tells us that Splatoon is not fully utilizing the hardware. The question is why? My personal guess is that the dev team is prioritizing a solid 60fps above all else. I suspect we will see improvements with the final build, but ultimately the developer is not sacrificing framerate under any circumstances. I wouldn't be surprised to see a dynamic 900p docked in the final build, and a more consistent 720p portable.

Mario Kart 8 Deluxe runs 1080p 60fps locked, so why is Splatoon struggling to follow suite? Racing games have the luxury of being very predicable. Sudden panning of the camera is typically not going to happen, and the next frame being drawn is super predictable. Basically, racing games in general are easier for the developers to get good control over their code. Far easier to optimize code for a racer than most other genres outside of 2d platformers.

Snake Pass just released on Switch today, and I am getting conflicting info. The developer had previously told Eurogamer that the Switch build would run 1080p 30fps docked, but I am hearing reports suggesting 720p docked. Sounds like the framerate is very smooth, but if it turns out to be 720p native, then I believe there is bottleneck in the Tegra X1 when targeting 1080p, most likely the memory bandwidth. I want to see confirmation on Snake Pass before passing final judgement. It would be easy to confuse a 1080p image with a 720p image if it used some pretty blurry FXAA like a lot of PS3 games implemented.
 

Shoulder

Your Resident Beardy Bear
Splatoon 2 has certainly raised questions regarding Switch hardware performance. According to DF, Splatoon runs at a rock solid 60fps, but the docked resolution is only 720p native, and portable sees a dynamic resolution that dips as low as 548p, but does seem to maintain the rock solid 60fps gameplay. From my personal eye test, I am seeing improved lighting and shadows, and character models seem to be higher poly count, but overall this isn't a significant upgrade over the original in the tech department. So how hard is Splatoon pushing the Switch? Well, according to DF the Switch is pulling 3 watts less with Splatoon docked compared to Zelda BoTW. This tells us that Splatoon is not fully utilizing the hardware. The question is why? My personal guess is that the dev team is prioritizing a solid 60fps above all else. I suspect we will see improvements with the final build, but ultimately the developer is not sacrificing framerate under any circumstances. I wouldn't be surprised to see a dynamic 900p docked in the final build, and a more consistent 720p portable.

Mario Kart 8 Deluxe runs 1080p 60fps locked, so why is Splatoon struggling to follow suite? Racing games have the luxury of being very predicable. Sudden panning of the camera is typically not going to happen, and the next frame being drawn is super predictable. Basically, racing games in general are easier for the developers to get good control over their code. Far easier to optimize code for a racer than most other genres outside of 2d platformers.

Snake Pass just released on Switch today, and I am getting conflicting info. The developer had previously told Eurogamer that the Switch build would run 1080p 30fps docked, but I am hearing reports suggesting 720p docked. Sounds like the framerate is very smooth, but if it turns out to be 720p native, then I believe there is bottleneck in the Tegra X1 when targeting 1080p, most likely the memory bandwidth. I want to see confirmation on Snake Pass before passing final judgement. It would be easy to confuse a 1080p image with a 720p image if it used some pretty blurry FXAA like a lot of PS3 games implemented.
Watching some quick videos on YT, I can't quite make out if it's 720p or 1080p, but what it does look like it the game is using AA, because I don't see very many jaggies. This of course could mean 1080p, or perhaps 720p, but with AA in use.

Nevertheless, I look forward to the onslaught of folks on YT who say the following words if you die in the game: "Snake? Snake?! SNAAAAAAAKE?!"
 

Goodtwin

Well-Known Member
Watching some quick videos on YT, I can't quite make out if it's 720p or 1080p, but what it does look like it the game is using AA, because I don't see very many jaggies. This of course could mean 1080p, or perhaps 720p, but with AA in use.

Nevertheless, I look forward to the onslaught of folks on YT who say the following words if you die in the game: "Snake? Snake?! SNAAAAAAAKE?!"
I was thinking the same exact shit. Game looks blurry for 1080p, but no jaggies at all. I am thinking some heavy FXAA is in play here.

Sent from my SM-G360V using Tapatalk
 

Shoulder

Your Resident Beardy Bear
I was thinking the same exact shit. Game looks blurry for 1080p, but no jaggies at all. I am thinking some heavy FXAA is in play here.

Sent from my SM-G360V using genital warts
Yeah, I noticed the blurriness too, and first thought it was a blurry capture, but then wondered if it's another heavy FXAA-type feature going on. Reminds me of Quantum Break in a way.
 

Juegos

All mods go to heaven.
Moderator
I really can't sympathize with the disappointment over how Splatoon 2 looks. Personally, I thought the game looked stunning on my monitor. Yes I could notice some aliasing here and there, like with the Wii U (though this game definitely looked cleaner), but overall I just thought it looked better than Overwatch has ever looked for me, and that's a game I play everyday on my PC with a cocktail of settings that give me 60 fps. Obviously I could get Overwatch on my PC to look even better by throwing money at it, but my point is that Splatoon 2 on Switch already looks better to me than the best-looking multiplayer FPS I've played on my PC.

Artstyle, animation, and attention to detail are everything. Even with how demanding a game like DOOM is to run on Ultra settings, for example, I bet it would look still look amazing on the Switch by turning some effects low and targeting 60 fps (just like it still looked beautiful on my PC with a cocktail of low and medium settings). On the other hand, no amount of tech is ever going to make COD look as good as these games.
 

Odo

Well-Known Member
I don't have enough knowledge to judge, but the only thing I can say is that it looks beautiful considering the art style they're delivering. Besides I barely can see any difference when I watch those comparison videos. Those PS4/XBO comparison mean nothing to me. I try to see the difference and I see nothing but some small shadows here and there.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
Artstyle, animation, and attention to detail are everything.
I remember you and me were talking before about BotW's animations versus Horizon, and since I don't play Ubi games very much, I was surprised at just how bad (and stiff) some of Ubi's animations are. DF videos aren't going to usually pass technical judgment on that sort of thing (their wheelhouse is pixel counting, framerate counting, and looking for jagged edges), even though it's just as important. Hell, animation quality is more important than resolution.

It's those little things Splatoon gets right: the feel of the movement mechanics married to the animation of the movement. It's the sort of detail that easily slips through the cracks. Perhaps a discussion for a thread of its own.
 

Goodtwin

Well-Known Member

On PlayStation 4 we're looking at 1536x864 while Switch drops all the way to 1200x675, in docked mode and something in the region of 844x475 while playing detached. While lower than expected, this works out better than you would expect due to Unreal Engine's excellent temporal anti-aliasing tech, combined with the soft materials used in the game.
Considering Sumo only had three months of time to port the Switch build, the results are pretty darn good. I was pretty shocked to see just how low the portable resolution was, but it sounds like the temporal AA works very well for eliminating the jaggies. The fact that the PS4 build isn't able to get a native 1080p image and runs at 30fps bodes well for Unreal 4 multi platform games on Switch. If the PS4 build were running a native 1080p res with 60fps, you would argue that a more demanding game might struggle to down port to Switch, but with the evidence strongly suggest that games using Unreal 4 will more easily transition to Switch. There are compromises here, the resolution is a lot lower, and the settings seem to be lower across the board. Shadows are certainly more refined on PS4, but overall the game looks very comparable on Switch.

The haters are of course damage controlling this, suddenly Snake Pass is a bad PS4 port. LOL Seriously, there are ass hats downplaying the Switch version by saying the PS4 build is a bad port. I have to say, this is a new one for me, people using the bad port excuse for the superior version of a game. The developer obviously spent all there time with the Switch build right......oh wait, they only had dev kits since January.
 
Top