Nintendo Switch Spec Thread

Goodtwin

Well-Known Member
http://www.anandtech.com/show/8329/revisiting-shield-tablet-gaming-ux-and-battery-life

Clock speeds for these mobile processors are deceiving. Take a look at the Tegra K1 powering the Shield Tablet. Even on a benchmark intended to stress test the processor, the Tegra K1 rarely operated at max clock speed. These processors are designed to operate in short burst. Clock up very high, complete the task, and clock down. Take a look at the Shield Tablet when its stress tested for a half hour, it hits is thermal limits and throttles down significantly. The bottom line is that gaming hardware is a sustained workload for hours on end, very different from the intermittent work loads on mobile devices. Yes these devices play games, but what does utilization actually look like? From the looks of it, pretty low utilization most games. The Tegra K1 would only clock at about 450 Mhz for most of the android games that weren't specifically designed for the Shield.

Even with the Shield Console that uses the Tegra X1, which clocks at 1 Ghz, how often does the device see full utilization? We could be seeing the Tegra X1 clocking well under max clocks most of the time, even in games like Doom BFG. If similar test were available for the Shield Console, we might not be so surprised about the lower clock speeds for the custom Tegra powering Switch.
 

Koenig

The Architect
Out of curiosity, how much of a price difference would using the Maxwell instead of the Pascal architecture imply for the development of the Switch? If the price difference is large enough, I suspect that the rumor will turn out to be true.
 

tekshow

Active Member
Just wait for jan 12, all these rumors are going out of hand on the internet.
Realistically that's what I'm waiting on. I care less and less about the specs on a sheet and more about what the Switch can do and the games that will be available.

I made this point elsewhere but I really am fine with third parties showing up with minimal support. That's not my ideal Switch scenario, but after three generations of hardware I'm truthfully used to it. During that time period I've figured out how to find my third party gaming and work it into the *budget.




























*I don't have a budget :mpray:
 

Goodtwin

Well-Known Member
Out of curiosity, how much of a price difference would using the Maxwell instead of the Pascal architecture imply for the development of the Switch? If the price difference is large enough, I suspect that the rumor will turn out to be true.
Manufacturing cost is a minimal difference, it comes down to what Nvidia is charging to licence the chip to Nintendo. Reasonable conclusion leads me to believe Nintendo for a really good deal on this chip. Consoles also tend to lock in their tech well ahead of release, it's rare to incorporate the most current tech.

Sent from my SM-G360V using Tapatalk
 

Shoulder

Your Resident Beardy Bear
Manufacturing cost is a minimal difference, it comes down to what Nvidia is charging to licence the chip to Nintendo. Reasonable conclusion leads me to believe Nintendo for a really good deal on this chip. Consoles also tend to lock in their tech well ahead of release, it's rare to incorporate the most current tech.

Sent from my SM-G360V using genital warts
This. What likely occurred here was nVidia needed to sell a bunch of these Tegra X1 chips since the newer iterations were going to be coming out soon, and that the new chips would be manufactured on the 16nm FinFET process rather than the normal 20nm of the Tegra X1. In doing so, Nintendo had laid out their plans and intentions for customizing the X1 chip to their specifications. And when you think about what nVidia posted after the Switch reveal, and what the recent rumors about the clock speed, there are some distinctive clues as to what has happened here.

Since the system appears to be using a fan, after the latest info GT explained, and yet the clock speed seems to be reduced, the chip itself in its current state (if nothing else was altered) would be only using a few watts of power (literally only a few watts), maybe only a couple watts. What nVidia said about the "500-man years" makes me to believe that some stuff was taken out of the X1 chip, but other stuff was put in. As we know, Nintendo loves to tinker what their hardware, so it's possible they might've added CPU cores to the mix, and/or added CUDA cores for better capabilities. It is also possible that Pascal features were added, and just maybe (although unlikely) it's currently sitting on a 16nm process rather than the 20nm. There would be almost no way it would be increased in size to 28nm (something that some GAFers were asking about, which is ridiculous)

I get the impression this custom Tegra chip has its clock speeds reduced for better sustained performance, but with some beefed up other hardware such as cores to help offset the reduction. And if I understand this right, with the clock speed reduced, but with more GPU cores added, you get more performance than by simply upping the clock.

I would love for nVidia to provide some detailed specs for the Switch when the time comes.
 

Goodtwin

Well-Known Member
I doubt the TX1 was moddified that much. It's hard to use a TX1 for Dec kits, and then turn around and use a chip with more cores. I'm betting on removal of the A53 cores is likely, and potentially some of Pascals compression techniques were added. I also think 28nm HPC Plus is posible. It could be a cheaper process than the bastard 20nm, and from what I am heading TSMC is charging a pretty decent premium for 16nm Finfet. Another potential reason to need a little bit of fan cooling when in portable mode.

Sent from my SM-G360V using Tapatalk
 

Shoulder

Your Resident Beardy Bear
I doubt the TX1 was moddified that much. It's hard to use a TX1 for Dec kits, and then turn around and use a chip with more cores. I'm betting on removal of the A53 cores is likely, and potentially some of Pascals compression techniques were added. I also think 28nm HPC Plus is posible. It could be a cheaper process than the bastard 20nm, and from what I am heading TSMC is charging a pretty decent premium for 16nm Finfet. Another potential reason to need a little bit of fan cooling when in portable mode.

Sent from my SM-G360V using genital warts
But isn't the 28nm being phased out entirely? Wouldn't restarting that only be more expensive than simply leaving the chip as is? It just seems to me it's more work than is necessary, especially if using 28nm means you need a fan for cooling (not saying using 20nm doesn't require a fan).

The removal of the A53 cores is a possibility though, given it's looking more like it has the A57 (A72 is another possibility, but unlikely I think) cores instead.

So in my exploration, I found this newer thread on GAF where someone discovered some files in the UE4 ini. Basically, it presents some presets for making games on the Switch in both handheld and docked mode, although what the differences are between the different presets hasn't been answer yet. But one interesting tidbit is the screen percentage in the files, which indicate what most of us are thinking, and that is the screen is 720p, but when docked, the Switch renders in 1080p (or whatever the resolution is on the TV, it can be reduced to 66% for handheld).

http://www.neogaf.com/forum/showthread.php?t=1327012

Keep in mind though this only concerns UE4, and does not reflect all scenarios of how games could end up running on different engines. But to sum it all up, A slightly beefed up Wii U on handheld is what it's looking like.

It would support the idea that the Switch has essentially 2 confiurations, but beyond that, we still don't know what it means for real world capabilities. But one thing we can say almost for sure really is if your game is on UE4, porting over to the Switch has been made a lot easier now.
 

sjmartin79

White Phoenix of the Crown
I don't understand much about the specs and clock speeds and GPU and CPU. (I know math and finance... and a kabillion tidbits of useless entertainment trivia.) But this thread and other threads by you guys make all this much easier to understand, where I feel like I can almost grasp what is important and why.

So, thank you!
 

Goodtwin

Well-Known Member
I guess i am of the mind that its best to wait and see what January 13 shows us. We can look at specs and assume it's going to prevent ports, but what if the specs are legit, but Assassin's Creed and COD show up during the reveal? So much haterade is being spewed right now, diclaring western support is now impossible. A week ago Ubisoft, Bethesda and From Software were all saying possitive things about performance. Are we go believe they didn't know the specs? LOL, we may be suprised just how well games scale when they have modern hardware, even if it's far less powerful.

Sent from my SM-G360V using Tapatalk
 

sjmartin79

White Phoenix of the Crown
I guess i am of the mind that its best to wait and see what January 13 shows us. We can look at specs and assume it's going to prevent ports, but what if the specs are legit, but Assassin's Creed and COD show up during the reveal? So much haterade is being spewed right now, diclaring Easter support is now impossible. A week ago Ubisoft, Bethesda and From Software were all saying possitive things about performance. Are we go believe they didn't know the specs? LOL, we may be suprised just how well games scale when they have modern hardware, even if it's far less powerful.

Sent from my SM-G360V using genital warts
Honestly, this is what I've been thinking.
 

mattavelle1

IT’S GOT A DEATH RAY!
Moderator
I don't understand much about the specs and clock speeds and GPU and CPU. (I know math and finance... and a kabillion tidbits of useless entertainment trivia.) But this thread and other threads by you guys make all this much easier to understand, where I feel like I can almost grasp what is important and why.

So, thank you!
Same, I'm the same. And even as good as our guys here our it just about puts my head on the brink of hurting.
 

Goodtwin

Well-Known Member
http://www.realworldtech.com/tile-based-rasterization-nvidia-gpus/

@mattavelle1

If you want to be bored out of your skull, watch that video. LOL

Seriously though, for those who are interested the above link is pretty educational. This is basically how the Tegra X1 gets away with having lower memory bandwidth. The GPU renders the image in tiles, instead of all at once like AMD cards. Tiled rendering has been a thing for a while, but was only really supported in mobile processors. Nvidia starting with Maxwell decided to start designing their processors with a focus on mobile, knowing that its easier to scale up than it is down. Why does this matter? Typically a GPU is reading and writing data back and forth to main memory constantly as it renders each frame, Maxwell doesn't do this. Instead it processes a small portion of the frame internally in its own memory cache, and once complete, sends the completed tile out to the main memory. This greatly reducing the traffic with the main memory, and thus reduces memory bandwidth requirements.

People also need to understand that Maxwell in the Tegra X1 is not only far more modern than the GCN AMD architecture powering the PS4/X1, but it can basically be considered the Beta version of Pascal. The Tegra X1 used "2nd generation" Maxwell, and is the only Maxwell chip to support FP16 calculations. Not saying Pascal didn't make further improvements, but much of the energy savings that Nvidia markets with Pascal is really thanks to the 16nm FinFet process those chips are manufactured on. I'm also guessing that if beneficial, the more advanced Delta compression techniques used in Pascal could be incorporated into the custom Tegra X1 powering the Switch.

I'm also hearing from a developer over at Beyond3D that the Maxwell architecture shits all over the AMD GCN architecture in terms of polygons/geometry performance. There has been some concerns about this in portable mode, but from the sounds of it, even if portable mode at 300Mhz, the performance in this area shouldn't be compromised.

Guys, there is a reason the talk of Nvidia flops are 1.5x AMD flops, the Maxwell architecture is superior in every way. Combine this with the ability to implement FP16 shaders, something I am hearing can be uses nearly 70% of the time, and we are talking about a mobile platform that even in mobile mode outclasses the Wii U by a decent margin. The Wii U GPU is 176Gflop, used VLIW5 shaders, much less efficient than GCN the Xb1/PS4 use, so I would guesstimate converting the Tegra flops to Wii U flops would be the equivalent of 350 Gflop. This is no exact science, just me trying to convert the improvements made in architectures of the years into comparable metrics.

I want to close this post with the fact that last week Dark Souls 3 was running on Switch, the news that the Tegra processors clocks slower than expected doesn't change that. Real in game performance is what matters, and things look favorable. I think the January 13 reveal is going to shake things up. The haters are gonna hate, but what if COD and Assassins Creed were shown at the reveal? For all those naysayers claiming the clock speeds are proof that third parties wont/cant support Switch, good Lord how I would love to see them eat crow.
 

mattavelle1

IT’S GOT A DEATH RAY!
Moderator
http://www.realworldtech.com/tile-based-rasterization-nvidia-gpus/

@mattavelle1

If you want to be bored out of your skull, watch that video. LOL

Seriously though, for those who are interested the above link is pretty educational. This is basically how the Tegra X1 gets away with having lower memory bandwidth. The GPU renders the image in tiles, instead of all at once like AMD cards. Tiled rendering has been a thing for a while, but was only really supported in mobile processors. Nvidia starting with Maxwell decided to start designing their processors with a focus on mobile, knowing that its easier to scale up than it is down. Why does this matter? Typically a GPU is reading and writing data back and forth to main memory constantly as it renders each frame, Maxwell doesn't do this. Instead it processes a small portion of the frame internally in its own memory cache, and once complete, sends the completed tile out to the main memory. This greatly reducing the traffic with the main memory, and thus reduces memory bandwidth requirements.

People also need to understand that Maxwell in the Tegra X1 is not only far more modern than the GCN AMD architecture powering the PS4/X1, but it can basically be considered the Beta version of Pascal. The Tegra X1 used "2nd generation" Maxwell, and is the only Maxwell chip to support FP16 calculations. Not saying Pascal didn't make further improvements, but much of the energy savings that Nvidia markets with Pascal is really thanks to the 16nm FinFet process those chips are manufactured on. I'm also guessing that if beneficial, the more advanced Delta compression techniques used in Pascal could be incorporated into the custom Tegra X1 powering the Switch.

I'm also hearing from a developer over at Beyond3D that the Maxwell architecture shits all over the AMD GCN architecture in terms of polygons/geometry performance. There has been some concerns about this in portable mode, but from the sounds of it, even if portable mode at 300Mhz, the performance in this area shouldn't be compromised.

Guys, there is a reason the talk of Nvidia flops are 1.5x AMD flops, the Maxwell architecture is superior in every way. Combine this with the ability to implement FP16 shaders, something I am hearing can be uses nearly 70% of the time, and we are talking about a mobile platform that even in mobile mode outclasses the Wii U by a decent margin. The Wii U GPU is 176Gflop, used VLIW5 shaders, much less efficient than GCN the Xb1/PS4 use, so I would guesstimate converting the Tegra flops to Wii U flops would be the equivalent of 350 Gflop. This is no exact science, just me trying to convert the improvements made in architectures of the years into comparable metrics.

I want to close this post with the fact that last week Dark Souls 3 was running on Switch, the news that the Tegra processors clocks slower than expected doesn't change that. Real in game performance is what matters, and things look favorable. I think the January 13 reveal is going to shake things up. The haters are gonna hate, but what if COD and Assassins Creed were shown at the reveal? For all those naysayers claiming the clock speeds are proof that third parties wont/cant support Switch, good Lord how I would love to see them eat crow.
YOUR A beast!
 

Shoulder

Your Resident Beardy Bear
And see, this is why talking only in FLOPS is becoming more and more useless of a figure. It's more arbitrary than it used to be, much how clock speed has become compared to years past when everything was single core. Sure, it still has merits, but this nvidia FLOPS vs. AMD FLOPS business is like the old saying, "Our Germans are better than your Germans."
 

mattavelle1

IT’S GOT A DEATH RAY!
Moderator
http://www.realworldtech.com/tile-based-rasterization-nvidia-gpus/

@mattavelle1

If you want to be bored out of your skull, watch that video. LOL

Seriously though, for those who are interested the above link is pretty educational. This is basically how the Tegra X1 gets away with having lower memory bandwidth. The GPU renders the image in tiles, instead of all at once like AMD cards. Tiled rendering has been a thing for a while, but was only really supported in mobile processors. Nvidia starting with Maxwell decided to start designing their processors with a focus on mobile, knowing that its easier to scale up than it is down. Why does this matter? Typically a GPU is reading and writing data back and forth to main memory constantly as it renders each frame, Maxwell doesn't do this. Instead it processes a small portion of the frame internally in its own memory cache, and once complete, sends the completed tile out to the main memory. This greatly reducing the traffic with the main memory, and thus reduces memory bandwidth requirements.

People also need to understand that Maxwell in the Tegra X1 is not only far more modern than the GCN AMD architecture powering the PS4/X1, but it can basically be considered the Beta version of Pascal. The Tegra X1 used "2nd generation" Maxwell, and is the only Maxwell chip to support FP16 calculations. Not saying Pascal didn't make further improvements, but much of the energy savings that Nvidia markets with Pascal is really thanks to the 16nm FinFet process those chips are manufactured on. I'm also guessing that if beneficial, the more advanced Delta compression techniques used in Pascal could be incorporated into the custom Tegra X1 powering the Switch.

I'm also hearing from a developer over at Beyond3D that the Maxwell architecture shits all over the AMD GCN architecture in terms of polygons/geometry performance. There has been some concerns about this in portable mode, but from the sounds of it, even if portable mode at 300Mhz, the performance in this area shouldn't be compromised.

Guys, there is a reason the talk of Nvidia flops are 1.5x AMD flops, the Maxwell architecture is superior in every way. Combine this with the ability to implement FP16 shaders, something I am hearing can be uses nearly 70% of the time, and we are talking about a mobile platform that even in mobile mode outclasses the Wii U by a decent margin. The Wii U GPU is 176Gflop, used VLIW5 shaders, much less efficient than GCN the Xb1/PS4 use, so I would guesstimate converting the Tegra flops to Wii U flops would be the equivalent of 350 Gflop. This is no exact science, just me trying to convert the improvements made in architectures of the years into comparable metrics.

I want to close this post with the fact that last week Dark Souls 3 was running on Switch, the news that the Tegra processors clocks slower than expected doesn't change that. Real in game performance is what matters, and things look favorable. I think the January 13 reveal is going to shake things up. The haters are gonna hate, but what if COD and Assassins Creed were shown at the reveal? For all those naysayers claiming the clock speeds are proof that third parties wont/cant support Switch, good Lord how I would love to see them eat crow.
So in regular people's terms. There is STILL A CHANCE that the internalls of the Switch could atleast come close to the X1? I'm saying docked, I know you already said it will be quite a bit more than WiiU already.

Break it down in your simplist terms. :mgrin: where exactly do you think the Switch will fall? Close to X1? Right in between X1 and WiiU?
 

Odo

Well-Known Member
And see, this is why talking only in FLOPS is becoming more and more useless of a figure. It's more arbitrary than it used to be, much how clock speed has become compared to years past when everything was single core. Sure, it still has merits, but this nvidia FLOPS vs. AMD FLOPS business is like the old saying, "Our Germans are better than your Germans."
So does it mean that Switch is more powerful than 64 bits?
 

Shoulder

Your Resident Beardy Bear
So does it mean that Switch is more powerful than 64 bits?
Like what @Goodtwin was saying, nVidia is more efficient with their hardware compared to AMD, so they get more out of their architectures than AMD does. There is something to be said when nVidia almost always trounces the performance over AMD in the PC world.

As far as the bits go, that is still a mystery. One of the things that is to be said if it's it's 64-bits, then the memory bandwidth of the whole system it would be [I believe] 25.6GB/sec, but if it were instead 128 bits, that number would be doubled. What is interesting is some look at the number, and forget about the architecture entirely, and think the Switch will be memory starved (compared to the PS4 which is around 150GB/sec). But again, as GT said, the Tegra processor, and the way how nVidia does things, they can more efficiently manage the bandwidth restrictions so it would be as though the bandwidth itself is a lot more than it actually is due to the efficiency of the system. That's one basic way of putting it.

Or if you want to think of this in cars terms, which I love doing, imagine you have a 8.4 V10 engine with 640HP, which is a lot of power. Now imagine an engine with slightly less horsepower, a 3.8 Twin-turbo V6 with 545HP. Sure, it has less horsepower on paper, but it is able to put that power down more efficiently compared to the massive V10. I am of course comparing the Dodge Viper to the Nissan GT-R in this case, but it's about how you put that power down, and get the most out of it. I look at console hardware in the same vain. You can have this massive power under the hood, but if you cannot get that power down correct, or efficiently, then you're just wasting some of that potential power on paper. I am not saying though the Switch will be as powerful as the PS4, but implying that just because on paper, the clock speed, the memory bandwidth, and other things, doesn't mean it's going to be weak sauce.

In the end, we'll let the games do the talking come next month, which is now less than a month away. :D
 

Goodtwin

Well-Known Member
http://m.neogaf.com/showthread.php?t=1327012

http://nintendoeverything.com/diffe...able-switch-modes-spotted-in-unreal-engine-4/

There talkin about UE4 ver. 14 which I think is the most updated versions for X1 and PS4 being found in the coding for Switch. Or maybe there saying it's the version right below that. @Goodtwin @Shoulder some more info to sift thru.
This is further proof that the reported clock speeds do not paint the whole picture. The profiles suggest that Unreal 4 defaults to medium settings in console mode compared to high settings on PS4. In portable mode, settings default to low. This suggest a ton of scalability for those using Unreal 4. Developers can basically take there work and with the flip of a switch convert it to portable. We are talking shorter draw distance, lighting and shadows either at lesser quality, with things like shadows potentially being baked instead of rendering them in real time.

Sent from my SM-G360V using Tapatalk
 
Last edited:

Koenig

The Architect
One thing I am a little confused on in regards to the recent Maxwell VX Pascal rumor, is just how energy efficient they are compared to each other. From what I have heard Pascal would be more energy efficient in addition to having increased processing capabilities when compared to the Maxwell architecture. Is this correct? And if so what kind of a difference are we talking about here?
 

Goodtwin

Well-Known Member
Maxwell 2ng generation, what Tegra X1 uses, is basically the alpha build for Pascal. Pascal does have further improvements, but overall they are very similar. A lot of the energy efficiency improvements are believed to come from Pascal hardware moving to Finfet 16nm, not from the architecture itself. Bottom line is this, 256 2nd generation Maxwell cores are competitive with 256 Pascal cores. Not enough difference to make a big difference. Maxwell is still far more advanced than the GCN AMD cores powering PS4/X1.

Sent from my SM-G360V using Tapatalk
 

Shoulder

Your Resident Beardy Bear
This is further proof that the reported clock speeds do not paint the whole picture. The profiles suggest that Unreal 4 defaults to medium settings in console mode compared to high settings on PS4. In portable mode, settings default to low. This suggest a ton of scalability for those using Unreal 4. Developers can basically take there work and with the flip of a switch convert it to portable. We are talking shorter draw distance, lighting and shadows either at lesser quality, with things like shadows potentially being baked instead of rendering them in real time.

Sent from my SM-G360V using genital warts
Here's another thing as well to consider. Yes, some of those graphics details will be toned down for the 6 inch screen, but at that screen size with a 720p resolution, you might not notice the reduction in graphic fidelity as much as you'd think.
 

Koenig

The Architect
I'm not concerned about the visuals during portable mode, rather I am very concerned about whether or not the baseline set by the portable mode will easily match the competition in terms of development and performance. Granted, I have heard rumors that the Switch is even easier to develop for the PS4 and X1, which would be a pleasant turn of events.
 
Nintendo Switch is reported by Takashi Mochizuki that it will be 1080p when undocked and 1440p when docked. This report better be true, otherwise the Nintendo Switch will just die in an instant if it is not on par with xbox one or ps4
 
Why does it not need current gen power to succeed? Tell me atleast 10 reasons why such a basic requirement and a core reason why the Wii U failed is considered """""""""""unneeded"""""""""""" for a console that could have the same problem again?
 

Shoulder

Your Resident Beardy Bear
:mfacepalm: No it Dosent. They make videos like that for click bait on the weak.
The Click-Bait can have a strong influence on the weak-minded.

But again, we don't know to the extent of how modified the Tegra chip is. We don't know if more Cores were added, we don't know if Pascal features were put in, we don't know what manufacturing process it's going to be on, we don't know. It's like, people need to calm the fuck down until Jan. 12th. All we know is it's a Tegra X1, and its clock speed was reduced. That does not tell us all there is. It can give us ideas, but there are still some unknown variables as I mentioned a couple sentences back.

Ultimately, I'm letting the games do the talking.
 
LMAO, the fact that I said the nintendo switch needs to be atleast the power of the xbox one or ps4 did not get your attention all this time?

Wii U failed because it was not up to the power of those systems. Now look at the switch, again in the same situation.
 

Koenig

The Architect
LMAO, the fact that I said the nintendo switch needs to be atleast the power of the xbox one or ps4 did not get your attention all this time?

Wii U failed because it was not up to the power of those systems. Now look at the switch, again in the same situation.
Along with a multitude of even greater problems.

While I do agree that the Switch should ideally be on par with the PS4 and X1, lets not forget that the system must first function as a handheld before its console function can be developed for, since they both have to work properly. As is, a handheld more powerful than the Wii U is a notable (and marketable) feat that should not be underestimated.

In regards to third party support, there are two main things that will affect said support for the system:

1. The number of Switch units sold.
2. How easy the platform is to develop for.

As long as (1) is met, the Switch will get third party support, where as (2) will determine how well said games will be optimized for the system. We already know that major developer engines such as UE4 work well on the system, which alleviates that concern; what remains to be seen is whether or not the Switch will sell enough units to merit 3rd party interest.

Even though the Wii U was weak, the vast majority of PS4 and X1 games could be ported to the systems (Although development would be a bit more difficult), the problem was that the Wii U simply did not have enough units to justify support in the eyes of most 3rd parties. (Although I would argue that it would have been worth the risk)
 
LMAO, the fact that I said the nintendo switch needs to be atleast the power of the xbox one or ps4 did not get your attention all this time?

Wii U failed because it was not up to the power of those systems. Now look at the switch, again in the same situation.
I'm not gonna argue the same crap I did with you last night in the chat, I'm not a record you can spin round and round. You can scroll back the chat yourself to see where you accepted the faults in this argument.
Its not a bad view, its just one that we have discussed directly with you already.


As an aside, if you cant accept this this reality:
so...

1. this is still rumor
2. we already saw zeda running on switch at a healthier frame-rate than on wii u
3. anybody who expected the switch to be MORE powerful than xbone was delusional
4. it doesn't NEED to be more powerful than an xbone... because...
a. Nintendo's own games do not require more than what wii u did to look great
b. they were NEVER going to get 3rd party parity anyways
c. this is the generation of scalability, and even if performance isn't a match for the competition, the architecture is, and that is all that they need for a port, considering this is by far the least necessary generation jump in history​
5. The switch will unify nintendo's library of games that has been split between systems since the original gameboy, which means..
a. about twice the number of first party exclusives as either system line had before, including both Pokemon, and full 3d Zelda
b. no unnecessary extrenious versions of games, which frees up development for even more new games
c. the switch will be home to the great 3rd party support that their portables have had, in addition to what we got on Nintendo consoles that didn't have great 3rd party support, like the n64 and wii u... so it wont just be Pokemon AND Zelda... it will be Pokemon AND Zelda AND Bayonetta AND Monster Hunter.​
6. Lets not forget that by being a home system that can seamlessly be taken on the road, the switch becomes an attractive proposition to western 3rd parties that would otherwise not even give Nintendo a second glance. Skyrim might be an old game, but a bethesda dev said the switch was the best hardware demo he has ever seen, and they wouldn't have even considered the skyrim port for the wii u, which also could have run it.... so that shows that there IS a point of interest for even the western 3rd parties. Also, as a handheld device, it will be second to none in power, so for every comment about it being the weakest console, you could equaly argue that it is the strongest portable by a notable measure

for me, the concerns about a hybrid were always these
- how affordable will it be
- can it surpass the wii u, by even a little
- how long will the battery last

we still don't have a clue for the price, it is notably more powerful than the wii u, and if the clockspeeds are true, the battery performance should be better than we thought

so long as the system holds to that, and isn't like $400 or something, I will be super happy... it will have addressed my 3 concerns while taking a gigantic leap in the direction nintendo has needed to move in for several generations

fuck the clock speeds... you saw zelda:botw on Fallon... did it look great? that is all that matters

fuck western 3rd parties... they would have ignored it anyways, and by presenting an option the competition doesn't have they might even find themselves with BETTER 3rd aprty support than if they had tried to compete directly

do the games look and run good? yes
can you play it as a handheld and a console? yes
will it have support that would otherwise be divided between 2 separate system all on one device? yes
will it be the best around in performance and there for you to brag about on the school yard? fuck you if you need that
will it play games that are already available on other systems you already own? who gives a fuck
will it have a ton of exclusives that can only be played on it? YES!
then you have some serious blinders on. I want to believe most have, I think at least here we were all pretty level-headed enough to see it since the reveal. Some see it and are all for it, while others dont like the direction at all but dont deny that its whats happening.

You've been flip-flopping to a crazy degree, even when we've corrected you way back when to not expect this to touch XB1 or PS4 levels. I believe you've chosen to ignore and forget those moments solely to make posts like this where you can act shocked and appalled.
 

Shoulder

Your Resident Beardy Bear
LMAO, the fact that I said the nintendo switch needs to be atleast the power of the xbox one or ps4 did not get your attention all this time?

Wii U failed because it was not up to the power of those systems. Now look at the switch, again in the same situation.
The Wii U failed not because of power, dude. Understand that. It was the name itself, the lack of marketing, the gaming droughts, etc. All those things contributed to a much greater concern for the Wii U than the horsepower. Did it have some effect? Absolutely, but it was not THE reason the Wii U failed.

If horsepower were the reason, then the PS2 would've failed compared to the GCN and Xbox of the day. Sony came in guns blazing with a shitload of marketing, a catchy name, and despite it being a bitch to program for, developers were willing to develop for it because it sold well. Horsepower was not the reason.
 

Koenig

The Architect
The Wii U failed not because of power, dude. Understand that. It was the name itself, the lack of marketing, the gaming droughts, etc. All those things contributed to a much greater concern for the Wii U than the horsepower. Did it have some effect? Absolutely, but it was not THE reason the Wii U failed.

If horsepower were the reason, then the PS2 would've failed compared to the GCN and Xbox of the day. Sony came in guns blazing with a shitload of marketing, a catchy name, and despite it being a bitch to program for, developers were willing to develop for it because it sold well. Horsepower was not the reason.
Lets not forget that the Gamepad itself was often a detriment to the system in the eyes of many.
 

Goodtwin

Well-Known Member
If Switch were a straight up console that was significantly underpowered, and Nintendo's library was still split between portable and console, and Japanese developers primarily developed for the portable in lieu of the console, then yes, it would be a very big problem. But the factors surrounding Switch success seem much more reminicent of Nintendo path to success with their portables.

I would love to see Western Third parties support Switch, but these rumored specs are not in anyway going to make it impossible. Ram is an area that can be tough to work around, but Switch is rumored to have 3.2GB available for games compared to 5.5GB on PS4. Cut texture resolution in half and your good to go. The CPU is far more modern, and had superior simd support. As we have seen with countless cross gen games, Titanfall, Tomb Raider, GTA V, and two Call of Duty games, it's far from impossible to scale games from PS4/X1 to PS3/360 level hardware.

Sent from my SM-G360V using Tapatalk
 

Juegos

All mods go to heaven.
Moderator
But the factors surrounding Switch success seem much more reminicent of Nintendo path to success with their portables.
Damn, that's true. It's a lot like when the DS was going to get killed by the PSP or the 3DS was going to get killed by the PSV. Both the DS and 3DS did have bad launches, but only until there were enough good games and a good enough price to justify them, then they blew up. The Switch is poised to come out as a powerful handheld with a unified Nintendo library at a hopefully reasonable price and with Nintendo games being the most appealing they've been since the NES era - I don't see how the Switch isn't successful in its first year.
 

theMightyME

Owner of The Total Screen
Damn, that's true. It's a lot like when the DS was going to get killed by the PSP or the 3DS was going to get killed by the PSV. Both the DS and 3DS did have bad launches, but only until there were enough good games and a good enough price to justify them, then they blew up. The Switch is poised to come out as a powerful handheld with a unified Nintendo library at a hopefully reasonable price and with Nintendo games being the most appealing they've been since the NES era - I don't see how the Switch isn't successful in its first year.
Not to mention the switch has a ton of hype. The kind that the wii u never had.
 

Goodtwin

Well-Known Member
There does seem to be a lot more buzz surrounding Switch than Wii U, not to mention people already understand what Switch is, there is no confusion at all this time.

As much as I enjoy discussing specs, your average games isn't really all that interested. People are going to be impressed with how good Mario Kart looks, not picking it apart because it's not nearly technically advanced at Uncharted 4. For years there was no way a portable could play on the TV without the results being a pixelated mess. This is no longer the case. Mobile hardware is now rendering HD content with the same feature set consoles and pc's use.

Even the Vita with its 35 Gflop GPU was pretty close to bridging the gap. Playing Killzone Mercenaries on the Vitas 5 inch screen looked like a PS3 game. Sure, if you blew up that image to a big LCD tv the compromises would be more noticeable, but on the small screen it looks great.

Launching Switch with Nintendo's biggest Zelda game of all time is going to set the stage. This is a serious gaming platform, not just casuals and mobile titles. Get the Switch out of the gate fast and all concerns about specs go away. The chasm between Wii and 360/PS3 was far more significant, and look how many games were ported to Wii.

Sent from my SM-G360V using Tapatalk
 

Goodtwin

Well-Known Member
A simple way to really equate just how much the small screen lends itself to scaled back visuals, play a few Wii games on the Gamepad. Playing Mario Galaxy on my 42 inch LCD shows some age, but throw that image down on the gamepad, and the game looks damn close to Mario 3D World. Even games like COD Black Ops, on the TV it looks really dated, but throw it into the gamepad and suddenly it looks far more modern. Try it out, it can put things into perspective real quick.

Sent from my SM-G360V using Tapatalk
 

Shoulder

Your Resident Beardy Bear
A simple way to really equate just how much the small screen lends itself to scaled back visuals, play a few Wii games on the Gamepad. Playing Mario Galaxy on my 42 inch LCD shows some age, but throw that image down on the gamepad, and the game looks damn close to Mario 3D World. Even games like COD Black Ops, on the TV it looks really dated, but throw it into the gamepad and suddenly it looks far more modern. Try it out, it can put things into perspective real quick.

Sent from my SM-G360V using genital warts
If there's one thing we can learn from Nintendo all these years is they know how to psychologically prepare us, whether it be from a game design standpoint (World 1-1 in SMB), or how specs don't matter so much when you play things on a smaller screen compared to your TV.

Interesting fact to note: All those games we see on everyone's smartphones on their retina displays? A lot of them don't even use the full native resolution of the screen, and are either upscaled, or simply running at a lower resolution entirely. How many people actually notice this? *mic drop*
 

Koenig

The Architect
I wonder if the Switch will be difficult to get ahold of at launch like the Wii or be overstocked like the Wii U...

You guys have any particular plans for secure a unit at launch?
 
Top