Nintendo Switch Spec Thread

Koenig

The Architect
That thing is though, the Wii handled that kind of item count seamlessly, so I am having a hard time swallowing that ram was the real issue.
 

Shoulder

Your Resident Beardy Bear
I'm thinking it's the fact that you have to account for those extra 12 items to possibly be used at the same time. Imagine that every character has two sets of 3 red shells each, and may pick up more from an item box while using their current items. That's potentially 72+ red shells moving around the map, tracking opponents and causing them to tumble over on hit. Or imagine that some of these items are lightning bolts, bullet bills, bombs, and whatever else. It could get hectic, and could be a source of massive slowdown on the Wii U.

But I'm not a programmer and don't really know how items would be implemented in a game like this, so I'm purely speculating.
This is likely the answer. More items at once means more assets on screen, and thus requires more ram to store all those assets waiting to be processed by the CPU. Now the quantity of ram needed to pull this off is another story, and one I don't think we'll ever truly find out.
 

DarkDepths

Your friendly neighbourhood robot overlord
This is likely the answer. More items at once means more assets on screen, and thus requires more ram to store all those assets waiting to be processed by the CPU. Now the quantity of ram needed to pull this off is another story, and one I don't think we'll ever truly find out.
The thing is though, the actual amount of RAM is bound to be quite small, and we can make reasonable estimates. The graphical assets, which would account for the most, can be instanced, so it basically doesn't change as the number of instances increases. What you're left with is a small amount of data necessary for the individual items' states. Take a red shell, for example. It probably has the following properties:
  • instance id (32 bit integer)
  • position (96 bit, 3-component floating point vector)
  • velocity (96 bit, 3-component floating point vector)
  • target instance id (32 bit integer)
  • life span (32 bit integer)
That's a total of 288 bits, or 36 Bytes for a single Red Shell. If you have the hypothetical 72 of them going at once, that's still only 2,592 Bytes, or 2.5 KB. When the Wii U has 1 GB of available RAM, the 2.5 KB seems like nothing in comparison.

Of course, there is a lot of RAM needed for other things going on in the game, but still... if you can't find a few KB for a core component of the game, something has gone horribly wrong!

And in fact, we have to keep in mind that this is only twice of what could theoretically be done anyways. So it's not like they couldn't find 2.5KB, it's more like they couldn't find an additional 1.25KB, which makes it even more bizarre, I think.
 

theMightyME

Owner of The Total Screen
That thing is though, the Wii handled that kind of item count seamlessly, so I am having a hard time swallowing that ram was the real issue.
I think it is less that ram was holding it back from that, and more that with the additional ram, they upped the performance and then said "ummm what else can we do, we still have power" but seeing as how that came from a developer at Nintendo I am going to guess that they know more than us about the ram needed for multiple items...

but yeah... I think it was less "we can finally do this thing we couldn't"... and more "we found some more things to do with what we had extra"
 

Shoulder

Your Resident Beardy Bear
The thing is though, the actual amount of RAM is bound to be quite small, and we can make reasonable estimates. The graphical assets, which would account for the most, can be instanced, so it basically doesn't change as the number of instances increases. What you're left with is a small amount of data necessary for the individual items' states. Take a red shell, for example. It probably has the following properties:
  • instance id (32 bit integer)
  • position (96 bit, 3-component floating point vector)
  • velocity (96 bit, 3-component floating point vector)
  • target instance id (32 bit integer)
  • life span (32 bit integer)
That's a total of 288 bits, or 36 Bytes for a single Red Shell. If you have the hypothetical 72 of them going at once, that's still only 2,592 Bytes, or 2.5 KB. When the Wii U has 1 GB of available RAM, the 2.5 KB seems like nothing in comparison.

Of course, there is a lot of RAM needed for other things going on in the game, but still... if you can't find a few KB for a core component of the game, something has gone horribly wrong!

And in fact, we have to keep in mind that this is only twice of what could theoretically be done anyways. So it's not like they couldn't find 2.5KB, it's more like they couldn't find an additional 1.25KB, which makes it even more bizarre, I think.
Because...Nintendo.
 

DarkDepths

Your friendly neighbourhood robot overlord

theMightyME

Owner of The Total Screen
so here is a longshot crackpot theory...

we talked about the idea of using the usb3 port on the switch for external processing, this was shut down by the techies here as the usb protocol isn't fast enough... but a techie user on neogaf pointed out that it is only the usb 3 protocol that isn't fast enough, and that the hardware usb 3 port could be used, in conjunction with a custom protocol, which is not unheard of, to provide a specialized function that CAN be fast enough for such use... this user even pointed out that aprt of the reasoning for the slower speed on the protocol is data lost through distance of a cable... but an external power device that fits inside the dock would have a cable length that is only a few cm, if that, in length.

now there is a comment from Yoshiaki Koizumi, who says that a non-dock solution for video out will not work, perhaps implying that the video out on the switch when docked itself is also a special protocol rather than usb3

when I first had the idea of an external gpu fitting inside the dock my idea wasn't that nitnendo WOULD do it, but rather that they COULD, and planned for the possibility, in the same way nearly every nintendo system has extra ports and such that are, more than often, left unused.

so perhaps this is more momentum in that direction...

http://gizmodo.com/heres-the-box-that-can-turn-your-puny-laptop-into-a-gra-1724958260
this link is about using an external gpu over a form of usb-c called thunderbolt...

apparently usb 3.1 is the same spec as thunderbolt, so what if the hardware in the dock is 3.1... or as the neogafer posited.. a custom protocol

so before the debating starts.. let me clarify

1. I do not think nintendo WILL do this, but rather that they considered it when designing the dock, and left the possibility open
2. I am perfectly happy with what the switch spits out already, I just like speculating
3. I do not give a shit about western 3rd parties, nor do I think Nintendo "needs" to court them, I think this ideology stems from archaic precedence that might no longer be relevent with the direction nintendo has moved in.
 

Koenig

The Architect
To me it's not so much a question of whether or not it is possible, but whether or not it is actually a good idea. I would personally and objectively have to say no.
 

theMightyME

Owner of The Total Screen
To me it's not so much a question of whether or not it is possible, but whether or not it is actually a good idea. I would personally and objectively have to say no.
I agree, my theory is that SOMEBODY at nintendo likes the idea, and everyone else humored him/her, because... why not... maybe it will be a good idea in 4 years
 

Shoulder

Your Resident Beardy Bear
so here is a longshot crackpot theory...

we talked about the idea of using the usb3 port on the switch for external processing, this was shut down by the techies here as the usb protocol isn't fast enough... but a techie user on neogaf pointed out that it is only the usb 3 protocol that isn't fast enough, and that the hardware usb 3 port could be used, in conjunction with a custom protocol, which is not unheard of, to provide a specialized function that CAN be fast enough for such use... this user even pointed out that aprt of the reasoning for the slower speed on the protocol is data lost through distance of a cable... but an external power device that fits inside the dock would have a cable length that is only a few cm, if that, in length.

now there is a comment from Yoshiaki Koizumi, who says that a non-dock solution for video out will not work, perhaps implying that the video out on the switch when docked itself is also a special protocol rather than usb3

when I first had the idea of an external gpu fitting inside the dock my idea wasn't that nitnendo WOULD do it, but rather that they COULD, and planned for the possibility, in the same way nearly every nintendo system has extra ports and such that are, more than often, left unused.

so perhaps this is more momentum in that direction...

http://gizmodo.com/heres-the-box-that-can-turn-your-puny-laptop-into-a-gra-1724958260
this link is about using an external gpu over a form of usb-c called thunderbolt...

apparently usb 3.1 is the same spec as thunderbolt, so what if the hardware in the dock is 3.1... or as the neogafer posited.. a custom protocol

so before the debating starts.. let me clarify

1. I do not think nintendo WILL do this, but rather that they considered it when designing the dock, and left the possibility open
2. I am perfectly happy with what the switch spits out already, I just like speculating
3. I do not give a shit about western 3rd parties, nor do I think Nintendo "needs" to court them, I think this ideology stems from archaic precedence that might no longer be relevent with the direction nintendo has moved in.
Well, to think of it another way, USB-C can also be Thunderbolt 3 since Thunderbolt 3 uses the same connector. So in that sense, it wouldn't be completely out of the realm of possibility. This is also what I posted earlier about the possibility of the Supplemental Computing Device that made rounds on GAF last week.

Now, in terms of bandwidth, TB3 has a maximum of 40Gigbits/sec, which is only 5GB/sec, but it would allow the possibility of such a thing.

But Fried being Fried just completely shot it down thinking there was undeniably no possibility of it ever happening with the Switch.
 

DarkDepths

Your friendly neighbourhood robot overlord
what about the resolution section... an option for 30fps at 4k... drop the frame rate in half... quadruple the resolution

that seemed off to me
It wouldn't surprise me if it were true. The GameCube reportedly had special circuitry for outputting 3D images with a never released LCD screen attachment.

Note that applications can't make use of the higher resolution, according to the document. I imagine the hardware is capable of doing it, but Nintendo said "no".

Whether that's something that's made available in the future, perhaps with an add-on GPU for extra oomph, or it just sits dormant for eternity, I don't know. But I can definitely imagine the hardware being capable of doing it.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
A bunch of interesting stuff here:
http://nintendoeverything.com/switc...kit-capcom-wants-to-make-aaa-games-much-more/

Switch dev kit price, ease of porting to the system, Capcom and Nintendo working together throughout the hardware development process, Capcom wants to make AAA games for the system and wants to put the RE engine on Switch.
Lots of interesting tidbits in there. The hardware structure having some Wii U similarities fits right in line with Iwata's words awhile back about "adequately absorbing" the Wii U's architecture. Porting by two people in a month bodes well for this really being a really friendly, not-needing-to-have-a-phonebook-sized-manual-to-understand sort of device. It's interesting to hear there was so much back and forth with Capcom. With Wii U, it seemed to be EA. I think Capcom is a more important for Nintendo to have a strong working relationship with.

(...and yes, I'm saying that purely for Monster Hunter.)
 

Koenig

The Architect
Lots of interesting tidbits in there. The hardware structure having some Wii U similarities fits right in line with Iwata's words awhile back about "adequately absorbing" the Wii U's architecture. Porting by two people in a month bodes well for this really being a really friendly, not-needing-to-have-a-phonebook-sized-manual-to-understand sort of device. It's interesting to hear there was so much back and forth with Capcom. With Wii U, it seemed to be EA. I think Capcom is a more important for Nintendo to have a strong working relationship with.

(...and yes, I'm saying that purely for Monster Hunter.)
I wonder how many development companies worked with during the initial development of the Switch; hopefully there were many.
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
http://m.imgur.com/a/xZd1W
Photos of an opened up Switch.
Apparently points to 2nd Gen Maxwell architecture, whatever that means.
Also that battery is probably about as big as it could possibly be given the form factor.
@Goodtwin @DarkDepths

I see only the fan, the big-ish battery, the compactness of it all, but I'm sure you guys see much more.

http://www.neogaf.com/forum/showthread.php?t=1345524
http://www.neogaf.com/forum/showpost.php?p=230604967&postcount=53
Through 6 pages, it mostly devolved into a "you don't understand modern hardware design because of your statement on batteries!"-sort of thing. So. Par for the GAF course.
 
Last edited:

DarkDepths

Your friendly neighbourhood robot overlord
@Goodtwin @DarkDepths

I see only the fan, the big-ish battery, the compactness of it all, but I'm sure you guys see much more.

http://www.neogaf.com/forum/showthread.php?t=1345524
http://www.neogaf.com/forum/showpost.php?p=230604967&postcount=53
Through 6 pages, it mostly devolved into a "you don't understand modern hardware design because of your statement on batteries!"-sort of thing. So. Par for the GAF course.
My biggest initial takeaway is that there appears to very limited airflow. It's kind of hard to tell the depth of the impression, but I'm imagining not a lot of space between components and back cover.

I'm sure they've done ample testing, I just hope it stands up to the test of time.
 

Juegos

All mods go to heaven.
Moderator
My biggest initial takeaway is that there appears to very limited airflow. It's kind of hard to tell the depth of the impression, but I'm imagining not a lot of space between components and back cover.

I'm sure they've done ample testing, I just hope it stands up to the test of time.
So you're saying we're going to have to blow into the air vents before playing a game. :mgeek:
 

mattavelle1

IT’S GOT A DEATH RAY!
Moderator
That sweet Nintendo depth we always talk about in there games. Is now being applied to there hardware, there little secrets laying everywhere like blowing in the vents.

How do they do it :mconfuse:
 
I would recommend blowing in your video game console vents in general.
My Wii U red ringed and I was able to save it by getting an air can and blasting it in all holes.
:^)
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator
My biggest initial takeaway is that there appears to very limited airflow. It's kind of hard to tell the depth of the impression, but I'm imagining not a lot of space between components and back cover.

I'm sure they've done ample testing, I just hope it stands up to the test of time.
Now you're making me worry. :p But active cooling at all seemed a bit of a surprise at first blush, so I guess we'll take what we can get.

Also, a bit more info if you dig into GAF.
 

Goodtwin

Well-Known Member
After more and more reading, the chip could be more or less stock Tegra X1. Thraktor at Gaf was able identify the memory modules(we think, pictures weren't easy to read), which look to be 4GB of ram on a 64 bit bus, meaning 25GB/s bandwidth.

The cooling system is pretty slick. Using a heat pipe to transfer the heat from the Tegra chip to a heat sink near the exhaust vents. The Switch is tightly packed, but there should be adequate flow for a fan that is really only designed for modest air flow. Surface Air uses the same type of fan. It may seem minimal, but even moving a few cfm across a heat sink is far superior to passive cooling.

Looking how big the battery is, and how tightly everything fits together, there is no question Nintendo chose the biggest battery they could manage in the form factor. Switch is a tightly engineered.

Sent from my SM-G360V using Tapatalk
 

DarkDepths

Your friendly neighbourhood robot overlord
@Goodtwin @EvilTw1n

Agreed that the cooling system for the CPU seems pretty slick. I'm more concerned about heat outside of the ducted components. Battery + memory + bus controllers = heat, and it doesn't really look like it has anywhere to go.
 

Goodtwin

Well-Known Member
@Goodtwin @EvilTw1n

Agreed that the cooling system for the CPU seems pretty slick. I'm more concerned about heat outside of the ducted components. Battery + memory + bus controllers = heat, and it doesn't really look like it has anywhere to go.
Pretty standard for tablets, so I am confident that Switch can adequately dispose of the heat. Somebody at Gaf mentioned the long charging time for Switch could be to keep the battery from getting real hot when charging.

Sent from my SM-G360V using Tapatalk
 

Shoulder

Your Resident Beardy Bear
I'm skimming through the GAF thread right now, and I think Nintendo, and nVidia did all they could given the space constraints. I mean, DAMN that is very tightly packed. No wasted space at all, and as others have pointed out already, the biggest battery they could stuff in that tiny packaging.

All the engineers who worked on this thing deserve a standing ovation for how well designed this pice of kit really is.

 

Koenig

The Architect
Now my next question is this: How durable will the system be? Considering it's portable nature I sure hope it is able to survive the inevitable drop on the concrete surface.
 

DarkDepths

Your friendly neighbourhood robot overlord
Pretty standard for tablets, so I am confident that Switch can adequately dispose of the heat. Somebody at Gaf mentioned the long charging time for Switch could be to keep the battery from getting real hot when charging.

Sent from my SM-G360V using genital warts
True enough. My concern stems more from the fact that tablets tend to not have particularly performant RAM, nor did the 3DS. I should stress though, I'm not really concerned. It's just a little nugget sitting in the back of my mind.
 

Cubits

Well-Known Member
So it's pretty good news that the battery isn't soldered in! Literally anyone who can wield a screwdriver can swap it out if it ever becomes a problem. It's a dead standard looking part right down to the plug, i might even be able to source a slightly juicier alternative once i have the measurements.



And after the rollercoaster of back-and-forthing from next-gen pascal to underclocked X1, NeoGAF and Reddit are currently imploding with another possible leak from China, which is claiming some really whacky specs for the SoC:



The split core speeds in undocked mode is really weird (could be a reporting bug?), as it was assumed that the customisation would be dropping the 4x A53 cores. Also, the disparity between undocked and docked CPU potential is significant, possibly too significant to provide a comparable gaming experience between handheld and docked modes?

And the FLOP outputs are staggering for a portable (no phone or tablet, at all, is touching it)! Corrected for architecture differences the docked mode is almost Xbone powerful, thanks largely to the doubling of CUDA cores over the stock X1. That's mental! Downscaling the graphics for portable mode might take some serious fenagling, but if this is real then it's understandable how Fromsoft might have DS3 running in some capacity at least in docked mode.

If the GPU is really that grunty, then it makes me wonder how it wouldn't be throttled tremendously by the relatively meagre RAM bandwidth. Nintendo likes to produce balanced hardware, so i'd have to assume there's some way they're closing the gap between the expected ~26gb/s and the PS4's 140. Granted, they're not even using the same type of RAM, but either the PS4 is using brute force, or nvidia has something clever going on.

I'm still thinking the SoC is more likely to be similar to that found in the shield TV, that would be plenty of grunt for what we've seen running on the thing, but the leaks out of china have been pretty spot-on so far and this diagnostic screen looks pretty legit.
 
Last edited:

Shoulder

Your Resident Beardy Bear
So it's pretty good news that the battery isn't soldered in! Literally anyone who can wield a screwdriver can swap it out if it ever becomes a problem. It's a dead standard looking part right down to the plug, i might even be able to source a slightly juicier alternative once i have the measurements.



And after the rollercoaster of back-and-forthing from next-gen pascal to underclocked X1, NeoGAF and Reddit are currently imploding with another possible leak from China, which is claiming some really whacky specs for the SoC:



The split core speeds in undocked mode is really weird (could be a reporting bug?), as it was assumed that the customisation would be dropping the 4x A53 cores. Also, the disparity between undocked and docked CPU potential is significant, possibly too significant to provide a comparable gaming experience between handheld and docked modes?

And the FLOP outputs are staggering for a portable (no phone or tablet, at all, is touching it)! Corrected for architecture differences the docked mode is almost Xbone powerful, thanks largely to the doubling of CUDA cores over the stock X1. That's mental! Downscaling the graphics for portable mode might take some serious fenagling, but if this is real then it's understandable how Fromsoft might have DS3 running in some capacity at least in docked mode.

If the GPU is really that grunty, then it makes me wonder how it wouldn't be throttled tremendously by the relatively meagre RAM bandwidth. Nintendo likes to produce balanced hardware, so i'd have to assume there's some way they're closing the gap between the expected ~26gb/s and the PS4's 140. Granted, they're not even using the same type of RAM, but either the PS4 is using brute force, or nvidia has something clever going on.

I'm still thinking the SoC is more likely to be similar to that found in the shield TV, that would be plenty of grunt for what we've seen running on the thing, but the leaks out of china have been pretty spot-on so far and this diagnostic screen looks pretty legit.
I was checking out that GAF thread last night, and some of the more well-known technical guys were insistent that the memory was 4GB of 32bit on a 64bit bus, which means that 25.6GB/sec bandwidth number we hear about.

Some other GAFers however were saying that since a stock Shield uses a single 3GB ram module, that it is possible Nintendo is using two of those for a total of 6GB of LPDDR4, which is 64-bit on a 128 but bus, which equates to almost 60GB/sec of bandwidth, which is very significant over the other figure.

If what you are suggesting is true with the X1 having more CUDA cores than a stock X1, then it's possible Nintendo went with more ram as overhead, but I'm not so sure. Based on the leaked photos, GAFers were looking at the serial numbers, and while it is not confirmed, it does suggest the Switch has 4GB, and not 6GB like some were theorizing.

There are clues from developers that the Switch might be more capable than we are expecting based on some were saying how they could not believe the Switch was that cheap. Yes, cheap. Not expensive, but cheap. When the price was announced, some were quick to say that is too expensive, but we know nothing of the internals of the hardware, but the developers do, and they said it was cheap.

Honestly, I'm just very curious when the system does launch, and someone takes some die photos of the chip, so we can actually see what was customized for it.
 

sjmartin79

White Phoenix of the Crown
I was checking out that GAF thread last night, and some of the more well-known technical guys were insistent that the memory was 4GB of 32bit on a 64bit bus, which means that 25.6GB/sec bandwidth number we hear about.

Some other GAFers however were saying that since a stock Shield uses a single 3GB ram module, that it is possible Nintendo is using two of those for a total of 6GB of LPDDR4, which is 64-bit on a 128 but bus, which equates to almost 60GB/sec of bandwidth, which is very significant over the other figure.

If what you are suggesting is true with the X1 having more CUDA cores than a stock X1, then it's possible Nintendo went with more ram as overhead, but I'm not so sure. Based on the leaked photos, GAFers were looking at the serial numbers, and while it is not confirmed, it does suggest the Switch has 4GB, and not 6GB like some were theorizing.

There are clues from developers that the Switch might be more capable than we are expecting based on some were saying how they could not believe the Switch was that cheap. Yes, cheap. Not expensive, but cheap. When the price was announced, some were quick to say that is too expensive, but we know nothing of the internals of the hardware, but the developers do, and they said it was cheap.

Honestly, I'm just very curious when the system does launch, and someone takes some die photos of the chip, so we can actually see what was customized for it.


*Nods encouragingly* Mmmm hmmm. Yes, that's what I thought too.
(Yeah, I don't get all the technical stuff, but I like the fact that you guys do.)
 

EvilTw1n

Even my henchmen think I'm crazy.
Moderator


*Nods encouragingly* Mmmm hmmm. Yes, that's what I thought too.
(Yeah, I don't get all the technical stuff, but I like the fact that you guys do.)
That's one of the benefits of our forum, I think (you get people who know what they're talking about, without the 85% of users who don't but post semantics arguments, like on GAF). I'm happy to read Cubits/DD/GT/Shoulder/et al. because I wanna learn from them. It's one of the things we were able to preserve from IGN - most of my very meager understanding of design and technology comes from Indy83 and VesselOfLight. It's a healthy part of a forum, I think.

It'll be interesting once this thing is released into the wild and people are analyzing the SoC itself.
 

Goodtwin

Well-Known Member
There is still a ton of speculation being done over at Gaf. Most based on very little. The size of the chip along with the memory chips seem to point to Tegra X1. The fact that there are two ram modules isn't that odd. I believe lpddr4 ram is not offered in 4GB chips. Once Nintendo opted for more than 3GB, two chips was mandatory. It seems the most likely that Nvidia API and tools are really really good. It's not that crazy really. Not that long ago we were seeing PS4 and Xbox One share games with previous generation consoles. Switch still far exceeds previous gen hardware, and an easy to develop for platform is more important than brute force for a good deal of games in development these days.

Sumo Digital has Snake Pass at 1080p 30fps on Switch, and 1080p 60fps on PS4. Otherwise the games look very similar.

Edit:
I was wrong, Samsung offers single chips as large as 8GB, and do indeed have a 4GB chip. This does seem off to use two chips, unless they are infact using a 128bit bus.


Sent from my SM-G360V using Tapatalk
 
Last edited:

simplyTravis

Lamer Gamers Podcast Co-Host
There is still a ton of speculation being done over at Gaf. Most based on very little. The size of the chip along with the memory chips seem to point to Tegra X1. The fact that there are two ram modules isn't that odd. I believe lpddr4 ram is not offered in 4GB chips. Once Nintendo opted for more than 3GB, two chips was mandatory. It seems the most likely that Nvidia API and tools are really really good. It's not that crazy really. Not that long ago we were seeing PS4 and Xbox One share games with previous generation consoles. Switch still far exceeds previous gen hardware, and an easy to develop for platform is more important than brute force for a good deal of games in development these days.

Sumo Digital has Snake Pass at 1080p 30fps on Switch, and 1080p 60fps on PS4. Otherwise the games look very similar.

Edit:
I was wrong, Samsung offers single chips as large as 8GB, and do indeed have a 4GB chip. This does seem off to use two chips, unless they are infact using a 128bit bus.


Sent from my SM-G360V using genital warts
I seem to remember that RAM is often better utilized by using 2 smaller modules opposed to 1 large one. Maybe that is the reason?
 

Goodtwin

Well-Known Member
I seem to remember that RAM is often better utilized by using 2 smaller modules opposed to 1 large one. Maybe that is the reason?
Most popular consensus seems to come down to cost. Two 2GB chips could actually be cheaper than a single 4 GB chip. I also heard that the interface might be cheaper as well. So cost is most likely the reason.

Sent from my SM-G360V using Tapatalk
 

Cubits

Well-Known Member
I was checking out that GAF thread last night, and some of the more well-known technical guys were insistent that the memory was 4GB of 32bit on a 64bit bus, which means that 25.6GB/sec bandwidth number we hear about.

Some other GAFers however were saying that since a stock Shield uses a single 3GB ram module, that it is possible Nintendo is using two of those for a total of 6GB of LPDDR4, which is 64-bit on a 128 but bus, which equates to almost 60GB/sec of bandwidth, which is very significant over the other figure.
Oh yeah, i totally forgot that the shield TV used a single chip! I thought it was a 2x2 setup, which should have yielded significantly better bandwidth, which is why i was worried there was some bizarre bottleneck going on.

But the switch clearly has two chips, and there's no way they're going to be slower if the GPU is going so extreme. So nothing to worry about there.

On the CPU front, the A57 cores in the X1/Switch are, clock-for-clock, significantly more powerful than the jaguar cores in the ps4/xbone. If they're clocked to 2ghz as it's shown here, then the four core setup will be within reach of the full PS4 CPU for the vast majority of tasks (and WAY ahead on big single-thread tasks).

4xA57 @ 2GHZ - https://browser.primatelabs.com/v4/cpu/search?q=shield tv
4xJaguar @ 2GHZ - https://browser.primatelabs.com/v4/cpu/search?utf8=✓&q=a6-5200

Of course, the limiting factor will be the handheld mode, but there might be enough capacity to pull it off!
 

Goodtwin

Well-Known Member
Well, I'm sticking with Eurogamers clock speeds as 90% likely at this point. So how well to four 1Ghz A57 cores hold up to 8 1.6Ghz Jaguar cores? The benchmarks posted by Cubits make me think not to terribly at all. The ARM FPU units are actually really good, so SIMD code should perform pretty well on Switch, compared to the lackluster SIMD performance of the Wii U CPU.

All in all, I think Switch will split the difference with current gen and last gen consoles fairly well. I like the in game results we are seeing. If Switch sells well, even Western Developers will be channeling some level of support to Switch. Will we see Battlefield? Maybe not, but games that make more sense like Garden Warfare are certainly possible on Switch.

Sent from my SM-G360V using Tapatalk
 

Shoulder

Your Resident Beardy Bear
Oh yeah, i totally forgot that the shield TV used a single chip! I thought it was a 2x2 setup, which should have yielded significantly better bandwidth, which is why i was worried there was some bizarre bottleneck going on.

But the switch clearly has two chips, and there's no way they're going to be slower if the GPU is going so extreme. So nothing to worry about there.

On the CPU front, the A57 cores in the X1/Switch are, clock-for-clock, significantly more powerful than the jaguar cores in the ps4/xbone. If they're clocked to 2ghz as it's shown here, then the four core setup will be within reach of the full PS4 CPU for the vast majority of tasks (and WAY ahead on big single-thread tasks).

4xA57 @ 2GHZ - https://browser.primatelabs.com/v4/cpu/search?q=shield tv
4xJaguar @ 2GHZ - https://browser.primatelabs.com/v4/cpu/search?utf8=✓&q=a6-5200

Of course, the limiting factor will be the handheld mode, but there might be enough capacity to pull it off!
Pretty much. But yeah, I honestly don't think the Switch will be 6GB of Ram, but when you think about how similar the Switch and the Shield TV are (at least at first glance), you'd almost think it would be cost effective to stay with the same type of ram module, but then add one more to compensate. Based on what we read with Capcom having some influence on the hardware of the system, it is at least possible that the Switch did have 3GB of ram, but then Capcom said that wasn't enough, so at the minimum they bumped it up to 4GB.

Now, part of the reason some GAFers are holding onto the 4GB figure is not just the rumors/leaks, but also the recent photos. They were looking at the serial/code numbers on the ram modules, and while you cannot definitive say for sure what the numbers/letters are, you can get a good idea of what you're seeing, and from the sound of it, it does appear to be 4GB, which is still more than Shield TV already. 6GB would obviously be a good way to help future proof the system, but who knows?

Just for the sake of argument though, I think it would be interesting to see the Switch having 6GB of ram just to confuse the know-it-all types on GAF...which is almost everyone there. :mcool:
 
Top