RTX 40 series is in full swing. We still await budget-friendly mid and low-segments of Ada Lovelace architecture GPUs. Still, rumours start to spill about upcoming RTX 50 series GPUs codenamed Blackwell claiming over two times the performance increase of 40 series graphics cards:


**NVIDIA won't follow AMD with their chip design.**


RDNA 3 GPUs from AMD are designed on MCM architecture which, in the simplest of terms, involves stacking multiple layers of chips within a GPU die. This method is also used on AMD Ryzen X3D CPUs. It is proven more efficient, making it easier to cool and cram more transistors without making the chip bigger. Early rumours pointed out that RTX 50 series will incorporate this technology. However, NVIDIA is convinced it can still compete with traditional GPU die. Famous leaker [Kopite7kimi](https://twitter.com/kopite7kimi/status/1549382169302564865) confirmed a traditional die nearly a year ago. However, at some point, MCM design will be an industry-wide standard in CPU and GPU manufacturing.

**2x performance boost and specifications.**


RTX 50 series is rumoured to be over twice faster as the 40 series across the board. According to RedGamingTech's private sources, Blackwell GPUs will feature a massive overhaul to CUDA architecture. However, this is a partial redesign. "The Blackwell SM (Streaming Multiprocessor) units are expected to get a new structure, and further optimizations and additions will be made to Ray/Path Tracing hardware units."

For those that might be confused about **Path Tracing** – it's a more "complete" Ray Tracing. While Ray Tracing calculates bounces of light on fixed amounts of rays set by developers, Path Tracing does the same at a larger scale. It calculates every source of rays producing far more accurate and realistic ray-traced reflections, shadows and ambient occlusion. The technology is also far more demanding. So far, it was only introduced in Portal and is on its way as a free update to Cyberpunk 2077. To get things into perspective and help you understand how demanding the technology is, RTX 4090 can run Cyberpunk 2077 with Path Tracing enabled at 16 FPS average without DLSS and Frame Generation.

RTX 50 series are said to support **GDDR7** memory, but it might incorporate Samsung's new **GDDR6W** memory. It depends on which is more cost-effective and provides higher possible capacities. With games demanding more and more VRAM, we can expect high-end models to have **32GB** or even more video memory. RTX 50 series is rumoured to utilize **PCIe Gen 5** interface and offer **3GHz+** clock speeds.

There is no news about new versions of DLSS or Frame Generation. However, we expect both technologies to have some upgrades dedicated towards Path Tracing rendering.

**NVIDIA will use an expensive 3nm node**.


NVIDIA plans to shrink its die from 5nm on 40 series to 3nm on 50 series GPUs. This will again further improve power efficiency and performance. TSMC commented that 3nm node orders are at capacity, and the manufacturing process is going better than expected. Unfortunately, industry insiders pointed out that the new node is bound to be 25% more expensive than the previous 5nm node used in RTX 40 series. According to pricing leaks 5nm node costs $16,000 for a wafer of chips, while the new 3nm node is supposed to cost $20,000. This will inevitably translate to another generational price hike for the following gen GPUs.
https://stock-checker.com/uploads/RTX50Series3nmChipLeak.png

**The cooler design will stay the same.**


NVIDIA still uses a traditional die to manufacture its GPUs. In that case, it is highly speculated that we won't see any new designs for FE models. Improved 40 series FE cooling design was highly praised and significantly improved over the 30 series. If power and thermal requirements stay within the limit with RTX 50 series NVIDIA might keep the same design to save on time, engineering and manufacturing costs.

**Blackwell series reveal at GDC 2024.**


NVIDIA releases their next-gen GPUs every two years, and Ian Buck from NVIDIA semi-confirmed that there will be a significant announcement at GDC 2024. However, this most likely refers to the server-level introduction first. GDC 2024 will hit two years since the introduction of NVIDIA Hopper – a server-level GPU. This should at least give us a glimpse of what to expect from the gaming level line-up

**Disclaimers**
[Jakub Dominik](https://twitter.com/Hexagon90x) is a gaming and technology journalist for Stock Checker. Images used in this article belong to Digitimes