Skip to main content

Nvidia could be preparing an affordable $200 GPU to take on AMD

Nvidia could be set to introduce the GeForce GTX 1630 graphics card, which is apparently Team Green’s answer to certain entry-level boards from archrival AMD.

According to a rumor from VideoCardz, the GTX 1630 GPU is reportedly due to become the first x30 variant of the long-standing GTX series.

Nvidia GeForce GTX graphics card inside a PC.
Image used with permission by copyright holder

While Nvidia did release the GT 730 and GT 1030 boards in the past, these models are from 2014 and 2017, respectively. VideoCardz also points out how the company’s most recent last-gen Turing-based GTX card, the 1650, launched in April 2019.

After a three-year period, it seems Team Green is finally set to introduce a much-needed refresh, which comes at an opportune time for the GPU giant. Although GPU prices are starting to stabilize, costs for many cards are still above the manufacturer’s suggested retail price (MSRP). As such, there’s a huge demand for entry-level graphics cards, particularly for sub-$200 cards.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

The report highlights how AMD, which brought its Radeon RX 6500XT and RX 6400 boards to the market at a similar time period, has recently brought attention to Nvidia’s offerings in the low-end GPU segment.

AMD released a chart analyzing how its own Radeon 6000-series GPUs deliver better performance per dollar compared to Nvidia’s RTX 30 range. Notably, the GTX 1650 and GTX 1050 Ti were singled out to show how they offer inferior performance when going up against the RX 6500 XT and RX 6400, respectively.

In any case, VideoCardz made a general observation of how Nvidia has apparently shunned the low-end market amid the skyrocketing popularity of its more powerful mid-range options. The RTX 3050, for example, hit store shelves in January 2022, which is 14 months after Ampere-powered graphics cards became available.

That said, there’s a caveat: The card still cannot be purchased at the MSRP of $249, and instead is selling at inflated price tags of around $330.

A comparison of certain Nvidia GPUs against AMD's boards.
VideoCardz/AMD

As for the GeForce GTX 1630, this version is all but confirmed to replace the GTX 1050 Ti model, according to VideoCardz’s sources. As a result, expect a price point below the $190 mark (which is the market rate for the GTX 1650).

Although the website was unable to get its hands on technical specifications, it speculated that the GTX 1630 could be a TU117 (Turing) model with a sub-75 watts power requirement, in addition to updated GDDR6 memory.

There has been a considerable amount of discussion surrounding the sub-$200 GPU market as of late. For example, Intel’s desktop Arc Alchemist range of boards could introduce some new options for gamers looking for an affordable entry-level video card, but it’s a long wait before these products hit the market.

And of course, next-gen is just a few months away. We’ve heard that RDNA 2-based Radeon RX 6900XT that retails for $999 will be followed up by a model that delivers similar performance levels for half the cost.

If rumors like these turn out to be true, it may make more sense to simply wait a little while longer before upgrading or building a new gaming setup.

Update:

VideoCardz has now obtained the exact technical specifications, as well as a launch date for the GeForce GTX 1630. According to its sources, the GPU will be based on the 12nm Turing TU117-150 GPU silicon.

As such, there’s a difference in specifications when compared to the TU117-300-powered GTX 1650. In particular, there will be a decrease in cores (512 CUDA cores), in addition to a reduced memory bus (64-bit memory bus).

That said, one area where the GTX 1630 will excel compared to the GTX 1650 series is its higher boost clock, which is said to reach 1800 MHz.

The GTX 1630 will also reportedly come with 4GB of GDDR6 memory clocked at 12Gbps, which would allow the bandwidth to be pushed to 96 GB/s.

Finally, VideoCardz’s sources suggest a May 31st launch date for the board — expect the product to make an appearance at Nvidia’s Computex keynote.

Editors' Recommendations

Zak Islam
Former Digital Trends Contributor
Zak Islam was a freelance writer at Digital Trends covering the latest news in the technology world, particularly the…
How 8GB VRAM GPUs could be made viable again
Screenshot of full ray tracing in Cyberpunk 2077.

Perhaps there is still some hope for GPUs with low VRAM. According to a new patent published by Microsoft, the company worked out a method that could make ray tracing and path tracing more viable in terms of how much video memory (VRAM) they use. As of right now, without using upscaling techniques, seamless ray tracing requires the use of one of the best graphics cards—but this might finally change if this new method works out as planned.

This new patent, first spotted by Tom's Hardware, describes how Microsoft hopes to reduce the impact of ray tracing on GPU memory. It addresses the level of detail (LOD) philosophy, which is already something that's used in games but not in relation to ray tracing, and plans to use LOD to adjust ray tracing quality dynamically, thus lowering the load that the GPU -- particularly its memory -- has to bear.

Read more
Nvidia just made GeForce Now so much better
Playing games with GeForce Now on a laptop.

Nvidia has just added adaptive refresh rates to GeForce Now, its cloud gaming service. The new tech, dubbed Cloud G-Sync, works on PCs with Nvidia GPUs first and foremost , but also on Macs. These include Macs with Apple Silicon, as well as older models with Intel CPUs and AMD GPUs. On the Windows PC side more broadly, Intel and AMD GPUs will not be supported right now. Nvidia has also made one more change to GeForce Now that makes it a lot easier to try out -- it introduced day passes.

Cloud G-Sync's variable refresh rate (VRR) feature will sync your monitor's refresh rate to match the frame rates you're hitting while gaming with GeForce Now. Nvidia's new cloud solution also uses Reflex to lower latency regardless of frame rates. Enabling VRR in GeForce Now should provide a major boost by reducing screen tearing and stuttering, improving the overall gaming experience on PCs and laptops that normally can't keep up with some titles. To pull this off, Nvidia uses its proprietary RTX 4080 SuperPODs.

Read more
AMD’s GPUs had a bigger year in 2023 than you might realize
AMD's RX 7700 XT in a test bench.

It's safe to say that 2023 turned out to be a good year for the discrete graphics cards market. According to the latest data, both AMD and Nvidia saw an increase in add-in board (AIB) GPU shipments in the final quarter of 2023, and the year-to-year gains are also massive. While Nvidia still dominates the market, AMD's share is climbing steadily, and Intel remains in the shadows.

Today's round of market insights comes from Jon Peddie Research (JPR), and it's all about discrete GPUs. According to the analyst firm, discrete GPU shipments increased by 6.8% over the fourth quarter of 2023 compared to the previous quarter. This is above the less-than-impressive 10-year average of -0.6%. The year-to-year gains are even more impressive, though, as JPR notes a 32% increase compared to the final quarter of 2022, with a total of 9.5 million GPUs shipped (as opposed to 8.9 million units at the end of 2022).

Read more