![]() ![]() ![]() ![]() The Asus Phoenix (top) comes with a DVI output, for example, while their ROG Strix OC edition (below) swaps it for a second DisplayPort. Not all GTX 1660 Ti cards will have the same configuration of display outputs. Still, I'd argue this isn't such a big deal for the GTX 1660 Ti, as the card's excellent raw performance and comparatively low price are what really make it such an attractive proposition over Nvidia's current 10-series. In fact, right now Wolfenstein II is the only game that supports it right now, but Nvidia have assured me that more will follow. The problem is that, like most of the new features ushered in on Nvidia's RTX cards, it needs specific developer implementation and isn't just available by default, leaving us with very little to actually test. Not that Wolfenstein II is particularly hard to run in the first place, mind, but still, if it's indicative of the kind of boost you'll see in other games, it certainly sounds promising indeed. You can read more about how it all works in my RTX guide, but from my own experience of it in Wolfenstein II, there's barely any loss in overall visual quality, and you get a frame rate boost of around 15-20fps in the process. Similar to DLSS, these use Turing's AI smarts to take some of the load off the GPU by identifying which bits of a scene don't need as much rendering (or colouring in, so to speak) as other bits, allowing the GPU to focus on what does need filling in, such as detail-heavy control panels for example, instead of putting lots of wasted effort into rendering simpler objects such as blank walls. What it does have, however, is Turing's fancy pants shaders, allowing for other performance enhancing techniques such as variable rate shading, motion adaptive shading and content adaptive shading. The GTX 1660 Ti will come in all sorts of shapes and sizes, making it a great fit for mini ITX cases as well as regular ATX PCs. None of that's here for the GTX 1660 Ti, hence why it's been dubbed a GTX card instead of an RTX one. Stuff like RT Cores for real-time ray tracing, or Tensor Cores for Nvidia's performance boosting DLSS tech. Of course, a lot of you might be wondering, why not just call this thing the RTX 2050 and have done with it? There's good reason for this, and that's because the GTX 1660 Ti's Turing GPU doesn't actually have any of the RTX stuff that makes the RTX cards RTX. Which ain't bad for a card under £300 / $300. It's also even nippier than the GTX 1070, and in a lot of cases even brushing up against the RTX 2060. And yet this little tiddler (or not so tiddly in the case of Asus' ROG Strix OC edition on the left up top there) isn't just more powerful than the GTX 1060. Priced from £260 / $279, that's £70 / $70 less than the cheapest RTX 2060, and much more in line with what the GTX 1060 originally cost all those years ago before graphics card prices went a bit loopy. It's a mess, really, and isn't helped by Nvidia's somewhat weak justification for starting a new 16-series is because 16 is closer to 20 than 10 (seriously).īut! Naming conventions aside, the GTX 1660 Ti is actually a pretty swish graphics card - and potentially even a best graphics card contender for those after top notch quality at 1920x1080. It's confusingly similar to Nvidia's existing GTX 1060, implying it's some sort of weird, halfway house between that and the GTX 1070, when in fact it's actually part of Nvidia's new Turing family, which is the same GPU inside their new RTX 20-series cards, making it, in effect, much more closely related to the RTX 2060 than anything else. The GTX 1660 Ti is a terrible, terrible name. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |