PNY offers Geforce RTX30 series graphics cards

Earlier today, graphics card powerhouse nVidia announced its new Geforce RTX 30 series cards and they look pretty damn good, if I don’t mind saying so myself.

I was contemplating picking up an RTX2060 or RTX2070 later this year but while nVidia for some reason hasn’t made New Zealand pricing available, it sounds like an RTX3070 will cost around $AU800 (which means closer to $NZ850, probably) so while not cheap, they seemed competitively priced when compared to the RTX2000 series cards when they were released.

The RTX3090, however, sounds like it’ll need a small mortgage to cover the cost so I suspect it’s not considered a consumer-level card.

Hot on the heels of nVidia’s announcement, memory, RAM and GPU manufacturer PNY has come out announcing its own line-up of RTX30 series cards with the XLR8 gaming series: The  RTX 3090, RTX 3080 and RTX 3070, all powered by the all-new NVIDIA Ampere architecture.

nVidia says the new RTX 30 Series GPUs, the 2nd generation of RTX, features new RT Cores, Tensor Cores and streaming multiprocessors, bringing stunning visuals, amazingly fast frame rates and AI acceleration to games and creative applications.

In terms of overclocking and RGB customisation, PNY says its XLR8 Gaming GeForce RTX 30 Series is compatible with PNY’s VelocityX overclocking software which allows for the customisation and monitoring of critical stats like core clock, memory clock, core temperature, fan speed, RGB lighting and more, aiming for the perfect balance of performance and efficiency.

Here’s what PNY has to offer in the range:

PNY XLR8 Gaming GeForce RTX 3090

    • 24GB memory
    • 3 fan
    • PCIe 4.0
    • GDDR6X
    • EPIC-X RGBTM
    • Overclocking: via VelocityX Software

PNY XLR8 Gaming GeForce RTX 3080

    • 10GB memory
    • 3 fan
    • PCIe 4.0
    • GDDR6X
    • EPIC-X RGB
    • Overclocking: via VelocityX Software

PNY XLR8 Gaming GeForce RTX 3070

    • 8GB
    • 3 fan and 2 fan variations
    • PCIe 4.0
    • GDDR6
    • EPIC-X RGB on 3 fan version
    • Overclocking: via VelocityX Software

PNY says its RTX3090 will be available from late-September,  the RTX3080 from mid-September and the RTX3070 from mid-October from mWave.com.au in Australia and in New Zealand from  www.pbtech.co.nz/

So … I’m sticking with my GTX660Ti and here’s why

gigabye-3gb-geforce-gtx-660-ti_boxA while back, I wrote about contemplating upgrading my current GTX660Ti graphics card with either the GTX950, which I’d won in a YouTube competition (yeah, I know right?) or something like  GTX1060 or a Radeon RX480. I’m getting back into my PC gaming and, rightly or wrongly, I didn’t think my current GPU was up to the task.

Sounds like a simple thing, right? Well, not really, as things worked out.

I’ve swapped out graphics cards before – it’s one of the easiest things you can upgrade on a PC: You simply remove the old card from the PCI-E slot on your PC’s motherboard, slot in the new one,connect the power then boot up your computer. Easy.

Well, not as far as installing the GTX950 went. Long story short, I didn’t get a signal to my monitor with the new card installed (the fans on the 950 didn’t even power up, either) but put my 660Ti back and things were sweet. It seems that the original GTX950 was faulty so after months of emails with MSi support I eventually got a replacement card and installed it, crossing my fingers in the process.

This new GTX950 didn’t work either. I visited the nVidia ANZ forums with my problem. It’s a great community and I got a lot of good suggestions but none of them worked. Someone suggested looking for a new motherboard, which was an option but I was hoping this was a simple fix. So, I swallowed my pride and did what many PC enthusiasts wouldn’t want to do: Took it to  my local computer repair guy.

Long story short, again, after being with the technician for a couple of days it seems that my Intel DZ77ga 70K motherboard – a four-year-old motherboard that is now no longer supported by Intel: Thanks for that – just won’t accept the newer GTX950.

The GTX 660Ti is based on nVidia’s Kepler Maxwell architecture, as is the GTX950, but it seems that my Intel board can’t be updated to accommodate the newer card. Frankly, that sucks on Intel’s part. How hard would it be for them to issue a BIOS update that accepts the newer card (I’m not a programmer or computer scientist so I’ve no idea how hard it would be or not)?

It’s frustrating but I don’t have the funds to upgrade my motherboard – which would also mean new RAM, a new CPU (because the current CPU won’t work on the new board) – as well as buy a new GPU. So, at this point in time, so I’m sticking with the GTX660Ti. I think I’m happy with that, too.

It’s a great card: It’s got 3Gb of VRAM and is four years old but it’s just not considered cutting edge anymore.

titanfall2-sngplyr-c_pdp_screenhi_3840x2160_en_wwThat said, I picked up Titanfall 2 the other for PC (I took a punt) and, you know what? I can run it on my GTX660Ti on medium to high settings (most on high) and am getting consistently frame rates (I haven’t run FRAPs or anything to determine what FPS I’m getting but it’s running as smooth as butter.

The minimum recommended nVidia GPU is a GTX660 while the recommended nVidia GPU is the 1060, so I’m not far off the minimum but it’s all running mighty smooth to me. Sure, I’ve had a couple of crashes to the desktop but that’s part and parcel with PC gaming, right?

I also completed Gears of War 4 last month with a mixture of mostly high settings and it was sitting around the 45FPS mark (the PC version of GOW4 is amazingly customisable, which helps). It seems my four-year-old card might still have a little bit of life in it yet.

I’m now contemplating whether my PC would actually be up to Dishonored 2 but I’ll think about that one. It might be one for the consoles, perhaps, and one that pushes the GTX660Ti one step too far.

 

 

Watch The Witcher 3 running on my Geforce 660Ti

OK, so I last night I used nVidia’s Shadowplay video capturing software to record just under 10 minutes of game play from The Witcher 3: Wild Hunt. I just wanted to show you how the game looked on what is considered the minimum specced GPU for the game.

As I said yesterday, I’m running the game on what I  consider to be an ageing GPU: A Geforce GTX660Ti but it seems to handle the game OK.

Every thing is set to medium and I have locked the frame rate to 30FPS so I can ensure a consistent  experience. Things look nicer on medium settings than on low, especially the grass and other foliage. It’s just a pretty game, to be honest.

Sadly, I forgot to activate the FPS counter while I was playing so can’t see what the  frame rates were doing but everything seemed smooth and very much playable. There was no combat so I can’t see what happens during heavy combat but if I get the chance over the next day or so (work commitments dependent) I’ll record some more footage with the FPS counter running.

Any questions, post a comment below and I’ll do my best to answer. Thanks for watching.