So, rather than try and build a PC at this point in time when GPUs are priceless unicorns, I bought a gaming laptop a couple of months ago and have basically just been using it as a desktop.
This is the laptop in question. Without going too much into detail, it is pretty good for the price and even though I have run into a couple of issues with it, such as the wifi card being a total potato, I sort of just threw money at it until the problems disappeared. Until I encountered Nvidia Optimus.
What is Optimus?
Essentially, it is designed to improve battery life in laptops if they contain an integrated and dedicated GPU. Rather than having the dedicated GPU (in my case the 1660TI) hooked up to the screen displaying everything, it uses the 1660TI as a heavy lifter and just passes it all to the integrated GPU which sends the images to the screen. The idea is that when you're not using the big boi GPU, the integrated graphics do just fine and you don't have to be constantly powering your GPU just to perform simple tasks.
So why is Optimus total aids?
It was kind of designed at a point in time where anything you could put into a laptop wasn't a lot more powerful than integrated graphics. The issue is that manufacturers have kept implementing Optimus even though the majority of gaming laptop GPUs out there are actually powerful enough to be bottlenecked by the integrated card not being able to handle everything being sent through it to the screen. It results in lower frame rates and stuttering compared to just using your graphics card.
In my case, my screen and HDMI port are connected to the integrated card, with no way to directly use the Nvidia card for either. The only way to bypass this is by using a USB-C to display port cable with a monitor. So, I spent over £200 on a monitor and whatnot to see if it would work, and I've gone through the benchmarks to see what kind of a difference it makes.
The Results
Bear in mind that the 1660TI is not a particularly powerful GPU. There are laptops out there with 30 series GPUs that manufacturers are using with Optimus.
DX9 general rendering
- Optimus - 84.9 FPS
- 1660TI - 97.9 FPS
Reflection Handling
- Optimus - 91.4 FPS
- 1660TI - 131.0 FPS
Userbenchmark GPU score (meaning percentile of all 1660TIs tested)
- Optimus - 57th percentile
- 1660TI - 94th percentile
There are other results I could list here but they are smaller gaps and you kind of get the picture. There isn't a single metric where the 1660TI even performed similarly to itself when Optimus was enabled. As further anecdotal evidence, I noticed a 10-15 FPS increase in Battlefield, and a very lightweight Tetris game I play jumped from around 1500-2000 FPS to maintaining a constant 4000+.
All of these performance increases were achieved by plugging a cable into an external monitor, because manufacturers are too lazy/greedy to add a switch that allows you to change which GPU you'd like to use. It really does make me think the idea behind this whole thing is to make it so companies can advertise a longer battery life on their laptops, whilst still listing the GPU you think you're getting.
tl;dr if you're thinking about buying a laptop make sure it doesn't have Optimus
Well, that is exactly and unabashedly the reason.