12

So, rather than try and build a PC at this point in time when GPUs are priceless unicorns, I bought a gaming laptop a couple of months ago and have basically just been using it as a desktop.

This is the laptop in question. Without going too much into detail, it is pretty good for the price and even though I have run into a couple of issues with it, such as the wifi card being a total potato, I sort of just threw money at it until the problems disappeared. Until I encountered Nvidia Optimus.

What is Optimus?

Essentially, it is designed to improve battery life in laptops if they contain an integrated and dedicated GPU. Rather than having the dedicated GPU (in my case the 1660TI) hooked up to the screen displaying everything, it uses the 1660TI as a heavy lifter and just passes it all to the integrated GPU which sends the images to the screen. The idea is that when you're not using the big boi GPU, the integrated graphics do just fine and you don't have to be constantly powering your GPU just to perform simple tasks.

So why is Optimus total aids?

It was kind of designed at a point in time where anything you could put into a laptop wasn't a lot more powerful than integrated graphics. The issue is that manufacturers have kept implementing Optimus even though the majority of gaming laptop GPUs out there are actually powerful enough to be bottlenecked by the integrated card not being able to handle everything being sent through it to the screen. It results in lower frame rates and stuttering compared to just using your graphics card.

In my case, my screen and HDMI port are connected to the integrated card, with no way to directly use the Nvidia card for either. The only way to bypass this is by using a USB-C to display port cable with a monitor. So, I spent over £200 on a monitor and whatnot to see if it would work, and I've gone through the benchmarks to see what kind of a difference it makes.

The Results

Bear in mind that the 1660TI is not a particularly powerful GPU. There are laptops out there with 30 series GPUs that manufacturers are using with Optimus.

DX9 general rendering

  • Optimus - 84.9 FPS
  • 1660TI - 97.9 FPS

Reflection Handling

  • Optimus - 91.4 FPS
  • 1660TI - 131.0 FPS

Userbenchmark GPU score (meaning percentile of all 1660TIs tested)

  • Optimus - 57th percentile
  • 1660TI - 94th percentile

There are other results I could list here but they are smaller gaps and you kind of get the picture. There isn't a single metric where the 1660TI even performed similarly to itself when Optimus was enabled. As further anecdotal evidence, I noticed a 10-15 FPS increase in Battlefield, and a very lightweight Tetris game I play jumped from around 1500-2000 FPS to maintaining a constant 4000+.

All of these performance increases were achieved by plugging a cable into an external monitor, because manufacturers are too lazy/greedy to add a switch that allows you to change which GPU you'd like to use. It really does make me think the idea behind this whole thing is to make it so companies can advertise a longer battery life on their laptops, whilst still listing the GPU you think you're getting.

tl;dr if you're thinking about buying a laptop make sure it doesn't have Optimus

So, rather than try and build a PC at this point in time when GPUs are priceless unicorns, I bought a gaming laptop a couple of months ago and have basically just been using it as a desktop. [This is the laptop in question](https://www.amazon.co.uk/ASUS-FA506IU-5-4600H-GeForce-Fortress/dp/B089LXXP6X). Without going too much into detail, it is pretty good for the price and even though I have run into a couple of issues with it, such as the wifi card being a total potato, I sort of just threw money at it until the problems disappeared. Until I encountered Nvidia Optimus. **What is Optimus?** Essentially, it is designed to improve battery life in laptops if they contain an integrated and dedicated GPU. Rather than having the dedicated GPU (in my case the 1660TI) hooked up to the screen displaying everything, it uses the 1660TI as a heavy lifter and just passes it all to the integrated GPU which sends the images to the screen. The idea is that when you're not using the big boi GPU, the integrated graphics do just fine and you don't have to be constantly powering your GPU just to perform simple tasks. **So why is Optimus total aids?** It was kind of designed at a point in time where anything you could put into a laptop wasn't a lot more powerful than integrated graphics. The issue is that manufacturers have kept implementing Optimus even though the majority of gaming laptop GPUs out there are actually powerful enough to be bottlenecked by the integrated card not being able to handle everything being sent through it to the screen. It results in lower frame rates and stuttering compared to just using your graphics card. In my case, my screen and HDMI port are connected to the integrated card, with no way to directly use the Nvidia card for either. The only way to bypass this is by using a USB-C to display port cable with a monitor. So, I spent over £200 on a monitor and whatnot to see if it would work, and I've gone through the benchmarks to see what kind of a difference it makes. **The Results** Bear in mind that the 1660TI is not a particularly powerful GPU. There are laptops out there with 30 series GPUs that manufacturers are using *with Optimus*. DX9 general rendering - Optimus - 84.9 FPS - 1660TI - 97.9 FPS Reflection Handling - Optimus - 91.4 FPS - 1660TI - 131.0 FPS Userbenchmark GPU score (meaning percentile of all 1660TIs tested) - Optimus - 57th percentile - 1660TI - 94th percentile There are other results I could list here but they are smaller gaps and you kind of get the picture. There isn't a single metric where the 1660TI even performed similarly to itself when Optimus was enabled. As further anecdotal evidence, I noticed a 10-15 FPS increase in Battlefield, and a very lightweight Tetris game I play jumped from around 1500-2000 FPS to maintaining a constant 4000+. All of these performance increases were achieved by plugging a cable into an external monitor, because manufacturers are too lazy/greedy to add a switch that allows you to change which GPU you'd like to use. It really does make me think the idea behind this whole thing is to make it so companies can advertise a longer battery life on their laptops, whilst still listing the GPU you *think* you're getting. tl;dr if you're thinking about buying a laptop make sure it doesn't have Optimus

3 comments

[–] E-werd 2 points (+2|-0)

It really does make me think the idea behind this whole thing is to make it so companies can advertise a longer battery life on their laptops

Well, that is exactly and unabashedly the reason.

[–] ScorpioGlitch 1 points (+1|-0)

It is utterly ridiculous that you're getting better performance through a USB monitor though you're not listing what the resolutions were between the two methods. Also, it might be worth noting that the built in monitor might have it's own bottleneck in the pipeline/cable from the card to display.

[–] PMYA [OP] 2 points (+2|-0)

Both screens are 1080p resolution, so there shouldn't be any difference in performance in that regard. There appears to be two reasons why performance is better now. One is that the 1660TI isn't being limited at all by the integrated card, and the other is that the CPU is doing less work. Now that the CPU is not using the integrated graphics to render anything, it is being utilised a lot better in games that are harsh on it. Battlefield is one example, another is emulating Breath of the Wild with a Wii U emulator, the lowest FPS in that now is 57 in the most stressful parts of the game vs 47, with the lighter parts of the game running roughly +15 FPS. The difference is even more pronounced in CPU bound games that already had high performance, CSGO is now running at 280-300FPS, which is 100-150FPS better than Optimus performance.

To me the whole thing makes no sense. People don't really buy gaming laptops for long battery life, nobody is playing Battlefield 5 on battery, if you're doing something heavy its going to be plugged in.