GPU for Lightroom AI processing (2024)

D

DRODZ

New Member
Joined
Jun 4, 2023
Messages
2(0.01/day)
  • Jun 4, 2023
  • #1

Lightroom AI Denoise is taking 5 min to process a 45 MB file. Resource monitor shows memory and cpu below 25%. I am using AMD 5700G (integrated graphics) and it is at 100%.
So I temporarily used a RX 6650 XT (8 GB) and the time dropped to 1 min. I am now in search for a new discrete GPU. I have done a ton of research but cannot find what features are important for AI processing? Is it VRAM, base clock, boost speed, memory speed, core speed, tensor cores, Cuda cores, Directx 12 Ultimate (12_2), RAM bandwidth etc? It is hard to compare these cards when I don't know what specs are important to LR AI and Photoshop. It seems ALL reviews seem focused on Frames per second.
I am sure LR and other apps will be using more AI very soon so I want to future proof as much as possible. I am not looking so much for a GPU recommendation as I am understanding what specs are critically important to this type of processing. I know the RTX 4000 series are not getting great reviews for the price, but it seem that the new architecture (ada lovelace) could be important. Would a GPU that is good for video editing be the type of card I should focus on?
Side note: While DeNoise is processing my screen flickers, this is common from what I've read. comments?
Thanks for any help you can provide.

Q

QuietBob

GPU for Lightroom AI processing (1)
Joined
Sep 21, 2020
Messages
1,514(1.12/day)

System Specs

Processor5800X3D -30 CO
MotherboardMSI B550 Tomahawk
CoolingDeepCool Assassin III
Memory32GB G.SKILL Ripjaws V @ 3800 CL14
Video Card(s)ASRock MBA 7900XTX
Storage1TB WD SN850X + 1TB ADATA SX8200 Pro
Display(s)Dell S2721QS 4K60
CaseCooler Master CM690 II Advanced USB 3.0
Audio Device(s)Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled)
Power SupplySeasonic Prime TX-750
MouseLogitech Cordless Desktop Wave
KeyboardLogitech Cordless Desktop Wave
SoftwareWindows 10 Pro
  • Jun 5, 2023
  • #2

A DX12 GPU with a minimum of 8 GB of VRAM is recommended for Lightroom. The new AI features use machine learning models which take advantage of dedicated AI accelerators present in current gen GPUs.

Any RTX40xx or RX7000 graphics card fits the criteria, with higher models providing faster AI acceleration.

D

DRODZ

New Member
Joined
Jun 4, 2023
Messages
2(0.01/day)
  • Jun 5, 2023
  • #3

QuietBob said:

A DX12 GPU with a minimum of 8 GB of VRAM is recommended for Lightroom. The new AI features use machine learning models which take advantage of dedicated AI accelerators present in current gen GPUs.

Any RTX40xx or RX7000 graphics card fits the criteria, with higher models providing faster AI acceleration.

Great, that confirms what I have been assuming from my reading. Is there an easy way to evaluate each vendors "version" of the RTX40xx models? They vary so much in price. I know some are OC and vary by the number of fans and aesthetics.

Q

QuietBob

GPU for Lightroom AI processing (2)
Joined
Sep 21, 2020
Messages
1,514(1.12/day)

System Specs

Processor5800X3D -30 CO
MotherboardMSI B550 Tomahawk
CoolingDeepCool Assassin III
Memory32GB G.SKILL Ripjaws V @ 3800 CL14
Video Card(s)ASRock MBA 7900XTX
Storage1TB WD SN850X + 1TB ADATA SX8200 Pro
Display(s)Dell S2721QS 4K60
CaseCooler Master CM690 II Advanced USB 3.0
Audio Device(s)Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled)
Power SupplySeasonic Prime TX-750
MouseLogitech Cordless Desktop Wave
KeyboardLogitech Cordless Desktop Wave
SoftwareWindows 10 Pro
  • Jun 5, 2023
  • #4

I can't provide a GPU tier recommendation for Adobe AI Denoise specifically. Deep learning models in general are very compute intensive and can utilize huge amounts of video memory. Still, this benchmark of a competing AI de-noising algorithm shows little difference between a $250 and $1400 GPU:

GPU for Lightroom AI processing (3)

If you're looking for information on particular card models, check out the in-depth graphics cards reviews on this site.

damric

GPU for Lightroom AI processing (5)
Joined
Feb 17, 2010
Messages
1,518(0.29/day)
Location
Azalea City

System Specs

System NameMain
ProcessorRyzen 5950x
MotherboardB550 PG Velocita
CoolingWater
MemoryBallistix
Video Card(s)RX 6900XT
StorageT-FORCE CARDEA A440 PRO
Display(s)Samsung UE590
CaseQUBE 500
Audio Device(s)Logitech Z623
Power SupplyLEADEX V 1KW
MouseCooler Master MM710
KeyboardHuntsman Elite
Software11 Pro
Benchmark Scoreshttps://hwbot.org/user/damric/
  • Jun 5, 2023
  • #5

You can adjust your 5700G integrated or your 6650xt graphics clocks to experiment what helps, whether your software is more core clock or bandwidth bound. If you need help with that, this is the forum to ask

G

GeorgeB

New Member
Joined
Aug 5, 2023
Messages
1(0.00/day)
  • Aug 5, 2023
  • #6

Topaz runs much faster than LR Denoise, even with integrated graphics. My Intel NUC 12th gen with i5 CPU and Iris graphics processes Canon CR3 files from the R6 in under 10 seconds - versus LR Denoise which processes the same files in 5+ minutes. I'm guessing this is why there's so little performance difference with Topaz on various machines - it's fast on all of them. However, I find LR Denoise results much better, as Topaz makes people's skin look a bit too artificial and waxy for my taste.

For general info, DxO's PRIME XD noise processing takes about 75 seconds on the NUC. Totally acceptable for me, given the exceptional quality. Still, I'd like to keep up with Lightroom as Denoise did an amazing job on some school science fair pics.

B

Bigfootmax

New Member
Joined
Sep 27, 2023
Messages
1(0.00/day)
  • Sep 27, 2023
  • #7

DRODZ said:

Lightroom AI Denoise is taking 5 min to process a 45 MB file. Resource monitor shows memory and cpu below 25%. I am using AMD 5700G (integrated graphics) and it is at 100%.
So I temporarily used a RX 6650 XT (8 GB) and the time dropped to 1 min. I am now in search for a new discrete GPU. I have done a ton of research but cannot find what features are important for AI processing? Is it VRAM, base clock, boost speed, memory speed, core speed, tensor cores, Cuda cores, Directx 12 Ultimate (12_2), RAM bandwidth etc? It is hard to compare these cards when I don't know what specs are important to LR AI and Photoshop. It seems ALL reviews seem focused on Frames per second.
I am sure LR and other apps will be using more AI very soon so I want to future proof as much as possible. I am not looking so much for a GPU recommendation as I am understanding what specs are critically important to this type of processing. I know the RTX 4000 series are not getting great reviews for the price, but it seem that the new architecture (ada lovelace) could be important. Would a GPU that is good for video editing be the type of card I should focus on?
Side note: While DeNoise is processing my screen flickers, this is common from what I've read. comments?
Thanks for any help you can provide.

I’m on the same page: looking for an optimum future-proof GPU for photo and video editing. I’m still waiting for some other components for my new custom desktop. But here are some data from my recent experiments with AI denoise in ACR on my 2 current systems: 1) Acer laptop swift X 14 late 2021 model with amd ryzen 5 5600u, 16 gb DDR 4 RAM and rtx 3050 mobile (4 gb). 2) old PC with intel core i7 3770, 32gb ddr3 RAM coupled with radeon rx 580 8gb sapphire nitro+. For the experiment I used 10 compressed RAW files of my sony a7r4a with moderate noise (shot just before dawn at 800 iso), each file is 60-ish Mpxls. I opened these 10 files in ACR Photoshop and set ai Denoise to 50%. The results are the following: 1. Laptop - 13 min. (1 min. 18 sec. per photo) 2. PC 20 min 10 sec. (roughly 2 min. per photo).
Bear in mind that rtx 3050 mobile goes neck to neck with rx 580 in terms of common gaming synthetic tests. On the other hand amd ryzen 5 5600u has 15470 scores in passmark while i7 3770 - just 6400 scores. So, CPU also matters for ai denoise. Then I installed gtx 1080 ti on my old PC. 1080 ti is twice as fast as rx 580 (according to technical.city). No wonder that it cut the time exactly in half on my old PC: 1 min. per 60Mpxl raw file. I bought this msi gtx 1080 ti gaming X for just $160 so it is a good bang for the buck. I will do the same experiment on my new PC when I assemble it (based on amd ryzen 9 7950x) and I will let you know the results. I am also considering rtx 3090 which you can buy used in the range of $600 - 700.

Last edited:

You must log in or register to reply here.

GPU for Lightroom AI processing (2024)
Top Articles
Latest Posts
Article information

Author: Lakeisha Bayer VM

Last Updated:

Views: 5898

Rating: 4.9 / 5 (69 voted)

Reviews: 84% of readers found this page helpful

Author information

Name: Lakeisha Bayer VM

Birthday: 1997-10-17

Address: Suite 835 34136 Adrian Mountains, Floydton, UT 81036

Phone: +3571527672278

Job: Manufacturing Agent

Hobby: Skimboarding, Photography, Roller skating, Knife making, Paintball, Embroidery, Gunsmithing

Introduction: My name is Lakeisha Bayer VM, I am a brainy, kind, enchanting, healthy, lovely, clean, witty person who loves writing and wants to share my knowledge and understanding with you.