Replies: 1 comment
-
yes, scene detection does not support NVIDIA hardware processing AFAIK, it happens inside ffmpeg. losslesscut only uses gpu for possibly accelerating playback as well as UI animations |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
When I ran scene detection option I noticed that for ffmpeg process my CPU went up to 95% for but not GPU even though I have enabled NVIDIA NVENC (Enable HEVC option) and set LosslessCut to use NVIDIA discrete GPU in Windows graphics settings.

How can I use the full capabilities of the GPU when using LosslessCut ?
Beta Was this translation helpful? Give feedback.
All reactions