2

I would like to be able to open Activity Monitor and see 0 in the GPU% column for that process.

All I have are an intel integrated graphics on an intel based Mac. I'm willing to entertain virtualization, modifying the system, modifying the apps if needed but prefer a simpler solution if there is such a thing.

Can I run a process and make it think that I have no GPU?

bmike
  • 235,889
minseong
  • 8,824
  • This would be a variation of your existing question where the GPU cap would be set to 0%. This can’t be done from a user perspective. – Allan Apr 18 '23 at 18:31
  • 1
    @Allan I thought this could be easier than something variable. Just turn it on or off, block it or allow it – minseong Apr 18 '23 at 18:41
  • 1
    Let's see how this goes - no need to close it before we get dozens of eyes on both... – bmike Apr 18 '23 at 18:44
  • The GPU is very different from a CPU and having an app utilize it requires very specific calls to be made to it. This is done through the GPU driver. If it’s an Intel Mac then either it’s calling the GPU libraries supplied by Intel (integrated graphics) or the 3rd party libraries (AMD Radeon, for example). Now with Apple Silicon, things are much more locked down. There’s no “user switch” for these things. – Allan Apr 18 '23 at 18:44
  • @Allan what if I run it in some kind of VM? – minseong Apr 18 '23 at 18:45
  • Everything is scheduled for a “slice” of the GPU’s time. When you bombard it, obviously it will consume resources, but it doesn’t make it “unavailable” to requests from other applications. It will just grind everything down to a snail pace – Allan Apr 18 '23 at 18:47
  • 1
    As for the VM, it depends on how the VM virtualizes graphics. If it creates a GPU but passes the commands through to the existing GPU, it won’t help. But if the GPU get’s virtualized using CPU cycles, it might do what you want, but I can’t speak to overall performance of the system. – Allan Apr 18 '23 at 18:49
  • Many apps use little to no GPU when they can't be seen onscreen - nothing to composite. Background it, minimise or shift to another Space. If that doesn't work, the app itself is just poorly coded. – Tetsujin Apr 18 '23 at 18:53
  • 1
    The comments here regarding apps using specific calls directly to the GPU driver supplied by Intel or Radeon is not accurate for most apps. Almost all apps use the GPU through the Apple-supplied standardised interface. This is the case for both Intel and Apple Silicon platforms. – jksoegaard Apr 18 '23 at 18:59

1 Answers1

3

Yes, an easy way of doing this is to use virtualisation. You have indicated in your question that this would be acceptable to you.

Simply setup a virtual machine using whatever virtualisation program you choose (i.e. VMware, Parallels, VirtualBox, etc) - and make sure to disable its GPU virtualization (this is usually a simple checkbox that you need to ensure is unchecked).

On VMware Fusion this is achieved in the Settings window under "Display" - make sure that the box "Accelerate 3D graphics" is unchecked.

Note: This will not make the program think that there's no GPU as such. For that you need to remove the virtual GPU as such, which is possible, but would mean that you have no native screen output from the program (which is normally not desirable). If you're running a "server style" application, this could be perfectly acceptable to you.

Also note: Even though the program does not have direct GPU access - ofcourse anything that makes something appear on your monitor is is in some, very slight, form going to be using your GPU. However, for any sane use case of this, you want to cut the program off from running its code on the GPU, and this will achieve that goal.

UPDATE:

From your other questions and comments, I have discovered that your request is really about Minecraft. The rendering system used by Minecraft supports a software renderer (i.e. using the CPU only instead of using the GPU). So without using any macOS specific changes, it is possible to get Minecraft running using a software renderer. Enabling the software renderer is easy (run the Java version with the -Dorg.lwjgl.opengl.Display.allowSoftwareOpenGL=true parameter).

Unfortunately the hardware renderer will always be preferred if available - so you need a way to change that preference. There's no method for that made available to ordinary users, but if you know programming or have a programmer that can help you, it is possible to make that change.

jksoegaard
  • 77,783
  • Do you know of a lightweight virtualisation program that could make it as trivial as possible and with minimal overhead to run just one program like this that is already already on my system? – minseong Apr 18 '23 at 19:17
  • There's no real "lightweight" solution for that, I'm afraid. You will have to use something like VirtualBox, Parallels Desktop, VMware Fusion or similar to run a full operating system that you can run your app on top of. – jksoegaard Apr 18 '23 at 20:40
  • @theonlygusti I have added an update now that I found out that the app in question is Minecraft. – jksoegaard Apr 18 '23 at 20:46
  • Thanks for all your help and digging deeper. How would a programmer enable the software renderer? Do you have to mod Minecraft itself? – minseong Apr 19 '23 at 00:26
  • it's not exactly "simple" – user253751 Apr 19 '23 at 09:49
  • It is pretty simple actually - it's just a command line parameter when starting Minecraft. – jksoegaard Apr 19 '23 at 12:28
  • @theonlygusti I have added to my answer how to enable the software renderer. However, as I wrote, this is only one of the two steps necessary - you also have to disable the hardware renderer. – jksoegaard Apr 19 '23 at 12:28
  • How would a programmer disable the hardware renderer? – minseong Apr 19 '23 at 15:00
  • By modifying LWJGL so that it turns off the hardware rendering code when alllowSoftwareOpenGL is set to true. – jksoegaard Apr 19 '23 at 23:46
  • When would the hardware renderer not be available? – minseong Apr 24 '23 at 15:21
  • When the computer doesn't have one. – jksoegaard Apr 24 '23 at 21:15