I've created a simple app with a DirectX11 device and swap chain (IDXGISwapChain). All it does is clear the screen with a color and call Present(0, 0) on the swap chain. The app handles full screen/windowed mode transition by itself (I passed DXGI_MWA_NO_WINDOW_CHANGES) to the IDXGIFactory object. After switching from windowed mode to full screen, the VS graphics profiler shows a drop in the frame rate, from ~8000 FPS (window mode) to ~1500 FPS (full screen mode). I think I'm resizing my buffers properly (in response to WM_SIZE), I'm not getting any warnings about presentation inefficiencies in the debug window. The mode used to create the swap chain is obtained by enumerating the supported modes of the output device and selecting the proper resolution info. Isn't full screen mode supposed to be more efficient (as I understand, it can simply do a flip instead of a bit blit). The code is here in case you want to take a look.
Asked
Active
Viewed 356 times
2
-
FPS is a bad measure of performance, it's nonlinear. 1500 FPS is 0.0006 seconds per frame. 8000 is 0.000125 seconds per frame. That's a difference of 0.000475 seconds. That's not actually much, and it could be entirely attributable to the fact that windowed versus fullscreen is fairly different code for the GPU -- fullscreen is not, however, necessarily more efficient. That doesn't mean you aren't doing something yourself to incur that small delta, but I don't think you should worry too much about it. – Oct 23 '14 at 15:53