I checked the OpenGL version using the OpenGL Extensions Viewer 6.0.8.1 from here. It reports core features up to and including version 4.6 supported; a few ARB 2015 features are unsupported.
What else could be disabling the simulation player?
I checked the OpenGL version using the OpenGL Extensions Viewer 6.0.8.1 from here. It reports core features up to and including version 4.6 supported; a few ARB 2015 features are unsupported.
What else could be disabling the simulation player?
Many notebooks are equipped with more than one graphic processor.
Is it possible that Cura does not use the NVIDIA chip at all?
On 1/31/2020 at 6:49 PM, tinkergnome said:Many notebooks are equipped with more than one graphic processor.
Is it possible that Cura does not use the NVIDIA chip at all?
The notebook does in nfact have 2 graphics processors, an Intel HD Graphics 4000 which seems to actually drive the displays and an NVIDIA GeForce GT 730M which is used for rendering. As setup it also has 2 USB to DVI drivers.
However, my point is that the configuration of the hardware or, as far as I am aware, the software has not changed, except for a Windows Insider Preview update, and this setup was working and displaying the simulation preview a few weeks ago, but now doesn't display it.
Is there some diagnostic information available from the program to indicate why it's stopped working and what I might do to re-enable it? Alternatively, is there an option to force the mode on the assumption that the software detection of the capabilities is wrong?
---------------------------------------------------------
Update:
I tried disabling the Simulation View plugin and this removed the layer View slider! Re-enabling simulation view brought it back, but I am now suspicious that the installation is corrupted.
I tried a complete uninstall (removing config) and reinstall, but still don't have simulation view working again.
Edited by StarNamer18 minutes ago, StarNamer said:Is there some diagnostic information available from the program
The actually used OpenGL driver and version is written to the "cura.log" file (in the Cura configuration folder).
Just search for "OpenGL" or for the "ERROR" keyword- this gives perhaps some hints.
Here is an example of how it looks like:
[...] - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [112]: Initialized OpenGL subsystems. [...] - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [113]: OpenGL Version: 4.1.0 NVIDIA 376.54 [...] - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [114]: OpenGL Vendor: NVIDIA Corporation [...] - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [115]: OpenGL Renderer: GeForce GTX 960M/PCIe/SSE2 [...] - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [116]: GLSL Version: 4.0.0
...and just to be sure: does it work in compatibility mode or not at all?
Every log entry I could find reports that it's using the Intel HD Graphics 4000
2020-02-02 19:57:05,262 - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [111]: Initialized OpenGL subsystems.
2020-02-02 19:57:05,271 - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [112]: OpenGL Version: 4.0.0 - Build 10.18.10.5100
2020-02-02 19:57:05,278 - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [113]: OpenGL Vendor: Intel
2020-02-02 19:57:05,284 - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [114]: OpenGL Renderer: Intel(R) HD Graphics 4000
2020-02-02 19:57:05,290 - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [115]: GLSL Version: 4.0.0
However, if I run the OpenGL Extensions Viewer, it only reports on the NVIDIA GeForce GT 730M
Renderer: GeForce GT 730M/PCIe/SSE2
Vendor: NVIDIA Corporation
Version: 4.6.0 NVIDIA 425.31
Shading language version: 4.60 NVIDIA
Max texture size: 16384 x 16384
Max vertex texture image units: 32
Max texture image units: 32
Max geometry texture units: 32
Max anisotropic filtering value: 16
Max viewport size: 16384 x 16384
Max Clip Distances: 8
Max samples: 32
GL Extensions: 372
How do I tell Cura to use it?
OK. The NVIDIA Control Pabel has override setting to globally default to using the NVIDIA (instead of automatically selecting) and to set specific applications. After setting Cura to use the NVIDIA GPU, Simulation View works.
2020-02-02 20:17:11,872 - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [111]: Initialized OpenGL subsystems.
2020-02-02 20:17:11,878 - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [112]: OpenGL Version: 4.1.0 NVIDIA 425.31
2020-02-02 20:17:11,884 - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [113]: OpenGL Vendor: NVIDIA Corporation
2020-02-02 20:17:11,890 - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [114]: OpenGL Renderer: GeForce GT 730M/PCIe/SSE2
2020-02-02 20:17:11,896 - DEBUG - [MainThread] UM.View.GL.OpenGL.__init__ [115]: GLSL Version: 4.0.0
Thanks for the guidance where to look for what Cura was actually using.
My conclusion is that I actually disabled it when I downloaded and installed the latest compatible NVIDIA driver.
Edited by StarNamer
Recommended Posts
StarNamer 4
FYI, I tried installing the latest NVIDIA drivers (425.31) but it made no difference.
Link to post
Share on other sites