logo elektroda
logo elektroda
X
logo elektroda

How to switch the default graphics card from integrated to dedicated

Enytjoo 24066 12
ADVERTISEMENT
Treść została przetłumaczona polish » english Zobacz oryginalną wersję tematu
  • #1 18195561
    Enytjoo
    Level 6  
    Posts: 7
    Rate: 3
    Hello, I have a problem not long ago I bought a new HP Desktop computer - 460-p202nw and I have a problem, there are 2 graphics cards in this computer:
    1.Integrated Intel (R) HD Graphics 630
    2.Dedicated Radeon (TM) 520

    The problem is that all games run on this integrated Intel (R) HD Graphics 630 card and the Radeon (TM) 520 card is not used at all. I know that you can set the "Advanced display settings" to set a given game to performance and then the game is on the Radeon (TM) 520 card, but unfortunately not every game I can do that, e.g. Minecraft, etc.

    And now what to do / how to make the default graphics card the dedicated one, and preferably the integrated one to turn off
  • ADVERTISEMENT
  • #2 18195583
    bubu1769
    Level 43  
    Posts: 8039
    Help: 1173
    Rate: 2342
    In the BIOS settings it should be possible to set the graphics card.
  • ADVERTISEMENT
  • #3 18195586
    Enytjoo
    Level 6  
    Posts: 7
    Rate: 3
    But I do not understand these topics too much, unfortunately, would you be able to help me?
  • #4 18195593
    bumble
    Level 40  
    Posts: 7189
    Help: 608
    Rate: 1183
    But wait, where is the monitor connected? This is a laptop? The monitor is to be connected to a dedicated card and it will run in Bios, do not touch anything. Leave it on the car. You can turn off the integrated card, but when a dedicated one falls and you cut the monitor, you will not start the computer. You will have to restart the bios. I do not know the corporate pc does it have one monitor output?
  • #5 18195597
    Enytjoo
    Level 6  
    Posts: 7
    Rate: 3
    This is how it is a computer, the monitor is simply connected to the computer
  • ADVERTISEMENT
  • #6 18195600
    bubu1769
    Level 43  
    Posts: 8039
    Help: 1173
    Rate: 2342
    But which exit?
    Is it on the motherboard or on the graphics card?
  • #7 18195601
    Enytjoo
    Level 6  
    Posts: 7
    Rate: 3
    I do not understand too much there is a photo in the zalocznik
    Attachments:
    • How to switch the default graphics card from integrated to dedicated 71340224_660752667745875_7295389663248777216_n.jpg (79.59 KB) You must be logged in to download this attachment.
  • ADVERTISEMENT
  • #8 18195609
    bumble
    Level 40  
    Posts: 7189
    Help: 608
    Rate: 1183
    There is probably one monitor output like in laptops. You have drivers from radeon installed. There should be a card panel on the bar next to the clock and this is how the dedicated card is to be set and in which programs. By default, in fullscreen games, it should work. You can also check it there in the card activity panel. I don't know exactly what amd is like. There is also an option to disable the card integrated in the bios, although it may not be here.
  • #9 18195617
    Enytjoo
    Level 6  
    Posts: 7
    Rate: 3
    Well, even full screens of the game go to the integrated one, I know that you can set the game to high performance in the graphics card settings, then the game starts on the dedicated one, but not every game can be done this way. now I read that because it's some UEFI, maybe something can work in it? that the dedicated card is the default one and preferably the integrated one :P
  • #10 18195629
    bumble
    Level 40  
    Posts: 7189
    Help: 608
    Rate: 1183
    Integrated can be turned off in the bios. On the other hand, in the activity panel, you choose which programs to use the dedicated one, and it really makes no sense for all of them to use a dedicated one. Check out the performance benchmark of these cards. Is this Intel such a crap? I have the 8th gen i5 and it's ok even for games e.g. lol.
  • #11 18195641
    Enytjoo
    Level 6  
    Posts: 7
    Rate: 3
    1.If I would like to switch off this card integrated in biose, how?
    2. If I would already disable this integrated card, would the computer lose performance?

    Added after 1 [minutes]:

    and from what I saw in the monitor, there are probably 2 inputs
    Attachments:
    • How to switch the default graphics card from integrated to dedicated 71658054_1495251783958902_8381194263676846080_n.jpg (74.82 KB) You must be logged in to download this attachment.
  • #12 18195782
    sylweksylwina
    Moderator of Computers service
    Posts: 13170
    Help: 1875
    Rate: 2335
    The monitor is well connected via HDMI, there is nothing to draw on.
    1. It is rather impossible to disable the integrated UEFI card, since the image from the radeon probably goes through it.
    2. It is possible to indicate the executable file of a given game / application in this way:
    How to switch the default graphics card from integrated to dedicated How to switch the default graphics card from integrated to dedicated
  • #13 18195812
    Enytjoo
    Level 6  
    Posts: 7
    Rate: 3
    But, for example, in GTA Online in RP I can not play because, as he adds, the game is still played from Intel

    Added after 1 [minutes]:

    Or, for example, another minecraft is minecraft launcher adds it this way and it does not give anything, because minecraft is launched by some jave.exe, there are many other examples

Topic summary

✨ The discussion revolves around a user experiencing issues with their HP Desktop 460-p202nw, which has both an integrated Intel HD Graphics 630 and a dedicated Radeon 520 graphics card. The user seeks to switch the default graphics card from the integrated to the dedicated one, as games are primarily running on the Intel card. Suggestions include checking BIOS settings to potentially disable the integrated card, ensuring the monitor is connected to the dedicated card, and using the Radeon control panel to set specific applications to utilize the dedicated GPU. However, concerns are raised about the feasibility of disabling the integrated card due to potential performance impacts and the necessity of the integrated card for output. The user also notes difficulties in forcing certain games, like Minecraft, to run on the dedicated GPU.
Generated by the language model.

FAQ

TL;DR: On HP 460‑p202nw with 2 GPUs, games default to Intel HD 630; fix by forcing apps to High performance. "The problem is that all games run on this integrated Intel ... and the Radeon 520 is not used at all." [Elektroda, Enytjoo, post #18195561]

Why it matters: This FAQ helps Windows users with dual‑GPU HP desktops make games and apps use the dedicated Radeon for better performance.

Quick Facts

How do I force a game to use the AMD Radeon 520 on Windows 10/11?

Use Windows Graphics settings. It’s a three‑mode, per‑app control.
  1. Settings > System > Display > Graphics settings.
  2. Add the game’s .exe, click Options.
  3. Choose High performance, Save, then restart the game. These preferences override automatic selection for that executable. “Graphics settings for apps in Windows”

Do I need to move my HDMI cable to the Radeon 520?

Yes, when your PC has both motherboard and GPU outputs. Plug the monitor into the discrete card’s HDMI/DP port on the expansion slot bracket. This ensures the display pipeline uses the dedicated GPU. Avoid the motherboard video port for gaming. [Elektroda, bumble, post #18195593]

Can I disable the Intel HD 630 iGPU in BIOS/UEFI on this HP?

Often no. Many OEM designs route the dGPU through the iGPU (muxless). As one expert noted, "It is rather impossible to disable the integrated UEFI card, since the image from the radeon probably goes through it." Use per‑app selection instead. [Elektroda, sylweksylwina, post #18195782]

Will disabling the integrated GPU reduce performance or cause issues?

It usually won’t boost performance and can cause recovery headaches. If the dGPU fails or loses drivers, you may have no display until you reset BIOS/CMOS. Keep the iGPU enabled and control usage per application instead. [Elektroda, bumble, post #18195593]

Why do Minecraft or GTA still use Intel after I assign High performance?

Launchers often start a different executable. Minecraft Java runs via a Java runtime (e.g., javaw.exe), not just the launcher. Add the actual runtime or game process in Graphics settings or Radeon Software. Then set it to High performance. [Elektroda, Enytjoo, post #18195812]

How do I add the right executable when a launcher uses another process?

Add the real game binary, not the launcher. In Graphics settings, click Browse and select the game’s actual .exe from its install folder. If the game uses a runtime, add that file too. Pick High performance for each entry and restart the app. “Graphics settings for apps in Windows”

How can I verify which GPU my game is using?

Open Task Manager > Processes. Right‑click the header and enable the GPU Engine column. It shows which GPU (e.g., GPU 0 iGPU, GPU 1 dGPU) each process uses. The Performance tab labels GPUs so you can match engine numbers. “Task Manager GPU performance”

Is there a global switch to make Radeon 520 the default for everything?

No universal toggle exists on modern hybrid systems. Windows offers per‑app Graphics preferences with three modes. Assign High performance to games and heavy apps. Leave everyday apps on Power saving or Let Windows decide. “Graphics settings for apps in Windows”

Can BIOS switch the primary display to the dedicated GPU?

Some BIOS menus expose a primary graphics option. If available, select PCIe/PEG/Discrete as the primary. Not all HP consumer desktops provide this. If absent, use Windows Graphics settings and correct monitor cabling. [Elektroda, bubu1769, post #18195583]

Fullscreen still uses Intel. What else should I try?

Update AMD Radeon drivers, then set your game to High performance in Radeon Software or Windows Graphics settings. Add every related .exe the launcher spawns. Restart the game after changes. This resolves most hybrid‑GPU routing issues. “AMD Switchable Graphics Application Settings”

Does HDMI vs DVI matter for GPU selection?

Port type does not decide GPU selection. Port location does. Use the discrete GPU’s video outputs on the expansion card, not the motherboard’s ports, to ensure the Radeon handles rendering. [Elektroda, bubu1769, post #18195600]

Why doesn’t a UEFI toggle to disable the iGPU appear on this PC?

Many prebuilt systems use a muxless design, where the iGPU stays in the display path. In such systems, the firmware hides full disable options to keep the pipeline functional. Use per‑app controls instead. [Elektroda, sylweksylwina, post #18195782]
Generated by the language model.
ADVERTISEMENT