Table of Contents
Almost all motherboards have an HDMI output port on the back, so you probably will not have an issue with that. However, not all CPUs have integrated graphics. If your CPU does have it, but you do not see anything on your display, then maybe your HDMI port is disabled. So, how to enable motherboard HDMI?
It does not matter if you want to take advantage of your motherboard’s HDMI port for your primary display or a secondary one, the process is quite similar.
It is also great if you want to add a new display, but your graphics card only has a single HDMI port and you do not want to buy an adapter. After all, why waste money if you can make it work for free?
Here are all the steps that you should follow if you want to enable your motherboard’s HDMI port.
Check if Your CPU Has Integrated Graphics
Assuming that you already checked whether your motherboard has an HDMI port, the next most important step is to check if your CPU can even use that port. The logic behind it is that you can’t really use the HDMI port without integrated graphics because there is nothing to power it.
If you already know the exact CPU model that you have in your system, you can check the manufacturer’s website and find the information there. If not, here is how you can check it in Windows 10:
- In Windows 10, go to Start and open Settings (The gear icon above the Power button).
- Select “System”, then on the left side go to “About”.
- You can see the processor model name under “Device specifications”. AMD CPUs will probably tell you that they are “with Radeon Vega Graphics”, which means that you have an integrated graphics card. If you are on Intel or if it does not say this, do the following step.
- Do a Google search by typing the processor manufacturer and model, go to the manufacturer’s official website, and check the “Processor Graphics” part of the specs.
If you have an older version of Windows, you can do it through the Device Manager. Here is how:
- Go to Start.
- Type “Device Manager” and open it. Alternatively, you can find it in the Control Panel.
- Go to the segment that says “Display adapters”.
- You will see either your dedicated graphics card, integrated graphics, or both here.
Don’t panic if you do not see your integrated graphics card right away. If you are using a dedicated graphics card, your motherboard has most likely disabled the onboard graphics automatically. The following step will help you with this.
Enable Integrated Graphics in BIOS
Now that you have established that your CPU does in fact have onboard graphics, you will need to enable it through the BIOS. Some motherboard manufacturers, such as Asus, have built-in support to enable both the dedicated and onboard graphics to be used at the same time.
With Asus motherboards, it is called “iGPU Multi-Monitor” and you should make sure that it has been enabled. Other manufacturers use similar terms for this setting, such as “Enable IGPU”, “Enable integrated graphics”, or simply “Integrated graphics”.
If you do not know how to open your BIOS and find these settings, here is a generalized guide on how you can open it and where to look for it:
- You first have to turn off your computer for this. Save all important data and shut the system down.
- After the system has turned off, you need to turn it on again. As always, there will be a short POST screen before your computer boots up. During that time, you have to mash the button that allows you to enter the BIOS settings, which is usually written on the screen. The most common keys are F1, F2, F8, F12, Delete, and Escape. You can check the exact key in your motherboard’s user manual or on the motherboard manufacturer’s website.
- Once you are in the BIOS, you need to find the integrated graphics settings. Be careful not to modify anything else as it can break your computer if you do not know what you are doing, but feel free to browse around the different menus. Usually, you can find the integrated graphics settings easily in one of the menus. If not, find the “Advanced settings” and look for “Graphics configuration” or similar.
- Once you found it, enable the iGPU, Multi-Monitor, or Integrated Graphics. If there is a description of the setting, make sure that it states that enabling the setting will allow you to use both integrated and discrete graphics for multiple displays.
- When you are done, save the settings and close the BIOS. Your computer will reboot as normal.
Once you have enabled the integrated graphics, it is now time to connect the new display. In case you are not familiar with computers, the following step is helpful.
Connect the New Display
The next obvious step after you have checked if your CPU has integrated graphics and if it is enabled is to use the HDMI port. You will obviously need an HDMI to HDMI cable to do that. If you do not have one, you can buy one in any nearby electronics store for a few bucks. If your monitor does not have an HDMI port, using an adapter is a solution as well.
In case you do not know how to connect your monitor to your motherboard HDMI, here is how it is done:
- Plug in the HDMI (or other) cables to your monitor(s).
- If you are using two monitors, plug in the primary monitor to the graphics card. The graphics card ports are usually horizontal and located on the rear bottom part of your computer case.
- Plug in the second monitor to your motherboard HDMI port. The motherboard, unlike the graphics card, is vertically mounted. All motherboard ports are located on the rear side of the computer, just like the graphics card. They usually sit above the graphics card ports in all standard computers.
- Turn on your computer and enjoy your multi-monitor setup.
Even though your monitor works now, you still need to update your integrated graphics card drivers. If your resolution looks weird and you can’t change it or if the monitor is not working in the first place, then the next step is for you.
Update Your Integrated Graphics Drivers
This step is very easy to do and, unlike graphics card drivers, does not require you to download any software online. It is done through the Windows Device Manager:
- Go to Start.
- Type “Device Manager” and open it. You can also find it in the Control Panel.
- Go to the section that says “Display adapters” and find your integrated GPU. If you do not see it here, you can find it under the “Other devices” section as an Unidentified Device.
- Right-click the device and click “Properties”.
- Go to “Driver” and “Update Driver”. Then click on “Search automatically for updated driver software” and wait for it to finish downloading and installing.
- Restart your computer.
You can also use Windows Update to do this, but you may not want to install it and it sometimes does not update the driver or detect the new device as your onboard graphics. Once you are done with all these steps, you are pretty much good to go. You can find more information about your motherboard HDMI below.
The Impact on Performance of Integrated + Dedicated Graphics
Using your motherboard HDMI port is a great idea, especially if you do not have any free ports on your dedicated graphics card. You may wonder about the hit on the performance that you can expect from this method as opposed to using your graphics card for all displays.
The truth is that using your graphics card for all displays is a bit better in both terms of practicality and performance, but using your motherboard HDMI port is not that bad either. Your integrated GPU relies on the system RAM, so you will see a minor hit to it. It depends a lot on what you are doing.
You should not see usage above 1 GB of RAM when watching videos at 1080p. But if you want to play two games at the same time, then the performance will depend a lot on how powerful your computer is.
If you are playing a game and want to get the most out of your system to get the best FPS possible, you can simply unplug the secondary monitor. Your onboard graphics will enter sleep mode. Once you are done with your intense gaming session, remember to plug the monitor back in.
Advantages of Using Motherboard HDMI
The biggest advantage of using the motherboard HDMI port is the possibility of connecting multiple displays. If you do not have a second HDMI port on your graphics card or if all of your ports are populated, you can connect the new display to the motherboard.
Even if you have a triple monitor setup and use all three HDMI ports on your graphics card, you may still want to plug in the TV as well for some kickback gaming with your joystick or to watch YouTube.
Having multiple monitors connected has several benefits. It will speed up your workflow, increase productivity, allow for faster multitasking, and so much more. Since most people are working from home due to the ongoing pandemic, maximizing your productivity will give you more time to spend with your family and relax.
And if you are a gamer, having a second display to monitor your hardware utilization or perhaps having a wiki open is a great idea. If you are playing Cyberpunk 2077 and find yourself constantly using your phone to learn about the game, the second monitor will speed up this process, provided that you can afford the minor performance impact due to the RAM usage.
The upside of using the onboard graphics is that you will reduce the load on the graphics card when it has to drive two monitors. This usually means that you will have more VRAM to work with, which is especially important now that many games have high-resolution textures.
Disadvantages of Using Motherboard HDMI
The main disadvantage, as opposed to using the dedicated graphics for both monitors, is that you will not be able to play games in multi-monitor mode. On AMD, you will be sacrificing the Eyefinity technology that has some benefits on its own.
Also, you will notice that your integrated graphics are using some of the system RAM. This is because of onboard graphics, unlike a dedicated GPU, which does not have a separate VRAM. They have to make do with the shared RAM for that purpose. This is a big deal for users that only have 4 GB or even 8 GB, but people with 16 GB or more will probably not even notice it.
Enabling the motherboard HDMI port is not difficult. Most of the time, all you have to do is simply enable the integrated graphics in BIOS if you have a CPU with a GPU. If it does not work right away, then it is most likely the old driver causing the issue. Updating the integrated graphics driver is easy and fuss-free if you follow all of the steps above.
Using your graphics card for your primary display and your motherboard HDMI to drive the second monitor has some advantages and disadvantages. You will have less RAM, but you will allow your graphics card to work better. Either way, you will now have a second monitor that you can use for several things.
Many people are still working from home due to COVID, so increasing your productivity by getting a second display is a great idea. And if you are a gamer, you may want to have a YouTube video or Wiki open so that you can follow a guide while playing at the same time. Also, you can use the second display to check if your hardware is being utilized properly.