Speak now
Please Wait Image Converting Into Text...
Embark on a journey of knowledge! Take the quiz and earn valuable credits.
Challenge yourself and boost your learning! Start the quiz now to earn credits.
Unlock your potential! Begin the quiz, answer questions, and accumulate credits along the way.
General Tech Bugs & Fixes 2 years ago
Posted on 16 Aug 2022, this text provides information on Bugs & Fixes related to General Tech. Please note that while accuracy is prioritized, the data presented might not be entirely correct or up-to-date. This information is offered for general knowledge and informational purposes only, and should not be considered as a substitute for professional advice.
Turn Your Knowledge into Earnings.
I've got a very strange/annoying problem that I can't pinpoint exactly the source with a new 1080p monitor (a refurbished Dell UZ2315) and a graphic card (a Sapphire RX470 Mining Edition) I bought recently (early January). I never mined with the graphic card, I bought it because it was a powerful GPU I could buy to replace my previous 5 years old R7 240 GPU for cheap now that crypto-currency mining loses in popularity. Being only 1 month old it stills at the beginning of the Warranty period so in the worst case if it seems an hardware issue I can send it back.
Until now I had an old 17" 1440x900 Acer monitor with only a VGA input I connected to the only DVI port of my graphic card using a DVI-D -> HDMI adapter and then an HDMI -> VGA adapter I already had and it worked (and still works) without any problem.
However with the refurbished Dell monitor I bought recently I have strange issues but only with my graphic card. It has 1 VGA, 2 HDMI and 1 Display Port.
If I connect the DVI port of my RX470 to one of the HDMI port or the VGA port, I pretty much can't use 1080p resolution on Windows (with the VGA adapter it shows permanently randomly coloured noise on the entire screen) and on a HDMI port it very rarely get in sync and so stays pretty much always black. If by any chance the monitor accepts to display the 1080p image, red dots appears on dark areas. However, if I reduce the resolution even just to the first compatible lower one (1600x900 on Windows) it works perfectly on both VGA and HDMI.
But here doesn't stop the strange things. During boot, when grub displays on 1080p it works perfectly but if I boot to OpenSUSE the symptoms change compared to Windows. On OpenSUSE, I can use all available resolutions (1080p is displayed) however, whatever the resolution I use a lot of red dots shows on dark areas of an image, like 1080p Windows but here on all resolutions, including 1440x900 such as my old monitor (which I tried again just after and still displays correctly at this resolution). These red dots are not appearing totally randomly, if for example I take a screenshot of a frame generating some of them and move it, the dots stays at the same position on the screenshot and so follow the picture instead of staying at the same position on the screen. And if I display the screenshot on my old monitor they don't appear (so they're not the result of the frame buffer rendered by the GPU).
Finally I tried connecting the Dell monitor to my iGPU using the DVI-D output on my motherboard (and the HDMI adapter) and also the VGA output and they both work perfectly on 1080p. The same goes for my PS4, so it doesn't come from the cables nor the adapters and only the video output from my RX470 used with this specific monitor (I'd try if possible to find a third one to see if I can reproduce some of these behaviours), being it digital or analogic doesn't work. I also tried to stress test my graphic card but it works smoothly and and I also tried the second BIOS (of the four) by setting the switch but the results are the same.
Do anybody have any idea where it could come from exactly (the video output or the monitor? But why both analog and digital 1440x900p doesn't work correctly on the Dell monitor where they're ok on the Acer? As it's the highest common resolution I can use on both of them). The VGA adapter has only a few cm of cable length on the HDMI side, so it cannot be caused by a too long cable and if the output was messy I should have the same converted output on both monitor no? And also why can I use all resolutions on Linux for the Dell but they are all messy, while I can't use only 1080p on Windows and other resolutions are all ok while I don't have any of these symptoms with the old monitor? Is it really from the DVI output? Too much adapters lowering too much the strength of LVDS signal even for the 15/20cm cable length of the VGA adapter? Something generating too much electrical noise in this specific configuration? Misproper configuration / sync occuring only between the RX470 and the Dell monitor? GPU BIOS issue? (I tried to flash the BIOS of the gaming Sapphire Nitro but the card wouldn't work correctly at all so I reverted for now the original BIOS).
Besides this specific configuration everything works perfectly, but it's annoying as of now I have the RX470 connected to the old 1440x900 monitor while the iGPU is connected to the 1920x1080p monitor as the reverse would be way better (but in this case I'm currently stuck at 1600x1900 at the highest usable cleanly).
If anyone has any idea of what could cause this and how to fix it it would be very appreciated. Thanks in advance for your help :)
For information if it can help pinpoint the cause, here's briefly the references of the components involved in my current configuration that may be related to this issue:
No matter what stage you're at in your education or career, TuteeHub will help you reach the next level that you're aiming for. Simply,Choose a subject/topic and get started in self-paced practice sessions to improve your knowledge and scores.
General Tech 10 Answers
General Tech 7 Answers
General Tech 3 Answers
General Tech 9 Answers
General Tech 2 Answers
Ready to take your education and career to the next level? Register today and join our growing community of learners and professionals.