GP discusses the latest CPUs & GPUs

Thread Rating: 5 votes, 5.00 average.
(5 votes)
Results 15,526 to 15,550 of 15584

  1. Post
    BURN_BABY wrote:
    I thought G-Sync had to have a special module in the monitor to work? Or is this the 'Freesync monitor + Nvidia graphics card' implementation?
    There are now three versions of ďG-SyncĒ

    1. Gsync using a hardware device thatís built into the monitor (only works via DisplayPort) and requires an Nvidia graphics card.

    2. Gsync using AMDs Freesync software - the screen and graphics driver must be compatible as this is a software solution (works over DisplayPort and HDMI)

    3. Gsync using HDMI Forumís Variable Refresh Rate standard - the screen and graphics driver must be compatible as this is a software solution (works over HDMI)

    Nvidia has chosen to use Option 3 to bring Gsync to TVs. I think the main reason they have chosen option 3 is because itís a HDMI 2.1 standard feature which means easy future support across a range of products and wide support from TV manufacturers.

    Option 2 which is AMDs Freesync will probably die out from TVs now. Only Samsung supports it while game consoles, Nvidia gpus and probably all other TV makers use Option 3. And while AMD gpus do not support Option 3, AMD has said they will support it by a driver update at some point (probably sooner rather than later now that Nvidia has beat them to the punch)

  2. Post
    Where will the new consoles fit in that list, with their Navi GPU's? TVs blocking out whatever tech the consoles supports would be a dud move. I bet 1,000,000x as many people use consoles on TV's than PC gamers.

  3. Post
    Fragluton wrote:
    Where will the new consoles fit in that list, with their Navi GPU's? TVs blocking out whatever tech the consoles supports would be a dud move. I bet 1,000,000x as many people use consoles on TV's than PC gamers.
    Next gen consoles are an unknown quantity. We only know that both support support variable refresh gaming. The current Xbox One X supports both HDMI Forum VRR and Freesync - so currently you get variable refresh gaming from an Xbox One X whether you plug it into a Samsung or LG TV (and both have the same limitation that the variable refresh range only starts at 40hz, so any games under 40fps still have tearing). So maybe they'll just do both, after all even if all TVs just used VRR, some gamers may still want to plug their consoles into a Freesync desktop monitor - so supporting both would be nice.

    Also, get hyped for the 3950x. And maybe get your order in when they pop up because ryzen supplies could be hit as TSMC is under stress for 7nm demand and has just increased product lead time from 2 months to 6 months. Which is pretty hilarious because just a few months ago TSMC was saying they have spare 7nm capacity available and they don't have enough.

    https://www.techpowerup.com/259289/t...t-availability

    Last edited by SirGrim; 18th September 2019 at 9:20 am.

  4. Post
    If they're having trouble keeping up, it must mean Ryzen 2 is a roaring success. Pretty happy for amd just hope intel can keep up otherwise we'll still be stuck in the same situation as the last few years, just with the names reversed.

  5. Post
    SirGrim wrote:
    There are now three versions of “G-Sync”

    1. Gsync using a hardware device that’s built into the monitor (only works via DisplayPort) and requires an Nvidia graphics card.

    2. Gsync using AMDs Freesync software - the screen and graphics driver must be compatible as this is a software solution (works over DisplayPort and HDMI)

    3. Gsync using HDMI Forum’s Variable Refresh Rate standard - the screen and graphics driver must be compatible as this is a software solution (works over HDMI)

    Nvidia has chosen to use Option 3 to bring Gsync to TVs. I think the main reason they have chosen option 3 is because it’s a HDMI 2.1 standard feature which means easy future support across a range of products and wide support from TV manufacturers.

    Option 2 which is AMDs Freesync will probably die out from TVs now. Only Samsung supports it while game consoles, Nvidia gpus and probably all other TV makers use Option 3. And while AMD gpus do not support Option 3, AMD has said they will support it by a driver update at some point (probably sooner rather than later now that Nvidia has beat them to the punch)
    Don't think Freesync will die out from TVs. Freesync is built on VESA apdative sync, is an open standard and royalty free - compatible with AMD based consoles and is compatible with displayport 1.2a and up, and hdmi 1.4 and 2.0.

  6. Post
    tl:dr Just capping the framerate to prevent the GPU going over 95% usage results in the best low latency experience and is better than either technology from nvidia or amd.




    Last edited by SirGrim; 20th September 2019 at 7:35 pm.

  7. Post
    what coolers are folks using with am4s? idealy 3700x, Ive got a 240mm aio and its loud , think its struggling with the 3700x from a 1700x

  8. Post
    Soldier_Five wrote:
    what coolers are folks using with am4s? idealy 3700x, Ive got a 240mm aio and its loud , think its struggling with the 3700x from a 1700x
    https://www.computerlounge.co.nz/sho...id-cooling-kit

    got that on a 3600, cant hear it at all in an open air case.

  9. Post
    Soldier_Five wrote:
    what coolers are folks using with am4s? idealy 3700x, Ive got a 240mm aio and its loud , think its struggling with the 3700x from a 1700x
    ive got a H100i plat / 3700x and its fine, 26.5 water temp right now

  10. Post
    If a 240 AIO is struggling, there is something not quite right. I'd be looking at voltages when under load (if temps are actually high?). Is it pump noise or fan noise? Quieter fans might help.

  11. Post
    Soldier_Five wrote:
    what coolers are folks using with am4s? idealy 3700x, Ive got a 240mm aio and its loud , think its struggling with the 3700x from a 1700x
    I had to change around with the windows power settings for my 3700x as my 240 was doing the same, constantly revving up and down. CPU would idle at around 50c and 1.5v. Also setup a more conservative fan curve?

  12. Post
    The 16 core 3950x has been delayed again.

    Not launching with the rest of the lineup, the 16 core was moved back to September.
    And now AMD has announced that it’s not happening this month - the 3950x eta target is now the end of November.

    I’m pretty sure I saw a post in the yolo thread of someone who purchased a system and was just waiting on the 3950x to build it... hope he has a backup plan
    Last edited by SirGrim; 21st September 2019 at 7:56 am.

  13. Post
    So looks like its constantly running at near max boost clock around 4366mhz, even when i'm just in windows and cpu load is sitting around 2-3%, voltages appear to be a little on the high side, 1.428, do i juts go into bios and tinker? not really one for fiddle with overclocking bits

  14. Post
    Soldier_Five wrote:
    So looks like its constantly running at near max boost clock around 4366mhz, even when i'm just in windows and cpu load is sitting around 2-3%, voltages appear to be a little on the high side, 1.428, do i juts go into bios and tinker? not really one for fiddle with overclocking bits
    I was lazy and just used the Gigabyte EasyTune with mine. Ive set it to 3.8 boost on all cores, but I personally dont need more than that.

    Im sure your mobo will have some sort of OCing software to use. Or you could just tinker in the bios.

  15. Post
    My 3600 did this until I changed the windows power plan to balanced from high performance. Idle went from 4.2ghz to 2 and fans dropped off.

    Sent from my M6 Note using Tapatalk

  16. Post
    Flashed the ABBA bios last night. Now my max boost has increased from 4250 to 4550. Pretty happy really.

  17. Post
    Wow that’s a massive boost. What motherboard?

    Hardwareunboxed shows that pairing ryzen 3000 with cheaper boards resulted in lower boost clocks where as the high end stuff has already being getting near to advertised boost. So it can be that the cheap boards are the ones showing the big clock speed gains with the ABBA bios

  18. Post
    Is an x570 Aorus wifi pro, so not exactly the highest spec x570. Shouldn't matter though, every x570 board should be able to reach the advertised boost clocks and if they can't the advertised boost clock should be lowered.

  19. Post
    Same board as me, and yes, my 3900x is definitely boosting higher on single core.

  20. Post
    With the constant 3950x delays the new cascade lake x parts could be a viable alternative as well.

    Last gen parts are out of touch with 2019 prices since ryzen 3000’s launch and next gen TR coming - but Intel says the 10th gen cascade lake-x will have up to double the price/performance ratio. We know the performance isn’t going up, they only have about 100mhz clock speed boost over last gen - that means the prices are dropping in half or close to it. So for example the current 14 core X part retails around $2k - cut that in half and it’s 10th gen equivalent could come in just slightly higher than the 3900x price. Then you also get the extra benefit of AVX-512 and tons of lanes and quad channel memory. With the downsides being it will use more power and make more heat.

    So it appears cascade lake x will be used to compete with ryzen 3000 by cutting its prices in half. Could be interesting times ahead, competition is good.

  21. Post
    Cascade lake having fab issues. Most of the current stuff has been relegated for enterprise use.

  22. Post
    Keep in mind when Intel claim twice the price performance they are probably talking about one benchmark and one particular CPU. To believe anything more is naive.

    They can't make the chips as cheap as AMD can, thats for sure.

  23. Post
    Also will be for one month then will get blitzed by new Threadripper
    Last edited by Siris Le Osiris; 24th September 2019 at 11:31 am.

  24. Post
    Hitting 4591mhz boost on my 3900x now. 9mhz under the advertised speed is something I can live with, especially since I just have a "budget" x570 board. I'm glad AMD have been able to redeem themselves in this case, the discrepancy in boost clocks was a real stain on an otherwise smooth launch of an amazing product.

  25. Post
    I'm guessing most people missed this from last year, but there's once again a third x86 manufacturer.

    https://en.wikipedia.org/wiki/Zhaoxin

    Chinese brand which is a joint venture from the old third x86 manufacturer VIA. Apparently their 8 core chip (16nm) last year matched the quad core i5-7400, some Lenovo laptops already use their processors and they're planning a new chip release this year using DDR5 and PCI-E 4.0 on 7nm.

    Pretty exciting stuff, I learned about it today when CPUID announced CPU-Z now reports on their processors.