GP discusses the latest CPUs & GPUs

Thread Rating: 5 votes, 5.00 average.
(5 votes)
Results 15,626 to 15,650 of 15873

  1. Post
    SirGrim wrote:
    GPU of the year has arrived, I know Eva would buy 10 in a heartbeat.

    This special Waifu edition 5700xt is only sold in China (I don't know if I should feel sad or relieved)

    Attachment 227670
    Thank god it doesn't have half-naked anime characters on it like the old cards used to, so cringe worthy. At least thbis ones just using a unique colour scheme, id like to see more makers mix up their patterns.

  2. Post
    Ballistica wrote:
    Thank god it doesn't have half-naked anime characters on it like the old cards used to, so cringe worthy. At least thbis ones just using a unique colour scheme, id like to see more makers mix up their patterns.
    Like the PC equivalent of Itasha almost?

  3. Post
    In case someone gets the urge to buy it
    https://detail.tmall.com/item.htm?sp...ne=taobao_shop

  4. Post
    Ballistica wrote:
    Thank god it doesn't have half-naked anime characters on it like the old cards used to, so cringe worthy.
    You spoke too soon... look at the backplate and box lmao!

    Click image for larger version. 

Name:	O1CN01KYkhlt1EWv9cPnZdw_!!0-rate.jpg_400x400.jpg 
Views:	59 
Size:	51.3 KB 
ID:	227673

    Click image for larger version. 

Name:	O1CN01pnIYDV2L65R7YTzGy_!!0-rate.jpg_400x400.jpg 
Views:	66 
Size:	43.5 KB 
ID:	227674

  5. Post
    Th3WhiteKnight wrote:
    Like the PC equivalent of Itasha almost?
    I have no idea what this means

    SirGrim wrote:
    You spoke too soon... look at the backplate and box lmao!

    Click image for larger version. 

Name:	O1CN01KYkhlt1EWv9cPnZdw_!!0-rate.jpg_400x400.jpg 
Views:	59 
Size:	51.3 KB 
ID:	227673

    Click image for larger version. 

Name:	O1CN01pnIYDV2L65R7YTzGy_!!0-rate.jpg_400x400.jpg 
Views:	66 
Size:	43.5 KB 
ID:	227674
    Oh dear Lord I did.

  6. Post
    *gets excited about new gen*

    *realizes he still does not have anything worth upgrading for until next year*

    eh maybe 4th gen Ryzen?

  7. Post
    3950x reviews trickling out.

    She's a warm beast. It averages around 4.3ghz for Gamers Nexus on their tests only hitting 4.7ghz for a couple seconds. They did manage to get it to all core 4.4ghz which boosts performance in tests at the sacrifice of an extra 100w of heat. At 4.4ghz a 360mm AIO was not enough with the temperature nearly at 100c. With stock settings though a AIO is fine but overclocking really needs a custom loop. Gaming performance is as expected, the 3900x is better for it while the 3950x is better for workloads.

    https://youtube.com/M3sNUFjV7p4
    Last edited by SirGrim; 15th November 2019 at 4:20 am.

  8. Post
    Why would you realistically need to overclock it?

    Stock it uses the same power as a 9900k in Blender. While providing twice the core count and near enough twice the performance. Runs lower voltage than 3900x so clearly silicon is better. What a beast!

    Overall it looks to game better than the 3900X. Especially if you're not gaming at 1080P, which no one with a 3950X would, ever. Many games are just very poorly optimised, so there is only so much brute force can do in those games. The newer the game the better the AMD chips peform, which to me says game developers are taking AMD seriously and actually optimising for them, finally.

    https://youtu.be/wmqT2-2seT0
    Last edited by Fragluton; 15th November 2019 at 10:05 am.

  9. Post
    Fragluton wrote:
    Why would you realistically need to overclock it?
    Why would you realistically need a desktop 16 core CPU?

    If you need to ask you aren't YOLO enough.

  10. Post


    What a beast. My sister does 4K video editing on an i7. This thing would be bloody awesome for her. Looks like it uses less power than the 3900X ? - very WTF.

  11. Post
    SirGrim wrote:
    Why would you realistically need a desktop 16 core CPU?

    If you need to ask you aren't YOLO enough.
    Anyone that does video rendering will enjoy big core counts. Gamers don't need 16 cores, but that isn't what the 3950X is aimed at. Currently have a mate asking me 3900x or 3950x for his work / gaming system. Hard call. That said, he currently runs a 7700k, so both choices are a win.

  12. Post
    The only thing that bugs me, is that even the silicon the 3950X gets, could be better. Someone needs to kill the server market so all the best silicon hits the desktop chips for us.

  13. Post
    Fragluton wrote:
    Anyone that does video rendering will enjoy big core counts. Gamers don't need 16 cores, but that isn't what the 3950X is aimed at. Currently have a mate asking me 3900x or 3950x for his work / gaming system. Hard call. That said, he currently runs a 7700k, so both choices are a win.
    Well an all core overclock seems to grant 10% extra performance in work tasks. So if you need an extra 4 cores from 12 to 16 that bad, another 10% from overclocking is not far fetched.

    Your friend is using a quad core for work? that's unfortunate

    nzbleach wrote:

    What a beast. My sister does 4K video editing on an i7. This thing would be bloody awesome for her. Looks like it uses less power than the 3900X ? - very WTF.
    There is a bit of variance in the reviews. Some reviewers are getting low temps and showing lower power draw than the 3900x, others are getting the reverse. Either there is a wide variance in quality between the chips or once again there are motherboard BIOS problems afoot.
    Last edited by SirGrim; 15th November 2019 at 10:24 am.

  14. Post
    7700k still has the edge in older games. i'm tempted to hold onto mine till Ryzen 4

  15. Post
    one_red_god wrote:
    7700k still has the edge in older games. i'm tempted to hold onto mine till Ryzen 4
    Still a great chip. It has great resale value so now might be the time to sell it though.

  16. Post
    SirGrim wrote:
    Well an all core overclock seems to grant 10% extra performance in work tasks. So if you need an extra 4 cores from 12 to 16 that bad, another 10% from overclocking is not far fetched.

    Your friend is using a quad core for work? that's unfortunate
    I never said he needed the extra 4 cores that bad, but I really don't see the need to overclock such a chip. If the power usage was the same, sure, but it's not and so you need to look at more exotic cooling solutions. Stock you can just use a decent AIO cooler and job done. As for the quad core, yeah i'm surprised he hasn't upgraded it, but I think that particular rig is upgraded every two or so years. Still rocking a 1080Ti because the 2080Ti was priced so stupidly.

    7700K as mentioned is still pretty good in games, paired with a 1080Ti I can see why he wasn't fussed about upgrading till something properly worthwhile was released.

  17. Post
    Fragluton wrote:
    7700K as mentioned is still pretty good in games, paired with a 1080Ti I can see why he wasn't fussed about upgrading till something properly worthwhile was released.

  18. Post
    Am I supposed to watch that and work out how it relates to the quote? lol

    Ignoring the fact it's likely another poorly done PC port of a game, give me a tl;dr. GamersNexus have covered some random bugs with the game.

    Screenshot from GamersNexus. You miss out on the average FPS but still match the 9900k minimum framerates. Not a bad result really, minium FPS is a lot more important than average IMO. Well at least when you're north of 60FPS. 8700k low results are actually worse than the 7700k, lol'd. tl;dr, shit game to use for comparisons.
    Click image for larger version. 

Name:	gamersnexus.png 
Views:	86 
Size:	294.3 KB 
ID:	227719
    Last edited by Fragluton; 15th November 2019 at 11:29 am.

  19. Post
    7700k is fine in rdr2 - GPU is far more important. Sure if you need 120fps or something it's a bit slower, but you are unlikely to find that a common scenario

  20. Post
    Ballistica wrote:
    I have no idea what this means
    It translates from Japanese to English as "Painful Car"

    It's a car thing and that card is like the pc equivalent.

    https://en.wikipedia.org/wiki/Itasha

  21. Post
    Th3WhiteKnight wrote:
    It translates from Japanese to English as "Painful Car"

    It's a car thing and that card is like the pc equivalent.

    https://en.wikipedia.org/wiki/Itasha
    By painful I thought it was an insult, by Wikipedia suggests that's just what it's called. I guess it's like Otaku in the sense that it's negative literally but not really for those who call themselves it?

  22. Post
    Some 3d center users found that without any announcement, Nvidia enabled SLI CFR mode in their latest drivers.
    This "multi-gpu checkerboard rendering" mode apparently works in any game that is DX11 or DX12 and requires the two GPU's to use a NVLink bridge, it does not require the game developer to support SLI as it's handled in the driver.

    here is Metro Exodus running in DX12 with this new CFR multi gpu mode enabled in the driver. This game doesn't officially support DX12 multi gpu yet it's working fine with a 30 to 60% performance uplift compared to just using a single 2080ti.

    Since this feature wasn't announced, I wonder if it's an error. It could be Nvidia is laying the groundwork for future MCM chiplet architecture graphics cards which is rumored to be coming to the RTX4000 line-up in 2021/2022

    Last edited by SirGrim; 20th November 2019 at 11:56 am.

  23. Post
    did they report any micro stutter etc?

  24. Post
    Not sure, the 3d center forums are all in German, just reposting what I found on reddit.

    I did go to the 3d center link, and the person put up some screenshots and you can see the Frametime graph in them (I've linked them below), they look fairly smooth though I'm not certain what microstutter looks like as I'm never encountered it before.

    https://www.imagebanana.com/s/big/1576/swNzLALT.png
    https://www.imagebanana.com/s/1576/eTwo9UeD.html
    https://www.imagebanana.com/s/1576/xpHE7f2k.html
    https://www.imagebanana.com/s/1576/hhUwWNmz.html

  25. Post
    SirGrim wrote:
    Some 3d center users found that without any announcement, Nvidia enabled SLI CFR mode in their latest drivers.
    This "multi-gpu checkerboard rendering" mode apparently works in any game that is DX11 or DX12 and requires the two GPU's to use a NVLink bridge, it does not require the game developer to support SLI as it's handled in the driver.

    here is Metro Exodus running in DX12 with this new CFR multi gpu mode enabled in the driver. This game doesn't officially support DX12 multi gpu yet it's working fine with a 30 to 60% performance uplift compared to just using a single 2080ti.

    Since this feature wasn't announced, I wonder if it's an error. It could be Nvidia is laying the groundwork for future MCM chiplet architecture graphics cards which is rumored to be coming to the RTX4000 line-up in 2021/2022

    Thats cool! I miss sli/cf set up had 2x GTX 760s in SLI few years back.