GP discusses the latest CPUs & GPUs

Thread Rating: 5 votes, 5.00 average.
(5 votes)
Results 15,776 to 15,800 of 15980

  1. Post
    Fragluton wrote:
    All I wanted for Xmas was an affordable GPU that can do 144hz 1080P, in ultra, in all my games. Maybe next Xmas...
    I just want that for 1440p and hold a minimum of 144fps ultra in everything I play. My 1080ti can't do that.

  2. Post
    I am hoping by this Xmas I can get a GPU that can do 4k 60FPS ultra with some headroom at half the current price of the 2080Ti. My Strix 1080Ti will have to do till then, with some settings turned down for new games.

  3. Post
    SirGrim wrote:
    Doesn't quite clear up the question that people want to know, will Zen 3 launch in 2020 or 2021
    You missed the last sentence when you quoted

    Rather than ask me the question three times Ian [laughs], let me clear: you will see Zen 3 in 2020!
    Sounds like they think CES wasn't the appropriate venue this year and decided to stick to laptops instead. Guess it makes sense that most people associate laptops as a gadget and CES is mostly a gadget show.

    Guess the next date we may hear something from AMD/Nvidia for graphics cards is at GTC in late March? Gonna be a pretty boring GPU Technology Conference if there aren't any new GPU's to announce lol

  4. Post
    Maybe AMD will do an intel and sit on the tech till the competition release their next thing

  5. Post
    AMD will actually have something to release though. Intel have nothing this year.

  6. Post
    Fragluton wrote:
    Yeah they'll be smiling all the way to the bank. 3xxx series cards will be priced at least as bad as 2xxx series cards if AMD doesn't bring something to the table. RIP your wallet. Sound's like you're waiting for the next big thing, given 3950X is an amazing chip but you're not keen now?

    All I wanted for Xmas was an affordable GPU that can do 144hz 1080P, in ultra, in all my games. Maybe next Xmas...
    oh it does everything i want, i am just hoping the zen 3 will bring the price down a notch , i wont be upgrading until i come home in april/may

  7. Post
    The vast majority of the consumer base for graphics cards are in the <US$300 bracket. nVidia bets, probably rightly so, that the people chasing the best performance will pay a premium for it. AMD focused the RX570/80/90 and 5700/5700XT where they did because that's where the majority of the customer base is. AMD has relatively scarcer engineering resources compared to nVidia, so it makes perfect sense to dedicate those resources where they will generate the most return. Sure, it means that nVidia has the premier halo products and dominates the high-end, and that's not great for hardcore consumers who want the traditional "high-end" (like the 2070/2070 Super, as opposed to the "enthusiast" tag that applies to the 2080/Ti) to become cheaper, but it makes business sense for AMD.

    Also doesn't help that CUDA is so ingrained in machine learning and medical technology now that its hard for AMD to get a foot in the door in the enterprise space, which is where the real big bucks come.

  8. Post
    Th3WhiteKnight wrote:
    I just want that for 1440p and hold a minimum of 144fps ultra in everything I play. My 1080ti can't do that.
    Depending on what your definition of "affordable" is and what games you play, this is asking for a lot. The 2080 Ti does not do 144 fps @ 1440p Ultra in just about any modern game, and even not so modern ones. GPUs have become progressively more capable over time, but it seems like 1440p 144 FPS in Ultra will not be a standard expectation for a while yet, even for top end cards.

    This is an area where competition from AMD would be useful. That the 1080 Ti is basically the same or very slightly lower performance as the 2080 is pretty crappy, but its not like the Vega 56/64 or Radeon VII were real competitors, if only because you couldn't find those cards even if you wanted one.

  9. Post
    azarat wrote:
    Depending on what your definition of "affordable" is and what games you play, this is asking for a lot. The 2080 Ti does not do 144 fps @ 1440p Ultra in just about any modern game, and even not so modern ones. GPUs have become progressively more capable over time, but it seems like 1440p 144 FPS in Ultra will not be a standard expectation for a while yet, even for top end cards.

    This is an area where competition from AMD would be useful. That the 1080 Ti is basically the same or very slightly lower performance as the 2080 is pretty crappy, but its not like the Vega 56/64 or Radeon VII were real competitors, if only because you couldn't find those cards even if you wanted one.
    2560*1440p = 3.7MP
    3840*2160p = 8.29MP
    144/60=2.4
    8.29/3.7 = 2.24

    On a raw driving pixels calculation, 2560*1440p144 > 3840*2160p60 and the 2080 Ti can only JUST manage the latter on FFXV, a 2016 game.
    Click image for larger version. 

Name:	100910.png 
Views:	21 
Size:	27.8 KB 
ID:	228005

    Shadow of War and BattleField 1 it could manage it.
    Click image for larger version. 

Name:	100889.png 
Views:	14 
Size:	27.8 KB 
ID:	228006
    Click image for larger version. 

Name:	100922.png 
Views:	18 
Size:	28.0 KB 
ID:	228007
    https://www.anandtech.com/show/13346...ition-review/6

    Although on a competition front, here comes Intel:
    While CES 2020 technically doesn’t wrap up for another couple of days, I’ve already been asked a good dozen or so times what the neatest or most surprising product at the show has been for me. And this year, it looks like that honor is going to Intel. Much to my own surprise, the company has their first Xe discrete client GPU, DG1, at the show. And while the company is being incredibly coy on any actual details for the hardware, it is none the less up and running, in both laptop and desktop form factors.
    https://www.anandtech.com/show/15364...ed-at-ces-2020

  10. Post
    1st gen Intel Xe DG1 (whats up with these names...) will be sold as a "software developement gpu" and not for gaming. Though at CES they did have a demo of it playing destiny 2 at 1080p 30fps, but it's not a gaming GPU - it's current ability to run games seems to be on the same level as 2013's PlayStation 4 (1.9 Teraflop).

  11. Post
    I thought Intel are making a gaming GPU?

  12. Post
    SirGrim wrote:
    1st gen Intel Xe DG1 (whats up with these names...) will be sold as a "software developement gpu" and not for gaming. Though at CES they did have a demo of it playing destiny 2 at 1080p 30fps, but it's not a gaming GPU - it's current ability to run games seems to be on the same level as 2013's PlayStation 4 (1.9 Teraflop).
    nzbleach wrote:
    I thought Intel are making a gaming GPU?
    They're planning both, and more. The full product stack.



    The discrete one didn't have external power, so it's less than 75W. And there was a laptop one. Assuming they're running on a reference design (one chip for both) then you'd slot it in around a GT 1030 which would make its performance competitive but I don't imagine they will find it difficult to scale.

    It looks like they're targeting IPC for gaming and raw throughput for everything else based on the LP designation on the picture.

    Whether they succeed is a different scenario.

    Edit:
    "Software Development Vehicle" not "Software Development GPU", as in it's for ISVs to develop their software against (tuning for the GPU), rather than use it to develop software. This will probably include active game engines.

    Dubbed the “DG1 Software Development Vehicle”, the desktop DG1 card is being produced by Intel in small quantities so that they can sample their first Xe discrete GPU to software vendors well ahead of any retail launch. ISVs regularly get cards in advance so that they can begin development and testing against a new GPU architecture, so Intel doing the same here isn’t too unusual. However it’s almost unheard of for these early sample cards to be revealed to the public, which goes to show just how different of a tack Intel is taking in the launch of its first modern dGPU.

  13. Post
    Well my gtx 1080 just died. RiP. Cant afford a new gpu for a while either. Back to console gaming...

  14. Post
    False alarm! Turned out to my hdmi cable had come loose! Lol

  15. Post
    Huzzah, glad to hear you don't need to go peasant mode.

  16. Post
    So you just jumped to "my graphics card broke" without checking anything first?

  17. Post
    brand wrote:
    So you just jumped to "my graphics card broke" without checking anything first?
    Pffft, troubleshooting is for nerds.

  18. Post
    Graphics cards are 1 time use disposable right?

  19. Post
    Why vilify someone for making a little mistake like that? Not like we haven't all had derp moments

  20. Post
    dr.m0x wrote:
    Why vilify someone for making a little mistake like that? Not like we haven't all had derp moments
    "vilify" ??

    Yes -they're an inhuman monster, worse than Hitler and Pol Pot put together

  21. Post
    Not even going to bother playing this game with you. Have fun kiddo.

  22. Post
    In a interesting move, AMD has decided to fight off the $50 price cut to the RTX2060 by releasing a new BIOS file for the RX5600XT.

    The new BIOS file seems to increase all the clocks. It's not yet clear if vendors will release the BIOS for existing 5600XT cards or just load it onto new for sale cards. Should not be too long till the BIOS files pop up in the Techpowerup database and I'd hazard a guess you can just upgrade your existing 5600XT for free with it.



  23. Post
    If true it's nice to see vram go up ahead of next gen games arriving.

    The specs though are a bit "safe" for a leak - it only has a minor boost to core count over the RTX2080.
    So that would mean Ampere would need 1) a significant IPC boost over Turing and 2) Higher clock speed than Turing
    Last edited by SirGrim; 21st January 2020 at 7:59 pm.

  24. Post
    SirGrim wrote:
    In a interesting move, AMD has decided to fight off the $50 price cut to the RTX2060 by releasing a new BIOS file for the RX5600XT.
    Hmm that is an interesting one. Sandbagging their own products until competition heats up. Good for those with the cards, but a shitty move IMO to sandbag the cards to start with.