GP discusses the latest CPUs & GPUs

Thread Rating: 5 votes, 5.00 average.
(5 votes)
Results 14,376 to 14,400 of 15485

  1. Post
    Ah Frag is using hardware unboxed 1440p results for his 23% average

    Well done on the cherry picking, using 1440p benchmarks on a clearly made for 4K card where every single cpu bottlenecks it at 1440p

    You should have just said it’s 5% better and used the 1080p benchmarks

  2. Post
    Figured most people would be gaming at 1440P high refresh TBH. It's not really cherry picking when most gamers with those cards are probably running that resolution. Lets look at your situation, 1440P gaming monitor and 4k TV, perhaps your TV is 100hz. But most games won't be getting near that without turning settings down. Most are probably averaging not much more than 60fps with dips to 50fps. Sounds like fun.

    If you spent less time copying and pasting stuff articles, you'd know that Hardware Unboxed use current games and those highly requested, for all benchmarks.

    SirGrim wrote:
    Decided to try some far cry 5 dlc, noticed 4K Ultra has jumped from 51fps to 83fps
    Used to play the game on high with 90% scaling to get 60fps - the same settings now gets 103fps

    Laughs in 23% average
    Hardware Unboxed put the 2080Ti @ 28% faster @ 4k vs 1080Ti in FC5. I think i'll go with their results as they'd keep all testing consistent. You'd do anything to prove a point on the other hand. Laughs.
    Last edited by Fragluton; 11th February 2019 at 9:57 pm.

  3. Post
    Fragluton wrote:
    Which game are you cherry picking to get 30%+? Hardware Unboxed over 35 games put the gain at 23%. Which is alright, but nothing to write home about when you're paying such a big increase to get that performance. Anyone not "salty" at Nvidia's extreme pricing is obviously just happy to be ripped off. Which is fair enough too. Myself though, I game at 1080P UW, so I don't need top end gear. I also think the AMD option of Radeon 7 is shit too. Does that make me salty at AMD too? It probably does. I think it's pretty fair to be salty at the current state of the GPU market. 2c etc.
    i was talking raw compute performance (OpenCL and the like) and 30% is probably on the low side.

    too many variables when comparing games (as you and Grim have pointed out)

  4. Post
    Fragluton wrote:
    Figured most people would be gaming at 1440P high refresh TBH. It's not really cherry picking when most gamers with those cards are probably running that resolution. Lets look at your situation, 1440P gaming monitor and 4k TV, perhaps your TV is 100hz. But most games won't be getting near that without turning settings down. Most are probably averaging not much more than 60fps with dips to 50fps. Sounds like fun.

    If you spent less time copying and pasting stuff articles, you'd know that Hardware Unboxed use current games and those highly requested, for all benchmarks.



    Hardware Unboxed put the 2080Ti @ 28% faster @ 4k vs 1080Ti in FC5. I think i'll go with their results as they'd keep all testing consistent. You'd do anything to prove a point on the other hand. Laughs.
    Laughs in 1080p

    https://www.guru3d.com/articles_page...review,18.html

    Next time just quote 1080p numbers

  5. Post
    Ah, caught out with your 60%+ claimed gains, when HUB only manage 28%. Keep on spouting your fake news. No one buys a 2080Ti for 1080P, which is why I didn't even look for those results. Keep trying.

  6. Post
    Not to mention the gigantic improvement in smoothness which you won’t see in your HU average fps

    Click image for larger version. 

Name:	fwu1e21v.ay4.png 
Views:	228 
Size:	8.5 KB 
ID:	225070

    Pretty impressive when you consider the 2080 is not much more powerful than the 1080ti but has a massive improvement in perceived user experience smoothness, same goes for the 2080ti

  7. Post
    Total benchmark time is? For context. Could be 26 seconds, could be 5 minutes. One is bad for anything other than the 2080Ti, the other would probably not be noticeable. I don't find 60fps to be smooth in either case, so it's all a moot point for those wanting high FPS (and have screens that can take an input higher than 60hz). If you want 60hz @ 4k, get a 2080Ti for sure. Without turning settings down, that's about the max you'll likely get in many current games.
    Last edited by Fragluton; 12th February 2019 at 1:16 am.

  8. Post
    Agr just canceled my Radeon VII order and convinced myself to atleast look at the 2080 as they are bringing more features to Linux next update.
    Then AMD drops this
    https://www.techpowerup.com/252475/a...y-tracing-edge

    I wonder if SR-IOV will work now....

  9. Post
    The title of that article is confusing - translated it says "amd unlocks compute and workstation features to make radeon 7 more competitive against the rtx 2080's gaming focused real time ray tracing"

    TPU needs to rework that headline

    Is this not just AMD adding something missing from launch, just like UEFI?

    https://www.techpowerup.com/252476/a...o-uefi-support

    In what is turning out to be a massive QA oversight by AMD, people who bought retail Radeon VII graphics cards report that their cards don't support UEFI, and that installing the card in their machines causes their motherboard to engage CSM (compatibility support module), a key component of UEFI firmware that's needed to boot the machine with UEFI-unaware hardware (such as old storage devices, graphics cards, NICs, etc.,).
    Also, GTX1060's replacement found in the wild

    Last edited by SirGrim; 12th February 2019 at 4:19 pm.

  10. Post
    SL1CKSTA wrote:
    Agr just canceled my Radeon VII order and convinced myself to atleast look at the 2080 as they are bringing more features to Linux next update.
    Then AMD drops this
    https://www.techpowerup.com/252475/a...y-tracing-edge

    I wonder if SR-IOV will work now....
    So graphics professionals are going to snap these cards up reducing availability and inflating the price?

  11. Post
    It was always intended for this use just repackaged now as an all purpose card.

  12. Smile
    Wow, Is that really a 1660? or is that a joke of some sort?

    I thought it was RTX 20XX from now on.

    Edit--

    Have to be a joke, But the box would be worth heaps if a one off.
    Last edited by Magic Robertson; 12th February 2019 at 6:21 pm.

  13. Post
    Magic Robertson wrote:
    Wow, Is that really a 1660? or is that a joke of some sort?

    I thought it was RTX 20XX from now on.

    Edit--

    Have to be a joke, But the box would be worth heaps if a one off.
    It’s real

    More than enough leaks saying the same thing for a while now and it’s up for preorder in Russia
    It’s a 2060 but without the RTX and Tensor cores and likely has fewer Cuda cores and clocked slightly lower than the 2060 as to not to step on It’s toes

    So likely it’s a bit slower than the 2060 but still fair bit above the 1060
    It just won’t have support for DLSS or Ray Tracing

    The name is weird because it was probably not a planned product and only got made because people complained the 2060 was too expensive so they had to make something for entry level gamers they couldn’t just stop selling the 1060
    Last edited by SirGrim; 12th February 2019 at 6:47 pm.

  14. Post
    Magic Robertson wrote:
    Wow, Is that really a 1660? or is that a joke of some sort?

    I thought it was RTX 20XX from now on.

    Edit--

    Have to be a joke, But the box would be worth heaps if a one off.
    No, it's not a joke. "NVIDIA’s GeForce RTX lineup completed with the GeForce RTX 2060 and NVIDIA is now moving to offer more affordable graphics cards under their GeForce GTX family. Featuring the same Turing GPU architecture, the new GeForce GTX graphics cards will exclude ray tracing, but feature faster shading performance through the enhanced GPU design while utilizing the 12nm process node"

  15. Post
    Here is the back of the box




    And here is the specs


  16. Post
    sorceror wrote:
    here is a 21.5" 4k monitor - https://www.apple.com/nz/shop/produc...ine-4k-display

    not very usable for the majority of people but the colours on it are amazing - easily best i've seen on a monitor.
    I have a 15" 4k Laptop display, easily the worst idea a PC manufacturer has ever thought of. Any increase in picture quality created by HiDPI is quickly offset by the vast majority of software that's completely unreadable unless you turn the resolution down to 1080p or use the magnification tool.

  17. Post
    GTX 1660ti PCB
    Oh man she's a tiny thing!


  18. Smile
    SirGrim wrote:
    GTX 1660ti PCB
    Oh man she's a tiny thing!
    I like smaller cards tbh, Just as long as there's not much heat difference between their larger brothers.

    Don't mind losing 5°C for space saving. (anything above that gets too much)

  19. Post
    SirGrim wrote:
    GTX 1660ti PCB
    Oh man she's a tiny thing!

    Oh damn no shots of the VRMs

  20. Post
    Pretty keen on a 1660 Ti if it fits my budget!

    Now that NVIDIA support Freesync I can use either brand GPU with my screen

    Also planning on building an ITX system and 1660 is looking good for smallness

  21. Post
    Reports of DLSS not looking too great in Metro?

  22. Post
    nzbleach wrote:
    Reports of DLSS not looking too great in Metro?
    DLSS makes everything look like a blurry mess

  23. Post
    no, dlss is the best thing ever

  24. Post
    I just read this, seems appropriate to chuck in here:

    TPU wrote:
    AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions

    A report via PCGamesN places AMD's stance on NVIDIA's DLSS as a rather decided one: the company stands for further development of SMAA (Enhanced Subpixel Morphological Antialiasing) and TAA (Temporal Antialising) solutions on current, open frameworks, which, according to AMD's director of marketing, Sasa Marinkovic, "(...) are going to be widely implemented in today's games, and that run exceptionally well on Radeon VII", instead of investing in yet another proprietary solution. While AMD pointed out that DLSS' market penetration was a low one, that's not the main issue of contention. In fact, AMD decides to go head-on against NVIDIA's own technical presentations, comparing DLSS' image quality and performance benefits against a native-resolution, TAA-enhanced image - they say that SMAA and TAA can work equally as well without "the image artefacts caused by the upscaling and harsh sharpening of DLSS."

    Of course, AMD may only be speaking from the point of view of a competitor that has no competing solution. however, company representatives said that they could, in theory develop something along the lines of DLSS via a GPGPU framework - a task for which AMD's architectures are usually extremely well-suited. But AMD seems to take the eyes of its DLSS-defusing moves, however, as AMD's Nish Neelalojanan, a Gaming division exec, talks about potential DLSS-like implementations across "Some of the other broader available frameworks, like WindowsML and DirectML", and that these are "something we [AMD] are actively looking at optimizing… At some of the previous shows we've shown some of the upscaling, some of the filters available with WindowsML, running really well with some of our Radeon cards." So whether it's an actual image-quality philosophy, or just a competing technology's TTM (time to market) one, only AMD knows.

  25. Post
    This sums up my thoughts on this, due to proprietary technology like this never being for the benefit of the consumer, only the manufacturer. A good result would it ending up like the recent adaptive sync back down.
    Name:  1ejnt9.jpg
Views: 126
Size:  19.7 KB

    DLSS seems like another way of turning quality settings down to medium. Which funnily enough, likely results in the same performance gains. Nothing i've seen image wise so far has me convinced its even worth bothering with.
    Last edited by Fragluton; 15th February 2019 at 5:22 pm.