Latest CPU & GPU Discussion (Intel/AMD/Nvidia)

Thread Rating: 5 votes, 5.00 average.
(5 votes)
Results 14,176 to 14,200 of 14262

  1. Post
    it’s only $350
    Only? wut.

    I don't buy the leaks regarding a GTX1160 @ $249-299 - RTX2060 performance without RTX... can't see it happening eh. Why would anyone buy the 2060 if its true.

  2. Post
    So what will happen first?

    Intel release an 8 core 10nm desktop cpu?
    Or Computer Lounge update their website?


  3. Post
    I’d bet on Intel, at least they have provided a ballpark by saying Q2 2019.
    As for CLs website it’s been “coming soon” for several years
    I belive CL is using the trademarked version of “soon” as famously invented by Blizzard

    Edit: so apparently r/AMD reckons they know how much 16gb of HBM2 costs
    Safe to say the Radeon VII could have been a lot cheaper had they used GDDR5x with the 16GB HBM2 costing over $200usd

    AMD could have priced it well under the 2080 had they used cheaper memory
    Last edited by SirGrim; 10th January 2019 at 5:41 pm.

  4. Post
    imho AMD really wasted their keynote CES chance. They had the whole tech world waiting and anticipation. Instead they release a consumer fill in card which does not offer any performance or price advantage over their competitor and then kept their 3000 series powder dry with little new info. They should have prepped better, i know for sure NVIDIA , Intel etc would have seized the limelight in the same position and knocked their competitors in the gonads.

  5. Post
    I'm pretty happy I got my 2700x on launch and that secondhand 1080TI STRIX in October now.

  6. Post
    Fragluton wrote:
    But if the screen can operate 100% fine on AMD cards, it's pretty obvious where the actual fault lies.
    I hate Nvidia as much as anyone else but your argument that if their "G-Sync Compatible" drivers (Adaptive Sync) aren't compatible with every single FreeSync monitor then they suck seems weird - both companies implement adaptive sync differently so incompatibly seems inevitable in purely practical terms, well before we need to bring Nvidia's bs into it.

    Bloody wrote:
    imho AMD really wasted their keynote CES chance
    The products themselves are fine but until we see them stocked with retail prices we won't know for sure how exciting they are.

    IMO the Radeon VII only needs to be between 1080 and 2080 performance - hopefully not too far off 2080 - with a price of ~$1200 and Nvidia won't be relevant without dropping some prices big time - the cost savings in not going Intel/Nvidia/G-Sync for a full rig would probably buy you a second Radeon VII...

    They were never going to compete for the 1-2% market share of the 2080 Ti and they don't have to. Maybe they'll try with a 1000W dual GPU card
    Last edited by iRoN; 10th January 2019 at 6:20 pm.

  7. Post
    iRoN wrote:
    I hate Nvidia as much as anyone else but your argument that if their "G-Sync Compatible" drivers (Adaptive Sync) aren't compatible with every single FreeSync monitor then they suck seems weird - both companies implement adaptive sync differently so incompatibly seems inevitable in purely practical terms, well before we need to bring Nvidia's bs into it.
    My point is more so that blaming the flickering on the monitor, seems off the mark. Do you think it is a fault with the screen or Nvidia driver then? I'm just saying I think it is their driver. And showcasing a monitor being unusable with their driver, reflects badly on the screen. As they say it's "not compatible" with their hardware / driver combo. Seems like marketing BS to me. I'd call the same BS if AMD said you can use Freesync on a G-SYNC monitor. Then displayed a G-SYNC monitor flickering like that, while using an AMD card. When the G-SYNC monitor does not experience the same flickering under other use case scenarios.

    FWIW, I prefer the G-SYNC requirements for adaptive sync. It keeps products to a high standard to get the sticker. Do I think a screen with a narrow adaptive sync window is good? Nope. I genuinely think the whole problem comes down to drivers here. Sure the hardware isn't as good. But if it was that bad, you'd expect it to be a well known issue.

    I'm not sure I said every single Freesync monitor? I can only really speak of the monitor I have, which featured in the video. Where they said even with Radeon cards, it has the same problem. Yet i've had mine a very long time and have seen none of this problem. Seems legit. I don't have a problem with the imcompatibilities BTW, just the "has the same problem on Radeon" type carry on. It's misleading at minimum.
    Last edited by Fragluton; 10th January 2019 at 6:36 pm.

  8. Post
    Fragluton wrote:
    My point is more so that blaming the flickering on the monitor, seems off the mark. Do you think it is a fault with the screen or Nvidia driver then? I'm just saying I think it is their driver.
    iRoN wrote:
    both companies implement adaptive sync differently so incompatibly seems inevitable
    ^ That's my point. It doesn't matter if you want to say the driver or the monitor is at "fault", the point is they're not compatible, and why would they be when their implementation of adaptive sync was never designed for the monitor's specific hardware and vice versa?

    G-Sync is not identical to FreeSync, and hamfisting it onto monitors without the G-Sync control unit is going to return mixed results

  9. Post
    Bobs wrote:
    So what will happen first?

    Intel release an 8 core 10nm desktop cpu?
    Or Computer Lounge update their website?

    The team at CL have their fingers and toes crossed that we'll be the ones launching first!

    We've literally been working around the clock so that we can launch the site as soon as possible, but as it turns out, it is a much bigger job than we had initially anticipated, and even the most highly-regarded web developers hit obstacles that cause setbacks.

    The saying 'good things take time' could not be more accurate in this case

  10. Post
    iRoN wrote:
    G-Sync is not identical to FreeSync, and hamfisting it onto monitors without the G-Sync control unit is going to return mixed results
    Correct.

    What isn't correct, is saying that the monitor in question has the same problem with a FreeSync capable Radeon card. Which is what they are saying in that video. Bascially, adaptive sync is causing problems with Nvidia AND Radeon cards. Which is not true. It's no skin off my nose, I just think misleading information, parroted by tech guys is a bad thing.

    I think the new Vega part is way overpriced too, uses too much power and just overall seems to be again a compute card marketed as a gaming card. I'd pick a 2080 over the new Vega card TBH. Though given the adaptive sync problems i'd then have with my monitor, makes that a no go. If they manage to get the drivers sorted, and I can use a 2080 no worries, then i'd totally do it.

  11. Post
    Fragluton wrote:
    What isn't correct, is saying that the monitor in question has the same problem with a FreeSync capable Radeon card.
    Ohhhhhhhhhh yeah okay you're good - didn't catch that end at the part, and I am at least 98% sure that dude is competent enough to know that's spin as well - untrustworthy bs.

    Fragluton wrote:
    I think the new Vega part is way overpriced too, uses too much power and just overall seems to be again a compute card marketed as a gaming card.
    That's probably because it technically is just another prosumer card. It looks like Radeon Instinct MI50s with +54 Mhz and PCIE 3.0 instead of 4.0.

  12. Post
    HELL KNIGHT wrote:
    I'm pretty happy I got my 2700x on launch and that secondhand 1080TI STRIX in October now.
    Yeah for sure

    When the 1080ti launched it was seen as super expensive
    Fast forward to 2019 and the 1080ti is probably one of the best value cards

  13. Post
    iRoN wrote:
    That's probably because it technically is just another prosumer card. It looks like Radeon Instinct MI50s with +54 Mhz and PCIE 3.0 instead of 4.0.
    Hah I didn't spot that card, explains a lot! One day they might give us a high level dedicated gaming GPU. Maybe.

  14. Post
    AMD CEO says it is working on Ray Tracing

    Maybe this is why all the leaks said they would announce Navi - When the RTX cards came out, Navi got pulled from being shown so AMD can take it back to the drawing board to build Ray Tracing in?

    https://www.gizmodo.com.au/2019/01/a...ndows-laptops/

  15. Post
    No one really cares about ray tracing man.

  16. Post
    Nvidia will make people care

  17. Post
    nzbleach wrote:
    No one really cares about ray tracing man.
    Yeah, not at the moment with the poor launch it got. One day it will be nice, a couple of generations away.

  18. Post
    It isn't going to gain any real traction till mid-range cards can run it comfortably - there isn't much incentive for devs to implement it fully till then.
    Last edited by HELL KNIGHT; 11th January 2019 at 10:03 pm. Reason: typo

  19. Post
    esepcially when using RT halves your FPS and is not 100% better visually, the glowy edges etc

    RT needs more work, the devs will get better at using it, drivers will improve it but as above, not until the next gen or 2 will it be widely adopted or used. We all saw the same deal for PhysX and DX11 / 12 etc etc .

    Sure if you have the cash and must have the fastest card atm, get the 2080ti but getting it for RT is not a reasonable purchase for the majority of consumers.

  20. Post
    Bloody wrote:
    not until the next gen or 2 will it be widely adopted or used. We all saw the same deal for PhysX and DX11 / 12 etc etc .
    Yeah just look at all the PhysX games released this year...

    EDIT: I meant last year
    Last edited by brand; 11th January 2019 at 11:04 pm.

  21. Post
    brand wrote:
    Yeah just look at all the PhysX games released this year...

    EDIT: I meant last year
    Never really understood physx
    Devs own Game engines can do physics just as good if not better as NVIDIAs tools

    Physx originally required an extra piece of hardware if your rig then an extra gpu
    But once that went away what’s the point of physx anymore

    Where as ray tracing is a DX12 feature so it’s open to anyone - amd can enable DX12 rayvtracing tommorow on its cards if they want to, you may only get 10fps but the point is they can if they want to
    Last edited by SirGrim; 12th January 2019 at 12:04 am.

  22. Post
    1) popular game engines use physx under the hood so devs actually use it all the time (its the default in unreal engine)

    2) physx is open source now so it's open to anyone too

  23. Post
    Argh you’re right, unity UE4 and cryenegine
    But it all runs off the CPU and not the GPU - could explain why no one pops the physx logos on games from those engines for the most part
    You can run the hardware accelerated gpu version of physx too on Nvidia cards but by default it’s a cpu based physics library in those engines

    Not much uses accelerated hardware physx anymore
    https://en.m.wikipedia.org/wiki/List..._PhysX_support

  24. Post
    AdoredTV is thinking the demo'd chip that equals the 9900k with unfinished clocks is the Ryzen 5 chip, so mid level chip at best. With a potential 65w TDP, which is realistic since you can already get 8c16t AMD chips at that TDP. Upgrade paths for those sitting on AM4 platform could well end up quite tasty. Potential 9900k+ performance on my first gen setup would suit me fine, for now.

    Sauce
    https://www.youtube.com/watch?v=g39dpcdzTvk
    Last edited by Fragluton; 12th January 2019 at 12:38 pm.

  25. Post
    Thinking of going ahead and buying the AMD 2700x..or should i go with the Intel equivalent?