Jump to content

Anthracite

Members
  • Content Count

    93
  • Donations

    $0.00 
  • Joined

  • Last visited

Community Reputation

92 Good

Flight Sim Profile

  • Commercial Member
    No
  • Online Flight Organization Membership
    none
  • Virtual Airlines
    No

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I appreciate this stance and if you can afford it, why not? I buy whatever offers a reasonable delta between performance and price, I had the i5 2500k, bought it new and it only started struggling recently. That’s a good 8 or so years. The alternative at the time was the 2600k. Barely an improvement in gaming. that’s true now too, a 9900k or 10900k is barely an improvement over a 9600k or 10600k once the chip is over Clocked. that’s also true of amd- in gaming there isn’t much benefit of even going higher than the 3300x now. I’m on the 3600 because the 9600k was £100 more but ultimately the performance delta isn’t really there - as long as your Cpu can keep up with your gpu you’re golden. Perhaps if I won the lottery I’d buy a xx80 ti but even then it would leave a foul taste in my mouth. as another poster aptly put, its price gouging at that point. I feel like if I buy something that I consider over priced, even if I can afford it, I’m giving them permission to keep overpricing stuff.
  2. Yeah I heard this, wasn’t too happy with Matisse refresh, I want to see a 4000 series cpu that can tackle intel in the gaming space. That 3300x basically annihilates anything below a 10600/9600 in gaming at half the cost of a 9600. All that’s left for amd to do now that they’re dominating everything outside of gaming is to dominate gaming. Eitherway I’m happy to sit on my 3600 until that happens.
  3. Your system is still reasonable. I only recently upgraded from an i5 2500k and gtx 970. It’s your gpu that’s lacking the most, if you’re playing at 1080 then waiting for the next generation and sticking in a lower end rdna2 or ampere gpu with a new 16 or 32gb ram kit would give you an acceptable frame rate for the lowest cost (around £400). I would highly recommend upgrading your whole system if possible though, either the latest 10 series intel cpus (10600k would be the best bang for buck From intel) or waiting for the soon to be released amd ryzen 4000 desktop units (no information so far). regardless of which upgrade path you take I would definitely invest in a solid state drive. It will dramatically reduce loading times. In your current system that would be a sata ssd but a future upgrade would also allow m.2 drives. You’d load and boot 5+ times as fast as your current hdd. so if you decide to keep your current system but upgrade it you’d be looking at £400 and could achieve acceptable frame rates at a lower resolution. If you want a higher resolution/frame rate you need to replace your system. if you replaced it, you’d be looking at £1.3k - £1.5k depending on your choices. For a system capable of 1440 max settings or 4K with sub 60 fps. (This is about what I spend every 8-10 years) for a system capable of 4K and max settings you’d be looking at £2.5k+ ( but the £1.5k system Could achieve it with a gpu upgrade a couple of years down the line, which if you included selling your gpu from the £1.5k system would probably make the total system cost still comfortably lower than £2k) for 1080 max you could achieve it for less than £1k but it depends on how the prices of the new hardware get affected by the current economic situation. Choosing this option would also likely limit any upgrade path considerably, as each of the cheaper components would slow down any individual component change. Upgrading from this path in the future would likely require an upgraded motherboard. (Again the £1.5k system could upgrade comfortably for a couple of gpu generations).
  4. It depends what kind of system you want. Consoles are sold at a loss and contain some proprietary firmware and hardware to help elevate them. They are able to keep pace with a mediocre gaming rig because of this and certainly represent great value if you enjoy games at 30fps. They won’t be replacing a pc anytime soon, as stated above, a lot of their performance is achieved by being as lightweight as possible in the software and having tightly controlled firmware. So while it’s theoretically possible to turn them into regular pc’s - as soon as you do that, you’re going to be affecting their performance. Won’t happen for the foreseeable unless we all move our data and software to the cloud. Regarding specs, I’ve never bought a flagship gpu. I’ve never struggled with games using xx70 Nvidia GPUs within the generational cycle. There’s a big alarm that rings in my head when I feel suits dipping their greedy fingers in my wallet. That’s what I hear when I see the price of xx80+ GPUs. Their price/performance tanks. Even the xx70 GPUs of the current generation have been overpriced. Hopefully that’s remedied and they return to pre-RTX levels with ampere but, even if they don’t the xx60 is a great unit. Its important to remember that if you have a fast gpu but a slow cpu, you will slow the gpu down. True of system memory and your motherboard too, to a lesser degree. For example- if you’re buying a 3600 but you want a 2080ti, you have to remember that your cpu is in control of your systems scheduling and the faster it can send things for the gpu to process the faster your gpu can process it. The 3600 would probably steal some of the potential performance of an rtx2080ti hitting its price/performance even harder. This works the other way around too. Great cpu with a lower tier gpu? The cpu has to wait for the gpu. While the rtx2080ti would still be the best gpu for your system, you’d want to levy the performance it’s offering properly otherwise you’re wasting money. all of these contribute to real world measurable performance and stability. Focus on the hardware for your chosen budget, your money will go further and you’ll be much happier. With that in mind, the 3300x coupled with an rtx2070 super would make for a stellar system, capable of running anything at 1440 and if you dropped it down to a 1660 super it would be even cheaper and easily handle 1080. That’s within the current generation and solely focused on gaming. You could get everything you needed for not a whole lot more than what the new consoles would cost. If you wait for the next gen hardware that will release just before or just after the consoles, you’ll be looking at a ~20% cost delta for a system that will be slightly faster in games (with more granularity on options and wider peripheral support) and if it’s your daily driver the combination of both would more than make up the 20%. It’s all about your personal situation. If you’d rather have a dedicated console for games and then a ~£250 laptop or desktop as your home pc, you’ll be spending a similar amount as you would on a gaming pc. If you only need a phone for your emails and social media or whatever, then the console is a great prospect - as it is also if you already have a stable home computer. If you need any help with your hardware choices lots of us here who have their finger on the pulse are happy to spread the knowledge 🙂
  5. I’d wait, the new amd processors will drop this year and with the way and have been tearing up the consumer processor and hedt market you may miss out on something amazing. The 10 series is basically a redone 9 series. Very lacklustre.
  6. It’s expected, I imagine that there are one or two Chinese companies stealing all the assets and code with the intention of re releasing it as their own game. Happens often.
  7. Would anyone like to create a dummy account we could all share? 😄
  8. The legal ramifications of not preventing the dissemination of copyrighted works or something like that. look at the recent “the last of us 2” controversy for the kinds of things websites, channels and companies would have to endure were they to host the content.
  9. In many of the screenshots it says “offline scene mode” or something on the watermarks
  10. Aye, we all agreed. We just also said that we’re sure asobo too knows that it’s wrong lol
  11. Yeah lots of leaks there. Weird that it says "offline scene mode" on his watermarks. Easy enough to find if you want to look. They're on bilibili - a chinese video website.
  12. 2 years is a relatively short amount of time and would depend on use. 2 hours a day I’d expect him not to experience anything noticeable until around 4 years provided he is playing a lot of motion content. It depends on how long you want it to last, many people are happy upgrading after 5 some want longer. I believe LG OLED panels are expected to last 100,000 hours, but their quality degrades substantially more during that time because of burn in, if they’re displaying static content. This is particularly true of the red pixels. All panels tend to lose brightness and colour over time but the compounds in oled are more prone to it - if you’re interested in oled and can wait a year, Samsung is looking to bring qdoled out in 2021 according to the rumours. That’s assuming corona doesn’t effect their timescale. These will have better efficiency, brightness, colour and lifespan.
  13. Hdr on monitors at the moment is a joke compared to an oled tv or even a qled tv. the issue isn’t how bright the screen can get, but how many dimming zones the screen has. Most of the consumer quality screens do not have enough dimming zones to offer a decent implementation of hdr. The asus Proart range (at about £4000) offers just over 1100 local dimming zones. A £500 hdr monitor might give you 8. So yeah, 600/1000 can be okay as an introduction but I worry how many people are going to think hdr is a bit of a “meh” for graphics because of substandard implementations that don’t have good local dimming. I’d consider the proart to have the minimum required amount of local dimming zones to be considered a good implementation of hdr. I’d also advise against using oled screens for computers because the quality would degrade pretty quickly because of screen burn in. Oled looks great but there’s a reason monitors aren’t utilising the technology right now. It’s not a great fit for what monitors display.
×
×
  • Create New...