Hacker News new | past | comments | ask | show | jobs | submit login
Alan Wake 2 is an unexpected visual marvel even on older GPUs (xfire.com)
177 points by mdotk on Oct 29, 2023 | hide | past | favorite | 235 comments



Have not heard the name xfire in a long long time. It was the initial (IIRC) defacto gaming messaging service back in the early 2000s. You could download skins (like Winamp) and it would track game time hours and show running game status.

Looks like they pivoted into a obscure news site. Will always have fond memories.


xfire, roger wilco, teamspeak, kali, ventrilo, mumble, icq. Relics of better times.


Teamspeak and mumble are still alive. Definitely not as popular as it used to be, but not dead for sure.

Mumble is still big in eve online afaik, due to some funky auth processes with EVE account.

Teamspeak forums are still pretty active too, and updates are still being released at a regular pace. I'm still keeping teamspeak as primary voice chat for my small gaming community and i hear from my users that TS3 voice quality is superior to discord. Plus if something happens with server, i can fix issues myself, since i host TS3 server on my hardware. Which is not the case with discord - if you have issues with discord server, your only choice is to wait.


I think Mumble's popularity in EVE scene is due to its support for talking to multiple channels simultaneously and ability to do hierarchical voice channels. This is require for large scale battle coordination. Battle group commanders can talk to each other and then command their ships without risk that any "lower level" can talk on top of them and so on.


This also allows for hilarity whispering to users in other channels who will respond to other users who couldn't hear the initial message.


I've been hosting a Teamspeak server for more than half my life, it's acted as the communication hub for the majority of my time gaming.

It's fallen off in terms of features and UX. But I feel the warm and fuzzies knowing that my memes and conversation aren't being fed into some neural network owned by microsoft.

I also love that I can just change the voice codec to whatever I please without me or my users forking over a monthly fee.


Mumble is popular because it's free, lean on resources, low on latency, and has incredible audio quality.


I believe teamspeak is also pretty popular in the milsim community due to its support for mods in games like Arma 3


Oh yeah, ACRE2, which simulates real-life radios to the point of tracing radio signal paths, accounting for power output, terrain, buildings, antenna plugged into radio, antenna radiation pattern and other factors. Absolutely insane mod.


GameSpy. The original video game match-maker. Also the original home of another relic, Penny Arcade.


A friend of mine self hosts a teamspeak server and REFUSES to use discord. Absolutely won't have it. I play along, I respect the self hosting. And it works, we can chat and game.


Entirely different Xfire, as far as I can tell. The two are unrelated.


I remembered that xfire too. So many applications back then had broad and seemingly global audiences but none seem to have stuck around. It seems like everything now is about getting funding and growth. Back then it just seemed like people making cool applications.


Ah I remember Xfire, pretty sure the biggest draw for me was the FPS counter it gave you.

I doubt this is related to the current domain, though. https://en.wikipedia.org/wiki/Xfire#Video_game_and_pop_cultu...


I came here to post this. For a second I thought Xfire was alive once more.

https://en.wikipedia.org/wiki/Xfire

I remember.


I don't think anyone from the original xfire is involved in this new xfire.com gaming news site, it looks like it's now owned by a New Zealand company called Enoki Limited.


> the minimum requirements for Alan Wake 2 are 1080p at 30FPS with low graphics, with an Intel i5-7600K CPU, RTX 2060 GPU, and 16GB RAM

I think we have different definitions of "older" then. Does the game runs decently on something like rx580?

Ok, the game runs 2 fps even in low res on rx580: https://www.youtube.com/watch?v=bbM433WwY8U


I actually rebuilt a computer with an RX580 instead of RX570 this year (as well as a Ryzen 7 1700), because everything else is way more expensive at local stores (vs 140 EUR for the 580), to the point where current gen hardware doesn't feel worth it: I don't catch myself wanting to play any of the modern games that look "amazing" (can't tell characters apart from the scenery in some visually cluttered games), but rather ones that have consistent performance/art style and don't make the fans spin up much.

That said, it's kind of a shame that not many games let you actually pull back the graphical fidelity if you want to, despite the engines themselves often scaling back even to mobile devices. Maybe I'll upgrade in 3-5 years when Wirth's law will catch up way more. Then again, I can kind of understand companies wanting their games not to look "badly", no matter the graphics settings; it just means that indie games are for me, not AAA titles.


I've run a lot (~120,000+) of 470, 480, 570, 580 gpus and one kind of depressing thing is that they are all complete snowflakes. It is literally luck of the draw on performance.

I would suggest learning how to tune them if you really want to pull some performance out.


Pulling back graphical fidelity is a no go. Art team sizes can scale up so much faster than anything else, and AAA competes on how much it can spend/scale up compared to other AAA games or other offerings (like independent/small team games).


Good for them! I hope the majority of players enjoy the advancements in modern titles, if they have adequate hardware.

I'll just enjoy games like Valheim, No Man's Sky, Cloudpunk and other stylized ones.


The game is basically unplayable if your GPU does not support Mesh Shaders which means AMD RX 6xxx / NVidia RTX 2xxx series or newer (or 1650 which is in reality a Turing card so RTX 2xxx but very low end).

The game will tell you that during startup.


The 6 year old rx580 has moved from older to just old at this point.

As a midrange card from 3 years ago the RTX 2060 is getting long in the tooth, the 5 year old RTX 2070 is well above the minimum specs. Which seems perfectly reasonable for a new game. You don’t have to upgrade constantly, just put off some titles until you do.


> Which seems perfectly reasonable for a new game.

Right, I'm okay with that. But "being a visual marvel even on PS5 era GPU" sounds not at all the same as "being a visual marvel even on older GPU" to me. I do consider RTX20xx modern GPU, even if they are not the highest range for sure.


It also runs and looks good on a GTX 1650 with low settings which was a 150$ card from 3 years ago. Which I think is more their point.


almost 5 years actually https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_proces...

And to a sibling comment: RX 480 is not 6 years old but actually coming up on 8 years here. Yes, these GPUs are quite old now.

People don't want to hear it but at some point graphics technology is going to have to move forward from Pascal and Polaris. You can't stay behind on legacy technology forever.

AMD knows this very well themselves: https://www.youtube.com/watch?v=fqLea0QUW1k&t=176

As Roy says: Don't be cynical, developers are not going to just stop working. But they're not going to be bound to supporting legacy pathways and obsolete APIs/feature-standards forever either. This is literally the "when developers commit from the ground-up" that AMD was referring to.

Remember, DX12U is just DX 12.2, it would have been a full api release in the past, and that eventually does come along with the old stuff eventually falling out of support. It was a pretty shortsighted move to downplay all these features for all these years, and eventually sooner or later some big titles are going to utilize them to an extent that makes a legacy fallback impossible or un-performant.

Only difference is that nvidia was the first one on the draw with this DX feature set.

GN made the case for the importance of early testing and early adoption of these featuresets, back in 2016. Then they lost their mind over Turing... https://www.youtube.com/watch?v=3qiQH29KAXg


Meaningful improvements stalled years ago, it's not exactly the 90s anymore. Games from the mid-2010s still look perfectly fine. "It's old" isn't an excuse if there's no real benefit to be had from upgrading.


Comparing midrange settings on modern titles vs older titles at ultra high settings is hardly a fair comparison. You really should be comparing games of each generation on hardware of that generation or hardware that’s powerful enough to maximize frame rates and settings on both.

2015’s Bloodborn looks quite dated when you use the kind of mid range settings most gamers would at the time. We can’t see what 2023 titles look like on 2026+ hardware, but I think people looking back will notice a bigger gap than it currently seems like.


A big enough gap that you'd pay hundreds of dollars to cross it?


I’m in no hurry.

Tossing a low end graphics card into a new PC unlocks significantly better back catalog every time with the added benefit of having the most patched version at discounted prices.


Is playing starfield really such a big deal that you'd pay hundreds of dollars, when you have a perfectly fine iGPU?

If you don't buy DX12.2 capable hardware you don't get the DX12.2 features, it's really that simple. buying fermi or terascale or some other ancient hardware will always be cheaper, but studios aren't just going to not release games because someone is clinging to their 4870. and the situation is not different in kind when we're talking about DX12.0 to 12.2 transition instead of DX9 to DX11 or DX11 to DX12.

If you've upgraded to a current NVIDIA gpu in the last 5 years, or an AMD card in the last 2 years, you got the capability. And when you next upgrade, you'll get the capabilities on AMD too, unless you go out of your way to buy older cards without support, as many people did with DX12.2 (because reviewers lost their minds over Turing).

It is, again, super funny to look back at how uncontroversial this all used to be, when it was AMD who was ahead on DX feature support. Like yeah if you buy a DX11-focused card like Maxwell then your DX12 experience is going to be shit. Doctor it hurts when I do this.

https://www.youtube.com/watch?v=fqLea0QUW1k&t=176

https://www.youtube.com/watch?v=3qiQH29KAXg

Developers aren't going to not release games just because some people insist on running 10+ year upgrade cycles, or run esoteric hardware with some different tradeoffs because they don't want to pay for newer stuff. I literally don't understand why it's an issue or a confusing topic, other than people having a bit of a rough time with the realization that Turing wasn't a pointless waste of a generation after all.

Reviewers kinda did it dirty, you can argue that it was overpriced at first (2070 was still pretty good tho), but buying a used 1080 Ti instead of a 2070S to save $50 was a mistake, and buying a 2080 Ti for $950 was an outright bargain. There were plenty of deals that got shouted down in the whole "turing BAD!" tantrum from reviewers and gamers, and under the general lack of understanding that indeed the cost-trend aspect of moore's law was over and the price trend was gonna be upwards.

https://youtu.be/tu7pxJXBBn8?t=273

These sorts of videos are just shameful and utterly incorrect to look back on, and it's absurd to think it's the same Steve who made the above videos 2 years before. Nuclear reaction from tech media aside, almost everything tom's said in their article was correct and has come to pass, it's interesting to re-read the article with 5 years of hindsight. This is exactly the sort of "cost to buying outdated tech" that Steve mocks, and the people who followed Steve's "long-term value doesn't real, buy the older cheaper thing without DX12.2 support" will inevitably have to upgrade sooner as a result.

Pascal and Polaris have already received a lot of grace from developers, in the abstract it is crazy that tuning for 8-year-old cards is still such a significant aspect of gamedev today. But there will eventually be a point when Polaris and Pascal fall off, and then eventually can't even launch the game, just like older generations before them. And it seems like it's going to be a very rapid offramp over the next 2 years as studios move onto true current-gen titles and finally stop supporting base-tier PS4/XB1 hardware, because these cards are very very overdue for replacement.

The transition to upscaling being a standard part of the pipeline is going to be another of these "gamer moments" too. People are dead-set on it being "just to boost your framerate", but even if you go back to the original launch presentation, it was always framed as making it possible to run much more intensive effects than otherwise could be. Being behind on this part of the pipeline is problematic for AMD, because it is going to be more and more directly relevant to actual game performance and not just an optional extra. Rendering 4x as many raw pixels is going to throw off all the optimizations etc.


> If you don't buy DX12.2 capable hardware you don't get the DX12.2 features, it's really that simple. buying fermi or terascale or some other ancient hardware will always be cheaper, but studios aren't just going to not release games because someone is clinging to their 4870. and the situation is not different in kind when we're talking about DX12.0 to 12.2 transition instead of DX9 to DX11 or DX11 to DX12.

Again, the question isn't whether they should release new games (of course they should!) but whether those new games are meaningfully improved by relying on this new tech (hint: they aren't).

> It is, again, super funny to look back at how uncontroversial this all used to be, when it was AMD who was ahead on DX feature support. Like yeah if you buy a DX11-focused card like Maxwell then your DX12 experience is going to be shit. Doctor it hurts when I do this.

It was uncontroversial because back in the day the new tech lead to real visible improvements that you could tell without needing a side-by-side on/off comparison, not just spec wankery. GPU brand has nothing to do with it. The law of diminishing returns kicks in, and the S-curve slows down. It happens to every category of tech eventually.

> buying a 2080 Ti for $950 was an outright bargain.

Again, you're talking about a $950 purchase that for new tech that maybe improves the level of eye candy by maybe 1%. There is no market for that. There is no possible world where that tradeoff is worth it.

But you're never going to understand that, because all you do is look at is a spec sheet, where "bigger number must obviously be betterer".


> "It's old" isn't an excuse if there's no real benefit to be had from upgrading.

you are literally seeing the benefit from upgrading right now, mesh shaders increase performance and reduce VRAM utilization substantially.

the game has a non-mesh-shader fallback, but obviously in that case you don't get the performance or VRAM benefit. And then people get upset that it doesn't perform well... but somehow don't make the obvious connection there.

people basically decide a priori that it doesn't have any performance benefits and then discard any evidence to the contrary, it can't be a performance benefit, the legacy pathway is just unoptimized!

they used up all the ~~glue~~ optimization, on purpose!


> you are literally seeing the benefit from upgrading right now, mesh shaders increase performance and reduce VRAM utilization substantially.

And yet, games were able to get on just fine without them for decades. Remedy's previous games worked just fine without them!

What's so special about this one that suddenly prevents it from running acceptably without it? Would anything of value have been lost if they had just reused the Control engine?


what’s so special about quake 3 that it needs hardware transform and lighting? Previous games worked fine without it, even previous games from the same company!

I dunno dude, which pebble is responsible for the avalanche? Graphics standards move forward, why wouldn’t you have coded it in DX9 instead of DX11?

Also Alan Wake 1 came out in like 2008 or something, why would we literally want to be stuck in 2008 level graphics forever?


> what’s so special about quake 3

It's very much obvious to everyone that there was something special about q3, but there's a lot less of the special about the games in the current year. This comparison is unfair and manipulative.


Can you reliably tell the difference in a double blind test?

And choosing to compare against AW1 rather than a newer title which still didn't require this kind of bullshit (like Control) says enough about what kind of faith you come into this with.


Can I tell the difference between 2008 games and 2022 games in a blind test?

yes. I had a 7850, then 280. I was an early adopter of gsync etc. I returned a 295x2 unopened (for $450 or something, regret) to game for $270 XB270HU IPS and 780 Ti for $185 after maxwell launched. I've had 1080, I've had 1080 Ti, I know what I'm talking about.


> Can I tell the difference between 2008 games and 2022 games in a blind test?

That wasn't the question.


I've heard that Remedy has used[1] the D programming language in the past. Is it used on Alan Wake 2?

1. https://ubm-twvideo01.s3.amazonaws.com/o1/vault/gdceurope201...


Seems likely, they are apparently still using the same engine they developed for Quantum Break (Northlight)


The author of that left Remedy in 2017 so I wouldn’t be surprised if they replaced the D code by now. They have at Facebook while I was there.


Most certainly, yes.


Art direction trumps graphic horsepower every time. Tears of the Kingdom is a great example of this. Alan Wake and Control are great combination of style and visual dazzle.


Visually, I find TOTK is extremely underwhelming at 720p on the half-baked potato they call a switch. It’s like 15 years behind in every way. Nonetheless… to your point, the fact that it’s not worse is a miracle, and testament to the significance of well rounded art direction.


Seeing it emulated shows you how amazing the art direction is - it scales extremely well and it's indisputable how beautiful it looks.

One the hugest misses of Nintendo to be honest. The art styles they develop are designed to scale down to their hardware, but they also scale way up and they never provide you an official way to take advantage, even on future hardware. The art team is being done a bit dirty in this respect.


Since there is no 'future hardware' announced yet how can you state Nintendo will never provide you with an official way to take advantage of it? Seems Nintendo regularly re-releases games on new consoles, often with graphical improvements. FI, you can play Zelda Skyward Sword in HD on the Switch.


Never is a long time, but it's not likely to be the next iteration (which was demoed in private at Gamescom). Mostly because the Switch graphics API is very hard coupled to the specific Tegra chips they use.


You mean you don't get an upgraded version for free when buying a new console. While I'd like that Nintendo is not big on 'for free' ;)


No, I mean that there wouldn't be backwards compatibility, and Nintendo usually doesn't re-release games immediately (with the notable exception of Mario Kart 8 and some simultaneous releases like Twilight Princess and Breath of the Wild).


Most games are still stuck on their older consoles. Even well known games like Super Mario Galaxy 1/2 and Xenoblade Chronicles X. They port only their most popular stuff. This is why piracy is necessary in the long term. Platform holders don't care about game preservation.


Super Mario Galaxy 1 was part of the 'Mario 3D All stars' pack for Switch. It's hard to take a case for illegal copying serious if you start with incorrect examples.

I'm convinced that as long as people want those games they will be made available eventually, but probably not on the timeline you want. I think some kind of 'continued availability' provision in copyright law would be more realistic than an appeal to resort to piracy.


> Super Mario Galaxy 1 was part of the 'Mario 3D All stars' pack for Switch.

It was on sale for a limited time and got delisted apparently. Better than nothing, I guess - but if you missed it - then it might not even exist.

My point still stands - pirated copies of those games are available 24/7. As GabeN said - piracy is a service problem and Nintendo fails to provide a better service.


Still doesn't give you any right. Lobby your politicians if you want better copyright laws. Being a pirate will only strengthen the case for those who believe strong copyright enforcement is necessary.


So your argument is that if pirates don't stop it empowers people who hate pirates?

If pirates stop, there's no alternatives to get a lot of media in many cases. There is zero incentive to stop because you can't get worse than nothing. Nothing being what is available in many cases for older media.

You want to embolden your argument that pirates should stop? Then don't take away all legal means of getting something. I have zero sympathy for copyright holders who refuse to make an IP available anymore yet litigate after those who do.


> it scales extremely well and it's indisputable how beautiful it looks.

I disagree - I had a google and found [0]. This is emulated, at 4k with raytracing. It's a good decade behind what we're seeing come out these days IMO. It's dated, suffers from the hallmarks of not-quite-HDR (that lovely shiny surface that we all learned to love during the late 360/PS3 days).

[0] https://www.youtube.com/watch?v=rA7BAbMQU3c


Tears of the Kingdom came so late in the lifecycle of the Switch and the next Zelda is so far off, that I wonder if Nintendo will do some type of light re-release on Switch 2 (or whatever it is called). I read that the did a closed-door demo of Tears of the Kingdom running on Switch 2 hardware.


They most likely will. The same happened with the previous zelda game.


> It’s like 15 years behind Did you expect 4090 performance on a system you can pick up for less than $300?

Finished TOTK last week and had a lot of fun with it. I'm not blind and can clearly see jaggies, frame drops and resolution drops, but nowhere in the game did that really interfere with the core gameplay or make it impossible to enjoy it.

While newer systems have more detail, most of the time it's imperceptible since you don't have time to look at the scenery anyway and when it matters (e.g. when rendering faces) it still doesn't look realistic to me. Burning a kilowatt just to get a bit more detail or framerate for the same old gameplay just sounds insane to me.


> While newer systems have more detail, most of the time it's imperceptible since you don't have time to look at the scenery anyway and when it matters

But higher resolutions never were about more details. At least not for TV's (and thus consoles).

At the end of the SD era (576p) I had a 28" TV. At the end of the HD era (1080p) I had a 50" TV. That's a very similar pixel density. We're now partway through the 4k era (2160p) and I currently have a 77" TV. 8k TV are not here yet in any significant numbers; my last 4k TV will likely be around 100". When we move to 8k it won't be because we want a more detailed image, it's because we want a wider field-of-view. We need more pixels so our screens can get larger.


I think pixel density is a strange metric for a screen you're watching fully (instead of reading text from a part of the screen). If you want a bigger screen you can take the same content and upscale it (early digital cinema showed mainly 2K movies and that was about as sharp as film). These days lots of "4K" blu-ray discs just contain an upscaled 2K movie and most people don't seem to bother.

Also, I don't think the market will ever get to 100" TVs as a norm. You're an outlier: most people have other priorities (don't want TV to dominate the room, don't have room, don't want to pay for the extra energy required etc).


“when rendering faces […] it still doesn’t look realistic to me”

This gets to the heart of the matter for me. No matter how much hair sways like real hair, skin wrinkles like real skin, or eyes get a glassy look when emotion suggests they should, real humans can spot weirdness from miles away.

That person you see walking towards you, barely visible in the dark and without your glasses on? The tiniest stiffness in the way they made that last step, the way their elbow shifted, have raised your awareness and you’re now considering them a threat, and will watch them carefully and keep your distance.

Or the way they get up from the table they are sitting at clues you into the fact they have a pulled muscle in their lower back or upper right leg and you instinctively watch a little closer in case they fall and you can help somehow.

The scowl and glare of a woman who is playfully admonishing her kids and encouraging them to play along.

While we asymptotically approach ‘perfect’ rendering and simulation, we are still just alongside it, unable to climb out of the uncanny valley.

A game can look beautiful with some creativity and care in its visuals, and it seems like nearly everyone who’s played Tears of the Kingdom or Breath of the Wild is enamoured with their beauty.

I often play just to travel around and enjoy the beautiful landscapes. Even The Depths is enchantingly beautiful, reminding me of what you can see while scuba diving. I wonder if this was intentional.

So yes, while I do appreciate a super high resolution game with rock solid megahertz framerates and physically correct lighting, these are additive in their enhancement, not multiplicative.

Spending effort on these that could have been spread more evenly across art direction, gameplay, music and other aspects serves only NVidia, AMD and those who enjoy ‘high fidelity’ visuals or the game (in its own right) of chasing hardware that is capable of running these games.

To me, there is a parallel with the 'audiophile', who lusts after higher and higher 'fidelity' and perpetually upgrades their equipment but only ever listens to a playlist titled "Songs to test headphones with". Spine-tingling cymbal wash and awe-inducing bass sweeps are amazing.

Others are here for the remaining 90% of the music, and are happy to listen on their phones, and don’t even notice when a live performance isn’t a clone of album tracks.

I personally fall into both camps. I love to put on some headphones and listen to ‘audiophile’ music, but that’s a hour or two every few months. Most of the time I’m having a blast with medium fidelity, ultra fun music … and gameplay.


> I often play just to travel around and enjoy the beautiful landscapes. Even The Depths is enchantingly beautiful, reminding me of what you can see while scuba diving. I wonder if this was intentional.

You just made me wish for a Wind Waker like game with a full underwater world ;)


All that yet modern games cant render mirrors


They can, but it's really expensive so most devs avoid it. Currently the performance hit is not worth it.

Games could render realtime reflections using various tricks even 20 years ago.

I remember that in Hitman Blood money enemies could spot you in a mirror and mirrors were functional in that game. It's a game from 2006 and mirrors were placed only in small rooms.


Modern games have been rendering mirrors with ray tracing for couple of years already


Exactly. It runs on a "potato" yet it still looks better than many AAA games and their generic bland art direction and janky animations.


It’s also a AAA game, it’s just running on a potato and looks the part. It would be a better game if it was developed for hardware from this decade.


Lots of people forget that Breath of the Wild was developed for the Wii U. The engine for TOTK is actually two generations old.


But Control is one of the most graphically intense games of the pre-Unreal 5 era. And it was the demo game for Nvidia's latest graphics features - raytracing, DLSS.

They didn't neglect the style and art direction, but it absoutely required graphical horsepower. Maybe - though - you need top tier art direction to sell the sizzle of top shelf graphics.


I played Control on a PS4 Slim and it still looked phenomenal without all the raytracing stuff.


I managed to play Control and really enjoy it on an RX590 and an Intel i5 processor.


It didn’t require it. It just was one of the first major titles that supported full ray tracing. If you didn’t use that then it could run on much more modest systems and still looks great.


It's clearly not the best setup, but I'm currently playing Control on steam deck and I like very much what it manages to pull up from such light hardware.


Controls art direction is beautiful and it's atmosphere really unsettling. It's a unique game. I'm in love with mostly every game Remedy develops.


"visual marvel" and no gameplay images? No frame breakdowns? No even comparisons of high/low settings at fps and rigs.

Is this an Ad?


I wanted to see some of it with the maximum settings and found this video channel where they just walk around (no talking or fighting) without abrupt movements. The game looks fine, but the ground litter looks very flat when lit by the flashlight. The light also seems to illuminate through water to hit e.g. the bottom of a stream, which they show in the video a bit, but there's no reflection of the light off the water's surface in realtime. I also noticed the tree geometry regularly changing complexity at about 3-4 meters away in an abrupt way. After playing 2077 with the new ray reconstruction enabled, this is a great-looking game but not pushing the envelope for realism.

https://www.youtube.com/watch?v=E_dXLgmdqeM


Strangely, Digital Foundry don't [yet] have a video out with the max settings that a PC can provide.

They have a PS5 review[1], and a 'this is how best to get console quality settings on a PC' comparison with a PS5 on performance settings.[2]

1. https://www.youtube.com/watch?v=JawxvOF__4Q 2. https://www.youtube.com/watch?v=QrXoDon6fXs


Not a gamer, so pardon my ignorance.

Does it always rain in the game? How depressing.


Well, it is set in Washington...


and "older GPUs" are GeForce RTX 2060 / Radeon RX 6600 ?


The lowest end of the offerings from 2 generations ago? Yeah.


Yeah, right.

I can run basically any game at 1080p with at least 60fps with my 1080ti, except this game.

What they did with the mesh shaders is basically the equivalent of releasing a raytracing-only game.

So now I have to wait until GPUs become affordable again, which might not happen for a couple of years still. Shame, as this was the game I was the most excited about this entire year.

And no, spending money to get the same (or worse) GPU which supports mesh shaders wouldn't be worth it.


That GPU came out almost 7 years ago, it's only natural its lack of modern features starts being apparent. I still remember not being able to run new games once Shader Model 3 became standard, it's sad but what can you do?


The problem is that there are no good upgrade paths for 1080ti. It's roughly as powerful as 3060ti. But there's no point to upgrade to a card with a similar performance and less VRAM. So the only upgrade path to feel a noticeable performance gain is to buy RTX 3090 or RTX 4090 and those are very expensive.


One small caveat: the "came out" date is irrelevant, the relevant date is the date an alternative appeared. In this case the real info is the replacement came out ~5 years ago (or ~4 depending on the definition).

And the situation is worse for AMD users, only 3 years since a compatible board came out.


Sure, but there was a new generation of consoles in the meantime. The example I gave of Shader Model 3.0 had to do with the same thing, new consoles appeared.

The real culprit here was the chip shortage, cryptocurrency bubble bursting, rising inflation, and overall instability throughout the world (deteriorating China and US relations, for example).

All in all, it's just how things have always worked out, with the exception of external factors that had unforseen consequences.

I'm sure Remedy put on a ton of work, and is trying to raise the bar in graphical fidelity. Decisions like these had to be done early in the game dev stage, since all the game assets would've taken it into account.

At the end of the day, gimping progress in real time CG for the sake of compatibility with a 7 year old card makes no sense.


Your GPU is 6 years, 7 months old. As an analogy, consider someone in 2007 objecting that their 2001 GeForce3 Ti500 can't run Crysis/Mass Effect/etc. The PS4 generation really messed with the usual conventions and expectations of PC upgrade cycles.

(I appreciate that GPU prices have creeped up and up over time, though.)


For most people, Moore's law died around 2013 with Haswell or even Ivy Bridge. In 2007, 2001 was forever ago (In particular, 2001-07 involved the jump between the space heater Pentium 4 and proper 64-bit multicore chips such as the Athlon 64 X2 and Intel Core 2 Duo/Quad as well as the growth of dedicated gaming video cards - even the GeForce3 was uncommon, while by 2007 everyone had a video card)

In 2007, CPUs and GPUs were still getting twice as fast (perceptibly) each year. That hasn't been true for a while. Other than lacking a TPM, my i7-3770k machine and GTX 970 runs as fast in desktop use as my i9-9900k+2070 Super, and that machine (which dates to....2019, I think?) still plays new game releases at 1440p just fine.

Recall that most games are designed for the XBox Series X (released 2020) and PS5 (released 2020) and still target that caliber of GPU performance.


The mesh shader issue isn't a Moore's law raw performance issue. It's an issue with the hardware not supporting specific graphic pipelines. The software fallback is slower than dedicated hardware. Like how AES-NI on a CPU is many times faster at doing AES. That's why the parent comment likens it to newer GPUs' hardware raytracing features.

"Caliber of GPU" is not just performance, but also features. The Xbox Series supports mesh shader (and PS5 with their equivalent). Nvidia 10-series and AMD RDNA1 GPUs and older do not. This youtube video compared two GPUs released around the same time, one with less performance but mesh shader support and one with more performance but no mesh shader support: https://youtu.be/UiduP4Y7RSw


> Recall that most games are designed for the XBox Series X (released 2020) and PS5 (released 2020) and still target that caliber of GPU performance.

This cycle is extremely unusual though, the length of the cross-gen period has been dramatically extended by both covid shortages and also disruptions to the game development pipeline from covid and the Russian war among other things.

Once developers are no longer forced to validate for literal base-tier ps4 hardware from 2012 (8 jaguar netbook cores and an underclocked 7850, oh my), gtx 970 tier hardware is going to immediately drop off a cliff. And frankly even pascal is not going to age well. There were lots of improvements and features in pascal that got downplayed by reviewers from 2018-2022 and now they are really starting to come into play (and would have done so earlier had it not been for the pandemic).

You can hardly say we’ve even seen next-gen games at this point, tbh. Even CP2077 is a cross-gen title - which in many ways functionally means “last-gen”.


I went on a quest last year to reduce latency on my desktop and code using an 8khz polling speed mouse, a pro gamer keyboard, and two 1440p 165hz monitors. I read a fascinating article here on Hacker News about it (in November 2022, I’d have to look up the article) Anyway, it feels great. If you’re wanting things to feel faster consider these upgrades! I highly recommend them. For the first few weeks it’d feel like I had started typing before I started typing which was a weird experience. I’d just become so used to a ton of latency.


The lesson here is that you need to bite the bullet and get top of the line at least once

Because then you always have some capital within the hardware to upgrade again without breaking the bank

But if you upgrade mid life cycle or even worse: to a mid range card that’s already mid life cycle, then you’re always getting less life span and having to do full upgrades again


> I can run basically any game at 1080p with at least 60fps with my 1080ti

No, you can’t.


There's lots and lots of other great games to play, so don't stress if you miss one.


Have you considered geforce now? A lot more affordable than a GPU. You get a 4080 for $20 a month, no commitment.


You can buy an RTX 3080 for $400 or a 3090 for $650-700 right now. Those can both run the game fine. I got mine from some gamer kid in a beat up van at the gas station and it’s working great.


> You can buy an RTX 3080 for $400 or a 3090 for $650-700 right now

New or with 10 000+ hours of 100% load mining coins ?


Stable load is better for GPU longevity than cycling between hot and cold repeatedly. Also, lots of mining farms will undervolt their cards for more efficiency. Getting a used mining card is probably better than getting one from a gamer.


> Stable load is better for GPU longevity than cycling between hot and cold repeatedly.

Same for cars but I'll buy a 30k km car over a 300k km car every day


I spent a little bit fiddling with the graphics settings for 2880x1600 last night, and settled on 30FPS at max settings and 1280x800 internal rendering, which looks pretty good. This is with a 3080 and a 7800X3D and 64GB of RAM, but wow the game looks really really nice. It’s 30FPS in the forest and 60FPS in town, which is interesting. All that foliage takes a lot, apparently.

I had to take the settings way way down to get to 60FPS and I don’t know as I get older, I prefer things to look nice and smudged vs high frame rates and crummy.

I’m just enjoying the heck out of walking around the forest and staring at necrotizing obese ray traced wieners. The details are astounding!

Also, if you’re looking to upgrade your GPU, it’s a great time. I bought this one off some kid for $400 and 3090s are selling for $650.


The fact that running a game at almost 720p and hitting 30fps in demanding areas with high-end last-gen hardware is slightly concerning. Are developers not optimizing their games anymore?


It is actually quite well optimised. It is one of the best looking video games ever made.

Maxing everything means enabling path tracing which is a massive performance hit. Basically to do that you need a 4090 or a cheaper 40xx card and use frame generation.

The “PS5/Series X equivalent” settings are a mixture of low/medium without any hardware ray tracing (I think it still does some software ray tracing similar to UE5 Lumen) using FSR2 with post processing effects done before the upscaling (low resolution). On PC if you use high post processing effects they happen after the upscaling and thus at much higher resolution (this is the “heaviest” option outside of the ray tracing stuff)

Remedy hasn’t made a game that targets “native” resolution since Max Payne 2. Basically starting with the first Alan Wake game they have used lower internal resolutions and used the budget for other graphical effects. (On PC you still always had the option to use native resolution)


I think it’s just the reality of how many bells and whistles there are, especially with ray tracing. It looks really good, it doesn’t seem unoptimized (generally you’d expect weird hitches in stuff like that, I definitely noticed that in Baldur’s Gate 3 with lighting effects) there’s a sumptuousness and wetness and warmth in all the right places to the environments that really feel like you’re walking along a path at sunset in an old growth forest.

The feature sets involving AI in the newest cards are really the key, I think, and why my last gen premium card is lagging behind. DLSS 2.0 on the 4080 and 4090 especially are just revolutionary in how they work. The last gen cards just can’t quite keep up. It’s like magic, making frames out of seemingly thin air.

In my opinion this is a good problem to have! I’d rather have games pushing the state of the art than sticking to what the PS5 and Xbox whatever it’s called these days can do.


The comment mentions they're running at max settings. So even current-gen high-end hardware might not be the target. As long as the lower settings scale down fine to lower end hardware, who cares about max settings? That's what the original article is about.

Max settings also means enabled path tracing, which current-gen Nvidia's 4000 series have better hardware acceleration for. So you can't really compare performance to previous hardware which don't have that hardware acceleration. No amount of optimization is going to fix that, and it should probably be turned off if you're not on a 4000 series or newer card.

In that same way, AMD's path tracing hardware acceleration falls behind Nvidia's. A current-gen high-end AMD RX 7900 XTX does worse than Nvidia's last-gen not as high-end 3070 in Cyberpunk 2077's path tracing update. https://youtu.be/cSq2WoARtyM?t=503


Video games are slapped together because deadlines must be hit. The most popular titles can cost >$100 million to produce so there is huge pressure to get it out the door. Customers expect cutting edge graphics which contributes to production budget bloat (so does marketing). A lot of PC games are console first and ported over. The new console generation (PS5) means more VRAM is available so games expand to fill it. Just like how Blu-ray was a large increase over DVD capacity. Later 50GB was a ceiling commonly bumped up against. Once 50GB was exceeded why not have a 100GB install size since you’ve blown past the Blu-ray limit anyway?


This is not true about AW2 though - it's pretty clear that it's a PC-first title. Consoles run the game at medium-low settings. Max settings are for future PCs. Even the 4090 can't run it @ native 4k 60 fps with everything maxed out.


I've been playing with a 3090. I play in 1440p 21:9 ultrawide. So far it seems that medium settings with DLSS 720p and raytracing on medium is the most solid one. On high settings it's mostly fine, but in the forest it gets slower, and sometimes very slow (like 5-10 fps) when there's additional effects like smoke or something.


Alan Wake sits very comfortably at the top of my personal chart as the best game experience I have ever had.

I truly look forward to a year or so from now when the game finally makes it's way to Steam, fully patched and ready to go for Linux play. Until then, I'm content with playing the vast catalogue of non-exclusive games available.


It's unlikely that it will ever come to steam, it was funded and published by Epic, it's basically a second party game. I don't see it coming to steam unless egs closes. Remedy made a deal with Epic to publish 3 games, some people say Alan Wake Remastered was the first, and Alan Wake 2 is the second.


It’s not “some people say”. It’s their yearly/quarterly reports saying that. (Also it’s two games)

https://investors.remedygames.com/app/uploads/2023/03/remedy...

They also have 2 more games with 505, max payne remakes with Rockstar and one game with Tencent.


Other people corrected about the Epic deal being only 2 games, but for some reason I always forget that and say 3 because is what I remember.

The 2 games with 505 are Control 2 and Project Condor (Control coop spin-off).

Tencent game is Project Vanguard, a live service game, hinted to be related to Control, and the Keystone AWE.


This game is published by Epic, it likely won't come to Steam for a very long time.


Ughhhh, I won’t shoot the messenger but that is so annoying to hear.


I'll gladly wait. Don't see the rush.


It was funded by Epic - so it's unlikely to be a timed exclusive.


I guess they'd rather have people pirate it then.


Game not being available on steam is a pretty weak excuse for piracy. Epic simply doesn't want to give 30% to a competing store.


Interesting, it may never go to steam


I'll wait


Your determination is unmatchable


I'm a PC gamer. I couldn't give a toss what frame rate I'm getting, or what the graphics benchmark is.

I get the fascination of tinkering with the build, and optimising stuff, and so on. But eventually it's got to come down to actually playing the game, right? This article is a classic case in point: not once, in the entire article, does it talk about the actual game. It only talks about graphics performance. Who buys games just to run them at 60fps at 4K? Surely at some point you've got to actually, y'know, play the game and enjoy that game?


The overwhelming majority of game reviews address the content of the games, but there are reviewers that concentrate specifically on graphics performance because the difference between a low end and high end gaming PC can easily be $3k+, and half that is in the GPU nowadays. People use articles like this to decide what to buy to get the optimal experience in games they want to play, not which games they want to play in the first place.

Alan Wake 2 merits specific consideration on this front because its system requirements seemed to suggest it wouldn't even be playable on a lot of popular configurations.


I think you are right, but what you are ragging on is the meta of PC gaming. It's like an old muscle car nut trying to optimize the top speed of their car, acceleration, etc. They enjoy driving it, but the tweaking is an additive experience. It may eventually become the experience altogether.

If this isn't you, consoles or mid-tier PCs are a great fit.

I would say I straddle the two - if I buy great hardware, I expect to capitalize on it. But I don't overclock shit or care exactly what speed my RAM is, nor do I need the fastest SSD or top tier everything.


Due to how my head is wired, I've discovered that I enjoy gaming a lot more after I switched to console two years ago: all settings are curated (for better and worse). I'm not even sure why I didn't switch earlier.

The only reasons I can think of are fear of the controller and slow load times due to mechanical HDD on last-gen systems.

I don't spend time away from gameplay trying to get optimal setting. I'm just the type of neurotic that I will do that stuff even though I absolutely don't enjoy it.

I can jump directly into gaming without anxiety over how my hardware is falling behind. With my PS5 being in my living room, gaming is also time away from my desk, which I, much like doing anything on a Windows PC, associate with my IT jobby job.

A console is an appliance. It's like a toaster or dishwasher, except it lets me play games without any work.

With PS5, I also don't have to worry about weird glitches with audio through HDMI in my living room setup, since that's how I prefer to play. And PS5 now also renders Dolby Atmos, without any glitches.

These are of course all me-problems, but I thought I'd mention it here, since gaming is too much fun to be confined to the tuning-happy garage mindset of PC.


yeah, it feels like a parallel hobby - it's not actually gaming, because the game doesn't matter, it's optimising the hardware. The game is just the benchmarking tool for the graphics setup.


Offering a good faith alternative explanation, I would hope that the main reason many people obsess over benchmarks and performance metrics is because GPUs are damn expensive — buyers would want to know that the money they're spending has tangible benefits. I'd like to think that once someone actually loads a game in order to play it, they'll actually focus on the game.

(As a parallel example, I love the idea of seeing films in 3D, 4K, HDR etc, but I rarely notice the difference within ten minutes of the film starting. There's a reason why digital cinema got along fine at 2K for so long: extra detail really, really, really doesn't matter.)


I think you’d care if you were getting less than 30 fps and the game is virtually unplayable.


I started PC gaming back in the 90's. <30 fps is perfectly playable, depending on the game. Some strategy and RP games are slideshows and play fine.


It was for me as well, like, I was happy to be able to play Morrowind at ~15 fps at the time. But we can do better now; at the time, hardware was a limiting factor, but now if a game runs like shit, it's always the developer's fault.

Recent example, Cities Skylines 2; onereason it runs like shit is that the pedestrians still get fully rendered teeth even though you're far away and can't see them. Why do they even have teeth? It's a city building sim.

Or the latest Pokemon game, it looks like it could've been made 20 years ago in terms of visuals, but it runs terribly.


Some say that less than 120fps is literally unplayable.

I remember when CoD 4 being 60fps on Xbox 360 was a big deal.

Virtually unplayable is in the eye of the beholder.


Those people are ridiculous. You can have high fps but inconsistent frame timing which makes games feel terrible. I think that was the bigger issue with cod.


I know plenty of "gamers" who's #1 obsession is how good the game looks to them, not how well it plays. They generally dislike games with non-photoreaslistic rendering


There's a big slice of the gamer market that treats games and GPUs like an interest in sports cars. The benchmarks and the visual spectacle is an interest in itself. These are subjective preferences.


I mean, ultimately, yeah -- the gameplay is the thing!

But I think there's most definitely room for articles that nerd out on some specific technical aspect of the work and I don't think they imply that the actual "fun factor" of the game is unimportant.


I absolutely love and appreciate folks that push tech boundaries in gaming, but I absolutely agree - let's not lose sight of the reason why we make and play games!


A company's "give a shit" level is not confined to one area of development. If they cared about the gamer when it comes to graphics they probably cared about the gamer when it comes to gameplay.


If anyone is playing on Linux with an AMD GPU and has texture issues (missing FBI text on the characters' jackets is the give away), it can be fixed with applying a patch to mesa-git: https://github.com/HansKristian-Work/vkd3d-proton/issues/175.... It should hopefully be implemented soon. I think it's only limited to RDNA2.


Graphics aside, is it a good game? Do you like it? My wife and I enjoy playing through story games, but only if they’re not dumb.

Spiderman 2 is awesome. It was one of the few preorders that have ever worked out for me. So I’m cautiously optimistic about recent mainstream games, and the preview for Alan Wake looks neat.


The first game was good. This one is not. I don’t understand how a big game like this could have you just walking around doing nearly nothing for several hours straight. Every puzzle and lock solution is basically on a sticky note next to it in game removing the minimum challenge the game presents. Movies are more engaging.

As for the story, it’s just Alan Wake 1 retold but louder and in your face about it.


> I don’t understand how a big game like this could have you just walking around doing nearly nothing for several hours straight

Alan Wake 1 had a lengthy walking around/watching cutscenes first part, so this sounds good to me: I like walking simulators (e.g Gone Home, Firewatch). Not every game has to be about blowing things up to smithereens or solving over-rehashed puzzles.

The most recent example for me was Scorn: for me it was fantastic until I got to the area where you get a thinly veiled shotgun thing with mandatory combat + solve those rotating puzzles over and over whose essential logic has been seen a thousand times (in stark with the first big room environmental puzzle whose logic wasn't that complex but was immersive), at which point I dropped the game as it was just killing the mood for me.


> I don’t understand how a big game like this could have you just walking around doing nearly nothing for several hours straight

Isn’t that exactly how Alan Wake 1 starts out?


I started AW1 the other day because it had been on my list after playing through Control. There's definitely a lot of walking around and cut scenes but I wouldn't say you do nothing. You do have to fight people, collect items and do some very simple puzzles.


Huh fair enough thanks for clarifying. How does Alan Wake 1 hold up?


Thank you! Exactly what I was hoping to hear, unless it was actually good. Thanks for saving us from disappointment.


The first couple hours are intentionally slow. It's not "doing nothing" it's building suspense and laying out the foundation for the story.

I don't understand why people need to hear whether games are "worth it" based on someone else's opinion.


It helps to make a decision about whether it's worth buying. Of course opinions can vary wildly, which is why looking up multiple reviews and/or videos might be a wise idea.

As an example, I've struggled to get people to fully play one of my favorite platformers of all time, VVVVVV. And then the much vaunted Ocarina of Time bores me to tears and I can't push myself to get through it.


It makes sense to make a purchase decision based on others opinions yes, but what I don't understand is why sillysaurusx seems to have already decided based on one response, and without knowing how (dis)similar Drybones's taste in games.

Alan Wake is a great game but I consider Alan Wake 2 to be much better, but I also enjoyed Control more than Alan Wake and enjoyed all aspects of Quantum Break. Where you fall on those games will probably most affect how much you enjoy Remedy's latest.

How much you enjoy unorthodox multimedia narrative story telling, psychological and cosmic horror meta-narrative weirdness, how much you accept being confused from design jankiness in certain spots as part of the experience, stuff like that will decide how much you enjoy this game. Personally, its only real flaw is mystery solving can be a little too handholdy and bruteforce-able.


No, it makes zero sense to predetermine one's artistic interpretation based on outside influence. Absolutely none.

The idea that anyone needs another person to tell them what is likable is just sad.


> As an example, I've struggled to get people to fully play one of my favorite platformers of all time, VVVVVV. And then the much vaunted Ocarina of Time bores me to tears and I can't push myself to get through it.

VVVVV appears to be a fairly low budget indie game with graphics that haven't aged very well compared to modern indie games. I'm sure it was fun and it seems it was very well received when it came out.

Ocarina of Time was a AAA game that came out during the first real mainstream transition toward 3D graphics. It effectively kickstarted the RPG and free-roam genres while still presenting a typically polished Zelda experience. It was fun when I played it (when it came out) but it has not aged very well compared to modern games. If you didn't play it when it came out you're probably not going to enjoy it - but at least you can respect its impact on gaming.

Anyway the point is that games don't age well and similar to music, people don't tend to like other people's favorite games unless they're highly aligned in the first place.

Reviews are pretty pointless as well but in general you can tell in the first 2 hours of a game if you're going to like it or not, at which point you can choose to return it or not return it.


VVVVVV holds up perfectly. Good level design, good physics, good gimmick at the core of its game play. The graphics were designed to be evocative of the Commodore 64 era. It's hard for that to really age when that was the intent in the first place.

Ocarina of Time bored me in 1998. It still does. It hasn't "aged poorly" in my view; it was never good in the first place. (Yes, it's my opinion. :P)


> Ocarina of Time bored me in 1998. It still does

Oh my so I am not alone. I always felt like it was terrible, especially compared to the GB and SNES ones, and whatever games were out on PC/PS/DC around that time (not so much about the graphics but the game's pacing, controls, and mechanics). I feel the same about Golden Eye.

I feel like these games got a lot of success not because of what they are but because it exposed a chunk of players (Nintendo die-hards) to a type of game that wasn't previously available to them on their favourite platform.


VVVVV was released during the initial wave of indie game popularity. It may not be the most influential of that group- but it's more relevant than it probably seems from a present day standpoint


Okay I’ll take your word for it. Like I said it appears to have been well received. Gameplay looks interesting.


VVVVVV has extremely simple and polished gameplay, what makes it interesting is how the same gameplay mechanic is constantly challenged and reinvented through very clever level design. It looks like metroidvania as far as exploration is concerned but is the complete opposite in general progress, as in the only thing that locks you out from an area is pure player skill, reinvesting what you learned to go further: "oh, you can do this" instead of "oh, I unlocked this and so can now access that".

It was an instant classic for me.


You defeat your own point. OOT is nearly universally adored, and you don't like it. So other's opinions are worthless when it comes to determining your own taste.

Experience the art and make up your own mind. You either like it or you don't. The end.


Random internet people is, hilariously all you can even trust now.

Reviewers have been pretty questionable of late with a few obvious gaffes, but Starfield was the last straw for me. A wall on 10s for reviews, but the game is a clear 6, maybe 7 on a good day.


> Reviewers have been pretty questionable of late with a few obvious gaffes, but Starfield was the last straw for me. A wall on 10s for reviews, but the game is a clear 6, maybe 7 on a good day.

Pretty much all reviewers I follow were giving Starfield reviews explicitly mentioning how boring it is how it feels obsolete for 2023.

So which "reviewers" are you quoting and why aren't you reading the ones that match your taste in games?

And how the heck are random people in the internet more trustable to you, there's thousands of people that lost their shit because Starfield didn't get perfect 10/10 scores.


Starfield looked incredibly boring from the trailers and preview videos. I had to shake my head and roll my eyes at the review scores. I’ll stick to Halo if I want to constantly jump and shoot aliens.


I must be about 50h into Starfield and I'm on NG+1.

I found it extremely boring at first. I hugely disliked the potato graphics, the clunky animations and the rubbery faces.

Now I find I pick it up to kill time and have fun while doing so. The story is not incredibly deep but it's interesting enough. The gameplay is not 2023 AAA quality but it's decent enough. The endless interruptions with fast travel and loading (even for crossing doors and riding lifts! :-o ) are pretty low effort but they don't bother enough.

Pretty much all aspects of the game are good enough but never great. I'd probably give it a 6/10 but with a caveat that it's something I keep going back to.

This is the first time I'm actually playing a Bethesda game by the way. I tried Skyrim a few times and always found it extremely boring and clunky. I tried Fallout 76 and it was the same. This is the 3rd of their games I'm trying to play and so far it's been going ... quite ok! And I'm glad I'm playing it, it's fun despite its (many) shortcomings!


I was confused about Starfield until I saw https://youtu.be/lHiP5OPZ2sA?feature=shared , which explains what it actually is: Fallout, but in space. Also apparently it takes about 12 hours to get into it.


That's my take on it, and what I found so increasingly boring about those games

Starfield is Fallout, but in space, Skyrim is Oblivion, but in the north, Fallout is Oblivion, but in post-nuke retrofuture, and Oblivion is Morrowind, but in a thinly veiled roman empire setting.

I mean, from gameplay to quests, it's the exact same thing, reskinned. e.g porphyric hemophilia was cool in Morrowind, but the exact set up + quests is reproduced over and over again in subsequent games (incl. across franchises!). I'm halfway wondering (and would not be surprised) if Starfield had vampires as well.

It's nice if you enjoy the thing a lot (good for fans! I'm all for them enjoying it) but is otherwise so repetitive that what was fun back then is not anymore, and a fresh coat of paint increasingly failed at saving the later entries.


Why do you need to trust anyone else at all? It's art. Experience it for yourself. Make your own decisions.

Going in to the experience with your expectations already dialed in defeats the entire purpose of experiencing art.


> Thank you! Exactly what I was hoping to hear, unless it was actually good. Thanks for saving us from disappointment.

Eh let's be real - you decided you didn't like that game the moment you saw any sort of press about it.


It is good, but you have to go into it with different expectations. Skill Up has the best review of it that I've seen [0], where the game is actually a 4th wall breaking meta commentary on the story driven and mystery game genre as a whole, with live action parts interspersed with the gameplay and cutscenes.

[0] https://www.youtube.com/watch?v=jh1vq0SljoU


I hate meta. It always sucks.


> Thank you! Exactly what I was hoping to hear, unless it was actually good.

So you were just peddling around to confirm your preconcieved bias for it to be bad?

(Pretty much all reviews have been universally VERY good in media.)


The first game was more action oriented while this one is more like survival horror. I think it's a refreshing change from pushing back the hordes as it's much more of a tense and atmospheric slow-burn.

Personally I'm hugely fond of it but I absolutely love the weirdness of Remedy's extended universe, and for me it's less about the gameplay but the story and atmosphere. So, my bias couldn't be more apparent but I think there's a lot more to it than what you're dismissing it for.


I liked Control, there's a lot of story that you can discuss. It's like if someone made a game about SCPs but with an actually overarching story.


As a counterpoint I’m two and a half hours in and like the pacing. I’ve been drinking in the scenery and just had some really creepy stuff happen. I really like the whole “descent into madness” trope in my games though, and this definitely is scratching that itch.


You should draw no conclusions about quality or how well a game will run at launch from a game by another studio.


I didn't like it, feels like the story lost the twin peaks charm


I have a 2070super GPU and realized that for most modern games, the bottleneck is my CPU. Seems like most games have pushed a lot of complexity to that these days.


The GPU in the PS5 is roughly equivalent to a 2070 super. You'll probably find pretty much everything that also releases on console will run just fine until the next console generation becomes the new performance target.


It is beautiful even on my 2070. But strangely mirrors are completely broken.


Have you tried toggling Vampire Mode?


Yes, because they are using ssr for mirrors on lower settings.


I really tried to enjoy the first game, but it was too derivative of its influences (Twin Peaks, In the Mouth of Madness, Stephen King novels) for me to get through. Control did a much better job with this, including it’s retelling of the Alan Wake narrative. I had hoped the sequel would be better, but from everything I’ve seen, they’ve doubled down on the pastiche. Hopefully the next Control game won’t fall victim to the same.


Most of pop culture is derivative, and the benchmarks we use reveal more about our age than about true originality.

That said, the references to Control in this game definitely feel a bit hamfisted. You even encounter a familiar character from Control, and they seem to have a lot less depth than in the original game.


I mean, there’s a reason we have the notion of pastiche vs. influence. Some works are derivative or unoriginal in a way that damages their artistry. Twin Peaks was wildly influential, but not everything it inspired does the same kind of cut-and-paste stuff that Alan Wake does. Control is, in fact, a better example of this - it’s thoroughly Lynchian without regurgitating so much of the specifics.


For me the main issue with the first Alan Wake was the awful controls. I stopped playing when it became clear I was spending more time wrestling with the game than actually playing.


Yes, the controls were quite clunky and unfortunately the remastered edition did little to fix them.


If your up for it, I recommend the review by YouTuber “SkillUp” on Allen Wake 2 makes some good cases for why someone might enjoy it. He also compares it to the studios other games.


For me it was the other way around. I loved the Alan Wake 1 story. It was clear, easy to follow with enough mystery. Control was a mess, I didn't understand what I was achieving or had to do. The fact that it was "open world" inside the offices of that agency made it even worse. I skipped it and I couldn't continue, despite the fact that I would have loved to learn more about Alan Wake backstory.


Control (the base game) was anti-climatic but I wouldn't call it a mess. The story isn't really about the main character and that certainly affects the aforementioned ending, but everything that's going on is explained in the media you encounter throught the Oldest House.


I really like the visual artistic direction... but the Mind Place feels like a Jira simulator.


Started playing it yesterday with ray tracing turned off. Still absolutely stunning!


Are people really that invested into raytracing hype being so necessary for visuals?

From what I was unless you're comparing RTX ON vs OFF side by side it's not noticeable enough to make a difference.


Alan Wake 2 doesn't really have the option to disable ray tracing, only to disable hardware ray tracing. Software ray tracing is pretty good for diffuse GI nowadays.


It depends how games use it. Spiderman 2 for example uses raytracing for reflections which are very important in New York (glass buildings, water, cars) and even uses it to draw rooms inside the skyscraper as you are climbing them. It's important enough that raytracing cannot be turned off even on the lowest quality settings.


No raytracing needed for the window interiors, it’s shader wizardry

https://www.alanzucconi.com/2018/09/10/shader-showcase-9/


According to Fitzgerald, ray/path tracing is used for interiors in the most recent release:

[0]: https://youtu.be/fuu_wseJnIE?t=3m26s


Insomniac is incredible. I have great appreciation for studios willing to maintain their own engine and consistently push technological boundaries. Same with Remedy.


With the horsepower behind Alan Wake 2, Control 2 is gonna be nuts


That was for the first, which needed to run on a PS4.

Spiderman 2 uses a very different system.


This article is from 2018 and talking about the original Spider-Man PS4 game. And I think it's still wrong - as far as I know they used cubemaps for the interiors in that game.

So I think this article is mostly just an ad for a Unity addon.


Article gives examples of the cube mapping interiors https://forum.unity.com/threads/interior-mapping.424676/#pos...

You’re right about the new one though, they’ve apparently gone to raytracing for Spider-Man 2’s windows. I wonder if they’ll stick to raytracing always enabled if they do a PC port of this one.


You can do 4K@30 for most the part on a 4090 with max settings, no super-sampling or frame generation, with max ray-tracing on. The early bit in the forest it'd dip down to 21-24FPS though, so I up dropped the render resolution to 1440p.

Though if you flip on frame generation at 4K max settings, you can do 60+ FPS


Started playing Max Payne for the first time last night - great game so far.

Remedy know what they are doing


You mean the first Max Payne? Oh man you are going to be in for a great time! Part 1 and 2 are absolute classics. So many quoteable things are being said and the athmosphere and attention to detail are amazing.


In stark contrast to Cities Skylines 2


Interestingly both come from Finland. I guess the dev teams didn't share performance tips!


Is there one of those breakdown of a frame article? I'd be curious to understand the sauce.


I haven't seen one of those for years, is there anyone still doing them? They are great. They're a huge amount of work though so I wouldn't expect one for a new game.



Nice, thank you.



Thank you.


You can do 4K@30 for most the part on a 4090 with max settings, no super-sampling or frame generation, with max ray-tracing on. The early bit in the forest it'd dip down to 21-24FPS though, so I up dropped the render resolution to 1440p.


An article about how visually amazing the game is on older GPUs without showing more than 3 very dark screenshots that don't really show much. Sometimes I'm really wondering how people pick their article images.


What kind of performance are you all getting?

On my RTX 2070 Super, I played the first hour so far on render resolution of 1080p (with DSLR upscale to 4K), medium settings, no ray-tracing at roughly 30-40FPS.


I think this game is one of the best pieces of storytelling ever created, and it's so weird and arthouse that I can hardly believe there's an audience for it.


Too bad the tech was wasted on a horror game. Tons of people don’t want to play those.


Guess it's finally time to upgrade my R9 Fury from 2015..


Wish they wouldn't have made this an Epic exclusive.


Supposedly it even runs on the Steam Deck.


All that power and still they fail to position the camera directly behind the back... Hopefully same mistake will not be repeated on Max Payne remakes.


Even Max Payne 1 had the titular character slightly to the left. It's on purpose.


I’m not sure I’ve played a game where putting it directly behind doesn’t feel terribly uncomfortable. The player character ends up in the way of what people are looking at and moving towards.


I've gotten used to it, but I never liked it because the direction of movement is decoupled from the direction of the camera. Moving the mouse left and right also swivels the camera around a moving invisible point looking inwards, instead of stationary around the character looking outwards.

Also, is the obstruction you speak of really an issue? Never had I thought "God damn, move over!" to the character I was controlling straight from above-behind in ye olde days.


Camera behind the player is preferable for platformer games where you are concerned about your movement.

Over the shoulder (offset behind the player) was popularized by resident evil 4. It makes sense because you want to both focus on what’s in front of you with a clear canvas for aiming while also keeping the character in view as part of the game.


Remedy sold off Max Payne to focus on other of their IP.


They are officially developing the Max Payne remakes


The hopes are high that Alan Wake 2's optimization and performance will only continue to improve with the release of post-launch updates, including multiple expansions.

---

The idiocy of people. A game must be in a shipping ready state when it launches. End of story. If you wait for a game and buy it on T=0 day you do that because you want to experience the whole thing at that very moment, and don't want to wait for possible updates that would elevate it to a level where it would run much better on your hardware.


Seems entirely reasonable to both enjoy something now and hope it becomes even more enjoyable.


After you watch a movie, you hope for something more? The cut scenes maybe that are always horrible?


Games are a different medium then movies. When was the last time you spent twenty hours watching a movie? My library is full of games I have spent a hundred hours or more on. So yeah, I expect a game to be good the day I buy it. But I also like when teams update a game after launch.


It was an example and the same applies to games. I never replayed a game because of a patch, never waited for any kind of updates and never will. When a game released, that's the state I'm interested in, that's what I will base my reviews on, end of story.


That’s fine. Nobody cares about your reviews if they refer to a version of the game that no longer exists though.


Oh people nowadays and their bs.


Oh people older people who are unable to adapt to changing conditions.


Aside from the problem of jumping types, yes?

Have you never watched a film and wanted them to make a prequel, sequel or anything else in the same imagined universe? Never read a book and wanted more in the series?

As for fixes, if you watched a film and there was some awkward CGI in it would you think "wow I really hope they never improve that"?


You are talking about a new game from the same thing. That takes years to come together. You cannot just sit down and say, ok, here is a few more hours of thing for you. That takes time, a lot of time and effort from multitude of people. That's why 99% of DLCs are trash.

There is no such thing in movies like bad cgi at a given time. Noone with a sane mind would ruin 10s of millions of investment with bad cgi nowadays. CGI reflects the state of the art, or a level that is acceptable/affordable and won't cause the viewer to be knocked out of concentration.

Maybe for people who work in the industry and have an eye what to look for, cgi might have issues, but viewers won't recognize anything odd.

Also... movies either work or don't work. I even rewatch movies from the 30s. There are movies, like Fritz Lang's M (1931), which were tens of years ahead of the industry regarding pacing and storytelling and still (re)watchable almost 100 years later.


> A game must be in a shipping ready state when it launches. End of story.

I agree but reality doesn't care about our opinions, it is how it is


Most of them are in a shipping ready state.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: