Long life, high activity nuclear waste represents less than 3500m3 (one Olympic swimming pool), and this, since the start of civil nuclear electrical production in the 50's. World wide.
20 swimming pools of total waste isn't that impressive. I don't want to live near that, but I'm sure I'd we can find a place to put that in that will have minimal impact on people's lives.
Exactly. The waste isn't really a problem. But it doesn't have to be waste. That's the point. All that U235 in 'spent' silos? You can get 60x - 100x its OG power feeding it to nextgen reactors. So cool
I guess you mean the "super hot for centuries" minor actinides (Np-237, Am-241/243, Cm-242/244/245 etc..)? These are less than 1% global waste, but next gen reactors can still eat them. The majority of waste (95%+) is U-235, then Pu, which nextgen also eats.
Currently renovating our house, everything will be KNX based. Offline, no servers needed (even within the house) but nice for visualization, standard, 500+ vendors of compatible hardware. Highly recommended.
Swiss like French have hundreds of different cheeses, so calling a specific cheese a Swiss cheese is like telling "Why does American car is powered by an electrical motor?", talking about a specific Tesla car while you have hundreds of other cars manufactured in the US.
Off-Topic: Thank you for not only providing a stellar service with Fastmail, but also for contributing back to the OSS ecosystem and the specifications/RFC work. This takes a lot of time and we all benefit from this work. It helps many people/small IT shops to run a system outside of the "big ones". Again, thank you.
My kids are playing Fortnite on a PS4, it works, they are happy, I feel the rendering is really good (but I am an old guy) and normally, the only problem while playing is the stability of the Internet connection.
We also have a lot of fun playing board games, simple stuff from design, card games, here, the game play is the fun factor. Yes, better hardware may bring more realistic, more x or y, but my feeling is that the real driver, long term, is the quality of the game play. Like the quality of the story telling in a good movie.
Every generation thinks the current generation of graphics won't be topped, but I think you have no idea what putting realtime generative models into the rendering pipeline will do for realism. We will finally get rid of the uncanny valley effect with facial rendering, and the results will almost certainly be mindblowing.
Every generation also thinks that the uncanny valley will be conquered in the next generation ;)
The quest for graphical realism in games has been running against a diminishing-returns-wall for quite a while now (see hardware raytracing - all that effort for slightly better reflections and shadows, yay?), what we need most right now is more risk-taking in gameplay by big budget games.
I think the inevitable near future is that games are not just upscaled by AI, but they are entirely AI generated in realtime. I’m not technical enough to know what this means for future console requirements, but I imagine if they just have to run the generative model, it’s… less intense than how current games are rendered for equivalent results.
I don't think you grasp how many GPUs are used to run world simulation models. It is vastly more intensive in compute that the current dominant realtime rendering or rasterized triangles paradigm
Yeah, which is pretty slow due to the need to autoregressively generate each image frame token in sequence. And leading diffusion models need to progressively denoise each frame. These are very expensive computationally. Generating the entire world using current techniques is incredibly expensive compared to rendering and rasterizing triangles, which is almost completely parallelized by comparison.
Okay you clearly know 20x more than me about this, so I cannot logically argue. But the vague hunch remains that this is the future of video games. Within 3 to 4 years.
I don't think that will ever happen die to extreme hardware requirements. What I do see happen is that only an extremely low fidelity scene is rendered with only basic shapes, no or very little textures etc. that is them filled in by AI. DLSS taken to the extreme, not just resolution but the whole stack.
I’m thinking more procedural generation of assets. If done efficiently enough, a game could generate its assets on the fly, and plan for future areas of exploration. It doesn’t have to be rerendered every time the player moves around. Just once, then it’s cached until it’s not needed anymore.
Even if you could generate real-time 4K 120hz gameplay that reacts to a player's input and the hardware doesn't cost a fortune, you would still need to deal with all the shortcomings of LLMs: hallucinations, limited context/history, prompt injection, no real grasp of logic / space / whatever the game is about.
Maybe if there's a fundamental leap in AI. It's still undecided if larger datasets and larger models will make these problems go away.
I actually think many of these are non-issues if devs take the most likely approach which is simply doing a hybrid approach.
You only need to apply generative AI to game assets that do not do well with the traditional triangle rasterization approach. Static objects are already at practically photorealistic level in Unreal Engine 5. You just need to apply enhancement techniques to things like faces. Using the traditionally rendered face as a prior for the generation would prevent hallucinations.
Good luck trying to tell a "cinematic story" with that approach, or even trying to prevent the player from getting stuck and not being able to finish the game, or even just to reproduce and fix problems, or even just to get consistent result when the player turns the head and then turns it back etc etc ;)
There's a reason why such "build your own story" games like Dwarf Fortress are fairly niche.
Yes, that's something I failed to address in my post. I myself have also been happier playing older or just simpler games than chasing the latest AAA with cutting edge graphics.
What I see as a problem though is that the incumbent console manufacturers, sans Nintendo, have been chasing graphical fidelity since time immemorial as the main attraction for new generations of consoles and may have a hard time convincing buyers to purchase a new system once they can't irk out expressive gains in this area. Maybe they will successfully transition into something more akin to what Nintendo does and focus on delivering killer apps, gimmicks and other innovations every new generation.
Or perhaps they will slowly fall into irrelevance and everything will converge into PC/Steam (I doubt Microsoft can pull off whatever plan they have for the future of xbox) and any half-decent computer can run any game for decades to come and Gabe Newell becomes the richest person in the world.
I can't figure out what Microsoft's strategy is with the Rog ally X or whatever it's called. The branding is really confusing, even on just the devices. It gets even more confusing with Xbox on pc.
Are they planning on competing with steam and that's how they'll make money? I have a steam deck and I've zero interest in the Rog ally, windows 11 is bad enough on my work pc.
That's the Nintendo way. Avoiding the photorealism war altogether by making things intentionally sparse and cartoony. Then you can sell cheap hardware, make things portable etc.
handheld devices like switch,steam deck etc is really the future
while phone is also true for some extend but gaming on a phone vs gaming on a handheld is really world of a differences
give it few generations then traditional consoles would obsolete, I mean we are literally have a lot of people enjoy indie game in steam deck right now
Unreal engine 1 looks good to me, so I am not a good judge.
I keep thinking there is going to be a video game crash soon, over saturation of samey games. But I'm probably wrong about that. I just think that's what Nintendo had right all along: if you commoditize games, they become worthless. We have endless choice of crap now.
In 1994 at age 13 I stopped playing games altogether. Endless 2d fighters and 2d platformer was just boring. It would take playing wave race and golden eye on the N64 to drag me back in. They were truly extraordinary and completely new experiences (me and my mates never liked doom).
Anyway I don't see this kind of shift ever happening again. Infact talking to my 13 year old nephew confirms what I (probably wrongly) believe, he's complaining there's nothing new. He's bored or fortnight and mine craft and whatever else. It's like he's experiencing what I experienced, but I doubt a new generation of hardware will change anything.
> Unreal engine 1 looks good to me, so I am not a good judge.
But we did hit a point where the games were good enough, and better hardware just meant more polygons, better textures, and more lighting. The issues with Unreal Engine 1 (or maybe just games of that era) was that the worlds were too sparse.
> over saturation of samey games
So that's the thing. Are we at a point where graphics and gameplay in 10-year-old games is good enough?
Are we at a point where graphics and gameplay in 10-year-old games is good enough?
Personally, there are enough good games from the 32bit generation of consoles, and before, to keep me from ever needing to buy a new console, and these are games from ~25 years ago. I can comfortably play them on a MiSTer (or whatever PC).
If the graphics aren’t adding to the fun and freshness of the game, nearly. Rewatching old movies over seeing new ones is already a trend. Video games are a ripe genre for this already.
Now I'm going to disagree with myself... there came a point where movies started innovating in storytelling rather than the technical aspects (think Panavision). Anything that was SFX-driven is different, but the stories movies tell and how they tell them changed, even if there are stories where the technology was already there.
I get so sad when I hear people say there’s no new games. There are so many great, innovative games being made today, more than any time in history. There are far more great games on Steam than anyone can play in a lifetime.
Even AAAs aim to create new levels of spectacle (much like blockbuster movies), even if they don’t innovate on gameplay.
The fatigue is real (and I think it’s particularly bad for this generation raised to spend all their gaming time inside the big 3), but there’s something for you out there, the problem is discoverability, not a lack of innovation.
???? hmm wrong??? if everyone can make game, the floor is raising making the "industry standard" of a game is really high
while I agree with you that if everything is A then A is not meaning anything but the problem is A isn't vanish, they just moved to another higher tier
You probably have a point and it's not something I believe completely. My main problem I think is I have seen nothing new in games for 20 years at least.
Yokoi: When I ask myself why things are like this today, I wonder if it isn’t because we’ve run out of ideas for games. Recent games take the same basic elements from older games, but slap on characters, improve the graphics and processing speed… basically, they make games through a process of ornamentation.
Do not worry, I am mentoring a young engineer in my team. It is painfully hard to get him to improve his code, because it works. It is badly structured, lot of small "impedance mismatches", lot of small security issues, all that in 3 Python files.
I have a team of 10 engineers, the quality of the code they produce together with the LLM of the day correlates even more with the experience.
My impression over the past 6 months - before we had no "official" access to LLM, is that they increase the gap between junior and experienced developers.
Note that this is my limited impression from a team of 10 engineers. This matches with Simon's feeling in a good way for you!
reply