This is bad, very bad. A complete and utter disrespect for people who poured in their blood and sweat in creating the content for them and for free. Migrating only "top" content is insufficient and so ignorant. The value of wiki is almost entirely in long tail. Vast majority of Unreal engine is undocumented where you get all kind of issues few have seen before. If it wasn't for the community creating content for them, it would be unusable for many users. To salt the wound, they aren't even telling you exactly why they are doing this. Why making read-only archive is so hard, at very minimum. Didn't expect this from Tom Sweeney's company.
I posted recently about how bad ue4’s c++ documentation is, and how it drives people to Unity. They have been promising to make it better for 7ish years from what I gather in forums. What is with them
Looks like the post got updated with some reasoning:
> Why can’t we put a read-only archive online currently?
The Wiki, even in it’s read-only state, was presenting security risks, and it was deemed necessary to take it offline.
I did this for a forum I used to host for many years. I crawled versions for people (using a user with the same rights as they did so it wouldn't contain anything extra) with static assets and referred images and zipped that up, so they could browse the old posts locally and nostalgize. It worked really well.
Maybe the people at UE need to be taught how to use wget.
The only other "security risk" I can imagine such a read-only wiki could present is if it documents something that could be considered "risky" from a security perspective.
That would make this more of a censorship situation.
> We still have the data, and as mentioned above, we will work to migrate the top content into various official resources. In the meantime, we’re investigating how we can make this content available, even in a rudimentary way.
Static content needs a web server and a way to move the content to the server (ftp, sftp, rsync..), and ssl certification, and a DNS entry, and DDOS protection, and a way to manage credentials for those things. It's not zero risk at all. There are plenty of ways to attack a static site. The only way static is safer is because you're not executing scripts on the server (which is a massive win, I don't want to down play that aspect), everything else is the same as a dynamic site.
That's still a lame excuse though. They could have outsourced all the "hard" stuff to a free Netlify account.
Like what problem specifically that you cannot mitigate? If you can't just host static contents on a new empty server and get it going, how does the rest of the world work at all?
I'm in a similar boat. What other similar tools have you found yourself using? For example, Ive been learning the hell out of wget, but I find my tool library lacking or lacking in trust.
The thing that seems super strange to me is that they don't seem to have warned people they would do this, the first comment is
> This isn't very helpful, Amanda! I know that the wiki wasn't optimal, but there were many wiki pages developers like me had bookmarked for years beacuse they contained comprehensive and easy information, which is now missing. Why not just keep the wiki read-only online? Just to retain the old pages? I'm pretty lost right now without some of these articles and I don't understand why the only option you had was to completely disable it. Please think about opening it up again just for read. I don't care about the maintenance mode, but the wiki was an important learning point, which is now gone.
If you don't want to support the wiki that's fine, you don't owe anyone hosting, but if you're going to dump it, atleast give someone the opportunity to scrape the site and host it themselves.
Lesson I've long learned: never bookmark anything. Bookmarks are to temporarily remember a URL, not to archive content. If the content is important or meaningful to you, save the page.
Yes this. Grab the 'singlefile' add-on[0] and train yourself to hit that instead of bookmarking. You'll be much happier if you ever need the information again!
Ah, right, of course, so that I can later access that page locally and install the extension whenever I need it. Then with that extension installed, I could even save that extension page!
im surprised at myself for shilling a service i dont even use and dont have any stake in, but https://pinboard.in/ lets you bookmark things and then also archive those things so they dont go away
Absolutely. When it comes to sites with mostly static content I have a habit of archiving them with HTTRACK, though I wish there was a more modern solution with better support for active content.
For some inexplicable reason some pages add script tags with javascript. So what happens is that you archive a page. The page is gone and when you visit it you notice all the broken links pointing to javascript files that are gone as well.
Archiving even just a single webpage is non-trivial. You're dependent on the internet archive if you want a reliable backup.
Such a solution presents a bit of a challenge though, given that you'd be 1) broadcasting a security issue, and 2) possibly compounding it by presenting your audience with some really disagreeable news.
I can at least see why they'd hesitate to leave things up, depending on the anticipated risk and likelihood of addressing it in reasonable time.
Edit: Downvote if it makes you feel better, but this is really how groups execute on problems like this without taking time away from other important projects. "Security issue? Extensive fixes needed? Take it down!"
Can’t you just lock the system down and isolate it enough so the security vulnerability is a non-issue? Certainly there’s an ops solution to things like this.
wget or other site rippers can just make static content out of it. You can write a script to put a notice/header at the top and host it on nginx .. or an S3 bucket.
They could provide a the public or a trusted third-party with a database dump, or put it on a separate hosting service. With a bit of effort either course would relieve them of security concerns
It shouldn't be hard to save a static copy of the actual HTML and replace the dynamic site with that though. Or at least give others enough time to do the same.
It's not hard, except when it is and it takes weeks of work. From my own experience:
1) httrack wasn't able to crawl anything for some reasons. wget only with special flags available in the latest version.
2) All the links are broken. Wiki have their own linking system between pages that are processed on the fly. The archive with all links hardcoded to wikidump.example.com/wikidump/page really doesn't work in place.
3) More challenges with saving pictures and attachments.
4) Unicode characters in some pages, that broke the page and/or the crawler.
5) Infinite crawling and a ton of useless pages. Consider that every URL with a query string is a separate page. /page?version=50&comparewith=49
6) Crawling large amount of documents takes forever. Could be an hour wasted on each try. Consider tens or hundreds of thousands of unique URLs to save, see point five. Really wish the crawler could parallelize and run on the same host as the wiki.
Good points. Perhaps I should have said, "It should be very possible," or similar instead. I'm sure there would be challenges, but I would expect them to be surmountable.
Sounds like someone accidentally deleted it and they have no backup. Instead of admitting to not having a backup, they can just say it was intentionally shutdown, and ask their staff to salvage whatever is available from archives.
My first thought as well, but my second thought was why wouldn't they own up to it? Surely they know that owning up to something like this earns them much more respect and positivity from the community than "taking it down" for no good reason, or worse trying to cover it up.
In my experience this sort of decision is always driven by sales / marketing people deciding they want to funnel the users into some other part of the site that nobody currently uses because it's not as good.
Not necessarily sales/ marketing, but an Old Thing is not a new thing you'll be praised for, it's just another annoying cost that comes out of your budget. Maintenance is boring.
Building a New Thing comes with excitement and praise.
Microsoft has done this so long their own people strongly recommend using URLs in https://aka.ms/ their long term link maintenance software, so that when yet another "exciting" change happens to their entire Microsoft web site you can still find all the vital documentation. Maybe their "Knowledge base" articles for example, will become a Wiki again, and then a social networking site, and then a blog the week after, and then a different Wiki with newly inscrutable URLs. But the aka.ms link can be updated so that you don't need to spend an hour navigating.
The more important maintenance becomes to a company's actual financial health the more senior management rebel and become sure their destiny is to radically reinvent the company. If directors did their actual job (working for the shareholders, for whom "exciting" aka "massively loss making" isn't the goal) the very next thing you'd hear after the CEO announcing the company has a new name, new branding, new logo, would be the Chairman arriving to tell everybody actually it's not a new name, new branding or new logo, just a new CEO, sorry, no big leaving do he was escorted off site by security.
Some marketing people cannot grok user-generated content and absolutely freak out over the perceived lack of control. It's completely irrational but I've had pinhead marketing droids kill my content before with no replacement plan. It's infuriating.
"Your bonuses are tied to the amount of traffic your blog posts generate" is such a 2020s KPI. And I've heard of more than one person whose compensation package has those metrics.
I think corporate upper management are the only echelon of business capable of such waste and cynicism, maybe they noticed that traffic was going to the wiki and not the dedicated support pages.
> But to just destroy information, information about your own product, is…well, it's stupid. Profoundly so.
From a purely historical/retrocomputing point of view, I'm really disappointed IBM took the legacy Library Server site offline.
It was full of fascinating detritus... old OS/2 manuals, various ancient layered products that IBM dreamed up once upon a time back in the 1990s or early 2000s and which never went anywhere (DDM, DCE, etc)...
Why couldn't they just donate the contents to archive.org or something like that, if they don't want to host it anymore?
At least they still have the offering information site online – https://www-01.ibm.com/common/ssi/ – full of product announcements from the 1980s. To pick some random examples:
Their support site also still has a bunch of articles from the 90's in the database, you can still get them to show up in the searches on their own support site, but they're not browsable and hence not indexed by Google.
It had been in read-only maintenance mode for quite a while. Then they just took the whole thing down with no warning. The wayback machine copy unfortunately seems to be missing a lot of articles.
Really stupid. UE4 documentation is generally crap and the wiki resources were quite invaluable. It might be outdated information but it at least gave you a starting point to figure out where to look in the source for more information.
Since I started using Ogre3D I always had a hard time settling down to feature rich engines like unreal or unity.
I don't know how often, giving beginners access to a space shuttle, will it lead to a successful project that can compete with non-indie game developers.
There is also a fine line between an indie team of developers who can benefit from those tools, and experienced game developers who would not need them.
It seems unreal and unity are just very capable, but cheap, tools that are well-marketed towards students and beginners. The problem is, once those developers learned to use those tools, they are still unable to develop a game without those tools, which is a huge win for unity and unreal.
Generally I tend to believe unreal and unity only enable developers to make games that will never be able to compete with more established and skilled game developers. I think it's a pretty sad situation, because initially I really believe indie games were able to compete with those big studios, but they're not, and I think unity and unreal are responsible for this. It seems the whole FOSS mantra/paradigm/philosophy has a lot of trouble penetrating the field of game development, maybe because games are heavily monetized towards consumers, unlike other softwares. It bothers me.
As someone who's written engines, worked in bare bones do-everything-yourself frameworks, and used Unity, not everyone has the time or ability to sit down and code almost everything from scratch. Some people have simple ideas that involve mostly gluing parts together and big engines are fine for that. Some can program everything if they want to, but they want a decent physics, rendering, and user input engine from the get-go so they can get to work on their ideas and not endless boilerplate code.
Most indie gamedev projects I've seen die involve people who are primarily programmers who get stuck in framework hell. They're endlessly trying to build and tweak basic features for a basic framework, when any major engine has all of those features out of the box or with a 5 second package download.
I've had to get myself out of the habit of doing everything myself, admitting defeat, and downloading existing packages. Oftentimes someone out there has something better than what I wanted to make and easily tweakable, saving weeks of time and still allowing me to give it my own touch.
Even for a total amateur who won't use 1% of the features, Unity and Unreal have two huge things: support and a community. You can ask a question anywhere about anything and generally there's someone who can help and even give a precise solution. That's huge.
I like reinventing the wheel on the game logic side. The payoff for novel ideas is often pretty high. Not everything is a tiled side scrolling platformer or a top down RPG. However, with graphics it's the opposite for me. The amount of setup needed to even show a single triangle feels like a waste of time. You're tied to a single graphics API so you either have to write the code again and again for each API or you use an engine. The payoff is pretty low because spending hundreds of hours on a custom graphics engine doesn't make your game stand out unless you are very skilled.
Of course this doesn't mean you shouldn't write your own graphics engine for fun and as a learning opportunity but if your goal is making a finished game then you should avoid falling into this trap.
That was one of the reasons (among others) that I never went beyond demoscene like stuff, always went too deep into engine stuff, and never produced anything really interesting as a game.
Anyone that wants to do games should use a pre-made engine, if the game is at all interesting, people will play it, even if the technology sucks.
What engines are you suggesting make it easier to create those games that compete? My experience is that UE4 and Unity are both enabling of indie developers to make very high quality games. The only real limitations are how much effort you put into the art. UE4, while hard to code for, is still orders of magnitude less work than coding all the rendering, animation, and hardware logic from scratch. There are of course other engines, but they are either devoid of the features you need to compete with AAA titles, or have severe performance limitations.
Someone else already mentioned Godot, which I think is fantastic and a lot of fun for making 2d games. I have not tried to use it for 3d, I understand its new renderer is making it more competitive with more advanced engines.
edit: btw, I don't really agree with anything GP said. I just wanted to plug these two nice open source alternatives. I personally (as a casual hobbyist) did not really mesh with Unity or Unreal, for different reasons but I definitely understand what they offer to both beginners and serious businesses.
For example, there are few indie games that managed to get a lot of sales because they innovated in terms of technology or ideas, like factorio or minecraft. Yet those games are pretty ugly, they're not "HD", while there are too many indie games that are HD with high poly graphics.
Indie devs don't have money, so they obviously cannot compete on the art and content. They have to make games nobody is making, and not hesitate to make pixel art or low poly content to concentrate on the gameplay. They cannot compete with big studios on content. It's too time-expensive.
This is the most important thing that people forget about games: the content doesn't matter. 3D artists don't matter. Games are not CG movies. Unless you're Kojima or final fantasy, nobody cares about cinematics. Game developers must focus on the gameplay and stop advertising about fancy graphics. This is a never-ending problem of video games since the 3D era: too much focus on the graphics, and no efforts on the gameplay, game balance, game theory, mechanics, reward system, difficulty, game lifespan, etc.
> What engines are you suggesting
If you're not aiming for bleeding edges graphics, you can use an graphics engine, or make your own. For physics and other stuff, there are plenty of libraries. I'm just saying unreal and unity are not engines, they're framework/platforms. They impose too many contraints, and you can't do everything you want with them. Not to mention the IP or business side.
Also interested in this question. I recently started on my first game project after 4 years of being a professional C# dev and chose unreal because it seemed to be the quickest path to success for a small multiplayer arena battle game. Seems like coding my own networking layer, renderer, animation system, etc. would take way longer.
The biggest for me is the fact they're rewriting the graphics stack. The churn is enough but I also just don't like "fixed"/"simplified"/"helpful" tools that hide the underlying platform. Unity's shader language is extremely ugly but at least I can use raw GLSL if I have to. I've had to use custom pragmas to get certain acceleration features to work on Samsung hardware that doesn't seem possible in Godot. Hopefully the updates with Vulkan will have more flexibility.
Hi, I'm not very proficient in graphical programming, but godot docs say this
> Godot uses a shading language similar to GLSL ES 3.0. Most datatypes and functions are supported, and the few remaining ones will likely be added over time.
Unlike the shader language in Godot 2.x, this implementation is much closer to the original.
Godot is relatively new and definitely "not there yet", but at least with its open nature you can do `git clone godot-doc.git` and no top manager can take it away from you.
I think you're discounting the value of the tools Unity and UE offer to experienced developers/teams, and ignoring that outside a few exceptions, Indies haven't really been able to compete with AAA Devs since the PS2 era, when games more or less got programmed from scratch, or were adapted from the brittle code of previous titles.
Nearly every major studio or publisher has a similar toolset they've either built from scratch (RED Engine, Frostbite, Anvil, Decima, Id Tech, etc.) or license (like Unreal) and built on top of. Years of testing, R&D, and workflow refinement goes into making these toolchains extensible and useful for teams of all skill levels and functions, as well as to make them scale well to the needs of different titles.
The trade-off with these tools, is that their tremendous breadth can make working with them on complex projects their own knowledge domain for smaller teams, even as it abstracts away many of the complexities that come with developing your own engine.
If you're an engineering oriented developer who has the luxury of developing for a very restricted set of platforms and the time to debug their own tooling, with narrow, well-defined graphical requirements, a clear vision, and a technically inclined art team, then using a framework like Ogre makes perfect sense. Lightweight frameworks are a joy to work with, and you only have to add what you need on top of them to get the job done.
But iteration is slower, and you may spend months getting your tooling where it needs to be if you're going to work with a team.
Good luck onboarding new artists and game designers though. First you have to worry about training. After that, compatibility. Artists tend to have a workflow that works best for them, and even using open file formats, and established toolchains, they've got a gift for finding edge cases in your system. Your map editing toolchain also has to work for both the artists, and the designers.
Conversely, a mature engine like UE, or Unity has a wealth of crowdsourced documentation, and it's almost impossible to trip over an issue that someone else hasn't already triaged before you. New team members are almost guaranteed to know how to fulfill their responsibilities within the constraints of the engine's toolset, so they can get to iterating on prototypes much faster.
They're also typically extensible enough that the engineering guy(s) can put whatever efforts they would have contributed to designing a rendering engine, tools, and debugging platform issues into adding features unique to their title.
The featureset on these behemoths may be overwhelming, but it's more or less on par with what the 'pros' are using, so just by adopting one, you're virtually eliminating your technical capability gap with them. There is still a gap. With respect to tooling, Indies simply lack access and experience with parametric modelling tools like Houdini which greatly increase the efficiency of content-generation.
The rest of that gap can be broken down to experience, and manpower. Experience can be fixed, but few indies are able to throw the number of bodies at a project that someone like EA or UbiSoft can.
Engines allow anyone to make AAA level experiences with AAA levels of graphical fidelity now.
The output gap has become about art and content, something no indie can effectively compete with in terms of volume.
I agree that developing on large engines can cause you to hit a wall, and the engine essentially becomes the developer's world, but I think overall the proportion of people in the world who go further is the same, even if the proportion of people in the world cluelessly noodling with the low-cost space shuttle they've been given, and putting out garbage increases.
People incapable of competing have been allowed to join the market. But the democratization of engines has also given those with the potential to be great a much lower barrier of entry onto the development scene
Excellent post, but in the context of this discussion, I disagree on one point:
> a mature engine like UE, or Unity has a wealth of crowdsourced documentation, and it's almost impossible to trip over an issue that someone else hasn't already triaged before you
C++ in UE4 is a nightmare to work with because you have to scour forums for a day and a half to find somebody who might have mentioned the name of the function you need. Great engine besides that, but that is a pretty big deal. Unity however - everything is the first page of search results. Incredibly valuable!
Thanks, and quite right. An oversight on my part. Admittedly, programming for Unreal's well outside my realm of experience. I tried it some time ago, but mostly played with it as an art tool. Its coding conventions seemed more opinionated relative to Godot and Unity's, which turned me off, because I was only really looking for a fun toy in a domain outside my experience.
> Engines allow anyone to make AAA level experiences with AAA levels of graphical fidelity now.
They allow it, but those indie devs don't seem to compete with AAA games. What's the catch then? I think that it's performance, game design, experience, etc. It's pointless to compete with big game studios on the same types of games.
> AAA level experiences
Sorry but what exactly is this? That's not what makes indie games interesting. And I don't think indie game devs will really achieve those "AAA" things.
Maybe I am wrong, but I think Unreal engine will let you do anything and you can actually compete with the big studios, performance and feature wise.
The only reason someone can't actually compete, is man-hours available. Some things are just very time-consuming to implement.
But this last point applies to any engine, even the ones the big studios use.
About open source, just check Godot. And Godot is a competitor to Unity, in the 'learn this engine and you will have to keep using it' market. Which I think is fine for Indies.
As a noob, I've found learning c++ in unity pretty comprehensive. I can also look at how real code works looking through what others have done. And the youtube tutorial section is huge. At the end of the day I can't wait to know enough to jump ship to godot.
Unity is in a comfortable position with newer lines of business from VR, architecture, and animation along with its strong position on the long tail of desktop, console, and mobile gaming. They have the marketplace to beat and a userbase beyond comparison. I believe this enables Google-esq behavior and its disconcerting at best.
My son was really into Unity development for a while, but he got discouraged when they deprecated their entire networking stack without providing a suitable replacement (since August 2018) and are even removing support from old LTS releases.
For a multi-billion dollar company to suddenly take down a wiki that hundreds of man-months went into creating, that is visited millions of times each year, with no warning or archive- that is open user hostility. They can certainly afford to keep it around in read-only mode as a static site. An intern could run wget and have a mirror up in a few days tops. If there is unmoderated content they are worried about, they can afford to clean it up. This is wrong.
> they deprecated their entire networking stack without providing a suitable replacement
It was their second networking stack already, and both have had been ridden with problems. Last stack's still open sourced on BitBucket, and you can choose it as a starting point for your networking stack,.
Some promises, like what Unity networking was trying to achieve - a hassle-free real-time game mulitplayer without dedicated servers - are just not achievable, and your customers are better off if you admit it. It's much worse when you buy into marketing hype and start discovering structural problems that require total rewrite close to the release.
I worked with Unity since 2009, and in 2017/2018 implemented a custom multiplayer solution for an open-world RPG game without a dedicated server. Which was originally written on that exact stack. Never had a worse burnout in my life.
I was mixed up about who owned the wiki, sorry folks. Unity networking is still broken. My son did try to move on to Unreal, but he never made anything of substance with it. He got interested in developing with 6502's and got out of 3D games for now.
It would’ve been trivial to crawl the forum and dump the resulting WARC files into the Wayback machine to provide a permanent archive. This is just apathetic laziness on their part.
I used the wiki extensively in my last UE4 project. It had its warts, but it also had valuable information that did not exist anywhere else. Taking this down without a torrent mirror or a grace period is phenomenally harmful to the community. Bad move!
The lack of a static copy of the wiki really sucks but it's understandable that a mediawiki install would be pulled indefinitely. Mediawiki racks up a dozen CVEs in an average year and even a single one of those is an opportunity to perform watering hole attacks on every UE licensee. Getting RCE on a single UE customer's machine is an opportunity for million+-dollar industrial espionage - it's not uncommon for someone to get a copy of a game's source code and try to extort the developer for cash. We generally only know about the cases where the extortion fails...
It's possible that really aggressive security measures could mostly prevent that but even if you were to patch weekly that won't stop someone from pairing an undisclosed mediawiki attack with some other attack that isn't well-known. A game studio's machines are probably using LTS versions of Firefox or Chrome w/slower update cadence, which potentially means multiple days of vulnerability even after an exploit is patched.
Also, now that Epic processes credit card payments (Epic Store, etc) it's possible the mediawiki install would prevent them from passing PCI-DSS audits.
Microsoft did a similar thing with ASP.NET site. There were quite a lot of old articles for ASP.NET WebForms that were really good references or if you were working with someone that was new to WebForms you could just point them to a particular article and say "read through this, this has almost everything covered on how to do this".
Bonehead move. Leave it up as read-only and mark when pages are out of date so users can look for up-to-date information elsewhere. Hope they come to their senses and re-upload a read-only archive of the documentation
Not twiki, looks like MediaWiki to me. Wonder what attracts people to MediaWiki anyway. Horrible and insecure code all over, easy to break into. Only maintainable with massive manual administration costs. And hundreds of Wikipedia editors.
On the opposite I once maintained phpwiki which never had any security problems, and all my known instances still work fine after decades. No much need for massive manual interventions. lots of admin plugins. XSS attacks impossible. I ran backup jobs for the DB (berkeley db was 30 faster than MySQL) and as HTML archive. So even if you have to put it down for php maintainance, you can trivially replace it with a readonly dump without any action buttons and without any PHP.
"security risk" - So they're just trying to hand-wave a bluff at a huge community of developers? No one believes this. edit: also I feel bad for this community manager having to lie and apologize
I am the co-maintainer of an OpenSource project called AwesomeWM. I took down our wiki years ago due to:
* Constant vandalism
* Dubious user created content rendering computer non functional
* Trolling edits to cause breakages to people copy/pasting shell commands
* SPAM
* Maintaining the wiki
Before that we forced users to log-in for edits, then forced moderator approvals for everything, then forced moderator approval for new account. Then gave up and retired the Wiki.
So no, wiki are not free content. They are a pain, especially when your community tend to have many trolls/hostile individuals like the gaming community. It's not "downright lies" all the time.
Completely off-topic, but thank you for your work on AwesomeWM, together with all other contributors! It is a fantastic piece of software and I always use it on all my Linux installs.
About maintaining wikis, that is indeed a problem. In addition, most wiki software I used has extremely clunky administrative tools which make moderation way more challenging than needed.
I used to maintain a tiny private wiki for a previous job, and even in a very small operation (10s of users), it was a disproportionately large maintenance burden.
I agree with your points, and there are a lot of constraints for running a wiki. Especially on a volunteer basis.
Indeed it is not free content, the volunteers that edited the UE4 wiki must be pretty disappointed. But Epic isn't broke and the vague reasoning they offer is insufficient to me and many others.
p.s. Coincidentally I am a daily user of AwesomeWM. Thanks for your efforts!
"We hear your concerns and frustrations. And we missed an opportunity to give you notice prior to the Wiki page coming down, and for that we apologize.
So why can’t we put a read-only archive online currently? The Wiki, even in it’s read-only state, was presenting security risks, and it was deemed necessary to take it offline.
We still have the data, and as mentioned above, we will work to migrate the top content into various official resources. In the meantime, we’re investigating how we can make this content available, even in a rudimentary way.
Some individuals from the community have already reached out to explore a community-hosted Wiki. We are following up with these folks and are evaluating next steps to see what may be possible here as well. "
Well you always learn new ways to express incompetence. They do know that you can render wikis into static HTML pages?