If Wikipedia just accumulated everything it could, we'd end up with a billion articles 90% of which are vandalized, outdated, mistaken crap. The potential to misinform would be huge. They can only successfully scale the corpus at the rate at which they gain volunteer editors (who are far rarer than people who just write one article about their favorite obscure thing, and then leave).
It is frankly easier to get a fake article by the editors than a real one. The editors generally do not check articles for accuracy or quality (beyond what a high-schooler with a dictionary could do) — they just fastidiously apply a checklist of rules.
Also, I think it would be easier to get editors if being an editor were more about making the site good and less about dealing with the head cases who currently edit the site. (Nothing personal intended against any individual editors who may be reading — it's just a general reflection of how things look to people who have casually attempted to try editing.)
This still sounds like a technical limitation. A page that gets ten hits a year should not demand as much editorial oversight as one that gets a million. Who cares if a long tail article gets vandalized if no one sees it?
It sounds like a missing UI. Something like the ability to show recent changes sorted by the pages' popularity. Too simple?
If something is vandalized or mistaken then delete it, if it is outdated then there should be some kind of long term archiving policy but I fail to understand the reasoning behind deleting articles which are factually correct, even if rather obscure. What harm do they do?
Either all articles are high quality and there's no room for esoteric knowledge, or the entire system will collapse under the weight of misinformation? That's an obvious false dichotomy.