I don't think aggregators obviate the need for rss. I don't really like the idea of the masses entirely determining what content I have awareness of. Furthermore, aggregators simply don't offer the sheer volume of content pumped by an rss feed with a good couple hundred subscriptions. And if you are into niche content like say arxiv's math.OA you pretty much need rss.
How much time does it take you to sort through it all? That's the tradeoff you make with aggegators vs. rss. Personally, I'm also subscribed to a few hundred feeds on Google Reader, and I usually go through them first by deleting everything older than a day, and then sorting by "magic". But that has major limitations: I lose occasional posts and older, but still interesting, content, and magic just skews the sort towards what I've read before (maybe through keywords?), and I lose out on new and interesting content.
RSS - you have high volume and high breadth, but it takes a long time to look through it and find the things that are interesting.
Aggregators - the "most interesting" posts are right there on the front page, but it has a very narrow focus and is filtered or skewed by the community.
I think what we're really looking for is the most interesting content that is customized for us, but it's a big problem to tackle. Newsblur (http://news.ycombinator.com/item?id=1869136) was an interesting idea; not sure where that's gone since November's post.
And where do the aggregators get the news item they aggregate?
They often get their news from blogs, news websites, word of mouth, and web searches. RSS is very useful for keeping up with some of these sources.
Sometimes aggregators do get their news from other aggregators, but at some point that incestuous chain has to be broken and someone has to read the original news somewhere and pass it on to an aggregator.
Finally, most aggregators themselves (including HN), publish their news via RSS (or Atom). I know that's how I read HN headlines, and virtually every other news source I regularly read.