Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That wouldn't work sufficiently well; unless the benchmark is quite complex it would almost always be better to implement an in-memory custom datastore rather than use a traditional backend. Even if you require persistance, since the intentionally small dataset means you can get away quite well with raw memory dumps or even an mmap variant. You wouldn't use a full-fledged JSON serializer but a limited string concatenation.

In any case, the way I read it, the requirement doesn't mean you have to make anything particularly heavy, just that you need to use a representation that's general enough to be usable in other scenarios without rewriting the entire app.

Of course, a little bit of extra encouragement in the form of a benchmark that's slightly harder to game wouldn't be bad.



Custom datastores (custom at the framework level, not necessarily at the application level) have been very useful for me in the past for getting high performance.

For example, I wrote a little in-memory DB for use in a .NET framework to avoid having to deserialize an object graph per request. Instead, it navigated a byte array and plucked out just the data needed.

Another time, I wrote a compact in-memory representation of a trie, using bit-packing and a few other tricks, to get an order of magnitude reduction in memory usage, making it possible to cache a lot more of a data set.

Why go to the database and instantiate a classic object if you can avoid it?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: