Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
In-browser RAW Processing: How We Did It (pics.io)
74 points by Come-rad on May 14, 2014 | hide | past | favorite | 30 comments


If I were to need RAW processing in the browser I'd probably have compiled libraw using emscripten.

It's pretty easy ( http://blog.bitops.com/blog/2013/06/04/webraw-asmjs/ ) and provides a proven set of demosiacing algorithms that work well, in addition to support for a lot more cameras (Sony springs to mind).

At any rate this looks like it was a quite fun exercise, and the blog post is well written.


We also played with dcraw running on asm.js. It actually a blackbox with no control. Our current implementation is more fast because uses platform specific optimisations code. I personally spend alot of time optimising it for running in JavaScript.


It would be interesting how big the libraw asm.js file is. One downside is usually that it gets quite big.


In the demo blog post I linked, it looks like libraw is around 3MB, but that the blogger who wrote that post didn't pay much attention to eliminating cruft and the script is additionally not minified. Minified and compressed it's under 450k.

http://dev.tag.is/rawson.js/ seems to be an emscripten-to-JS compiled dcraw.c, and while less functional than libraw it's under 500kb.


Just tried http://dev.tag.is/rawson.js/ and it doesn't work (it just says "Reading file", the green bar is full, but nothing's happening).

The OP site is quite slow but does work: it produces a JPEG in the end (using Canon 6D original CR2 files).

"LightRoom in the browser" would be the greatest thing ever.


What Lightroom features is the most meaningful for you? What do you think?


Any idea why they didn't go with an emscripten compile of libraw or rawspeed? These formats are well understood by now. libraw (dcraw based) can convert practically any format in existence.


Hi, I'm part of the team. In that case we would get a "black box". We need full control over the process.


Come-rad is right — emscriptened code is not very maintainable, so we cant rely on it. More than that we use alot of JS-specific performance optimisations that allows us to run faster than any of those autogenerated libs.


Interesting. If you're worried about correctness I'd just do integration testing. Get a bunch of raw files from different manufacturers and test them with rawspeed on the desktop and on the browser, comparing output. If it's performance though it seems strange when there are 3D engines being ported to asm.js that something like rawspeed wouldn't optimize well.

The alternative seems a little crazy to me. Reimplementing a full raw library in JS is a huge task. I've just spent a total of 6-7 hours getting the basics of MRW support in rawspeed and those formats are just insane, and change continuously between models of the same manufacturer. And even within a single model you'll get crazy variations depending on camera settings.

Best of luck.


AFAIK these 3D-engines was specially prepared to be ported to JS. We have abit another specific than just moving 3D objects, creating scenes and blend textures. Sure, its very similar, but our "textures" is bigger and we should know about more than one pixel per time to do correct demosaic process. Its not the same to create cool WebGL 3D effect and do JPG decompression for example.


If the emscriptened code does the right thing, then its output is object code, no more in need of by-hand maintenance than the object code from a compiler, right?

Anyway, I'd be interested to see how much faster you are (at comparable image quality) than one of the suggested libraries. If you are in fact a lot faster, I'd be curious to know what it is that you're doing that you think makes it run significantly faster in current JavaScript engines than translated code.



Would be interesting to hear what their demosaic algorithms look like. Being gflops-limited means that a lot of the image signal processing algorithms need to be dramatically simplified in some regards, but having large memory buffers that they can work in gives them a wide berth to implement more complex algorithms in other ways.

The libraw / dcraw-based demosaic algorithms are, I am told, very good; it'd be interesting to compare the image quality that they get (if they reinvented the wheel) vs. libraw, vs. the camera's on-board ISP.

Either way, having an in-browser RAW experimentation platform could be a very interesting tool; even if for nothing else, using it for education could be very neat. Good work!


Before start, we investigated and learned tons of theory and alot of libs. Rawspeed is my personal best and we compared our results with it. Now in raw.pics.io we implement simple bicubic interpolation algorithm which optimal in speed/quality. Sure, we think about AHD, gradient and other algos, but we should deal with performance first.


Any plan to support Sony ARW?

Am I understanding this right, the conversion is done in the browser, so rather than using something like dcraw that's quick and featureful and exist, you've reimplemented the whole thing in js? (I do get that it's a trade-off concerning what you want to enable). Do you see this as viable for use on tablets and such? Today's tablets, or are you targeting the tablets of, say, 2015+?


Sony holds a good share, so, yes, we'll take it into work. But for now our priority is speeding up. We just started optimising and already reduced processing time by 50%. But that isn't enough, obviously.

You are correct about our approach (I'm referring to dcraw here).

>Do you see this as viable for use on tablets and such?

It's a question of hardware, most of the things we build can run in browsers on tablets (in theory). We actually don't see much benefit in editing on a tablet. But iPad can be a good device for photo management and sharing.


Hm, if I can't edit where I happen to be (I need my laptop/workstation) -- what's the benefit of editing in the browser vs just using an application and some form of cloud sync (actually, as editing is local in the browser, there is no "sync" needed for feature parity, as one could just "automagically" download on demand and/or upload on save/use davfs...?)?


>just using an application and some form of cloud sync

Setups like this usually suck. These workflows are hard to maintain, especially by not so tech-clever photographers.

>what's the benefit of editing in the browser

You mean from a user perspective? Then "no installation", freedom choosing platform, collaborative editing, etc. If we talk in a greater perspective... Web version of software for RAW processing can be easily integrated into any web service: Google+, Dropbox...


>just using an application and some form of cloud sync

>> Setups like this usually suck. These workflows are hard to maintain, especially by not so tech-clever photographers.

Maybe. Then again, I don't really see the need for syncing going away in the near future -- it's still hard to "just upload" the result of a single photo session/trip/whatever? Even if you're great at deleting obviously bad shots, it's pretty hard to keep the number of photos below the low hundreds (ie: 2+ GB of raws)?

I suppose some will prefer to synch up once, then have "the cloud" maintain the working set of images (as bad ones are deleted, cropped/edited/black'n'white copies are created). Does sound like a lot of data going up and down though.

>> what's the benefit of editing in the browser

> You mean from a user perspective?

> Then "no installation", freedom choosing platform,

Fair enough -- I see how this can be a benefit -- it also highlights how much more comfortable my life has become after I gave up on dualbooting and just stuck with Debian as my main desktop/workstation setup. But that might not be for everyone.

>> collaborative editing, etc.

Do you plan on supporting some form of non-destructive editing, where changes can be easily propagated?

>> If we talk in a greater perspective... Web version of software for RAW processing can be easily integrated into any web service: Google+, Dropbox...

I'd like to see that :-) I still think there's a bandwidth problem though -- I have problems managing my RAWs locally, I can't imagine it will work (yet) with the actual image data only in the cloud -- and I'm not sure if caching gigabytes of data on the client is an acceptable solution (mostly because of poor uis for controlling the cache, purging parts etc).

I'd be happy to be proved wrong, however :-)


  raw.pics.io doesn't support your browser
IE 11 supports WebGL (and is a pretty decent HTML5 browser)


The problem is not only with WebGL. Unfortunately we don't support it yet, but working hard on it.


Hey, hackers. Here's the first picture from the post in higher resolution – http://blog.pics.io/wp-content/uploads/2014/05/8bd2a4633b08b...

In case some of you want to put it as a desktop wallpaper :)


Images like this are created due to bug in our JPG decompression algorithm — I like this one and put it as a post title photo. =))


Really interesting that you put so much emphasis on the proprietary formats.

As a Pentax (K-5, K-5II, K-3) and Nokia (Lumia 1020) user, I use nothing but DNG. Sure I could squeeze a few more bytes from a memory card with PEF, but I prefer my records to be in a standardised format.


We love DNG and I personally always recommend it. Unfortunately there are lot of manufacturers doesn't support DNG, thats why we support CR2/NEF.


Is their site seriously broken or what? I keep getting no images loading and scrolling to the bottom I get a ton of links to handbags and purses, shoes and fake oakleys


Loads for me. Don't see spam links. Can send you over the post in Pocket. What's your email?


think this is really cool. Do you support compressed DNG? I used that a lot when moving raw files across the internet, comes out a lot smaller while still retaining 12bit raw

The next part - processing the debayered pic on the browser. I like ACR a lot, the highlight recovery is the best I've seen.. plans?


If you talking about lossless DNG compression — yes we support it. If you talking about lossy DNG compression — no.

We already added Exposure Compensation to http://raw.pics.io, and we think about upcoming features. What features are the most useful for you?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: