Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Motion Stills – Create Looping GIFs from Live Photos (googleblog.com)
106 points by tambourine_man on June 8, 2016 | hide | past | favorite | 51 comments


Link to underlying CV research breakthrough:

Auto-Directed Video Stabilization with Robust L1 Optimal Camera Paths

http://www.cc.gatech.edu/cpl/projects/videostabilization/

Stunning amount of optimization required to make virtual cinematography possible in real-time on a mobile device!


Wow.. google actually made a iOS only app?

(I know it's because of fragmentation, and they only have to optimize for a handful or GPUs instead of the very fragmented android landscape. Still..).


They have released iOS only apps before (see the Gboard), but specific to this one, it works with iPhone 6's live photo feature. This is where the phone records a three second movie clip starting 1.5 seconds before and ending 1.5 seconds after you press the shutter button. It makes sense that this would be an iOS only app given that most Android's don't have a live photo feature.


In a way it's kind of a healthy thing. Google research isn't so locked down that they're prevented from releasing something cool that only works on the platforms of direct competitors.


Google Photos will already make looping animated gifs from burst/series photos. I think this specifically does it with the burst/series photos that make up Apple's Live Photos. Essentially, as far as I can tell, it's an iOS-only app because the specific format of Live Photos is iOS-only.

(please correct me if I'm wrong)


It's not due to fragmentation, it works with the iOS Live Photos features on the 6s and 6s+. Android doesn't have such a feature (at the moment, I'm sure it will soon).


iPhone 6s only


Why should they preach to the converted?


I was just the other day looking for a great app to turn Live Photos into shareable images easily. I feel a well taken Live Photo provides so much more life into that moment than a still photo. My current workflow was Gif Toaster to turn them into GIFs, and then upload the result image to Giphy, and share that. Painful and slow, really. This new Google app is AMAZING for what I wanted to do, plus provides stabilization making the resulting images even better. Goddammit, Google, well done!


GIF Toaster is pretty bad, I made an app called Lively (http://lively.tinywhale.net) to convert Live Photos to GIFs/movies. It doesn't have fancy image stabilization but you can trim the "live" part or pick a frame out of a Live Photo :). It's free to try.


It won't have the fancy stabilization of Google's new app, but you could build your old method in Workflow.

https://workflow.is/workflows/b960986caaca4ea29cfa3fcfe1aec6...

Just swap in Upload to Imgur at the end where that example shows Quick Look. Or if you want to get fancy you could have it ask what to do after converting.


I thought it was going to create an animation from a still photo, and of course assumed that it would be _smoothly_ looping. It did neither.


I have an app released which a rare downvoter complained "i thought it would be a standalone database manager" when the app is nothing of the kind and didn't claim at all to be.

Likewise: right up front this app clearly addresses Live Photos which is a particular iOS/iPhone 6s feature. It does not claim to do what you lament it doesn't.


Yes, so did I, and coming from Google Research I was expecting it to use an obscure deep learning algorithm that was known for years but nobody had taken the time to optimize for GPUs.

Still it was and interesting read, with few technical details, but just enough. Lately I'd been looking into stabilization with OpenCV and this seems like a fun "let's re do it" project.


Does anyone know why Google hasn't brought Live Photos to Android yet? And why, when I upload a Live Photo from my iPhone to Google Photos, and then visit that photo in a web browser, there's no way to see the animation?


If I had to guess..?

Live Photos requires good UI design to make it useful and easy to use and you can bet that Apple has patented the existing (novel) UI for Live Photos on iOS.

So Google has to a) come up with a just as good way of doing it b) do it in a non-infringing way and c) do it in a way that doesn't feel just like a copy-cat and damage their brand.

Google is not great at UI/UX (they're not bad, just not great, they're solid but it's not really their focus) so I can imagine this presents a compound and complex problem for them with all the other factors added.


I guess they don't care enough. Especially since HTC had live photos before iOS and Google just ignored the whole feature.


Downloaded it and tried it out. I have to say it works really well. The stabilization is quite good and you can export the end result as either GIF or a video with sound (because Live Photos themselves have sound). Good job, Google.


Even more so, it automatically stabilizes the clips while you're scrolling thru your collection - it's practically done with the conversion even before you see what you might want to convert.


So, how are Live Photos different from 3-second video clips (other than resolution)?

There is no technical reason this shouldn't work with short clips or sequences of still images.


The critical UX difference is: when in Live Photo mode, it is continuously recording - idea is you take a photo, and get 1.5 seconds before & after.

This differs from "3 seconds of video" in that it starts recording BEFORE you decide to.

It's all about the UX.


In a technical sense? They are no different.

In a UX sense? A lot different. In a emotive sense? Also a lot different.

Live Photos are different to most users rather than being different on a pure functionality level.


Very similar to the UIST 2012 best paper on "Cliplets" via Microsoft Research http://research.microsoft.com/en-us/um/people/hoppe/proj/cli... .


There's some overlap, but what impresses me is that they're doing camera tracking (hard) in near-real-time and automatically (even harder) on an iPhone. Camera tracking is used in feature film visual FX frequently, and it's usually a very manual and specialist intensive process with very esoteric (and slow) tools...


you can do it yourself on a 7 year old laptop in real time http://www.robots.ox.ac.uk/~gk/PTAM/

Its a bit hilarious that Microsoft hired PTAM author just to put him in ... MSN software group :D


What's also impressive to me is that Google's solution seems robust, meaning it does a good job in the face of outliers like lens flare, exposure changes, and the like. I see a lot of research with canned results where they optimize for the 6 or so clips they test with, but it's an order of magnitude harder to make it robust enough to release it as GA to the general public.


It's 2023, and we still only have 256-color gifs...


Inside a GIF file, a palette can be redefined anywhere, and in between chunks of image data. Using that trick, you can pull 24-bit colour GIFs. There are two problems about it: most GIF editors don't use this trick, and it makes the GIFs grow significantly (50 MB GIF anyone?).

Raher than pushing an old technology, I'm surprised they didn't go for a better, modern alternative, like WebM.


On all 4chan boards, webms have pretty much completely replaced gifs.


We have other formats like animated pngs (and most browsers support them), but they don't seem that popular.


"256 colors ought to be enough for anybody"


This is surprisingly nice. I like live photos, but they've always been hard to share. This not only makes it easy, but the image stabilization really improves the result (maybe I just have a shaky hand).


I wonder if this helps or hurts the folks at Flixel. These look a lot like cinemagraphs, and they're free/easy to make.


This is a neat app but the failure state when it can't figure the motion out is downright trippy.

I have a live photo of waves coming in on the beach that looks like something from Inception.


haha isn't that a good thing?


It's great that Google is happy to build iOS apps, I just really wish they'd stop pushing material design on platforms other than their own.


/rant

I am so tired of comments like yours.

Google does a ton of UX research, so I am sure they have a much better understanding of whether normal iOS users find their apps usable or not.

If you dislike their design, don't use their apps. It's not pre-installed or forced upon you in any way.


I think your comment is more aggressive than is necessary.

My preference is for mobile apps that feel at home on the platform they're installed on. In my opinion Google builds high quality apps that I want to (and often do) use, but I have a disagreement with their style choices. This is a legitimate opinion to hold.

When an app is as good as (say) Google Maps, the fact that I disagree with some decisions isn't going to stop me using it. But for apps of more borderline utility, it might do.


That makes no sense. There is no "pushing" going on. An app can be designed with any style or approach. I'm not sure how you expect Google to design their apps... perhaps get in touch with them and offer your advice.


I know an app can be design with any approach, I just happen to disagree with theirs.

However there's also a perfectly plausible possibility that that Google's use of material design in iOS is less about well-researched decisions and more about trying to get people used to the Android aesthetic so that they switch. I don't know if this is what's actually happening, but I don't think it's entirely outlandish, and it's what it feels like as a user of their products.


Material design acting as vehicle of infiltration, contaminating the minds of users on other platforms, making them abandon their devices to join the rival tech giant? That's a good one.


When you export as a GIF, how do you stop it putting the little watermark in the bottom right corner?


In the settings there is a toggle for watermark


Ah I see now, thanks for that. For those as lost as me, to get to the settings, you have to scroll to the very first photo and then 'pull down'.


And pull way down. I had to slide-walk several fingers to force it.


Excellent! Thank you for this algorithm! Will implement it in my hyperlapse stabilization suite!


Yes, but can it use a html5 video format instead of gif?


It can export as a H.264 video, with or without sound.


would be great to have the stabilisation as a mobile/desktop app for video



IMHO, it would be even better to have this as a servic that I can submit media to and receive these kind of results. Does anyone know of such a service? No knock on Motion Stills, I think Google has made an awesome app.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: