Several have tried to "accelerate" Android, like Myriad with Turbo Dalvik, but it's not so simple. The ART benchmarks are tantalizing, but how much battery life do you give up to get 2X performance in a synthetic benchmark? Hopefully ART is at least as efficient as Dalvik, but nobody outside Google has measured that yet.
Potential improvements in the VM need to show they don't give up in other performance dimensions what they gain in JIT'ed code performance.
I'd wager Hotspot's JIT would beat the pants off Dalvik's JIT, and make your battery hot while doing it.
Dalvik's JIT is designed unlike other JITs: It is designed to JIT compile less code, but to find the code that has the highest impact. Dalvik's bytecode interpreter already gains what Google has claimed is a 2X improvement over interpreted Java bytecode. Before Android 2.3, that's how Android ran code.
To really run a CPU hot, you need to manually create a workload that utilizes all of the computation units (ALU, FPU, SIMD) interleaving assembler commands. This is nontrivial, and will not happen "by accident" when doing aggressive optimization. Have a look at the cpuburn program for one example.
That video is tantalizing, but it's dated 2010. If it works as well as advertised, then why has it taken three years to be part of Android? I'd love to see ART work out though. And the video is almost worth watching just for the very awesome French accent.
https://www.youtube.com/watch?v=tEAz9fRoDmA&list=UUowARqiVYk...