Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

An interdisciplinary mashup I’m curious about is using videogame engines (e.g. Unreal) for quality real-time 3D rendering responding to e.g. audio signal (music) or motion, and creating cool art installations / live performances based on that.


On a smaller scale, this is def. a thing that's coming. Take the OP-Z for example, which has a dedicated Unity engine track (in addition to sample, synth, MIDI, FX, performance and even DMX).

Realtime ray tracing (see e.g. NVIDIA iRay) is being patched into many realtime 3D systems at the moment, and I suspect will become more popular for projection mapping art installations this year.


Notch is used for art installations and live performances (Rammstein, Billie Eilish, etc) as stage shows rendered in real time. It started as a tool to make graphics demos, so pretty close to a video game engine.

https://www.notch.one/

https://www.youtube.com/watch?v=TaEoAJw_0Zc


I recently saw this mock up of a DE done in Unity[0]. Maybe not the best use case but interesting nonetheless.

I've long dreamt of putting together an intuitive CAD tool equivalent to SketchUp in Unity.i think it's probably doable.

[0]: https://old.reddit.com/r/Unity3D/comments/ey5eta/i_made_an_o...


Eevee is quite interesting. The performance isn't quite there yet when compared with Unreal, but it is built directly into Blender.


check out the work of keijiro takahashi: https://www.keijiro.tokyo/


This has been done in various forms for a very long time. Winamp visualizations were common back in the day. Look at Touch Designer if you want to see one of the ways large concert panels are driven.


I know about Touch Designer, but it is not a game engine and to my knowledge does not really specialize in fully real-time high-resolution detailed 3D rendering.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: