An interdisciplinary mashup I’m curious about is using videogame engines (e.g. Unreal) for quality real-time 3D rendering responding to e.g. audio signal (music) or motion, and creating cool art installations / live performances based on that.
On a smaller scale, this is def. a thing that's coming. Take the OP-Z for example, which has a dedicated Unity engine track (in addition to sample, synth, MIDI, FX, performance and even DMX).
Realtime ray tracing (see e.g. NVIDIA iRay) is being patched into many realtime 3D systems at the moment, and I suspect will become more popular for projection mapping art installations this year.
Notch is used for art installations and live performances (Rammstein, Billie Eilish, etc) as stage shows rendered in real time. It started as a tool to make graphics demos, so pretty close to a video game engine.
This has been done in various forms for a very long time. Winamp visualizations were common back in the day. Look at Touch Designer if you want to see one of the ways large concert panels are driven.
I know about Touch Designer, but it is not a game engine and to my knowledge does not really specialize in fully real-time high-resolution detailed 3D rendering.