As early as 10th grade I remember asking my GIS teacher, "why is ArcGIS so sluggish and aliased but in my 3d animation class the viewports are antialiased and smooth like butter?" (I didn't use these words but they're what I meant to say)
I've been a bit obsessed with interdisciplinary mashups ever since. Robotics, for example, badly badly needs to stop writing their own tools and use a ton more of what already exists in GIS. And the disciplines of geography need to do the same with the amazing vector and raster tools out there as well.
It's really awesome seeing Blender used as a GIS view.
I want the doppler weather radar data with its height slices cast over a 3d topo map of the terrain. We can sometimes see the effects of terrain on weather in the real, but zooming out and having perspectives of real recorded data would be entrancing to watch... i'm sure there's a lot to be learned form such things to but i just want to see it better.
This seems like something ideal for development by NOAA/NWS, with the resulting code operationalized as part of weather.gov and also released on Github.
The 3D animation software used by TV news weather departments can do all of the above, as far back as 2000, so it should be possible to replicate in a more open way.
An interdisciplinary mashup I’m curious about is using videogame engines (e.g. Unreal) for quality real-time 3D rendering responding to e.g. audio signal (music) or motion, and creating cool art installations / live performances based on that.
On a smaller scale, this is def. a thing that's coming. Take the OP-Z for example, which has a dedicated Unity engine track (in addition to sample, synth, MIDI, FX, performance and even DMX).
Realtime ray tracing (see e.g. NVIDIA iRay) is being patched into many realtime 3D systems at the moment, and I suspect will become more popular for projection mapping art installations this year.
Notch is used for art installations and live performances (Rammstein, Billie Eilish, etc) as stage shows rendered in real time. It started as a tool to make graphics demos, so pretty close to a video game engine.
This has been done in various forms for a very long time. Winamp visualizations were common back in the day. Look at Touch Designer if you want to see one of the ways large concert panels are driven.
I know about Touch Designer, but it is not a game engine and to my knowledge does not really specialize in fully real-time high-resolution detailed 3D rendering.
That's basically how all technological and science progress works. You take a few existing dots and connect them together. Connecting dots between fields that are further apart is less common and often has more impact.
I got into GIS because my public high school happened to be a "GIS magnet school." Turned out that was basically the only thing I actually felt interested in actually doing well in.
My university had a geography program where I focused on GIS and remote sensing among other minor focuses (you really have to be multidisciplinary in geography). By the time I finished my Master's they had a dedicated Geomatics program which was a lot more GIS heavy.
I had a couple GIS courses in University, along with a few mapping and survey courses. We also used arcGIS extensively for different projects in other courses and for our year long research project. I had never heard of GIS in high school though.
The amount of academic capture by Esri is crazy. Though to their credit my profs forced us to use QGIS a lot and learn enough Python to write our own workflows both in QGIS and model builder. I'd say most students still just became GIS button pushers but some definitely broke out of that capture and began making their own tools.
On the other hand there are people like me - where GIS course at university was basically "setup a postgres DB, load test data, add a feature in qgis" - and in professional work i had to probably re-discover everything.
If someone with proper education in the field would look at my topology checking code, they would die a bit inside.
As a software developer that has never had any GIS classes, what is the best way to get started in this?
Every time I see something as cool as this I want to learn it and find some way to apply it and create something cool, but I never seem to find a project that I can dive into or be able to apply it in real world. (But I'm sure this is because my surroundings just aren't interested in it)
I honestly just sort of fell into a job where it was relevant. I don't know that there is anything inherently interesting about it. It's mostly just lat/lon coordinates and putting stuff on maps.
Same for programming IDEs. My old P2 350 could compute real time simulation of soft body over Nurbs but doing anything in Eclipse took 5 seconds. Does not compute.
Looks good. I made this site I sent to Show HN last week[1] which lets you see a 1km square at 1m resolution at any point in the UK's public lidar data, using three.js.
Someone did ask about an export feature. What format would the export have to be to be imported into this?
Thanks. The 3D view is generated from using the latest raw data but the 2D map tiles are all pre-rendered en masse, so this spot must have been added to the data since the tiles were lasted generated, which admittedly has been a while - it's a long job.
Similarly, I've been hoping for a few people to pick up 'BlenderCAD' to add accurate sizing/positioning and dimensions to Blender. For this sort of thing, Sketchup is still (a lot) better.
Nice. In retrospect, Blender choosing Python for its UI script in and API was really a smart bet, especially given its adoption in the scientific community. This kind of "mash up" is really great.
This looks great! I've been looking at tools to do viewshed analysis. Already being familiar with Blender, this is just the type of project I was looking for.
I'm interested in some what's visible through the horizon. I wonder, does the 3d model import take into account the curvature of the earth, or is a projection onto the plane they start with?
The raw data is in lat/long and elevation. You use a tool like QGIS to project the raw data into xyz data [0]. QGIS and the projection model determine the coordinates of the data that get rendered. Blender just renders the xyz data.
Seeing the height-map retrieval makes me wonder: are there any (non-AR) games that have a game world that's not just based on the real world, but rather an import of real-world GIS data?
VBS3 by Bohemia Simulations can import and I believe even live load continual terrain due to mixing real-world data with approximations of trees and whatnot from satellite images...but it's not for the public, military only. The civilian branch of the company is BI who make Arma 3; that supports real-world terrain imports via mods
Ubisoft/Massive Entertainment's "Tom Clancy's The Division" (New York - Brooklyn and Manhattan), and "Tom Clancy's The Division 2" Washington DC both have extensive GIS imported locations, cleaned up with LIDAR and post-processing for trees/vehicles. I know there is a video about Division 2's use of GIS as I remember seeing it, but can't find it at the moment. I think it might be a game developers conf video.
I've been a bit obsessed with interdisciplinary mashups ever since. Robotics, for example, badly badly needs to stop writing their own tools and use a ton more of what already exists in GIS. And the disciplines of geography need to do the same with the amazing vector and raster tools out there as well.
It's really awesome seeing Blender used as a GIS view.