In November 2020, I sent an email to about 1400 people asking for help in understanding what they saw as the most pressing current issues with geospatial data and technology. Thanks to everyone who responded!
As a general description of those that did respond:
Roughly two-thirds (2/3) are project managers/implementors of geospatial technology, while one-third (1/3) are involved in proposal writing and business development.
Interestingly, there was an even split between folks that work at small (<10), medium (10 to 100) and large (>100) organizations. What this tells me is that the perspectives are well distributed across organizations and are not unique based on where you work or who your boss is.
In characterizing some of the biggest challenges with geospatial data and technology, the three (3) most difficult areas that emerged were:
Keeping up with hardware– cameras, platforms, GPS, drones, tablets, mobile devices, etc.
Internal IT infrastructure– PCs, servers, VPN, security, firewalls, etc.
Data formats– CAD, GIS, Imagery, LiDAR, PDFs, internal formats, transferring formats back and forth
Meanwhile, almost 80% of respondents said that cloud-based technologies- effectively using data online- was not an issue for them or their organization. If that is the case, then it appears the “cloud revolution” is not ongoing… it’s over! Which I find interesting given the latent pervasiveness of file-based data delivery requirements and the ongoing need for specialized, proprietary software across our industry.
So, while we love our cloud-based tech and evidently aren’t scared or threatened by it, it’s not clear to me why we are not leveraging it fully. Almost like someone back in the 1970’s saying they absolutely love their new TV, but don’t see any problem with walking across the room to change the channel.
The Big Reveal: Managing large data volumes and integrating that data between proprietary software platforms did show up as a recurring theme. However, one pattern emerged overall in the discussion of the single biggest challenge when it comes to geospatial data and technology: the lack of a consistent understanding of geodesy. Previously relegated to the realm of a small group of scientists and land surveyors, geodesy is now everyone’s problem. And I must concur that most of the issues I encounter on data projects are fundamentally geodesy issues. And that comes before the understanding of things like data standards and accuracy.
Having correctly defined datums, geoids, projections, and coordinate systems all seem like so much mathematical mumbo jumbo to most of us.
What we want is for all that stuff to have been figured out behind the curtain, so everything just “works”.
And it usually does. Of course…until it doesn’t.
To most of us, these things are just settings that should be automatically accounted for or adjusted when necessary. But they can wreck havoc on any data project, particularly when comparing two different datasets (cue the uncontrolled shuddering and shaking). When asked recently what advice I would have for young geospatial professionals entering the field, I answered “1- Learn to code, and 2- Learn as much geodesy as you can.”
I literally had a surveyor once go straight to the client and tell them my data was off by over 100ft! Um, check your datum and geoid settings please? We were using federal project standards and he was using a local grid. And of course, we were both right.