There's kind of a product space to support "buyers", specifically calling out your section regarding "software integrator" vs "knowledge worker / domain expert." BTW "buy vs build" is a common engineering management language for this sort of thing I think; https://en.wikipedia.org/wiki/Component_business_model ... there's some silliness here in this space too since software itself is intended to already be maximally flexible; so "sellers" spending time building lots of flexibility is sort of a smell.
To motivate some of the machinery, you know systemically it's kind of a no-brainer for how financing and oversight works I suppose; overseers want to see early results which you might get from a knowledge worker rehashing or tweaking an existing solution, and then integrators get to play 'pick up sticks'.
So you have the integrator is the "buyer" because they have opted to defer product expertise (disregarding whether there's even any real cost, i.e., a FOSS solution, etc. "buy" meaning "not build").
The buy side has popular tools like linters, static analyzers; this is pretty huge. If you look at the "cloud native" space, treating lots of the cloud software players as "integrators" of software that's intended to be distributed, public, always-on, and at a cost of 'ad-supported' the buy side has the "Security and Compliance" layer: https://landscape.cncf.io/ ... a lot of money is flowing there too, since it simplifies precisely what you mention there, so you're in good company. Granted, "integrator work" might be a bit less specialized than the sorts analyses these tools might perform, but it's the same problem applied instead of abstractly "domain expertise" to "domain expertise of deploying existing software solutions."
It might not be popular thinking, but doc tools like CodeViz, doxygen, any worthwhile IDE, and other tools in the space are probably somewhere between the next space:
OTOH managers are sort of 'friendly buyers' for the expertise of their employees, so you can look at the tools of requirements elicitation, definition, capture, scheduling, prioritizing, and lifecycle management too, thinking specifically of like stories, epics, kanban, and presumably legacy systems like DOORS, which I don't have any experience with. If you want to avoid too much philosophizing and want to defer to some pretty broad experience SWEBOK has a lot of words about this more 'pre-compiled' approach rather than 'just-in-time' approach that most agile-type firms will use for their usually simple value props.
The same sorts of things might apply though; identifying whether "coverage exists" for "domain-expert produced prototypes" -- specifying the form of those, in say what you mention, signal processing. Basically, TDD -- does the shipped code match the tests? Does the integrator know enough to write such tests?
Essentially using 3rd party machinery to establish acceptance criteria, and clear targets to ensure that the "buyer-builder" contract is fulfilled.
Thanks for this thoughtful analysis. Your framing of the 'buyer vs builder' dynamic really resonates with what I saw at Raytheon and explains a lot. We weren't just struggling with a technical handoff - we were watching an organization transform from a builder to a buyer of expertise.
The part about 'overseers wanting early results' is especially relevant to defense. Program managers need to show progress to keep funding, which incentivizes quick integration of existing solutions over deep technical work. I saw this firsthand when our most sophisticated signal processing work was subcontracted out.
But here's where defense differs from commercial software: When you're building complex systems like radar or signal processing pipelines, the 'buy' approach has serious limitations. You can't just integrate your way to novel capabilities. Those retired experts we had to bring back? They represented irreplaceable domain knowledge that can't be purchased off the shelf.
The tools you mention (linters, static analyzers, requirement tracking) are valuable, but they don't solve the core problem: you need people who deeply understand both the mathematics and the implementation. In my current role, I see PhDs struggling because they lack this foundational knowledge - no amount of process or tools can bridge that gap.
What I think we need is a model that preserves deep technical expertise while still allowing for integration of existing solutions. But the current funding and organizational structures in defense make this really hard to achieve.
Well, naively, then, what you might be describing is a new language; because if the software itself can't capture the details of the expertise, then that may imply that something simply isn't being expressed.
Languages evolve with expertise, specialties are tailored to and can be permanently and effectively captured; and not just captured, but reused as in the case of libraries.
Perhaps there's some core capability missing from the MATLAB ecosystem that isn't so obvious on the surface. What I know about MATLAB is it is primarily focused on simplifying and making performant DAQ / processing cycles, but not necessarily associating concepts to use-cases and correctness/effectiveness (and probably optimality, as an important thought) -- say with how Objects and Types compartmentalize data transformation effectively for user-facing software applications with understandable UIs.
Programming language adjustments sound good on the surface but don’t really cut deep enough, as far as the problem domain is concerned.
This is both research and implementation of a military grade solution.
Think about it this way: both the electrical engineering and the mathematics you are combining are (a) cutting edge in their respective fields (for the most part) and (b) cutting edge in their combination. Finding good ways of expressing that Geometric structure in a programming language feature or as a subroutine library is 10 years and many more applications (read: real world tests) down the road from where original poster’s work takes place for the government.
And ipunchghosts seems to have encountered what other people in his position convey behind closed doors: The researchers are sacrificial pawns and even sacrificial chess queens. Your mileage may vary, of course.
Again, probably pretty naive take (and without regard to the amount of or quality of outcomes), but when looking at the way academic research occurs, thinkers are often exposed to failures and asked to explain them or expound upon the underlying philosophy -- rather than to participate in the full engineering cycle; that can take different forms such as hands-on development, but particularly in engineering, viewed with skepticism.
I think there's even a take that you can tease the patent system as a way to profit from failing to explain ideas effectively.
Edit: maybe the consideration then is what are the roles missing? if there's a way to improve off-the-shelf model performance faster than moore, what is it? scaling to teams that specify more specialized roles; simulation, model architecture / quantization specialist, systems-level, hardware-match specialist / minimization? some sort of way to compose, or perform operations on the content of models?
I guess im wondering if other researchers in DoD are having the same sentiment as me. I find it hard to believe they arent but it's something that not talked about much because from my point of view, there arent many researchers with 20 YoE left in the DoD. If i am wrong, point me to them as i want to join their ranks!
To motivate some of the machinery, you know systemically it's kind of a no-brainer for how financing and oversight works I suppose; overseers want to see early results which you might get from a knowledge worker rehashing or tweaking an existing solution, and then integrators get to play 'pick up sticks'.
So you have the integrator is the "buyer" because they have opted to defer product expertise (disregarding whether there's even any real cost, i.e., a FOSS solution, etc. "buy" meaning "not build").
The buy side has popular tools like linters, static analyzers; this is pretty huge. If you look at the "cloud native" space, treating lots of the cloud software players as "integrators" of software that's intended to be distributed, public, always-on, and at a cost of 'ad-supported' the buy side has the "Security and Compliance" layer: https://landscape.cncf.io/ ... a lot of money is flowing there too, since it simplifies precisely what you mention there, so you're in good company. Granted, "integrator work" might be a bit less specialized than the sorts analyses these tools might perform, but it's the same problem applied instead of abstractly "domain expertise" to "domain expertise of deploying existing software solutions."
It might not be popular thinking, but doc tools like CodeViz, doxygen, any worthwhile IDE, and other tools in the space are probably somewhere between the next space:
OTOH managers are sort of 'friendly buyers' for the expertise of their employees, so you can look at the tools of requirements elicitation, definition, capture, scheduling, prioritizing, and lifecycle management too, thinking specifically of like stories, epics, kanban, and presumably legacy systems like DOORS, which I don't have any experience with. If you want to avoid too much philosophizing and want to defer to some pretty broad experience SWEBOK has a lot of words about this more 'pre-compiled' approach rather than 'just-in-time' approach that most agile-type firms will use for their usually simple value props.
The same sorts of things might apply though; identifying whether "coverage exists" for "domain-expert produced prototypes" -- specifying the form of those, in say what you mention, signal processing. Basically, TDD -- does the shipped code match the tests? Does the integrator know enough to write such tests?
Essentially using 3rd party machinery to establish acceptance criteria, and clear targets to ensure that the "buyer-builder" contract is fulfilled.