Requirements First Mindset
Defense programs are increasingly recognizing the value of human-centered design, but many of the most consequential product decisions are still made before designers or end-users enter the picture. If defense wants more usable, effective, and trustworthy systems, human-centered design methodology needs to move upstream ahead of locking requirements rather than remaining a downstream implementation detail.
TL;DR
The DoD’s acquisition reform language increasingly mirrors core design principles.
Definitions of “iterative” and “outcomes-based” are being operationalized right now.
Many defense programs still define requirements before validating warfighter input.
Research is frequently approached as training demonstration rather than qualitative inquiry.
Design competency is largely absent from acquisition workforce transformation efforts.
The opportunity for design to influence acquisition policy has never be more open than it is now.
The Reform Already Speaks Design
Last month, the Department of Defense published its Acquisition Transformation Strategy. The strategy calls for “iterative approaches” and frames success as “rewarding outcomes over process.” It describes a development culture centered around “speed, innovation, and accountability,” language that, candidly, has not reflected much of my experience working on defense programs.
I have spent the last seven years making the case for human-centered design in defense. I have advocated versions of that argument to program managers, contracting officers, engineering teams, primes, startups, and DoD stakeholders alike. For the first time, however, I find myself reading versions of those same arguments reflected back through official policy language. The reform increasingly speaks the language of design, but whether that language becomes meaningful practice will depend largely on who participates in shaping its implementation.
The distinction matters because policy language and implementation are not the same thing. The phrase “outcomes over process” may sound straightforward, but different positions will interpret it very differently. A contracting officer may hear flexibility within an otherwise familiar procurement structure. A designer hears something else entirely: a process where requirements evolve through observation, prototyping, testing, and learning from the people expected to use the resulting capability operationally.
The same tension exists around the word “iterative.” Historically, iteration inside defense acquisition has often meant successive engineering builds within a relatively fixed requirements envelope focused on incremental functionality. In design practice, iteration means something more foundational. It means the requirements themselves remain open to change as understanding deepens through engagement with operators. Both interpretations can exist under the same language while leading to very different outcomes.
The Executive Order on acquisition modernization earlier this year made explicit what the government is attempting to change: procurement systems that move too slowly, require too many approvals, and struggle to integrate emerging commercial technology at operational speed. The Acquisition Transformation Strategy operationalizes that intent, and branch-level implementation guidance is already beginning to follow. At nearly every layer of the reform effort, the language of design is present. My concern is that the reform risks framing iteration primarily through the lens of engineering process improvement while overlooking the role human-centered design methods play in defining better outcomes in the first place.
The Inverted Process
For seven years, the most persistent challenge I have encountered in defense design is what I refer to as an inverted product development process.
Programs define requirements before meaningful design work begins, often based on technological advancement built on legacy systems, prior doctrine, or engineering and production assumptions rather than current warfighter needs. By the time a designer becomes involved, the problem has already been named and the solution heavily constrained. Design becomes a production layer rather than a strategic partner helping shape product direction alongside operators.
This problem exists regardless of the design maturity of program leaders, contractors, or consultants because it is enforced by the procurement structure itself. When programs require finalized requirements before meaningful research and design strategy occur, the inversion becomes difficult to overcome. You cannot improve requirements through operator insight if the structure of the program does not allow those insights to meaningfully influence the requirements before they are locked.
As I often describe it, the process becomes a technical solution looking for a human problem. I have seen this happen repeatedly in practice.
On one program, our team was given unusual flexibility to conduct meaningful research with active soldiers early enough to influence the direction of the product itself. We built high-fidelity prototypes, observed operators performing realistic mission tasks, and iterated based on what we learned. What emerged from those sessions changed our understanding of how the product should function and integrate into real operational workflows.
Several key assumptions embedded within the written requirements did not align with how operators actually wanted to interact with the platform. Workflows that appeared logical on a whiteboard became cumbersome, inefficient, or simply incorrect once placed in the hands of the people expected to use them operationally. To their credit, several DoD stakeholders encouraged us to continue pursuing what operators were responding to rather than rigidly adhering to assumptions that no longer appeared correct. As a designer, it was one of the most exciting experiences I have had in defense work because it felt like human-centered design functioning successfully within the system.
Eventually, however, we encountered the hard stop. The requirements still had to be fulfilled, and I was told directly that changing them would require the signature of a four-star general and an act of Congress.
That moment fundamentally changed how I think about defense product development because it clarified something important: if design enters after requirements are finalized, much of the most consequential decision-making has already happened.
The current acquisition reform creates genuine space to challenge that inversion. But the mechanisms that will determine whether that happens, contract structures, evaluation criteria, workforce competencies, and implementation guidance, are being defined right now. Historically, those conversations have occurred largely without design practitioners involvement, creating a meaningful gap between the intent of the reform and the expertise required to operationalize it successfully.
Questions Versus Answers
One of the most common patterns I encounter on defense programs is that organizations approach user engagement with a presentation and training mindset rather than an inquiry and iteration mindset.
Program teams demonstrate systems, explain workflows, teach operators how the product works, and then gather generic feedback afterward. While that process may satisfy contractual requirements for user engagement, it leaves an enormous amount of usability insight on the table as training someone how to use a system is fundamentally different than observing how they naturally think, behave, hesitate, adapt, struggle, or fail within it.
Part of the challenge is operators themselves can be surprisingly difficult research participants without the right methods in place. Highly trained warfighters are often conditioned to project competence, adaptability, and confidence. Many are accustomed to working around friction or accepting cumbersome workflows as normal. When shown prototypes, the response is frequently some version of: “Yeah, this works. I could use this.”
Getting beneath that surface layer requires actual design research practice. It requires contextual inquiry, observation without explanation, iterative testing, and the ability to recognize that what users say and what users actually struggle with are not always the same thing.
Some of the richest research opportunities I have experienced in defense environments emerged because the design team intentionally created conditions where things could become confusing, frustrating, or even fail in controlled ways. Those moments often revealed far more about how operators naturally thought and behaved than polished demonstrations ever could.
I think often about a quote I heard from an Army Forward Observer with two decades of service and years of combat experience. During an interview, he said to me: “The things we are given are built by a bunch of nerds wearing khakis and polo shirts, sitting in conference rooms with no idea what we actually need.”
That statement has stayed with me because it captures the central tension remarkably well. Defense organizations are filled with intelligent, experienced people attempting to solve difficult problems, but there is often enormous distance between the environments where systems are conceived and the environments where they are ultimately used. The further upstream decisions are made without meaningful operator engagement, the greater the likelihood that systems become capability-centered rather than human-centered.
The Workforce Gap
The acquisition reform correctly identifies a workforce problem. Defense organizations need people capable of operating inside faster, more iterative, commercially integrated acquisition environments. Recent conversations around workforce transformation increasingly focus on modernizing contracting, agile methodologies, and accelerating delivery timelines. Those investments are necessary, but they are not sufficient.
What is still largely absent from the conversation is design competency itself: research methods that engage operators before requirements are finalized, prototyping that makes ideas testable before engineering commitments are locked, and validation methods that measure operator behavior rather than specification compliance.
These are not adjacent capabilities. They are the methods through which user-defined outcomes are actually produced. Without them, “outcomes over process” risks becoming a faster version of the same capability-centered acquisition model rather than a fundamentally more human-centered one.
I have led programs where contextual inquiry completely changed the fundamental aspects of the design through nothing more than a simple prototype placed in front of operators. In one case, users made it immediately clear that the most important information was on the wrong screen. We redesigned accordingly. That insight would never have emerged from a requirements document alone. It required someone who knew how to ask non-leading questions, observe user behavior, and synthesize their experience into actionable design direction.
The acquisition reform needs that person, and at the moment, design competency still does not appear meaningfully represented in workforce transformation conversations.
One of the more interesting findings from DARPA’s Explainable AI program was that teams combining human-computer interaction expertise with computer science consistently outperformed teams approaching the problem from computer science alone. That insight came from inside the DoD itself, yet it has not meaningfully appeared in acquisition workforce planning discussions.
The Window Is Closing
Defense acquisition has been reformed before, and each cycle introduced meaningful improvements in specific domains before eventually being absorbed into the larger institutional structure around them but the current moment feels different.
There is simultaneous pressure occurring from multiple directions: executive leadership, acquisition strategy reform, branch-level implementation, commercial technology acceleration, and a new generation of increasingly vocal warfighters. Definitions established over the next year will likely harden into evaluation criteria, contracting structures, training curricula, and workforce expectations for years to come. What “iterative” means in process, what “user outcomes” means in evaluation criteria, and whether human-centered design methods become recognized as a formal acquisition competency are all being decided right now.
The opportunity for design practitioners to influence those definitions may never be more available than it is in this moment. At the same time, there is an honest complication embedded in that statement. The argument for design’s seat at the table assumes there are designers prepared and willing to take it.
In seven years working in this space, the number of designers actively pursuing defense work has felt remarkably small. Not because the work lacks meaning. I would argue the opposite. But the design community has historically gravitated toward different kinds of industries, different kinds of products, and different ideas about where design effort should create value.
Opening the seat is necessary. Finding people willing to fill it is a different challenge entirely.
Still, for the first time in my career, I can point directly to official policy language beginning to reflect many of the arguments design practitioners have been making for years. The perceived value of design is evolving inside the defense industry, and what happens next may determine whether this reform becomes procedural change or genuine transformation.
Further Reading
Modernizing Defense Acquisitions and Spurring Innovation in the Defense Industrial Base — The White House
Defense Department Acquisition Transformation Strategy — USNI News
The Army’s 2025 Acquisition Reforms — U.S. Army Acquisition Support Center
Talent Development to Unleash the DoD Acquisition Workforce — DefenseScoop
Defense Acquisition Reform: Persistent Challenges — U.S. GAO
UX Research Methods Overview — Nielsen Norman Group