The Pentagon's Joint All-Domain Command and Control data fabric promises to unify sensor feeds across Army, Navy, Air Force, Marines, and Space Force. Here's what actually works and what's still vendor hype.

The Department of Defense is spending billions on Joint All-Domain Command and Control (JADC2) data fabric architecture. The pitch sounds compelling: unify sensor data from all services into a single, real-time operating picture. Let an Army fire control system use Navy radar data. Let Space Force satellite feeds inform Air Force targeting. Break down the stovepipes that have plagued joint operations since Desert Storm.
As someone who's spent years navigating Navy ERP integration challenges and dealing with data systems that barely talk to each other within the same service, I'm watching JADC2 implementation with equal parts hope and skepticism. The technology exists. The operational need is real. But the organizational, political, and technical challenges are massive.
Let's cut through the vendor hype and talk about what's actually happening on the ground.
JADC2 isn't a single system—it's an architectural concept for connecting sensors, decision-makers, and shooters across all domains (air, land, sea, space, cyber). The data fabric is the connective tissue: a mesh of networks, protocols, and standards that let heterogeneous systems share data in real-time.
Think of it as trying to build a universal translator for military systems:
Each service operates its own command and control ecosystem, often with multiple generations of systems running simultaneously. Some use Link 16 tactical data links. Others use proprietary protocols. Many can't share data outside their security enclaves without manual intervention.
JADC2 data fabric aims to create interoperability without forcing every service to rip and replace existing systems. The goal is to build adapters, gateways, and federated data layers that translate between systems.
Here's where vendor pitches meet operational reality.
Military sensor data spans classification levels: Unclassified, Controlled Unclassified Information (CUI), Secret, Top Secret, and various compartmented programs (TS/SCI). The data fabric needs to enforce classification boundaries while enabling cross-domain sharing where authorized.
Current solutions involve cross-domain solutions (CDS) and data diodes—essentially one-way data pipes that prevent sensitive data from flowing to lower classifications. But real-time sensor fusion requires bidirectional queries. How do you let a tactical planner at Secret query TS satellite data without exposing raw TS feeds?
The technical answer: differential privacy, query result sanitization, and federated learning models that share insights without sharing raw data. The practical answer: it's still being figured out in pilot programs, and accreditation timelines are measured in years.
JADC2 presentations love to show seamless global connectivity. Reality is messier. Tactical networks operate in Denied, Degraded, Intermittent, and Limited (DDIL) bandwidth environments.
A Navy destroyer in the South China Sea might have satellite comms, but bandwidth is constrained and latency is high. An Army forward operating base might rely on tactical radios with kilobits-per-second throughput. Space Force ground stations have reliable connectivity, but cross-domain accreditation limits real-time sharing.
The data fabric needs to work in these constraints:
I've seen Navy systems where a simple budget report transmission requires manual courier because the data crosses classification boundaries. Now imagine trying to do real-time sensor fusion across services in a contested electromagnetic environment. The technical challenges are significant.
Every service—hell, every program office—defines sensor data differently. An Air Force radar track includes different metadata than a Navy surface radar track. GPS coordinates might use different datums. Timestamps might not account for leap seconds consistently.
The Joint Common Foundation (JCF) is attempting to standardize data formats, but adoption is slow. Legacy systems weren't built with federation in mind. Adding adapters to translate proprietary data into common schemas requires software updates, testing, and accreditation—processes that take months or years.
From my experience with Navy ERP systems, even getting two financial systems within the same command to agree on a transaction format is painful. JADC2 is trying to do this across five services, dozens of major systems, and hundreds of legacy platforms.
The successful implementations I've seen follow a pragmatic path:
JADC2 spending is distributed across service-specific programs and joint efforts. Here's where the major contracts are flowing:
Army: The Integrated Battle Command System (IBCS) is furthest along. Northrop Grumman holds the prime contract, with subcontractors including Raytheon and Lockheed Martin. IBCS demonstrates multi-domain integration by linking Patriot, THAAD, and Sentinel radars into a unified air defense picture.
Navy: The Navy's Integrated Fire Control-Counter Air (NIFC-CA) program connects Aegis destroyers, E-2D Hawkeye aircraft, and SM-6 missiles. Lockheed Martin and Raytheon are major players. The Navy is also investing in Project Overmatch, a data-centric architecture aimed at fleet-wide sensor fusion.
Air Force: The Advanced Battle Management System (ABMS) is the Air Force's contribution to JADC2. Multiple vendors are competing in this space, including L3Harris, Collins Aerospace, and Palantir. The Air Force is taking a modular approach, running frequent experiments to test different architectures.
Joint Programs: DISA (Defense Information Systems Agency) is building network infrastructure to support JADC2, including secure cloud environments and cross-domain solutions. AWS, Microsoft Azure, and Google Cloud are all competing for DoD cloud contracts with IL4/IL5 accreditation.
Edge and AI Vendors: Companies like Palantir, Anduril, and Scale AI are positioning edge computing and AI-driven fusion as enablers for JADC2. Their pitch: let AI pre-process sensor data at the edge, transmit only actionable intelligence over constrained networks.
The total JADC2 investment is hard to quantify because funding is distributed across service budgets, but estimates range from $10-20 billion over the next decade. Whether that's enough depends on how aggressively the DoD tackles legacy system integration.
JADC2 data fabric shares architectural principles with commercial data mesh implementations, but the constraints are wildly different.
Similarities:
Differences:
The DoD could learn from commercial data mesh implementations in areas like federated query engines, automated data lineage tracking, and developer-friendly APIs. But the security and legacy constraints mean commercial solutions can't be adopted wholesale.
JADC2 is hard enough within U.S. forces. Now add NATO allies, Five Eyes partners, and coalition operations.
Allied sensor sharing requires navigating:
NATO is developing federated mission networks (FMN) to enable coalition sensor sharing, but progress is slow. The UK's Multi-Domain Integration (MDI) program aims for interoperability with U.S. JADC2, but technical integration is years away.
From my experience supporting Navy operations with allied partners, even simple data exchanges require manual approval chains and sanitization. Automating this in a JADC2 context requires not just technical solutions but diplomatic agreements on data sovereignty and classification equivalency.
Despite the challenges, some JADC2 capabilities are operational:
Army IBCS: The Integrated Battle Command System successfully integrated multiple radar types during operational testing. It's not full JADC2, but it proves that multi-sensor fusion within a single domain (air defense) is achievable.
Navy NIFC-CA: The Navy has demonstrated extended-range engagements where surface ships use targeting data from E-2D aircraft to guide SM-6 missiles beyond radar horizon. This is sensor fusion across platforms in a contested environment.
Project Convergence: The Army's annual Project Convergence exercises bring together joint and allied forces to test data-sharing scenarios. Recent exercises included AI-driven target recognition and automated sensor-to-shooter handoffs.
Link 16 upgrades: Tactical data links like Link 16 are being upgraded to handle higher data rates and new message formats, expanding their role as connective tissue for JADC2.
These successes share common traits:
The future of JADC2 depends heavily on edge computing—processing sensor data where it's collected rather than backhauling everything to centralized data centers.
Edge deployments enable:
Vendors like Anduril are building ruggedized edge servers designed for forward operating bases. Palantir's Apollo platform aims to manage software deployments across edge nodes with intermittent connectivity. NVIDIA is positioning its Jetson AGX for military edge AI workloads.
But edge computing in military environments faces unique challenges:
From my work on Navy systems, I know that getting software updates approved and deployed to shipboard systems takes months. Doing this for edge nodes in Army forward operating bases, with DevSecOps automation and continuous authorization, is a major cultural shift.
JADC2's biggest challenge isn't technical—it's organizational.
Building and operating a federated data fabric requires:
Most military units don't have these skills organic to their formations. The services are hiring contractors to fill gaps, but sustainable JADC2 requires building internal capability.
The culture gap is equally significant. JADC2 requires services to share data by default rather than hoarding it. It requires program offices to build to common standards rather than optimizing for their specific requirements. It requires flag officers to prioritize joint interoperability over service-specific capabilities.
These are organizational challenges that can't be solved by technology alone.
If I were advising a program office implementing JADC2, here's what I'd recommend:
Start with high-value, narrow use cases: Don't try to federate all sensor data. Pick a specific operational problem—like integrated air and missile defense—and solve it well.
Build on working systems: IBCS and NIFC-CA prove that multi-sensor fusion is achievable. Replicate their approaches in other domains before attempting cross-domain integration.
Invest in edge AI: Sensor fusion at the edge reduces bandwidth requirements and enables DDIL operations. This is where commercial AI innovation can directly support JADC2.
Automate accreditation: Continuous ATO processes with security controls embedded in infrastructure are non-negotiable for achieving speed at scale.
Train your people: Contractors can build initial capability, but units need organic expertise to operate and evolve the system.
Accept incremental progress: Full JADC2 across all services and all sensors won't happen in this decade. Measure success by operational capability delivered, not by architectural purity.
JADC2 data fabric is essential for future joint operations, but it's going to be messier and slower than PowerPoint charts suggest. The technology is largely proven in commercial contexts and narrow military applications. The challenges are organizational, cultural, and driven by legacy systems and security constraints.
Successful implementations will come from programs that focus on operational needs, integrate incrementally, and invest in both technology and people. Failed implementations will chase perfect architectures while ignoring practical constraints.
As someone who's wrestled with data integration in Navy financial systems—where the stakes are audits, not combat—I know how hard this is. JADC2 is that problem multiplied by classification levels, service boundaries, and life-or-death operational tempo.
The sensors exist. The networks are improving. The data standards are being written. What remains is the hard work of implementation, accreditation, and cultural change.
If the DoD can pull it off, JADC2 will fundamentally change how joint forces operate. If it stumbles, we'll have spent billions on yet another grand architecture that couldn't overcome organizational inertia.
Place your bets accordingly.
Working on JADC2 integration or multi-domain data challenges? I'm always interested in talking to people solving these problems in the real world rather than conference presentations. Drop me a line.