Reliable power, information overload, size and weight, and interpreting old-fashioned infantry hand signals top research priorities for digitizing the warfighter.
The first recorded war took place between Sumer and Elam in Mesopotamia in 2700 BC, but archaeological evidence shows a history of violent mass conflict for more than 12,000 years, about the time humans began changing from hunter-gatherers to farmers and builders.
For that entire history, the brunt of war has fallen on the foot soldier, with little change in the basics of individual ground combat other than personal weapons, organization, and training. For all but the last few decades, these warriors generally were the least educated, most disposable members of society – sometimes dismissed as “cannon fodder”.
That began to change with the U.S. military in the 20th Century, with the development of a cadre of well-trained noncommissioned officers providing leadership in the moment-to-moment actions on the battlefield. Little else changed, however, until the 21st Century, when infantry began receiving individual communications, navigation-and-location equipment, computing capability, sensors, personal armor, and precision-guided munitions.