Public:ARHUDFM Features Summary: Difference between revisions

From Wiki Furtherium
Line 279: Line 279:


==Live-Synthetic Training Environment (L-STE) Integration==
==Live-Synthetic Training Environment (L-STE) Integration==
{{Main Article|[[Synthetic Training Environment]]|||||||||}}
{{Main Article|[Synthetic Training Environment]|||||||||}}
*connectivity with STE One World Terrain (OWT), Bohemia Interactive Simulations (BISim)
{{#lst:Intelligence_Surveillance_and_Reconnaissance_Systems|L-STE-Abstract}}
*mission preparation, mission rehearsal, mission analysis, mission debriefing
*C4I training (Cross-Domain Interaction, coordinating ground maneuver with fast air)
*training courses for the C4I staff with varied and customizable training paths
*support to functional and operational integration of heterogeneous tools (experienced tank training simulators, VR tank and combat vehicle simulators + camera mounted on a robot, flight simulators etc.)
*training and expert certification


==ISR Systems Integration==
==ISR Systems Integration==

Revision as of 09:41, 3 September 2024

Furtherium UG / Furtherium Inc.

Whitepaper for Experts of U.S. Army Future Command and Cyber Innovation Hub der Bundeswehr (CIHBw)

Description of the application, features, and main characteristics of the upcoming electronic device ARHUDFM: Augmented Reality Head-Up Display Fullface Mask


Revised on August 18, 2024

Notes:

  1. This document covers only application and technical issues. All questions related to development, distribution, business model, cost, unit economics, competitors and other topics can be found in the public FAQ section of our Wiki at https://wiki.furtherium.com
  2. This is the printed version of the electronic document located at https://wiki.furtherium.com/wiki/Public:ARHUDFM_Features_Summary

Please contact Basil Boluk at basil.boluk@furtherium.com if you have any questions or comments.

Furtherium on Vulcan Defense Innovation Technology Scouting Platform, sponsored by USSOCOM, fully networked solution on GovCloud(US).

DUNS: 030938614

CAGE: 9AMZ2

Abstract

redirect Further information: Public:ARHUDFM Manifesto, Public:Graphical User Interface, Public:Applications, Public:DoD_Pains

Augmented Reality Heald-Up Display Fullface Mask (ARHUDFM) is a complex device that is not easy to understand at a first look, because there are little relevant examples to compare. Therefore, we suggest that you first read the information on the Public:ARHUDFM Manifesto page at the link above.

This page describes the structure and contents of the GUI (Graphical User Interface) used in the ARHUDFM device.

For a brief description of the functions of each application, please see the Public:Applications page.

Introduction

At present, the armed and security forces have specialization based on acquired competencies and skills as well as on the equipment used. However, for ISR tasks it is very important to consider the human factor and the unexpected factor. Often a specialist with the right qualifications, along with the right equipment, will not be in the right place at the right time. Weather, terrain and remoteness create barriers to earlier detection and identification of threats. Longer-range threat detection tools have limitations, including satellite and mechanized stations. At the same time, each ARHUDFM user acquires previously unattainable capabilities for local day and night visual surveillance with photo and video transmission capability, coordinate transmission to his command and simultaneously directly to other domain commands for long-range precision targeting, local radio signal reconnaissance and direction finding capabilities, direction and remoteness of radio signal or jamming source, most aircraft types detection including small UAVs, artillery crews detection and force concentration, mine traps, minefields and other ISR tasks.

Reconnaissance, tracking and detection capabilities are greatly enhanced by the active introduction of miniaturized surveillance drones and ground robots (land, waterborne and aerial drones), from which data can only be obtained over short distances. Including in automatic mode and with inter-domain one-to-many and many-to-many communication capabilities on different frequency bands. For example, an assault team crew can correct the work of helicopter and fighter aviation (fast air), artillery units.

For escorting various tactical operations a great role can play another our development - Intelligence Surveillance Reconnaissance Airship Drone with Radio Relay (ISRADRR), aerostatic unloaded drone operating at 900-5500 m altitude, carrying a payload in the form of equipment for tropospheric communication in VHF/UHF band, transverters for frequency conversion of radio waves, as well as independent radio relay and radar equipment (active and passive radars). The main advantage in cost effectiveness and energy efficiency, long time of continuous operation with maintenance of geostationary position, possibility of quick evasion and complex maneuvering in case of threats, low cost in comparison with any other types of airborne. It is important to mention here because it forms inexpensive and stable communication channels for VHF/UHF (including ARHUDFM) radio users with omni-directional antennas not only outside line of sight, but also channels of long-distance communication at tens of thousands of kilometers. This is a separate big topic, which we do not consider in this document.

It should be clarified that each user himself is not only an operator, but also a sensor, collecting and analyzing information from the battlefield. Aggregated information in a continuous flow should go to analytical centers for subsequent analysis of big data using artificial intelligence. All major military-technical corporations and military-engineering centers, as well as the laboratories of many universities, are developing in this direction. Below we give some examples that are available in the public media.

Electronic warfare is a fight for control of the spectrum, which militaries use for situational awareness, communications, weapons guidance and more. This contest is increasingly important as more advanced technologies are deployed on the battlefield and troops try to minimize their signatures to avoid detection. This also means sifting through all the noise in the congested electromagnetic spectrum to understand and prioritize specific targets.

Electronic warfare and battlefield management systems based on artificial intelligence have played a major role in recent years and an even greater role in the years to come. “We’re going to be designed to deliver long-range sense and effect, integrated SIGINT, EW and radio frequency-enabled cyber. But we’re going to do it at longer range and at echelons that do not have that capability organically at this time: division, corps and theater,” Maj. Joseph Fink, also an assistant product manager with the terrestrial spectrum warfare portfolio, said at the same event. “We also have different target sets. We have different signals of interest at each echelon. And those are being developed in real time.” [1]

And that’s what really lays the groundwork for the Army to be able to actually achieve these programs as part of a family of systems. Moreover, the system will connect with other reconnaissance systems in an attempt to shorten the sensor-to-shooter timeline, which involves rapidly delivery sensitive data from sensors to the platforms or individuals who take action. Modernization of the Army’s networks and underlying computer infrastructure is among the service’s most pressing priorities.

But it is not only the combat missions of the ISR that are priorities in the development of the ARHUDFM. This device also has protective, auxiliary, communication and organizational features. In creating the concept, we proceeded from the immediate needs of human systems in the Army and Navy. First of all, for the needs of defense, counter-terrorist operations and security, but also for civilian rescue purposes and improving the effectiveness of medical personnel. This development will be useful to all professionals outside the office. We are creating a unique platform with affordable cost to make these technologies widely usable. Of course, if we're going to do some unique things, or some sophisticated purpose, we're going to do it in partnership, do integration with other technologies, and create interfaces for integration capabilities with our system for other developers.

Lessons from Ukraine, Middle East and potential fights against China and Russia

The U.S. Defense Department is preparing for potential fights against China and Russia. The two world powers have constructed anti-access and area-denial infrastructure in an attempt to counter U.S. strengths and keep at bay forces or weapons that could overwhelm. The growth of armaments and the concentration of technology in the military sector in China is of concern to most experts. Open confrontation between NATO and Russia is possible, but such a confrontation with China is even more likely. The experience of the war in Ukraine has shown that only a highly organized, well-trained and equipped military, together with effective cross-domain cooperation and flexible tactics based on accurate intelligence, can help find and exploit enemy vulnerabilities with maximum damage to the enemy. More importantly, at the same time, this approach significantly reduces personnel and equipment losses.

A prime example is in Ukraine, where even “six months ago, the environment looked very different than it looks now,” Mark Kitz said (Programm Executive Officer, PEO IEW&S). Both cyberwarfare and electronic warfare have played roles in the bloody Russia-Ukraine conflict, leaving government websites paralyzed, command-and-control methods jeopardized, and GPS signals jammed. [2]

Given the Army's need for flexibility and readiness, reinforced by the philosophy known as multi-domain operations, the U.S. Department of Defense does not expect to purchase large quantities of hard equipment. Instead, experts believe that small batches of upgradeable and modifiable kit will be the right choice because the threat will be constantly changing. At the same time, systems are needed that will adapt quickly to changes and, on the other hand, will not require a significant investment. Thus, the ARHUDFM's useful life of 4 years and the SaaS model of providing software on a subscription basis is, in our opinion, the optimal strategy that fits the current environment and the needs of the NATO armed forces.

“We have got to be able to have systems or capabilities that can adapt,” Kitz said. “What we’re trying to understand in these prototyping activities is: How can we get to an adaptable system?”

Main Features

Field of View (FOV)

Head-Up Display (HUD)

  • effect of total reflection, without glowing holographic waveguide
  • FOV 105°h and 33°v
  • no vergence-accommodation conflict
  • DLP technology display smoother (4K UHD) images without jitter, perfect geometry and superior grayscale linearity, higher contrast, wider color gamut
  • dynamically changing transparency of the screen areas
Optical scheme using dielectric mirror, refraction and reflection coatings and films

With our ARHUDFM product, we have found an elegant solution to the problem of dimming to display (parts of the screen) contrasting images during the day and in artificial light, so that the user can constantly view their surroundings without degradation, even in bright sunlight. For example, thermal images or video from a digital spotting device, drone, robot, other users' cameras, or cameras mounted on armored vehicles.

  1. Inner lens, outer surface. The outer surface of the inner lens has a high-reflection (HR) layer for better reflection of the projector rays (to reflect the azure wave of 490-520nm[3] (#0087BD), the yellow wave 575–585nm (#FFD300), and the red wave 680-700nm (#C40233)) - dielectric (Bragg mirror) design;
  2. Inner lens, inner surface. On the inner surface of the outer lens apply dielectrically separated 58 sections of the screen - Fading Pads (Carbon Nanotube Composite Films with Switchable Transparency), which, when an electrical potential arises, will cause a complete blackout for better reproduction of the full-color image;
  3. Outer lens, inner surface, top layer. A film layer with the ability 99.9% UV protection.
  4. Fading pads traces
    Outer lens, inner surface, bottom layer. IR filter with 99.99% protects from IR laser detector emission using 700-900 nm waves that using against optical sights, binoculars, night vision googles, rangefinder, camera lenses, Electro-Optical and Infrared sensors[4][5][6][7][8].
  5. Outer lens, outer surface, bottom layer. Anti-reflective (AR) coating significantly improves image contrast and reduces light glare (demasking effect).
  6. Outer lens, outer surface, top layer. Anti-scratch (AS) coating (Diamond-like carbon - DLC).

Optical multilayer nanocoatings are produced using a magnetron sputtering machine (PVD).

Hands-Free Control

  • voice control, inaudible from outside (via the mask's built-in microphone)
  • spech-to-text and voice playback
  • hand tracking
  • joystick and buttons

The main problems: the effect on concentration when performing multiple simultaneous actions, both hands are used to control fire and technical facilities.

Power Capacity and Active Cooling System

  • With the ARHUDFM mask, we have thought of this and the non-removable battery capacity is 12-elements 16.8V battery 16500mAh or 277wh / 14.8-16.8V (LiMgCoAl 26650 3.7V 5500mah), which is equivalent to a lifetime of over 7 hours. And in special cases, when you want to increase the continuous life of the device, the user can connect the power bank (4x12, max 48pcs LiMgCoAl 26650 3.7V 5500mah, ~ 1,065.6wh, 9.9 lbs / 4.6 kg) from the backpack without removing it, but only by reaching into the pocket and inserting the cable into the connector on the back of the mask.
    • For example: Apple MacBook Pro M1 chip based - 8-core CPU with 4 perform­ance cores and 4 efficiency cores, 8-core GPU, 16-core Neural Engine, 49.9‑watt‑hour lithium‑polymer battery[9], 7-10 hours at medium duty, 3-4 hours at maximum duty;
    • For example: Apple MacBook Pro M3 chip based - 8-core CPU with 4 performance cores and 4 efficiency cores, 10-core GPU, 16-core Neural Engine, 66.5-watt‑hour lithium‑polymer battery[10], 9-12 hours at medium duty, 5-6 hours at maximum duty;
    • For example: ARHUDFM Model One Intel Core i7-14700HX, 20-core CPU (x8 P-dual cores 5.2 GHz, x12 E-cores 3.7 GHz), 32-core GPU, 128-core NPU (Loihi 2), Gaudi 3 AI Accelerator (optional, 1-to-squad or 1-to-platoon), 277.2‑watt‑hour lithium‑polymer battery, 7-10 hours at medium duty, 3-5 hours at maximum duty; along with the power bank 35-72 at medium long-time duty, 15-28 hours at maximum duty;
  • Another problem is the cooling of electronic components inside a sealed case, especially the processor (CPU, NPU), the graphics processing unit (GPU), the RAM chips, and the battery. Air cooling breaks the seal and moisture and dust can enter. Air cooling is not possible with sealed designs. Liquid cooling is not possible for wearable compact devices. Using aluminum for the outer casing of a wearable device is ineffective because the casing will be heated by direct sunlight to the point where the processor protection kicks in and the processor starts to decelerate until it shuts down completely. The use of aluminum as a housing reinforces the user's signature to enemy radio detection equipment. We solve the problem with a hybrid circuit consisting of a compact active cooler and heatsink inside, and an efficient heatsink outside, connected in a closed loop with digital control.

Situational Awareness (SA)

redirect Further information: Night Vision, Computer Vision Control

  • optical and digital zoom up to 24x
  • stereo camera and measurement (without using a unmasking laser rangefinder)
  • rear view camera (optional)
  • HD, HDR, SWIR, LWIR multi sensor cameras
  • drone, robot, and external cameras view (other viewpoints (POV), other users' camera view, digital aiming device camera view)
  • night vision (SWIR, passive only terrain perception, with no IR source visible to others) - IR source can be located separately (picatinny rail) and not unmask (disclosure) users
  • thermal image vision (LWIR, ambient and object temperature monitoring)
  • mixed vision (HD / HDR / IR / SWIR cam + LWIR thermal image cam + LiDAR (optional) / sonar dots map (optional), silhouette highlighting - moving, different ambient temperature, dimensions)
  • synthetic day / twilight / night vision (used for clear display, allowing to focus on decision making instead of recognizing fuzzy images)
  • no-visibility navigation (LiDAR and sensors separately)
  • Computer Vision detection, recognition, and location
  • Computer Audition detection, recognition, and location (incl. GFL - Gunfire Locator)
  • Identification Friend or Foe (IFF) - (incl. 360° navigation grid)
  • vitals body sensors data (oxygen level, heart rate, body temperature, hydration level sensors data-driven monitoring; also another users data view, incl. health, fitness, fatique, sickness, fall monitor, hazmat exposure)
  • key monitor indicators and images before the eyes

The tools used today are not always at your fingertips. In addition, some of them have significant size and weight, occupy a useful volume of outfit.

Communication (Comms)

  • Cross-Domain Interaction (CDI), Multi Domain Operations (MDO, Combined Joint All Domain Command & Control (CJADC2): one-to-many and many-to-many communication solutions
  • voice call, inaudible from outside (via the mask's built-in MEMS microphone)
  • text messages (chat) - (including spech-to-text and voice playback, auto-translatiion): important for reducing traffic, number and duration of communication sessions
  • icon messages (instant Icon Messaging System): warnings, short commands, rapid comms
  • image exchange (images, videos, screenshots, auto-confirmation of completion)
  • videoconferencing (mission briefing anf debriefing, virtual mentor)
  • merging communication channels without PTT button
  • LOS / BLOS stealth multi-band communications (highly-directional antenna, Digital Beamforming, dynamic Tx power, hopping frequencies)

The main problems: lack of frequencies, congested channels, communication quality, radio interference, lack of line of sight and repeatedly reradiated signal, lack of cross-domain communication without intermediaries one-to-many and many-to-many.

Navigation (Nav)

  • easy GNSS positioning (Teseo III which supports GPS, Glonass, Galileo, BeiDou and QZSS) [11]
  • maps and mapping
  • compass
  • route planning and path tracking (Artificial Intelligence)
  • geolocation position exchange (own, friend, and enemy positions, incl. Cross-Domain Interaction)
  • auto-tracking of own positions and execution of tasks (Artificial Intelligence)
  • connectivity with ATAK-MIL system

Intelligence, Surveillance, Reconnaissance (ISR)

  • SIGINT Software Defined Radio (16-bit SDR) VLF, LF, MW, HF, VHF, UHF and L-band 1 kHz to 8 GHz range Scanner-Receiver and Spectrum Analyzer
  • Radio Direction Finding system (RDF) - allows you to detect the coordinates of radio transmitting sources, and identify them, report and exchange
  • passive radar detection (mobile Anti-UAV Defence System)
  • external active phased radar connection (remote mobile radar, UAV equipped with radar)
  • capture of enemy fire positions and missile launch positions (direct and indirect fire positions)
  • motion and flashes detection
  • missile engines heat & smoke traces
  • detection of booby traps (tripwires, radio, infrared), pressure mines (via external ground-penetrating radar - GPR)
  • face and object recognition
  • emotion assessment
  • 19 external MEMS cardioid microphones, "flat/dome" acoustic arrays, Gunfire Locator
  • ambient sound filter (noise filtering and threat detection; UAVs, helicopters, engine sound, human speech identification; direct and reflected sound waves identification; determining the direction in which to search for the sound source)

Firing Assistance (FA)

  • targets auto-capture and tracking (target lock & track, target detection)
  • ballistic calculation (instant evaluation of external factors, distance, angle, satellite weather data, and multiple modes of aiming correction)
  • speed measurement and calculation of preemptive
  • instant aiming (integration with the digital and laser aiming devices, accurate shooting with both eyes open)
  • target hit tracking and corrections
  • easy access to shot history
  • shooter's field of view and especially peripheral areas are not obstructed by the scope
March Reticles MTR

In the ARHUDFM device, we not only use a complex algorithm of ballistic calculations according to 19+ parameters but also Machine Learning based on the history of shots in a given area under similar conditions, including by other shooters. We use Computer Vision technology to track the point of impact, as well as to make the image as clear as possible for the shooter at high magnification (24x), to correct for climatic variables such as mirage. Subtensions are measured in MIL/MOA and remain constant at all magnifications (the number of MOA units within one division decreases according to the ratio of magnification; 1 subdivision at 10x magnification equals 4MOA, at 20x equals 2MOA, at 40x equals 1MOA). And, of course, we've thought about the shooter's user experience itself. The shooter sees the image from the stereo camera of the mask along with the reticle, as well as the point of sight line. If the digital aiming device allows, corrective adjustments (MOA) are made automatically after the target has been captured. Or the user sees the calculated values in MOA and in the number of clicks for the specified reticle for the specified magnification. If the sight line is outside the view area boundary, the system tells him which way to move it and how far (important at high magnification). After the test shot, the shooter sees the corrective adjustments on the same screen and can immediately fire again, which is very important for a moving target. Some shooters do this "by eye", and some lose time by making additional corrective adjustments to the scope. The ARHUDFM shows the necessary correction deviation instantly. The shooter must not lose sight of the target by keeping it in the field of view of the stereo camera at all times. The shooter also does not lose visual situational awareness around him, including peripheral vision and rear view, and does not need to cover one eye. This shooting technique is much easier and faster. In addition, there is no need for a second number, a spotter. And the shooter does not become vulnerable during aiming and firing.

The calibrated digital aiming device's spatial gyroscope and the other two sensors must be used to accurately match the sighting line to the camera image of the mask. The mathematical values of the sighting line are transmitted to the computing module of the mask. The aiming algorithm itself sets the necessary corrections and changes the displayed sighting line in the view from the mask cameras (a circle with a crosshair on the outside). We want to highlight that ARHUDFM uses a mathematical way of matching the sighting line and the reticle of the mask, not a graphic "by eye" one like in ENVG-B. The digital zoom uses a 4K, 9K, or 12K stereo camera and the Computer Vision algorithm for pixel smoothing with outline extraction.

After each shot, two modes are used after analysis of the impact point:

  • the sighting line is not readjusted, but the necessary corrections are displayed on the reticle of the mask camera view (for experienced shooters)
  • the sighting line is readjusted each time and the shooter aligns the sighting line with the crosshairs of the reticle of the mask camera view

The image quality of the sighting camera (FWS-I for example) is usually not very good (640x480Vox, 1x FOV 18°h 13.5v°, 3x FOV 6°h 4.5°v, monochrome BW display, BW dot), including those using a thermal imaging module, only relevant for close-in cover fire (up to 100 yards) when the mask cameras are not used. In the case of aiming by the view from the sight camera, corrective adjustments must be made to the aiming device itself.

In ARHUDFM we do not use laser and infrared sources so as not to de-mask the shooter's firing position. Measurement of distance, course, angular velocity and angular elevation of the target (SLH) above the true horizon is performed by a stereo camera, gyroscope, accelerometer, and magnetometer with high accuracy during day and night. In the ARHUDFM the measurements are taken with a HD stereo camera and at night with a thermal imaging camera.

During night and day, the thermal imaging camera in mixed view with HD stereo camera image allows, via Computer Vision, to create a point map and an accurate silhouette of objects, even for very distant and camouflaged objects. The accurately calibrated and magnification-independent reticle and silhouettes with different digital zoom indices in the mixed reality view, in addition to the accurate calculation of the stereo camera angles to the line of sight, allow high accuracy of distance and angular velocity measurements. This achieves a significantly lower measurement error compared to existing optical devices.

Electronic Planning and Management Tool (Task Management - TM)

  • quick tasks and auto-confirmation fo completion
  • task control and checklists (ongoing, set by me, assisting, following)
  • video and image recording, proof of completion
  • task delegation and status monitoring
  • reporting and auto-tracking
  • connectivity with C4I, EW systems and Battle Management Systems
  • distributed p2p computing when running neural networks for complex AI, CV tasks (integrating the resources of tens and hundreds of users)
  • real-time access to library and reference materials
  • patient records and diagnostic cards (EHR integration)

Respiratory and Facial Protection (RFP)

  • This product provides the ability to use it without the removable lower mask module, which is designed to protect the respiratory system, to isolate the user's voice acoustically and the integrated hands-free built-in drinking system. Through many studies, we have concluded that 100% of users need to use the bottom of the mask, even if the environment is safe to breathe, so as not to disturb others while continuously using voice control and communications.
  • The respiratory protection rating is one of the highest. It uses two interchangeable ULPA filters that provide a 99.9995% level of protection against gases, aerosols, dust, bacteria, and viruses. This is a multipurpose type of filter that has an additional filling of treated activated carbon. The filters are IP67 or better protected against moisture.
  • The total area of the two filters is oversized so that the breathing resistance is as low as possible, lower than that of counterparts commercially available from leading manufacturers 3M, MCA, Dräger, etc. It is our intention that the wearer will be able to breathe without difficulty even during intense physical exertion.
  • CPAP (continuous positive airway pressure) and DPAP (dynamic positive airway pressure) make breathing easier during stress and heavy exertion as well as elements of non-invasive pulmonary resuscitation.
  • To protect the respiratory system from carbon monoxide (CO), a special modification of the removable filters will be developed.
  • In addition, an adapter will be used to attach a compressed air reducer for SCBA (Self Contained Breathing Apparatus) systems instead of the cover of one of the filters. The second cover, when using the SCBA, blocks the access of contaminated air.
  • The mask protects the face in the area of the eyes, ears, forehead, and back of the head from shards of glass, wood chips, stones, and sandy debris, which can have high kinetic energy and cause superficial and penetrating wounds to the face and head. Together with the lower part of the respiratory protection module, the face is fully protected. We have reinforced the outer layer of the mask with a 2.8 mm thick aramid fiber (kevlar) composite layer. This significantly reduces the kinetic energy of fragments in flight and at the same time increases the weight of the product insignificantly. The front and side parts of the head are additionally protected. The back of the head is quite well protected by a ballistic helmet.
  • The problem of lens fogging, which for a period of tens of seconds to several minutes completely deprives the user of optical situational awareness, is solved as follows. The mask has two visor lenses with an air insulation barrier between them and an elastic seal around the edges that prevents air exchange and air vapor access. When the mask is used together with the lower part, another seal around the face contour prevents air access to the inner lens surface. Together, an inert system of staggered temperature transitions is formed from the face around the eyes to the inner lens and then to the outer lens, separated by two insulated air chambers. When moving from a cooler to a warmer state, there is no condensation on the lens surface because the lens system retains its thermal inertia longer and does not experience a drastic difference. When in the cold, the exhaled warm air is led away through the insulating skirt to the exhaust valve. When in a fog, condensation from the air condenses on the cooler surfaces, but since the lens system is temperature dependent and much more inert than, for example, the lenses of typical eyeglasses, the outer lens is not cooled down sufficiently for a dew point to form on its outer surface. The physical meaning of the counteraction of condensation is always the same. The dew point at the temperature contrast between the body and the outside air must be in the area of the thermal insulating barrier, which itself has water and vapor barrier that prevents the penetration of air vapor.
  • The air space around the eyes is also ventilated by using the lower part of the mask. The air pipes that connect the filters and the obturing area have technological openings with elastic plugs. During inspiration, the purified air moves to the obturing area and, by Bernoulli's effect, some air is also aspirated from the sealed area around the eyes, where a slight vacuum occurs. The outlet valve prevents outside air from entering bypassing the filter. The obturing area inlet valves prevent air from moving back in during exhalation. During the pause between breaths, the pressure in the air chamber around the eye is balanced by the connection to the filter chambers.
  • Reduced visibility in the infrared spectrum thermal imaging cameras. The mask protects the open areas of human face skin from thermal illumination. This reduces the temperature contrast between the surrounding objects and the body parts that the thermal imaging camera records. As you know, the face is the warmest part of the human body.
  • Technological openings are provided in the headphone area so that the person does not lose acoustic awareness when using the mask. In the back of the mask, there are technological openings for ventilation of the electronic components with the airflow direction from top to bottom. This provides additional active ventilation of the user's under-helmet space and, on the other hand, directs warm air to the neck area, which also provides comfort in use, especially during the cooler seasons.

Safety and Protection of User Health

  • The center of mass in ARHUDFM is centered around the head, so the torque is not felt, despite the comparable mass of the device. According to military experts, weight reduction is desirable, but not critical. But the generated torque is a serious disadvantage. For example, in used night vision devices, there is also a high torque (the weight of the device is about the same), but when the device is not in use, it can be folded upwards and almost completely get rid of this negative effect.
  • We do not use external cables in ARHUDFM product, all modules are compactly located inside the mask housing. Even the Software Defined Radio module is integrated. The exceptions are the external antenna, the flexible drinking system pipe, and the flexible compressed air container pipe (for firefighters). However, the route of these communications runs mostly on the back and does not require hanging loops. Other devices may be connected (optionally), but these connections, too, will not critically disturb crawling, shooting in the supine position, and the safe use of ammunition in the front chest pockets.

Vitals Body Sensors and First Aid

  • Sensor locations: shoulder, forearm, wrist, thigh, lower leg, neck, abdomen, chest, mask obturator airway. Wireless communication of sensors with the controlling MCU. Sensor cuffs use flexible electronics technology.
  • Oxygen Level. Skin-sensitive oxygenation level sensors are integrated into the cuffs with other sensors.[12][13][14] This allows you to analyze the level of blood flow in certain parts of the body. The system also includes a sensor built into the mask obturator airway for continuous monitoring of carbon dioxide (CO2) concentration, respiration rate (RR), blood oxygen saturation (SpO2), pulse rate (PR), and Minimum Alveolar Concentration (MAC) values. Innovative micro-optical technology - reliable, safe and easy direct flow monitoring.
  • Heart Rate and Blood Pressure. Blood pressure is measured by a special sensor using ultrasound. At the same time, a chemical sensor releases special substances to induce perspiration and then studies the biochemical values of the secreted fluid (it measures caffeine, alcohol and lactic acid levels). A third sensor, which measures blood glucose levels, uses a weak electrical current to analyze the composition of the tissue fluid.[15]
  • Body Temperature. Body temperature sensors are built into the cuffs. The measurements allow you to establish the relative level of overload, fatigue, sickness, hypothermia and overheating for a given user.
  • Hydration Level. The integrated drinking system has a sensor whose readings over time, taking into account physical activity and other vital signs (temperature, respiration rate, pulse rate, oxygen level, glucose level), generate recommendations for the user. Another sensor analyzes the level of exhaled water vapor.
  • Fall Monitor. Electronic sensors such as the accelerometer and gyroscope, which are integrated into the cuffs with other sensors, are part of a separate circuit under microcontroller control. The program analyzes nonspecific changes in body position, head position, vital signs and limb movements. When necessary, it generates a command for an intra-network alarm message requesting evacuation and indicating geo-location. The message contains an encrypted symbolic indication of the probable condition and a time countdown.
  • Hazmat Exposure. Hazardous environment sensors are built into the bottom of the mask, installed in front of the air filter. The user will receive a timely warning of the danger. The symbol message will also be transmitted internally to other users and commanders.
  • First Aid. If blood loss is detected in the limbs, the tourniquet mechanism is automatically actuated by a pre-tensioned spring, which also loosens at necessary intervals.[16]
  • As a promising idea, together with experts from the field of emergency medicine, a technology is being developed for preinjection of microcatheters in the groin, thigh, and forearm before cuffs with sensors are put on. In this case, the cuff design will include an aramid fiber casing containing 120 to 260 ml of saline. When blood loss is detected, the sterile sealed catheter connector will be opened and the pressurized solution will begin to flow to the desired area.
  • All of the above is based on existing serial technology.

ARHUDFM ecosystem

C4I, Electronic Warfare, CJADC2 and Battlefield Management Systems Integration

redirect Main Article: WGR (Workgroups) section, Battlefield Management Systems, Workgroups

  • Tactical Intelligence Targeting Access Node (TITAN), Joint Tactical Terminal (JTT), Distributed Common Ground System – Army (DCGS-A) - (PEO IEW&S) [17][18][19]
  • DAF Battle Network (Advanced Battle Management System (ABMS)) - US Air Force CJADC2 System [20][21]
  • Project Overmatch - US Navy CJADC2 System [22]
  • Project Convergence - US Army CJADC2 System [23]
  • Project Maven (Maven Smart System) - NGA System (Palantir) [24][25][26][27], Open DAGIR (Open Data and Applications Government-owned Interoperable Repositories) [28][29], RDER (Rapid Defense Experimentation Reserve) [30]
  • Tactical Intelligence Targeting Access Node (TITAN), Joint Tactical Terminal (JTT), Distributed Common Ground System – Army (DCGS-A) - (PEO IEW&S) [31]
  • All Source II (ASII) and GOTHAM - (Palantir Technologies). AI based software that allows analysts to parse vast amounts of data and quickly provide leaders the latest battlefield information. The All Source II application is expected to be deployed as part of the Command Post Computing Environment, a tailorable mission command suite operated and maintained by soldiers. “ASII is integrated into, and built to interoperate with, the Command Post Computing Environment, which not only reduces the amount of hardware the Army intelligence community is required to maintain, but it also provides a streamlined way to deliver timely intelligence to the commander,” Col. Christopher Anderson, the project manager for intelligence systems and analytics, said in a statement. [32] AIP System - (Palantir Technologies)[33]
  • Artificial intelligence and cloud-analytics company BigBear.ai won a Army contract to roll out the service’s Global Force Information Management system, meant to provide service leaders with an automated and holistic view of manpower, equipment, training and readiness. The system, referred to as GFIM, will consolidate 14 aging applications and provide real-time data to up to 160,000 Army users. It will also automate a slate of tasks that were previously done manually, such as determining unit readiness. [34]
  • The projects — essentially siblings — are known as the Terrestrial Layer System-Brigade Combat Team, TLS-BCT (Lockheed Martin Co., General Dynamics Mission Systems); the Terrestrial Layer System-Echelons Above Brigade, TLS-EAB (Lockheed Martin Co.); and the Multi-Function Electronic Warfare-Air Large, or MFEW-AL (Lockheed Martin Co.). The U.S. Army is assembling a family of systems to provide soldiers with electronic warfare, signals intelligence and cyber capabilities that they can employ from near and far, on the ground and in the air. “We don’t expect these systems to do their own mission; they have to be used in tandem, whether we’re using an MFEW-Air Large in the air to get range [or] looking at long-range precision fires with [TLS-EAB],” William Utroska, who works with the Army’s Program Executive Office Intelligence, Electronic Warfare and Sensors (PEO IEW&S), told reporters in August at Aberdeen Proving Ground, Maryland. “That’s one of the points I wanted to make upfront,” he said. “We foresee these systems being mutual, used in tandem, in order to provide the commander the best effects.” Together, the three systems are expected to boost soldier situational awareness and efficacy in future fights, potentially against technologically savvy opponents such as China and Russia. [1][35]
  • The BCT system is more tactically focused and must be hosted on a platform common to tactical units with more mobility and the capability to interoperate with Army Mission Command Systems. The TLS-BCT is considered a next-generation platform for the Army that brings the service one step closer to satisfying the joint all-domain operations philosophy, a holistic approach to planning and fighting. The system is designed to boost awareness and offer troops more offensive options that can deny or degrade enemy systems. [36]
  • The EAB is an electromagnetic attack and collection system that integrates cyber, signal intelligence and electronic warfare capabilities. Put together, the system’s various capabilities can be used to give soldiers indications and warnings about their surroundings across thousands of miles. Coupled with the TLS Brigade Combat Team system, soldiers at all levels will have the ability to support Multi-Domain Operations with EW situational awareness and affects options. Compared to the BCT system, the EAB system will be focused on the higher and more strategic echelons of the Army, which require longer ranges and the ability to interoperate with a broader range of adjacent service, coalition and national and strategic partners, according to the Army’s Project Manager Electronic Warfare & Cyber office. [35]
  • Torch-X C4ISR solution - (Elbit Systems) provide a sophisticated and advanced framework for a wide range of complex and large-scale applications. With seamless integration of sensors, effectors and communication systems, and full support for manned and unmanned autonomous platforms, the advanced systems offer enhanced cross-force coordination, strategic planning capabilities, comprehensive battle management, tactical operations, survivability and lethality. Torch-X C4ISR solutions feature AI-based decision support tools to reduce cognitive load at all echelons, facilitating optimal decision-making and planning processes. The solutions are based on Elbit Systems’ E-CIX modular framework, an open architecture design that provides the development environment and can accommodate third party applications for future growth and requirements. [37]
  • WIN BMS - (Elbit Systems) is an essential add-on to virtually any combat vehicle mounted sensor or weapon system forming coordinated battle teams that perform their tasks with optimum precision. WIN BMS supports every requirement of battalion-and-below tactical units, meeting all their operational needs, including direct fire engagement & maneuver, indirect fire support, intelligence and logistics. In addition to its combat networking capabilities, this “super” system of systems provides commanders and crewmen with simplified operational interface, enhanced situational awareness and data communication capabilities. Elbit Systems was chosen by the IMOD to serve as prime contractor for the IDF program of Battle Management Systems for Battalion Combat Teams. [38]
  • Advanced Battle Management (ABMS) - (Northrop Grumman). Northrop Grumman is developing and integrating leading-edge artificial intelligence and machine learning solutions into large, complex, end-to-end mission systems that are essential to our national security. [39]
  • Integrated Air and Missile Defense Battle Command System (IBCS) - (Northrop Grumman). The Integrated Air and Missile Defense Battle Command System is a revolutionary command-and-control (C2) system developed to deliver a single, unambiguous view of the battlespace. [40]
  • Command and control system TacNet - (Rheinmetall). TacNet provides a common operational picture shared by dismounted troops, tactical vehicles of all types and command posts. To secure successful outcomes, tactical units have to be able to move, shoot and communicate. A state-of-the-art command and weapon engagement system, TacNet supports these capabilities while simultaneously opening up new possibilities. [41]
  • SCORPION combat information system - (ATOS). The challenge of a command system is to deliver in an immediately understandable and exploitable form, in guaranteed time, the information useful to the person who will decide and lead the action (e.g., command orders, evacuation requests). [42]
  • BNET IP SDR - (Rafael). Meeting the most critical challenges of a modern digitized battlefield, BNET creates a seamless, unified network for all land, sea and air units ‒ enabling reliable, high-speed connectivity for data, voice and video on-the-move, with no bandwidth limitations. These capabilities provide the basis for the rapid, precision closing of sensor-to-shooter loops. [43]
  • INTeACT Combat Management System - (BAE Systems). INTeACT provides warship crews with all the information they need to track, analyse and respond to threats in combat, as well as the ability to co-ordinate resources in other operations such as intelligence gathering and humanitarian assistance, both independently or as part of multinational coalitions. Incorporating weapon control systems and a Datalink capability, INTeACT supports planning, tactical picture compilation, decision-making and weapon control. [44]
  • Network Tactical Common Data Link (NTCDL) - (BAE Systems). NTCDL improves data exchange between platforms across land, sea, and air to provide warfighters with an integrated tactical network. Our system goes a step further by allowing data sharing, exchange, transfer, or distribution in real-time across military assets such as aircraft, ships, and unmanned vehicles. By bringing together a greater volume of data, our system allows operators to effectively communicate command and control protocols among forces to maintain an advantage. [45]
  • Joint All-Domain Command and Control (JADC2) - (Raytheon). Defending against peer nations and other evolving threats will depend upon on our speed to collect, analyze and use data to inform decision-making. We have to make the best decision possible faster than our adversaries in order to win. Tomorrow’s battles will span every domain simultaneously. JADC2 will provide seamless integration connecting mission-critical military platforms and systems worldwide across all domains – air, land, sea, cyber and space. [46]
  • Leonardo DRS (Leonardo). These tactical mission systems are designed to provide ultra-reliable, on-the-move computing capability with the latest cyber secure technologies for mission-critical applications in harsh environments. In addition, the technology also delivers compatibility with other allied militaries using related Leonardo DRS systems, including the United States (MFoCS-II), United Kingdom (Bowman), Australia, Bahrain, and the United Arab Emirates. [47]
  • 9Land Battle Management System - (SAAB). The 9Land Battle Management System (BMS) is a tactical command and control system from Saab that lets you utilise the full potential of your forces by increasing the level of awareness in all units, at all times. Based on an open integration platform enabling easy adaption to change and facilitates swift integration of legacy systems and 3rd party products. Built as one coherent system makes it scalable to your specific needs. [48]
  • T-BMS Tactical Battlefield Management System - (Thales). Provides automatised reporting and graphical orders dissemination. Integrated to secure data communications, IP interfaces for others ccommunications media, peripheral interface for sensors displays. Reconnaissance, Surveillance, and Target Acquisition capabilites. Post mission debriefing and Analysis After Review. [49]
  • Battle Management System (BMS), SitaWare - (BWI, Systematic). As a technology shared by the armed forces, the Battle Management System is a central component in the digitization of land-based operations. The system is designed to ensure that the Bundeswehr can exchange information interoperably and seamlessly between command posts, units and allies during operations. The Battle Management System, which is already in use in 30 countries, was tailored to the Army's specific requirements. [50][51]
  • Iris - (Rebellion) delivers a comprehensive view of threat environments by detecting and tracking objects of interest across domains. It fuses multi-source sensor data to maintain a persistent, near-real time view of adversary activities and inform threat deterrence. [52]
  • Helsing - AI-based Defense System. Its platform aims to provide the clearest picture possible in any operating environment. Helsing's software will employ artificial intelligence to combine data from infrared, video, sonar, and radio frequencies gathered from military vehicle sensors to generate a real-time picture of battlefields. [53]

We are researching integrations with these platforms to offer the capability to extend these systems to include the use of user devices by the majority of troops in all domains engaged in tactical operations and on active duty.


Live-Synthetic Training Environment (L-STE) Integration

redirect Main Article: [Synthetic Training Environment]


ISR Systems Integration

redirect Main Article: DVC (Integrated devices) section, Intelligence Surveillance and Reconnaissance Systems

  • Target and Designation Acquisition Suite (TRIDENT) - (Elbit Systems) [54]
  • Dismounted Joint Fires - (Elbit Systems) [55]
  • Surveillance Sensor Star SAFIRE® 380X and other Airborne Systems - (Teledyne FLIR) [56][57]
  • Land Systems - (Teledyne FLIR) [58]
  • Counter-UAS - (Teledyne FLIR) [59]
  • Maritime Systems - (Teledyne FLIR) [60]
  • CBRNE Detectors - (Teledyne FLIR) [61]
  • Tactical Solutions - (Teledyne FLIR) [62], (Garmin) [63], (Wilcox) [64][65]

We are researching integrations with these platforms to offer the capability to extend these systems to include the use of user devices by the majority of troops in all domains engaged in tactical operations and on active duty.


Communication Channels

  • connectable external handheld VHF/UHF radio
  • Software Defined Radio (SDR) MANET network for VHF/UHF 30-512 MHz Receiver-Transmitter
  • Wi-Fi Direct (p2p network) for integrated computing, up to 200 m / 250 Mbps, IEEE 802.11a, g, or n, 2.4 and 5 GHz
  • Bluetooth 5.3 for near field protect communications, resistant to radio interference, low signal strength 2.5 mW, 128-bit AES encryption, up to 400 m / 2 Mbps , IEEE 802.15.1, 2.4 GHz
  • WLAN 10.7-12.5 GHz Ku-band[66], up to 120 Mbps download / 20 Mbps upload / 40 ms latency.
  • WLAN / Wi-Fi, up to 100 Mbps download / 10 Mbps upload / 14 ms, IEEE 802.11, 2.4 and 5 GHz
  • LTE / 5G FR1 (410-7125 MHz) and FR2 (24-71 GHz), up to 4.9 Gbps

Software Defined Radio (significantly less traffic and interference, shorter sessions, higher quality and security, broadband signal transmission - one-to-many communication, compatible with analog radio transmitting devices, shows the entire radio air panorama in a given location, analyzes the spectrum, identifies weak signals and radio interference sources - ISR function, recording radio intercepts due to fast ADC).

SDR Receiver, RF Scanner and Spectrum Analyser

SDR (Software Defined Radio) module is a single-tuner wideband full featured 16-bit SDR which covers the entire RF spectrum from 2 kHz to 8 GHz. It contains three antenna ports, two of which use SMA connectors and operate across the 2 kHz to 3.2 GHz range and the third uses a BNC connector which operates up to 200 MHz.[67] The SDR module provide enhanced performance with additional and improved pre-selection filters, improved intermodulation performance, the addition of a user selectable DAB notch filter and more software selectable attenuation steps . The SDR module introduces a special HDR (High Dynamic Range) mode for reception within selected bands below 2MHz. HDR mode delivers improved intermodulation performance and fewer spurious responses for those challenging bands.

  • Covers all frequencies from 2 kHz through VLF, LF, MW, HF, VHF, UHF and L-band to 3.2GHz, with no gaps
  • Passive radar A-, B-, C-, D-bands
  • Receive, monitor and record up to 8 MHz of spectrum at a time
  • Performance below 2 MHz substantially enhanced – improved dynamic range and selectivity
  • Software selectable choice of 3 antenna ports
  • Connected to Software Defined Antenna (SDA) - 3x 64-patch phased array highly-directional Digital Beamforming System
  • Enhanced ability to cope with extremely strong signals
  • External clock input for synchronisation purposes, or connection to GPS reference clock for extra frequency accuracy
  • Excellent dynamic range for challenging reception conditions
  • Calibrated S meter/ RF power and SNR measurement
  • Fast search for active radio frequencies (RF surveying), up to 100 channels per second, including mobile communications, ready-made patterns for precise identification (modulation type, pulse type, power). In group mode, the search tasks are distributed among several users and are not duplicated. Found frequencies are memorized for subsequent periodic scanning. Tasks for scanning previously found frequencies are also distributed among several users. This multiplies the productivity of radio scanning. The scanning tasks are performed automatically in the background and do not require the constant attention of the user. The user can analyze the spectrogram and "waterfall" of frequencies, measure power, control the operation of filters, determine the sources of radio frequency interference (RFI/EMC detection)
  • The most popular algorithm of use without user participation is automatic scanning of channels, identification of signal type and wave (including pattern recognition by means of Computer Vision), localization of the radio transmission source, identification of friend or foe, and then notification of the user and sending reports to the command server.

SDR Receiver-Transmitter (Comms Mode)

  • 30 MHz to 512 MHz frequency range
  • Modulation type: FM, AM, CPM (and other)
  • Channel bandwidth: 25 kHz, 250 kHz
  • Waveforms: LOS FM/AM (STANAG 4204/4205), WF40 (VHF/UHF MANET waveform), HW20 (VHF EPM waveform)\
  • Communication encryption: AES, key length up to 2048 bits
  • Multi channel radio system
  • Simultaneous voice and IP data
  • Virtual voice channels
  • True MANET network for VHF/UHF
  • Position reporting system (GPS, Galileo, Glonass)
  • AES based COMSEC/TRANSEC
  • Up to 10 W power output
  • Mission Modules support
  • Backward compatibility with the RF20 radio system

Instant (Icon) Messaging System (IMSG)

The system consists of several groups of iconographic symbols. In many ways it resembles the system of road signs, which have a concise form and are perceived instantly, causing an appropriate reaction. Such a system is perceived much faster than a text or voice command. An example of this is the sign language used in special operations units. Also unlike the voice call session is much shorter, does not overload the communication line, small data packet is not demanding on the data rate, the digital data packet with the control notification of receipt always means only 2 states, that the message is received or not. No problem with message illegibility, lack of clarity in the message. No need to tell who is transmitting the message. This method itself is very concise and eliminates unnecessary information, thus relieving the communication channel and the attention of users. This communication system is particularly important for international contingents and during joint exercises.

The sender of the message by voice command (the user's voice is acoustically isolated from the surroundings by the obturator airway and the mask body) or by gesture informs: group or individual recipient (by alphanumeric code), command or command response (see example below), information for clarification (course, distance, quantity, reason, code, object name). By default, if no recipient is specified, each subsequent message is addressed to the same as the previous one. Also, as a general rule, if no clarifying information is specified, it is not transmitted. In this way, communication is further accelerated. Pronounced: <user> (<usergroup>) + <command> + <info>

The user who sent the message sees to whom the last message was not delivered. The interface shows the data of the users who have not sent a confirmation message back. The interface also shows the last messages received from users.

Some examples of voice commands in English used in the Icon Messaging System ("+i" where clarifying information is provided by default):[68][69][70]

Notification Reply Command Request Report ACP-131 Z-codes
Warning! Threat +i Online Total silence Call <user> Transmitting target coordinates +i ZAL: Alter your wavelength.
Warning! Civilians +i Standby Cancel Report Fault report +i ZAR: This is my .#. request (or reply).
Warning! Friendly forces +i WILCO (will comply) Cleared Picture report Injury report +i ZDE *: Message ... undelivered, *..
Warning! Gas +i CANNOT +i Recleared Requesting command +i Ammo report +i ZEK: No answer is required.
Cell leader <user> Clarify me Disregard Requesting support +i Picture report +i ZDG: Accuracy of following message(s) (or message ...) is doubtful. Correction or confirmation will be forthcoming.
Hurry Up Done Cover me by fire Requesting evacuation +i Monitoring report +i ZEL: This message is correctionto message ... (transmitted by ...).
Rally Point +i Check it +i Follow me Requestion medical aid +i Motion report +i ZET: Message ... has been protected andno further action by ... is required.
No info Don't follow me Requesting clarification +i Radio monitoring data report +i ZEV: Request you acknowledge message ... / Message (or message ...) is acknowledged.
No connection +i Dividing +i Requesting comm channel +i Meteo report +i ZIP: Increase power.
Target coordinates confirmed +i Formation +i Requesting target coordinates +i ZOM: Delivery of this message by mail in lieu of broadcast permissible (to ...).
Target destruction confirmed +i Find shelter Requesting ammunition +i ZUE: Affirmative (Yes).
Ready to fire Take up defenses Requestion radio scanning +i ZUG: Negative (No).
Confirm Take position +i Requestion meteo report +i ZUH: Unable to comply.
Negative Change position +i ZUJ: Stand by.
Do surveillance +i ZWF: Incorect.
Perform tactical reconnaissance +i ZWG: You are correct.
Watch out! Fire ZWH: Try again.
Charge +i ZWI: Answer last question (or question ...).
Apply disguise +i
Freeze
Go here +i
Go away +i
Keep distance +i

User Interface

redirect Further information: Public:Graphical User Interface, Public:Voice User Interface

Menu

The screen is divided into outline menu areas at the top, left and right, and bottom. The menu is not shown in its entirety. Only the part of the menu that is currently being called up is displayed (to reduce unnecessary information on the screen and to avoid obstructing the user's attention). Some types of menus: main menu sections, statuses (on and off status of functions, battery charge, signal level, etc.), widgets.

Widgets

There are 2 types of widgets for running applications. One, which does not require dimming the screen area and is clearly visible to the user at any time of day in any weather in any environment. Typically, these widgets provide the user with key information and should not significantly limit the visibility of the surrounding reality. Other widgets, on the contrary, require dimming certain areas

Widgets that require a dimmed screen area: compass, communication with other users, navigation grid, friend-or-foe system, ballistic parameters and fire assistant, task management, drone / robot remote control.

Screen fading areas

A total of 58 independent fading areas (fading pads) are provided: in the central area of the screen, one each in the upper left and right halves, and two each in the lower left and right halves. Dimming zones are never located in the peripheral vision area, the upper and lower central area of the screen. Also, blackout zones never restrict visibility to two eyes at the same time.

Voice Control

A voice control module based on one of the Open Source Projects with a large user base. This improves the reliability of the system, as voice control is seen as preferable to other features.

The LLM is fine-tuned on the local device and adapts to the speech of the individual user, learns to better understand his instructions and tries to convey hints and recommendations more clearly.

Hand tracking control and virtual keyboard

Similarly, this module is used on the basis of one of the Open Source Projects. In principle, it is less convenient for the user, because the input performance of the virtual keyboard is lower than voice input with conversion to text. However, for menu management and operations with widgets this module will be also convenient.

Joystick and virtual keyboard control

Rather, it is an additional method of control and input in case one of the previous methods malfunctions. The joystick is soft and well calibrated. You can use it with one finger without removing your gloves to control menus and perform virtual keyboard input.

User-friendly Use

ARHUD-OCP-Tactical-Helmet-F

We strongly oppose all external connections and cables, which can cause inconvenience and danger. However, in individual cases, there is no alternative. We have therefore arranged the connection points in such a way that free-hanging cables and flexible tubes are located as far back as possible so that they cannot be accidentally caught by the user. The mask can be connected to:

  • drinking system tube, located in a backpack, through the sealed connector on the right side, near the right-hand filter cover
  • UHF / VHF external antenna cable from a backpack, via BNC connector in the occipital area
  • mask battery charging cable via a back connector (usually enough for 7-10 hours but a power bank can be used, along with the power bank 35-72 hours at medium long-time duty, 15-28 hours at maximum duty)
  • handheld portable radio through the 4-pin connector on the left earpiece (built-in headset, not required if using built-in UHF / VHF Software Defined Radio)
  • air supply tube to pressure reducer in place of filter cover left or right (usually used by firefighters and rescue workers)
  • external device (SDA, power flashlight, LiDAR, other sensors) via NVG mount on the front of the mask top

We are researching promising options for troop security, such as the use of sensors (detectors) to remotely detect explosive tripwires (booby traps), pressure mines, and improvised explosive devices (IEDs) triggered by proximity. In the future, if successful, we will make such external devices available as an option for attachment via NVG mounts.

The following electronic components are built into the mask for a variety of applications:

  • LED flashlight with an adjustable light level and several modes (including SOS signal)
  • loudspeaker with an adjustable sound level which can be used for voice transmission over longer distances than normal and in noisy environments
  • external microphone allows you to amplify the level of ambient sound to the level audible to the human ear, and is part of the Ambient Sound Filter, which allows the software to select and suppress not only certain frequencies of carriers but also to form recommendations to the user
  • internal microphone is built into the obturator airway of the mask and allows the user to use the voice to communicate and for voice control while remaining inaudible to surrounding people
  • headphones have the function of sound reproduction, protect against high noise levels, protect the user's ears and part of the head from outside influences, and have technological openings that allow the user to remain acoustically aware without the use of electronic assistants
  • two HD / IR cameras in front and one rear camera enable to record of video and static images for monitoring, analysis, zooming, transmission to other users and also used in Computer Vision services and in stereo camera mode for measurements, besides using infrared illumination emitter the cameras transmit images at night visible in the infrared spectrum (night vision)
  • front-facing thermal imaging camera transmits video and static images to the user, showing the temperature of the environment

The mask is designed for long-term wear, so it contains a long-life battery (over 7-10 hours). The comfort of long-term use differentiates the mask from most other devices worn on the head. In particular, the mask is differentiated by its very low breathing resistance while protecting the respiratory organs and very low pressure on the soft tissues of the face and head. To prevent the user from experiencing dehydration, a drinking system is integrated into the mask. The design of the mask protects the user's eyes from direct sunlight and the visor of the mask is tinted.

Mask Characteristics

ARHUD-OCP-Tactical-Helmet-without mask-45R

Despite the apparent high weight of the mask (approx. 1.3 kg or 2.8 lbs, without the removable lower mask module 0.9 kg or 2.0 lbs) together with a Class 2 ballistic helmet (1.5 kg or 1.3 lbs) the weight is evenly distributed across the head and has low torque, significantly lower than that of modern night vision devices with a lot of frontal protrusion. According to military experts, torque is significantly more important than static mass. For comparison, here is the mass of helmets that are widely used today:

  • Class 2 helmet (IIIA NIJ) (1.5 kg or 3.3 lbs) + night vision device (1.2 kg or 2.7 lbs) = 2.7 kg or 6.0 lbs
  • firefighter's helmet (1.5 kg or 3.3 lbs) + flashlight (200 g or 0.4 lbs) + full-face mask (650 g or 1.4 lbs) + compressed air reducer (300 g or 0.7 lbs) + radio headset (145 g or 0.3 lbs) = 2.8 kg or 6.2 lbs
  • Titanium or composite class 2 helmet for special operations forces with visor (2.5-4.5 kg or 5.5-9.9 lbs) and radio headset = 2.7-4.7 kg or 6.0-10.2 lbs
  • motorcycle rider integral helmet (1.3-1.6 kg or 2.9-3.5 lbs) + radio headset (145 g or 0.3 lbs) + aerodynamic pressure at speeds >100 miles (4-9 kg or 8.8-19.8 lbs) = 5.4-10.7 kg or 11.9-23.6 lbs

We have high hygienic requirements for the product. The materials do not cause allergic reactions. Water and dust protection class IP67 or higher. The electronic components are sealed in a rugged housing and are actively cooled with an aluminum radiator on the top of the housing and on the inner surface at the front. The mask has interchangeable elements: 2 air filters, 2 filter covers, 1 face contour seal. The mask can be cleaned with water and alcohol sprays on the outside and inside. We do not recommend the use of alcohol-containing products to treat the transparent visor as this may cause clouding.

We have also provided different sets of functionality depending on the user's specialization. For example, some modifications will add additional digital communications, advanced vision and hearing capabilities, and digital assistants, while other modifications will not require some capabilities. All physical product modifications are similar. They differ in the composition of electronic modules and performance.

The size of the mask is adjusted to the size and shape of the head by using the rotating parts on the back of the mask with a locking mechanism. There are 14 soft wear-resistant cushioning pads to minimize the perceived pressure on the soft tissues of the forehead and occiput. Their area and geometry are large enough to ensure that the mask is held firmly on the head and that the pressure is evenly distributed so that it does not cause discomfort or interfere with blood flow to the soft tissues of the head. Four rigid tensioners connect the front and back of the mask. To remove the mask, use two fingers to squeeze the locking latch behind the ears. This can easily be done one-handed with or without gloves. The mask still remains on the head and will not fall off. Once the mask is removed, it can be folded and reduced in size.

Mask Specific Features

ARHUD-OCP-Tactical-Helmet-L

The mask, like any other mask or goggles, limits visibility in the peripheral area from below by about 30% of the visible lower peripheral area. We find this acceptable and will conduct future research and testing to reduce this effect if possible.

When a projection system is used, a polarized light beam is directed onto the visor screen from the inside at an angle of about 47° to the line of sight (between the outgoing ray and the surface normal). The masked person's face is only slightly illuminated and has no effect on the user's visibility in the dark. The source of the beam can only be seen by others at close range in the dark if the user lifts their head significantly. To reduce the effect of sunlight shining into the screen (visor), the mask has a solar cover. The visor is made of transparent polyurethane, comparable in strength to polycarbonate. As with any mask or goggles, the visor may produce glare when the sun or artificial light sources are placed low.

Relative native energy spectra of the short (R), middle (G), and long (B) wavelength primaries of the digital light projector (DLP). Dominant wavelengths are 611, 549, and 470 nm, respectively. The inner lens has a light transmittance of about 70% and is colored in mass to enhance the reflection effect of the related wavelengths of light, as well as to greatly reduce the refraction effect. The focal point of the imaginary image is formed at a distance of about 530 mm, so that the user has no adverse sensation of accommodation (vergence-accommodation conflict) for clear perception of objects located at different distances.

We provide for integration with ballistic helmets and helmets for firefighters and rescuers. The design of the helmets has been developed and will be manufactured by our partners, who currently have extensive experience in manufacturing such products.

Controls:

  • buttons on the left and right earpiece bodies
  • joystick on right earpiece body
  • hand tracking control
  • voice control

User Cases

Military

redirect Further information: Public:User Cases

  1. Army, Marine Corps, Coast Guard, Navy and Air Force operational units[71][72][73][74][75][76]:
    • SA - extends vision capabilities, avoids friendly fire, enables timely wound detection
    • Comms - high consistency within the unit, especially when using the Icon Messaging System in close range and radio silence mode, at all levels of command and with other domains
    • Nav - following the system's plan and recommendations precisely based on user experience
    • ISR - providing security at longer ranges from enemy forces and assets, including timely detection and destruction of enemy UAVs
    • FA - significantly more accurate target engagement and ammunition savings, more effective air target engagement
    • TM - faster coordination and two-way communication with command at different levels
    • EW&BMS - many times more real-time theater information
    • RFP - effective protection of respiratory system, eyes, face from irritants (smoke, powder and chemical gases, dust), better voice recognition during noise, ability to speak in stealth, built-in drinking system
  2. Army and Marine Corps reconnaissance teams, scouters - Researchers at the USAF Research Lab (Calhoun, Draper et al.) found an approximately two-fold increase in the speed at which UAV sensor operators found points of interest using AR. Better execution of SIGINT and other ISR tasks to identify threats, recognize objects at a distance, enemy coordinates and their own coordinates to avoid friendly fire. Easier orientation in unfamiliar terrain, tracking and coordinating with each other from a distance.
  3. Special operations forces, rangers, navy seals, airborne - More effective and coordinated actions during tactical missions. More room for ammunition. More ability to engage in long-range combat and for stealthy withdrawal. More ability to detect mine traps and minefields.
  4. Patrol and cover units, Patrol services of aircraft, naval, regional forces bases, National Guard operational units, Operational units of military police - Significantly more effective surveillance tasks using drones and robots along with SA capabilities and facial recognition technology.
  5. Combat engineers and EOD's sappers - Safer work with and without robots, including work using external ground-penetrating radar - GPR and high-frequency radar to search for metal objects on the surface, computer vision to detect signs of a possible threat.
  6. Snipers and fire spotters - More accurate ballistic calculation and aiming. Capable of working without a fire spotter.
  7. Machine gunners and grenade launchers (incl. helicopters, cutters, navy ships) - More accurate hits and quicker threat removal. Saves ammo.
  8. Artillery crews, Mortar crews - More accurate spotting using drones and operational data from nearby units. More coordinated combat teamwork thanks to seamless communication.
  9. Combat vehicle and tank crews - A combination of all-around cameras and augmented reality can be used on board combat vehicles and tanks as an all-around vision system.
  10. Military personnel of Navy ships combat units - On board naval vessels, AR can allow watchstanders on the bridge to constantly monitor important navigational information as they move around the bridge or perform other tasks. Communication with each member of a combat unit and faster command execution.
  11. Military personnel of Navy ships technical units - Along with the introduction of the IoT system, sensors and sensors will allow to track in real time. As well as to perform routine tasks in a timely manner. During the operation of the machines and mechanisms will be able to promptly receive instructions from the knowledge base and relevant electronic logbook entries of previous technicians. Large machines are difficult to maintain because they are layered or structured. AR allows people to look through the machine like an X-ray, immediately pointing them to a problem, including with the device's thermal imaging camera.
  12. Flight deck crew of aircraft carriers - Good consistency and high safety on deck.
  13. Helicopters pilots and crew - Good all-around visibility, night vision, marker assisted piloting for landing and maneuvering in difficult weather conditions and when threatened.
  14. Paramedics and doctors of medical service, surgeons - Real-time wound information for troops. Communication with each soldier. Easy navigation and location of each soldier. Operational help system and knowledge base reference information. Remote expert advice (audio, video), video demonstration from the front cameras.

ARHUD-user-interface-fire-01 (old prototype)

Civilian services

ARHUD-user-interface-surgery (old prototype)
ARHUD-user-interface-ambulatory (old prototype)
  1. Coast Guard cutter and ship units (ships, cutters, helicopters) - radio direction finding of radio signal sources, computer vision to assist recognition on water surface.
  2. Customs and Border protection units - *
  3. Immigration and Customs Enforcement units - *
  4. Police, Sheriff's, SWAT, FBI, DEA, ATF, USSS, USMS operational units - *
  5. Police and Sheriff's patrol services - *
  6. Patrol services of private security companies - *
  7. Professional and volunteer firefighting crews - thermal imaging camera and mixed video, more accurate communication, access to reference materials on building layout and construction, reliable navigation in low visibility and smoke-proof light rays.
  8. Civilian paramedics, doctors and surgeons - *

* For civilian services, the custom cases are similar to the previous ones above.

Abbreviations and Conventions

  • AI (Artificial Intelligence) - is intelligence—perceiving, synthesizing, and inferring information—demonstrated by machines. Example tasks in which this is done include speech recognition and understanding human speech, computer vision, recommendation systems, automated decision-making. Machine learning (ML), a fundamental concept of AI research since the field's inception, is the study of computer algorithms that improve automatically through experience. Natural language processing (NLP) allows machines to read and understand human language. Machine perception is the ability to use input from sensors (such as cameras, microphones, wireless signals, and active lidar, sonar, radar, and tactile sensors) to deduce aspects of the world. Applications include speech recognition, facial recognition, and object recognition. Computer vision is the ability to analyze visual input. Many problems in AI (including in reasoning, planning, learning, perception, and robotics) require the agent to operate with incomplete or uncertain information. AI researchers have devised a number of tools to solve these problems using methods from probability theory. Modern neural networks model complex relationships between inputs and outputs and find patterns in data. They can learn continuous functions and even digital logical operations. Deep learning uses several layers of neurons between the network's inputs and outputs. The multiple layers can progressively extract higher-level features from the raw input. Deep learning often uses convolutional neural networks for many or all of its layers. In a convolutional layer, each neuron receives input from only a restricted area of the previous layer called the neuron's receptive field. This can substantially reduce the number of weighted connections between neurons, and creates a hierarchy similar to the organization of the animal visual cortex. In a recurrent neural network (RNN) the signal will propagate through a layer more than once; thus, an RNN is an example of deep learning. RNNs can be trained by gradient descent, however long-term gradients which are back-propagated can "vanish" (that is, they can tend to zero) or "explode" (that is, they can tend to infinity), known as the vanishing gradient problem. The long short term memory (LSTM) technique can prevent this in most cases.
  • AR (Augmented Reality) - is an interactive experience that combines the real world and computer-generated content. Augmented reality is largely synonymous with mixed reality. There is also overlap in terminology with extended reality and computer-mediated reality.
  • ARHUDFM (Augmented Reality Head-Up Display Fullface Mask) - the device whose features are described in this document.
  • Comms (Communication) - is usually defined as the transmission of information, between humans, or non-living entities such as computers. For human communication, an important distinction is between verbal and non-verbal communication.
  • CDI/CDS (Cross-Domain Interaction / Cross-Domain Solutions) - consists of trusted software components and a secure operating system, running on a secure hardware platform. Data flowing from a high-security domain to a lower-security domain is filtered to ensure that no data is shared in violation of the security policies. Data flowing from a lower-security domain to a higher-security domain is sanitized to prevent malicious data from entering the secure network. Also is the concept that the Department of Defense has developed to connect sensors from all branches of the armed forces into a unified network powered by artificial intelligence. These branches include the Air Force, Army, Marine Corps, and Navy, as well as Space Force.
  • CV (Computer Vision) - include methods for acquiring, processing, analyzing and understanding digital images, and extraction of high-dimensional data from the real world in order to produce numerical or symbolic information, e.g. in the forms of decisions.
  • EP&MT (Electronic Planning and Management Tool / Task Management) - an independent electronic system integrated into larger BMS systems. Directly responsible for preparing and delegating operational tasks within the mission, as well as receiving, monitoring, and analyzing execution results.
  • C4I Systems - acronym for Command, Control, Communications, and Computers.
  • EW&BMS (Electronic Warfare and Battlefield Management Systems) - is any action involving the use of the electromagnetic spectrum (EM spectrum) or directed energy to control the spectrum, attack an enemy, or impede enemy assaults. The purpose of electronic warfare is to deny the opponent the advantage of—and ensure friendly unimpeded access to—the EM spectrum. EW can be applied from air, sea, land, and/or space by crewed and uncrewed systems and can target communication, radar, or other military and civilian assets. A battlefield management system (BMS) is a system meant to integrate information acquisition and processing to enhance command and control of a military unit.
  • FA (Firing Assistance) - is an electronic-software system that allows automatically and with the user's assistance to perform sophisticated ballistic calculations as well as corrections based on hit-tracking. It can also be integrated with a digital sight.
  • IFF (Identification Friend or Foe) - is a tool within the broader military action of Combat Identification (CID), the characterization of objects detected in the field of combat sufficiently accurately to support operational decisions. The broadest characterization is that of friend, enemy, neutral, or unknown. CID not only can reduce friendly fire incidents, but also contributes to overall tactical decision-making.
  • IMSG (Instant Messaging System) - radio communication is most often in a short range of up to 400 m (very close range), is used also in radio silence mode to avoid radio interception. The specific feature of this method is the regulation of messages and their transformation into graphic images - icons, for maximum rapid and clear understanding.
  • ISR (Intelligence Surveillance Reconnaissance) - Information is collected on the battlefield through systematic observation by deployed soldiers and a variety of electronic sensors. Surveillance, target acquisition and reconnaissance are methods of obtaining this information. The information is then passed to intelligence personnel for analysis, and then to the commander and their staff for the formulation of battle plans. Intelligence is processed information that is relevant and contributes to an understanding of the ground, and of enemy dispositions and intents.
  • ISRADRR (Intelligence Surveillance Reconnaissance Airship Drone with Radio Relay) - aerostatic unloaded drone operating at 900-5500 m altitude, carrying a payload in the form of equipment for tropospheric communication in VHF/UHF band, transverters for frequency conversion of radio waves, as well as independent radio relay and radar equipment (active and passive radars). The main advantage in cost effectiveness and energy efficiency, long time of continuous operation with maintenance of geostationary position, possibility of quick evasion and complex maneuvering in case of threats, low cost in comparison with any other types of airborne.
  • L-STE (Live-Synthetic Training Environment) - Future live training solutions must simulate the effects of both direct and indirect fire weapons with an emphasis on simulation realism. Live training systems need to provide the user with an objective assessment of direct and indirect fire engagements, taking into account cumulative equipment damage, personnel casualties, and appropriate actions to treat casualties. Additionally, live training simulations need to be realistic enough for Soldiers to have confidence in their training performance evaluations. Training evaluations for FoF engagements need to be based on accurate hit/kill probabilities, damage effects, and casualty assessments.
  • ML (Machine Learning) - Machine learning algorithms build a model based on sample data, known as training data, in order to make predictions or decisions without being explicitly programmed to do so.
  • p2p (peer-to-peer) - computing or networking is a distributed application architecture that partitions tasks or workloads between peers. Peers are equally privileged, equipotent participants in the network. They are said to form a peer-to-peer network of nodes.
  • RFP (Respiratory and Facial Protection) - features and characteristics that protect the respiratory system and soft tissues of the face.
  • SA (Situational Awareness) - knowing what is going on around you, is critical for good decision making in many environments. It is formally defined as: “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future”. An alternative definition is that situation awareness is adaptive, externally-directed consciousness that has as its products knowledge about a dynamic task environment and directed action within that environment.
  • SDR (Software Defined Radio) - is a radio communication system where components that have been traditionally implemented in analog hardware (e.g. mixers, filters, amplifiers, modulators/demodulators, detectors, etc.) are instead implemented by means of software on a personal computer or embedded system.
  • SIGINT (Signals intelligence) is intelligence-gathering by interception of signals, whether communications between people (communications intelligence—abbreviated to COMINT) or from electronic signals not directly used in communication (electronic intelligence—abbreviated to ELINT).
  • TM (Task Management) - is the process of managing a task through its life cycle. It involves planning, testing, tracking, and reporting. Effective task management requires managing all aspects of a task, including its status, priority, time, human and resources assignments, recurrence, dependency, notifications and so on. These can be lumped together broadly into the basic activities of task management.
  • UAV/UAS (Unmanned Aerial Vehicle / Unmanned Aerial Systems) - strike and reconnaissance drones, quadcopters, helicopters, octacopters, fuselage-type cruise drones, two-winged bionic UAV, four-winged ornithopter UAV, aerostatic unloaded drones with variable thrust vector propulsion (ISRADRR). Land, waterborne and aerial drones. Remotely-piloted, fully autonomous, hybrid. [77][78]
  • VHF/UHF (Very High Frequencies / Ultra High Frequencies) - radio communication in the frequency range 30-300 MHz / 300-3000 MHz, includes voice calls, text messages, photo and video, digital information.

FAQ

See also

Public External Sections: Public Wiki Sections: Public Wiki Sections: Not-Public Wiki Sections:

Note: Unless otherwise stated, whenever the masculine gender is used, both men and women are included.

See also product details

Hardware Details: Functional Apps Details: Executive Apps Details: Service Apps Details:


References

  1. 1.0 1.1 Colin Demarest, Family affair: US Army pursues synced electronic warfare systems, (defensenews.com, Oct 08, 2022)
  2. Colin Demarest, Jam, spoof and spy: US Army looks to energize electronic warfare, (c4isrnet.com, Oct 10, 2022)
  3. Wikipedia, "Natural Color System"
  4. Military Aerospace, Optical warfare: technology emerges to see the enemy, and to blind him
  5. DARPA, Wearable Laser Detection and Alert System
  6. Laser Focus World, Military researchers ask industry to develop electro-optical imaging sensors for SWaP-constrained uses (Oct 5, 2023)
  7. Transvaro, Sniper Detection Systems
  8. ScienceDirect, Electronic warfare in the optical band: Main features, examples and selected measurement data
  9. Apple, MacBook Air (M1, 2020) - Technical Specifications
  10. Apple, MacBook Air (15-inch, M3, 2024) - Technical Specifications
  11. Francesco Virlinzi, "GNSS Positioning", st.com, (STMicroelectronics International N.V., 39, Chemin du Champ des Filles, 1228 Plan-Les-Ouates – Geneva - Switzerland, February 19, 2019)
  12. Kara Manke, "Skin-like sensor maps blood-oxygen levels anywhere in the body", (Berkeley News, Nov 7, 2018)
  13. Arias Research Group
  14. "Skin-like sensor maps blood-oxygen levels anywhere in the body", (Science Daily, Nov 7, 2018)
  15. Juliane R. Sempionatto, Muyang Lin, Lu Yin, Ernesto De la paz, Kexin Pei, Thitaporn Sonsa-ard, Andre N. de Loyola Silva, Ahmed A. Khorshed, Fangyu Zhang, Nicholas Tostado, Sheng Xu & Joseph Wang, "An epidermal patch for the simultaneous monitoring of haemodynamic and metabolic biomarkers", (Nature, Feb 15, 2021)
  16. "Tourniquet: Stop Bleeding After A Gunshot Wound", (Emergency Life, Nov 25, 2019)
  17. PEO IEW&S, PM IS&A
  18. Breaking Defense, Palantir wins contract for Army TITAN next-gen targeting system (Mar 6, 2024)
  19. Breaking Defense, TITAN Tag
  20. Breaking Defense, In new JADC2 overhaul, Air Force pushes all-encompassing ‘DAF Battle Network’ (Mar. 23, 2023)
  21. Breaking Defense, CJADC2 Tag
  22. Breaking Defense, Navy’s Project Overmatch eager for new software, ahead of schedule, admiral says (Apr. 9, 2024)
  23. Breaking Defense, Project Convergence: The Army’s tech showcase for the future (Apr. 19, 2024)
  24. Wikipedia, Project Maven
  25. Breaking Defense, New contract expands Maven AI’s users ‘from hundreds to thousands’ worldwide, Palantir says (May 30, 2024)
  26. Breaking Defense, Project Maven Tag
  27. Breaking Defense, Open Architecture Tag
  28. Breaking Defense, Open DAGIR: DoD plans July industry day, experiments for new CJADC2 command apps (May 31, 2024)
  29. Defense.gov, CDAO Announces New Approach to Scaling Data, Analytics and AI Capabilities (May 30, 2024)
  30. Breaking Defense, EXCLUSIVE: Pentagon reveals 5 more funded RDER projects, including a top Marine priority (Aug 20, 2024)
  31. PEO IEW&S, PM IS&A
  32. Colin Demarest, Palantir wins contract to help Army quickly process battlefield data, (c4isrnet.com, Oct 19, 2022)
  33. Palantir Technologies, "Palantir AIP Platform", "AIP" (Demo video), (Palantir, Apr. 26, 2023)
  34. Colin Demarest, BigBear.ai delivering US Army digital info system with Palantir’s help, (c4isrnet.com, Sep 30, 2022)
  35. 35.0 35.1 Catherine Buchaniec, Lockheed Martin, General Dynamics win Army electronic warfare contract, (c4isrnet.com, Aug 18, 2022)
  36. Colin Demarest, Lockheed nabs $59 million order for Stryker cyber, electronic warfare suite, (c4isrnet.com, Jul 14, 2022)
  37. Elbit Systems, Torch-X C4ISR
  38. Elbit Systems, Battle Management Systems (BMS)
  39. Northrop Grumman, Advanced Battle Management (ABMS)
  40. YouTube, Delivering Advanced, Survivable Warfighter Capabilities Across All Domains: (IBCS)
  41. Rheinmetall, Command and control system TacNet
  42. ATOS, Data at the heart of the battlefield
  43. Rafael, BNET IP SDR
  44. BAE Systems, INTeACT Combat Management System
  45. BAE Systems, NTCDL
  46. Raytheon, Joint All-Domain Command and Control, JADC2
  47. Leonardo, Leonardo DRS
  48. SAAB, 9Land Battle Management System
  49. Thales, T-BMS Tactical Battlefield Management System
  50. BWI, Battle Management System (BMS)
  51. CIR Bw, Battle Management System - CIRCyber- und Informationsraum digitalisiert
  52. Rebellion, Iris
  53. Helsing, Spotify-Gründer Ek steckt 100 Millionen Euro in Künstliche Intelligenz fürs Militär
  54. Elbit Systems, TRIDENT, YouTube: TRIDENT Target Acquisition Suite
  55. Elbit Systems, Torch-X, Torch-X Fires, YouTube: Dismounted Joint Fires
  56. Teledyne FLIR, Star SAFIRE 380X, YouTube: 380X De-Scintillation Filter | Airborne Law Enforcement (ALE) EO/IR Surveillance
  57. Teledyne FLIR, Airborne Systems
  58. Teledyne FLIR, Land Systems
  59. Teledyne FLIR, Counter-UAS
  60. Teledyne FLIR, Maritime Systems
  61. Teledyne FLIR, CBRNE Detectors
  62. Teledyne FLIR, Tactical Solutions
  63. Garmin, Foretrex 801 / 901
  64. Wilcox, Combat and ILS Systems, YouTube
  65. Wilcox, RAPTAIR Xe, YouTube: RAPTAR Xe
  66. Todd E. Humphreys, Peter A. Iannucci, Zacharias M. Komodromos, Andrew M. Graff, Signal Structure of the Starlink Ku-Band Downlink, (Department of Aerospace Engineering and Engineering Mechanics, The University of Texas at Austin; Department of Electrical and Computer Engineering, The University of Texas at Austin; Jan 19, 2023)
  67. SDRplay RSP2 Pro, RSPdx, (SDRplay)
  68. Radio Exchange Glossary: File:Radio Exchange Glossary.pdf
  69. File:Z-Signals.pdf Z Codes – Military and Commercial:
  70. Q- and Z-codes NATO, ACP 131
  71. Wikipedia, "United States Army"
  72. Wikipedia, "United States Army Forces Command"
  73. Wikipedia, "List of United States Marine Corps MOS"
  74. Wikipedia, "List of United States Navy ratings"
  75. Wikipedia, "List of United States Coast Guard ratings"
  76. Wikipedia, "Air Force Specialty Code"
  77. Unmanned aerial vehicle (Wikipedia)
  78. U.S. military UAS groups, (Wikipedia)

External links