We are here to help. Contact us.

How the ACS CyberVAN Works

How It Works

Figure 1 illustrates the high-level concept of CyberVAN. A simulator such as QualNet or ns-3 runs a simulation of a network. Every node in the simulated network has a corresponding VM outside the simulated network. The figure illustrates the journey of a packet sent from a node A to another node B. This packet could be sent by any application running on the VM on A to the VM on B, e.g., tactical applications like JBC-P and CPOF, or enterprise applications like web browsers. The packet is forwarded from the VM representing node A to a Layer 2 switch (step 1), and then to the machine hosting the simulator (step 2). At this point, the CyberVAN software injects the packet into the simulated network at the node representing node A (step 3). From the perspective of the simulator, it is as if the packet were originating at that node in the simulation. The packet travels through the simulated network towards the node representing node B in the simulator (step 4). Upon arrival at this node, the CyberVAN software extracts the packet from the simulator and delivers it via the Layer 2 switch (step 5) to node B (step 6). All of this is completely transparent to the communicating applications on nodes A and B. In our experiments, the latency introduced by this transparent packet forwarding process was measured to be less than 3 microseconds.

Figure 1: VAN transparently forwards packets through simulated network

Large-Scale Experimentation with CyberVAN’s TimeSync

The use of a simulated network for a network testbed suffers from the drawback that high-fidelity simulation of a large network may become slower than real time, causing timeouts in applications running on the hosts. Thus, there is a need to be able to synchronize time across the hosts and the simulated network. CyberVAN provides a TimeSync capability for this purpose [17]. TimeSync is a suite of technologies that allow tight time synchronization across different testbed components (e.g., VMs, emulated processes, discrete event simulators, etc.). The concept of TimeSync is to force time ticks provided by a single time controller in the testbed to drive all elements under test, which is particularly useful where there are multiple time sources that may deviate from each other in the testing environment. E.g., a large-scale high-fidelity simulation may run slower than real time. Assume an m-seconds long simulation takes n*m seconds to complete, where n > 1. In this case, the average slow-down factor for this simulation is n. In general, the rate of time advancement for the simulation may vary widely over the course of the simulation, as it is determined by the number of discrete events that the simulator has to handle. The idea behind TimeSync is to slow down VMs when the simulation runs slower than real time. To accomplish this, TimeSync provides a sampler that periodically samples the rate of time advancement for the simulation in order to compute predictions for the rate of time advancement for the next interval, called the “time dilation factor”. The time dilation factor is sent to the time controller, which then adjusts both the time value and the rate of time advancement of the components under its control, until the next synchronization event. As there is always some error in predicting how fast the simulation may progress in the next synchronization interval, the controller will compensate for the prediction error and adjust the rate of advancement in the next synchronization interval. Our previous studies have shown that in 95% of the cases, the prediction error was under 1ms. Our current implementation of TimeSync is embedded in the Xen hypervisor, making it unnecessary to modify operating systems to enable TimeSync.

Using the TimeSync concept, one can model a much larger network at high fidelity than would be possible when all components run in real time. As an example, for the CERDEC CRUSHPROOF program, we conducted cyber experiments at the scale of 548 nodes running SRW, NCW, 802.16 (LAW) and Ethernet. The simulation of the 548-node network ran on JNE (JTRS Network Emulator) in QualNet and was ~1.5 to ~1.8 times slower than real time due to the large network scale. We used TimeSync to slow down execution of the VMs representing the 548 nodes in the experiment in order to match their advancement of time with that of the QualNet simulator.

Use of Real Devices Within the VAN Testbed

Although a simulated network usually provides the required fidelity for experimentation, our experience with real radios has shown the importance of using real devices for uncovering issues that arise due to insufficiently detailed models of mobile devices or sensors. Examples of such issues include inaccurate time-response behavior in software-defined radio models and lack of low-level hardware models for evaluating smartphone Root-of-Trust solutions for tactical cellular networks. CyberVAN’s PhyToVir Gateway function currently supports insertion of a heterogeneous ecosystem of devices that can include disadvantaged assets like Android and NettWarrior handheld devices, USRP2 GNU radios, and sensors, into the CyberVAN testbed, and lets them communicate seamlessly with other devices, or VMs, over the simulated network. We demonstrated 274 nodes running the Army’s JBC-P application using CyberVAN, including two companies and Nett Warrior handhelds, using our PhyToVir Gateway. The network and background traffic replicated a NIE 12.1 scenario. The 274 nodes and the Nett Warrior devices communicated over a high-fidelity network simulation running SRW and WNW models (provided by JTRS Program Office) on CyberVAN. We have also demonstrated integration of CyberVAN with the USRP2 testbed at CERDEC.

Network Agility and Situational Awareness

A critical capability for a testbed that makes use of hybrid emulation with a simulated network is the ability to dynamically reconfigure the simulated network and the host nodes. Here we use the term “reconfiguration” to encompass any type of agility maneuver, such as modification of network, services, and software configuration. Many existing testbeds do not allow the emulated or simulated network to be reconfigured during an experiment, which makes it impossible to experiment with agility capabilities. CyberVAN supports dynamic reconfiguration of the testbed nodes and network. Using a standard SNMP interface, an agility application can modify the configuration of the network and hosts. In addition to reconfiguration, there is also a need to be able to monitor the elements of the experiment for situational awareness, which is critical for cyber detection technologies. We have developed capabilities to instrument the simulated network used with CyberVAN so that it can be monitored using standard SNMP commands, a feature that is typically not supported in simulated or emulated network environments. We have used these capabilities extensively on programs where we were experimenting with policy-based management capabilities that monitored the network and dynamically reconfigured it based on policies.


A critical benefit of hybrid emulation with discrete event simulation is the ability to leverage existing military waveform models of military and cellular radios developed for commercial simulators like OPNET and QualNet, which is important for realism. Most reference implementations of military waveform models are only available in OPNET or QualNet. CyberVAN currently works in conjunction with OPNET, QualNet, ns-2, or ns-3. Another benefit is that CyberVAN supports high-fidelity hosts with different operating systems running on VMs. This capability is important for cyber experiments because of the need to simulate attacks on specific targets, such as hosts running a specific OS version or running specific Windows applications, etc. Further, CyberVAN provides experiment repeatability due to the ability to run simulations with the same seeds, thereby also providing a high degree of experiment predictability.


  1. ACS VAN testbed team, “The 274-node VAN Testbed Demonstration Operating Guide”, Technical Report delivered to U.S. Army CERDEC, December 2012.
  2. ACS VAN testbed team, “The VAN Testbed User Guide”, Technical Report delivered to U.S. Army CERDEC, December 2012
  3. ACS VAN testbed team, “The VAN Testbed Administration Instructions for Installation”, Technical Report delivered to U.S. Army CERDEC, December 2012.
  4. ACS VAN testbed team, “The VAN Testbed Study Use Cases”, Technical Report delivered to U.S. Army CERDEC, December 2012.
  5. ACS VAN testbed team, “The VAN Testbed Software Design Document”, Technical Report delivered to U.S. Army CERDEC, December 2012.
  6. ACS VAN testbed team, “The VAN Testbed Architecture Document”, Technical Report delivered to U.S. Army CERDEC, December 2012.
  7. Pratik K. Biswas, Constantin Serban, Alex Poylisher, John Lee, Siun-Chuon Mau, Ritu Chadha, Cho-Yu J. Chiang, “An Integrated Testbed for Mobile Ad hoc Networks”, in Proceedings of the 5th International ICST Conference on Testbeds and Research Infrastructures for the Development of Networks and Communities (TRIDENTCOM 2009), Fairfax, Virginia, April 2009.
  8. C. Jason Chiang, Alex Poylisher, Yitzchak Gottlieb and Constantin Serban, “Cyber Testing Tools and Methodologies”, 30th Annual International Test and Evaluation Symposium, 2013, Washington, DC.
  9. C. Jason Chiang and Constantin Serban, “TimeFlex: Time Alignment for Improved Accuracy in Distributed Testing”, 30th Annual International Test and Evaluation Symposium, 2013, Washington, DC.
  10. Chang-Han Jong, Taichuan Lu and Cho-Yu Jason Chiang, “Storage Deduplication and Management for Application Testing over a Virtual Network Testbed”, in Proceedings of the 7th International ICST Conference on Testbeds and Research Infrastructures for the Development of Networks and Communities (TRIDENTCOM 2011), Shanghai, China, March 2011.
  11. Alexander Poylisher, Yitzchak M. Gottlieb, Constantin Serban, Keith Whittaker, James Nguyen, Chris Scilla, John Lee, Florin Sultan, Ritu Chadha, Cho-Yu Jason Chiang. “Building an Operation Support System for a Fast Reconfigurable Network Experimentation Testbed”. Proceedings of the IEEE Conference on Military Communications, MILCOM 2012.
  12. Alexander Poylisher, Taichuan Lu, Constantin Serban, John Lee, Ritu Chadha, Cho-Yu Jason Chiang, Kimberly Jakubowski, Keith Whittaker and Rocio Bauer, “Realistic Modeling of Tactical Networks with Multi-Level Security in VAN Testbeds”, in Proceedings of MILCOM 2010, San Jose, USA, October 2010.
  13. Alexander Poylisher, Constantin Serban, John Lee, Ted Lu, Ritu Chadha, Cho-Yu Jason Chiang, “A Virtual Ad Hoc Network Testbed”, International Journal on Communication Networks and Distributed Systems, 2010.
  14. Alexander Poylisher, Constantin Serban, John Lee, Ted Lu, Ritu Chadha, Cho-Yu Jason Chiang, “Virtual Ad hoc Network Testbeds for High Fidelity Testing of Tactical Network Applications”, MILCOM 2009, Boston, USA, October 2009.
  15. Constantin Serban and Alexander Poylisher, “Increasing the Reliability and Productivity of Cyber Testing via Experiment Speedup”, 30th Annual International Test and Evaluation Symposium, 2013, Washington, DC.
  16. Constantin Serban, Alexander Poylisher, Cho-Yu Jason Chiang. “Virtual ad hoc network testbeds for network-aware applications”. Proceedings of the IEEE/IFIP Network Operations and Management Symposium, NOMS 2010.
  17. F. Sultan, A. Poylisher, J. Lee, C. Serban, C.J. Chiang, R. Chadha. K. Whittaker, C. Scilla, S. Ali, “TimeSync: Enabling scalable, high-fidelity hybrid network emulation”, Proceedings of the 15th ACM International conference on Modeling, analysis and simulation of wireless and mobile systems, MSWiM ‘12, Paphos, Cyprus, October 2012, pp. 185–194.
  18. Florin Sultan, Alex Poylisher, John Lee, Constantin Serban, C. Jason Chiang, Ritu Chadha, Keith Whittaker, Chris Scilla and Syeed Ali, “TimeSync – Virtual Time for Scalable, High-Fidelity Hybrid Network Emulation”, Proceedings of MILCOM 2012, Orlando, Florida, November 2012.