Case Study : ANVEL (Autonomous Navigation and Virtual Environment Laboratory)

Interactive, Real-time Simulation to Build Smarter Vehicles

ANVEL is one of QS’ flagship products, and a phenomenal example of what the team is capable of. A collaboration between QS and the US Army Engineering Research and Development Center (ERDC) that started back in 2007, ANVEL is an interactive, real-time software M&S tool that supports the conceptualization, development, testing, validation, and verification of intelligent manned and unmanned ground vehicles and capabilities. Some highlights of ANVEL include:

  • It can be run in real-time, facilitating interactive experiments with technologies such as teleoperation, shared control, autonomous behaviors, and more. It can also be run in faster than real-time to facilitate larger-scale experimentation, and slower than real-time to facilitate higher fidelity, more computationally intensive simulation components.
  • It is multi-fidelic, meaning that one can use higher fidelity simulation components in areas of deeper interest and lower fidelity components where there is less importance to yield a complete systemic simulation. The user can choose the fidelity throughout, giving it the flexibility to address many challenges.
  • It enables one to quickly and easily create virtual vehicle models of different design and architectures, using easy-to-edit XML files to define details (wheelbase, number of axles, CG position, etc.) and graphical models from common digital modeling tools; ANVEL includes a number of different pre-configured vehicles that can be used directly or “tweaked” to define a new vehicle.
  • It has the ability to quickly and easily import, edit, and configure environments including the modification of terrain heights, changing of terrain composition (grass, gravel, asphalt, etc.), and manual and procedural insertion/placement of objects. ANVEL can support almost any environment (rural, urban, etc.) and ships with a set of environments to facilitate quick start.
  • It has the ability to model and use virtual sensors and related subsystems, attach them to a virtual vehicle with point-and-click tools, and use them as part of the overall simulation. ANVEL comes with a set of sensor models, including LIDARS, GPS, IMU, cameras, and more, and is specifically geared to facilitate creation and incorporation of new ones. Sensor models can support by-wire accurate representations of protocols.
  • It has the ability to incorporate sensing, perception, and control algorithms/code and autonomy codebases, integrate them with the virtual vehicle/sensors/systems, and perform testing and experiments.
  • It has the ability to instrument variables and systems within the simulation, capturing performance data from anything within the simulation, allowing plotting (in real time), and export as desired.
  • It has the ability to support virtual “actors” that interact with the system under test. This includes other vehicles, humans, animals, and similar that can be spawned in real time, follow paths or scripts, respond to triggering events, change behaviors, etc.
  • It has the ability to interface with and support interactions with other external systems via the external API. The API allows one to programmatically query, configure, or control anything within the ANVEL simulation, which enables a world of possibilities in terms of testing and validation. ANVEL’s external API supports common languages such as Matlab, C, C++, Python, and more!

ANVEL is used by QS and hundreds of individuals and groups developing intelligent and autonomous ground vehicle capabilities, and is an integral part of all key current US Army ground robotics programs. QS continues to develop and grow ANVEL, and is proud to be the creator of such a well-used and appreciated tool. Extensive detail about ANVEL (including licensing information, documentation, and support) can be found at