CosmoSim is the an open-source simulator to model satellite network capacity released as part of the Assessing LEO Satellite Networks for National Emergency Failover paper at IMC 2025. The code bundled here is curated for public release: configuration generators, graph builders, traffic-engineering runners, spectrum-management tooling, and plotting scripts are organised around a shared data directory so you can reproduce the pipeline end-to-end without hauling large intermediate artefacts.
The paper can be found here.
The web app visualizing the output of this simulator can be found here.
If you use this codebase, please cite:
@inproceedings{bhosale2025leo,
title = {Assessing LEO Satellite Networks for National Emergency Failover},
author = {Vaibhav Bhosale and Ying Zhang and Sameer Kapoor and Robin Kim and Miguel Schlicht and Muskaan Gupta and Ekaterina Tumanova and Zachary Bischof and Fabián E. Bustamante and Alberto Dainotti and Ahmed Saeed},
booktitle = {Proceedings of the 2025 ACM on Internet Measurement Conference (IMC~2025)},
year = {2025}
}
The bundled dataset covers the six study countries from the paper (Great Britain,
Ghana, Haiti, Lithuania, South Africa, and Tonga). For any additional geography
you must first obtain the appropriate shapefiles and population rasters,
generate an inputs/cells/<country>.txt file, and register the country in
plotting_scripts/common.py. Likewise, new constellations must be described
under constellation_configurations/ (see the existing starlink_*.yaml
files) before they can be referenced by the workflows.
Install the required native libraries (no ns-3 toolchain needed):
sudo apt-get update && sudo apt-get install -y libproj-dev proj-data proj-bin libgeos-devThen install the Python dependencies:
python -m pip install -r requirements.txtRun the following stages from the repository root using the dedicated helper
scripts. Each script spawns the required workflow jobs and logs output under
data/command_logs/.
-
Generate terminal distributions
python terminal_deployment/script_cell_allocation.py
Adjust the country/population/distribution lists near the top of the script (or edit
terminal_deployment/generate_cell_allocations.pyfor bespoke runs). This stage writescells.txtplus extended scenario assets insidedata/scenarios/<scenario_id>/(wherescenario_idis a slug such asstarlink_5shells_ground_stations_starlink_cells_britain_0_10000_uniform) and populatesdata/<scenario_id>_<beam_policy>/demands.txtwith the baseline demand snapshots (one directory per beam policy). -
Generate graphs
python scripts/run_generate_graphs.py
Graph snapshots for each constellation/country/time combination are produced under
graph_generation/graphs/<constellation>/<country>/1000ms/. -
Generate flows (demand snapshots)
python scripts/run_generate_flows.py
This invokes
workflows/generate_flows.pyfor every scenario, storingdemands.txtfiles directly underdata/<scenario>_<beam_policy>/demands.txt. -
Generate capacities (routing policies)
python scripts/run_generate_capacities.py
workflows/generate_capacities.pyconverts those demands into routed capacities, writing{routing}_{t}.txtandflow_dict_{routing}_{t}.jsonalongsidedemands.txtindata/<scenario_id>_<beam_policy>/. -
Generate capacities with competing traffic
python scripts/run_generate_capacities_competing_traffic.py
This step calls
workflows/generate_capacities_competing_traffic.pyto evaluate emergency/incumbent demand priorities. Outputs (max-flow summaries, fulfillment logs, first/second-pass flow dictionaries) are stored underdata/capacities_competing/<scenario>/t<t>/<beam>/<routing>/<priority>/inc_<demand>/.
Each workflow shares the same positional arguments (output directory, graph directory, constellation, ground stations, terminal file, country, flow time, beam policy, KU-band capacity), so you can invoke them directly if you only need a single scenario. Let each stage finish before starting the next to ensure all dependencies are in place.
The figures from the paper, along with any custom visualizations, are generated
via the scripts under plotting_scripts/. Each entry reads the data products
from the workflow stages above and drops rendered assets into
plotting_scripts/out/.