Skip to content

Followup Calculations

For large-scale studies with multiple different sources or where you want to explore a large parameter space of different observatory configurations or delay times, computing sensitivity curves and observation times individually can be computationally expensive. The followup module provides fast exposure time estimates using pre-computed lookup tables via interpolation.

A lookup table contains pre-simulated observation times for a grid of:

  • Sources (from catalogs or population studies)
  • Delay times (time from event to observation start)
  • Observatory configurations (site, zenith, azimuth, IRF version)
  • EBL models

The followup module filters the lookup table by event, observation conditions, and any other metadata columns you provide in order to instantly interpolate the observation time needed from exposure calculations you have already performed.

These tables are can be stored in any format readable by pandas, such as Parquet or CSV files.

import pandas as pd
from sensipy.util import get_data_path
# Load lookup table (sample data included with sensipy)
lookup_path = get_data_path("mock_data/sample_lookup_table.parquet")
df = pd.read_parquet(lookup_path)
print(df.columns)

Two columns are required: obs_delay and obs_time, which are the observation delay and the 5σ5\sigma observation time for the event, respectively. The columns names do not have to match exactly.

ColumnTypeRequiredDescription
obs_delayfloatYesObservation delay time (seconds)
obs_timefloatYes5σ5\sigma observation time (seconds)
event_idintYesEvent identifier
irf_sitestrNoObservatory site: 'north' or 'south'
irf_zenithfloatNoZenith angle (degrees)
irf_eblboolNoBoolean indicating if EBL is used
irf_ebl_modelstrNoEBL model name (e.g., 'franceschini', 'dominguez11')
irf_configstrNoTelescope configuration
irf_durationintNoIRF duration (seconds)
longfloatNoSource longitude (radians)
latfloatNoSource latitude (radians)
distfloatNoSource distance (kpc)

The main function for followup calculations is sensipy.followup.get_exposure().

from sensipy import followup
from sensipy.util import get_data_path
import astropy.units as u
import pandas as pd
# Load lookup table (sample data included with sensipy)
lookup_path = get_data_path("mock_data/sample_lookup_table.parquet")
lookup_df = pd.read_parquet(lookup_path)
# Get exposure for a specific event
# Use filters as keyword arguments to specify event and observation configuration
result = followup.get_exposure(
delay=10 * u.s,
lookup_df=lookup_df,
event_id=1,
irf_site="north",
irf_zenith=60,
irf_ebl_model="franceschini",
)
print(result)
ParameterTypeDescription
delayu.QuantityObservation delay from event onset
lookup_dfpd.DataFrame or strLookup table (DataFrame or filepath). If provided, uses lookup mode
source_filepathPath or strPath to source file. Required if lookup_df is None. Sensitivity will be re-calculated from scratch.
other_infolist[str]List of column names to include in the returned dictionary when using lookup mode. These are extracted from the lookup dataframe.
delay_columnstrName of the column containing observation delays (default: "obs_delay")
obs_time_columnstrName of the column containing observation times (default: "obs_time")
**filtersstr | float | int | boolColumn-value pairs to filter dataframes. Common filters include:
- event_id: Event identifier
- irf_site: Observatory site (eg "ctao-north")
- irf_zenith: Zenith angle in degrees
- irf_ebl_model: EBL model name (e.g., “franceschini”, “dominguez11”)

Returns a dictionary similar to Source.observe():

{
'obs_time': <Quantity>, # Interpolated observation time
'seen': bool, # Detection possible
'start_time': <Quantity>, # Observation start (= delay)
'end_time': <Quantity>, # start_time + obs_time
'ebl_model': str, # EBL model used
'min_energy': <Quantity>, # Energy range
'max_energy': <Quantity>,
# Any metadata columns from lookup table you request via `other_info`, for example;
'long': <Quantity>,
'lat': <Quantity>,
'dist': <Quantity>,
'id': int,
'error_message': str, # If any issues
}

The get_exposure function uses logarithmic interpolation to estimate observation times at arbitrary delays:

  1. Filter lookup table using the filters on your lookup table columns (e.g., event_id, irf_site, irf_ebl_model).

  2. Extract delay-observation time pairs from filtered rows

  3. Perform log-log interpolation:

    log(obs_time) = f(log(delay))
  4. Extrapolate if necessary using linear trend in log-space

  5. Return result with interpolated observation time

from sensipy import followup
from sensipy.util import get_data_path
import astropy.units as u
import pandas as pd
# Load lookup table once (cache for multiple queries)
lookup_path = get_data_path("mock_data/sample_lookup_table.parquet")
lookup_df = pd.read_parquet(lookup_path)
# Query a single event
result = followup.get_exposure(
delay=30 * u.min,
lookup_df=lookup_df,
event_id=1,
irf_site="south",
irf_zenith=20,
irf_ebl_model="franceschini",
)
if result['seen']:
print(f"Event 1 detectable in {result['obs_time']}")
print(f" Start: {result['start_time']}")
print(f" End: {result['end_time']}")
else:
print(f"Event 1 not detectable")

Use followup calculations to explore how detectability varies across parameter space:

from sensipy import followup
from sensipy.util import get_data_path
import astropy.units as u
import pandas as pd
import numpy as np
lookup_path = get_data_path("mock_data/sample_lookup_table.parquet")
lookup_df = pd.read_parquet(lookup_path)
# Get all unique event IDs
event_ids = lookup_df['event_id'].unique()[:100] # First 100 events
# Fixed observing parameters
delay = 60 * u.s
sites = ["north", "south"]
zeniths = [20, 40, 60]
detection_matrix = np.zeros((len(sites), len(zeniths)))
for i, site in enumerate(sites):
for j, zenith in enumerate(zeniths):
detections = 0
for event_id in event_ids:
result = followup.get_exposure(
delay=delay,
lookup_df=lookup_df,
event_id=event_id,
irf_site=site,
irf_zenith=zenith,
irf_ebl_model="franceschini",
)
if result['seen']:
detections += 1
detection_matrix[i, j] = detections / len(event_ids)
# detection_matrix now contains detection fractions for each configuration
print("Detection fractions:")
print(f" Z=20 Z=40 Z=60")
for i, site in enumerate(sites):
print(f"{site:6s}: {detection_matrix[i, 0]:.3f} {detection_matrix[i, 1]:.3f} {detection_matrix[i, 2]:.3f}")

Directly retrieve rows from the lookup table using keyword arguments to filter on any columns:

from sensipy.followup import get_row
from sensipy.util import get_data_path
import pandas as pd
lookup_path = get_data_path("mock_data/sample_lookup_table.parquet")
lookup_df = pd.read_parquet(lookup_path)
# Get specific configuration for an event
# Use column names as keyword arguments
row = get_row(
lookup_df=lookup_df,
event_id=1,
irf_site="south",
irf_zenith=20,
irf_ebl_model="franceschini", # EBL model name
irf_config="alpha",
irf_duration=1800,
)
print(row)
# You can filter on any columns in your dataframe
row = get_row(
lookup_df=lookup_df,
event_id=1,
irf_zenith=20, # Any column name works
)

extrapolate_obs_time: Low-level Interpolation

Section titled “extrapolate_obs_time: Low-level Interpolation”

Perform interpolation directly (rarely needed by users):

from sensipy.followup import extrapolate_obs_time
from sensipy.util import get_data_path
import astropy.units as u
import pandas as pd
lookup_path = get_data_path("mock_data/sample_lookup_table.parquet")
lookup_df = pd.read_parquet(lookup_path)
# Manually perform interpolation
result = extrapolate_obs_time(
delay=100 * u.s,
lookup_df=lookup_df,
filters={
'event_id': 1,
'irf_site': 'south',
'irf_zenith': 20,
'irf_ebl_model': 'franceschini',
},
other_info=['long', 'lat', 'dist'], # Additional fields to include
)

get_sensitivity: Create Sensitivity from Lookup Table

Section titled “get_sensitivity: Create Sensitivity from Lookup Table”

If you have pre-computed sensitivity curves in your lookup table, you can create Sensitivity objects:

from sensipy.followup import get_sensitivity
from sensipy.util import get_data_path
from sensipy.sensitivity import Sensitivity
import astropy.units as u
import pandas as pd
lookup_path = get_data_path("mock_data/sample_lookup_table.parquet")
lookup_df = pd.read_parquet(lookup_path)
# Note: get_sensitivity requires sensitivity_curve and photon_flux_curve columns
# The sample lookup table is designed for get_exposure, which uses obs_delay/obs_time
# For get_sensitivity examples, you would need a different lookup table structure
# Or provide curves directly
sens = get_sensitivity(
sensitivity_curve=[1e-10, 1e-11, 1e-12],
photon_flux_curve=[1e-9, 1e-10, 1e-11],
observatory="ctao_south",
)
# Now use this sensitivity for custom calculations
# (though usually get_exposure is sufficient)

Followup calculations are designed for speed:

OperationTypical TimeUse Case
Single query~1-10 msReal-time alerts
100 events~0.1-1 sQuick scans
1000 events~1-10 sCatalog analysis
10,000 events~10-100 sFull O5 catalog
from sensipy.util import get_data_path
import pandas as pd
# Load once at the start of your script/notebook
lookup_path = get_data_path("mock_data/sample_lookup_table.parquet")
LOOKUP_TABLE = pd.read_parquet(lookup_path)
# Reuse for all queries
def quick_query(event_id, delay):
return followup.get_exposure(
delay=delay,
lookup_df=LOOKUP_TABLE, # Reuse cached table
event_id=event_id,
irf_site="south",
irf_zenith=20,
irf_ebl_model="franceschini",
)
Aspectget_exposure() (Followup)Source.observe() (Full Sim)
SpeedVery fast (ms)Slow (seconds to minutes)
AccuracyInterpolatedExact
FlexibilityLimited to table paramsFully customizable
RequirementsPre-computed tableIRF files, spectral models
Use caseQuick lookups, real-timeDetailed studies, new configs

To create a lookup table for your own catalog:

from sensipy.ctaoirf import IRFHouse
from sensipy.sensitivity import Sensitivity
from sensipy.source import Source
import astropy.units as u
import pandas as pd
from pathlib import Path
# Load IRFs
house = IRFHouse(base_directory="./IRFs/CTAO")
# Configuration grid
sites = ["north", "south"]
zeniths = [20, 40, 60]
delays = [10, 30, 100, 300, 1000, 3000] * u.s
ebl_models = [None, "franceschini"]
# Catalog of events
event_files = list(Path("./catalog/").glob("*.fits"))
results = []
for event_file in event_files:
for site in sites:
for zenith in zeniths:
for ebl in ebl_models:
# Load IRF
irf = house.get_irf(
site=site,
configuration="alpha",
zenith=zenith,
duration=1800,
azimuth="average",
version="prod5-v0.1",
)
# Load source
source = Source(
filepath=str(event_file),
min_energy=20 * u.GeV,
max_energy=10 * u.TeV,
ebl=ebl
)
# Calculate sensitivity
sens = Sensitivity(
irf=irf,
observatory=f"ctao_{site}",
min_energy=20 * u.GeV,
max_energy=10 * u.TeV,
radius=3.0 * u.deg,
)
sens.get_sensitivity_curve(source=source)
# Simulate at each delay
for delay in delays:
result = source.observe(
sensitivity=sens,
start_time=delay,
min_energy=20 * u.GeV,
max_energy=10 * u.TeV,
)
# Add configuration info
result['site'] = site
result['zenith'] = zenith
result['delay'] = delay.to(u.s).value
result['ebl'] = ebl if ebl else "none"
results.append(result)
# Save to Parquet
df = pd.DataFrame(results)
df.to_parquet("my_lookup_table.parquet")