hendrics package

Submodules

hendrics.base module

A miscellaneous collection of basic functions.

hendrics.base.adjust_dt_for_power_of_two(dt, length, strict=False)[source]

Examples

>>> length, dt = 10, 0.1
>>> # There are 100 bins there. I want them to be 128.
>>> new_dt = adjust_dt_for_power_of_two(dt, length)
INFO: ...
INFO: ...
>>> assert np.isclose(new_dt, 0.078125)
>>> assert length / new_dt == 128
>>> length, dt = 6.5, 0.1
>>> # There are 100 bins there. I want them to be 128.
>>> new_dt = adjust_dt_for_power_of_two(dt, length)
INFO: ...
INFO: Too many ...
INFO: ...
>>> assert length / new_dt == 72
hendrics.base.adjust_dt_for_small_power(dt, length)[source]

Examples

>>> length, dt = 9.9, 0.1
>>> # There are 99 bins there. I want them to be 100 (2**2 * 5**2).
>>> new_dt = adjust_dt_for_small_power(dt, length)
INFO:...
>>> assert np.isclose(new_dt, 0.099)
>>> assert np.isclose(length / new_dt, 100)
hendrics.base.array_take(arr, indices)[source]

Adapt np.take to arrays.

hendrics.base.check_negative_numbers_in_args(args)[source]

If there are negative numbers in args, prepend a space.

Examples

>>> args = ['events.nc', '-f', '103', '--fdot', '-2e-10']
>>> newargs = check_negative_numbers_in_args(args)
>>> assert args[:4] == newargs[:4]
>>> assert newargs[4] == ' -2e-10'
hendrics.base.common_name(str1, str2, default='common')[source]

Strip two strings of the letters not in common.

Filenames must be of same length and only differ by a few letters.

Parameters:
str1str
str2str
Returns:
common_strstr

A string containing the parts of the two names in common

Other Parameters:
defaultstr

The string to return if common_str is empty

Examples

>>> common_name('strAfpma', 'strBfpmb')
'strfpm'
>>> common_name('strAfpma', 'strBfpmba')
'common'
>>> common_name('asdfg', 'qwerr')
'common'
>>> common_name('A_3-50_A.nc', 'B_3-50_B.nc')
'3-50'
hendrics.base.compute_bin(x, bin_edges)[source]

Examples

>>> bin_edges = np.array([0, 5, 10])
>>> assert compute_bin(1, bin_edges) == 0
>>> assert compute_bin(5, bin_edges) == 1
>>> assert compute_bin(10, bin_edges) == 1
hendrics.base.deorbit_events(events, parameter_file=None, invert=False, ephem=None)[source]

Refer arrival times to the center of mass of binary system.

Parameters:
eventsstingray.events.EventList object

The event list

parameter_filestr

The parameter file, in Tempo-compatible format, containing the orbital solution (e.g. a BT model)

hendrics.base.fold_detection_level(nbin, epsilon=0.01, ntrial=1)[source]

Return the detection level for a folded profile.

See Leahy et al. (1983).

Parameters:
nbinint

The number of bins in the profile

epsilonfloat, default 0.01

The fractional probability that the signal has been produced by noise

Returns:
detlevfloat

The epoch folding statistics corresponding to a probability epsilon * 100 % that the signal has been produced by noise

Other Parameters:
ntrialint

The number of trials executed to find this profile

hendrics.base.fold_profile_probability(stat, nbin, ntrial=1)[source]

Calculate the probability of a certain folded profile, due to noise.

Parameters:
statfloat

The epoch folding statistics

nbinint

The number of bins in the profile

Returns:
pfloat

The probability that the profile has been produced by noise

Other Parameters:
ntrialint

The number of trials executed to find this profile

hendrics.base.get_bin_edges(a, bins)[source]

Examples

>>> array = np.array([0., 10.])
>>> bins = 2
>>> assert np.allclose(get_bin_edges(array, bins), [0, 5, 10])
hendrics.base.get_list_of_small_powers(maxno=100000000000)[source]
hendrics.base.gti_len(gti)[source]

Return the total good time from a list of GTIs.

Examples

>>> assert gti_len([[0, 1], [2, 4]]) == 3
hendrics.base.hen_root(filename)[source]

Return the root file name (without _ev, _lc, etc.).

Parameters:
filenamestr

Examples

>>> fname = "blabla_ev_calib.nc"
>>> hen_root(fname)
'blabla'
>>> fname = "blablu_ev_bli.fits.gz"
>>> hen_root(fname)
'blablu_ev_bli'
>>> fname = "blablu_ev_lc.nc"
>>> hen_root(fname)
'blablu'
>>> fname = "blablu_lc_asrd_ev_lc.nc"
>>> hen_root(fname)
'blablu_lc_asrd'
hendrics.base.hist1d_numba_seq(a, bins, ranges, use_memmap=False, tmp=None)[source]

Examples

>>> if os.path.exists('out.npy'): os.unlink('out.npy')
>>> x = np.random.uniform(0., 1., 100)
>>> H, xedges = np.histogram(x, bins=5, range=[0., 1.])
>>> Hn = hist1d_numba_seq(x, bins=5, ranges=[0., 1.], tmp='out.npy',
...                       use_memmap=True)
>>> assert np.all(H == Hn)
>>> # The number of bins is small, memory map was not used!
>>> assert not os.path.exists('out.npy')
>>> H, xedges = np.histogram(x, bins=10**8, range=[0., 1.])
>>> Hn = hist1d_numba_seq(x, bins=10**8, ranges=[0., 1.], tmp='out.npy',
...                       use_memmap=True)
>>> assert np.all(H == Hn)
>>> assert os.path.exists('out.npy')
>>> # Now use memmap but do not specify a tmp file
>>> Hn = hist1d_numba_seq(x, bins=10**8, ranges=[0., 1.],
...                       use_memmap=True)
>>> assert np.all(H == Hn)
hendrics.base.hist2d_numba_seq(x, y, bins, ranges)[source]

Examples

>>> x = np.random.uniform(0., 1., 100)
>>> y = np.random.uniform(2., 3., 100)
>>> H, xedges, yedges = np.histogram2d(x, y, bins=(5, 5),
...                                    range=[(0., 1.), (2., 3.)])
>>> Hn = hist2d_numba_seq(x, y, bins=(5, 5),
...                       ranges=[[0., 1.], [2., 3.]])
>>> assert np.all(H == Hn)
hendrics.base.hist2d_numba_seq_weight(x, y, weights, bins, ranges)[source]

Examples

>>> x = np.random.uniform(0., 1., 100)
>>> y = np.random.uniform(2., 3., 100)
>>> weight = np.random.uniform(0, 1, 100)
>>> H, xedges, yedges = np.histogram2d(x, y, bins=(5, 5),
...                                    range=[(0., 1.), (2., 3.)],
...                                    weights=weight)
>>> Hn = hist2d_numba_seq_weight(x, y, bins=(5, 5),
...                              ranges=[[0., 1.], [2., 3.]],
...                              weights=weight)
>>> assert np.all(H == Hn)
hendrics.base.hist3d_numba_seq(tracks, bins, ranges)[source]

Examples

>>> x = np.random.uniform(0., 1., 100)
>>> y = np.random.uniform(2., 3., 100)
>>> z = np.random.uniform(4., 5., 100)
>>> H, _ = np.histogramdd((x, y, z), bins=(5, 6, 7),
...                       range=[(0., 1.), (2., 3.), (4., 5)])
>>> Hn = hist3d_numba_seq((x, y, z), bins=(5, 6, 7),
...                       ranges=[[0., 1.], [2., 3.], [4., 5.]])
>>> assert np.all(H == Hn)
hendrics.base.hist3d_numba_seq_weight(tracks, weights, bins, ranges)[source]

Examples

>>> x = np.random.uniform(0., 1., 100)
>>> y = np.random.uniform(2., 3., 100)
>>> z = np.random.uniform(4., 5., 100)
>>> weights = np.random.uniform(0, 1., 100)
>>> H, _ = np.histogramdd((x, y, z), bins=(5, 6, 7),
...                       range=[(0., 1.), (2., 3.), (4., 5)],
...                       weights=weights)
>>> Hn = hist3d_numba_seq_weight(
...    (x, y, z), weights, bins=(5, 6, 7),
...    ranges=[[0., 1.], [2., 3.], [4., 5.]])
>>> assert np.all(H == Hn)
hendrics.base.histnd_numba_seq(tracks, bins, ranges)[source]

Examples

>>> x = np.random.uniform(0., 1., 100)
>>> y = np.random.uniform(2., 3., 100)
>>> z = np.random.uniform(4., 5., 100)
>>> # 2d example
>>> H, _, _ = np.histogram2d(x, y, bins=np.array((5, 5)),
...                          range=[(0., 1.), (2., 3.)])
>>> alldata = np.array([x, y])
>>> Hn = histnd_numba_seq(alldata, bins=np.array([5, 5]),
...                       ranges=np.array([[0., 1.], [2., 3.]]))
>>> assert np.all(H == Hn)
>>> # 3d example
>>> H, _ = np.histogramdd((x, y, z), bins=np.array((5, 6, 7)),
...                       range=[(0., 1.), (2., 3.), (4., 5)])
>>> alldata = np.array([x, y, z])
>>> Hn = hist3d_numba_seq(alldata, bins=np.array((5, 6, 7)),
...                       ranges=np.array([[0., 1.], [2., 3.], [4., 5.]]))
>>> assert np.all(H == Hn)
hendrics.base.histogram(*args, **kwargs)[source]
hendrics.base.histogram2d(*args, **kwargs)[source]
hendrics.base.index_arr(a, ix_arr)[source]
hendrics.base.index_set_arr(a, ix_arr, val)[source]
hendrics.base.interpret_bintime(bintime)[source]

If bin time is negative, interpret as power of two.

Examples

>>> assert interpret_bintime(2) == 2
>>> assert interpret_bintime(-2) == 0.25
>>> interpret_bintime(0)
Traceback (most recent call last):
    ...
ValueError: Bin time cannot be = 0
hendrics.base.is_string(s)[source]

Portable function to answer this question.

hendrics.base.log_x(a, base)[source]
hendrics.base.memmapped_arange(i0, i1, istep, fname=None, nbin_threshold=10000000, dtype=<class 'float'>)[source]

Arange plus memory mapping.

Examples

>>> i0, i1, istep = 0, 10, 1e-2
>>> assert np.allclose(np.arange(i0, i1, istep), memmapped_arange(i0, i1, istep))
>>> i0, i1, istep = 0, 10, 1e-7
>>> assert np.allclose(np.arange(i0, i1, istep), memmapped_arange(i0, i1, istep))
hendrics.base.mkdir_p(path)[source]

Safe mkdir function.

hendrics.base.nchars_in_int_value(value)[source]

Number of characters to write an integer number.

Examples

>>> assert nchars_in_int_value(2) == 1
>>> assert nchars_in_int_value(1356) == 4
>>> assert nchars_in_int_value(9999) == 4
>>> assert nchars_in_int_value(10000) == 5
hendrics.base.njit(**kwargs)[source]

Dummy decorator in case jit cannot be imported.

hendrics.base.optimal_bin_time(fftlen, tbin)[source]

Vary slightly the bin time to have a power of two number of bins.

Given an FFT length and a proposed bin time, return a bin time slightly shorter than the original, that will produce a power-of-two number of FFT bins.

Examples

>>> assert optimal_bin_time(512, 1.1) == 1.0
hendrics.base.pds_detection_level(epsilon=0.01, ntrial=1, n_summed_spectra=1, n_rebin=1)[source]

Detection level for a PDS.

Return the detection level (with probability 1 - epsilon) for a Power Density Spectrum of nbins bins, normalized a la Leahy (1983), based on the 2-dof \({\chi}^2\) statistics, corrected for rebinning (n_rebin) and multiple PDS averaging (n_summed_spectra)

Parameters:
epsilonfloat

The single-trial probability value(s)

Other Parameters:
ntrialint

The number of independent trials (the independent bins of the PDS)

n_summed_spectraint

The number of power density spectra that have been averaged to obtain this power level

n_rebinint

The number of power density bins that have been averaged to obtain this power level

Examples

>>> assert np.isclose(pds_detection_level(0.1), 4.6, atol=0.1)
>>> assert np.allclose(pds_detection_level(0.1, n_rebin=[1]), [4.6], atol=0.1)
hendrics.base.pds_probability(level, ntrial=1, n_summed_spectra=1, n_rebin=1)[source]

Give the probability of a given power level in PDS.

Return the probability of a certain power level in a Power Density Spectrum of nbins bins, normalized a la Leahy (1983), based on the 2-dof \({\chi}^2\) statistics, corrected for rebinning (n_rebin) and multiple PDS averaging (n_summed_spectra)

Parameters:
levelfloat or array of floats

The power level for which we are calculating the probability

Returns:
epsilonfloat

The probability value(s)

Other Parameters:
ntrialint

The number of independent trials (the independent bins of the PDS)

n_summed_spectraint

The number of power density spectra that have been averaged to obtain this power level

n_rebinint

The number of power density bins that have been averaged to obtain this power level

hendrics.base.prange(*args)[source]

Dummy decorator in case jit cannot be imported.

hendrics.base.r_det(td, r_i)[source]

Calculate detected countrate given dead time and incident countrate.

hendrics.base.r_in(td, r_0)[source]

Calculate incident countrate given dead time and detected countrate.

hendrics.base.show_progress

alias of tqdm

hendrics.base.touch(fname)[source]

Mimic the same shell command.

Examples

>>> touch('bububu')
>>> assert os.path.exists('bububu')
>>> os.unlink('bububu')
hendrics.base.z2_n_detection_level(n=2, epsilon=0.01, ntrial=1, n_summed_spectra=1)[source]

Return the detection level for the Z^2_n statistics.

See Buccheri et al. (1983), Bendat and Piersol (1971).

Parameters:
nint, default 2

The n in $Z^2_n$ (number of harmonics, including the fundamental)

epsilonfloat, default 0.01

The fractional probability that the signal has been produced by noise

Returns:
detlevfloat

The epoch folding statistics corresponding to a probability epsilon * 100 % that the signal has been produced by noise

Other Parameters:
ntrialint

The number of trials executed to find this profile

n_summed_spectraint

Number of Z_2^n periodograms that are being averaged

hendrics.base.z2_n_probability(z2, n, ntrial=1, n_summed_spectra=1)[source]

Calculate the probability of a certain folded profile, due to noise.

Parameters:
z2float

A Z^2_n statistics value

nint, default 2

The n in $Z^2_n$ (number of harmonics, including the fundamental)

Returns:
pfloat

The probability that the Z^2_n value has been produced by noise

Other Parameters:
ntrialint

The number of trials executed to find this profile

n_summed_spectraint

Number of Z_2^n periodograms that were averaged to obtain z2

hendrics.binary module

Save different input files in PRESTO-readable format.

hendrics.binary.get_header_info(obj)[source]

Get header info from a Stingray object.

hendrics.binary.main_presto(args=None)[source]
hendrics.binary.save_events_to_binary(events, filename, bin_time, tstart=None, emin=None, emax=None)[source]

Save an event list to binary format.

Parameters:
events:class:stingray.Eventlist

Input event list

filenamestr

Output file name

bin_timefloat

Bin time of the output light curve

Returns:
lcinfoobject

light curve info

Other Parameters:
tstartfloat

Starting time

eminfloat

Minimum energy of the photons

emaxfloat

Maximum energy of the photons

hendrics.binary.save_inf(lcinfo, info, filename)[source]

Save information file.

hendrics.binary.save_lc_to_binary(lc, filename)[source]

Save a light curve to binary format.

Parameters:
lc:class:stingray.Lightcurve

Input light curve

filenamestr

Output file name

Returns:
lcinfoobject

light curve info

hendrics.calibrate module

Calibrate event lists by looking in rmf files.

hendrics.calibrate.calibrate(fname, outname, rmf_file=None, rough=False)[source]

Do calibration of an event list.

Parameters:
fnamestr

The HENDRICS file containing the events

outnamestr

The output file

Other Parameters:
rmf_filestr

The rmf file used to read the calibration. If None or not specified, the one given by default_nustar_rmf() is used.

hendrics.calibrate.default_nustar_rmf()[source]

Look for the default rmf file in the CALDB.

The CALDB environment variable has to point to the correct location of the NuSTAR CALDB

Note

The calibration might change in the future. The hardcoded file name will be eventually replaced with a smarter choice based on observing time

hendrics.calibrate.main(args=None)[source]

Main function called by the HENcalibrate command line script.

hendrics.calibrate.read_calibration(pis, rmf_file=None)[source]

Read the energy channels corresponding to the given PI channels.

Parameters:
pisarray-like

The channels to lookup in the rmf

Other Parameters:
rmf_filestr

The rmf file used to read the calibration. If None or not specified, the one given by default_nustar_rmf() is used.

hendrics.calibrate.read_rmf(rmf_file=None)[source]

Load RMF info.

Note

Preliminary: only EBOUNDS are read.

Parameters:
rmf_filestr

The rmf file used to read the calibration. If None or not specified, the one given by default_nustar_rmf() is used.

Returns:
pisarray-like

the PI channels

e_minsarray-like

the lower energy bound of each PI channel

e_maxsarray-like

the upper energy bound of each PI channel

hendrics.calibrate.rough_calibration(pis, mission)[source]
Parameters:
pis: float or array of floats

PI channels in data

mission: str

Mission name

Returns:
energiesfloat or array of floats

Energy values

Examples

>>> assert rough_calibration(0, 'nustar') == 1.6
>>> assert rough_calibration(0.0, 'ixpe') == 0.0
>>> # It's case-insensitive
>>> assert rough_calibration(1200, 'XMm') == 1.2
>>> rough_calibration(10, 'asDf')
Traceback (most recent call last):
    ...
ValueError: Mission asdf not recognized
>>> assert rough_calibration(100, 'nicer') == 1.0

hendrics.colors module

Functions to calculate colors and hardness.

hendrics.colors.colors()[source]
hendrics.colors.main(args=None)[source]

Main function called by the HENcolors command line script.

hendrics.create_gti module

Functions to create and apply GTIs.

hendrics.create_gti.apply_gti(fname, gti, outname=None, minimum_length=0)[source]

Apply a GTI list to the data contained in a file.

File MUST have a GTI extension already, and an extension called time.

hendrics.create_gti.create_gti(fname, filter_expr, safe_interval=[0, 0], outfile=None, minimum_length=0)[source]

Create a GTI list by using boolean operations on file data.

Parameters:
fnamestr

File name. The file must be in HENDRICS format.

filter_exprstr

A boolean condition on one or more of the arrays contained in the data. E.g. ‘(lc > 10) & (lc < 20)’

Returns:
gti[[gti0_0, gti0_1], [gti1_0, gti1_1], …]

The newly created GTIs

Other Parameters:
safe_intervalfloat or [float, float]

A safe interval to exclude at both ends (if single float) or the start and the end (if pair of values) of GTIs.

outfilestr

The output file name. If None, use a default root + ‘_gti’ combination

hendrics.create_gti.filter_gti_by_length(gti, minimum_length)[source]

Filter a list of GTIs: keep those longer than minimum_length.

hendrics.create_gti.main(args=None)[source]

Main function called by the HENcreategti command line script.

hendrics.efsearch module

Search for pulsars.

hendrics.efsearch.calculate_shifts(nprof: int, nbin: int, nshift: int, order: int = 1) array[source]
hendrics.efsearch.check_phase_error_after_casting_to_double(tref, f, fdot=0)[source]

Check the maximum error expected in the phase when casting to double.

hendrics.efsearch.decide_binary_parameters(length, freq_range, porb_range, asini_range, fdot_range=[0, 0], NMAX=10, csv_file='db.csv', reset=False)[source]
hendrics.efsearch.fit(frequencies, stats, center_freq, width=None, obs_length=None, baseline=0)[source]
hendrics.efsearch.h_test(norm, nmax=20)[source]

H statistics, a` la de Jager+89, A&A, 221, 180, eq. 11.

Examples

>>> phase = 2 * np.pi * np.arange(0, 1, 0.01)
>>> norm = np.sin(phase) + 1
>>> h, m = h_test(norm, nmax=4)
>>> assert np.isclose(h, 50)
>>> assert m == 1
hendrics.efsearch.main_accelsearch(args=None)[source]
hendrics.efsearch.main_efsearch(args=None)[source]

Main function called by the HENefsearch command line script.

hendrics.efsearch.main_z2vspf(args=None)[source]
hendrics.efsearch.main_zsearch(args=None)[source]

Main function called by the HENzsearch command line script.

hendrics.efsearch.mod(num, n2)[source]
hendrics.efsearch.search_with_ffa(times, f0, f1, nbin=16, n=1, t0=None, t1=None)[source]

‘Quite fast folding’ algorithm.

Parameters:
timesarray of floats

Arrival times of photons

f0float

Minimum frequency to search

f1float

Maximum frequency to search

Other Parameters:
nbinint

Number of bins to divide the profile into

nprofint, default None

number of slices of the dataset to use. If None, we use 8 times nbin. Motivation in the comments.

npfactint, default 2

maximum “sliding” of the dataset, in phase.

oversampleint, default 8

Oversampling wrt the standard FFT delta f = 1/T

search_fdotbool, default False

Switch fdot search on or off

t0float, default min(times)

starting time

t1float, default max(times)

stop time

hendrics.efsearch.search_with_qffa(times, f0, f1, fdot=0, fddot=0, nbin=16, nprof=None, npfact=2, oversample=8, n=1, search_fdot=True, t0=None, t1=None, silent=False)[source]

‘Quite fast folding’ algorithm.

Parameters:
timesarray of floats

Arrival times of photons

f0float

Minimum frequency to search

f1float

Maximum frequency to search

Other Parameters:
nbinint

Number of bins to divide the profile into

nprofint, default None

number of slices of the dataset to use. If None, we use 8 times nbin. Motivation in the comments.

npfactint, default 2

maximum “sliding” of the dataset, in phase.

oversampleint, default 8

Oversampling wrt the standard FFT delta f = 1/T

search_fdotbool, default False

Switch fdot search on or off

t0float, default min(times)

starting time

t1float, default max(times)

stop time

hendrics.efsearch.search_with_qffa_step(times: float64, mean_f: float64, mean_fdot=0, mean_fddot=0, nbin=16, nprof=64, npfact=2, oversample=8, n=1, search_fdot=True) tuple[array, array, array][source]

Single step of quasi-fast folding algorithm.

hendrics.efsearch.shift_and_sum(repeated_profiles, lshift, qshift, splat_prof, base_shift, quadbaseshift)[source]

Search for transient pulsations.

Parameters:
timesarray of floats

Arrival times of photons

f0float

Minimum frequency to search

f1float

Maximum frequency to search

Other Parameters:
nbinint

Number of bins to divide the profile into

nprofint, default None

number of slices of the dataset to use. If None, we use 8 times nbin. Motivation in the comments.

npfactint, default 2

maximum “sliding” of the dataset, in phase.

oversampleint, default 8

Oversampling wrt the standard FFT delta f = 1/T

search_fdotbool, default False

Switch fdot search on or off

t0float, default min(times)

starting time

t1float, default max(times)

stop time

hendrics.efsearch.z2_vs_pf(event_list, deadtime=0.0, ntrials=100, outfile=None, N=2)[source]
hendrics.efsearch.z_n_fast(phase, norm, n=2)[source]

Z^2_n statistics, a` la Buccheri+03, A&A, 128, 245, eq. 2.

Here in a fast implementation based on numba. Assumes that nbin != 0 and norm is an array.

Parameters:
phasearray of floats

The phases of the events, in terms of 2PI

normfloat or array of floats

A normalization factor that gets multiplied as a weight.

nint, default 2

The n in $Z^2_n$.

Returns:
z2_nfloat

The Z^2_n statistics of the events.

Examples

>>> phase = 2 * np.pi * np.arange(0, 1, 0.01)
>>> norm = np.sin(phase) + 1
>>> assert np.isclose(z_n_fast(phase, norm, n=4), 50)
>>> assert np.isclose(z_n_fast(phase, norm, n=2), 50)

hendrics.exvar module

Created on Thu Aug 17 08:55:47 2017.

@author: marta

hendrics.exvar.excvar_none(lc)[source]
hendrics.exvar.excvar_norm(lc)[source]
hendrics.exvar.fvar(lc)[source]
hendrics.exvar.main(args=None)[source]

hendrics.exposure module

Calculate the exposure correction for light curves.

Only works for data taken in specific data modes of NuSTAR, where all events are telemetered.

hendrics.exposure.correct_lightcurve(lc_file, uf_file, outname=None, expo_limit=1e-07)[source]

Apply exposure correction to light curve.

Parameters:
lc_filestr

The light curve file, in HENDRICS format

uf_filestr

The unfiltered event file, in FITS format

Returns:
outdatastr

Output data structure

Other Parameters:
outnamestr

Output file name

hendrics.exposure.get_exposure_from_uf(time, uf_file, dt=None, gti=None)[source]

Get livetime from unfiltered event file.

Parameters:
timearray-like

The time bins of the light curve

uf_filestr

Unfiltered event file (the one in the event_cl directory with the _uf suffix)

Returns:
expoarray-like

Exposure (livetime) values corresponding to time bins

Other Parameters:
dtfloat

If time array is not sampled uniformly, dt can be specified here.

hendrics.exposure.get_livetime_per_bin(times, events, priors, dt=None, gti=None)[source]

Get the livetime in a series of time intervals.

Parameters:
timesarray-like

The array of times to look at

eventsarray-like

A list of events, producing dead time

priorsarray-like

The livetime before each event (as in the PRIOR column of unfiltered NuSTAR event files)

Returns:
livetime_arrayarray-like

An array of the same length as times, containing the live time values

Other Parameters:
dtfloat or array-like

The width of the time bins of the time array. Can be a single float or an array of the same length as times

gti[[g0_0, g0_1], [g1_0, g1_1], …]

Good time intervals. Defaults to [[time[0] - dt[0]/2, time[-1] + dt[-1]/2]]

hendrics.exposure.main(args=None)[source]

Main function called by the HENexposure command line script.

hendrics.fake module

Functions to simulate data and produce a fake event file.

hendrics.fake.acceptance_rejection(dt, counts_per_bin, t0=0.0, poissonize_n_events=False, deadtime=0.0)[source]

Examples

>>> counts_per_bin = [10, 5, 5]
>>> dt = 0.1
>>> ev = acceptance_rejection(dt, counts_per_bin)
>>> assert ev.size == 20
>>> assert ev.max() < 0.3
>>> assert ev.min() > 0
>>> assert np.all(np.diff(ev) >= 0)
hendrics.fake.generate_fake_fits_observation(event_list=None, filename=None, instr=None, gti=None, tstart=None, tstop=None, mission=None, mjdref=55197.00076601852, livetime=None, additional_columns={})[source]

Generate fake X-ray data.

Takes an event list (as a list of floats) All inputs are None by default, and can be set during the call.

Parameters:
event_listlist-like

stingray.events.Eventlist object. If left None, 1000 random events will be generated, for a total length of 1025 s or the difference between tstop and tstart.

filenamestr

Output file name

Returns:
hdulistFITS hdu list

FITS hdu list of the output file

Other Parameters:
mjdreffloat

Reference MJD. Default is 55197.00076601852 (NuSTAR)

pilist-like

The PI channel of each event

tstartfloat

Start of the observation (s from mjdref)

tstopfloat

End of the observation (s from mjdref)

instrstr

Name of the instrument. Default is ‘FPMA’

livetimefloat

Total livetime. Default is tstop - tstart

hendrics.fake.main(args=None)[source]

Main function called by the HENfake command line script.

hendrics.fake.main_scramble(args=None)[source]

Main function called by the HENscramble command line script.

hendrics.fake.make_counts_pulsed(nevents, t_start, t_stop, pulsed_fraction=0.0)[source]

Examples

>>> nevents = 10
>>> dt, counts = make_counts_pulsed(nevents, 0, 100)
>>> assert np.isclose(np.sum(counts), nevents)
>>> dt, counts = make_counts_pulsed(nevents, 0, 100, pulsed_fraction=1)
>>> assert np.isclose(np.sum(counts), nevents)
hendrics.fake.scramble(event_list, smooth_kind='flat', dt=None, pulsed_fraction=0.0, deadtime=0.0, orbit_par=None, frequency=1)[source]

Scramble event list, GTI by GTI.

Parameters:
event_list: :class:`stingray.events.Eventlist` object

Input event list

Returns:
new_event_list: stingray.events.Eventlist object

“Scrambled” event list

Other Parameters:
smooth_kind: str in [‘flat’, ‘smooth’, ‘pulsed’]

if ‘flat’, count the events GTI by GTI without caring about long-term variability; if ‘smooth’, try to calculate smooth light curve first

dt: float

If smooth_kind is ‘smooth’, bin the light curve with this bin time. Ignored for other values of smooth_kind

pulsed_fraction: float

If smooth_kind is ‘pulsed’, use this pulse fraction, defined as the 2 A / B, where A is the amplitude of the sinusoid and B the maximum flux. Ignored for other values of smooth_kind

deadtime: float

Dead time in the data.

orbit_par: str

Parameter file for orbital modulation

Examples

>>> times = np.array([0.5, 134, 246, 344, 867])
>>> event_list = EventList(
...     times, gti=np.array([[0, 0.9], [111, 123.2], [125.123, 1000]]))
>>> new_event_list = scramble(event_list, 'smooth')
>>> assert new_event_list.time.size == times.size
>>> assert np.all(new_event_list.gti == event_list.gti)
>>> new_event_list = scramble(event_list, 'flat')
>>> assert new_event_list.time.size == times.size
>>> assert np.all(new_event_list.gti == event_list.gti)

hendrics.fspec module

Functions to calculate frequency spectra.

hendrics.fspec.average_periodograms(fspec_iterable, total=None)[source]

Sum a list (or iterable) of power density spectra.

Examples

>>> pds = AveragedPowerspectrum()
>>> pds.freq = np.asarray([1, 2, 3])
>>> pds.power = np.asarray([3, 3, 3])
>>> pds.power_err = np.asarray([0.1, 0.1, 0.1])
>>> pds.m = 1
>>> pds.fftlen = 128
>>> pds1 = copy.deepcopy(pds)
>>> pds1.m = 2
>>> tot_pds = average_periodograms([pds, pds1])
>>> assert np.allclose(tot_pds.power, pds.power)
>>> assert np.allclose(tot_pds.power_err, pds.power_err / np.sqrt(3))
>>> assert tot_pds.m == 3
hendrics.fspec.calc_cpds(lcfile1, lcfile2, fftlen, bintime=1, pdsrebin=1, outname=None, normalization='leahy', back_ctrate=0.0, noclobber=False, save_all=False, save_dyn=False, save_lcs=False, no_auxil=False, test=False, emin=None, emax=None, ignore_gti=False, lombscargle=False, fill_short_btis=None)[source]

Calculate the CPDS from a pair of input light curve files.

Parameters:
lcfile1str

The first light curve file

lcfile2str

The second light curve file

fftlenfloat

The length of the chunks over which FFTs will be calculated, in seconds

Other Parameters:
save_dynbool

If True, save the dynamical power spectrum

bintimefloat

The bin time. If different from that of the light curve, a rebinning is performed

pdsrebinint

Rebin the PDS of this factor.

normalizationstr

‘Leahy’, ‘frac’, ‘rms’, or any normalization accepted by stingray. Default ‘Leahy’

back_ctratefloat

The non-source count rate

noclobberbool

If True, do not overwrite existing files

outnamestr

Output file name for the cpds. Default: cpds.[nc|p]

eminfloat, default None

Minimum energy of the photons

emaxfloat, default None

Maximum energy of the photons

lombscarglebool

Use the Lomb-Scargle periodogram instead of AveragedPowerspectrum

hendrics.fspec.calc_fspec(files, fftlen, do_calc_pds=True, do_calc_cpds=True, do_calc_cospectrum=True, do_calc_lags=True, save_dyn=False, no_auxil=False, save_lcs=False, bintime=1, pdsrebin=1, outroot=None, normalization='leahy', nproc=1, back_ctrate=0.0, noclobber=False, ignore_instr=False, save_all=False, test=False, emin=None, emax=None, ignore_gti=False, lombscargle=False, fill_short_btis=None)[source]

Calculate the frequency spectra: the PDS, the cospectrum, …

Parameters:
fileslist of str

List of input file names

fftlenfloat

length of chunks to perform the FFT on.

Other Parameters:
save_dynbool

If True, save the dynamical power spectrum

bintimefloat

The bin time. If different from that of the light curve, a rebinning is performed

pdsrebinint

Rebin the PDS of this factor.

normalizationstr

‘Leahy’ [3] or ‘rms’ [4] [5]. Default ‘Leahy’.

back_ctratefloat

The non-source count rate

noclobberbool

If True, do not overwrite existing files

outrootstr

Output file name root

nprocint

Number of processors to use to parallelize the processing of multiple files

ignore_instrbool

Ignore instruments; files are alternated in the two channels

eminfloat, default None

Minimum energy of the photons

emaxfloat, default None

Maximum energy of the photons

lombscarglebool

Use the Lomb-Scargle periodogram instead of AveragedPowerspectrum

References

[3] Leahy et al. 1983, ApJ, 266, 160.

[4] Belloni & Hasinger 1990, A&A, 230, 103

[5] Miyamoto et al. 1991, ApJ, 383, 784

hendrics.fspec.calc_pds(lcfile, fftlen, bintime=1, pdsrebin=1, normalization='leahy', back_ctrate=0.0, noclobber=False, outname=None, save_all=False, save_dyn=False, save_lcs=False, no_auxil=False, test=False, emin=None, emax=None, ignore_gti=False, lombscargle=False, fill_short_btis=None)[source]

Calculate the PDS from an input light curve file.

Parameters:
lcfilestr

The light curve file

fftlenfloat

The length of the chunks over which FFTs will be calculated, in seconds

Other Parameters:
save_dynbool

If True, save the dynamical power spectrum

bintimefloat

The bin time. If different from that of the light curve, a rebinning is performed

pdsrebinint

Rebin the PDS of this factor.

normalization: str

‘Leahy’, ‘frac’, ‘rms’, or any normalization accepted by stingray. Default ‘Leahy’

back_ctratefloat

The non-source count rate

noclobberbool

If True, do not overwrite existing files

outnamestr

If specified, output file name. If not specified or None, the new file will have the same root as the input light curve and the ‘_pds’ suffix

eminfloat, default None

Minimum energy of the photons

emaxfloat, default None

Maximum energy of the photons

lombscarglebool

Use the Lomb-Scargle periodogram instead of AveragedPowerspectrum

hendrics.fspec.dumpdyn(fname, plot=False)[source]
hendrics.fspec.dumpdyn_main(args=None)[source]

Main function called by the HENdumpdyn command line script.

hendrics.fspec.main(args=None)[source]

Main function called by the HENfspec command line script.

hendrics.fspec.sync_gtis(lc1, lc2)[source]

Sync gtis between light curves or event lists.

Has to work with new and old versions of stingray.

Examples

>>> from stingray.events import EventList
>>> from stingray.lightcurve import Lightcurve
>>> ev1 = EventList(
...     time=np.sort(np.random.uniform(1, 10, 3)), gti=[[1, 10]])
>>> ev2 = EventList(time=np.sort(np.random.uniform(0, 9, 4)), gti=[[0, 9]])
>>> e1, e2 = sync_gtis(ev1, ev2)
>>> assert np.allclose(e1.gti, [[1, 9]])
>>> assert np.allclose(e2.gti, [[1, 9]])
>>> lc1 = Lightcurve(
...     time=[0.5, 1.5, 2.5], counts=[2, 2, 3], dt=1, gti=[[0, 3]])
>>> lc2 = Lightcurve(
...     time=[1.5, 2.5, 3.5, 4.5], counts=[2, 2, 3, 3], dt=1, gti=[[1, 5]])
>>> lc1._apply_gtis = lc1.apply_gtis
>>> lc2._apply_gtis = lc2.apply_gtis
>>> l1, l2 = sync_gtis(lc1, lc2)
>>> assert np.allclose(l1.gti, [[1, 3]])
>>> assert np.allclose(l2.gti, [[1, 3]])

hendrics.io module

Functions to perform input/output operations.

class hendrics.io.EFPeriodogram(freq=None, stat=None, kind=None, nbin=None, N=None, oversample=None, M=None, pepoch=None, mjdref=None, peaks=None, peak_stat=None, best_fits=None, fdots=0, fddots=0, segment_size=1e+32, filename='', parfile=None, emin=None, emax=None, ncounts=None, upperlim=None)[source]

Bases: object

find_peaks(conflevel=99.0)[source]
hendrics.io.filter_energy(ev: EventList, emin: float, emax: float) tuple[EventList, str][source]

Filter event list by energy (or PI)

If an energy attribute is present, uses it. Otherwise, it switches automatically to pi

Examples

>>> import doctest
>>> from contextlib import redirect_stderr
>>> import sys
>>> time = np.arange(5)
>>> energy = np.array([0, 0, 30, 4, 1])
>>> events = EventList(time=time, energy=energy)
>>> ev_out, elabel = filter_energy(events, 3, None)
>>> assert np.all(ev_out.time == [2, 3])
>>> assert elabel == 'Energy'
>>> events = EventList(time=time, pi=energy)
>>> with warnings.catch_warnings(record=True) as w:
...     ev_out, elabel = filter_energy(events, None, 20)  
>>> assert "No energy information in event list" in str(w[-1].message)
>>> assert np.all(ev_out.time == [0, 1, 3, 4])
>>> assert elabel == 'PI'
>>> events = EventList(time=time, pi=energy)
>>> ev_out, elabel = filter_energy(events, None, None)  
>>> assert np.all(ev_out.time == time)
>>> assert elabel == 'PI'
>>> events = EventList(time=time)
>>> with redirect_stderr(sys.stdout):
...     ev_out, elabel = filter_energy(events, 3, None)  
ERROR:...No Energy or PI...
>>> assert np.all(ev_out.time == time)
>>> assert elabel == ''
hendrics.io.find_file_in_allowed_paths(fname, other_paths=None)[source]

Check if file exists at its own relative/absolute path, or elsewhere.

Parameters:
fnamestr

The name of the file, with or without a path.

Other Parameters:
other_pathslist of str

list of other possible paths

hendrics.io.get_energy_from_events(ev)[source]
hendrics.io.get_file_type(fname, raw_data=False)[source]

Return the file type and its contents.

Only works for hendrics-format pickle or netcdf files, or stingray outputs.

hendrics.io.high_precision_keyword_read(hdr, keyword)[source]

Read FITS header keywords, also if split in two.

In the case where the keyword is split in two, like

MJDREF = MJDREFI + MJDREFF

in some missions, this function returns the summed value. Otherwise, the content of the single keyword

Parameters:
hdrdict_like

The header structure, or a dictionary

keywordstr

The key to read in the header

Returns:
valuelong double

The value of the key, or None if keyword not present

Examples

>>> hdr = dict(keywordS=1.25)
>>> assert high_precision_keyword_read(hdr, 'keywordS') == 1.25
>>> hdr = dict(keywordI=1, keywordF=0.25)
>>> assert high_precision_keyword_read(hdr, 'keywordS') == 1.25
hendrics.io.load_data(fname)[source]

Load generic data in hendrics format.

hendrics.io.load_events(fname)[source]

Load events from a file.

hendrics.io.load_folding(fname)[source]

Load PDS from a file.

hendrics.io.load_lcurve(fname)[source]

Load light curve from a file.

hendrics.io.load_model(modelstring)[source]
hendrics.io.load_pds(fname, nosub=False)[source]

Load PDS from a file.

hendrics.io.load_timeseries(fname)[source]

Load events from a file.

hendrics.io.main(args=None)[source]

Main function called by the HENreadfile command line script.

hendrics.io.main_filter_events(args=None)[source]
hendrics.io.print_fits_info(fits_file, hdu=1)[source]

Print general info about an observation.

hendrics.io.read_from_netcdf(fname)[source]

Read from a netCDF4 file.

hendrics.io.read_header_key(fits_file, key, hdu=1)[source]

Read the header key key from HDU hdu of a fits file.

Parameters:
fits_file: str
key: str

The keyword to be read

Other Parameters:
hduint
hendrics.io.recognize_stingray_table(obj)[source]

Examples

>>> obj = AveragedCrossspectrum()
>>> obj.freq = np.arange(10)
>>> obj.power = np.random.random(10)
>>> recognize_stingray_table(obj.to_astropy_table())
'AveragedPowerspectrum'
>>> obj.pds1 = obj.power
>>> recognize_stingray_table(obj.to_astropy_table())
'AveragedCrossspectrum'
>>> obj = EventList(np.arange(10))
>>> recognize_stingray_table(obj.to_astropy_table())
'EventList'
>>> obj = Lightcurve(np.arange(10), np.arange(10))
>>> recognize_stingray_table(obj.to_astropy_table())
'Lightcurve'
>>> obj = Table()
>>> recognize_stingray_table(obj)
Traceback (most recent call last):
...
ValueError: Object not recognized...
hendrics.io.ref_mjd(fits_file, hdu=1)[source]

Read MJDREFF+ MJDREFI or, if failed, MJDREF, from the FITS header.

Parameters:
fits_filestr
Returns:
mjdrefnumpy.longdouble

the reference MJD

Other Parameters:
hduint
hendrics.io.remove_pds(fname)[source]

Remove the pds file and the directory with auxiliary information.

hendrics.io.save_as_ascii(cols, filename='out.txt', colnames=None, append=False)[source]

Save arrays as TXT file with respective errors.

hendrics.io.save_as_netcdf(vars, varnames, formats, fname)[source]

Save variables in a NetCDF4 file.

hendrics.io.save_as_qdp(arrays, errors=None, filename='out.qdp', mode='w')[source]

Save arrays in a QDP file.

Saves an array of variables, and possibly their errors, to a QDP file.

Parameters:
arrays: [array1, array2]

List of variables. All variables must be arrays and of the same length.

errors: [array1, array2]

List of errors. The order has to be the same of arrays; the value can be: - None if no error is assigned - an array of same length of variable for symmetric errors - an array of len-2 lists for non-symmetric errors (e.g. [[errm1, errp1], [errm2, errp2], [errm3, errp3], …])

Other Parameters:
modestr

the file access mode, to be passed to the open() function. Can be ‘w’ or ‘a’

hendrics.io.save_data(struct, fname, ftype='data')[source]

Save generic data in hendrics format.

hendrics.io.save_events(eventlist, fname)[source]

Save events in a file.

Parameters:
eventlist: :class:`stingray.EventList` object

Event list to be saved

fname: str

Name of output file

hendrics.io.save_folding(efperiodogram, fname)[source]

Save PDS in a file.

hendrics.io.save_lcurve(lcurve, fname, lctype='Lightcurve')[source]

Save Light curve to file.

Parameters:
lcurve: :class:`stingray.Lightcurve` object

Event list to be saved

fname: str

Name of output file

hendrics.io.save_model(model, fname='model.p', constraints=None)[source]

Save best-fit models to data.

Parameters:
modelfunc or astropy.modeling.core.Model object

The model to be saved

fnamestr, default ‘models.p’

The output file name

Other Parameters:
constraints: dict

Additional model constraints. Ignored for astropy models.

hendrics.io.save_pds(cpds, fname, save_all=False, save_dyn=False, no_auxil=False, save_lcs=False)[source]

Save PDS in a file.

hendrics.io.save_timeseries(timeseries, fname)[source]

Save a time series in a file.

Parameters:
timeseries: :class:`stingray.EventList` object

Event list to be saved

fname: str

Name of output file

hendrics.io.sort_files(files)[source]

Sort a list of HENDRICS files, looking at Tstart in each.

hendrics.lcurve module

Light curve-related functions.

hendrics.lcurve.baseline_main(args=None)[source]

Main function called by the HENbaselinesub command line script.

hendrics.lcurve.filter_lc_gtis(lc, safe_interval=None, delete=False, min_length=0, return_borders=False)[source]

Filter a light curve for GTIs.

Parameters:
lcLightcurve object

The input light curve

Returns:
newlcLightcurve object

The output light curve

borders[[i0_0, i0_1], [i1_0, i1_1], …], optional

The indexes of the light curve corresponding to the borders of the GTIs. Returned if return_borders is set to True

Other Parameters:
safe_intervalfloat or [float, float]

Seconds to filter out at the start and end of each GTI. If single float, these safe windows are equal, otherwise the two numbers refer to the start and end of the GTI respectively

deletebool

If delete is True, the intervals outside of GTIs are filtered out from the light curve. Otherwise, they are set to zero.

min_lengthfloat

Minimum length of GTI. GTIs below this length will be removed.

return_bordersbool

If True, return also the indexes of the light curve corresponding to the borders of the GTIs

hendrics.lcurve.join_lightcurve_objs(lclist)[source]

Join light curves.

Light curves from different instruments are put in different channels. Light curves from the same time interval and instrument raise a ValueError.

Parameters:
lclistlist of Lightcurve objects

The list of light curves to join

Returns:
lcoutlistjoint light curves, one per instrument

See also

scrunch_lightcurves

Create a single light curve from input light curves.

Examples

>>> lcA = Lightcurve(np.arange(4), np.zeros(4))
>>> lcA.instr='BU'  # Test also case sensitivity
>>> lcB = Lightcurve(np.arange(4) + 4, [1, 3, 4, 5])
>>> lcB.instr='bu'
>>> lcC = join_lightcurve_objs((lcA, lcB))
>>> assert np.all(lcC['bu'].time == np.arange(8))
>>> assert np.all(lcC['bu'].counts == [0, 0, 0, 0, 1, 3, 4, 5])
hendrics.lcurve.join_lightcurves(lcfilelist, outfile='out_lc.p')[source]

Join light curves from different files.

Light curves from different instruments are put in different channels.

Parameters:
lcfilelistlist of str

List of input file names

outfile

Output light curve

See also

scrunch_lightcurves

Create a single light curve from input light curves.

hendrics.lcurve.lcurve_from_events(f, safe_interval=0, pi_interval=None, e_interval=None, min_length=0, gti_split=False, ignore_gtis=False, bintime=1.0, outdir=None, outfile=None, noclobber=False, deorbit_par=None, weight_on=None)[source]

Bin an event list in a light curve.

Parameters:
fstr

Input event file name

bintimefloat

The bin time of the output light curve

Returns:
outfileslist

List of output light curves

Other Parameters:
safe_intervalfloat or [float, float]

Seconds to filter out at the start and end of each GTI. If single float, these safe windows are equal, otherwise the two numbers refer to the start and end of the GTI respectively

pi_interval[int, int]

PI channel interval to select. Default None, meaning that all PI channels are used

e_interval[float, float]

Energy interval to select (only works if event list is calibrated with calibrate). Default None

min_lengthfloat

GTIs below this length will be filtered out

gti_splitbool

If True, create one light curve for each good time interval

ignore_gtisbool

Ignore good time intervals, and get a single light curve that includes possible gaps

outdirstr

Output directory

outfilestr

Output file

noclobberbool

If True, do not overwrite existing files

hendrics.lcurve.lcurve_from_fits(fits_file, gtistring='GTI', timecolumn='TIME', ratecolumn=None, ratehdu=1, fracexp_limit=0.9, outfile=None, noclobber=False, outdir=None)[source]

Load a lightcurve from a fits file and save it in HENDRICS format.

Note

FITS light curve handling is still under testing. Absolute times might be incorrect depending on the light curve format.

Parameters:
fits_filestr

File name of the input light curve in FITS format

Returns:
outfile[str]

Returned as a list with a single element for consistency with lcurve_from_events

Other Parameters:
gtistringstr

Name of the GTI extension in the FITS file

timecolumnstr

Name of the column containing times in the FITS file

ratecolumnstr

Name of the column containing rates in the FITS file

ratehdustr or int

Name or index of the FITS extension containing the light curve

fracexp_limitfloat

Minimum exposure fraction allowed

outfilestr

Output file name

noclobberbool

If True, do not overwrite existing files

hendrics.lcurve.lcurve_from_txt(txt_file, outfile=None, noclobber=False, outdir=None, mjdref=None, gti=None)[source]

Load a lightcurve from a text file.

Parameters:
txt_filestr

File name of the input light curve in text format. Assumes two columns: time, counts. Times are seconds from MJDREF 55197.00076601852 (NuSTAR) if not otherwise specified.

Returns:
outfile[str]

Returned as a list with a single element for consistency with lcurve_from_events

Other Parameters:
outfilestr

Output file name

noclobberbool

If True, do not overwrite existing files

mjdreffloat, default 55197.00076601852

the MJD time reference

gti[[gti0_0, gti0_1], [gti1_0, gti1_1], …]

Good Time Intervals

hendrics.lcurve.main(args=None)[source]

Main function called by the HENlcurve command line script.

hendrics.lcurve.scrunch_lightcurve_objs(lclist)[source]

Create a single light curve from input light curves.

Light curves are appended when they cover different times, and summed when they fall in the same time range. This is done regardless of the channel or the instrument.

Parameters:
lcfilelistlist of stingray.lightcurve.Lightcurve objects

The list of light curves to scrunch

Returns:
lcscrunched light curve

See also

join_lightcurves

Join light curves from different files

Examples

>>> lcA = Lightcurve(np.arange(4), np.ones(4))
>>> lcA.instr='bu1'
>>> lcB = Lightcurve(np.arange(4), [1, 3, 4, 5])
>>> lcB.instr='bu2'
>>> lcC = scrunch_lightcurve_objs((lcA, lcB))
>>> assert np.all(lcC.time == np.arange(4))
>>> assert np.all(lcC.counts == [2, 4, 5, 6])
>>> assert np.all(lcC.instr == 'bu1,bu2')
hendrics.lcurve.scrunch_lightcurves(lcfilelist, outfile='out_scrlc.p', save_joint=False)[source]

Create a single light curve from input light curves.

Light curves are appended when they cover different times, and summed when they fall in the same time range. This is done regardless of the channel or the instrument.

Parameters:
lcfilelistlist of str

The list of light curve files to scrunch

Returns:
timearray-like

The time array

lc

The new light curve

gti[[gti0_0, gti0_1], [gti1_0, gti1_1], …]

Good Time Intervals

Other Parameters:
outfilestr

The output file name

save_jointbool

If True, save the per-channel joint light curves

See also

join_lightcurves

Join light curves from different files

hendrics.lcurve.scrunch_main(args=None)[source]

Main function called by the HENscrunchlc command line script.

hendrics.modeling module

hendrics.modeling.main_model(args=None)[source]

Main function called by the HENfspec command line script.

hendrics.plot module

Quicklook plots.

hendrics.plot.main(args=None)[source]

Main function called by the HENplot command line script.

hendrics.plot.plot_color(file0, file1, xlog=None, ylog=None, figname=None, output_data_file=None)[source]
hendrics.plot.plot_cospectrum(fnames, figname=None, xlog=None, ylog=None, output_data_file=None)[source]

Plot the cospectra from a list of CPDSs, or a single one.

hendrics.plot.plot_folding(fnames, figname=None, xlog=None, ylog=None, output_data_file=None)[source]
hendrics.plot.plot_generic(fnames, vars, errs=None, figname=None, xlog=None, ylog=None, output_data_file=None)[source]

Generic plotting function.

hendrics.plot.plot_lc(lcfiles, figname=None, fromstart=False, xlog=None, ylog=None, output_data_file=None)[source]

Plot a list of light curve files, or a single one.

hendrics.plot.plot_pds(fnames, figname=None, xlog=None, ylog=None, output_data_file=None, white_sub=False)[source]

Plot a list of PDSs, or a single one.

hendrics.plot.plot_powercolors(fnames)[source]
hendrics.plot.rescale_plot_units(values)[source]

Rescale the values to an order of magnitude that allows better plotting.

Subtracts the mean mean from the values, then rescales the residuals to a comfortable order of magnitude oom. If out are the rescaled values, this should always work:

out * 10**oom + mean == values
Parameters:
values: array-like

Input values to be rescaled

Returns:
mean: float

The mean of the input values, rounded to the order of magnitude of the data span

oomint

The order of magnitude of the data span

valuesarray-like

The rescaled values

Examples

>>> values = np.arange(-0.003, 0.0032, 0.0002) + 5.0001
>>> mean, oom, rescaled = rescale_plot_units(values)
>>> assert mean == 5.0
>>> oom
-3
>>> assert np.allclose(rescaled * 10**oom + mean, values)
>>> values = np.arange(-3, 3.2, 0.2) + 5.0001
>>> mean, oom, rescaled = rescale_plot_units(values)
>>> assert oom == 0
>>> assert mean == 0.0
>>> assert np.allclose(rescaled, values)

hendrics.read_events module

Read and save event lists from FITS files.

hendrics.read_events.join_eventlists(event_file1, event_file2, new_event_file=None, ignore_instr=False)[source]

Join two event files.

Parameters:
event_file1str

First event file

event_file2str

Second event file

Returns:
new_event_filestr

Output event file

Other Parameters:
new_event_filestr, default None

Output event file. If not specified uses hendrics.utils.common_name to figure out a good name to use mixing up the two input names.

ignore_instrbool

Ignore errors about different instruments

hendrics.read_events.join_many_eventlists(eventfiles, new_event_file=None, ignore_instr=False)[source]

Join two event files.

Parameters:
event_fileslist of str

List of event files

Returns:
new_event_filestr

Output event file

Other Parameters:
new_event_filestr, default None

Output event file. If not specified joint_ev + HEN_FILE_EXTENSION

ignore_instrbool

Ignore errors about different instruments

hendrics.read_events.main(args=None)[source]

Main function called by the HENreadevents command line script.

hendrics.read_events.main_join(args=None)[source]

Main function called by the HENjoinevents command line script.

hendrics.read_events.main_splitevents(args=None)[source]

Main function called by the HENsplitevents command line script.

hendrics.read_events.multiple_event_concatenate(event_lists)[source]

Join multiple EventList objects into one.

If both are empty, an empty EventList is returned.

GTIs are crossed if the event lists are over a common time interval, and appended otherwise.

pi and pha remain None if they are None in both. Otherwise, 0 is used as a default value for the EventList where they were None.

Parameters:
event_listslist of EventList object

EventList objects that we are joining

Returns:
`ev_new`EventList object

The resulting EventList object.

Examples

>>> gti = [[0, 5]]
>>> ev1 = EventList(time=[0.3, 1], gti=gti, energy=[3, 4], pi=None)
>>> ev2 = EventList(time=[2, 3], gti=gti, energy=[3, 4])
>>> ev3 = EventList(time=[4, 4.5], gti=gti, energy=[3, 4])
>>> ev_new = multiple_event_concatenate([ev1, ev2, ev3])
>>> assert np.allclose(ev_new.time, [0.3, 1, 2, 3, 4, 4.5])
>>> assert np.allclose(ev_new.energy, [3, 4, 3, 4, 3, 4])
>>> assert ev_new.pi is None
hendrics.read_events.split_eventlist(fname, max_length, overlap=None)[source]
hendrics.read_events.split_eventlist_at_mjd(fname, mjd)[source]
hendrics.read_events.treat_event_file(filename, noclobber=False, gti_split=False, min_length=4, gtistring=None, length_split=None, randomize_by=None, discard_calibration=False, split_by_detector=True, additional_columns=None, fill_small_gaps=None, bin_time_for_occultations=None, safe_interval=None)[source]

Read data from an event file, with no external GTI information.

Parameters:
filenamestr
Other Parameters:
noclobber: bool

if a file is present, do not overwrite it

gtistring: str

comma-separated set of GTI strings to consider

gti_split: bool

split the file in multiple chunks, containing one GTI each

length_split: float, default None

split the file in multiple chunks, with approximately this length

min_length: float

minimum length of GTIs accepted (only if gti_split is True or length_split is not None)

discard_calibration: bool

discard the automatic calibration done by Stingray (if any)

split_by_detector: bool, default True

split data from different detectors

fill_small_gaps: float

fill gaps between GTIs with random data, if shorter than this amount

bin_time_for_occultations: float

Create a light curve with this bin time and infer occultations not recorded in GTIs. The rule is that the flux drops to zero and the average counts per bin are significantly above 25 ct/s.

safe_intervalfloat or [float, float]

A safe interval to exclude at both ends (if single float) or the start and the end (if pair of values) of GTIs.

hendrics.rebin module

Functions to rebin light curves and frequency spectra.

hendrics.rebin.main(args=None)[source]

Main function called by the HENrebin command line script.

hendrics.rebin.rebin_file(filename, rebin)[source]

Rebin the contents of a file, be it a light curve or a spectrum.

hendrics.save_as_xspec module

Functions to save data in a Xspec-readable format.

hendrics.save_as_xspec.main(args=None)[source]

Main function called by the HEN2xspec command line script.

hendrics.save_as_xspec.save_as_xspec(fname, direct_save=False, save_lags=True)[source]

Save frequency spectra in a format readable to FTOOLS and Xspec.

Parameters:
fnamestr

Input HENDRICS frequency spectrum file name

direct_savebool

If True, also call flx2xsp to produce the output .pha and .rsp files. If False (default), flx2xsp has to be called from the user

Notes

Uses method described by Ingram and Done in Appendix A of this paper<https://arxiv.org/pdf/1108.0789>__

hendrics.sum_fspec module

Function to sum frequency spectra.

hendrics.sum_fspec.main(args=None)[source]

Main function called by the HENsumfspec command line script.

hendrics.sum_fspec.sum_fspec(files, outname=None)[source]

Take a bunch of (C)PDSs and sums them.

hendrics.timelags module

hendrics.timelags.main(args=None)[source]

Module contents

This is proposed as an Astropy affiliated package.