Synopsis
Run CIAO tools from Python as if they were functions (CIAO contributed package).
Syntax
from ciao_contrib.runtool import * Run the dmcopy tool: dmcopy("evt2.fits[bin sky=::8]", "img.fits") Run dmstat: dmstat.punlearn() dmstat.median = True dmstat("src.dat[cols z,fx]", verbose=0) print("The medians are {0}".format(dmstat.out_median))
Description
The ciao_contrib.runtool module allows CIAO tools to be run as if they were Python functions, and supports a pset-like mode where tool parameters can be set before the tool is run, named parameters when calling the tool, and easy access to the parameter values after the tool has completed. Parameter values are given using native Python types - e.g. True and False for boolean parameters.
The ciao_contrib.runtool module is provided as part of the CIAO contributed scripts package.
Loading the routines
The module can be loaded by saying:
from ciao_contrib.runtool import *
Note that it is not provided by the ciao_contrib.all module, unlike the other ciao_contrib modules.
Tools and parameter files
The runtool module provides a function for each major tool in the CIAO bin and contrib/bin directories that has a parameter file. It also provides functions to set and get settings for those parameter files without an associated tool, such as ardlib, lut and colors.
Extra routines
The following routines are also provided:
- get_pfiles() - return the user or system portion of the PFILES environment variable;
- set_pfiles() - easily change the user portion of the PFILES environment variable;
- make_tool() - create a routine that can be used to call a CIAO tool;
- list_tools() - returns a list of the tools and parameter files that are covered by this module;
- new_tmpdir() - a context manager that creates a temporary directory and ensures that the directory is removed once the block has finished;
- new_pfiles_environment() - a context manager that creates a temporary directory and uses it as the user path of the PFILES environment variable for the duration of the block; on exit the PFILES variable is restored to its original value and the temporary directory is removed;
- add_tool_history() - adds a CIAO history block to one or more files to indicate that a tool has been run with a given set of parameters (this block can be read by the dmhistory tool).
The new_pfiles_environment context manager should be used if you are running multiple copies of the same CIAO tool simultaneously (either using Python's multiprocessing module or by running several copies of the same script). See the "Running tools in parallel" section below for more information.
The new_tmpdir context manager is useful when you need access to a unique temporary directory, as shown below in the "Changing the temporary directory used by a tool" example.
For more information on these tools please use the Python help system - e.g.
>>> import ciao_contrib.runtool as rt >>> help(rt.new_pfiles_environment)
Running a tool
A tool can be run by calling the routine with all the arguments it needs; required arguments are given either by positional order or as a named argument whereas optional arguments always have to be given as a named argument. As an example
>>> dmcopy("in.fits[sky=region(src.reg)][bin sky=::1]", "img.fits") >>> dmstat(infile="img.fits", median=True, centroid=False)
Parameters can also be set using a parameter-style interface of <toolname>.<parname> before calling the tool as shown below:
>>> dmcopy.outfile = "img.fits" >>> dmcopy.infile = "in.fits[sky=region(src.reg)][bin sky=::1]" >>> dmcopy() >>> dmstat.median = True >>> dmstat.centroid = False >>> dmstat(infile="img.fits")
You can mix both approaches, as shown above in the last dmstat call.
Running tools in parallel
The Python multiprocessing module makes it possible to run tools in parallel, and is used to provide the multiprocessing feature in scripts like fluximage and merge_obs. Alternatively, you can run several copies of your script at the same time. In either case, you may see errors or problems because the same parameter file is being used for each run of the tool (this can lead to invalid results, as inconsistent parameter settings are used, or the tool can fail). To avoid this, use the new_pfiles_environment context manager to create a new copy of your PFILES environment. An example of its use is shown below (this uses Sherpa's parallel_map routine, which provides a simple way to use the multiprocessing module):
import ciao_contrib.runtool as rt import sherpa.utils # The asphist/mkinstmap/mkexpmap call below will use their own # PFILES directory and so will not collide with other runs of # these tools (or changes to the default ardlib parameter file # since this has been copied over to the new directory) # def getemap((ccd, gridstr)): """Create an ACIS exposure map for the ccd and grid that can be run in parallel. The routine can only accept a single argument as it is to be called via Sherpa's parallel_map routine. As we need multiple values, we therefore use a tuple. This routine is only for didactic purposes since it assumes a number of things, such as file names, that will not be true in general. """ # We copy over the ardlib so that if the bad-pixel file # has been set, its value is used in the new environment. # with rt.new_pfiles_environment(ardlib=True): afile = "asphist{0}.fits".format(ccd) ah = rt.make_tool("asphist") ah(infile="asol1.fits", outfile=afile, evtfile="evt2.fits[ccd_id={0}]".format(ccd), clobber=True) ifile = "imap{0}.fits".format(ccd) mki = rt.make_tool("mkinstmap") mki(outfile=ifile, monoenergy=1.7, pixelgrid="1:1024:1,1:1024:1", obsfile=afile+"[asphist]", detsubsys="ACIS-{0}".format(ccd), maskfile="msk1.fits", clobber=True) efile = "emap{0}.fits".format(ccd) mke = rt.make_tool("mkexpmap") mke(asphistfile=afile, outfile=efile, instmapfile=ifile, xygrid=gridstr, clobber=True) # Create the arguments for CCDs 0-3 and 6 grid = "2824.5:6552.5:#932,2868.5:5328.5:#615" ccds = [0, 1, 2, 3, 6] args = [(ccd,grid) for ccd in ccds] # Run getemap on all available CPUs sherpa.utils.parallel_map(getemap, args)
Note that new_pfiles_environment automatically creates the temporary parameter directory, changes the PFILES environment variable, runs your code, and then restores the PFILES variable and deletes the temporary directory for you.
The location of the temporary direction is determined by the ASCDS_WORK_PATH environment variable.
Clearing, setting, and viewing parameter settings
Each tool has a punlearn() method which will reset the values to the CIAO defaults. As shown above, parameter values can be set either when calling the tool or via the <toolname>.<parname> approach, and the current settings can be displayed using print. As an example
>>> dmjoin.punlearn() >>> dmjoin.interpolate = "closest" >>> print(dmjoin) Parameters for dmjoin: Required parameters: infile = Input file joinfile = Input join file outfile = Output file join = Join col Optional parameters: interpolate = closest Interpolate mode verbose = 0 Debug Level(0-5) clobber = False Clobber
There are also write_params() and read_params() methods that can be used to create a .par file from the current settings or read values in from a given .par file. See "help <toolname>.read_params" or "help <toolname>.write_params" for more information on these methods.
Support for unique parameter prefixes
As with the command line, parameter names can be shortened to any unique prefix. This means that dmjoin.inf can be used to read or write the infile parameter of dmjoin, but dmstat.joi will fail since it matches both the join and joinfile parameters.
This support also holds when calling a tool, so
>>> dmcopy(infile, outfile, cl=True)
calls dmcopy with the clobber parameter set to yes.
An error will be raised if the prefix is not unique; for instance
>>> dmjoin.joi AttributeError: Multiple matches for dmjoin parameter 'joi', choose from: joinfile join
A SyntaxError will occur if a Python reserved word - such as "in" or "for" - is used as the parameter name.
Stack support
There is limited support for stacks in this module: if a string or file parameter is sent an array then the elements are turned into a comma-separated string and this is used as the parameter value.
The following example creates the average of all the files that match imgn.fits, with n being from 1 to 9, to create avg.fits:
>>> import glob >>> from ciao_contrib.runtool import dmimgcalc >>> dmimgcalc.punlearn() >>> infiles = glob.glob("img[1-9].fits") >>> nf = len(infiles) >>> inames = ["img{0}".format(i) for i in range(1,nf+1)] >>> opstr = "imgout=({0})/{1}.0".format("+".join(inames), nf) >>> dmimgcalc(infiles, out="avg.fits", op=opstr)
The build() routine from the stk module can be used to convert a parameter value to an array of values, and the make_stackfile() routine from the ciao_contrib.stacklib module can be used to create a temporary stack file from an array.
Parameter constraints
If a parameter can only be one of a set of values then the parameter value can be set to a unique substring, just as on the command line. As an example, the interpolate parameter of dmjoin can be one of "linear", "first", "last", "closest", "furthest", "minimum", or "maximum", so the following are all valid:
>>> dmjoin.interp = "min" >>> dmjoin("a.dat","b.dat","c.dat","x",interp="lin")
An error occurs if a parameter is set to an invalid value; e.g. either using a number outside the valid range or if there are a limited set of choices, as shown below:
>>> dmjoin.verbose = 10 ValueError: dmjoin.verbose must be <= 5 but set to 10 >>> dmjoin.interp = "foo" ValueError: The parameter interpolate was set to foo when it must be one of: linear first last closest furthest minimum maximum
The return value
If there is no screen output from the tool then the tool returns None, otherwise it returns a single string containing both the STDOUT and STDERR channels. This output can either be stored in a variable or - if used from an interactive environment such as IPython and Sherpa - it will be automatically displayed. In the following, the dmcopy call creates no screen output, so it returns None, and therefore is no screen output, whereas the dmstat output is displayed:
>>> dmcopy(infile, outfile) >>> dmstat(outfile) y min: 10 @: 1 max: 87.3 @: 5 mean: 35.3 sigma: 28.55548984 sum: 176.5 good: 5 null: 0 >>> out = dmstat(outfile) >>> lines = out.split("\n") >>> print(lines[1]) min: 10 @: 1
What happens if the tool fails?
An IOError is raised if the tool returns a non-zero exit status. The error contains the screen output of the tool, as shown below where I used an invalid file:
>>> dmstat("simple.dat[cols z]") IOError: An error occurred while running 'dmstat': # DMSTAT (CIAO 4.11): could not open infile simple.dat[cols z].
The return value can be retrieved using the get_runtime_details() method of the tool, which is described below.
Unfortunately not all CIAO tools set the exit status correctly on error, so these will appear to have run successfully even on failure.
How long did the tool run for?
The get_runtime_details() method returns a dictionary which lists information on the last time the tool was run using this interface. This information is cleared when the punlearn() method is used. The keyword and values returned are listed below. Note that no keyword is guaranteed to be present and that the timings are not intended for precise bench-marking so should not be considered to be accurate to more than about one second.
Keyword | Value |
---|---|
code | The return value for the tool (0 indicates success, otherwise there was a failure). |
start | The start time, as returned by the time.localtime() routine. |
end | The end time, as returned by the time.localtime() routine. |
delta | A string representing the run time. It is useful for a simple display, but use the start and end fields for a more accurate representation. |
args | An array of the command-line arguments used to run the tool, stored as (name,value) pairs. |
output | The screen output for the tool (includes both STDOUT and STDERR in the same string). This is not cleaned up to remove trailing white space as is done for the return value when calling the tool. |
See the documentation for the Python time module for more information on how to use the start and end values.
Examples
Example 1
Providing all the arguments in the call
In this example we run the dmcopy tool and provide all the necessary arguments when the routine is called. The dmcopy.punlearn routine is used to ensure the settings are at their default values.
>>> dmcopy.punlearn() >>> dmcopy("evt2.fits[ccd_id=7]", "ccd7.fits", clobber=True)
We adapt this to loop through the ccd_id values 0, 1, 2, 3, and 6 and create the files "ccd<n>.fits". We also take advantage of the support for using parameter prefixes to shorten "clobber" to "cl".
>>> for ccd in [0,1,2,3,6]: infile = "evt2.fits[ccd_id={0}]".format(ccd) outfile = "ccd{0}.fits".format(ccd) dmcopy(infile, outfile, cl=True) print("Created: {0}".format(outfile)) Created: ccd0.fits Created: ccd1.fits Created: ccd2.fits Created: ccd3.fits Created: ccd6.fits
Example 2
Setting the arguments before the call
The "pset" approach can also be used to set the arguments before the call. The original example above can be re-written as:
>>> dmcopy.punlearn() >>> dmcopy.infile = "evt2.fits[ccd_id=7]" >>> dmcopy.outfile = "ccd7.fits" >>> dmcopy.cl = True >>> dmcopy()
Example 3
Mixing both approaches
Both approaches can be combined. For instance, if you wish to call a tool multiple times with some parameters fixed you may wish to use the pset approach for those parameters and call the tool with those arguments that change.
Here we set the clobber argument and then call dmcopy with the infile and outfile arguments.
>>> dmcopy.punlearn() >>> dmcopy.cl = True >>> dmcopy(infile, outfile)
Example 4
Accessing parameter values after a tool runs
Some tool set parameter values after being run. These values can be inspected and used as shown below, which includes setting the parameter value for the mkexpmap tool based on the output of get_sky_limits.
>>> get_sky_limits("img.fits", verbose=0) >>> dmf = get_sky_limits.dmfilter >>> print("Filter: {0}".format(get_sky_limits.dmf)) Filter: X=4047.5:4098.5:#51,Y=3920.5:3971.5:#51 >>> mkexpmap.xygrid = get_sky_limits.xygrid
The whole set of parameters can also be displayed:
>>> print(get_sky_limits) Parameters for get_sky_limits: Required parameters: image = img.fits Image for which you want to know the binning Optional parameters: precision = 1 Precision [# decimal points] for output numbers dmfilter = X=4047.5:4098.5:#51,Y=3920.5:3971.5:#51 DM filter syntax to match image xygrid = 4047.5:4098.5:#51,3920.5:3971.5:#52 xygrid parameter for mkexpmap to match image verbose = 0 Debug Level (0-5)
Example 5
Multiple copies of a tool
Multiple versions of a tool can be created, each with its own set of parameters. As a simple case, if you want to run dmstat with the median parameter set to True or False you could try the following:
>>> dms1 = make_tool("dmstat") >>> dms1.median = False >>> dms2 = make_tool("dmstat") >>> dms2.median = True
Then using dms1 or dms2 will run dmstat with the median parameter set to False and True repsectively:
>>> dms1("lc.fits[cols count_rate]") >>> dms2("lc.fits[cols count_rate]")
Example 6
Changing the temporary directory used by a tool
Several tools allow you to change the location of the temporary directory used to store intermediate products. This parameter should be changed if multiple copies of the tool are being run simultaneously, to avoid clashes and potential tool failure. One way to do this is to use the new_tmpdir context manager, which creates a temporary directory and guarantees it's clean up. As an example:
with new_tmpdir() as tmpdir: # tmpdir has been created and can be used in this block bmap = make_tool("create_bkg_map") bmap.tmpdir = tmpdir ... bmap(...) # at this point the temporary directory is guaranteed to have been # deleted, even if there is was an error in the block of code above # ...
Controlling the location of temporary files and directories
The ASCDS_WORK_PATH environment variable (not TMPDIR) is used to determine the location to use for temporary files and directories. This location is used when running CIAO tools - those that are instances of the CIAOToolParFile class - since a copy of the parameter file is created to ensure there to ensure that tools can be run in parallel without conflict. It is also used by the new_tmpdir and new_pfiles_environment context managers.
Routines with no corresponding tool
There are a number of CIAO parameter files which have no corresponding tool - for instance ardlib, geom, colors and lut. These are provided in this module for reading and setting parameters, but you can not call them. Also, as discussed below in the "PARAMETER FILES" section, setting these parameters does not change the on-disk parameter file unless you call the write_params method.
Here is an example of accessing the lut parameter file:
>>> print(lut) Parameters for lut: a = ${ASCDS_CALIB}/a.lut Color lookup table aips = ${ASCDS_CALIB}/aips.lut Color lookup table b = ${ASCDS_CALIB}/b.lut Color lookup table bb = ${ASCDS_CALIB}/bb.lut Color lookup table blue = ${ASCDS_CALIB}/blue.lut Color lookup table color = ${ASCDS_CALIB}/color.lut Color lookup table cool = ${ASCDS_CALIB}/cool.lut Color lookup table green = ${ASCDS_CALIB}/green.lut Color lookup table grey = ${ASCDS_CALIB}/grey.lut Color lookup table halley = ${ASCDS_CALIB}/halley.lut Color lookup table heaob = ${ASCDS_CALIB}/heaob.lut Color lookup table heat = ${ASCDS_CALIB}/heat.lut Color lookup table hsv = ${ASCDS_CALIB}/hsv.lut Color lookup table i8 = ${ASCDS_CALIB}/i8.lut Color lookup table rainbow1 = ${ASCDS_CALIB}/rainbow1.lut Color lookup table rainbow2 = ${ASCDS_CALIB}/rainbow2.lut Color lookup table ramp = ${ASCDS_CALIB}/ramp.lut Color lookup table red = ${ASCDS_CALIB}/red.lut Color lookup table sls = ${ASCDS_CALIB}/sls.lut Color lookup table staircase = ${ASCDS_CALIB}/staircase.lut Color lookup table standard = ${ASCDS_CALIB}/standard.lut Color lookup table >>> hl = lut.halley >>> print(os.path.expandvars(hl)) /soft/ciao-4.11/data/halley.lut
(assuming CIAO is installed in /soft/ciao-4.11/) but it can not be called
>>> lut(b="tst.lut") TypeError: 'CIAOParameter' object is not callable
Parameter files
Using these routines does not change any parameter file on disk, unless you use one of the routines listed below. This means it is safe to use this module to run tools from a script whilst using CIAO tools from the command line; the parameter files used by the command-line tools will not be over-written or changed by the Python script.
Any routine that is an instance of the CIAOToolParFile class - which, as of CIAO 4.9, is all tools except for axbary, dmgti, evalpos, fullgarf, mean_energy_map, pileup_map, tgdetect, and wavdetect - will not change the on-disk parameter file.
Routines that do change the on-disk parameter file
Running instances of the CIAOToolDirect class - so axbary, dmgti, evalpos, fullgarf, mean_energy_map, pileup_map, tgdetect, and wavdetect - will change the on-disk parameter file.
How to check whether a parameter file will be changed?
The class of the routine - which can be returned by the type() call or checked using isinstance() - determines whether a parameter file will be changed ("CIAOToolDirect") or not ("CIAOToolParFile"). For instance,
>>> type(dmgti) <class 'ciao_contrib.runtool.CIAOToolDirect'> >>> type(dmstat) <class 'ciao_contrib.runtool.CIAOToolParFile'> >>> type(wavdetect) <class 'ciao_contrib.runtool.CIAOToolDirect'>
These class names are rather confusing and may be changed in a later release.
WARNING: use of ardlib and other such parameter files
Many instrument-specific tools use parameter files like ardlib and geom to find certain files. When the tool is run, they still use the on-disk version of the file and not the values stored in the Python object of the same name. So the setting
>>> ardlib.AXAF_ACIS0_BADPIX = "bpix.fits[ccd_id=0]"
will not be picked up when a tool such as mkinstmap is run. One solution is to use the write_params method, which is described further below, to write out the ardlib parameter file to disk after making changes.
>>> ardlib.AXAF_ACIS0_BADPIX = "bpix.fits[ccd_id=0]" >>> ardlib.write_params()
Updating a parameter file
If you want to write the current set of parameters to disk, you can use the "write_params" method. As an example, where e1, e2 and e3 are numbers that have been calculated elsewhere
>>> expr = "imgout=(img1*{0}+img2*{2})+{3}".format(e1,e2,e3) >>> dmimgcalc.operation = expr >>> dmimgcalc.write_params()
will write the expression out to the dmimgcalc parameter file (eg try "!plist dmimgcalc" to see the result).
To write to a specific file, give a file name as the argument to write_params. A ".par" suffix will be appended if the filename does not contain one.
Updating to match a parameter file
The default values - and those used when the punlearn() method is used - are taken from the CIAO defaults. To update the values to match those from an on-disk file use the "read_params" method:
>>> dmimgcalc.read_params()
To read from a specific file, give a file name as the argument to write_params. A ".par" suffix will be appended if the filename does not contain one.
Changing the PFILES environment variable
The set_pfiles() routine changes the user portion of the PFILES environment variable. This controls where the tools look for ancillary parameter files (e.g. ardlib) and where the read_params and write_params methods default to when no explicit filename is given. See 'help set_pfiles' for more information.
The get_pfiles() routine can be used to get the current user or system portions of the PFILES environment variable. See 'help get_pfiles' for more information.
The new_pfiles_environment() context manager can be used to automatically handle the creation and clean up of a directory, as well as changing the PFILES environment variable. For instance:
def getemap(ccd, gridstr): """Create an ACIS exposure map for the given CCD and using gridstr as the xygrid parameter of mkexpmap. The input files are hard coded to simple names for simplicitly. """ with new_pfiles_environment(ardlib=True): afile = "asphist{0}.fits".format(ccd) ah = make_tool("asphist") ah(infile="asol1.fits", outfile=afile, evtfile="evt2.fits[ccd_id={0}]".format(ccd), clobber=True) ifile = "imap{0}.fits".format(ccd) mki = make_tool("mkinstmap") mki(outfile=ifile, monoenergy=1.7, pixelgrid="1:1024:1,1:1024:1", obsfile=afile+"[asphist]", detsubsys="ACIS-{0}".format(ccd), maskfile="msk1.fits", clobber=True) efile = "emap{0}.fits".format(ccd) mke = make_tool("mkexpmap") mke(asphistfile=afile, outfile=efile, instmapfile=ifile, xygrid=gridstr, clobber=True)
defines a routine, getemp(), which will run the asphist, mkinstmap and mkexpmap tools using a temporary directory and PFILES setting. This directory will be automatically removed, and the PFILES environment variable restored at the end of the routine, even if one of tools fails.
Differences to CIAO tools
No mode parameter
All tools are run with the mode parameter set to "hl". This is to avoid the tool trying to prompt you for missing parameter values, which would lead to it appear to hang as the prompt would not be seen by the user.
Missing parameters
As well as the mode parameters, there are a few tools which have a parameter that can only be set to a single value. These parameters are excluded from the interface.
Parameter values
Parameters should be set using the appropriate Python type, namely booleans, integers, floats or strings. For floating-point values, None is used to indicate the "INDEF" setting. None can be used for filenames and strings to indicate the value "". File names are given as strings, but you do not need to add extra quotes if it contains a space - so the following is valid
>>> dmstat("evt2.fits[sky=region(src.reg)][cols energy]")
Redirected values
There is limited support for the use of parameter redirects - the use of ")<parname>" to use the value from another parameter - but it only supports redirects to the same parameter file.
Parameter names
When required, parameter names are adjusted to make sure they are valid Python. At present the only change used is to convert any "-" characters to "_"; this is only required for several parameters in ardlib. Purely numeric parameter names are ignored; this only occurs for the "helper" parameters of dmtabfilt.
Changes in the scripts 4.15.1 (January 2023) release
The module has been updated to include the new color_color script.
Changes in the scripts 4.11.1 (December 2018) release
Updated for CIAO 4.11
The module has been updated to reflect the parameters for the tools included in CIAO 4.11.
Changes in the scripts 4.10.1 (April 2018) release
Updated for CIAO 4.10
The module has been updated to reflect the parameters for the tools included in CIAO 4.10.
Improved Python 3 support for INDEF parameters
Fixed an issue which prevented the use of INDEF as a parameter value when run on a Python 3.5 version of CIAO.
Changes in the scripts 4.9.4 (July 2017) release
Running scripts
The following tools have been changed to use the CIAOToolDirect class, which means that care should be taken if multiple copies of these tools are run in parallel: axbary, evalpos, mean_energy_map, pileup_map, tgdetect, and wavdetect. It is suggested that the new_pfiles_environment context manager is used to run these tools, as it automates the creation and deletion of a temporary parameter directory for use by the tools.
Changes in the scripts 4.9.2 (April 2017) release
Changed tool
The blanksky routine supports the new 'random' seed parameter in order to create reproducible background files.
Changes in the scripts 4.8.4 (October 2016) release
New tools
The module was updated to support the new blanksky, blanksky_image, and correct_periscope_drift scripts added in this release.
Changed tool
The specextract routine supports the new 'binarfcorr' parameter to explicitly specify detector pixel size used by arfcorr PSF correction. The 'srcflux' script has the 'psfmethod=marx' option enabled to simulate a monochromatic PSF for aperture correction if MARX is installed.
Changes in the scripts 4.8.3 (April 2016) release
New tools
The module was updated to support the new simulate_psf script added in this release.
Changes in the script 4.8.2 (January 2016) release
Changed tools
Support for the add_grating_orders and add_grating spectra scripts has been removed, superseded by the combine_grating_spectra script to combine grating orders and spectra.
Changes in the script 4.8.1 (December 2015) release
Removed tools
The fluximage, merge_obs, flux_obs, and readout_bkg routines support the 'random' seed parameter in order to create reproducible results.
Changes in the scripts 4.7.4 (September 2015) release
New tools
The module was updated to support the new readout_bkg, install_marx, detilt, dewiggle, symmetrize, and download_obsid_caldb scripts added in this release.
Changed tool
The specextract routine supports the new 'binarfcorr' parameter to explicitly specify detector pixel size used by arfcorr PSF correction. The 'srcflux' script has the 'psfmethod=marx' option enabled to simulate a monochromatic PSF for aperture correction if MARX is installed.
Changes in the scripts 4.7.2 (April 2015) release
New tools
The module was updated to support the new splitobs, gti_align, and multi_chip_gti scripts added in this release.
Changed tool
The mktgresp routine supports the new 'orders' parameter to allow the creation of responses for an arbitrary set of orders. The responses for different orders are can now be created in parallel, controlled by the new 'parallel' and 'nproc' parameters.
Changes in the scripts 4.6.7 (October 2014) release
New tools
The module was updated to support the new combine_grating_spectra and tgsplit scripts added in this release.
Changes in the scripts 4.6.6 (September 2014) release
Using $ASCDS_WORK_PATH for temporary files
The new_pfiles_environment() and new_tmpdir() context managers have a new parameter tmpdir, and default to using the $ASCDS_WORK_PATH environment variable to set the location of the temporary directories they create; previously they used /tmp or the $TMPDIR environment variable. The tmpdir argument has been added to the add_tool_history() function.
Changes in the scripts 4.6.3 (March 2014) release
New tool
The module was updated to support the new ecf_calc script added in this release.
Changes in the scripts 4.6.1 (December 2013) release
The module was updated to support the new tools and scripts in the CIAO 4.6 release (e.g. srcflux,tgdetect2,tg_choose_method) the removed tools and scripts (acis_classify_hotpix, acis_find_hotpix, acis_run_hotpix, merge_all), and for changes in parameters of existing tools (such as mkinstmap, acis_process_events, and specextract).
Automatic stack handling
Long parameter values are automatically written out as a stack file, to avoid problems with long line lengths. Prior to this release the stack files were accessed using '@filename', which has now been changed to '@-filename' to avoid the path to the stack file being added to the entries. See 'ahelp stack' for more information.
Improved support for modelflux
The modelflux routine now no longer changes the on-disk parameter file (i.e. it is now an instance of the CIAOToolParFile rather than CIAOToolDirect class). See the "Parameter files" section above for more information.
Changes in the scripts 4.5.4 (August 2013) release
New tools
The module supports the new scripts in this release: search_csc and obsid_search_csc.
Changed tools
The specextract and find_chandra_obsid commands have been updated to reflect the changes in the parameter files of these tools.
Running tools in parallel
A new section to the ahelp file has been added: Running tools in parallel.
Changes in the scripts 4.5.2 (April 2013) release
New tools
The module supports the new scripts in this release: mktgresp, tgmask2reg, and reg2tgmask.
Changed tools
The chandra_repro command supports the new recreate_tg_mask parameter.
Renamed parameter files
The list of look-up tables available via the ximage and imagej parameter files are now accessed through their new names: ximage_lut and imagej_lut.
Changes in the scripts 4.5.1 (December 2012) release
The module was updated to support the new tools in the CIAO 4.5 release (tg_findzo), the removed tools (mkpsf), and for changes in parameters of existing tools (e.g. modelflux). Support for INDEF with numeric parameters has been improved.
Bugs
Use of parameter redirects to other parameter files
Parameter values of the form ")toolname.parname", which use the parameter value from another parameter file, are not supported. So, for example tgdetect, which has its thresh parameter set to ")celldetect.thresh", will not run unless you manually set these parameters to a valid value. For example:
>>> from ciao_contrib.runtool import * >>> print(tgdetect.thresh) )celldetect.thresh >>> tgdetect.thresh = celldetect.thresh >>> print(tgdetect.thresh) 3.0
See the bugs pages for an up-to-date listing of known bugs.
Refer to the CIAO bug pages for an up-to-date listing of known issues.