Writing JSON BLOCK Files¶
This page contain instructions and an example of how to write JSON BLOCK files for use with the Scheduler. Fundamentally, JSON BLOCKS are collections of observing scripts and their associated configurations serialized into JSON format. They provide users a way to develop complex observing sequences using our standard SAL Script library, without having to develop a new SAL Script.
If this is the first time you are contributing to the observatory control software, it is highly recommend that you take a look at the development guidelines.
Getting started with a BLOCK Ticket JIRA¶
All JSON BLOCKS should have an associated BLOCK Ticket in JIRA. This allow us to track progress and provides traceability for JSON BLOCKS to be added to night plans. Users should start by generating a new JIRA ticket. This JIRA ticket number provides a BLOCK-ID for reference that can be used to uniquely identify data and individual executions.
The BLOCK Repository in ts config ocs¶
The first step in creating a new JSON BLOCK is to determine which Scheduler will execute the BLOCK. At current, there are three Scheduler instances available for executing BLOCKS: MTScheduler, ATScheduler, and the OCS Scheduler. Each Scheduler instance maintains its own repository of JSON BLOCKS, and can only execute valid JSON BLOCKS within its own repository.
To get started, clone this repository and open a ticket branch to develop your JSON BLOCK:
git clone https://github.com/lsst-ts/ts_config_ocs.git
Once the JSON BLOCK file is ready, open a new pull request and request review by one of the Commissioning Scientists to get your JSON BLOCK merged into ts config ocs and deployed at the summit for execution.
JSON BLOCK Requirements¶
A JSON BLOCK requires three fields:
{
"name": str,
"program": str,
"scripts": list[ObservingScript]
}
name
Used as an human-readable identifier for the BLOCK.
It is not required to be unique and is often set to the be the same as program
.
program
Used iternally by the Scheduler as the identifier or BLOCK-ID
.
It must be unique (i.e. two JSON BLOCKS cannot have the same program
),
and it will be used to load the BLOCK into the scheduler.
A good choice for the program
field is the BLOCK JIRA ticket name.
Note
There is also a program
keyword specified in most scripts that corresponds to the science_program
keyword written to the Butler.
Unless specified as a script configuration, the program
used in the metadata of the JSON BLOCK will NOT translate to the science_program
.
scripts
Will contain the list of SAL scripts and configurations that will make up our BLOCK.
Some additional, optional fields that can be passed include:
constraints
Currently ignored by the Scheduler.
Provides a set of observing constraints for conditions on which to schedule the JSON BLOCK for observation.
id
A unique uuid that can be used for tracing the JSON BLOCK. If not provided, this field is automatically generated by the Scheduler CSC.
A Full Example JSON BLOCK¶
Here we will build a JSON BLOCK to take a series of images with the Auxiliary Telescope to determine the focus offset of a new optical element. This full example is based on the test described in https://rubinobs.atlassian.net/browse/SITCOM-1012 and corresponds to BLOCK-91 https://rubinobs.atlassian.net/browse/BLOCK-91, observed in September 2023.
{
"name": "BLOCK-91",
"program": "BLOCK-91",
"constraints": [],
"scripts": [
{
"name": "auxtel/track_target.py",
"standard": true,
"parameters": {
"target_name": "gam Gru"
}
},
{
"name": "auxtel/latiss_acquire.py",
"standard": false,
"parameters": {
"program": "BLOCK-91",
"reason": "SITCOM-1012",
"acq_grating": "holo4_003",
"do_reacquire": true,
"acq_exposure_time": 0.2,
"acq_filter": "SDSSr_65mm"
}
},
{
"name": "auxtel/offset_ataos.py",
"standard": true,
"parameters": {
"z": -0.1
}
},
{
"name": "auxtel/take_image_latiss.py",
"standard": true,
"parameters": {
"program": "BLOCK-91",
"reason": "SITCOM-1012",
"nimages": 2,
"exp_times": 10,
"image_type": "FOCUS",
"filter": "cyl_lens",
"grating": "holo4_003"
}
},
{
"name": "auxtel/offset_ataos.py",
"standard": true,
"parameters": {
"z": 0.025
}
},
{
"name": "auxtel/take_image_latiss.py",
"standard": true,
"parameters": {
"program": "BLOCK-91",
"reason": "SITCOM-1012",
"nimages": 2,
"exp_times": 10,
"image_type": "FOCUS",
"filter": "cyl_lens",
"grating": "holo4_003"
}
},
{
"name": "auxtel/offset_ataos.py",
"standard": true,
"parameters": {
"z": 0.025
}
},
{
"name": "auxtel/take_image_latiss.py",
"standard": true,
"parameters": {
"program": "BLOCK-91",
"reason": "SITCOM-1012",
"nimages": 2,
"exp_times": 10,
"image_type": "FOCUS",
"filter": "cyl_lens",
"grating": "holo4_003"
}
},
{
"name": "auxtel/offset_ataos.py",
"standard": true,
"parameters": {
"z": 0.025
}
},
{
"name": "auxtel/take_image_latiss.py",
"standard": true,
"parameters": {
"program": "BLOCK-91",
"reason": "SITCOM-1012",
"nimages": 2,
"exp_times": 10,
"image_type": "FOCUS",
"filter": "cyl_lens",
"grating": "holo4_003"
}
},
{
"name": "auxtel/offset_ataos.py",
"standard": true,
"parameters": {
"z": 0.025
}
},
{
"name": "auxtel/take_image_latiss.py",
"standard": true,
"parameters": {
"program": "BLOCK-91",
"reason": "SITCOM-1012",
"nimages": 2,
"exp_times": 10,
"image_type": "FOCUS",
"filter": "cyl_lens",
"grating": "holo4_003"
}
},
{
"name": "auxtel/offset_ataos.py",
"standard": true,
"parameters": {
"z": 0.025
}
},
{
"name": "auxtel/take_image_latiss.py",
"standard": true,
"parameters": {
"program": "BLOCK-91",
"reason": "SITCOM-1012",
"nimages": 2,
"exp_times": 10,
"image_type": "FOCUS",
"filter": "cyl_lens",
"grating": "holo4_003"
}
},
{
"name": "auxtel/offset_ataos.py",
"standard": true,
"parameters": {
"z": 0.025
}
},
{
"name": "auxtel/take_image_latiss.py",
"standard": true,
"parameters": {
"program": "BLOCK-91",
"reason": "SITCOM-1012",
"nimages": 2,
"exp_times": 10,
"image_type": "FOCUS",
"filter": "cyl_lens",
"grating": "holo4_003"
}
},
{
"name": "auxtel/offset_ataos.py",
"standard": true,
"parameters": {
"z": 0.025
}
},
{
"name": "auxtel/take_image_latiss.py",
"standard": true,
"parameters": {
"program": "BLOCK-91",
"reason": "SITCOM-1012",
"nimages": 2,
"exp_times": 10,
"image_type": "FOCUS",
"filter": "cyl_lens",
"grating": "holo4_003"
}
},
{
"name": "auxtel/offset_ataos.py",
"standard": true,
"parameters": {
"z": 0.025
}
},
{
"name": "auxtel/take_image_latiss.py",
"standard": true,
"parameters": {
"program": "BLOCK-91",
"reason": "SITCOM-1012",
"nimages": 2,
"exp_times": 10,
"image_type": "FOCUS",
"filter": "cyl_lens",
"grating": "holo4_003"
}
},
{
"name": "auxtel/offset_ataos.py",
"standard": true,
"parameters": {
"z": -0.1
}
},
{
"name": "auxtel/stop_tracking.py",
"standard": true,
"parameters": {}
}
]
}
Generating this BLOCK in a Notebook¶
To generate this JSON BLOCK, we will make use of a set of convenience classes defined in the ts observing repository. The ts observing repository is also used by the Scheduler CSC when loading JSON BLOCKS, so by using it to generate our JSON file we can ensure proper formatting.
The ts observing repository can be installed in your local environment by running
conda install -c lsstts -y ts-observing
and restarting your notebook kernel.
To setup your notebook, import the ObservingBlock
and ObservingScript
classes from ts observing
from lsst.ts.observing import ObservingBlock, ObservingScript
Next, we define some parameters to be used throughout the notebook and instantiate an empty list of scripts.
name = "BLOCK-91"
program = "BLOCK-91"
reason = "SITCOM-1012"
constraints = []
scripts = []
filter_to_use = 'cyl_lens'
Here we setup our first script to slew to and track a target using the ObservingScript
class.
We must pass the script configuration to the ObservingScript
class as a parameters
dictionary,
and then append it to our list of scripts.
track_target_script = ObservingScript(
name = "auxtel/track_target.py",
standard = True,
parameters = dict(
target_name="gam Gru"
)
)
scripts.append(track_target_script)
Now add a script to acquire and re-center the target on the grating hotspot
acquire_script = ObservingScript(
name="auxtel/latiss_acquire.py",
standard=False,
parameters = dict(
program = program,
reason = reason,
acq_grating = "holo4_003",
do_recquire = True
)
)
scripts.append(acquire_script)
And finally we setup our data taking block, utilizing a loop to successively apply ATAOS offsets and take images.
z_offset_start = -0.1 # mm
z_offset_step = 0.025 # mm
z_offset_end = 0.1
offset_script = ObservingScript(
name = "auxtel/offset_ataos.py",
standard = True,
parameters = dict(
z = z_offset_start
)
)
take_images_script = ObservingScript(
name = "auxtel/take_image_latiss.py",
standard = True,
parameters = dict(
program = program,
reason = reason,
nimages = 2,
exp_times = 30,
image_type = "FOCUS",
filter = filter_to_use,
grating = 'holo4_003',
)
)
scripts.append(offset_script)
scripts.append(take_images_script)
z_offset_total = z_offset_start
while z_offset_total < z_offset_end:
offset_script = ObservingScript(
name = "auxtel/offset_ataos.py",
standard = True,
parameters = dict(
z = z_offset_step
)
)
scripts.append(offset_script)
scripts.append(take_images_script)
z_offset_total+=z_offset_step
To finish our block, we return the ATAOS to its nominal position and stop tracking the target.
offset_script = ObservingScript(
name = "auxtel/offset_ataos.py",
standard = True,
parameters = dict(
z = -z_offset_end
)
)
scripts.append(offset_script)
stop_tracking_script = ObservingScript(
name = "auxtel/stop_tracking.py",
standard = True,
parameters = dict()
)
scripts.append(stop_tracking_script)
Now, we assemble the block using the ObservingBlock
class and write it out to a JSON file.
block = ObservingBlock(
name = "BLOCK-91",
program = "BLOCK-91",
scripts = scripts,
)
output_file_path = name+'.json'
with open(output_file_path, 'w') as fp:
fp.write(block.model_dump_json(indent=2))
This will produce the file BLOCK-91.json
in your local directory.
This file is ready to be added to your development branch in ts config ocs.
Best Practices for JSON BLOCK Writing¶
Here we provide a few tips for effective JSON BLOCK writing.
- Evaluate the full SAL Script library before deciding which scripts to use. It may be that you can combine two or more steps by choosing a different, more appropriate SAL Script.
- Keep JSON BLOCKS short. Consider both the total execution time and the total number of scripts. In general, you do not want a single BLOCK to take more than 1hr of execution time. You also should avoid having BLOCKS of more than 100 individual scripts, at the time of writing there is no automatic method to recover and execute a BLOCK from an individual step in the case of failure.
- For use cases where execution time is paramount, consider developing a new SAL Script. JSON BLOCKS execute SAL Scripts in sequence so there is some added system latency compared to a single SAL Script executing the same commands.
- Recall that JSON BLOCKS are not configurable, so if your test requires changing multiple parameters consider using individual SAL Scripts or a different method.
Testing and Validating JSON BLOCKS¶
At the time of enabling the Scheduler, the BLOCK library is validated for both proper JSON format and individual script configurations are checked against their schema. If there is an error, the JSON BLOCK will not be available to add to the Scheduler for observation. Therefore, it is required that all JSON BLOCKS be validated using one of the test stands described in the Operational Environments before deployment at the summit. Separate documentation will be developed describing this process.