Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DAQ conf Split #386

Merged
merged 10 commits into from
Sep 21, 2023
10 changes: 5 additions & 5 deletions docs/ConfigurationsForCasualUsers.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,20 +10,20 @@ First, a reminder to set up your working software environment and download the f

Next we generate some sample system configurations and use _[nanorc](https://dune-daq-sw.readthedocs.io/en/latest/packages/nanorc/)_ to run a demo system with them.

The tools to generate these configurations consist of a single Python script that generates DAQ system configurations with different characteristics based on the configuration file given to the script. This script is `daqconf/scripts/daqconf_multiru_gen`. It uses `daqconf/schema/daqconf/confgen.jsonnet` to define the format for configuration JSON files.
The tools to generate these configurations consist of a single Python script that generates DAQ system configurations with different characteristics based on the configuration file given to the script. This script is `daqconf/scripts/fddaqconf_gen`. It uses `daqconf/schema/daqconf/confgen.jsonnet` to define the format for configuration JSON files.
The configuration generation files under the `daqconf/python/daqconf/apps` directory were developed to work with the _nanorc_ package, which itself can be seen as a basic Finite State Machine that sends commands and drives the DAQ system.

Here is an example command line which uses the provided JSON file that has all of the default values populated (so it is equivalent to running without any options at all!). Note for the reader that it scrolls horizontally. The command below assumes you also have a hardware map file available, e.g. [this basic example](https://raw.githubusercontent.com/DUNE-DAQ/daq-systemtest/develop/config/default_system/default_system_HardwareMap.txt). Further details on hardware map files can be found at the bottom of this page.
```
daqconf_multiru_gen --hardware-map-file <your hardware map file> --config daqconf/config/daqconf_full_config.json daq_fake00
fddaqconf_gen --hardware-map-file <your hardware map file> --config daqconf/config/daqconf_full_config.json daq_fake00
```
The created configurations will be called `daq_fake<NN>` and there will be a `daq_fake<NN>` directory created containing the produced configuration to be used with _nanorc_.
The configurations can be run interactively with `nanorc daq_fake<NN> <partition_name>` from the `<work_dir>`.

In the following sections, we will use "dot" notation to indicate JSON paths, so that `readout.data_file $PWD/frames.bin` is equvalent to `"readout": { "data_file": "$PWD/frames.bin" }`.

1) In order to get the set of configuration options that can be overridden from the command line and their `help` , run :
`daqconf_multiru_gen -h`
`fddaqconf_gen -h`
Command-line options override any options set in the configuration file, which in turn override any default values.

2) The data `Input` and `Output` system configuration options allow the user to change the input data file location and the output data directory path as needed. To specify an input `frames.bin` file from the current directory, a user would use `readout.data_file $PWD/frames.bin`. This file contains data frames that are replayed by fake cards in the current system, and as mentioned above, this file can be downloaded with "`curl -o frames.bin -O https://cernbox.cern.ch/index.php/s/7qNnuxD8igDOVJT/download`". The output data path option `dataflow.apps[n].output_dirs` can be used to specify the directory where output data files are written. If more than one path is provided in the array, the DataWriter will use those directories in rotation for writting TriggerRecords.
Expand Down Expand Up @@ -58,7 +58,7 @@ slowdown_conf.json (Applying the data slowdown factor from point 4):
}
}
```
`daqconf_multiru_gen --config slowdown_conf.json daq_fake01`
`fddaqconf_gen --config slowdown_conf.json daq_fake01`

multi_df.json (Running two dataflow apps):
```JSON
Expand All @@ -71,6 +71,6 @@ multi_df.json (Running two dataflow apps):
}
}
```
`daqconf_multiru_gen --config multi_df.json daq_fake02`
`fddaqconf_gen --config multi_df.json daq_fake02`

9) One of the key options is the `--detector-readout-map-file`, or `readout.detector_readout_map_file`, which points the configuration generators to a JSON file describing the readout links in the current configuration. It can be generated using [this utility from daqconf](https://github.com/DUNE-DAQ/daqconf/blob/develop/test/scripts/daqconf_dromap_gen) and edited using [this utility](https://github.com/DUNE-DAQ/daqconf/blob/develop/scripts/dromap_editor).
2 changes: 1 addition & 1 deletion docs/InstructionsForCasualUsers.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ As of Oct-4-2022, here are the steps that should be used when you first create y
"`curl -o frames.bin -O https://cernbox.cern.ch/index.php/s/0XzhExSIMQJUsp0/download`"
or clicking on the [CERNBox link](https://cernbox.cern.ch/index.php/s/0XzhExSIMQJUsp0/download)) and put it into `<work_dir>`
10. `git clone https://github.com/DUNE-DAQ/daq-systemtest`
11. `daqconf_multiru_gen --hardware-map-file daq-systemtest/config/default_system_HardwareMap.txt daq_fake`
11. `fddaqconf_gen --hardware-map-file daq-systemtest/config/default_system_HardwareMap.txt daq_fake`
12. `nanorc daq_fake ${USER}-test boot conf start_run 101 wait 60 stop_run shutdown`
13. examine the contents of the HDf5 file with commands like the following:
* `h5dump-shared -H -A swtest_run000101_0000_*.hdf5`
Expand Down
12 changes: 6 additions & 6 deletions docs/MigratingTov3_2_0Confgen.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,20 @@
# Migrating to v3.2.0 _daqconf_ Configuration Generation

## Introduction
v3.2.0 introduced several changes to the configuration generation scripts (e.g. _daqconf_multiru_gen_), most notably introducing the use of configuration files as a replacement for most of the command-line options. This document will serve as a guide for updating a pre-v3.2.0 confgen command line to a v3.2.0 configuration file.
v3.2.0 introduced several changes to the configuration generation scripts (e.g. _fddaqconf_gen_), most notably introducing the use of configuration files as a replacement for most of the command-line options. This document will serve as a guide for updating a pre-v3.2.0 confgen command line to a v3.2.0 configuration file.

## Notable Changes in v3.2.0
1. Most of the command-line options have been removed from _daqconf_multiru_gen_
1. Most of the command-line options have been removed from _fddaqconf_gen_
2. Readout apps are configured through the HardwareMapService from _detchannelmaps_, using a "Hardware Map File"

### Creating a configuration file
A default configuration can be generated via `echo '{}' >daqconf.json;daqconf_multiru_gen -c daqconf.json`. This is the basic starting point, and any non-default options can be specified, using `daqconf_multiru_gen -h` as the guide for constructing the JSON file (or use the [schema](https://github.com/DUNE-DAQ/daqconf/blob/develop/schema/daqconf/confgen.jsonnet)). For this release, option names have been kept the same as much as possible, so it should be a fairly direct translation.
A default configuration can be generated via `echo '{}' >daqconf.json;fddaqconf_gen -c daqconf.json`. This is the basic starting point, and any non-default options can be specified, using `fddaqconf_gen -h` as the guide for constructing the JSON file (or use the [schema](https://github.com/DUNE-DAQ/daqconf/blob/develop/schema/daqconf/confgen.jsonnet)). For this release, option names have been kept the same as much as possible, so it should be a fairly direct translation.

Example:

v3.1.0 Command line
```
daqconf_multiru_gen -e -n 10 -a 62144 -b 200000 -f --tpc-region-name-prefix=gio --host-ru np04-srv-030 --host-df np04-srv-004 -o /data0/test --frontend-type wib2 --clock-speed-hz 62500000 --ers-impl cern --opmon-impl cern felix_wib2_10links
fddaqconf_gen -e -n 10 -a 62144 -b 200000 -f --tpc-region-name-prefix=gio --host-ru np04-srv-030 --host-df np04-srv-004 -o /data0/test --frontend-type wib2 --clock-speed-hz 62500000 --ers-impl cern --opmon-impl cern felix_wib2_10links
```

Let's first reformat that command line so we can see what options we're dealing with:
Expand Down Expand Up @@ -81,12 +81,12 @@ For our configuration example, we see that we have one readout app on np04-srv-0

The first column is the Source ID for a given link. Multiple links may have the same SourceID, depending on the detector configuration. The next four fields specify the physical location of the link in hardware coordinates and the [DetID](https://github.com/DUNE-DAQ/detdataformats/blob/develop/include/detdataformats/DetID.hpp). Next comes the FELIX card location, as host/card #, and the final two numbers are the link location within the FELIX (Super Logic Region and Link #). For physical (e.g. FELIX links), there are up to 2 SLRs each supporting up to 5 links. For the HD_TPC detector, "DetCrate" corresponds to "APA".

_daqconf_multiru_gen_ will create a readout app for each unique host/card pair in the given hardware map file. The hardware map can be passed using the `--hardware-map-file` command-line option or via `readout.hardware_map_file`. Note that detector type HD_TPC can be used for both ProtoWIB and DUNEWIB configurations, they are distinguished using `readout.clock_speed_hz`.
_fddaqconf_gen_ will create a readout app for each unique host/card pair in the given hardware map file. The hardware map can be passed using the `--hardware-map-file` command-line option or via `readout.hardware_map_file`. Note that detector type HD_TPC can be used for both ProtoWIB and DUNEWIB configurations, they are distinguished using `readout.clock_speed_hz`.

### Bonus Example (Coldbox Config)
Command line version:
```
daqconf_multiru_gen --host-ru np04-srv-028 --host-df np04-srv-001 --host-dfo np04-srv-001 --host-hsi np04-srv-001 --host-trigger np04-srv-001 --op-env np04_coldbox -o /data1 --opmon-impl cern --ers-impl cern -n 10 -b 260000 -a 2144 --clock-speed-hz 62500000 -f --region-id 0 --frontend-type wib2 --thread-pinning-file /nfs/sw/dunedaq/dunedaq-v3.1.0/configurations/thread_pinning_files/cpupin-np04-srv-028.json --hsi-trigger-type-passthrough --enable-dqm --host-dqm np04-srv-001 --dqm-cmap HDCB --dqm-impl cern np04_coldbox_daq_4ms
fddaqconf_gen --host-ru np04-srv-028 --host-df np04-srv-001 --host-dfo np04-srv-001 --host-hsi np04-srv-001 --host-trigger np04-srv-001 --op-env np04_coldbox -o /data1 --opmon-impl cern --ers-impl cern -n 10 -b 260000 -a 2144 --clock-speed-hz 62500000 -f --region-id 0 --frontend-type wib2 --thread-pinning-file /nfs/sw/dunedaq/dunedaq-v3.1.0/configurations/thread_pinning_files/cpupin-np04-srv-028.json --hsi-trigger-type-passthrough --enable-dqm --host-dqm np04-srv-001 --dqm-cmap HDCB --dqm-impl cern np04_coldbox_daq_4ms
```

Options
Expand Down
2 changes: 1 addition & 1 deletion docs/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# daqconf

This repository contains tools for generating DAQ system configurations, the [`daqconf_multiru_gen` script](https://github.com/DUNE-DAQ/daqconf/blob/develop/scripts/daqconf_multiru_gen) ("DAQ configuration, multiple readout unit generator"). It generates DAQ system configurations with different characteristics based on the configuration file and command-line parameters given to it.
This repository contains tools for generating DAQ system common configurations, the [`fddaqconf_gen` script](https://github.com/DUNE-DAQ/fddaqconf/blob/develop/scripts/fddaqconf_gen) ("DAQ configuration, multiple readout unit generator"). It generates DAQ system configurations with different characteristics based on the configuration file and command-line parameters given to it.

The focus of this documentation is on providing instructions for using the tools and running sample DAQ systems. If you're starting out, take a look at:

Expand Down
2 changes: 1 addition & 1 deletion docs/UpdatedIOManagerAPI.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ The `connect_modules`, `add_endpoint` methods and the `Queue` constructor from `

When updating, it is useful to generate configurations and check the "queues" and "connections" lists in the generated *_init.json files for correct names and data types. Running a configuration with incorrect data types will produce error messages from IOManager which should include enough information to track down the connection causing the error. (Code issues such as undeclared data types can also cause IOManager issues, though that commonly results in one specific message: "Connection named "uid" of type Unknown not found".)

## Updating Configuration Generation Scripts (e.g. listrev_gen, daqconf_multiru_gen)
## Updating Configuration Generation Scripts (e.g. listrev_gen, fddaqconf_gen or nddaqconf_gen)

The main change for configuration generation scripts is the addition of the "boot.use_connectivity_service" parameter to the main daqconf schema, which should be passed to make_app_command_data to enable/disable features of the configuration generation logic related to the ConnectivityService.

Expand Down
Loading