Speaker
Description
One of the critical points in performing phenomenological analyses of new physics at collider is that, very often, large parameter scans are necessary: intensive and often redundant MC simulations have to be performed to cover relevant regions of signal parameter space and achieve enough accuracy in the determination of signal features.
On the other hand, disk space and computing time are often limited, and the environmental impact of performing such computations is almost never taken into consideration.
There is a growing need to devise strategies to optimise data production and share resources in the HEP community, both theory and experiment.
I will describe a framework which allows such approach, where simulated signal samples are deconstructed into complete sets of basic elements to be combined a posteriori to perform different analysis.
The framework is modular, collaborative, flexible and resource-friendly.
I will describe it through a concrete example: the analysis of BSM contributions in Higgs pair production at the LHC, and indicate possible short- and long-term developments and applications.