The type II seesaw model remains a popular and viable explanation of neutrino masses and mixing angles. By hypothesizing the existence of a scalar that is a triplet under the weak gauge interaction, the model predicts strong correlations among neutrino oscillation parameters, signals at lepton flavour experiments, and collider observables at high energies. We investigate reports that the type...
The starting point to our discussion is the "$B\to K\pi$ puzzle".
We show, that although the "puzzle" can be resolved by a more detailed analysis, there is a more fundamental question that needs to be addressed:
Is New Physics necessary to describe the experimentally observed asymmetries and branching fractions of the $B\to PP$ decays?
We perform a phenomenological analysis based on fits of...
An update on recent developments and results, current activities and future plans, for the Contur (Constraints of new theories using Rivet) toolkit.
I will present new features of CheckMATE, in particular implementation of multibin fits in a number of ATLAS and CMS searches. I will show an application of the method to electroweakino scenarios and discuss notable improvements in exclusion range due to CMS multijet search.
We report recent developments in Analysis Description Language (ADL) and the runtime interpreter/framework CutLang targeting (re)interpretation studies, including integration of machine learning models and a Jupyter-based interface for plotting. We present validation efforts and results from various LHC analyses along with studies with CMS open data. We also highlight several core...
Searches for new physics at the Large Hadron Collider have constrained many models of physics beyond the Standard Model. Many searches also provide resources that allow them to be reinterpreted in the context of other models. We describe a reinterpretation pipeline that examines previously untested models of new physics using supplementary information from ATLAS Supersymmetry (SUSY) searches...
We present a collection of tools automating the efficient computation of large sets of theory predictions for high-energy physics. Calculating predictions for different processes often require dedicated programs. These programs, however, accept inputs and produce outputs that are usually very different from each other. The industrialization of theory predictions is achieved by a framework...
One of the critical points in performing phenomenological analyses of new physics at collider is that, very often, large parameter scans are necessary: intensive and often redundant MC simulations have to be performed to cover relevant regions of signal parameter space and achieve enough accuracy in the determination of signal features.
On the other hand, disk space and computing time are...
I discuss how current LHC measurements are employed in the determination of the Parton Distribution Functions (PDFs) of the proton. I focus on the current praxis and I outline a few challenges in the release and usage of the experimental data. I finally discuss prospects to improve best practice measures, in light of LHC Run III and beyond.
Analysing statistical models is at the heart of any empirical study for hypothesis testing. We present a new cross-platform Python-based package which employs different likelihood prescriptions through a plug-in system, enabling the statistical inference of hypotheses. This framework empowers users to propose, examine, and publish new likelihood prescriptions without the need for developing a...
An increasingly frequent challenge faced in HEP data analysis is to characterize the agreement between a prediction that depends on a dozen or more model parameters-such as predictions coming from an effective field theory (EFT) framework-and the observed data. Traditionally, such characterizations take the form of a negative log likelihood (NLL) distribution, which can only be evaluated...
Rare decays like $B^+ \to K^+ \nu \bar{\nu}$, searched for by the Belle II collaboration, are important in particle physics research as they offer a window into physics beyond the Standard Model. However, the experimental challenges induced by the two final state neutrinos require assumptions on the kinematic distribution of this decay....
Most Beyond Standard Model (BSM) physics theories are characterized by multiple BSM parameters. These encompass properties like new particle masses, coupling constants, decay widths, and effective field theory parameters.
When testing such theories against data, analysts might choose to consider only a subset of relevant BSM physics parameter in order to work within limits of computational...
Machine learning tools have enabled a new type of differential cross section measurements that are unbinned and high-dimensional (see e.g. 2109.13243). This talk will discuss the challenges and prospects of (re)using such measurements with respect to new physics.
The HEP Statistics Serialization Standard is an emerging standard to
serialize statistical models in High Energy Physics as JSON (or
JSON-like) files.
Publishing likelihoods has been on the bucket-list of the HEP
community since 20 years. With this new standard, which is already
implemented in ROOT and has gathered support by all major HEP
statistics frameworks, the community will...
Full statistical models encapsulate the complete information of an experimental result, including the likelihood function given observed data. Since a few years ago ATLAS started publishing statistical models that can be reused via the pyhf framework; a major step towards fully publishing LHC results. In the case of fast Simplified Model Spectra based reinterpretation we are often only...
I will discuss how SMEFT fits incorporate constraints from quark flavour physics
at the hand of two recent examples. I will also highlight the technical
complexity of this endeavour, due to the large number of low-energy hadronic parameters involved.
I propose an approach that "preserves" the flavourful results of such low-energy EFT fits
and provides them to subsequent SMEFT fits as "off...
Global fits to particle physics data outside experimental collaborations requires the combination and (re)interpretation of a vast range of LHC datasets. This is a non trivial exercise and requires state of the art Monte Carlo simulations at NLO QCD in the Standard Model Effective Field Theory (SMEFT). In this talk, I will present the latest ongoing efforts from the SMEFiT collaboration,...
The talk includes an overview of the latest EFT studies in the top sector from the CMS experiment. Various physics processes and experimental approaches are reviewed, with an additional discussion of prospects for a potential ATLAS-CMS EFT combination for top-related processes.
With detailed precision measurements of physics at the scale of electroweak symmetry breaking (EWSB), the experimental study of the Standard Model at the Large Hadron Collider is rapidly shifting towards an effective field theory (EFT) approach. An effective field theory prescription allows to consistently account for all possible deformation to the Standard Model arising from decoupled New...
I present a tool for the simultaneous determination of Parton Distribution Functions and EFT coefficients. I discuss the importance of this simultaneous fit, as well as challenges in obtaining proper constraints. I present the most comprehensive analysis of both PDF and EFT effects in the top sector, which will be reproducible with our tool.
I will give an overview of the LHC EFT working group and summarise ongoing activities.
In this talk, I will address the distinction between two different kinds of Effective Field Theories. I will specifically discuss the differences between the Standard Model Effective Field Theory (SMEFT) and the Higgs Effective Field Theory and motivate the importance of both while interpreting data, providing an example for the scalar sector. I will then focus on the importance of global fits...