Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatise numerical FONLL #153

Merged
merged 25 commits into from
Mar 14, 2024
Merged
Show file tree
Hide file tree
Changes from 21 commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
b6f9621
init speeding up nfonll
giacomomagni Feb 15, 2024
e7dbfb1
add combine Fk tables
giacomomagni Feb 15, 2024
29312c9
restore commands
giacomomagni Feb 15, 2024
4f3c94d
improve logging
giacomomagni Feb 15, 2024
e930699
remove creation of num fonll theories from eko command
giacomomagni Feb 15, 2024
6300c9d
init fonll docs
giacomomagni Feb 15, 2024
739f7ca
init fonll docs
giacomomagni Feb 15, 2024
c74f940
more on docs
giacomomagni Feb 15, 2024
28b8fe8
Fix FONLL-B bug
andreab1997 Feb 20, 2024
89a066a
Update docs/source/overview/running.rst
giacomomagni Feb 20, 2024
01cc3a2
Update docs/source/overview/running.rst
giacomomagni Feb 20, 2024
09a0686
Update docs/source/overview/running.rst
giacomomagni Feb 20, 2024
42d22d4
Update docs/source/overview/running.rst
giacomomagni Mar 6, 2024
78b3fb8
Update docs/source/overview/running.rst
giacomomagni Mar 6, 2024
efe91fb
Update docs/source/overview/running.rst
giacomomagni Mar 6, 2024
676cf60
Update docs/source/overview/running.rst
giacomomagni Mar 6, 2024
05ea71e
Update docs/source/overview/running.rst
giacomomagni Mar 6, 2024
1673333
Update docs/source/overview/running.rst
giacomomagni Mar 6, 2024
415f0f9
Update docs/source/overview/running.rst
giacomomagni Mar 6, 2024
f38eec4
clarify "usual" in docs
giacomomagni Mar 6, 2024
fbb4c55
Split docs
felixhekhorn Mar 7, 2024
3365f2e
reorganize fonll commands
giacomomagni Mar 8, 2024
2c755c6
fix pineko help
giacomomagni Mar 8, 2024
1b88551
fix issue #138
giacomomagni Mar 8, 2024
78c2138
fix test
giacomomagni Mar 8, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions docs/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,9 @@ SPHINXBUILD = sphinx-build
SOURCEDIR = source
BUILDDIR = build

PINEKODIR = ../src/eko
PINEKODIR = ../src/pineko
PINEKOOUT = $(SOURCEDIR)/modules/pineko
TODOOUTFILE = ./source/code_todos.rst

# Put it first so that "make" without argument is like "make help".
help:
Expand Down Expand Up @@ -41,4 +42,4 @@ todos:
python generate_code_todos.py "$(PINEKODIR)" "$(TODOOUTFILE)"

clean-todos:
rm "$(TODOOUTFILE)"
rm -f "$(TODOOUTFILE)"
2 changes: 2 additions & 0 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,9 @@
:hidden:

theory/fktables
theory/fonll
theory/scalevar
theory/kfactors


zzz-refs
Expand Down
27 changes: 8 additions & 19 deletions docs/source/overview/prerequisites.rst
Original file line number Diff line number Diff line change
Expand Up @@ -58,9 +58,7 @@ Default Operator Card
---------------------

You need to provide a default operator card for |EKO| for each theory you want to use.
An example is the following:

::
An example is the following::

configs:
evolution_method: truncated
Expand Down Expand Up @@ -99,7 +97,6 @@ An example is the following:
skip_non_singlet: false
skip_singlet: false

::

For more details about what is needed inside an operator card please refer to https://eko.readthedocs.io/en/latest/code/IO.html
under the section **Operator Runcard**. Note that the actual operator cards for each FK table will be
Expand All @@ -109,23 +106,17 @@ Grids
-----

*pineko* does **NOT** compute grids, which are instead expected input to *pineko*.
There are typically two ways to obtain grids: computing them from scratch with `runcards <https://github.com/NNPDF/pinecards/>`_
or reusing existing ones.
There are typically two ways to obtain grids:

Generate new Grids with *rr*
""""""""""""""""""""""""""""
1. computing them from scratch with
`pinefarm <https://github.com/NNPDF/pinefarm/>`_ (and `pinecards <https://github.com/NNPDF/pinecards/>`_).

You need to run *rr* with a given theory runcard and put the generated grid file with the same name
inside the *paths.grids/theory_id* folder. The name has to match the *ymldb* which is the case by default.
2. You can reuse the grids from a different theory by running::

Inherit Grids from Existing Theory
""""""""""""""""""""""""""""""""""
pineko theory inherit-grids SOURCE_THEORY_ID TARGET_THEORY_ID DATASET1 DATASET2 ...

You can reuse the grids from a different theory by running::

pineko theory inherit-grids SOURCE_THEORY_ID TARGET_THEORY_ID DATASET1 DATASET2 ...

The relation between the source theory and the target theory is non-trivial [4]_.
The relation between the source theory and the target theory is non-trivial
(e.g. they may differ by scale variations, different DIS settings, etc)


Notes
Expand All @@ -134,5 +125,3 @@ Notes
.. [2] this is to be replaced by the new CommonData format implemented by NNPDF

.. [3] this is to be replaced by a binding to the NNPDF theory objects

.. [4] examples being scale variations, different evolution settings, etc.
72 changes: 22 additions & 50 deletions docs/source/overview/running.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
################
Running `pineko`
################

Expand Down Expand Up @@ -27,17 +26,19 @@ This is a two step process:

pineko theory ekos THEORY_ID DATASET1 DATASET2 ...



Inherit |EKO| from Existing Theory
"""""""""""""""""""""""""""""""""""
Inherit |EKO| or grids from Existing Theory
"""""""""""""""""""""""""""""""""""""""""""

You can reuse the |EKO| from a different theory by running::

pineko theory inherit-ekos SOURCE_THEORY_ID TARGET_THEORY_ID DATASET1 DATASET2 ...

You can reuse the grid from a different theory by running::

pineko theory inherit-grids SOURCE_THEORY_ID TARGET_THEORY_ID DATASET1 DATASET2 ...

The relation between the source theory and the target theory is non-trivial [5]_.
The relation between the source theory and the target theory is non-trivial
(e.g. they may differ by scale variations, different DIS settings, etc).

Generating the FK Table
-----------------------
Expand All @@ -58,68 +59,39 @@ Other functionalities
---------------------

Other than the fundamental functions that have been described so far, *pineko* has a few
handy utility functions.
handy utility functions:

- applying the :doc:`FONLL prescription</theory/fonll>`
- applying the :doc:`scale variation prescription</theory/scalevar>`
- burning the :doc:`K-factor</theory/kfactors>` into grids


Checking the grids
""""""""""""""""""

Under the subcommand ``pineko check`` you can find two possible useful checks:

1. **compatibility**. This is used to check if a *grid* and an *eko* are compatible and ready to generate an Fk table. In order for a grid and an eko to be compatible, they must have the same x and Q2 grid (eventually including the factorization scale variations). The check is used as
::
1. **compatibility**. This is used to check if a *grid* and an *eko* are compatible and ready to generate an |FK| table.
In order for a grid and an eko to be compatible, they must have the same x and Q2 grid (eventually including the
factorization scale variations). The check is used as

pineko check compatibility GRID EKO
pineko check compatibility GRID EKO

eventually specifying the value of the factorization scale variation with the option ``--xif``.
eventually specifying the value of the factorization scale variation with the option ``--xif``.
2. **scvar**. This is used to check if the provided grid contains the requested scale variations. The syntax is the following
::

pineko check scvar GRID SCALE AS_ORDER AL_ORDER
pineko check scvar GRID SCALE AS_ORDER AL_ORDER

where ``SCALE`` can be one between "ren" and "fact" (respectively for *renormalization* and
*factorization* scale variations).
where ``SCALE`` can be one between "ren" and "fact" (respectively for *renormalization* and
*factorization* scale variations).

Comparing grids and FK tables
"""""""""""""""""""""""""""""

With the command ``pineko compare`` it is possible to compare the predictions as provided by the grid
(convoluted with a |PDF|) with the predictions as provided by the FK table. This is done like::
(convoluted with a |PDF|) with the predictions as provided by the |FK| table. This is done like

pineko compare GRID FKTABLE MAX_AS MAX_AL PDF

again eventually specifying the values of *renormalization* and *factorization* scales with the
appropriate options.

Scale variations
""""""""""""""""

Since it is possible to compute scale variations terms at a certain perturbative order N+1 just from
the knowledge of the central N order (see https://pineko.readthedocs.io/en/latest/theory/scalevar.html),
`pineko` includes a tool to add the required scale variations order to a grid which contain the
necessary central orders. The command to run it is::

pineko ren_sv_grid GRID_PATH OUTPUT_FOLDER_PATH MAX_AS NF ORDER_EXISTS

where ``GRID_PATH`` is the path of the original grid, ``OUTPUT_FOLDER_PATH`` is the folder where the
updated grid will be dumped, ``MAX_AS`` is the requested perturbative order of the QCD coupling and
``NF`` is the number of active flavors one wants to consider when computing the scale variations terms.
If the original grid has already all the scale variations terms for the requested perturbative order,
`pineko` will do nothing. If one want to force `pineko` to overwrite the already existing orders, it is
enough to set ``ORDER_EXISTS`` to `True`.

KFactors
""""""""

Another useful tool that `pineko` includes is ``pineko kfactor`` which allows the embedding of a kfactor
as a proper order in a grid. The usage is the following::

pineko kfactor GRIDS_FOLDER KFACTOR_FOLDER YAMLDB_PATH TARGET_FOLDER MAX_AS ORDER_EXISTS

where ``GRIDS_FOLDER`` is the folder containing the grids to update, ``KFACTOR_FOLDER`` is the folder
containing the kfactor files and ``YAMLDB_PATH`` is the path to the yamldb file of the requested dataset.
The other inputs have already been described in the previous section.

Notes
-----

.. [5] examples being scale variations, different DIS settings, etc.
7 changes: 7 additions & 0 deletions docs/source/shared/abbreviations.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,16 @@
.. |FFNS| replace::
:abbr:`FFNS (Fixed Flavor Number Scheme)`

.. |FFN0| replace::
:abbr:`FFN0 (Asymptotic Fixed Flavor Number Scheme)`

.. |VFNS| replace::
:abbr:`VFNS (Variable Flavor Number Scheme)`

.. |FONLL| replace::
:abbr:`FONLL (Fixed Order Next-to-Leading Log)`



.. perturbative orders
.. |LO| replace::
Expand Down
4 changes: 2 additions & 2 deletions docs/source/theory/fktables.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
============================================================

Fast Kernel (FK) tables
============================================================
=======================

The direct calculation of observables during a |PDF| fit is not very practical
since it requires first solving the |DGLAP| evolution equation for each new boundary
Expand Down
74 changes: 74 additions & 0 deletions docs/source/theory/fonll.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
FONLL
=====

In order to generate |FK| tables with |FONLL| the different flavor schemes
need to be evolved separately and joined together in a single |FK| table only
in the final step.

There are 2 workflows possible, one in which all the steps are performed individually
and one that is more automatized.

The automatized workflows always assumes that |FONLL| is performed both for
charm and bottom effects.

Manual procedure
----------------

1. Generate 7 theories for all the different flavor patches with command::

pineko fonll_tcards THEORY_ID

The different flavor patches are named following the convention:

* ``<THEORY_ID>00`` : |FFNS| :math:`n_f=3`
* ``<THEORY_ID>01`` : |FFN0| :math:`n_f=3`
* ``<THEORY_ID>02`` : massless component of |FFNS| :math:`n_f=4`
* ``<THEORY_ID>03`` : massive component of |FFNS| :math:`n_f=4`
* ``<THEORY_ID>04`` : |FFN0| :math:`n_f=4`
* ``<THEORY_ID>05`` : massless component of |FFNS| :math:`n_f=5`
* ``<THEORY_ID>06`` : massive component of |FFNS| :math:`n_f=5`

where for |FFNS| :math:`n_f=4,5` massive and massless parts are split to
allow for a damping option.

2. Generate the grids corresponding to all the 7 theories with the external program.

3. Generate the operator cards for each theory with the normal command listed above.
Note that, in principle, only 3 ekos are needed, as there are only 3 different :math:`n_f` patches.
So you might speed up the procedure taking advantage of inherit-ekos.

4. Generate the ekos for each theory with the normal command listed above.

5. Generate the |FK| tables each of the 7 theories with the normal command listed above.

6. Combine the various |FK| tables into a single file, using the command::

pineko combine_fonll THEORY_ID DATASET --FFNS3 THEORY_ID00 --FFN03 THEORY_ID01 --FFNS4til THEORY_ID02 --FFNS4bar THEORY_ID03 --FFN04 THEORY_ID04 --FFNS5til THEORY_ID05 --FFNS5bar THEORY_ID06

where the first 3 theories are needed to perform |FONLL| on charm effects,
while the last 4 are needed to include also bottom effects.

Automatic procedure
-------------------

This workflow can be faster, but it might be less flexible:

1. Generate 7 theories for all the different flavor patches with the command::

pineko fonll_tcards THEORY_ID

See above for the intermediate theories naming convention.

2. Generate the grids corresponding to all the 7 theories with the external program.

3. Generate the three ekos, one for each :math:`n_f`, and inherit the others running::

pineko fonll_ekos THEORY_ID DATASET1 DATASET2 ...

Note: this is usually an expensive operation as multiple ekos are run sequentially.
Depending on the resources that you have available it might be more convenient
to call the command separately for each DATASET.

4. Generate the final |FONLL| |FK| table directly running::

pineko fonll_fks THEORY_ID DATASET1 DATASET2 ...
11 changes: 11 additions & 0 deletions docs/source/theory/kfactors.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
K-Factors
=========

Another useful tool that `pineko` includes is ``pineko kfactor`` which allows the embedding of a kfactor
as a proper order in a grid. The usage is the following::

pineko kfactor GRIDS_FOLDER KFACTOR_FOLDER YAMLDB_PATH TARGET_FOLDER MAX_AS ORDER_EXISTS

where ``GRIDS_FOLDER`` is the folder containing the grids to update, ``KFACTOR_FOLDER`` is the folder
containing the kfactor files and ``YAMLDB_PATH`` is the path to the yamldb file of the requested dataset.
The other inputs have already been described in the previous section.
19 changes: 18 additions & 1 deletion docs/source/theory/scalevar.rst
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ where this time the perturbative expansion of :math:`\overline{H}(\alpha_{s}(t+\
than :math:`\mathcal{O}(\alpha_{s}^{0})`.

Scale variation for |PDF| evolution
###########################################
###################################

A completely independent source of |MHOU| arises from the truncation of the perturbative expansion of the anomalous dimensions governing the evolution
of the |PDF|. Again, this uncertainties can be estimated trough scale variation but, in this case, there are three equivalent ways in which it can be
Expand Down Expand Up @@ -222,3 +222,20 @@ Note that, even if these schemes are formally equivalent, they can differ by sub
In fact, in **scheme A** some higher order terms of the anomalous dimensions expansion can be retained according to the kind of solution adopted for the evolution equation.
In **scheme B** the exponential has been expanded so that it corresponds to a linearized solution of the evolution equations and in **scheme C** some terms coming from the
cross-expansion of the coefficients functions and the linearized solution of the evolution equations have been dropped.


Adding scale variations to a grid
#################################

Since it is possible to compute scale variations terms at a certain perturbative order N+1 just from
the knowledge of the central N order, `pineko` includes a tool to add the required scale variations order to a grid which contain the
necessary central orders. The command to run it is::

pineko ren_sv_grid GRID_PATH OUTPUT_FOLDER_PATH MAX_AS NF ORDER_EXISTS

where ``GRID_PATH`` is the path of the original grid, ``OUTPUT_FOLDER_PATH`` is the folder where the
updated grid will be dumped, ``MAX_AS`` is the requested perturbative order of the QCD coupling and
``NF`` is the number of active flavors one wants to consider when computing the scale variations terms.
If the original grid has already all the scale variations terms for the requested perturbative order,
`pineko` will do nothing. If one want to force `pineko` to overwrite the already existing orders, it is
enough to set ``ORDER_EXISTS`` to `True`.
Loading
Loading