nnpdf
                                
                                
                                
                                    nnpdf copied to clipboard
                            
                            
                            
                        LHCb Z 13 TeV dimuon 2022
This PR implements the new measurements from LHCb for Z production at 13 TeV in the new commondata format. Please, observe the following:
- The keys in the metadata relative to the FK tables will be modified once FK tables will be produced.
 - I'm not sure of the keys in "nnpdf_metadata", so please check if it is correct.
 - Recall that the double distribution in the boson transverse momentum and rapidity has the issue with negative define correlation matrix. Hence, you won't find yaml file for this particular distribution.
 
Hi @achiefa coyld you cherry-pick or rebase the last few commits? (so that the branch contain only your changes).
Hi @scarlehoff, done. Now the branch history should be clean. I apologise for that - yesterday night I didn't pay much attention on the rebase procedure and I messed it up.
Should this be on top of new_commodnata_utils or master? Currently it contains commits that are in master but not in new_commodnata_utils.
My understanding from the last pheno meeting was to rebase on top of master, but I may be wrong. Should I rebase on top of new_commondata_utils ?
I wasn't there so I don't know what was discussed, but rebasing on master indeed seems reasonable to me
Is there a report (ideally comparefits) that includes this dataset?
No, there isn't. How can I produce such a report?
At the moment you won't be able to produce a report using this dataset since the theory is missing.
Ah, I assumed that came with the PR. I should have read the first message more carefully.
Could you please move this from buildmaster to validphys/datafiles/new_commondata?
Hi @achiefa, could you check the error:
FAILED tests/test_loader.py::test_kitable_with_cuts - validphys.loader.DataNotFoundError: No .dat file found for LHCB_Z0J_13TEV_2022_DIMUON and no new data translation found
Maybe there's some name problem? Or some mismatch for the ZJ dataset?
Also, could you fix the conflict? I think you can just use the NC PT for your dataset as well as the process type.
I'm quite confused by process_type.py in the master branch. It seems there are two instantiations of the class _Process that look the same (DY_NC_PT in particular), at least to me.
Also, DY_ZJ_PT (the one that I implemented) and DY_NC_PT (the one that Mark implemented) use different maps. Which one is better?
If they are the same just make sure that alm datasets are using the same one and remove the redundant one.
Since you are editing this dataset maybe it makes sense to use DY_NC, but it doesn't really matter.
Hi @scarlehoff, I've resolved the conflict and now all checks have passed. I just want to mention that I removed the jet data from dataset_names.yaml since we're not going to use them.
Is there anything else that needs to be fixed or modified?
Hi @scarlehoff, I've resolved the conflict and now all checks have passed. I just want to mention that I removed the jet data from dataset_names.yaml since we're not going to use them.
You should actually not add any new dataset to dataset_names.yaml. That's only a mapping for the legacy sets.
Ok, I've just removed it.
I think the PR is now complete. A few comments before merging:
- I resolved the conflict with the master branch in 
process_options. Specifically:- There is now a single implementation for each of _dyboson_xq2map and _dybosonpt_xq2map. I kept my implementation.
 - process_options now contains two DY-process classes: 
DY_PTandDY_2L. The specification for a particular process type is done using dataclasses.replace in the PROCESS dictionary. For instance,DY_W_ETAis obtained fromDY_2L, and the dataset implemented by @comane is safe. 
 - I made the report on the kinematic coverage for both LHCb and CMS (the latter was implemented by Mark). You can find the report here: report
 - I've attached the data-theory comparison at NNLO for the Z0 data set.
 
Thanks! I think this can be merged now.
I made the report on the kinematic coverage for both LHCb and CMS (the latter was implemented by Mark). You can find the report here: report
This looks good (in that it looks like what I would expect: CMS is central while LHCB accesses the extremes).