BROWSE

Related Scientist

ctpu's photo.

ctpu
순수물리이론연구단
more info

ITEM VIEW & DOWNLOAD

Resolving combinatorial ambiguities in dilepton t t ¯ event topologies with neural networks

Cited 0 time in webofscience Cited 0 time in scopus
292 Viewed 0 Downloaded
Title
Resolving combinatorial ambiguities in dilepton t t ¯ event topologies with neural networks
Author(s)
Haider Alhazmi; Zhongtian Dong; Li Huang; Jeong Han Kim; Kyoungchul Kong; d David Shih
Publication Date
2022-06
Journal
Physical Review D, v.105, no.11
Publisher
American Physical Society
Abstract
© 2022 authors. Published by the American Physical Society.We study the potential of deep learning to resolve the combinatorial problem in supersymmetrylike events with two invisible particles at the LHC. As a concrete example, we focus on dileptonic tt¯ events, where the combinatorial problem becomes an issue of binary classification: pairing the correct lepton with each b quark coming from the decays of the tops. We investigate the performance of a number of machine learning algorithms, including attention-based networks, which have been used for a similar problem in the fully hadronic channel of tt¯ production, and the Lorentz Boost Network, which is motivated by physics principles. We then consider the general case when the underlying mass spectrum is unknown, and hence no kinematic end point information is available. Compared against existing methods based on kinematic variables, we demonstrate that the efficiency for selecting the correct pairing is greatly improved by utilizing deep learning techniques.
URI
https://pr.ibs.re.kr/handle/8788114/12033
DOI
10.1103/PhysRevD.105.115011
ISSN
2470-0010
Appears in Collections:
Center for Fundamental Theory(순수물리이론 연구단) > 1. Journal Papers (저널논문)
Files in This Item:
There are no files associated with this item.

qrcode

  • facebook

    twitter

  • Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
해당 아이템을 이메일로 공유하기 원하시면 인증을 거치시기 바랍니다.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse