BROWSE

Related Scientist

cnir's photo.

cnir
뇌과학이미징연구단
more info

ITEM VIEW & DOWNLOAD

k-Space deep learning for reference-free EPI ghost correction

DC Field Value Language
dc.contributor.authorJuyoung Lee-
dc.contributor.authorYoseob Han-
dc.contributor.authorJae-Kyun Ryu-
dc.contributor.authorJang-Yeon Park-
dc.contributor.authorJong Chul Ye-
dc.date.available2019-10-11T08:05:57Z-
dc.date.created2019-08-20-
dc.date.issued2019-12-
dc.identifier.issn0740-3194-
dc.identifier.urihttps://pr.ibs.re.kr/handle/8788114/6257-
dc.description.abstract© 2019 International Society for Magnetic Resonance in MedicinePurpose: Nyquist ghost artifacts in echo planar imaging (EPI) are originated from phase mismatch between the even and odd echoes. However, conventional correction methods using reference scans often produce erroneous results especially in high-field MRI due to the nonlinear and time-varying local magnetic field changes. Recently, it was shown that the problem of ghost correction can be reformulated as k-space interpolation problem that can be solved using structured low-rank Hankel matrix approaches. Another recent work showed that data driven Hankel matrix decomposition can be reformulated to exhibit similar structures as deep convolutional neural network. By synergistically combining these findings, we propose a k-space deep learning approach that immediately corrects the phase mismatch without a reference scan in both accelerated and non-accelerated EPI acquisitions. Theory and Methods: To take advantage of the even and odd-phase directional redundancy, the k-space data are divided into 2 channels configured with even and odd phase encodings. The redundancies between coils are also exploited by stacking the multi-coil k-space data into additional input channels. Then, our k-space ghost correction network is trained to learn the interpolation kernel to estimate the missing virtual k-space data. For the accelerated EPI data, the same neural network is trained to directly estimate the interpolation kernels for missing k-space data from both ghost and subsampling. Results: Reconstruction results using 3T and 7T in vivo data showed that the proposed method outperformed the image quality compared to the existing methods, and the computing time is much faster. Conclusions: The proposed k-space deep learning for EPI ghost correction is highly robust and fast, and can be combined with acceleration, so that it can be used as a promising correction tool for high-field MRI without changing the current acquisition protocol-
dc.description.uri1-
dc.language영어-
dc.publisherWILEY-BLACKWELL-
dc.subjectdeep convolutional framelet-
dc.subjectdeep learning k-space learning-
dc.subjectEPI-
dc.subjectMRI-
dc.subjectNyquist ghost artifact-
dc.titlek-Space deep learning for reference-free EPI ghost correction-
dc.typeArticle-
dc.type.rimsART-
dc.identifier.wosid000478151700001-
dc.identifier.scopusid2-s2.0-85069906163-
dc.identifier.rimsid69382-
dc.contributor.affiliatedAuthorJae-Kyun Ryu-
dc.contributor.affiliatedAuthorJang-Yeon Park-
dc.identifier.doi10.1002/mrm.27896-
dc.identifier.bibliographicCitationMAGNETIC RESONANCE IN MEDICINE, v.82, no.6, pp.2299 - 2313-
dc.citation.titleMAGNETIC RESONANCE IN MEDICINE-
dc.citation.volume82-
dc.citation.number6-
dc.citation.startPage2299-
dc.citation.endPage2313-
dc.description.journalClass1-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.subject.keywordAuthordeep convolutional framelet-
dc.subject.keywordAuthordeep learning k-space learning-
dc.subject.keywordAuthorEPI-
dc.subject.keywordAuthorMRI-
dc.subject.keywordAuthorNyquist ghost artifact-
Appears in Collections:
Center for Neuroscience Imaging Research (뇌과학 이미징 연구단) > 1. Journal Papers (저널논문)
Files in This Item:
36 박장연_k‐Space deep learning for reference‐free EPI ghost correction.pdfDownload

qrcode

  • facebook

    twitter

  • Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
해당 아이템을 이메일로 공유하기 원하시면 인증을 거치시기 바랍니다.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse