Back to the list
Congress: ECR25
Poster Number: C-28044
Type: Poster: EPOS Radiologist (scientific)
Authorblock: S. Sauranen, T. Mäkelä, T. Kaasalainen, M. Kortesniemi; Helsinki/FI
Disclosures:
Sara Sauranen: Nothing to disclose
Teemu Mäkelä: Nothing to disclose
Touko Kaasalainen: Nothing to disclose
Mika Kortesniemi: Nothing to disclose
Keywords: Artificial Intelligence, Computer applications, Radiation physics, CT, Physics, Technology assessment, Quality assurance
Purpose

Image noise is a distractor in radiological image reading and may hide subtle details especially at low contrast levels. Thus, image noise magnitude is a key descriptor of image quality. In computed tomography (CT), the magnitude of image noise is typically measured as the standard deviation of CT numbers in a uniform image area. Lowering noise is among the general targets when optimizing image quality. In addition to traditional and well-known conventional and dual-energy CT (DECT) noise reduction methods, many denoising technologies based on artificial intelligence (AI), especially deep learning, have recently been introduced for both research and clinical use. AI-based noise reduction methods have been proved to improve image quality for both conventional and dual-energy CT in various studies. [1,2,3] Optimizing DECT map image quality often requires particular consideration at image acquisition, reconstruction, and post-processing domains as the formation of DECT maps, such as effective atomic number maps and virtual monoenergetic images, differs from the formation of conventional CT images. In this study, AI noise reduction was applied to DECT maps at different stages of image post-processing. DECT maps with and without AI noise reduction were compared to evaluate whether the application of AI noise reduction and its timing during post-processing affect noise and contrast across different dose levels.

GALLERY