Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/65377
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBahrami, Khosroen
dc.date.accessioned2015-09-07T05:16:21Zen
dc.date.available2015-09-07T05:16:21Zen
dc.date.copyright2014en
dc.date.issued2014en
dc.identifier.citationBahrami, K. (2014). Image tampering detection based on level or type of blurriness. Doctoral thesis, Nanyang Technological University, Singapore.en
dc.identifier.urihttps://hdl.handle.net/10356/65377en
dc.description.abstractWith the development of sophisticated photo-editing tools, image manipulation and forgery can be done easily and detection of tampered images by human eyes is difficult. Since images can be used in journalism, medical diagnosis, police investigation and as court evidences; image tampering can be a threat to the security of people and human society. Therefore, detection of image forgery is an urgent issue and development of reliable methods for image integrity examination and image forgery detection is important. Image splicing is one of the most common types of image tampering. In image splicing, if the original image and the spliced region have inconsistency in terms of blur type or blur level, such inconsistency can be used as an evidence of image splicing. In addition, after splicing, the traces of splicing boundary in the form of sharp edges are left in the tampered image which are different from the normal edges in the image. However, the forger may use some post-processing operations such as resizing the tampered image into a smaller size and artificial blurring of the splicing boundary to remove the splicing traces or visual anomalies. In such a case, the existing tampering detection methods are less reliable. In this thesis, we address the problem of splicing detection and localization by proposing three methods for 1) Splicing localization by exposing blur type inconsistency between the spliced region and the original image, 2) Splicing localization based on inconsistency between blur and depth in the spliced region, and 3) Splicing detection based on splicing boundary artifacts. To locate the splicing region based on blur type inconsistency, we propose a blur type detection feature to classifying the image blocks based on the blur type. This feature is used in a classification framework to classify the spliced and the authentic regions. To locate the splicing based on the inconsistency between blur and depth, we estimate two depths based on defocus blur cue and image content cues. The inconsistency between these two depths are used for splicing localization. To detect the splicing based on splicing boundary artifacts, we propose two sharpness features called Maximum Local Variation (MLV) and Content Aware Total Variation (CATV) to measure the local sharpness of the image. These sharpness features are incorporated in a machine learning framework to classify the image into authentic or spliced. Different from the previous splicing detection methods, the first two methods are reliable in the case of artificial blurring of the splicing boundary and all of our proposed methods are robust in general to image resizing.en
dc.format.extent118en
dc.language.isoenen
dc.subjectDRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer visionen
dc.titleImage tampering detection based on level or type of blurrinessen
dc.typeThesisen
dc.contributor.supervisorKot Chichung, Alexen
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen
dc.description.degreeDOCTOR OF PHILOSOPHY (EEE)en
dc.identifier.doi10.32657/10356/65377en
item.grantfulltextopen-
item.fulltextWith Fulltext-
Appears in Collections:EEE Theses
Files in This Item:
File Description SizeFormat 
PhdThesis-KhosroBahrami.pdfMain article (Phd Thesis)35.55 MBAdobe PDFThumbnail
View/Open

Page view(s)

227
Updated on Jul 27, 2021

Download(s) 20

132
Updated on Jul 27, 2021

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.