Facial reconstruction

Search LJMU Research Online

Browse Repository | Browse E-Theses

Application of deep learning for detection of toxic images in social media

Mac Dermott, AM, Motylinski, M, Iqbal, F, Stamp, K, Hussain, M and Marrington, A (2022) Application of deep learning for detection of toxic images in social media. In: Forensic Science International: Digital Investigation , 43 (301446). (DFRWS APAC 2022, Adelaide, Australia).

[img]
Preview
Text
Using deep learning to detect social media ‘trolls'.pdf - Published Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.

Download (1MB) | Preview

Abstract

Detecting criminal activity online is not a new concept but how it can occur is changing. Technology and the influx of social media applications and platforms has a vital part to play in this changing landscape. As such, we observe an increasing problem with cyber abuse and ‘trolling’/toxicity amongst social media platforms sharing stories, posts, memes sharing content. In this paper we present our work into the application of deep learning techniques for the detection of ‘trolls’ and toxic content shared on social media platforms. We propose a machine learning solution for the detection of toxic images based on embedded text content. The project utilizes GloVe word embeddings for data augmentation for improved prediction capabilities. Our methodology details the implementation of Long Short-term memory Gated recurrent unit models and their Bidirectional variants, comparing our approach to related works, and highlighting evident improvements. Our experiments revealed that the best performing model, Bidirectional LSTM, achieved 0.92 testing accuracy and 0.88 inference accuracy with 0.92 and 0.88 F1-score accordingly.

Item Type: Conference or Workshop Item (Paper)
Subjects: Q Science > QA Mathematics > QA76 Computer software
Divisions: Computer Science & Mathematics
Publisher: Elsevier
SWORD Depositor: A Symplectic
Date Deposited: 30 Jun 2022 12:36
Last Modified: 02 Nov 2023 12:06
DOI or ID number: 10.1016/j.fsidi.2022.301446
URI: https://researchonline.ljmu.ac.uk/id/eprint/17187
View Item View Item