A deep facial recognition system using computational intelligent algorithms.

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Source:
      Publisher: Public Library of Science Country of Publication: United States NLM ID: 101285081 Publication Model: eCollection Cited Medium: Internet ISSN: 1932-6203 (Electronic) Linking ISSN: 19326203 NLM ISO Abbreviation: PLoS One Subsets: MEDLINE
    • Publication Information:
      Original Publication: San Francisco, CA : Public Library of Science
    • Subject Terms:
    • Abstract:
      The development of biometric applications, such as facial recognition (FR), has recently become important in smart cities. Many scientists and engineers around the world have focused on establishing increasingly robust and accurate algorithms and methods for these types of systems and their applications in everyday life. FR is developing technology with multiple real-time applications. The goal of this paper is to develop a complete FR system using transfer learning in fog computing and cloud computing. The developed system uses deep convolutional neural networks (DCNN) because of the dominant representation; there are some conditions including occlusions, expressions, illuminations, and pose, which can affect the deep FR performance. DCNN is used to extract relevant facial features. These features allow us to compare faces between them in an efficient way. The system can be trained to recognize a set of people and to learn via an online method, by integrating the new people it processes and improving its predictions on the ones it already has. The proposed recognition method was tested with different three standard machine learning algorithms (Decision Tree (DT), K Nearest Neighbor(KNN), Support Vector Machine (SVM)). The proposed system has been evaluated using three datasets of face images (SDUMLA-HMT, 113, and CASIA) via performance metrics of accuracy, precision, sensitivity, specificity, and time. The experimental results show that the proposed method achieves superiority over other algorithms according to all parameters. The suggested algorithm results in higher accuracy (99.06%), higher precision (99.12%), higher recall (99.07%), and higher specificity (99.10%) than the comparison algorithms.
      Competing Interests: The authors have declared that no competing interests exist.
    • References:
      PLoS One. 2019 Dec 2;14(12):e0225519. (PMID: 31790454)
      PLoS One. 2015 Oct 14;10(10):e0139827. (PMID: 26465631)
      PLoS One. 2019 Mar 6;14(3):e0212935. (PMID: 30840663)
      PLoS One. 2018 Dec 28;13(12):e0209927. (PMID: 30592761)
      Proc Natl Acad Sci U S A. 2018 Feb 27;115(9):E2105-E2114. (PMID: 29440410)
      PLoS One. 2016 Feb 26;11(2):e0150036. (PMID: 26918457)
      Eur J Clin Microbiol Infect Dis. 2020 Jul;39(7):1379-1389. (PMID: 32337662)
      PLoS One. 2018 Feb 26;13(2):e0193465. (PMID: 29481572)
      Med Image Anal. 2017 Dec;42:60-88. (PMID: 28778026)
      Annu Int Conf IEEE Eng Med Biol Soc. 2016 Aug;2016:1373-1376. (PMID: 28268581)
      IEEE Trans Med Imaging. 2016 May;35(5):1285-98. (PMID: 26886976)
      J Med Syst. 2016 Apr;40(4):96. (PMID: 26872778)
      IEEE Trans Pattern Anal Mach Intell. 2018 Apr;40(4):1002-1014. (PMID: 28475048)
      IEEE Trans Med Imaging. 2016 May;35(5):1299-1312. (PMID: 26978662)
      Int J Med Inform. 2017 Sep;105:1-10. (PMID: 28750902)
      PLoS One. 2016 Feb 01;11(2):e0148148. (PMID: 26829321)
    • Publication Date:
      Date Created: 20201203 Date Completed: 20210115 Latest Revision: 20210115
    • Publication Date:
      20240104
    • Accession Number:
      PMC7714107
    • Accession Number:
      10.1371/journal.pone.0242269
    • Accession Number:
      33270670