• About
  • Subscribe
  • Contact
Thursday, May 8, 2025
    Login
FutureCISO
  • People
  • Process
  • Technology
  • Resources
    • White Papers
    • PodChats
No Result
View All Result
  • People
  • Process
  • Technology
  • Resources
    • White Papers
    • PodChats
No Result
View All Result
FutureCISO
No Result
View All Result
Home Process Incident Response

Study warns against malicious use cases of AI in 2024

FutureCISO Editors by FutureCISO Editors
March 27, 2024
Photo by Anna Shvets: https://www.pexels.com/photo/people-on-a-video-call-4226261/

Photo by Anna Shvets: https://www.pexels.com/photo/people-on-a-video-call-4226261/

Share on FacebookShare on Twitter

Malicious use cases of artificial intelligence (AI) will most likely emerge from targeted deepfakes and influence operations, according to the Adversarial Intelligence: Red Teaming Malicious Use Cases for AI by Recorded Future.

Malicious use cases:

Deepfakes for impersonation: Threat actors can use publicly available short clips to generate deepfake and live cloning

Influence operations impersonating legitimate websites: Malicious actors can leverage AI to generate disinformation and automatically curate content based on generated text, decreasing the cost of content production by 100 times compared to traditional troll farms and human content writers.

Self-augmenting malware evading YARA: Malicious actors can use GenAI To evade string-based YARA rules by augmenting the source code of malware variants and scripts, lowering detection rates. 

ICS and aerial imagery reconnaissance: Threat players leverage multimodal AI to process public images and videos to geolocate facilities and identify industry control system (ICS) equipment.

Recommendations

As voices, videos, and photos of executives now become part of an organisation’s attack surface, Recorded Future analysts recommend organisations invest in multi-layered and behavioural malware detection capabilities to prepare for threat actors developing AI-assisted polymorphic malware. 

Moreover, organisations need to assess the risk of impersonation in targeted attacks. Recorded Future analysts suggest organisations use various alternate methods of communication and verification for sensitive transactions. 

To protect sensitive data, publicly available images and videos of critical infrastructure and sensitive sectors such as defence, government, energy, manufacturing, and transportation should be scrutinised and scrubbed. 

Related:  68% of MFG firms faced encrypted data after ransomware attack
Tags: Artificial IntelligencecybersecuritydeepfakeRecorded Future
FutureCISO Editors

FutureCISO Editors

No Result
View All Result

Recent Posts

  • DDoS attacks surge in Asia Pacific, claims Cloudflare
  • Reimagining security for the AI Era
  • PodChats for FutureCISO: Articulating the business value of security in 2025
  • New standard for cybersecurity at the storage layer
  • Cybersecurity challenges persist despite improved defenses

Categories

  • Blogs
  • Compliance and Governance
  • Culture and Behaviour
  • Cybersecurity careers
  • Data Protection
  • Endpoint Security
  • Incident Response
  • Network Security
  • People
  • Process
  • Resources
  • Risk Management
  • Technology
  • Training and awarenes
  • Videos
  • Webinars and PodChats
  • White Papers

Strategic Insights for Chief Information Officers

FutureCISO serves the interests of the Chief Information Security Officer (CISO) and the information security profession. Its purpose is to provide relevant and timely industry insights around all things important to security professionals and organisations that recognize and value the importance of protecting the organisation’s data and its customers’ privacy.

Cxociety Media Brands

  • FutureIoT
  • FutureCFO
  • FutureCIO

Categories

  • Privacy Policy
  • Terms of Use
  • Cookie Policy

Copyright © 2024 Cxociety Pte Ltd | Designed by Pixl

Login to your account below

or

Not a member yet? Register here

Forgotten Password?

Fill the forms bellow to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • People
  • Process
  • Technology
  • Resources
    • White Papers
    • PodChats
Login

Copyright © 2024 Cxociety Pte Ltd | Designed by Pixl