Cybercriminals Use Fake “Gemini” AI Chatbot to Push Fraudulent Crypto Scam

Related

Serbia’s Cybersecurity Leadership Spotlight

Serbia’s cybersecurity leadership is being shaped by a mix...

Latvia’s Cybersecurity Leadership Spotlight

Latvia sits on a frontline where cybersecurity is inseparable...

Bulgaria’s Cybersecurity Leadership Spotlight

Bulgaria’s cybersecurity leadership reflects a country balancing fast-growing tech...

Moldova’s Cybersecurity Leadership Spotlight

Moldova’s cybersecurity leadership sits at a crossroads of banking...

Share

What happened

Cybercriminals launched a fraudulent “Google Coin” presale website that includes a custom AI chatbot impersonating Google’s Gemini assistant to convince victims to send cryptocurrency payments for a non-existent token. 

Who is affected

Online crypto investors and the broader public visiting third-party crypto presale or investment sites are at risk, particularly those encountering AI-driven sales bots on unaffiliated platforms.

Why CISOs should care

This incident highlights a significant evolution in threat actor tactics: leveraging AI to automate social engineering at scale. Traditional human-led scams that engage users through messaging or calls are now being augmented or replaced by AI chatbots that can respond convincingly to individual queries and push victims toward irreversible payments. This increases the volume of potential victims and the speed at which fraud campaigns can operate.

3 practical actions

  1. Monitor and educate: Alert employees and stakeholders about AI-assisted scams, emphasizing that reputable companies don’t embed third-party AI chatbots for financial transactions on unknown sites.
  2. Enhance detection: Integrate content scanning tools and threat intelligence feeds that flag sites mimicking trusted brands or using unauthorized AI branding.
  3. Review policies: Update security governance and acceptable use policies to address AI impersonation risks, including guidance on identifying fake investment platforms or bots.