Safer Internet: Online Scams

From phishing emails and fake websites to AI-generated frauds, online scams are becoming increasingly sophisticated. These threats often exploit trust, urgency, or lack of awareness to deceive people into sharing personal or financial information.

Common Scams to Watch Out For

  • Phishing Attacks: Fraudulent emails or messages that appear to come from trusted sources.
  • Impersonation Scams: Fake profiles or accounts pretending to be someone you know.
  • Too-Good-To-Be-True Offers: Unrealistic deals designed to lure you into providing payment details.
  • AI-Driven Scams: Deepfake videos or voices imitating real individuals to manipulate victims.

How to Protect Yourself

Phishing attacks:

  • Verify the source: Always double-check the sender’s email address or phone number. Scammers often use addresses or numbers that closely mimic legitimate ones.
  • Think before you click: Avoid clicking on links or downloading attachments from unknown or suspicious emails.
  • Use multi-factor authentication (MFA): Add an extra layer of security to your accounts. Even if scammers obtain your password, MFA can block unauthorised access.
  • Be wary of urgent language: Scammers use urgency to make you act without thinking. Always take a moment to assess the message’s legitimacy.

Impersonation scams:

  • Verify accounts: Before engaging with someone online, confirm their identity through an alternate trusted source.
  • Scrutinise social media requests: Be cautious of friend or connection requests from people you don’t know or duplicate requests from those you already know.
  • Limit public information: Avoid sharing personal details, such as your full name, address, or employer, on public platforms. Scammers can use this information to build convincing fake accounts.
  • Enable privacy settings: Adjust your social media privacy settings to restrict who can view your profile or contact you.

Too-good-to-be-true offers:

  • Research the offer: Google the deal, company, or person involved. Scams are often reported online.
  • Avoid upfront payments: Be sceptical if you’re asked to pay in advance for a “guaranteed” return.
  • Use trusted platforms: Only shop or conduct financial transactions on reputable websites with secure payment systems (look for “https://” and a padlock symbol in the URL).
  • Trust your instincts: If something feels off or sounds too good to be true, it likely is.

AI-driven scams: Deepfake Videos or Voices

  • Pause and analyse: If you receive a suspicious voice or video message, question its authenticity. Deepfakes often have subtle imperfections, such as unnatural speech patterns or visual glitches.
  • Verify through secondary channels: Confirm the message by directly contacting the individual through a separate, trusted means.
  • Stay informed about AI tools: Familiarise yourself with the capabilities of deepfake technology to recognise its potential misuse.
  • Use reliable news sources: Avoid acting on AI-generated “news” or misinformation until you’ve verified the story from multiple trustworthy outlets.

More generally, you should:

  • Educate yourself regularly: Scams evolve quickly. Stay updated on common tactics by following cybersecurity blogs or news outlets.
  • Use security software: Install reliable antivirus and anti-phishing software on all devices.
  • Report scams: If you encounter a scam, report it to relevant authorities (e.g., Action Fraud in the UK) to help prevent others from falling victim.
  • Trust your instincts: If something seems suspicious, take a step back and reassess.

More information

Action Fraud (UK) websiteΒ  πŸ‘‰ https://www.actionfraud.police.uk/

Our Cybersecurity Awareness course πŸ‘‰https://hsqe.co.uk/courses/cyber-security-awareness/

Make sure you are following our social media channels and sign up to our monthly newsletter to ensure you are not missing out:

LinkedIn πŸ‘‰Β  https://www.linkedin.com/company/hsqe-limited/

Facebook πŸ‘‰ https://facebook.com/hsqe.co.uk

Newsletters πŸ‘‰ https://www.hsqe.co.uk/downloads/#newsletters

β€”

Authors: Alex Nightingale & John Constable

(c) HSQE Ltd 31/01/25