Premium Invest Hub
  • Investing
  • Stock
  • Latest News
  • Editor’s Pick
  • Economy
Popular Topics
  • Occupied West Bank rocked by day of violence as gunmen kill three Israeli settlers and reprisal attacks reported
  • Azerbaijan’s leader accuses Russia of passenger jet crash ‘cover up’ in blistering new attack on neighbor
  • Spanish woman killed by elephant in Thailand while bathing animal, police say
  • US adds Chinese tech giants to list of companies allegedly working with China’s military
  • Bad news for homebuyers in the Northeast and Midwest

    Sign up for our newsletter to receive the latest insights, updates, and exclusive content straight to your inbox! Whether it's industry news, expert advice, or inspiring stories, we bring you valuable information that you won't find anywhere else. Stay connected with us!


    By opting in you agree to receive emails from us and our affiliates. Your information is secure and your privacy is protected.

    • About us
    • Contact us
    • Privacy Policy
    • Terms & Conditions
    Premium Invest Hub
    • Investing
    • Stock
    • Latest News
    • Editor’s Pick
    • Economy
    • Investing

    FTC Updates Rules to Address AI Deepfake Threats to Consumer Safety

    • February 17, 2024

    The US Federal Trade Commission (FTC) proposed making new updates on an artificial intelligence (AI) deepfake rule on February 16. The government agency said the proposed rule changes would protect users from AI impersonations.

    According to the ‘Rule on Impersonation of Government and Businesses’ document, AI deepfakes that impersonate businesses and governments could face legal action.

    No AI Deepfakes Allowed for Businesses and Government Agencies


    The FTC said the changes are necessary due to the prevalence of impersonations of businesses, government officials, and parastatals.

    The endgame is to protect customers from possible harm incurred from generative AI platforms.

    The updated rule will come into effect 30 days following its publication in the Federal Register.

    For now, public comments are welcome for the next 60 days. Once the rule is enacted, the FTC will be empowered to go after scammers who defraud users by impersonating legitimate businesses or government agencies.

    The AI industry has come a long way since the famous launch of ChatGPT in November 2022 by the OpenAI team. The company, led by Sam Altman, has recently launched a new product called Sora.

    Sora uses AI prompts to generate realistic videos with highly detailed scenes, complex camera motions, and vibrant emotions.

    Introducing Sora, our text-to-video model.

    Sora can create videos of up to 60 seconds featuring highly detailed scenes, complex camera motion, and multiple characters with vibrant emotions. https://t.co/7j2JN27M3W

    Prompt: “Beautiful, snowy… pic.twitter.com/ruTEWn87vf

    — OpenAI (@OpenAI) February 15, 2024

    Powerful AI tools like those offered by OpenAI and Google have increased productivity for many people and businesses.

    However, they have also become an effective tool in the hands of cybercriminals. With the tool, criminals can easily alter the appearance or voice of someone to deceive a target audience.

    The FTC rule change will come down hard on these criminals to ensure they face the full weight of the law.

    While there is no concrete rule that makes AI-generated recreations illegal, US Senators Chris Coons, Marsha Blackburn, and Thom Tillis have taken steps to address the issue.

    Impersonator Scams Stole $2.7 Billion in 2023


    Impersonator scams, though not often featured in tabloids, pose a major threat to the US.

    Speaking on the issue, the FTC Chair Lina Khan noted that voice cloning and AI-driven scams were rising.

    Khan proposed that updating the rules would strengthen the agency’s ability to address AI-enabled scams that impersonate individuals.

    Putting a figure on the potential hazard impersonator scams carry, Khan noted that US citizens lost upwards of $2.7 billion in 2023.

    2. Scams where fraudsters pose as the government are highly common. Last year Americans lost $2.7 billion to impersonator scams.

    The rule @FTC just finalized will let us levy penalties on these scammers and get back money for those defrauded.https://t.co/8ON0G63ZjL

    — Lina Khan (@linakhanFTC) February 15, 2024

    The new rules would also enable the agency to return the stolen funds to the affected victims.

    Meanwhile, the head of the Federal Communications Commission (FCC), Jessica Rosenworcel, has proposed categorizing all calls with AI-generated voices as illegal.

    Today we announced a proposal to make AI-voice generated robocalls illegal – giving State AGs new tools to crack down on voice cloning scams and protect consumers. https://t.co/OfJUZR0HrG

    — The FCC (@FCC) January 31, 2024

    The announcement came after reports surfaced that US citizens were getting robocalls imitating President Joe Biden.

    NH voters are getting robocalls from Biden telling them not to vote tomorrow.

    Except it’s not Biden. It’s a deepfake of his voice.

    This is what happens when AI’s power goes unchecked.

    If we don’t regulate it, our democracy is doomed.pic.twitter.com/8wlrT63Mfr

    — Public Citizen (@Public_Citizen) January 22, 2024

    In the call, US voters were advised not to vote in the US Presidential elections.

    Meanwhile, in the crypto industry, AI deepfakes are a menace.

    According to Michael Saylor, about 80 deepfake videos of himself are removed daily. Most videos show him asking users to send their Bitcoin to a posted wallet address.

    Michael Saylor, Chairman of MicroStrategy and one of the largest Bitcoin holders, has issued a warning to the Bitcoin community about the risk of scams using deep-fake videos created by artificial intelligence (AI). He revealed that his security team had to remove about 80…

    — manaury ezequiel (@manauryezequiel) January 15, 2024

    New ones emerge daily, however. Saylor, who serves as the Chairman for Microstrategy, has warned crypto investors about the trend.

    The post FTC Updates Rules to Address AI Deepfake Threats to Consumer Safety appeared first on Cryptonews.

    Previous Article
    • Investing

    U.S. Banks Push SEC For Key Changes In Crypto Regulation Following Spot-Bitcoin ETF Exclusion

    • February 17, 2024
    View Post
    Next Article
    • Investing

    Coinbase’s Philanthropic Arm Donates $3.6 Million to Brink for Bitcoin Development

    • February 17, 2024
    View Post

      Sign up for our newsletter to receive the latest insights, updates, and exclusive content straight to your inbox! Whether it's industry news, expert advice, or inspiring stories, we bring you valuable information that you won't find anywhere else. Stay connected with us!


      By opting in you agree to receive emails from us and our affiliates. Your information is secure and your privacy is protected.

      Popular Topics
      • Occupied West Bank rocked by day of violence as gunmen kill three Israeli settlers and reprisal attacks reported
      • Azerbaijan’s leader accuses Russia of passenger jet crash ‘cover up’ in blistering new attack on neighbor
      • Spanish woman killed by elephant in Thailand while bathing animal, police say
      • US adds Chinese tech giants to list of companies allegedly working with China’s military
      • Bad news for homebuyers in the Northeast and Midwest
      Copyright © 2025 premiuminvesthub.com | All Rights Reserved
      • About us
      • Contact us
      • Privacy Policy
      • Terms & Conditions

      Input your search keywords and press Enter.