This week, Meta was in the spotlight for its Threads platform and its advertising tactics in Europe. The Federal Trade Commission (FTC) added new requirements for financial institutions when reporting data breaches. President Biden signed an executive order to regulate artificial intelligence (AI) platforms. At the same time, artists lost their class action lawsuit against image generation AI providers.
Here are the highlights.
1. President Biden signs a major executive order to govern AI
U.S. President Joe Biden signed an executive order as the first concrete action toward regulating artificial intelligence (AI). It holds AI platforms accountable to law enforcement and citizens. The order is expected to have a more immediate impact on tech companies. This is in contrast to ongoing lawsuits and warnings from legislators which take a long time to yield results.
The executive order includes the following aspects:
- Safety disclosures for companies that build AI-powered tools
- Privacy protections for consumers who interact with AI
- Tackling biases and discrimination in AI algorithms
- Safety practices for healthcare practitioners and educators using AI tools
- Support for workers impacted by labor disruptions caused by AI.
The executive order highlights the need to watermark AI-generated content. It also requires greater transparency about how large language models (LLMs) collect and use training data. Some of these actions require implementation within 90 days while others require about a year. It means that tech giants must prioritize the new rules going forward.
2. FTC updates the Safeguards Rule for non-banking firms
The Federal Trade Commission (FTC) amended the Standards for Safeguarding Customer Information (Safeguards Rule) this week. Originally, the rule did not specify what would happen if non-banking institutions experienced a data breach. This amendment now requires businesses like mortgage brokers and payday lenders to report a data breach to the FTC within 30 days of discovery. The rule also applies if the breach affects the sensitive data of 500 people or more.
However, there is one exception to the new amendment. If publicly announcing a data breach may threaten national security or impede an ongoing investigation, businesses can request a 60-day delay.
The amendment was proposed in 2021 as a way of protecting consumer data. It gives businesses a comprehensive data security program, which involves staff training and incident response strategies. The new rules will apply to all businesses entrusted with sensitive financial data, including credit unions, tax firms, and financial advisors.
3. U.K. software company sues Meta over Threads
A company based in the United Kingdom (U.K.) called Threads Software Limited sent a warning to Meta this week. The company wants Meta to stop using the name “Threads” on the Instagram platform within 30 days. Otherwise, Threads Software Limited will seek legal action from U.K. courts.
The U.K. Threads was licensed and trademarked in 2012 by JPY Limited. It is the name of an email aggregation software for businesses. The software unifies different email inboxes into one platform to improve workflows. The company’s owners stated that since April 2023, Meta offered to purchase the “Threads.app” domain from them several times. However, they declined all offers. Meta later announced its own Threads platform and removed the U.K. Threads company from Facebook. Meta is yet to respond to these claims.
This comes after Twitter got sued in early October for its use of “X” for rebranding the platform. X SocialMedia is a digital marketing agency based in Florida. The company was launched in 2016 and claims that X (formerly Twitter) has led to lost revenues and confusion. The case is still ongoing.
4. Artists lose the first case against AI
In January this year, three artists sued AI-generated image service companies for copyright infringement. The lawsuit called Andersen v. Stability AI claimed that their artwork was used as training data for AI content generation. In April, the AI companies, including Midjourney and DeviantArt, asked the courts to dismiss the case. This week, the court ruled that the copyright infringement claim was not plausible.
The artists involved are Sarah Andersen, Kelly McKernan, and Karla Ortiz. Judge William H.Orrick of the Northern District of California said that only one artist, Andersen, had copyrighted 16 out of hundreds of her comics with the U.S. Copyright Office. She could opt to sue the AI companies based on those copyrighted works only. However, McKernan and Ortiz had not filed any copyright for their art. Therefore, they could not claim that AI tools used their content for training.
The ruling may set a precedent for other artists seeking to sue AI companies for copyright infringement. It suggests that as long as they officially file to copyright their work, they have a chance of fighting back against AI platforms. This is one of several cases against AI. Software developers, stock photo companies, and authors have sued OpenAI since the launch of ChatGPT in November 2022.
5. Meta may be banned across the EU for behavioral advertising
Behavioral advertising is a method of targeting individuals with ads based on their browsing habits and location data. It requires personal data harvesting techniques, e.g., cookies on a web browser. Meta came under scrutiny for behavioral advertising in Norway in August this year. The case was won by Norway in September, banning Meta from harvesting user data for targeting ads. This week, the European Data Protection Board (EDPB) extended this ban to cover all European Union (EU) and European Economic Area (EEA) countries.
This means that in 30 countries in Europe, Meta will be breaking the law if its platforms harvest user data for targeting ads. Facebook and Instagram have an estimated 250 million users in Europe. Meta had argued that users consented to targeted ads when they signed up to its platforms. However, this defense did not hold against the General Data Protection Regulation (GDPR) framework.
Meta was under an ongoing fine of $90,000 per day for violating the ban in Norway. Noncompliance with this multinational ban may cost Meta up to 4% of its annual turnover in fines.