Week of 2025-5-5

Ontario’s Greenbelt email-gate scandal nabs national lack of transparency award

Center for Free Expression

Ontario’s Ford government has received the 2024 Code of Silence Award for Outstanding Achievement in Government Secrecy for its handling of the Greenbelt land development scandal. The award highlights the use of personal email accounts by senior staff to conduct official business, violating Ontario’s Freedom of Information laws. Auditor General Bonnie Lysyk’s 2023 report criticized the practice, which led to the resignation of a central staffer. The Centre for Free Expression called the actions a serious threat to the public’s right to government transparency. The award, issued by Canadian journalism and civil society groups, aims to draw attention to institutions that obstruct public access to information.

Updated guidance on public records disclosure in British Columbia

Jasmine Samra | Nicole Sapieha | Ariell Sie-Mah | Gowling WLG

In February 2025, British Columbia's Office of the Information and Privacy Commissioner (OIPC) released updated guidance to enhance proactive disclosure practices among public bodies under the Freedom of Information and Protection of Privacy Act (FIPPA). The guidance recommends that public institutions establish and publish clear categories of records routinely disclosed without formal FOI requests, aiming to improve transparency and reduce administrative burdens. Key steps include defining record categories, publishing them prominently online, training staff across departments, and regularly updating the disclosed information. This initiative follows an OIPC investigation highlighting the need for improved access-to-information processes in municipalities. Commissioner Michael Harvey emphasized that proactive disclosure empowers citizens to participate more fully in democratic processes by understanding decisions that impact their lives

CAJ hands dubious honour to federal government over Residential School records

Marc Lalonde | Prince Albert Daily Herald

The Canadian Association of Journalists (CAJ) has awarded the federal government the 2024 Code of Silence Award for Outstanding Achievement in Government Secrecy, citing its persistent obstruction of access to residential school records. Despite the Truth and Reconciliation Commission's 2015 report urging full disclosure, Ottawa continues to resist releasing documents that could help Indigenous communities uncover the fates of missing children. The CAJ criticized the government's ongoing legal battles and lack of proactive transparency, stating that such actions hinder truth, justice, and reconciliation efforts. This award underscores the federal government's failure to honor its commitments to Indigenous peoples and the broader Canadian public.

‘The Classified Catalog’ launches to track secrecy news

Lauren Harper | Freedom of the Press Foundation

The Freedom of the Press Foundation has launched "The Classified Catalog," a new database designed to track and document instances of government secrecy under the Trump administration. Initiated in January 2025, the project aims to monitor actions such as the deletion of thousands of datasets from agency websites, closure of Freedom of Information Act (FOIA) offices, use of disappearing messaging apps without preserving government records, and the firing of inspectors general. The catalog serves as a resource for journalists, researchers, FOIA requesters, litigators, and members of Congress to stay informed about these developments and to advocate for transparency. Regular updates to the database will provide ongoing insights into the administration's approach to information management and public access.

Talking to a Brick Wall: The US Government’s Response to Public Comments on AI

Susan Ariel Aaronson | CIGI

A recent report by the Centre for International Governance Innovation (CIGI), titled "Talking to a Brick Wall," critiques the U.S. government's approach to public consultation on AI policy. Despite soliciting feedback from approximately 300 individuals, the National Telecommunications and Information Administration (NTIA) failed to engage a representative cross-section of the public, with most respondents being stakeholders already involved in AI governance. The NTIA's final report in July 2024 did not meaningfully incorporate or acknowledge the public comments received, leading to perceptions of a tokenistic consultation process. The author, Susan Ariel Aaronson, emphasizes that such superficial engagement undermines trust in AI policy-making and calls for more inclusive and responsive public consultation methods. The report advocates for alternative approaches, such as citizen assemblies, to ensure diverse perspectives are considered in AI governance.

UAE to use AI to create, regulate laws

Middle East Monitor

The United Arab Emirates (UAE) has announced plans to integrate artificial intelligence (AI) into its legislative process, marking a global first in AI-driven lawmaking. The newly established Regulatory Intelligence Office will oversee the use of AI to draft, amend, and regulate laws, aiming to accelerate the legislative process by up to 70% . This initiative involves creating a comprehensive database of federal and local laws, court judgments, and public service data, enabling AI to suggest legislative updates and assess the impact of laws on society and the economy . While the move is lauded for its potential to enhance efficiency and reduce costs, experts caution about the risks, including AI's potential to generate inaccurate or biased outputs and the challenges of ensuring transparency and accountability in AI-driven legislation.

Hackers hobbled Ontario’s broken system for tracking money laundering at online casinos

Claire Brownell | The Logic

A cyberattack in March 2024 incapacitated Ontario's online casinos' ability to file suspicious transaction reports to Canada's anti-money laundering watchdog, FINTRAC, for an entire year. This disruption hindered the province's capacity to monitor and report potential money laundering activities within its digital gambling platforms. The incident underscores significant vulnerabilities in the cybersecurity infrastructure of Ontario's online gambling sector and raises concerns about the effectiveness of existing anti-money laundering measures. The prolonged reporting blackout may have provided an opportunity for illicit financial activities to go undetected during this period.

Privacy Commissioner signals draft age assurance guidance is coming

Jaime Cardy | Janice Philteos | Dentons Data

The Office of the Privacy Commissioner of Canada (OPC) has concluded its exploratory consultation on age assurance and is preparing to release draft guidance on the design and use of age-assurance systems . The consultation, conducted from June to September 2024, gathered over 40 responses from stakeholders across industry, civil society, academia, and international data protection authorities. Key themes emerged, including the need to differentiate between various age assurance methods, the importance of considering the potential impacts of these systems, and the necessity of a risk-based, proportional approach to their implementation. The OPC plans to issue formal guidance elaborating on its expectations for privacy-protective age assurance techniques, potentially engaging in further consultations before finalization.

Age assurance shouldn’t lead to harvesting of kids’ data: Irish privacy watchdog

Joel R. McConvey | Biometric Update

The Irish Data Protection Commission (DPC) has cautioned that age assurance measures should not lead to excessive collection of children's personal data. With Ireland's Online Safety Code, effective July 2025, mandating age verification for platforms hosting adult or violent content, the DPC emphasizes the need for privacy-preserving solutions. Deputy Commissioner Dale Sunderland highlighted that companies should avoid gathering large amounts of user data for age verification, advocating for methods that require minimal personal information. The DPC is collaborating with the European Commission on developing an EU-wide age assurance app and contributed to the European Data Protection Board's guidelines promoting data minimization in such processes. This stance aligns with broader efforts to balance child protection online with the safeguarding of individual privacy rights.

What are ‘nudification’ apps and how would a ban in the UK work?

Rachel Hall | The Guardian

The UK’s Children’s Commissioner, Dame Rachel de Souza, is calling for a ban on AI-powered “nudification” apps that generate deepfake nude images, often targeting women and girls. These tools raise serious privacy and safety concerns, especially for minors, and contribute to growing fears that drive young people—particularly girls—away from online spaces. While creating or possessing explicit images of children is already illegal, the apps themselves remain accessible and largely unregulated. Proposed measures include mandating risk assessments for AI tools to prevent misuse. However, experts warn that enforcing such a ban may be difficult due to the rapid pace of AI development and concerns over digital freedom.

Ofcom publishes statement on the protection of children online

Jane Pinho | Shona O’Donovan | Convington

On April 24, 2025, Ofcom released its final Children's Safety Codes of Practice and accompanying Risk Assessment Guidance under the UK's Online Safety Act 2023. These measures require online platforms likely to be accessed by children to complete a Children's Risk Assessment by July 24, 2025, evaluating potential harms such as exposure to pornography, self-harm, and eating disorder content. From July 25, 2025, subject to parliamentary approval, platforms must implement safety measures outlined in the Codes or demonstrate alternative approaches that effectively mitigate identified risks. Non-compliance could result in significant penalties, including fines up to £18 million or 10% of global turnover. While Ofcom describes these steps as a "reset" for online child safety, some critics argue that the measures may not go far enough to address all potential online harms.

Over 90 percent of consumers worry about protection of information from cyber criminals: KPMG survey

Bernese Carolino | Lexpert

A recent KPMG survey reveals that 91% of Canadian consumers are concerned about retailers' ability to protect their personal and financial information from cybercriminals. Nearly half of the respondents expressed discomfort with retailers sharing data about their shopping habits. The survey, which included 1,522 participants, also found that 57% of consumers find online shopping frustrating, citing issues like product mismatches and inconvenient return processes. Despite these concerns, 61% of consumers still prefer shopping in physical stores, valuing the ability to test products and receive them immediately. The findings underscore the importance for retailers to enhance cybersecurity measures and transparency to maintain consumer trust.

Lack of artificial intelligence policies puts GN at risk, privacy commissioner says

Jordge Antunes | Nunatsiaq News

Nunavut’s Information and Privacy Commissioner, Graham Steele, has expressed concern over the Government of Nunavut's (GN) lack of policies governing the use of artificial intelligence (AI), warning that employees may be unknowingly using AI tools in unethical ways. He highlighted the risk of government workers inadvertently exposing confidential information due to insufficient understanding of how AI systems operate. Steele emphasized the urgent need for clear guidelines to ensure AI is used properly and ethically within the GN. In response, Mark Witzaney, director of access to information and protection of privacy, stated that the GN is actively developing AI policies, with official guidelines expected to be completed later this year or early next year.

‘It feels dystopian:’ AI-generated content about election flooded online news void

Fakiha Baig | CTV News

The 2025 Canadian federal election witnessed an unprecedented surge of AI-generated content flooding social media platforms, exploiting the news vacuum created by Meta's ban on Canadian news due to the Online News Act. This environment facilitated the spread of deepfake videos and fraudulent ads impersonating reputable news outlets, often promoting cryptocurrency scams featuring fabricated endorsements from political figures like Prime Minister Mark Carney. Despite the volume of disinformation, experts from the Media Ecosystem Observatory observed limited evidence of widespread voter manipulation, attributing this to increased public awareness and skepticism towards online content. Nevertheless, the proliferation of such AI-driven misinformation underscores significant challenges in maintaining electoral integrity and highlights the need for robust regulatory frameworks to address the evolving digital landscape.

ICO calls for protections for 23andMe customer data

UK Information Commissioner’s Office

The UK Information Commissioner's Office (ICO) and Canada's Office of the Privacy Commissioner (OPC) have jointly called for stringent protections of 23andMe's customer data amid the company's bankruptcy proceedings. In a letter to the U.S. Trustee, they emphasized that both 23andMe and any potential buyers must comply with UK GDPR and Canada's PIPEDA, especially concerning sensitive genetic and health information. A U.S. bankruptcy judge has appointed a Consumer Privacy Ombudsman to oversee data handling during the process, a move welcomed by both regulators. The ICO and OPC are also conducting a joint investigation into a 2023 data breach at 23andMe, with the ICO issuing a provisional intent to fine the company £4.59 million. They have warned that failure to adhere to data protection laws could result in enforcement actions against 23andMe or any future data holders.

Bankrupt Genetic Data: Minimizing and Privacy-Protecting Data from the Start

Justin Sherman | Epic

The Electronic Privacy Information Center (EPIC) has raised significant concerns over the potential sale of 23andMe’s extensive genetic data repository amid the company's bankruptcy proceedings. With over 15 million customers' sensitive genetic, health, and ancestry information potentially up for acquisition, EPIC emphasizes the urgent need for robust data privacy and security measures. They advocate for stringent data minimization practices, urging companies to collect only essential information and to establish clear data deletion and expiration policies to mitigate risks associated with unforeseen events like bankruptcy. EPIC also highlights the unique and immutable nature of genetic data, underscoring that once such information is compromised, individuals cannot simply change their DNA as they might a password or email address. This situation underscores the pressing necessity for comprehensive federal data privacy legislation to protect individuals' most personal information.

Navigating medical malpractice litigation: How PHIPA protects plaintiff privacy and rights

Carlo Panaro | Canadian Lawyer

Ontario's Personal Health Information Protection Act (PHIPA) plays a pivotal role in medical malpractice litigation by safeguarding patient privacy while delineating access to health records. PHIPA grants individuals the right to access their personal health information and mandates that health information custodians—such as hospitals and physicians—protect this data from unauthorized disclosure. In malpractice cases, plaintiffs must formally request medical records, but these may be redacted or incomplete, necessitating legal challenges to ensure full disclosure. Complexities arise when the patient is deceased or incapable; for instance, in the absence of an estate trustee, PHIPA allows a person managing the deceased's estate to consent to record release, though custodians often erroneously demand formal certificates. Additionally, audit trails, which log access to electronic health records, can be instrumental in establishing negligence but are sometimes improperly redacted, contravening PHIPA's transparency requirements. While PHIPA doesn't provide a private right of action for privacy breaches, Ontario courts have recognized common law claims for serious infringements, underscoring the act's significance in balancing privacy with the pursuit of justice.

Irish Data Protection Commission fines TikTok €530 million and orders corrective measures following Inquiry into transfers of EEA User Data to China

Irish Data Protection Commision

The Irish Data Protection Commission (DPC) has fined TikTok €530 million for violating the EU's General Data Protection Regulation (GDPR) by transferring European user data to China without ensuring adequate protection . The DPC found that TikTok failed to verify and demonstrate that personal data accessed by staff in China was afforded a level of protection equivalent to that guaranteed within the EU . Additionally, TikTok provided inaccurate information during the inquiry, initially denying that European user data was stored on Chinese servers, only to later admit that some data had been stored there . The company has been given six months to bring its data processing into compliance with EU standards or face suspension of data transfers to China . TikTok plans to appeal the decision, asserting that it has never provided European user data to Chinese authorities and highlighting its Project Clover initiative aimed at enhancing data security.

BC Court of Appeal says it lacks jurisdiction over appeal of arbitral award on worker privacy

Bernise Carolino | Canadian Lawyer Magazine

The British Columbia Court of Appeal has ruled it lacks jurisdiction to hear an appeal concerning an arbitral award related to employee privacy and workplace surveillance. The case involved Rehn Enterprises Ltd., a tree-falling contractor, which installed forward- and rear-facing dash cams with audio recording in company trucks used by employees commuting to worksites. The United Steelworkers union filed a grievance, and an arbitrator determined the surveillance was unreasonable, citing a high expectation of privacy for workers during transit and insufficient justification from the employer. Rehn Enterprises sought to appeal under Section 100 of the BC Labour Relations Code, arguing the arbitrator's decision involved a matter of general law. However, the Court of Appeal concluded that the arbitrator applied established legal principles to specific facts within the labor relations context, and thus, the matter fell outside its jurisdiction. The court emphasized that challenges to such arbitral decisions should be directed to the Labour Relations Board, which had already reviewed and upheld the arbitrator's ruling.

Co-op cyber attack: Staff told to keep cameras on in meetings

Joe Tidy | BBC

The Co-op has instructed its 70,000 employees to keep cameras on during virtual meetings and verify attendees amid an ongoing cyber attack, aimed at preventing unauthorized access. Internal systems requiring VPN access have been disabled, and staff are being urged not to share sensitive information on Teams or record calls. While Co-op describes its response as “proactive” and claims minimal impact, internal communications reveal stricter controls, including a halt to remote access and a push for in-person work. The attack coincides with a major ransomware incident at Marks & Spencer (M&S), believed to involve the DragonForce cyber crime service and potentially linked to the Scattered Spider group. UK authorities, including the Metropolitan Police and the National Cyber Security Centre, are now investigating and urging retailers to remain alert.

Previous
Previous

Week of 2025-05-12

Next
Next

Week of 2025-4-21