Yet a new report paints a worrying picture of how artificial intelligence (AI) systems are using Canadian journalism. The analysis revealed that 50% of the AI-generated responses included at least one Canadian link. Yet these sources were traced just 28 percent of the time. This market-based approach creates serious and urgent questions around copyright and their market-based licensing. It raises a question of how AI will affect the journalism industry itself.
The federal government presented the report during a national summit held in Banff. The day-long conference explored the intersection of artificial intelligence and cultural policy. In welcoming the findings, Culture Minister Mark Miller spoke to the need for urgency. Yet he recognized the “legitimate questions” posed by copyright and market dynamics regarding AI-generated content. He stressed that these issues are important even as AI rapidly gets into the market place.
In a pointed critique, the researchers noted that AI’s reliance on journalism could “accelerate the economic decline of the journalism it relies on,” suggesting a detrimental cycle for media outlets. They noted that outgoing links can and often do lead users back to the original sources. Consumers too often don’t have visible, discernible markers of the journalism they’re consuming. The report stated, “Links provide a pathway back to the source, but consumers reading the response itself rarely see an indication of whose journalism they are consuming.”
The recommendations of the report have inspired swift action from Canadian media organizations. This coalition has been joined by some of the largest news organizations in the country, including The Canadian Press, Torstar, The Globe and Mail, Postmedia and CBC/Radio-Canada. This coalition is currently suing OpenAI in an Ontario court. They’re looking to address the problems created by AI systems that copy and reuse their creations without attribution or compensation.
As Artificial Intelligence Minister Evan Solomon said in his rousing keynote address at the summit, he noted that creators should feel assured that the future of AI development incorporates proper guardrails. He noted how the government indeed heard from thousands of creators in this summer’s consultation process. Perhaps most excitingly, this indicates their readiness to meet and discuss these rising topics of conversation, although no less important.
Minister Solomon remarked, “creators need assurance that AI will be developed with guardrails,” reinforcing the government’s commitment to fostering a responsible AI landscape that respects the rights of content creators.
As talks go deeper into how AI and journalism can work together, both ministers are still “open to these conversations.” These important dialogues have the potential to help shape more effective regulations and practices of the new AI technology. They will influence the ways AI intersects with institutional media.

