National Campus and Community Radio Association Statement on use of AI

Below is a statement produced by the NCRA/ANREC Board of Directors on the use of Artificial Intelligence in our sector. It has been posted here as it will inform our eventual approach to developing policies regarding the use of AI at our organization and station. You can find more information on our current practices regarding AI at our station here.

[FOR IMMEDIATE RELEASE]

Ottawa, May 22nd, 2024

National Campus and Community Radio Association Statement on use of AI

At the Station Manager Summit (SMS) hosted by Radio Sidney in Sidney, BC attendees engaged in a lively discussion on the use of Artificial Intelligence (AI) and Large Language Tools such as Chat GPT. This also followed a presentation by Futuri Media which spoke to their product Futuri Audio AI, which creates AI powered synthetic DJ voices. The National Campus and Community Radio Association/Association nationale des radios étudiantes et communautaires (NCRA/ANREC) was asked for its stance on AI. The following statement outlines our perspective and explores how our members may adopt tools and navigate the ethical debates when using these tools. While this statement serves as a general overview, each case should be evaluated on an individual basis.

The NCRA/ANREC actively embraces new tools and technology to promote and improve its work, and strongly supports the organization's staff in exploring AI tools in their own work, like the use of AI tools to summarize meetings. We believe that AI and AI powered large language models like Open AI’s Chat GPT can bolster productivity and accessibility. We have been excited to see how our members have used this tool to bolster their audio storytelling and social media content. 

The introduction of AI synthetic voices in commercial radio raised the question of the soul of the sector. The NCRA/ANREC Board aims to address the ethical considerations surrounding the use of AI tools by our members. For example, some of our smaller stations with limited capacity could use synthetic voices and AI tools to create hourly local news or weather updates. Additionally, text-to-voice capabilities of AI could literally bring a voice to members of our community who are unable to speak themselves. We encourage our members to engage their audience to understand their demand or tolerance for synthetic voices on radio.

In the Broadcasting Act, the community broadcasting element is the part of the Canadian broadcasting system in which members of a community participate in governance and programming. The Act also states that the Canadian broadcasting system should: 

(ii) encourage the development of Canadian expression by providing a wide range of programming that reflects Canadian attitudes, opinions, ideas, values and artistic creativity, by displaying Canadian talent in entertainment programming… 

As you weigh the use of AI tools, we strongly encourage members to keep in mind that this is a central tenet of our mandate, as is our sector’s goal to provide space for local voices. We also know that we need to continually evolve, and are aware that for some of our campus stations and former instructional school stations, providing students with knowledge and experience with the technology used by the commercial sector is a key part of their educational efforts. We are also aware of the disproportionate impact these tools and automation will have on the younger generations especially as AI will hurt the employment prospects of young people by eliminating the entry-level jobs that would give them their career starts. Maintaining access, and opportunities for students and young people in our community is key to our role. Additionally, the use of AI can be both intimidating and a barrier for those unfamiliar with the tools. Any use of these tools should include, informed, accessible and compassionate training. 

Another area we know concerns our members relates to the copyright abuses alleged against large language model AI systems like Open AI. For example, the New York Times recently took Open AI to court on this issue. Our members are contributors and amplifiers of local cultural expression. There have been significant concerns that the large language model AIs have been trained on used extensive works (including art and music) without permission from or compensation to the creators of that content. We have seen several stations adopt AI art generation to help create website and social media graphics, or even AI generated sounds to assist with advertising production or radio plays etc. The NCRA/ANREC encourages our members to have an internal discussion on the ethical use of a tool which has been created using art or other cultural content without the express permission, or payment for those creators. 

The use of AI content in spoken word programming is where we predict it will be noticed first in our sector. For news content, and current affairs programming we would like to draw members’ attention to some of the limitations of the tools. Chat GPT and others have been found to hallucinate, which refers to the system stating as fact something which is not true. Wewant to remind our members of their obligation under the Radio Regulations which prohibit the broadcast of: (d) any false or misleading news; We strongly recommend that our members review any content created using AI tools like Chat GPT for factual accuracy. Please also consider that many of these large language models were trained on data found on the internet and data from the West, by using knowledge primarily in English. Advocates have appropriately raised concerns that these tools often repeat the biases and prejudices that are common to the West, specifically the data sets it was trained on could replicate racism, sexism and be cis-centrist. We recommend that stations using these tools should undertake meaningful and ongoing equity reviews, with community involvement and participation, to identify the ways in which AI adoption may be negatively impacting marginalized communities they serve. We also suggest disclosing AI Use. 

We are aware that some of our stations are unionized, and those stations are looking specifically at how to incorporate safeguards for these tools. We recommend that stations consider introducing policies for the effective and fair use of these tools to protect their staff, volunteers and community. The NCRA/ANREC is exploring a session at the National Community Radio Conference to develop such guidance for stations to use.

We know our members have used AI tools in multiple ways ranging from effective time and calendar management such as Reclaim.AI, to Canva and the magic resizing tools to turn one piece of content to many sizes and shapes across social media platforms. Some stations have also taken advantage of new translation AI to make more third language programming. We are excited to see that audio transcription software has improved and now enables volunteers to make their interviews accessible in written formats. There are also new tools which make audio editing much easier and accessible for volunteers learning a new skill. If you want to see what your colleagues across the sector are using, or have a great suggestion, go to www.AI.NCRA.ca and share your suggestions with Barry. 

The NCRA/ANREC Board are elected by you, the members of the association. Board members are staff and volunteers at NCRA/ANREC member stations. Our individual opinions on AI use differ, similar to the way members themselves are likely to differ. For example, among our Directors is the opinion that AI tools should be banned in principle due to these ethical concerns. However we agree on the whole that the responsible and measured use of AI and large language models will help our sector and encourage our members to independently explore these tools as they arrive. They could boost accessibility, elevate productivity and support stations with limited capacity. As discussed above, these tools are not without risks and may raise ethical issues. But considering ethical issues, working to elevate community members, and making occasional grammatical errors are what set us apart from robots. 

This was not written by a robot. 

Signed: 

The NCRA/ANREC Board of Directors

Are you a volunteer who has thoughts regarding the potential use of AI at CHLY? Shoot me a message.