As you know, we have been evaluating the use of Otter.ai and AI bots generally in Open edX community meetings, and we have come to the following decision. Due to the variation and unpredictability of the technology at this time in conjunction with varying laws and regulations around the world regarding recording and transcription of people’s image and voice, we are prohibiting the use of AI bots in Open edX working groups and any other types of community-wide, public meetings.
That said, recordings and transcripts are vital for the flow of information across our community. To help make sure this is done in an effective and compliant way, Axim will provide links for Google Meet and Zoom, and recordings and transcripts will be produced via the native functionality provided by Google Meet and Zoom. Meeting owners will be responsible for making recordings and transcripts public. Moving forward, all community working groups and any ad-hoc public meetings will have their meeting link provided by the Axim team; we are currently investigating the best way of doing this, and will be in touch with affected working groups soon.
Community members are asked to remove their AI bots from all public Open edX meetings. Additionally, meeting owners are responsible for denying access, or removing, AI bots from their meetings.
That makes perfect sense to me.
As you know, a problem with technology like this is that you would be sharing the information from the meeting with a third party, and also what result you get can be misleading.
I was only wondering if you had considered the more indirect way of running a LLM on a private server where we know that the data will never be shared with a third party, and then just allowing users to paste a transcript into it for summary at their own risk of the result not being accurate.
Just since I feel the benefits of being able to summarize a meeting or generating some content or analysis for it can be good.
Assuming you don’t want to do this, I assume if we want to do that in 2U context - taking the transcript from a meeting afterwards, and running it through an LLM in such a way that we are guaranteed not sharing it with a third party - would that be permitted? I see a lot of benefits in that. In a way it’s also a thing where we have the transcript already, so I would assume we are probably allowed to internally do what we want with it if we don’t share information with a third party?
Thank you for the great question! Our primary concern is that an AI bot could be recording a person’s words without their knowledge; the native Meet/Zoom record tools provide clear indication to meeting attendees that they are being recorded, so they can take action as appropriate, unlike the AI bots we’ve seen which may be hiding in a long list of meeting attendees. Our plan is to post recordings and transcripts on working group wiki pages; we, or others, may choose to run that through a LLM to provide summaries. Your point is well taken about sharing data with a third party; we are looking into ways we could provide clarity to attendees about how we manage the recordings and transcripts.