Pages Menu

Posted by on Apr 16, 2024 in Microsoft Teams

Microsoft updates Microsoft Teams App Store validation guidance for the AI-era

Microsoft updates Microsoft Teams App Store validation guidance for the AI-era

Microsoft has updated the Teams Store validation guidelines for anyone wishing to publish their apps to the Microsoft Teams App Store.

The validation guidelines help developers understand why their app might be rejected, and what they can do about it. AI-generated content is a new area of concern for Microsoft that will now be scrutinised.

There are 4 main areas for developers to be aware of:

  • You need to tell users that you are using AI-generated content. This needs to happen both before the user adds the app (so, probably in the description), and during usage of the app. Although Copilot Chat has a nice way to show users that the content is AI-generated, developers of Teams apps can’t choose to show this message (yet?) so I think the only way that this can be realised today is by adding something to end of each message, or maybe in the introductory welcome message.
  • Your app must not “generate, contain, or provide access to inappropriate, harmful, or offensive AI generated content”. Some ways you can prevent this that the validation time will likely be looking for include
    • Using the Teams AI Library which includes an implementation of Microsoft’s Responsible AI engine. This is likely going to be the quickest way to satisfy the validation team.
    • Using moderation hooks.
    • Adding conversation sweeping capabilities so you can monitor and intervene with humans if needed.
  • Your app needs to include a way for users to report “inappropriate, harmful or offensive content” to you. This can be using an in-app feature (a “Report this message” button for instance), but could be as simple as making sure the app description has a mail link or URL that can be used for this purpose.
  • If concerns are reported, you must “take timely action”. There is no guidance as to what this means in practice, but to satisfy the validation team it might be a good idea to think about this now and prepare a process document outlining what steps would be taken in the event of a complaint.

I think it’s good that Microsoft have responded in this way and updated the guidelines so that developers are aware of new things that might prevent their Teams Apps from being accepted into the Store. I do think, though, that there is some scope for Microsoft to make things even easier for developers, such as by allowing them to signify when sending a message respond whether it contains AI-generated content and then rendering a disclaimer message in the UI automatically. This would mean ISV-built applications would look similar to Copilot Chat and users would quickly learn to check in the same place for a consistent message about AI.

Written by Tom Morgan

Tom is a Microsoft Teams Platform developer and Microsoft MVP who has been blogging for over a decade. Find out more.
Buy the book: Building and Developing Apps & Bots for Microsoft Teams. Now available to purchase online with free updates.

Post a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.