Microsoft’s new AI development guidelines seek to create standards


There is no doubt that AI will play a vital role in our future, becoming more and more integrated even with ourselves. Of course, anyone who’s seen the Matrix or the Terminator movies will know that giving too much power to AI results in the end of civilization as we know it. No one wants this to happen, which is why Microsoft has released new guidelines to prevent it.

Okay, this is a flippant introduction, but there are some obvious concerns about AI. While sci-fi media may seem fancy, in the wrong hands AI could be harmful. Plus, we still don’t know the end of the tech game and what happens with increased battery life.

Having clear standards for developing AI solutions is one way to ensure that potential risks are minimized. And that’s what Microsoft’s guidelines aim to achieve. Created in collaboration with Boston Consulting Group (BCG), the new document sets development guidelines for AI technology.


Called the “Ten Guidelines for Product Leaders to Implement AI Responsibly,” the document aims to develop core standards that help us avoid machine uplift. Or more realistically, Prevent AI used for nefarious activities such as cybercrime.


In a blog post, Alysa Taylor Corporate Vice President of Industry, Apps, and Data Marketing at Microsoft explains why Microsoft wrote the document:

“As AI becomes more entrenched in our daily lives, it is incumbent on all of us to be thoughtful and responsible in how we apply it for the benefit of people and society. A reasoned approach to responsible AI will be essential for every organization as this technology matures. As technical and product leaders seek to adopt responsible AI practices and tools, several challenges arise, including identifying the approach best suited to their organizations, products, and market.

There are 10 directives in total, which Microsoft groups into the following three categories:

  1. Evaluate and prepare: Evaluate the benefits of the product, the technology, the potential risks and the team.
  2. Design, build and document: Review the impacts, unique considerations, and practice of documentation.
  3. Validate and support: Select test procedures and support to ensure products perform as expected.

You can read and download the document from Microsoft Azure here.

Tip of the day: Due to the various problems that arise with microphones, it may often be necessary to perform a microphone test. Microsoft’s operating system does not make it particularly intuitive to listen to microphone playback or microphone playback through speakers. In our tutorial, we show you how to hear yourself on the microphone in just a few clicks.


Source link

Sara H. Byrd