Tuesday, March 14, 2023

An unknowable something ...

 A Microsoft office building

Photo: JeanLucIchard (Shutterstock)

As people are now finding out, we will never know how AI works due to the fact software has to write software in real-time in order for the tech to evolve in real-time as man cannot code in real-time in any way, shape or fashion so ... the mission to instill ethics into an unknowable something becomes but a wish list, something made all the more worrisome with the news Microsoft just disbanded the group responsible to do just that regarding Bing, the rapidly evolving MS version of ChatGPT, the bot transforming society 24/7. 

Microsoft Scraps Entire Ethical AI Team Amid AI Boom

As part of the tech giant's ongoing layoffs, the company has cut its Ethics and Society team, which had focused on aligning AI products with responsible policy.

Microsoft is currently in the process of shoehorning text-generating artificial intelligence into every single product that it can. And starting this month, the company will be continuing on its AI rampage without a team dedicated to internally ensuring those AI features meet Microsoft’s ethical standards, according to a Monday night report from Platformer.

What could possibly go wrong?

Microsoft has scrapped its whole Ethics and Society team within the company’s AI sector, as part of ongoing layoffs set to impact 10,000 total employees, per Platformer. The company maintains its Office of Responsible AI, which creates the broad, Microsoft-wide principles to govern corporate AI decision making. But the ethics and society taskforce, which bridged the gap between policy and products, is reportedly no more.

Corporatespeak ...

Microsoft remains committed to developing and designing AI products and experiences safely and responsibly. As the technology has evolved and strengthened, so has our investment, which at times has meant adjusting team structures to be more effective. For example, over the past six years we have increased the number of people within our product teams who are dedicated to ensuring we adhere to our AI principles. We have also increased the scale and scope of our Office of Responsible AI, which provides cross-company support for things like reviewing sensitive use cases and advocating for policies that protect customers.

Whistling past the graveyard applies, right?

Note: Bing uses ChatGPT-4, an update twice as powerful as 3.

No comments: