• Latest
  • Trending
  • All
  • News
  • Business
  • Politics
  • Science
  • World
  • Lifestyle
  • Tech
As AI booms, reducing risks of algorithmic systems is a must, says new ACM brief

As AI booms, reducing risks of algorithmic systems is a must, says new ACM brief

January 25, 2023
Marjorie Taylor Greene’s Miserable Congress Record

Marjorie Taylor Greene’s Miserable Congress Record

January 28, 2023
Want real tech innovation and growth? Think heartland, not coasts

Want real tech innovation and growth? Think heartland, not coasts

January 28, 2023
Here’s Why So Many Grocery Store Staples Are So Expensive Right Now

Here’s Why So Many Grocery Store Staples Are So Expensive Right Now

January 28, 2023
‘Buy Now, Pay Later’ Is the Victim of its Own Success

‘Buy Now, Pay Later’ Is the Victim of its Own Success

January 28, 2023
‘Love In Glacier National: A National Parks Romance’ Hallmark Movie Premiere: Trailer, Synopsis, Cast

‘Love In Glacier National: A National Parks Romance’ Hallmark Movie Premiere: Trailer, Synopsis, Cast

January 28, 2023
The best movies leaving Netflix, Hulu, Prime, and HBO Max at the end of January 2023

The best movies leaving Netflix, Hulu, Prime, and HBO Max at the end of January 2023

January 28, 2023
2 Injured in Jerusalem Attack, Hours After a Nearby Mass Shooting

2 Injured in Jerusalem Attack, Hours After a Nearby Mass Shooting

January 28, 2023
Most Brits want Prince Harry to attend King Charles’ coronation: poll

Most Brits want Prince Harry to attend King Charles’ coronation: poll

January 28, 2023
Demon Dolls, Lonely Dolls and Sex Dolls

Demon Dolls, Lonely Dolls and Sex Dolls

January 28, 2023
Trump makes his first big move in New Hampshire

Trump makes his first big move in New Hampshire

January 28, 2023
The Saturday Six: Fireball lawsuit, student gets robotic hand and more

The Saturday Six: Fireball lawsuit, student gets robotic hand and more

January 28, 2023
Five Times Meghan Markle Perfectly Curtsied to Queen Elizabeth in Public

Five Times Meghan Markle Perfectly Curtsied to Queen Elizabeth in Public

January 28, 2023
DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

As AI booms, reducing risks of algorithmic systems is a must, says new ACM brief

January 25, 2023
in News
As AI booms, reducing risks of algorithmic systems is a must, says new ACM brief
519
SHARES
1.5k
VIEWS
Share on FacebookShare on Twitter

AI might be booming, but a new brief from The Association for Computing Machinery (ACM)’s global Technology Policy Council, which publishes tomorrow, notes that the ubiquity of algorithmic systems “creates serious risks that are not being adequately addressed.” 

According to the ACM brief, which the organization says is the first in a series on systems and trust, perfectly safe algorithmic systems are not possible. However, achievable steps can be taken to make them safer and should be a high research and policy priority of governments and all stakeholders.

The brief’s key conclusions:

  • To promote safer algorithmic systems, research is needed on both human-centered and technical software development methods, improved testing, audit trails, and monitoring mechanisms, as well as training and governance.
  • Building organizational safety cultures requires management leadership, focus in hiring and training, adoption of safety-related practices, and continuous attention.
  • Internal and independent human-centered oversight mechanisms, both within government and organizations, are necessary to promote safer algorithmic systems.

AI systems need safeguards and rigorous review

Computer scientist Ben Shneiderman, Professor Emeritus at the University of Maryland and author of Human-Centered AI, was the lead author on the brief, which is the latest in a series of short technical bulletins on the impact and policy implications of specific tech developments. 

While algorithmic systems — which go beyond AI and ML technology and involve people, organizations and management structures — have improved an immense number of products and processes, he noted, unsafe systems can cause profound harm (think self-driving cars or facial recognition).

Governments and stakeholders, he explained, need to prioritize and implement safeguards in the same way a new food product or pharmaceuticals must go through a rigorous review process before it is made available to the public.

Comparing AI to the civil aviation model

Shneiderman compared creating safer algorithmic systems to civil aviation — which still has risks but is generally acknowledged to be safe.

“That’s what we want for AI,” he explained in an interview with VentureBeat. “It’s hard to do. It takes a while to get there. It takes resources effort and focus, but that’s what’s going to make people’s companies competitive and make them durable. Otherwise, they will succumb to a failure that will potentially threaten their existence.”

The effort towards safer algorithmic systems is a shift from focusing on AI ethics, he added.

“Ethics are fine, we all we want them as a good foundation, but the shift is towards what do we do?” he said. “How do we make these things practical?”

That is particularly important when dealing with applications of AI that are not lightweight — that is, consequential decisions such as financial trading, legal issues and hiring and firing, as well as life-critical medical, transportation or military applications.

“We want to avoid the Chernobyl of AI, or the Three Mile Island of AI,” Shneiderman said. The degree of effort we put into safety has to rise as the risks grow.”

Developing an organizational safety culture

According to the ACM brief, organizations need to develop a “safety culture that embraces human factors engineering” — that is, how systems work in actual practice, with human beings at the controls — which must be “woven” into algorithmic system design.

The brief also noted that methods that have proven to be effective cybersecurity — including adversarial “red team” tests in which expert users try to break the system, and offer “bug bounties” to users who report omissions and errors capable of leading to major failures — could be useful in making safer algorithmic systems.

Many governments are already at work on these issues, such as the U.S.’s Blueprint for an AI Bill of Rights and the EU AI Act. But for enterprise businesses, these efforts could offer a competitive advantage, Shneiderman emphasized.

“This is not just good guy stuff,” he said. “This is a good business decision for you to make and a good decision for you to invest in in the notion of safety and the larger notion of a safety culture.”

The post As AI booms, reducing risks of algorithmic systems is a must, says new ACM brief appeared first on Venture Beat.

Share208Tweet130Share

Trending Posts

Sabalenka wins Australian Open for first Grand Slam crown

Sabalenka wins Australian Open for first Grand Slam crown

January 28, 2023
Aryna Sabalenka Wins the Australian Open Women’s Singles Title

Aryna Sabalenka Wins the Australian Open Women’s Singles Title

January 28, 2023
Ukraine launching campaign to bar Russian athletes from Paris Olympics

Ukraine launching campaign to bar Russian athletes from Paris Olympics

January 28, 2023
FEC demands George Santos explain botched treasurer switch

FEC demands George Santos explain botched treasurer switch

January 28, 2023
What To Watch This Weekend On Netflix And In Theaters [List]

What To Watch This Weekend On Netflix And In Theaters [List]

January 28, 2023

Copyright © 2023.

Site Navigation

  • About
  • Advertise
  • Privacy & Policy
  • Contact

Follow Us

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2023.

We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
Cookie settingsACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT