Automobile

Diversity key as AI increasingly integrated into safety systems

Bias in AI can surface in a few different ways. It’s often born from a lack of understanding about the type of data needed to solve the problem at hand — or not supplying enough diversity of data or scenarios to the system.

“If you do not have data that accurately represents the real world, let’s say, in terms of weather conditions, in terms of different types of highway structures, in terms of different types of urban intersections, then that means that the vehicle will not be properly prepared to react in those situations,” Cvijetic said. “If your system has not been trained on this type of data, then you’re introducing ambiguity into the situation that the vehicle is not trained for, and that needs to be addressed.”

Bias is also born from a lack of diversity in the development team.

It could be as simple as a younger engineering team that might not consider the needs of a 100-year-old end user, or a team in San Francisco not considering that the technology also needs to be applicable in China — or as safety-critical as a room full of male engineers not considering the need for differently shaped dummies.

“It’s about who’s looking at this data, who’s annotating the data,” AEye’s Vijayan said. “It’s so important that the design happens in a way that it is adapted for different kinds of people.”

Reducing bias requires diverse engineering teams, frequent training about the possibilities of bias in AI and, in some ways, regulatory measures.

“The more diversified your team is, the better,” Vijayan said. “As people, we need to be aware: Every person is biased in his or her own way. Knowing that, acknowledging that and being conscious about it also enables these [biases] to be removed.”

German megasupplier Bosch, for instance, conducts frequent “lunch and learns” with key stakeholders across the company to educate its associates. Recently, the supplier addressed artificial intelligence and inclusion.

“Once we understand our own selves and our own self-perspectives, we can truly try to be conscious enough to mitigate that,” said Carmalita Yeizman, chief diversity, equity and inclusion officer for Bosch in North America.

The Center for Automotive Diversity, Inclusion & Advancement encourages “trying to build diversity into the team so that you don’t have that groupthink,” Thompson said, “but also building diversity in that design team so that you’re getting as much representation as possible to avoid blind spots.”

It’s a combination of “if you don’t have diversity on that team, you’re not even going to be aware of what those blind spots are,” she said, and “being aware of all of the different conditions [or use cases] that can come up.”

There are ongoing efforts in the European Union that would create regulatory frameworks to assess risk of bias in artificial intelligence. The efforts would propose best practices to ensure that the AI being implemented in systems, including those in vehicles, is comprehensive.

“This is so important to the core business that we do, and to doing it the right way, and to the success of the product, to aligning with regulation, to making our end customers comfortable and empowered to use these products,” Cvijetic said. “I think it underpins a lot of the reasons why we do this in the first place.”

Most Related Links :
reporterwings Governmental News Finance News

Source link

Back to top button