I INDY AUTONOMOUS CHALLENGE t’s more than three years since A AVI reported on the then-new Indy Autonomous Challenge (IAC) (see Circuit Training, October 2020, p16). Since then, the competition has begun to make good on its mission to prove the safety and performance of AV technologies in a demanding race environment, solve edge case scenarios, grow new student engineering talent and build some much-needed goodwill for autonomous technologies among the general public. Races have expanded from no-right-turn US ovals to the more demanding road course of Monza in Italy and, new this year, the famous, partially tree-covered hillclimb at the Goodwood Festival of Speed. Along the way, maximum speeds have increased to 290km/h, overtaking moves are no longer confined to low-risk straightaways and the competing university teams have advanced their AI drivers to the point where IAC needed to build a new car. Enter the AV-24 “The new point cloud radar sensor will enable us to do things like visual odometry with the radar point clouds thanks to the embedded speed information,” says C K Wolfe, a technical program manager at the University of California, Berkeley, who leads the simulation and vehicle dynamics subteam for AI Racing Tech (ART). “The different configuration of the lidars, with a larger field-of-view and rear lidar sensor, allows for more interesting algorithms on the sensor fusion side in terms of detecting and classifying opponents or developing your vision stack to navigate with agents overtaking from behind. The upgrades have incorporated a lot of feedback from the different teams for the things that we want to see, to get the edge-case performance that we want out of the vehicles.” Aside from Berkeley, ART incorporates students from California’s UC San Diego, Carnegie Mellon in Pennsylvania and the University of Hawai’i Maui College. It was the first team to volunteer an AI test driver for the AV-24, beginning with Work on the new AV-24 machine began last summer. The AV-21’s Dallara carbon-fiber chassis and Honda-derived engine have received new brake-and steer-by-wire systems to better handle the demands of road course racing. Another major area of improvement relates to reducing the quantity and improving the reliability of the copious onboard wiring and connectors. “That continues to be an area that needs to evolve even further, because automotive-grade connectors for some of the equipment that you need – network switches, wireless modems, etc – are still hard to source,” explains IAC president Paul Mitchell. These measures have also cut the car’s weight, as has removing as many as possible of the control units that went with individual sensors: “The feeds from sensors or other systems are not getting processed by a black box from the supplier,” Mitchell says. “Rather, we have a direct feed into the central computer and let the teams use the raw data coming from the lidar or radar, for example. They can pick and choose how they want to process it and use it.” The sensor stack is heavily revised. Out go the AV-21’s three Luminar Hydras, to be replaced by four Volvo EX90-style Luminar Irises – including a rear-facing unit that will help with perception during overtaking moves on road courses. The two 4D radars are also new, with Continental ARS 548 RDIs now on board. Left: The previous AV-21 sensor stack Below: The new AV-24 sensor stack “WE HAVE A DIRECT FEED INTO THE CENTRAL COMPUTER AND LET THE TEAMS USE THE RAW DATA COMING FROM THE LIDAR OR RADAR” Paul Mitchell, president, Indy Autonomous Challenge shakedown sessions at Lucas Oil Indianapolis Raceway Park in November 2023. With six Allied Vision Mako G-319C cameras and four VectorNav VN-310 GNSS antennas alongside its lidars and radars, the new sensor stack provides a wealth of information to the AI drivers. Wolfe says that the exact algorithm used in competition will differ depending on the track and the conditions on race day. “Having redundancy in these systems is helpful because this is an expensive, completely custom-built race car,” she notes. “From a research perspective, you can try out different approaches in terms of the computer vision stack and choose what you want to use. You can also validate the accuracy of the different systems against each other. What we run in the race may be a leaned-down version of that stack. We may just do a camera-lidar projection for the depth and vision piece or run segmentation and then just sample the lidar points within certain areas where we know the other agent is. “On a road course, the features are different from what you would see at an oval,” she continues. “In the past when we had the side radar sensors, we ran a wall-detection algorithm. The environment is repetitive on an oval and you know where the wall is going to be. But a road course such as Monza or Putnam Park [in Indiana] has varied features, so you have to take the environment into account when you’re dealing with the perception stack. The way I’m determining where the chicanes are, or how I find those track edges may differ depending on the context of the problem or how I’m trying to approach vision for that track.” Simulation boost IAC’s nine teams have been hard at work adapting their AI drivers to the possibilities of the new sensor stack and increased on-track competition, with IAC aiming to get three or more cars together on track as soon as possible. 22 ADAS & Autonomous Vehicle International April 2024