Supply Chain Council of European Union | Scceu.org
Procurement

Self-driving cars are coming. Chemical makers are racing to keep up

09841-cover-nuro.jpg

Credit: Nuro

Nuro’s driverless R2 vehicle is in testing in California. Its predecessor, the R1 (shown), delivered groceries in Scottsdale, Arizona.

In brief

A convergence of new technologies will finally bring self-driving cars to the mass market in the next decade. Experts think the robot vehicles will start by delivering packages within limited areas. Then, bit by bit, they’ll make it out onto the open road with humans inside. This future hinges on sensors that let computers see the road. The sensors need to become more robust and sensitive while coming down in cost. An increasingly electric fleet will also demand lighter-weight materials and designs alongside improvements in cooling and lubrication. It’s a big set of changes that will challenge the chemical industry—and create opportunities.

Talk with businesspeople and policy makers involved in mobility, and it becomes clear that self-driving cars are not a matter of if, but when. These vehicles will be front and center in a future where cars are connected, autonomous, shared, and electric—what advocates call CASE. Any one of those features would upend the auto industry, and they’re all coming at more or less the same time, perhaps as soon as the end of the decade.

Then again, we’ve been talking about self-driving, electric cars for a long time. What’s the hold up? “What’s been happening over the past couple of years has been the growing realization that this is actually a much more difficult problem than everybody thought,” says Sam Abuelsamid, a principal research analyst at Guidehouse Insights. “The technology is simply not mature enough. It’s making progress, but it’s not as good as human drivers yet.”

And well before the Ubers and Lyfts of the world replace their controversial gig workers with artificial intelligence—and before we give up our personal vehicles—autonomous package-delivery vehicles will hit the roads, Abuelsamid predicts. That’s because package routes can stay on slow, calm thoroughfares and minimize challenging tasks such as left turns. “Packages don’t care if it takes an extra 5 or 10 min to get there,” he says.

“If you don’t have to worry about carrying passengers . . . that actually lowers the bar for deploying these vehicles,” Abuelsamid adds. “People expect to be able to summon a ride and go from any arbitrary point to any other arbitrary point in the quickest time possible.” Insurance companies and government regulators are also less worried when there isn’t a human life in the car.

Indeed, a company called Nuro has self-driving delivery vehicles on the road now. The firm started with a grocery-delivery pilot program in Scottsdale, Arizona, in 2018. At first, it used self-driving Toyota Priuses with a backup driver. In December of that year, the driverless, golf-cart-sized R1 joined the pilot. R1s, and their successor, R2s have four gull-wing doors that open toward the sidewalk to reveal configurable parcel compartments.

Nuro has since moved on to a larger pilot in Houston, where automated Priuses are delivering groceries from Kroger in six zip codes and prescriptions from CVS in three zip codes. The pilot will expand to include Domino’s Pizza and Walmart soon, the firm says. R2s are distributing medical supplies in a COVID-19 field hospital in Sacramento, California.

Still, self-driving passenger vehicles are what most capture the public imagination. Several car companies and technology firms are testing self-driving taxis, though most have a staffer sitting in the driver’s seat. Tesla, Nissan, and others have varying levels of automation that drivers can turn on under certain conditions.

The management consulting firm McKinsey & Company predicts that robotaxis could handle 9% of all miles traveled in the US by 2030 and as much as half by 2040. “It is probably, in our view, the next big thing,” McKinsey partner Philipp Kampshoff says in a 2017 video on self-driving cars.

In a report released earlier this month, the firm says electrification and interiors present the strongest growth opportunities for the chemical industry. McKinsey also predicts declines in materials and chemicals related to engines, such as high-temperature plastics and rubbers, lubricants, and fuel additives.

This is actually a much more difficult problem than everybody thought.

Sam Abuelsamid, principal research analyst, Guidehouse Insights

Whether it’s packages or people moving, getting to this point has taken a lot of science and engineering.

High-tech computers and batteries get the attention, but the chemical industry is quietly bringing solutions to many of the less-obvious problems of autonomous driving. The demands of computer vision require a lot of chemistry support. A design shift in car interiors that emphasizes usable space will create a need for structural materials that are strong, lightweight, and attractive to the eye and touch. And the electric motors that are expected to move most self-driving cars need different fluids than internal combustion engines.

It’s a boom time for R&D. Most major carmakers are working on CASE vehicles, and new firms like Nuro that lack anchors in internal combustion are offering serious competition. For the most part, each one is working in isolation, developing its own systems with their own challenges. That dynamic means companies’ suppliers, including specialty chemical firms, need to customize offerings for each client. And they need to be ready to iterate along with a quickly evolving field.

09841-cover-waymo.jpg

Credit: Waymo

Waymo opened its fully driverless taxi service to the public in Phoenix earlier this month.

Paul Perrone says robotic driving systems are ready to do real, meaningful work now. His firm, Perrone Robotics, specializes in bounded-zone self-driving vehicles, such as its pilot shuttle service around the Fort Carson US Army base in Colorado. He expects autonomous driving to penetrate deep into the mobility market in the next few years. “In just the subset we operate in, the opportunity is enormous,” Perrone says.

He envisions mobile autonomous robots in widespread service on roads, on sidewalks, in the air, and in the water. However, “I think there are a lot of titanic failures to come,” Perrone warns. He expects some high-profile firms to run out of money before they have a product ready to deploy at scale.

Perrone likens the state of vehicle autonomy to the dawn of powered, heavier-than-air aviation. A number of groups were working on bold, bioinspired flying machines. The first invention in the air, though, was based on established technology from kites and boats. “The Wright brothers succeeded where others failed,” he says, “by building something they understood how to control.”

Aiding the sensors

To drive, a computer needs to see. In March 2018 in Arizona, a Volvo in a self-driving test by Uber struck and killed Elaine Herzberg, who was walking her bicycle across a road at night. Two years earlier, a Tesla car in self-driving mode killed its owner, Joshua Brown, when it failed to recognize the white side of a tractor trailer against a bright daytime sky. In both cases, the computer’s vision systems didn’t recognize the hazard in time for the artificial intelligence to apply the brakes. The human copilots didn’t either.

Levels of autonomy

Self-driving vehicles are ranked by how much control the computer has over driving.

Level 0: No automation.

Level 1: Cruise control and parking-assistant systems help the driver control the vehicle.

Level 2: Partial automation can control speed and steering under certain conditions.

Level 3: Conditional automation drives the car most of the time, but the human must be ready to take over at any moment.

Level 4: High automation drives the car almost all the time, but the human can take over if desired.

Level 5: Full automation; the car can handle all driving conditions. Humans are only passengers or absent entirely.

The autonomous cars currently being tested generally watch the road in three ways. They build a detailed depth map by scanning the area with near-infrared (NIR) lasers and measuring how they reflect, a technology called laser imaging, detection, and ranging, or lidar. Overlaid on that map is information from radar, which uses radio or microwave radiation to detect objects and determine their speed and direction. Cameras use visible light to add detail and identify objects.

All three imaging methods share a prosaic need: the sensors need to stay reasonably clean, though radar is less touchy on that point. Two ways to maintain clean sensors are nonstick and self-cleaning coatings. And for both, manufacturers can draw on existing expertise.

Coatings already enable nonstick cookware several ways, including using grafted siloxanes, fluorinated polymers, and micropatterned textures. Automotive surfaces can use some of the same science. “If you think about the proteins and carbohydrates you may cook with and the way they interact with that surface, it’s similar to a road environment and the insect that may hit your car,” says Peter Votruba-Drzal, senior global technical director in PPG Industries’ automotive division.

Self-cleaning coatings may also reduce maintenance and downtime. Substances such as titanium dioxide can catalytically break down gunk using solar energy, a trick already used in architectural glass to make office building exteriors stay cleaner.

The challenge with vehicle coatings is making them last, Votruba-Drzal says. “The frontier here is getting to the durability that’s required for its service life.”

Clean lenses on good sensors in working order get you only part of the way. The objects they’re looking at need to stand out in a way that registers with a computer’s vision. Road signs and markings rely on retroreflection, an optical trick that sends light more or less back to where it came from.

Parts and labor

Autonomous and electric cars bring a new set of chemical challenges to automotive chemistry. Here are a few technologies for which new materials are making an impact.