REGENSBURG, Germany—BMW’s assembly plant here has become the first automotive factory in the world to use an end-to-end digitalized and automated process for inspection, processing and marking of painted vehicle surfaces in standard production.
Robots governed by artificial intelligence process each vehicle individually to meet objective quality standards. This ensures more stable operations, shorter lead times and a consistently high level of vehicle surface quality. Data stored in the cloud also enables optimal analysis of the causes of surface finish flaws.
On the paint line, four robots stand in the processing booth, surrounding a freshly painted body. As if on command, the robots begin working on the surface of the body. They sand it, apply the polishing compound, polish, change the attachments and switch out the sandpaper. Cameras track the action.
“What is unique here is that the robots work on each body exactly where needed, because the tiny specks and bumps that can appear after the topcoat is applied and that we want to remove are in different spots on each vehicle,” explains Stefan Auflitsch, head of production paint application and finish at the plant. “Robots are normally programmed to follow the same pattern until they are reprogrammed. Using artificial intelligence allows them to work in a more tailored manner. With up to 1,000 vehicles going through the finishing process every working day, that adds up to 1,000 unique processes.”
BMW has been using automated surface processing in series production at the Regensburg factory since March 2022. Now, the plant is the first in the world to use AI-based processing at scale. To ensure everything runs smoothly, this step is preceded by another automated process that has been considered state of the art in the automotive industry for some time: automated surface inspection. This process identifies and records features that require processing after the topcoat has been applied.
During automated surface inspection, the system first uses deflectometry to identify deviating characteristics. While large monitors project black-and-white striped patterns onto the vehicle’s surface, cameras scan it and detect even the slightest variation in the reflective paintwork through the change in the striped pattern. Like a perfectly trained eye, the camera registers areas that deviate from the ideal and transmits this data directly to a computer system. The computer saves the exact position, shape and size of the deviations, creates a digital 3D image from the data and classifies it, based on objective criteria. In this way, all vehicle surfaces are inspected for customer quality assurance purposes and treated as needed.
“The system already knows as much today as our best employees combined. We used the knowledge of our entire team to finalize the system; the functioning of the equipment relies on our associates’ unique expertise. We channeled their experience into the programming. On this basis, the algorithm now recognizes and objectively decides which features need postprocessing,” explains project manager Daniel Poggensee.
From the data collected, the system creates a separate profile for each body that then serves as the basis for custom surface processing. This means no bump, no matter how small, can escape detection.
The new method offers even more advantages than just reliable detection of characteristics and a shorter process lead time: Automated surface processing processes all deviations recorded in the optimal order, and does so quickly, repeatably and with a high level of quality.
However, there are limits to the use of robots. For example, they cannot process the edges of the body or the final millimeters next to the door and other joints. The fuel filler flap is also too fragile. For this reason, it is ultimately trained employees who add the finishing touches and conduct the final inspection of the body. However, the AI data is just as helpful for human workers as it is for the robots. A laser projector digitally marks the relevant areas of the body surface to ensure nothing is overlooked.
Based on the success of the project, Poggensee has future plans for the technology.
“Thanks to the data in the cloud, we expect to soon be able to intervene in the process even earlier if there are any inconsistencies, which will enable us to prevent faults from occurring in the first place,” he says.
The equipment could also be used to automatically record operations performed by people, so they do not have to go back and forth between the body and the computer for documentation.