Ferrari integrates AWS generative AI into various customer experiences

Ferrari has detailed the ways in which it is utilizing AWS’ generative AI to power a broad range of use cases that will, together, empower experiences for its customers and dealers.

Here, the luxury OEM highlighted that it had deployed AI to support different areas of the vehicle lifecycle, from accelerating the vehicle design process to enabling personalized services for its customers. At the same time, Ferrari highlighted its commitment to generative AI – seeing the technology as an opportunity to improve the vehicle and customer journeys.

Within vehicle development, for example, Ferrari invested in fully managed services, including AWS’ Fargate compute engine, to alleviate the need for its teams to manage the infrastructure for its vehicle applications. After integrating AWS’ serverless compute solution for containers, the OEM said that it saw improvements in both the reliability and scalability of its applications.

Across vehicle production, Ferrari will use both AI and machine learning to optimize and refine various processes. Amazon Lookout for Vision, for example, is being used to help it spot product defects by using computer vision to automate quality inspections. AI also helps Ferrari’s production practices by detecting missing or defective parts on the assembly line before a vehicle proceeds to the testing stage – reducing costs as a result.

At the forefront of Ferrari’s use of AWS’ generative AI on the customer side is its car configurator, which the OEM developed on AWS, that allows customers to personalize their vehicle with options for the wheel design, paint colors, interior, and more. The configurator is powered by a range of large language models (LLMs) working under Amazon Bedrock, a fully managed AWS tool that offers a choice of high-performing foundation models, and Amazon Personalize, which leverages machine learning tools to further enhance the customer experience.

Further along the customer experience, Ferrari’s use of generative AI extends to after-sales, in which the OEM has adopted a chatbot. Using Amazon Bedrock, it fine-tuned a range of LLMs – including Amazon’s own Titan model, Claude 3, and Llama – on its documentation to support its sales professionals and technicians.

For the chatbot, Ferrari paired Bedrock with Amazon SageMaker JumpStart, a machine learning hub offering foundation models, built-in algorithms, and prebuilt ML solutions, to train it to classify and summarize customer care tickets and answer commonly asked questions. The OEM highlighted that, through its AWS-powered chatbot, it is ultimately looking to minimize human error while boosting productivity.

In addition to these use cases, Ferrari confirmed that it plans to expand its use of generative AI further to better serve its customers and dealers. At the same time, it is also exploring cloud services that will help it accelerate the achievement of its carbon neutrality commitment by 2030.

Go to Source