Tower International to Announce Third Quarter 2018 Financial Results

Tower International to Announce Third Quarter 2018 Financial Results

LIVONIA, Mich., Oct. 12, 2018 /PRNewswire/ — Tower International, Inc. (NYSE: TOWR), a leading global manufacturer of engineered automotive structural metal components and assemblies, will report third quarter 2018 financial results before the market opens on Monday, October 29, 2018, via PR Newswire. At 11:00 a.m. EDT on that date, a conference call is scheduled to discuss the results in further detail, as well as other related matters.

To participate in the conference call:

Domestic calls: (866) 393-4576

International calls: (706) 679-1462

Tower will provide a broadcast of the conference call for the general public via a live audio webcast. The conference call, along with the financial results release, presentation material and other supplemental information, can be accessed through Tower's Web site at www.towerinternational.com.

The audio replay will be available two hours following the call at:

Domestic calls: (855) 859-2056

International calls: (404) 537-3406

The audio replay will be available until November 29, 2018 (Conference I.D. 4576237).

Investor & Media Contact:
Derek Fiebig
(248) 675-6457
fiebig.derek@towerinternational.com

View original content:http://www.prnewswire.com/news-releases/tower-international-to-announce-third-quarter-2018-financial-results-300730171.html

SOURCE Tower International, Inc.

The Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a Human

By James R. Doty, MD and Blair LaCorte

For over three decades, I’ve studied and performed surgery on the human brain. I have always been fascinated by the power, plasticity and adaptability of the brain and by how much of this amazing capacity is dedicated to processing and interpreting data we receive from our senses. With the rapid ascension of Artificial Intelligence (AI), I began to wonder how developers would integrate the complex, multi-layers of human perception to enhance AI’s capabilities. I have been especially interested in how this integration would be applied to robots and autonomous vehicles. It became clear the artificial intelligence that will be needed to drive these vehicles will require artificial perception that is modeled after the greatest perception engine on the planet — the human visual cortex. These vehicles will need to think like a robot, but perceive like a human.

To learn more and to better understand how this level of artificial perception will be created, I recently became advisor to AEye, a company developing cutting edge artificial perception and self-driving technologies, to help them use knowledge of the human brain to better inform their systems. This is known as biomimicry: the concept of learning from and then replicating natural strategies from living systems and beings (plants, animals, humans, etc.) to better adapt design and engineering. Essentially, biomimicry allows us to fit into our existing environment and evolve in the way life has successfully done for the past few billion years. But why is incorporating biomimicry and aspects of human perception integral to the development and success of autonomous vehicles?

Because nothing can take in more information and process it faster and more accurately than the human perception system. Humans classify complex objects at speeds up to 27 Hz, with the brain processing 580 megapixels of data in as little as 13 milliseconds. If we continue using conventional sensor data collection methods, we are more than 25 years away from having AI achieve the capabilities of the human brain in robots and autonomous vehicles. Therefore, to facilitate self-driving cars to safely move independently in crowded urban environments or at highway speeds, we must develop new approaches and technologies to meet or exceed the performance of the human brain. The next question is: how?

Orthogonal data matters
(Creating an advanced, multi-dimensional data type)
Orthogonal data refers to complimentary data sets which ultimately give you more quality information about an object or situation than each would alone, allowing us to make efficient judgements about what in our world is important, and what is not. Orthogonality concepts for high information quality are well understood and rooted in disciplines such as quantum physics where linear algebra is employed and orthogonal basis sets are the minimum pieces of information one needs to represent more complex states without redundancy. When it comes to perception of moving objects, two types of critical orthogonal data sets are often required — spatial and temporal. Spatial data specifies where an object exists in the world, while temporal is where an object exists in time. By integrating these data sets along with other complementary data sets such as color, temperature, sound, smell, etc. our brains generate a real-time model of the world around us, defining how we experience it.

The human brain takes in all kinds of orthogonal data naturally, decoupling and reassembling information instantaneously, without us even realizing it. For example, if you see that a baseball is flying through the air towards you, your brain is gathering all types of information about it, such as spatial (the direction of where the ball is headed) and temporal (how fast it’s moving). While this data is being processed by your visual cortex “in the background” all you’re ultimately aware of is the action you need to take, which might be to duck. The AI perception technology that is able to successfully adopt the manner by which the human brain captures and processes these types of data sets will dominate the market.

Existing robotic sensory data acquisition systems have focused only on single sensor modalities (camera, LiDAR, radar) and only with fixed scan patterns and intensity. Unlike humans, these systems have not learned nor have the ability to efficiently process and optimize 2D and 3D data in real-time while both the sensor and detected objects are in motion. Therefore, they cannot use real-time orthogonal data to learn, prioritize, and focus. To effectively replicate the multi-dimensional sensory processing power of the human visual cortex will require a new approach to thinking about how to capture and process sensory data.

AEye is pioneering one such approach. AEye calls its unique biomimetic system iDAR (Intelligent Detection and Ranging). AEye’s iDAR is an intelligent artificial perception system that physically fuses a unique, agile LiDAR with a hi-res camera to create a new data type they call Dynamic Vixels. These Dynamic Vixels are one of the ways in which AEye acquires orthogonal data. By capturing x, y, z, r, g, b data (along with SWIR intensity), these patented Dynamic Vixels are uniquely created to biomimic the data structure of the human visual cortex. Like the human visual cortex, the intelligence of the Dynamic Vixels is then integrated in the central perception engine and motion planning system which is the functional brain of the vehicle. They are dynamic because as they actively interrogate a scene and adjust to changing conditions, such as increasing the power level of the sensor to cut through rain, or revisiting suspect objects in the same frame to identify obstacles. Better data drives more actionable information.

Not all objects are created equal
(See everything, and focus on what is important)
Humans continuously analyze their environment, always scanning for new objects, then in parallel and as appropriate focus in on elements that are either interesting, engaging, or potentially pose a threat. We process at the visual cortex fast, with incredible accuracy, and with very little of the brain’s immense processing power. If a human brain functioned as autonomous vehicles do today, we would not have survived as a species.

In his book The Power of Fifty Bits, Bob Nease writes of the ten million bits of information the human brain processes each second, but how only fifty bits are devoted to conscious thought. This is due to multiple evolutionary factors, including our adaptation to ignore autonomic processes like our heart beating, or our visual cortex screening out less relevant information in our surroundings (like the sky) to survive. It is an intelligent system design.

This is the nature of our intelligent vision. So, while our eyes are always scanning and searching to identify new objects entering a scene, we focus our attention on objects that matter as they move into areas of concern, allowing us to track them over time. In short, we search a scene, consciously acquire the objects that matter, and track them as required.

As discussed, current autonomous vehicle sensor configurations utilize a combination of LiDAR, cameras, ultrasonics, and radar as their “senses” that are serial collection (one way) and are limited to fixed patterns of search. These “senses” collect as much data as possible, which is then aligned, processed, and analyzed long after the fact. This post-processing is slow and does not allow for situational changes to how sensory data is captured in real-time. Because these sensors don’t intelligently interrogate, up to 95% of the sensory data currently being collected is thrown out as it is either irrelevant or redundant at the time it is processed. This act of triage also itself comes with a latency penalty. At highway speeds, this latency results in a car moving more than 20 feet before the sensor data has been fully processed. Throwing away data you don’t need with the goal of being efficient is inefficient. A better approach exists.

The overwhelming task of sifting through this data — every tree, curb, parked vehicle, the sky, the road, leaves on trees, and other static objects — also requires immense power and data processing resources, which slows down the entire system significantly, and introduces risk. These systems’ goal is to focus on everything and then try to analyze each item in their environment, after the fact, at the expense of timely action. This is the exact opposite of how humans process spatial and temporal data in situations that we associate with driving.

AEye’s iDAR teaches autonomous vehicles to “search, acquire, and track” objects as we do. By defining new data and sensor types that more efficiently communicate actionable information while maintaining the intelligence to analyze this data as quickly and accurately as possible. AEye’s iDAR enables this through its unique foundational solid-state agile LiDAR. Unlike standard LiDAR, AEye’s agile LiDAR is situationally adaptive so that it can modify scan patterns and trade resources such as update rate, resolution, and max range detection among others. This enables iDAR to dynamically adjust as it optimally searches a scene, conserve power and apply that power to efficiently identify and acquire critical objects, and track these objects over time. iDAR’s unique ability to intelligently use power to search, acquire, and track scenes helps identify that the object is a child walking into the street or that it is a car entering the intersection and accelerating high speed. Doing this in real-time is the difference between a safe journey and an avoidable tragedy.

Humans Learn Intuitively
(Feedback loops enable intelligence)
As we have discussed, the human visual cortex can scan ay 27Hz (much faster than current sensors on autonomous vehicles, which average around 10Hz). The brain naturally gather..

AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDAR

Company Simultaneously Closes $40 Million Series B Funding to Fuel Global Expansion
“The test conducted by AEye delivered impressive results…This is an outstanding achievement that demonstrates the true potential of perception systems to reliably and accurately detect and track objects at great range.”

Pleasanton, CA – November 19, 2018 – AEye, a world leader in artificial perception systems and the developer of iDAR™, today announced a major breakthrough in long-range threat detection and safety. In performance specification tests monitored and validated by VSI Labs, one of the nation’s leading automated vehicle technology advisors, AEye’s iDAR system detected and tracked a truck at 1,000 meters, or one kilometer – four to five times the distance current LiDAR systems are able to detect. AEye’s test sets a new benchmark for solid-state LiDAR range, and comes one month after AEye announced a 100Hz scan rate – setting a new speed record for the industry.

The company simultaneously announced $40M in Series B funding, led by Taiwania Capital. The round was significantly oversubscribed and includes multiple global automotive OEMs, Tier 1s, and Tier 2s to be formally announced at CES in January. In addition to Taiwania Capital, existing investors Kleiner Perkins, Intel Capital, Airbus Ventures and Tychee Partners also participated.

New Range and Scan Rate Records Key to Autonomous Automotive and Trucking Safety
Using AEye’s standard iDAR sensor, the company set up a formal test, monitored by VSI Labs, the leading research and development resource for active safety and automated vehicle technologies. The test was structured to establish and verify the range and scan rates of the iDAR system.

The test was conducted on the runway of an airport in Byron, California in order to isolate targets to better measure and calibrate iDAR’s performance. To test range, a standard 20-foot moving truck was tracked and continuously scanned down the length of the 914 meter runway. At the end of the runway, the iDAR system was fully able to continuously detect, and track the movements of the vehicle as well as detect runway signs and markers en route. The AEye sensor vehicle was then taken off the runway to extend the available test range to over 1000m, where iDAR continued to track the truck without difficulty.

“The test conducted by AEye delivered impressive results,” said Sara Sargent, senior engineer at VSI Labs. “We monitored the performance and the truck was clearly identifiable and visible at 1 kilometer. We were also able to verify that AEye’s iDAR system achieves scan rates of 100Hz and that the fusion of the camera and LiDAR in the iDAR sensor produces accurate true color real-time point clouds in the form of Dynamic Vixels. This is an outstanding achievement that demonstrates the true potential of perception systems to reliably and accurately detect and track objects at great range.”

iDAR and Biomimicry
AEye’s iDAR is an intelligent artificial perception system that physically fuses an agile, solid-state LiDAR with a hi-res camera to create a new data type called Dynamic Vixels. These Dynamic Vixels are the result of real-time integration of iDAR’s Agile LiDAR and a low-light camera in the IDAR sensor, not post fusion of a separate camera and LiDAR system after the scan. By capturing x, y, z, r, g, b data, Dynamic Vixels are uniquely created to “biomimic” the data structure of the human visual cortex. Better data drives vastly superior performance and delivers more accurate information. AEye’s use of Biomimicry is more fully explored by Dr. James Doty, world renowned neurosurgeon and clinical professor in the Department of Neurosurgery at Stanford University, in an article he recently published on Medium.

“After establishing a new standard for LiDAR scan speed, we set out to see just how far we could accurately search, acquire and track an object such as a truck”, said Blair LaCorte, Chief of Staff at AEye. “The iDAR system performed as we expected. We detected the truck with plenty of signal to identify it as an object of interest, and then easily tracked it as it moved over 1000m away. We now believe that with small adaptations, we can achieve range performance of 5km to 10km or more. These results have significant implications for the autonomous trucking and Unmanned Aircraft Systems (UAS) markets, where sensing distance needs to be as far as possible and potential threats identified as early as possible to achieve safe, reliable vehicle autonomy.”

New Funds Fuel Company’s Global Expansion
In addition, AEye announced the close of its Series B round, bringing the company’s total funding to over $61 million. The funds will be used to scale AEye’s operations to meet global demand for the company’s artificial perception systems for autonomous vehicles. AEye is uniquely structured to effectively scale through partnerships with contract manufacturers and Tier 1s on a global basis. This has allowed the company to focus on its core design and innovation competencies, avoiding the costs of building manufacturing capacity, while optimizing investment dollars on higher value activities. AEye’s growth has been fueled by its ability, as a software driven platform, to provide artificial perception systems that address both ADAS and Mobility solutions and engagements with customers and partners in Europe, North America, and Asia.

“This funding marks an inflection point for AEye, as we scale our staff, partnerships and investments to align with our customers’ roadmap to commercialization,” said Luis Dussan, AEye founder and CEO. “Our strategic relationship with Taiwania will serve as a gateway to Asia, with valuable manufacturing, logistics and technology resources that will accelerate our ability to address the needs of a global market. We intend to launch our next generation product at CES, which we believe will help OEMs and Tier 1s accelerate their products and services by delivering market leading performance at the lowest cost.”

“We see AEye as the foremost innovator in this space, whose systems deliver highly precise, actionable information at speeds and distances never seen in commercially available LiDAR sensors,” said Huang Lee, Managing Partner at Taiwania. We look forward to working closely with AEye’s team to explore and pursue growth opportunities in this burgeoning space.”

AEye takes a disruptive approach to vehicle perception by putting intelligence at the sensor layer and making it extensible and controllable via a software driven architecture. The company’s iDAR system is an intelligent artificial perception system that physically fuses an agile, solid-state LiDAR with a hi-res camera to create a new data type called Dynamic Vixels with integrated software definable feedback control loops. This enables the iDAR sensor to also dynamically assess and prioritize what’s most relevant in a scene, then process this data at the edge. This is way above and beyond the function of legacy fixed pattern LiDAR systems and standalone cameras with 2D computer vision algorithms. This unique approach enables rapid, dynamic perception & path planning, for drastically improved autonomous vehicle safety and performance.

About AEye
AEye is an artificial perception pioneer and creator of iDAR™, a perception system that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid-state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. The company is based in the San Francisco Bay Area, and backed by world-renowned investors including Kleiner Perkins Caufield & Byers, Taiwania Capital, Intel Capital, & Airbus Ventures.

About Taiwania Capital
Taiwania Capital is a venture capital firm sponsored by the Taiwan government and large private enterprises. Founded in 2017, Taiwania Capital is focused on ICT-related sectors and startups in fields including: enterprise IT infrastructure and software, AI, IoT, network security, industrial automation, drones and robotics, next-gen semiconductors, autonomous vehicle technology, and digital devices. With offices in both Taiwan and Silicon Valley, Taiwania Capital exclusively backs startups that will turn the promises of technological advancement into scalable applications.

Media Contact:

AEye, Inc.
Jennifer Deitsch
[email protected]

925-400-4366

AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDAR — AEye Introduces Groundbreaking iDAR TechnologyGartner Names AEye Cool Vendor in AI for Computer VisionThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementAEye Announces the AE100 Robotic Perception System for Autonomous VehiclesAEye’s iDAR Leverages Biomimicry to Enable First Solid State Lidar with 100Hz Scan RateAutoSens Names AEye Most Exciting Start-Up in the Automotive Imaging SectorAEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™Nate Ramanathan Joins AEye as Vice President of Operations

Tower Signs a Memorandum of Understanding to Sell its European Operations at an Accretive Value

Tower Signs a Memorandum of Understanding to Sell its European Operations at an Accretive Value

LIVONIA, Mich., Nov. 20, 2018 /PRNewswire/ — Tower International, Inc. (NYSE: TOWR), a leading global manufacturer of engineered automotive structural metal components and assemblies, today announced it has signed a Memorandum of Understanding relating to the sale of all of its European Operations to Financière SNOP Dunois S.A.”FSD”, a privately owned French automotive supplier.

Tower's European Operations include manufacturing facilities in Belgium, Czech Republic, Germany, Italy, Poland and Slovakia and offices in Germany and Italy. Financial results for full year 2018 are projected at revenue of $650 million and Adjusted EBITDA of $55 million. Before fees and other customary adjustments, the anticipated sale price represent an Enterprise Value of €255 million ($298 million at $1.17/Euro) representing an enterprise multiple of 5.4 times Adjusted EBITDA. This transaction multiple is significantly higher than the present multiple for Tower's common stock, which Tower estimates was approximately 4.5 times based on yesterday's closing price.

“This accretive transaction with FSD allows Tower to focus on a North American business with strong organic growth, profit margins and cash flow. It further strengthens Tower's balance sheet and enhances Tower's financial flexibility and accelerates Tower's ability to invest in additional accretive growth, reduce leverage and/or return capital to Tower shareholders,” said CEO Jim Gouin. “FSD and Tower Europe's operations are very complementary from a customer as well as geographic footprint. This combination will allow Tower's assets and colleagues to be better utilized as part of this Pan-European entity.”

The memorandum of understanding signed by the parties together with an unsigned stock purchase agreement, would be the basis on which the parties pursue the signing of a definitive agreement in the next few weeks, once works council consultation has taken place. Completion of the divestiture is expected to take place during the first quarter of 2019 and is subject to approval of the applicable antitrust authorities and other customary conditions. Tower expects to recognize a book loss of approximately $60 million related to the sale of the European operations. This one-time charge will include the reclassification of currency translation into earnings, other fair value adjustments and selling costs.

For this transaction, Rothschild & Co. served as Tower's M&A advisor, Freshfields Bruckhaus Deringer LLP was Tower's legal advisor, and De Brauw Blackstone Westbroek advised Tower on country specific legal matters. Tower also received advisory services from J.P. Morgan Securities LLC.

Tower to Host Conference Call Today at 2 p.m. EST

Tower will discuss this transaction and other related matters in a conference call at 2 p.m. EST today. Participants may listen to the audio portion of the conference call either through a live audio webcast on the Company's website or by telephone. The slide presentation and webcast can be accessed via the investor relations portion of Tower's website www.towerinternational.com. To dial into the conference call, domestic callers should dial (866) 393-4576, international callers should dial (706) 679-1462. An audio recording of the call will be available approximately two hours after the completion of the call. To access this recording, please dial (855) 859-2056 (domestic) or (404) 537-3406 (international) and reference Conference I.D. #1976027. A webcast replay will also be available and may be accessed via Tower's website.

Non-GAAP Financial Measures

This press release includes the following non-GAAP financial measures: “Adjusted EBITDA”, “Adjusted EBITDA Margin”, “Free Cash Flow”, and “Net Debt.” We define Adjusted EBITDA as net income / (loss) before interest, taxes, depreciation, amortization, restructuring items and other adjustments described in the reconciliations provided in this presentation. Adjusted EBITDA margin represents Adjusted EBITDA divided by revenues. Free Cash Flow is defined as cash provided by operating activities less cash disbursed for purchases of property, plant and equipment. Net Debt represents total debt less cash and cash equivalents. We use Adjusted EBITDA, Adjusted EBITDA margin, Free Cash Flow, and Net Debt as supplements to information provided in accordance with generally accepted accounting principles (“GAAP”) in evaluating our business and they are included in this presentation because they are principal factors upon which our management assesses performance. The non-GAAP measures presented above are not measures of performance under GAAP. These measures should not be considered as alternatives for the most directly comparable financial measures calculated in accordance with GAAP. Other companies in our industry may define these non-GAAP measures differently than we do and, as a result, these non-GAAP measures may not be comparable to similarly titled measures used by other companies in our industry; and certain of our non-GAAP financial measures exclude financial information that some may consider important in evaluating our performance. Given the inherent uncertainty regarding mark to market adjustments of financial instruments, fair value adjustments to our pension plan, potential gain or loss on our discontinued operations, potential restructuring expenses, and expenses related to our long-term incentive compensation programs in any future period, a quantitative reconciliation of forward-looking financial measures to the most directly comparable financial measures calculated and presented in accordance with GAAP is not feasible. Consequently, any attempt to disclose such reconciliations would imply a degree of precision that could be confusing or misleading to investors. The magnitude of these items, however, may be significant.

Forward-Looking Statements and Risk Factors

This press release contains statements which constitute forward-looking statements, within the meaning of the Private Securities Litigation Reform Act of 1995, including but not limited to statements regarding the completion of the pending transactions in this presentation, the consequences of that transaction, projected enterprise value, anticipated stock valuation, positioning, projected truck revenues and the outlook for revenue, Adjusted EBITDA, Adjusted EBITDA Margin, Free Cash Flow, net new business, net debt and leverage. The forward-looking statements can be identified by words such as “anticipate,” “believe,” “plan,” “estimate,” “expect,” “intend,” “project,” “target,” and other similar expressions. Forward-looking statements are made as of the date of this presentation and are based upon management's current expectations and beliefs concerning future developments and their potential effects on us. Such forward-looking statements are not guarantees of future performance. The following important factors, as well as risk factors described in our reports filed with the SEC, could cause our actual results to differ materially from estimates or expectations reflected in such forward-looking statements:

global automobile production volumes;

the financial condition of our customers and suppliers;

our ability to make scheduled payments of principal or interest on our indebtedness and comply with the covenants and restrictions contained in the instruments governing our indebtedness;

our ability to refinance our indebtedness;

risks associated with our non-U.S. operations, including foreign exchange risks and economic uncertainty in some regions;

any increase in the expense and funding requirements of our pension and other postretirement benefits;

our customers' ability to obtain equity and debt financing for their businesses;

our dependence on our largest customers;

pricing pressure from our customers;

changes to U.S. trade and tariff policies and the reaction of other countries thereto;

work stoppages or other labor issues affecting us or our customers or suppliers;

our ability to integrate acquired businesses;

our ability to take advantage of emerging secular trends,

risks associated with business divestitures;

costs or liabilities relating to environmental and safety regulations;

our ability to close the pending transaction in accordance with anticipated terms; and

regulatory and other conditions that must be satisfied or, in certain circumstances, waived in order to consummate the pending transaction.

We do not assume any obligation to update or revise the forward-looking statements contained in this press release.

Contact:
Derek Fiebig
Executive Director, Investor & External Relations
(248) 675-6457
fiebig.derek@towerinternational.com

View original content:http://www.prnewswire.com/news-releases/tower-signs-a-memorandum-of-understanding-to-sell-its-european-operations-at-an-accretive-value-300753352.html

SOURCE Tower International, Inc.

AEye Sets New Benchmark for LiDAR Range

Introducing 1KM
AEye has set a new benchmark for LiDAR range. In performance specification tests monitored and validated by VSI Labs, AEye’s iDAR system acquired and tracked a truck at 1,000 meters – or one kilometer – five times the distance current LiDAR systems are able to detect!

Watch now:

AEye Sets New Benchmark for LiDAR Range — AEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARAEye Introduces Groundbreaking iDAR TechnologyThe Future of Autonomous Vehicles: Part I – Think Like a Robot, Perceive Like a HumanAEye Announces the AE100 Robotic Perception System for Autonomous VehiclesGartner Names AEye Cool Vendor in AI for Computer VisionAEye Granted Foundational Patents For Core Solid-State MEMs-Based Agile LiDAR And Embedded AI TechnologyAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementAutoSens Names AEye Most Exciting Start-Up in the Automotive Imaging SectorAEye Introduces Next Generation of Artificial Perception: New Dynamic Vixels™AEye’s iDAR Leverages Biomimicry to Enable First Solid State Lidar with 100Hz Scan Rate

Montupet UK Limited

Home

About us

The Group

Financials Informations

From 1894 up to now

Contacts

Strategy

Chairman of the board

Company Philosophy

Quality

Environment

Activities

Customers and Products

Advanced Technology

News

MONTUPET is proud of FORD and BMW-PSA successes

MONTUPET presence in India

Montupet UK Limited

Signature of two major contracts with Daimler

Locations

Headquarters

France, Laigneville

France, Châteauroux

Spain, Zaragoza

Bulgaria, Ruse

Mexico, Torreon

Northern Ireland, Belfast

Your career

MONTUPET is proud of FORD and BMW-PSA successes

MONTUPET presence in India

Signature of two major contracts with Daimler

Montupet UK Limited

Montupet UK Limited, a subsidiary of Montupet SA has received an Invest NI Grant for Research and Development, supporting company innovation in services, products and processes.
Part financed by the Investment for Growth and Jobs Programme for NorthernIreland co financed by the European Regional Development Fund.
The funding for the project during the period 2014 and 2015 in relation to Panther Cylinder Head Development assisted the industrialisation and introduction of a new product using pioneering manufacturing techniques.

MONTUPET
202, quai de Clichy
BP77 – 92112 Clichy cedex France
Telephone:+33 (0)1 47 56 47 56
Fax: +33 (0)1 47 39 77 93

News

MONTUPET is proud of FORD and BMW-PSA successes

MONTUPET presence in India

Signature of two major contracts with Daimler

Bridgestone to Transfer “PureBeta” Ultra High Purity Fine Ceramics SiC Component Operations

News – 2018

Bridgestone to Transfer “PureBeta” Ultra High Purity Fine Ceramics SiC Component Operations

2018/10/17

Tokyo (October 17, 2018) — Bridgestone Corporation announced on October 11 that it has reached an agreement with MARUWA CO., LTD., to transfer its “PureBeta” ultra high purity fine ceramics*1 silicon carbide (SiC) component operations to this company.

Since the launch of “PureBeta” in 2003, products from this line have been adopted by numerous semiconductor production equipment and semiconductor device manufacturers around the world, primarily for use as components in semiconductor production equipment. However, it was decided to transfer this business as part of the reorganization of the diversified products business*2, which is centered on solutions businesses, with the aim of achieving profitable and sustainable growth.

Financial information pertaining to “PureBeta” operations is not disclosed.

The Bridgestone Group is seeking to improve corporate value through higher capital efficiency while moving ahead with management reforms to make progress toward its ultimate management goals of becoming a truly global company and “Dan-totsu*3,” or the clear leader, in its industries.

Overview of MARUWA CO., LTD.

1. Company name :MARUWA CO., LTD.

2. Location :3-83, Minamihonjigahara-cho, Owariasahi-city, Aichi, 488-0044, Japan

3. Establishment :April 1973

4. Capital :8,646,720,000 JPY

5. Representative :Sei Kanbe

6. Business activities :Development, production, and sale of ceramics for electronics and industrial applications and electronic parts

7. Number of employees :1,815 (consolidated, as of March 31, 2018)

*1. High-performance ceramics that differ from standard ceramics in that they are made through optimal refining and mixing of raw material powders

*2. Business entity in Bridgestone Group offering products other than tires and sports equipment, namely conveyor belts, water drainage systems, and urethane foam products for vehicles

*3. The Japanese term for “the absolute and clear leader”

About Bridgestone Corporation:
Bridgestone Corporation, headquartered in Tokyo, is the world’s largest tire and rubber company. In addition to tires for use in a wide variety of applications, it also manufactures a broad range of diversified products, which include industrial rubber and chemical products and sporting goods. Its products are sold in over 150 nations and territories around the world.

Location

Home

About us

The Group

Financials Informations

From 1894 up to now

Contacts

Strategy

Chairman of the board

Company Philosophy

Quality

Environment

Activities

Customers and Products

Advanced Technology

News

MONTUPET is proud of FORD and BMW-PSA successes

MONTUPET presence in India

Montupet UK Limited

Signature of two major contracts with Daimler

Locations

Headquarters

France, Laigneville

France, Châteauroux

Spain, Zaragoza

Bulgaria, Ruse

Mexico, Torreon

Northern Ireland, Belfast

Your career

The Group

Financials Informations

From 1894 up to now

Contacts

Location

MONTUPET
202, quai de Clichy
BP77 – 92112 Clichy cedex France
Telephone:+33 (0)1 47 56 47 56
Fax: +33 (0)1 47 39 77 93

MONTUPET
202, quai de Clichy
BP77 – 92112 Clichy cedex France
Telephone:+33 (0)1 47 56 47 56
Fax: +33 (0)1 47 39 77 93

News

MONTUPET is proud of FORD and BMW-PSA successes

MONTUPET presence in India

Signature of two major contracts with Daimler

New research and development center for electronics: a milestone for vehicle electrification at MAHLE

Press releases

Press archives

Media service

Publications

Press contact

Home

Press
Press releases

New research and development center for electronics: a milestone for vehicle electrification at MAHLEStuttgart/Germany and Valencia/Spain, November 23, 2018 – MAHLE inaugurated a new research and development center yesterday at its location in Valencia/Spain. In the future, around 250 employees will work on developing new products and solutions for sustainable mobility at this competence center for vehicle electronics.
Press release (english version) [PDF; 36 KB]Press release (spanish version) [PDF; 54 KB] New research and development center for electronics opened in Valencia Location expanded to become a global competence center for electronicsMAHLE strengthens its competence as systems supplier for e-mobilityElectronic systems are becoming increasingly important in modern vehicles. The powertrain of the future is a system consisting of interconnected hardware and software that communicates and interacts intelligently within the vehicle. As a pioneer of future mobility solutions, MAHLE is therefore continuously expanding its activities in the area of electronics, with the company’s new research and development center in Valencia representing a significant element of this approach.
“As a creator of new and climate-friendly mobility solutions, we believe that the ongoing development of e-mobility is crucial. With our new research and development center, we’re strengthening our competence in the area of power electronics and consistently working toward our goal of becoming a holistic systems supplier in this field,” explains Dr. Jörg Stratmann, Chairman of the Management Board and CEO of the MAHLE Group.
In the future, power electronics and software solutions—e.g., for products such as electric drive systems and auxiliary components, charge management systems, or heating and cooling systems—will be developed in Valencia. Another area of focus will be the validation of systems in accordance with the standards of the automotive industry.
To effectively advance the electrification of vehicles, MAHLE has combined its activities relating to electric drives, actuators and auxiliaries, as well as control and power electronics in the Mechatronics division. These products are used in passenger cars, commercial vehicles, and off-highway vehicles.
“With its excellent universities, Valencia is an ideal hub for modern research and development. I’m convinced that our new research and development center will be an asset to the city and become a magnet for electronics development, attracting the next generation of engineers,” says Wilhelm Emperhoff, Member of the Management Board of the MAHLE Group and responsible for the Filtration and Engine Peripherals business unit as well as the Mechatronics division.
MAHLE’s goal is to make individual mobility more climate-friendly and sustainable. To achieve this, the company is pursuing a dual strategy. On the one hand, MAHLE is working intensively on the further optimization of the combustion engine. On the other, the company is developing solutions for the widespread adoption of e-mobility. As a key player in the automotive industry, MAHLE is therefore instrumental in shaping the future of mobility.
About MAHLE MAHLE is a leading international development partner and supplier to the automotive industry as well as a pioneer for the mobility of the future. The MAHLE Group is committed to making transportation more efficient, more environmentally friendly, and more comfortable by continuously optimizing the combustion engine, driving forward the use of alternative fuels, and laying the foundation for the worldwide introduction of e-mobility. The group’s product portfolio addresses all the crucial issues relating to the powertrain and air conditioning technology—both for drives with combustion engines and for e-mobility. MAHLE products are fitted in at least every second vehicle worldwide. Components and systems from MAHLE are also used off the road—in stationary applications, for mobile machinery, rail transport, as well as marine applications.
In 2017, the group generated sales of approximately EUR 12.8 billion with about 78,000 employees and is represented in more than 30 countries with 170 production locations. At 16 major research and development centers in Germany, Great Britain, Luxembourg, Spain, Slovenia, the USA, Brazil, Japan, China, and India, around 6,100 development engineers and technicians are working on innovative solutions for the mobility of the future.
For further information, contact:MAHLE GmbH
Margarete Dinger
Corporate Communications/Public Relations
Pragstraße 26–46
70376 Stuttgart/Germany
Phone: +49 711 501-12369
margarete.dinger@mahle.com

Annual report
This was our year 2017

Download
[PDF; 5087 KB]

Key figures
[PDF; 1464 KB]

Annual report website

ZF Digital Convention: ZF Promotes Digital Transformation with Internal Congress

ZF’s ambitious future vision is clean and safe mobility that is automated, comfortable, affordable and accessible to everyone, everywhere, around the globe. Digitalization plays a central role: “It enables us to offer our products on a networked basis, comprehensively across diverse customer sectors,” said Mamatha Chamarthi, Chief Digital Officer at ZF Friedrichshafen AG. “To retain… Continue reading ZF Digital Convention: ZF Promotes Digital Transformation with Internal Congress