Searching for your content…
No results found. Please change your search terms and try again.
Nothing here but dreams
Searching for your content…
No results found. Please change your search terms and try again.
FOR IMMEDIATE RELEASE
Nidec Corporation
Tokyo Stock Exchange code: 6594
Contact:
Masahiro Nagayasu
General Manager
Investor Relations
+81-75-935-6140
ir@nidec.com
Released on January 28, 2019, in Kyoto, Japan
Nidec Announces the Status of Share Repurchases and
the Conclusion of the Share Repurchase Plan
(Repurchases of Shares Pursuant to Article 459, Paragraph 1, Item 1 of
the Company Law of Japan)
Nidec Corporation (TSE: 6594; OTC US: NJDCY) (the ”Company”) today announces the status of the Company’s share repurchases under its repurchase plan in accordance with the Articles of Incorporation pursuant to Article 459, Paragraph 1, Item 1 of the Company Law of Japan.
The Company’s share repurchase plan authorized by the Board of Directors on January 24, 2018 has been concluded as of January 28, 2019. Neither the number nor yen amount of the sharesvrepurchased reached the upper limit of this repurchase plan resolved by the Board of Directors, reflecting the stock ma..
COLUMBUS, Ind.–(BUSINESS WIRE)–Cummins Inc. (NYSE: CMI) is adding to its additive manufacturing capabilities by investing in a new, high-precision 3D metal printing technology called binder jet. This investment is just the next step in Cummins’ plan to revolutionize its manufacturing processes and accelerate the company’s trajectory toward scaled production in additive technologies. Binder jetting is… Continue reading Cummins Implementing New Technologies With Potential to Revolutionize Manufacturing
On April 11, 2019, AEye’s Technical Product Manager, Indu Vijayan, will speak on “AI & Machine Learning” at SAE World Congress in Detroit, Michigan.
Indu Vijayan is a specialist in systems, software, algorithms and perception for self driving-cars. As the Technical Product Manager at AEye, she leads software development for the company’s leading-edge artificial perception system for autonomous vehicles. Prior to AEye, Indu spent five years at Delphi/Aptiv, where, as a senior software engineer on the Autonomous Driving team, she played a major role in bridging ADAS sensors and algorithms, and extending them for mobility. She holds a BS, Technology in Computer Science from India’s Amrita University, and an MS in Computer Engineering from Stony Brook University.
We sat down with Indu to learn more about why the advancement of edge computing and AI is so critical to the rollout of safe and efficient autonomous vehicles…
Q: What does it mean to implement Artificial Intelligence “at the sensor level”?
AEye’s iDAR is the only artificial perception system that pushes data capture and processing to the edge of the network. We achieve this by fusing LiDAR and camera at the sensor to create the highest quality data collection. Traditional LiDAR scanning methods attribute the same amount of importance to every aspect of a given scene. However, as we know from our own experiences driving, not all objects are perceived with equal priority. When driving, we pay much more attention to the pedestrian standing near a crosswalk than to a tree. In this same sense, cars must be able to perceive like a human would in order to drive safely and efficiently. That means, enabling the sensor to treat different regions or objects with varying degrees of priority, and collecting only the most situationally relevant information.
Q: Why is this favorable to the development of advanced artificial perception systems?
Since iDAR is intelligent, it can efficiently cycle and prioritize sensory information. Meaning, that it only sends the most relevant data to the vehicle’s path-planning system. In a conventional sensor system, layers upon layers of algorithms are needed to extract out relevant, actionable data, which creates too much latency for the vehicle to navigate safely at highway speeds. Say you are driving 60mph along a highway when, suddenly, you hear the siren of an ambulance coming from behind you, quickly closing in. In this instance, you are left with two choices: either stay in your lane and maintain your speed, or safety slow down and/or pull over to the side of the road. Whichever decision you choose is determined by the auditory and visual cues you are receiving from the environment, such as the speed of the ambulance, or the density of the traffic around you.
Just like in human perception, our iDAR system creates feedback loops that can efficiently cycle and prioritize sensory information. When humans gather information from the visual cortex, it creates a feedback loop that helps make each step of visual perception more efficient. Because we mimic this process in our system, we enable similar behavior to be learned and trained in autonomous vehicles so that they can make better, more accurate decisions, faster. Therefore, it is able to continually learn and adapt, so that, over time, it becomes even better at identifying and tracking potential hazards.
And because we only scan for and retrieve the most relevant information in a scene, this ultimately allows for cost and power optimization. For instance, we don’t need high end, mega-powerful processors to produce any of our AI algorithms because when we emphasize data quality over data quantity, we reduce the need for a highly powerful processor hiding in the trunk of the car. This not only makes us more cost effective, but it could allow for the redistribution of the power budget inside an electric vehicle to enable longer range performance, as an example. But most importantly, this allows us to make systems that are scalable and optimized for the full value chain.
Q: You will be speaking at SAE World Congress in Detroit, one of the largest gatherings of automotive industry engineers. Why is it so important for advanced automotive systems developers to regularly meet and discuss new ideas and innovations in the industry?
Ultimately, autonomous vehicles will spark a radical shift in our society. Not only will it make safer and more efficient public transportation accessible to the masses, it will allow us to have the time to accomplish meaningful tasks which would otherwise be lost to a long commute. Engineers are the leaders in bringing about this societal change. The dream of safe, fully automated vehicles is a herculean and challenging task to take on, but it’s one that is desperately needed to move society forward. Opportunities like SAE World Congress allow engineers to brainstorm and put the foundational stones together for a safer tomorrow.
AEye Team Profile: Indu Vijayan — AEye Adds VP of AI and Software to Executive TeamGartner Names AEye Cool Vendor in AI for Computer VisionAEye Introduces Advanced Mobility Product for the Autonomous Vehicle MarketAEye Announces Addition of Aravind Ratnam as Vice President of Product ManagementAEye Announces Industry Leading Family of Perception Sensors for ADAS SolutionsAEye’s iDAR Shatters Both Range and Scan Rate Performance Records for Automotive Grade LiDARThe Future of Autonomous Vehicles: Part II – Blind Technology without Compassion Is RuthlessHella and AEye Extend Strategic Partnership to Deliver Sensing and Perception Solutions for ADAS and Autonomous DrivingAEye Advisory Board Profile: Luke SchneiderAEye Advisory Board Profile: Elliot Garbus
17.
April 2019
Stuttgart/Munich
Combined fuel consumption 6.7-4.4 l/100 km; combined CO2 emissions 153-117 g/km*
Download
Filter
Do you really want to delete the data record?
Please wait a moment …
Please wait a moment …
Please wait a moment …
Please wait a moment …
The new Mercedes-Benz CLA Coupé: The main points at a glance
Stuttgart/Munich, Apr 17, 2019
The new Mercedes-Benz CLA Coupé: Cool, laid-back, exciting
Stuttgart/Munich, Apr 17, 2019
The exterior design: Pure emotion
Stuttgart/Munich, Apr 17, 2019
Interior design: The cool cocoon for digital natives
Stuttgart/Munich, Apr 17, 2019
Aerodynamics: Well-developed in the virtual wind tunnel
Stuttgart/Munich, Apr 17, 2019
Interview with Dr Teddy Woll, Head of Aerodynamics and Wind Tunnel: “Aerodynamics is the most efficient way to even more efficiency”
Stuttgart/Munich, Apr 17, 2019
MBUX Mercedes-Benz User Experience: MBUX keeps on learning
Stuttgart/Munich, Apr 17, 2019
ENERGIZING comfort control: Wellness on wheels
Stuttgart/Munich, Apr 17, 2019
Chassis: Agilely stylish
Stuttgart/Munich, Apr 17, 2019
Powertrain: Clean, strong and comfortable
Stuttgart/Munich, Apr 17, 2019
Under the microscope: RDE – On-road measurement of exhaust emissions: Realistic verification of lab values
Stuttgart/Munich, Apr 17, 2019
Technical data
Stuttgart/Munich, Apr 17, 2019
The Driver Assistance Systems and MULTIBEAM LED light: Watchdog
Stuttgart/Munich, Apr 17, 2019
Loading
Henk de Bruin — Sustainability Advisor
Building on our mission to provide clean mobility for everyone and further develop and maintain the Lightyear culture where sustainability is one of the five company values, we have analyzed what is material for the Lightyear company and its stakeholders. Three main subjects were selected on which to focus: Carbon Footprint, Circular Economy and Supplier Sustainability.
Carbon Footprint
In our efforts to support the reduction of global warming, the benefit of an electric vehicle with no tailpipe emissions is clear. That benefit is enlarged substantially if such a car is solar driven because additional charging from the grid can be limited to none, depending on geography.
Indirect CO2 emissions of electric power plants supplying grid charging can be low or even zero when coming from renewable sources. Carbon emissions, however, also occur during the making of a car and at the end-of-life of a vehicle. At Lightyear, we are building up o..
Gestamp, the multinational company specialized in the design, development and manufacture of highly engineered metal components for the automotive industry, signed an strategic Memorandum of Understanding (MoU) today with Beijing Hainachuan Automotive Parts Co. Ltd. (BHAP). The agreement with BHAP aims to increase cooperation in areas such as the electric vehicle as well as to… Continue reading 16.04.2019 Gestamp announces in Auto Shanghai 2019 a deeper cooperation with BHAP …
Lidar Makes it Easy to Build Accurate 3D Models of Any EnvironmentApril 16, 2019|In Media Coverage|By Albie Jarvis
The Puck by Velodyne Lidar used on a backpack by Green Valley International
Mobile mapping and surveying are seeing a growing number
of systems adding lidar technology to drones, backpacks, and
all-terrain vehicles (ATV) to build new applications.
Frank Bertini, UAV and Robotics Business Manager at
Velodyne Lidar, was interviewed
by LidarSurvey.net and discussed how lidar can be used for mobile mapping and
aerial surveying. In the interview, Bertini outlined which Velodyne products
are most suitable for these applications:
“Velodyne’s original HDL-32 is still the most
accurate 3D lidar sensor available on the market today. With a range of 100m
and accuracies down to 1.2cm on each of its 32 laser channels, the sensor is a
great choice for the professional surveyor. The sensor takes in a huge swath of
data, close to 1 million points per second, which reduces the time to complete
large acreage surveying jobs. Velodyne also offers the economical VLP-16-LITE. At just 590
grams, it is a great choice for lighter weight and lower cost drones and UAVs.
The extra-long range VLP-32 is also an option.”
Expect to see lidar technology deployed in an increasing array of applications such as city asset management and power line management.
The complete interview can be read HERE
For Velodyne products click HERE
April 16, 2019 The Alpha Puck, Ultra Puck, and Puck by Velodyne Lidar Advanced manufacturing techniques are needed to build automotive-grade lidar sensors at mass production levels. An Automotive News story by Jack Walsworth called “Velodyne rethinks Alpha Puck factory” explored how Velodyne developed an innovative manufacturing system in order to offer the Alpha Puck™… Continue reading Scaling Lidar Production to Address Growing Global Demand
Published April 16, 2019 2:00 pm, Via NYC
Via and King County Metro Deploy Microtransit Service Connecting Seattle and Tukwila Residents to Public Transit
The new first- and last-mile service will support five major Sound Transit Link light rail stations
April 16, 2019 (Seattle, WA) — Via, the world’s leading provider and developer of on-demand shared mobility solutions, announced today a new microtransit deployment in Seattle, Washington aiming to connect more residents to public transportation. In partnership with King County Metro, Sound Transit and the City of Seattle, the new service brings on-demand connections to five transit hubs, offering first- and last-mile service in southeast Seattle and Tukwila at no additional charge.
Starting April 16, Via will offer service to five Sound Transit Link light rail stations: Mount Baker, Columbia City, Othello, Rainier Beach, and Tukwila International Boulevard. Customers also have the choice to hop on board one of the many Metro bus routes that connect to the Link light rail stations, further connecting the residents of King County.
“Via’s technology is redefining mobility across the globe, and we are thrilled to partner with King County Metro, an innovation-forward agency, to provide residents with a convenient, affordable, and congestion-reducing dynamic transportation alternative,” said Daniel Ramot, Co-founder and CEO of Via. “Via’s powerful passenger matching and vehicle routing algorithm is the solution to solving the first-and-last mile challenge, seamlessly integrating into the existing public transit infrastructure to connect residents to transit hubs in their communities.”
The year-long pilot project is partly funded by $2.7 million from the voter-approved Seattle Transportation Benefit District. Sound Transit also successfully applied for a Mobility on Demand Sandbox grant from the Federal Transit Administration to test the effectiveness of providing on-demand ride-share connections to transit stations.
“We are making it more convenient than ever to hop on board our high-capacity regional transit system,” said King County Executive Dow Constantine. “The on-demand service we are bringing to southeast Seattle and Tukwila reflects our commitment to outstanding customer service, making it easy to take transit to work, school, or play, and back home again.”
Commuters, students, and visitors can download the Via app or call 206-258-7739 to book a ride. All ORCA public transportation passes are accepted upon boarding Via vehicles and will automatically apply as a transfer toward a Metro bus or Link light rail trip. Standard Metro fares apply.
Via to Transit will make it more convenient for customers to connect with the region’s growing transit system. On-demand services like this make it easier for commuters who do not own a car or prefer not to drive and park, live within a long walking distance of a transit hub, or can’t find open spaces at park-and-rides to take transit.
Via has been tapped by cities and transportation players around the world to help re-engineer public transit from a regulated system of rigid routes and schedules to a fully dynamic, on-demand network. Via now has more than 60 launched and pending deployments in more than 15 countries. To learn more about Via, visit www.platform.ridewithvia.com.
About Via
Via is re-engineering public transit, from a regulated system of rigid routes and schedules to a fully dynamic, on-demand network. Via’s mobile app connects multiple passengers who are headed the same way, allowing riders to seamlessly share a premium vehicle. First launched in New York City in September 2013, the Via platform operates in the United States and in Europe through its joint venture with Mercedes-Benz Vans, ViaVan. Via’s technology is also deployed worldwide through dozens of partner projects with public transportation agencies, private transit operators, taxi fleets, private companies, and universities, seamlessly integrating with public transit infrastructure to power cutting-edge on-demand mobility. For more information, visit www.platform.ridewithvia.com.
Read more