Rethinking the Four “Rs” of LiDAR: Rate, Resolution, Returns and Range

Extending Conventional LiDAR Metrics to Better Evaluate Advanced Sensor SystemsBy Blair LaCorte, Luis Dussan, Allan Steinhardt, and Barry Behnken
Executive SummaryAs the autonomous vehicle market matures, sensor and perception engineers have become increasingly sophisticated in how they evaluate system efficiency, reliability, and performance. Many industry leaders have recognized that conventional metrics for LiDAR data collection (such as frame rate, full frame resolution, points per second, and detection range) no longer adequately measure the effectiveness of sensors to solve real-world use cases that underlie autonomous driving.
First generation LiDAR sensors passively search a scene and detect objects using background patterns that are fixed in both time (no ability to enhance with a faster revisit) and in space (no ability to apply extra resolution to high interest areas like the road surface or pedestrians). A new class of solid-state, high-performance, active LiDAR sensors enable intelligent information capture that expands their capabilities — moving from “passive search” or detection of objects, to “active search,” and in many cases, to the actual acquisition of classification attributes of objects in real time.
Because early generation LiDARs use passive fixed raster scans, the industry adopted very simplistic performance metrics that don’t capture all the nuances of the sensor requirements needed to enable AVs. In response, AEye is proposing the consideration of four new corresponding metrics for extending LiDAR evaluation. Specifically: extending the metric of frame rate to include object revisit rate; extending the metric of resolution to capture instantaneous resolution; extending points per second to signify the overall more useful quality returns per second; and extending detection range to reflect the more critically important object classification range.
We are proposing that these new metrics be used in conjunction with existing measurements of basic camera, radar, and passive LiDAR performance. These extended metrics measure a sensor’s ability to intelligently enhance perception and create a more complete evaluation of a sensor system’s efficacy in improving the safety and performance of autonomous vehicles in real-world scenarios.
Download “Rethinking the Four “Rs” of LiDAR: Rate, Resolution, Returns and Range” [pdf]
IntroductionOur industry has leveraged proven frameworks from advanced robotic vision research and applied them to LiDAR-specific product architectures. One framework, “Search, Acquire [or classify], and Act,” has proven to be both versatile and instructive relative to object identification.
Search is the ability to detect any and all objects without the risk of missing anything.Acquire is defined as the ability to take a search detection and enhance the understanding of an object’s attributes to accelerate classification and determine possible intent (this could be done by classifying object type or by calculating velocity).Act defines an appropriate sensor response as trained, or as recommended, by the vehicle’s perception system or domain controller. Responses can largely fall into four categories:Continue scan for new objects with no enhanced information required;Continue scan and interrogate the object further, gathering more information on an acquired object’s attributes to enable classification;Continue scan and track an object classified as non-threatening;Continue scan and instruct the control system to take evasive action.Within this framework, performance specifications and system effectiveness need to be assessed with an “eye” firmly on the ultimate objective: completely safe operation of the vehicle. However, as most LiDAR systems today are passive, they are only capable of basic search. Therefore, conventional metrics used for evaluating these systems’ performance relate to basic object detection capabilities – frame rate, resolution, points per second, and detection range. If safety is the ultimate goal, then search needs to be more intelligent, and acquisition (and classification) done more quickly and accurately so that the sensor or the vehicle can determine how to act immediately.
Rethinking the MetricsMakers of automotive LiDAR systems are frequently asked about their frame rate, and whether or not their technology has the ability to detect objects with 10% reflectivity at some range (often 230 meters). We believe these benchmarks are required, but insufficient as they don’t capture critical details, such as the size of the target, the speed at which it needs to be detected and recognized, or the cost of collecting that information.
We believe it would be productive for the industry to adopt a more holistic approach when it comes to assessing LiDAR systems for automotive use. We argue that we must look at metrics as they relate to a perception system in general, rather than as an individual point sensor, and ask ourselves: “What information would enable a perception system to make better, faster decisions?” In this white paper, we outline the four conventional LiDAR metrics with recommendations on how to extend them.
Conventional Metric #1: Frame Rate of 10Hz – 20HzExtended Metric: Object Revisit Rate
The time between two shots at the same point or set of pointsDefining single point detection range alone is insufficient because a single interrogation point (shot) rarely delivers sufficient confidence – it is only suggestive. Therefore, passive LiDAR systems need either multiple interrogations/detects at the same location or multiple interrogations/detects on the same object to validate an object or scene. In passive LiDAR systems, the time it takes to detect an object is dependent on many variables, such as distance, interrogation pattern, resolution, reflectivity, the shape of the object, and the scan rate.
A key factor missing from the conventional metric is a finer definition of time. Thus, we propose that object revisit rate become a new, more refined metric for automotive LiDAR because a high-performance, active LiDAR, such as AEye’s iDAR™, has the ability to revisit an object within the same frame. The time between the first and second measurement of an object is critical, as shorter object revisit times keep processing times low for advanced algorithms that correlate multiple moving objects in a scene. The best algorithms used to associate/correlate multiple moving objects can be confused when time elapsed between samples is high. This lengthy combined processing time, or latency, is a primary issue for the industry.
The active iDAR platform accelerates revisit rate by allowing for intelligent shot scheduling within a frame. Not only can iDAR interrogate a position or object multiple times within a conventional frame, it can maintain a background search pattern while simultaneously overlaying additional intelligent shots. For example, an iDAR sensor can schedule two repeated shots on an object of interest in quick succession (30μsec). These multiple interrogations can be contextually integrated with the needs of the user (either human or computer) to increase confidence, reduce latency, or extend ranging performance.
These additional interrogations can also be data dependent. For example, an object can be revisited if a low confidence detection occurs, and it is desirable to quickly validate or reject it, enabled with secondary data and measurement, as seen in Figure 1. A typical frame rate for conventional passive sensors is 10Hz. For conventional passive sensors, this is the object revisit rate. With AEye’s active iDAR technology, the object revisit rate is now different from the frame rate, and it can be as low as tens of microseconds between revisits to key points/objects – easily 100x to 1000x faster than conventional passive sensors.
What this means is that a perception engineering team using dynamic object revisit capabilities can create a perception system that is at least an order of magnitude faster than what can be delivered by conventional passive LiDAR without disrupting the background scan patterns. We believe this capability is invaluable for delivering level 4/5 autonomy as the vehicle will need to handle complex edge cases, such as identifying a pedestrian in front of oncoming headlights or a flatbed semi-trailer laterally crossing the path of the vehicle.

Figure 1. Advanced active LiDAR sensors utilize intelligent scan patterns that enable an Object Revisit Interval, such as the random scan pattern of AEye’s iDAR (B). This is compared to the Revisit Interval on a passive, fixed pattern LiDAR (A). For example, in this instance, iDAR is able to get eight detects on a vehicle, while passive, fixed pattern LiDAR can only achieve one.
Within the “Search, Acquire, and Act” framework, an accelerated object revisit rate, therefore, allows for faster acquisition because it can identify and automatically revisit an object, painting a more complete picture of it within the context of the scene. Ultimately, this allows for collection of object classification attributes in the sensor, as well as efficient and effective interrogation and tracking of a potential threat.
Real-World ApplicationsUse Case: Head-On DetectionWhen you’re driving, the world can change dramatically in a tenth of a second. In fact, two cars traveling towards each other at 100 kph are 5.5 meters closer after 0.1 seconds. By having an accelerated revisit rate, we increase the likelihood of hitting the same target with a subsequent shot due to the decreased likelihood that the target has moved significantly in the time between shots. This helps the user solve the “Correspondence Problem,” determining which parts of one “snapshot” of a dynamic scene correspond to which parts of another snapshot of the same scene. It does this while simultaneously enabling the user to quickly build statistical measures of confidence and generate aggregate information that downstream proce..

Coffee Talk: Bob Brown

Each week, we sit down with a different member of AEye’s leadership team to discuss their role, their view of challenges and opportunities in the industry, and their take on what lies ahead.
This week, we talk with AEye CFO, Bob Brown.

1. You have a great deal of experience in the LiDAR industry. Tell us a little bit about that.I was at a couple of LiDAR companies before coming to AEye, so I tell people I saved the best one for last. I’m a big believer in LiDAR as a market opportunity. There’s a huge need for this technology, and LiDAR will be able to go everywhere, ultimately.
My background has been a mix of public and private companies, and a lot of hardware-oriented companies with software mixed in. I spent a number of years at HP, and my longest stretch was about 14 years at the semiconductor company, LSI Logic. I also worked at Cadence, and have done some things in the startup world. This is my fifth private company.
While I’ve done a variety of things, my depth of experience is highest in hardware-oriented companies, and you have to have a system knowledge when you’re doing hardware. That system capability is critical, and very helpful to understanding how LiDAR works, because it’s a combination of great hardware and great software that really creates a disruptive solution.
2. Let’s talk a little bit about that. You’ve worked with large companies, small companies, LiDAR, a lot in the hardware space. What attracted you to come to AEye?I was introduced to the company by someone I know and was just very impressed with four major elements.
One is the team. This team is absolutely phenomenal – great experience across the board – just enormous capabilities in terms of technology and business expertise. Everybody at each position is extremely strong. I would say the team, coupled with the culture of the company. Those two elements are critical, and I was convinced AEye would be a great fit for me personally, and that the company could succeed with the team that we’ve got in place.
Second, the technology is absolutely best in class. I’ve seen a lot of LiDAR solutions over the years, and I’m convinced this is by far the best one in the market, and really leapfrogs the competition pretty dramatically.
Third is customers and partners. We’ve really established an “A” list of customers and partners, including Hella, Continental and a number of other companies that are leaders in their spaces. The fact that we won some of those companies is a testament to what this team and this technology can do.
Finally, we’ve got a great business model in terms of how we’re approaching the market and how we’re working with Tier 1s and how we’re working with all of our customers in a unique way.
It’s a combination of those four elements that convinced me that AEye would be a winner and that it would also be a great place to work.
3. This role clearly requires you to be lock-step with the CEO. Can you talk about how you and AEye’s CEO, Blair LaCorte, collaborate, and what that looks like during a pandemic?You really need a strong partnership between the CEO and CFO to have a successful company long term, and Blair and I quickly established that kind of partnership. I think it was pretty clear to both of us when we met that we would work very well together, and that was key to making the decision to come to AEye as well. You’ve got to have chemistry between the executive staff, and the CEO and CFO relationship is a critical one.
We spend a lot of time together, lockdown or otherwise. We talk multiple times a day, usually starting early in the morning, and we’re on calls with each other and texting throughout the day. It’s just a constant communication stream between Blair and I all day long, seven days a week.
4. You’ve held finance roles at large global corporations like Cadence, LSI and HP, and also several startups. How has your experience at the former informed how you guide the latter?I think it was a great development path for me to start with some of the larger companies, because I learned the processes and structures needed to run a very large corporation successfully. Those learnings can be mapped to a startup, but in a very different way. By that, I mean there’s the nimbleness and aggressiveness that you need as part of a smaller company in a high growth environment like AEye’s, but you must marry that with some of the best practices that you develop from these larger companies. It’s a combination of keeping the speed and quick decision-making that you need as a startup, but embracing some of the process and structure critical to successful growth. You can’t cause things to slow down too much or you disrupt what you’re trying to do. It’s a real balancing act to get it right so that you’re getting the best of both worlds effectively. That’s the objective that I’m always striving for as CFO of AEye.
5. The world has changed in every way due to the pandemic, including how teams operate and deals are done. Can you shed light on that?Certainly, everybody’s adapting to a remote work environment, which can be challenging, but I’ve noticed AEye is very adept at that. Many companies seem to be struggling with being remote. I think part of making it work is having a great, collaborative team environment, which AEye does. I think that has enabled AEye to transition into this remote work environment better than some.
Finance people are usually used to working in close proximity to each other and being able to go down the hall and chat with people and ask questions, so we’ve got to adapt to that like everybody else and use tools and processes that you wouldn’t have used quite as much in the past. You have to do the accounting and finance jobs very efficiently on a remote basis and take advantage of some of the software and communication tools available to help enable that.
In the world of finance and capital raising, things are now being done through Zoom calls instead of flying all over the country or the world for personal meetings. Certainly, I’ve done that before, raising money, flying all over the world to meet with investors. This environment is one where it’s a lot more efficient. In some ways, it’s better because you can see a lot more people a lot more quickly than you could otherwise, and it’s easier to organize meetings on the spur of the moment. However, you lose that personal connection from meeting people face to face. Either way, it will be a theme that’s going to continue into 2021 until we get broadly distributed vaccines, and even then, we’ll still be cautious probably for a while.
6. You are a Michigan native, so, having grown up in the Motor City, I have to ask, did you grow up with an affinity for cars?I love cars. I grew up around cars, being from Michigan. My dad worked for Ford and my grandfather worked for Chrysler, and all of our friends were from the auto industry, so I definitely grew up around that. It’s always been a big interest for me. I still subscribe to all the car magazines, Motor Trend and Car and Driver and Road and Track, so I was always interested in what exciting new cars are coming out.
7. And just for kicks, what’s your favorite mode of transportation, and why?I have to say cars, definitely cars. I enjoy other things as well. I like biking. I grew up riding dirt bikes as well. I haven’t done that in many years, but at some point, maybe when I retire, I’ll get a motorcycle again. We’ll see.
Coffee Talk: Bob Brown —AEye and CF Finance Acquisition Corp. III Announce Continental AG’s Participation in $225 Million Pipe OfferingAEye Insights: The Roadmap to AutonomyContinental Expands LiDAR Technology Portfolio by Investing in Robotic Vision and Sensing Pioneer AEyeCoffee Talk: Nate Ramanathan4Sight for TruckingCoffee Talk: Dr. Allan SteinhardtAEye, Global Leader in Active, High-Performance LiDAR Solutions, to Go Public Through Merger with CF Finance Acquisition Corp. IIICoffee Talk: Jordan GreeneCoffee Talk: John Stockton

SAE – ADAS to Automated Driving Digital Summit – December 8-9, 2020

December 8-9, 2020 | ADAS to Automated Driving Digital SummitTuesday, December 8th, 1PM EST / 10AM PST | Panel Discussion: Enabling of ADAS and AVS through AI/ML LearningSpeaker: Abhijit Thatte – SVP of Software Engineering and AI, AEyeSAE – ADAS to Automated Driving Digital Summit – December 8-9, 2020 —AEye Appoints Blair LaCorte as CEO; Luis Dussan Named President and CTOCoffee Talk: Philippe FéruAEye Insights: 2021 Transportation TrendsAEye Unveils 4Sight™, a Breakthrough LiDAR Sensor That Delivers Automotive Solid-State Reliability and Record-Breaking PerformanceCoffee Talk: John StocktonCoffee Talk: Nate RamanathanCoffee Talk: Dr. Allan Steinhardt4Sight Shock and Vibe Testing50G Positive Z-Direction

AEye Insights: 2021 Transportation Trends

In this installment of the AEye Insights series, AEye Founder and VP of Corporate Development, Jordan Greene, interviews Reilly Brennan, General Partner at Trucks Venture Capital and author of the popular newsletter, Future of Transportation (FoT). In this discussion, Reilly and Jordan pull out their crystal ball to talk about COVID’s lasting effects on logistics and delivery, the challenges in store for “structured” autonomy, what’s behind the flurry of AV and sensor SPACs, and why AV engineers are like chefs.
JG: Welcome to AEye Insights, where we talk industry trends with proven business leaders. Our guest today is Reilly Brennan, General Partner at Trucks Venture Capital and author of the popular newsletter, Future of Transportation. Reilly, welcome and thanks for joining us.RB: Thanks, Jordan, it’s great to talk to you. Usually our conversations are texts in the middle of the night about something big that’s going on, so to put it on Zoom here I feel is a little funny, but I’m happy to be here with you.
JG: This is true. It’s usually very candid off-the-cuff discussions, so I’m excited to riff about where the transportation market’s headed and I think that you’ll have some really interesting insights given the very holistic approach to the market that you take in the newsletter that you put out. I think the obvious way to start, and one of the most common subjects that’s discussed, is what is the impact of COVID on all of these different areas of the market or segments of the market? How do you think it’s going to play out with autonomous, ADAS and all the supporting functions in the automotive value chain?RB: Sure, well, first of all, thank you to the team for having me on. Before I get into your question, AEye was one of the first investments we made at Trucks Venture Capital back in 2016, right when our fund was getting off the ground, so it’s been great to be a partner of AEye for now over four years. My background coming into this discussion is: we run a small venture capital fund focused on transportation that’s very early stage. So, usually we’re giving the first check to a company as they’re getting off the ground. And it’s interesting because COVID has accelerated a lot of things that people think about, like software and things related to delivery, while also probably tamping down on some of the expectations for things that shared an asset like shared scooters and things like that, which had, in the beginning, a more difficult time. I think the big change with COVID, though, is all the behavioral changes that are probably going to be closer to permanent. There’s that old saying that it takes 28 days to form a habit. We’re well past the 28-day point with COVID, so all those things that people have been doing around delivery, particularly, I’m really fascinated about how those stick after a vaccine. I think if you look at the way that most of those services work — for example, we have a Shipt subscription, which you buy for 12 months, or Instacart would be the same way — so even if the vaccine gets delivered and everybody is “cured tomorrow”, there’s going to be a lot of people who have already paid in advance for a lot of the stuff. And I think a lot of those behavioral changes will remain, and that’s going to be interesting to watch over the next few months.
JG: It’s on the consumer buying side of things that you see those behavioral changes will have impacted people, and I’m curious about what that means from the technology side when you start looking up the value chain from there: what are the implications that has on what people start to invest in to make those more streamlined logistics and everything else?RB: Well, I guess one of the maybe big opportunities is, are we going to continue to operate in this universe where Amazon becomes the primary mechanism in North America to get you the goods that you need? A lot of people now think Amazon first and Amazon, of course, is building out a really interesting logistics network, potentially, with their Zoox acquisition, and potentially more with commercial vehicles down the road. Maybe the bigger question is, are there other ecosystems out there that are going to drive as much enthusiasm, and where you think of them first? And in that regard, I think about Shopify and other platforms that are out there in the ether that are knitting together a lot of parts of e-commerce. And ultimately, although it sounds kind of insane right now, would Shopify eventually have a logistics network? Would Shopify have a need for vehicles for trucking, for commercial delivery? I think you could probably imagine that’s not too far away for a Walmart in the United States or potentially a Loblaw in Canada or Costco or any of these other big retail entities to support systems like that. And so, I think about first the software that knits together the commerce and ultimately getting that to your door. Those are some of the things I’m thinking about to your question.
JG: And there is, I would assume, multiple levels of vertical integration within that logistics model, because I’ve seen everything from the companies that do just the logistics side of things and are trying to set up the ways to track items from the ports through to the consumer to the actual first-mile last-mile delivery vehicles and everything in between. I’m very curious what you see as the greatest opportunity there, because there are people like Amazon, who arguably are going to want to control everything end-to-end, and they started from the touch point of the consumer. I’m curious how that will impact companies and how they want to fit into the chain. What are your thoughts on that? And have you seen anything really cool?RB: Yeah, well, there’s a couple of macro trends to think about. One is, so much of commercial logistics is moving to short haul: things under 500 miles. So, whereas historically, you might move a bunch of goods from a big assembly facility or distribution center thousands of miles to get it to a local distribution center and ultimately put it in a retail store, now there are so many more moves around a retail location. A company might have multiple distribution centers, and that just as the background of this discussion is really fascinating from a real estate perspective. But it also means all the really different points along the journey that different entities or different vehicles might move a good. Instead of one long, sort of like linehaul trip to a distribution center (DC), you break that up into many other different trips, so the value of real estate, and the people who are holding these goods is really fascinating. There’s a company I’m sure you’ve heard of called Prologis. I’m always kind of keen on following what Prologis is doing because many times they’re Amazon’s landlord. They’re the people who will actually go and break ground, put in a warehouse, and then invite customers like Amazon, etc., in to use their warehouse, and they’ve been putting those facilities in some really interesting places. Then think about all the systems and need to service the moves between the distribution centers. We have a portfolio company called Gatik, which does exactly that. They move middle mile trips between DCs. Then we did an investment in a Canadian company called Swift. They basically will allow a retailer to do same day delivery by having a courier go and pick up things directly from a warehouse. There’s a ton of opportunity here, because if you look at the movement of how commerce is changing, coupled with consumer preferences and all these new trends around ecommerce and delivery, there’s so much movement under the hood of getting a package to you that’s actually way more interesting than Robotaxi. I know that when people got into autonomous vehicles four or five years ago, the dream was always around people. But I think for many years, many people considered the commercial delivery part of AVS, I always considered it like people thought of it as the Sancho Panza of autonomous vehicle investment. Then in 2020, everyone was like, oh, wait, that’s actually going to be more important. So, you’ve seen a rush of things into delivery, commercial goods, and that’s not going to change.
JG: I’m curious to hear your thoughts. First, I’ll start with Gatik, which has a big partnership that you can talk to that I believe is with Walmart, and they are doing some of the logistics side of that, but I know that a big part of their effort also starts to dovetail into the automation of the vehicles themselves as well, and how to create these logistics systems that are automated. Our prediction at AEye was there was going to be an emergence of all of these distilled down or more constrained versions of autonomy that are more limited in function, rather than automated vehicles that we were promised and people are still somewhat pursuing that go everywhere all the time. But there’s derivatives of it. There’s long-haul trucking, there’s middle-mile, there’s first and last mile. There’re all these different ways. And usually it’s about a technology reduction in scope. It’s like if you’re automating a truck from the hub, it’s a lot easier of a problem than trying to automate something that goes anywhere all the time. And similarly, these applications start to work themselves out because there’s business opportunities that exist with considerable demand and with a much more down scope technology problem. What are some of the applications that you’re seeing in that regard that have piqued your interest? What are some of the trends you’ve seen in those areas? Gatik and Swift are obviously examples of that, but I’m sure you have similar predictions in the space.RB: We came up with this term, I think it was in 2016, of “structured autonomy” and the thought was, look at all of these huge vertical markets. Everybody’s focused on Robotaxi, but what about ag and mining and construction and trucking — and actuall..

Coffee Talk: Stephen Lambright

Each week, we sit down with a different member of AEye’s leadership team to discuss their role, their view of challenges and opportunities in the industry, and their take on what lies ahead.
This week, we talk with CMO, Stephen Lambright.

1. Tell us about your role.My role is to manage the overall marketing, communications, and branding for AEye. Initially, when we were pre-product, it was more of an evangelical role promoting the core basis of the technology, fostering partnerships, and helping to create an ecosystem that would enable us to be successful. Now that we are delivering products and can do so at scale, the job has evolved from evangelism to creating a sales funnel and managing against that funnel. It’s really a standard evolution of technology marketing, where you go from an early-phase startup into a startup at scale.
2. As you scale, what do you see as the most important part of your role?I see three components to it. The first is to drive effective communication and messaging for the company externally, which is really about the ruthless consistency that’s required of branding. It’s making sure that we’re all using the same terms and the same messaging and the same language wherever we communicate – personally, through social media, over email, etc.
The second is promoting and projecting the brand, the company, and the culture – who we are and why we’re different – not only from a technology perspective, but from a company and culture perspective. It’s putting the mix together that enables that. Marketing plays a role in humanizing the company. I believe that the culture that you create within the company, and supporting employees and partners in an effective way, is as important as how it’s reflected externally: that’s the yin and yang of marketing.
The third is ensuring that we emotionally engage with people through effective storytelling, through narrative. I’m a big proponent of content and content marketing because I believe that’s how you effectively communicate with people – through stories, not through textbooks and data sheets. That’s why we build in narratives, and work to deliver them effectively.
3. How has COVID impacted marketing at AEye in 2020, and how have you and your team adapted to the new reality?First and foremost is the lack of travel and the inability to be in front of people and interact with them. COVID hit during a product launch, which drove us to come up with a new way of doing demos and engaging customers. We looked at gaming platforms and adopted the Discord platform for doing interactive demos with people around the world, and it’s been extremely effective. We’ve customized it to fit our needs with multiple servers and a very sophisticated scheduling system. We’ve connected our demo vehicles to Discord so that we can do live, interactive Discord demos from Interstate 580 in the USA or from anywhere in the world. People can engage with our engineers in real time to really get a sense of what the product is and what our product platform is capable of. It’s unique and has been a real boon for us because it’s opened up an entirely new way to engage people without having to travel.
Discord has truly transformed how we think about product marketing. It’s not just data sheets and brochures and demos: we’ve created a very interactive way of engaging customers that’s much more fluid and continuous than it has been in the past.
The other big change has been the absence of physical events and shows. Virtual events have been a challenge for all marketers because, while everyone’s trying, it’s very difficult to replicate the energy and interchanges that occur in a physical event. There are a lot of attempts at it, but I’m just not sure it’s possible to replicate. There are some advantages to virtual events: you can be more time efficient in picking and choosing content that you want to see, and some platforms have done a great job creating focused, engaged events with a broader audience in a subject matter-focused way. But it’s not the same.
I’d be remiss if I didn’t talk about internal communications. With COVID forcing remote work, we’ve put a renewed focus on how we communicate internally. We established weekly Zoom lunches, where we share information. We try to bring in people from different departments to talk about what they are working on, and to introduce the rest of the company to what their world looks like. This has been a good opportunity to cross-pollinate, and I think it’s good for everyone to take a few minutes every week to step back from their world and hear someone else’s perspective. It’s been very helpful from a communication and community standpoint, as well.
We’ve tried very hard to increase inter and intra department communication, and because managers have been encouraged to do a daily outreach and to communicate with their teams on a more regular basis, I think we are communicating better as a company now that we are apart than we did when we were together. We miss the water cooler or break room conversations, but we also seem to be more effective at communicating because we’re more apt to pick up the phone or email or slack to touch base with people.
4. You are going to market with 150-year-old global suppliers and working directly or indirectly with some of the largest OEMs in the world. How has Silicon Valley melded with traditional business, and what advice can you share about how to approach these relationships?Silicon Valley is built on the idea of being disruptive: disrupting technologies, disrupting markets, disrupting established companies and business models, and finding new ways of doing things. Sometimes it is appropriate. Sometimes it’s better to leverage something that’s existing in order to gain greater market momentum, faster. It’s important for a technology company, whether they’re based in Silicon Valley or Austin or Boston or anywhere else, to understand the value chain in the market and understand where they add value, how the customers perceive they’re adding value, and to understand how customers want to buy and want to engage, and to find those leverage points that are most appropriate. Sometimes there are none and you need to establish something entirely new. Other times you’re getting a lot more momentum and it’s a lot more effective to leverage existing business models and ecosystems.
I learned a lot about this from my first job in the technology world, as a product manager at Autodesk. Autodesk, in my mind, established the ultimate partnership strategy, where they built a platform, AutoCAD, that was open and cost-effective relative to the existing systems that were out there. They then went to market with thousands of systems integrator partners who were experts in dozens of very specific vertical markets – everything from architectural design to civil engineering to forestry mapping to graveyard management. These systems integrators took AutoCAD and customized it to meet the very specific needs of each of these very specific markets. I find this model very compelling because it delivers better value for the end customer. It’s similar to what AEye is doing today. We have built an active, intelligent, software-definable sensing platform – which we call iDAR – that can be used in a variety of different markets, and we’re enabling our systems integrator and Tier 1 partners to take iDAR and customize it to the unique needs of each of these markets. These partnerships are a win-win for everybody: for the customer, the partner, and for us because everyone gets something better out of the equation.
5. You’ve said your personality and interests have tended to find you in positions where innovation is the driving catalyst and sustainable growth is the primary business objective. How does marketing support innovation and growth at AEye?A great example of this is our basketball demo at CES in 2020. A traditional marketing group probably wouldn’t have said, “Let’s do real-time motion forecasting of a basketball shot at CES.” But in a brainstorming session in the summer of 2019, one of our brilliant engineers said, “I think we should predict whether a basketball shot will go in the hoop”, and at that point I said, “Yes, we’re going to do that!” Marketing then pushed to make this demo happen over the last half of 2019. It turns out there was a huge, virtuous cycle in this process where, because we were determined to do this demo, the engineering team learned a tremendous amount about forecasting, not just from a perception perspective, but how the sensor needed to operate in order to get the best quality information. Marketing’s big audacious goal of doing a basketball motion-forecasting demo at CES actually drove product innovation.
6. Certainly the world has changed this year. What macro trends do you see on the horizon for marketing in 2021?There’s going to be a huge amount of pent-up demand when things start opening up – when everyone’s gotten the vaccine and they’re able to go out and see and meet and greet people again – to be with other human beings. I think marketing has an opportunity in the last half of 2021 to find ways of leveraging this desire for human interaction.
That being said, I think virtual components of events are here to stay: most events will be hybrid, with a real-time, personal portion, as well as a virtual connection. To me, this is one of the most interesting things about where marketing goes in 2021: Once we can resume events, what happens? What will they look like? Events will be different in the future, and I think there’s going to be a very interesting opportunity for events to differentiate themselves based on how they manage that hybrid environment.
7. I understand in your free time you are a theater buff. What’s your favorite genre/favorite show of all time?The best play I have ever seen performed live was Arcadia by Tom Stoppard. What I like about Tom Stoppard in general is that he weaves storylines ..

4Sight for Trucking

OverviewWhen it comes to partially automated and autonomous trucking, not only does its perception system require long-range detection capabilities in forward view at highway speed, but it’s equally important that the perception system can perform in low speed, highly complex and dynamic scenarios, such as maneuvering through logistics parks and urban environments. The trouble with conventional LiDAR systems for trucking is that, not only do they not achieve a far enough detection range, but they don’t have the agility to adjust their scanning capabilities to optimize for both highway and urban driving.
At a foundational level, trucking applications require: long-range capabilities in forward view, flexible sensor placement, and long term, solid-state reliability.
4Sight’s Unique CapabilitiesNo other sensor system can take on all autonomous trucking requirements like AEye’s 4Sight, which detects from 1cm to 1,000 meters, provides cost-effective 360° coverage around the truck, and boasts the industry’s most ruggedized and robust MEMS.
Built on AEye’s award-winning iDAR™ platform, 4Sight delivers proven industry leading, long-range, high performance perception for trucking applications. 4Sight is trucking’s only true, cost-effective, customizable perception solution with the ability to optimize for various driving conditions. It’s agile, software-definable architecture enables the creation of a library of responsive scan patterns that adapt to driving conditions, environment, speed, and vehicle location. For example, a set of unique scan patterns can be created for highway driving (with a long-range, high-density, narrow FOV), and for environments that are more dynamic and unstructured, such as urban areas and logistics parks. 4Sight’s agile architecture also enables the deployment of ultra-high resolution Regions of Interest (ROIs) to achieve the highest levels of safety for both highway and urban driving.
With software-definable range optimization of 1cm to 1,000 meters and a triggerable instantaneous resolution of 0.025°, 4Sight achieves the industry’s longest detection range for accurate high speed, highway detection and classification.4Sight is configurable for high-volume trucking applications, achieving 10x greater performance at 10% of the cost of other LiDAR sensors. 4Sight’s unique system design enables seamless, cost-effective 360° coverage around the truck. Its flexible placement options include grill, behind the windshield, or roof of the truck.AEye’s innovative and patented approach to MEMS makes 4Sight the most robust and ruggedized for any application that experiences a lifetime of shock and vibration, surpassing even the most stringent automotive standards. 4Sight delivers solid-state performance and reliability—proven to sustain mechanical shock of over 50G, random vibration over 12Grms (5-2000Hz), and sustained vibration of over 3G.AEye provides high-end, robust manufacturing through Tier 1 partners. All units are updated by our Tier 1 partners for the warrantied lifecycle. Request a live, virtual demo4Sight for Trucking —4Sight for ITS ApplicationsAEye Insights: 2021 Transportation TrendsCoffee Talk: Dr. Allan SteinhardtCoffee Talk: Stephen LambrightAEye Reveals Advanced MEMS That Delivers Solid-State Performance and Reliability Needed to Advance Adoption of Low-Cost LiDARTime of Flight vs. FMCW LiDAR: A Side-by-Side ComparisonAEye Insights: The Road to ElectrificationCoffee Talk: Philippe FéruAEye Appoints Blair LaCorte as CEO; Luis Dussan Named President and CTO

4Sight for ITS Applications

OverviewFrom tolling automation, intersection traffic management, smart mobility infrastructure, autonomous parking and more, ITS applications are diverse, yet demand similar requirements for their perception sensors. The trouble with traditional ITS perception systems is that cameras alone fail to accurately detect in adverse weather and poor lighting conditions, radar captures only 2D data and can be spoofed, and other alternatives in sensing are now simply out-of-date.
At a foundational level, all ITS applications require: high accuracy object detection and classification (vehicle, truck, cyclist, pedestrian, etc.), solid-state, long-term reliability, and the ability to perform in all weather conditions.
4Sight’s Unique CapabilitiesNo other sensor system can take on all ITS application requirements like AEye’s 4Sight, which can be used to obtain high quality traffic data, detect near-miss incidents, and provide safety solutions to help prevent accidents for a safer, more efficient world.
4Sight is built on its award-winning iDAR™ platform, which fuses solid-state agile LiDAR, an optional camera, and integrated AI to create a smart, software-definable sensor that extracts only the data that matters—enabling fast, accurate perception. AEye’s intelligent, solid-state LiDAR allows for the customization of scanning capabilities for any ITS application. With 4Sight, the user can optimize scanning configuration and parameters for their unique application to achieve the highest level of detection and classification possible. For example, intersection traffic management applications require flawless pedestrian detection. With 4Sight, it is possible to generate unique ultra-high resolution Regions of Interest (ROIs) at pedestrian crosswalks and along the sides of the road to better detect and classify pedestrians and cyclists.
With 4Sight, basic perception can now be distributed to the edge of the sensor network, enabling the collection of data in real time, enhancing existing centralized perception software platforms by reducing latency, lowering costs and achieving classification at range.Built on 1550nm LiDAR, 4Sight delivers superior performance in adverse weather. 4Sight enables the configuration of a library of deterministic scan patterns that can be customized and fixed or triggered to adjust to changing environments and external input. Therefore, a weather specific scan pattern can be triggered when the system detects rain.AEye’s 4Sight sensors were initially developed for the automotive market and have an Ingress Protection of IP69K. AEye’s innovative and patented approach to MEMS makes them the most robust and ruggedized for ITS or any application that experiences a lifetime of shock and vibration, surpassing even the most stringent automotive standards. Request a live, virtual demo4Sight for ITS Applications —AEye Insights: 2021 Transportation TrendsCoffee Talk: Jordan GreeneTime of Flight vs. FMCW LiDAR: A Side-by-Side ComparisonCoffee Talk: Philippe FéruCoffee Talk: Stephen Lambright50G Positive Z-DirectionAEye Appoints Blair LaCorte as CEO; Luis Dussan Named President and CTOAEye Unveils 4Sight™, a Breakthrough LiDAR Sensor That Delivers Automotive Solid-State Reliability and Record-Breaking PerformanceAEye Reveals Advanced MEMS That Delivers Solid-State Performance and Reliability Needed to Advance Adoption of Low-Cost LiDAR