If you have ever wondered how cars have become so advanced, whether they will eventually be fully self-driving, or how driver-assisting and self-driving technology works, Sara Sargent has some answers for you.
Sargent works at VSI Labs where she is the engineering project manager. VSI Labs is a local company that researches and tests autonomous vehicle (AV) technologies. Over the summer, Sargent had the opportunity to test one of its self-driving cars by taking it all the way to California.
Self-driving car technology has been in the industry for a while now.
"We are not all of a sudden driving a self-driving car," Sargent explained. "We are already using the technologies that are pushing the industry forward in our current vehicles now. I drive a 2015 and it has adaptive cruise control, lane departure, and blind spot warnings. It is not going to be a jump one day; it is a transition. Every automaker in the world is working on it now. People don't realize how big of an industry it is. Major players have a huge amount of money at stake."
Retail management to electrical engineering
Growing up, Sargent had no plans of becoming an engineer because she didn't enjoy science and math. After graduating from high school, she worked in retail business management for six years.
At the time, she had some friends who were in school for engineering. "I would hang out with them and they would tell me all about what they were working on. They convinced me I should do engineering," she said.
And that she did. After attending Century College for three years, she transferred to St. Thomas. "The engineering program really attracted me here because there are more opportunities to customize your study and work more closely with your adviser. It is such a beautiful school too."
While at St. Thomas, Sargent was the president of the IEEE (Institute of Electrical and Electronics Engineers) student chapter and a member of the Society of Women Engineers. Today, she is on the External Advisory Board for the School of Engineering. The External Advisory Board is made up of several engineers who work in industry, and meet three times a year with students and faculty.
As Anita Hall, the chair of the External Advisory Board, explained, "One of my goals is to ensure that working engineers who represent our student body and who have an interest in the success of St. Thomas' Society of Women Engineers students have a seat on our board. Sara has been a great addition to our board as she brings many unique perspectives to the table. She started college as a working adult, transferring to St. Thomas after obtaining an associate degree at a technical college. She works with autonomous vehicle technology, certainly an area poised for growth. Sara is enthusiastic about giving back to the Tommie community. I am thrilled to have her on our board."
The External Advisory Board also helps to review senior design projects.
"Last year, VSI Labs sponsored a senior design team for the engineering yearlong project where they work with a company," Sargent said. "They built a driver monitoring system for our car. They did a super great job."
Life at VSI Labs
With the combination of being interested in AV technology and working for start-ups and other small companies, VSI Labs was the perfect fit for Sargent. The company consists of two sides: industry research and applied research. She works in the applied research side.
While the industry research side works on finding out more about the technology used to perform different driving applications, the applied research side goes out into town and tests these technologies. These technologies include understanding the surroundings of the vehicle through LiDAR (Light Detection and Ranging), radars, cameras and thermal cameras.
"We have a couple of vehicles in the lab where we take the technologies we are learning about in our research and apply them to the vehicle, integrating different sensors, components and software," Sargent said. "Our team of software developers and computer engineers write code to train neural networks to understand how a human looks with a thermal camera. We then apply those applications to the car and we go out and test drive to see how these driving applications perform."
Some of the applications VSI Labs has tested are ideas that no one has tried before. One of these is called HD map-based lane keeping. This application is essentially a map that knows what lane you are in when driving on the highway. The driver would still navigate the car manually, but if the driver needed to make a lane change, they could flip the turn signal and the car would get over to the next lane on its own. Once they exit the highway, the driver would drive completely manually, unless there is a premade path for the car to follow.
A big problem in the industry is performing lane keeping without relying on vision."If we know where the lane lines and our car are within a couple of centimeters, then we can do lane keeping without having to see the lane lines. A lot of times with a camera, you cannot rely on it, like when it is dark and the pavement is difficult to see or if it is covered by snow or dust," Sargent said.
In addition to HD map-based lane keeping, VSI Labs also uses point cloud localization with LiDAR, which creates a 3D, 360-degree view in laser points all around the car. This technology creates a path synced with a GPS that the car can use to steer itself on a pre-mapped route. In this instance, the driver could be in control of the throttle and brake.
"If we use both adaptive cruise control and HD map-based lane keeping on the highway, then we don't have to worry about the human being in control of any of it. The human is just a safety driver monitoring to take over if we do not have enough data for the system to make a decision," she said.
The drive west
Sargent tested out this HD map-based lane-keeping idea during her drive out to California this past summer as she and two other drivers drove a Ford Fusion research vehicle.
Sargent and her team came across some interesting obstacles on the four-day drive. They were passing a semitruck on the interstate going 80 miles per hour, but the semitruck started to suck their car into it. "We were going at such a high speed that the force was much greater and the driver could feel it more. We hadn't taught the system to react to that, so we had to take over so we wouldn't collide into the semi."
Other times, they observed that the infrastructure wouldn't fit the needs of their other vision-based autonomous driving system, like when there was orange dirt covering the pavement markings. Once in California, they presented their findings and participated in a panel on autonomous vehicle regulations.
The future of AV
Sargent predicts two trajectories for the future of self-driving vehicles. First, she believes consumer vehicles will become more and more advanced. Second, mobility as a service will become more prevalent.There are different levels of AV technology: Level 2 technology means the driver is fully responsible for being engaged while driving, but the car has some advanced features to help. Many cars already have this technology now. Sargent predicts that everyone will have a level 2 vehicle in the next five to 10 years. Level 3 technology would allow the driver to not pay attention to the road but still be in the driver's seat. In this case, the car system would alert the driver if it doesn't have data it needs to handle the situation, and the driver would take over. Lastly, level 4 or 5 would allow for no driver to be in the front seat.
While level 2 technology is available in most consumer vehicles now, level 3 technology is up in the air. "Insurance companies and regulators are not sure what to think. Neither is the industry," Sargent explained. There are a lot of concerns involved with this level. For example, if there were an accident, who would be responsible: the manufacturer, the driver or the sensor company?
"Companies are pursuing this, but I wouldn't be surprised if we don't see it take off for consumer vehicles. The sensors are too expensive now, so that is a big problem. Once they go into production, the prices will come down and be more affordable for consumers, which would allow them to have higher levels of technology. But the consumer trend in the next few years will receive better and more robust level 2 driving systems. They are also what is affordable," she said.
Companies such as Uber, Lyft and Waymo are investing in level 4 technologies. These companies are fleet operators that have thousands of vehicles all running the same system and collecting lots of data. They need this data to train the neural networks to know what a person, dog or street sign is.
"For us to move from highway autopilot features, which are level 2, to an urban city, now we have to recognize all these different signs. The system would need to know what these signs mean and be able to react to pedestrians, bicycles, scooters, strollers ... all these scenarios," Sargent said. "These companies are investing millions in this technology to make it work because they benefit by replacing human drivers with no driver."
On the other hand, Tesla is built from the ground up to be an automated vehicle. It is constantly collecting mass amounts of vision and radar data to make its system smarter. A Tesla is always comparing its own decisions to the driver's so it can learn what to do in any situation.
Advancing automated driving technology, like lane-keeping assist, blind spot warning, adaptive cruise control, and automatic emergency braking, is all for the safety of citizens on and off the road, Sargent said.
"There were 40,000 deaths last year in the United States alone because of drivers doing something wrong on the roadway," she added. "The more safety features we add, the more automated and safer a car becomes."