{"id":10582,"date":"2021-04-05T13:47:24","date_gmt":"2021-04-05T11:47:24","guid":{"rendered":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/?p=10582"},"modified":"2021-04-05T13:48:24","modified_gmt":"2021-04-05T11:48:24","slug":"autonomous-truck-driver","status":"publish","type":"post","link":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/2021\/04\/05\/autonomous-truck-driver\/","title":{"rendered":"Building an Autonomous Truck Driver"},"content":{"rendered":"<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\"><i>Photo: Jack Roberts<\/i><\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\"><b>Daimler Trucks North America, which is working with autonomous-tech companies such as Torc Robotics and Waymo, recently showed off its second-generation autonomous Freightliner Cascadia. Note the sensor clusters on the front bumper and above the windshield and doors.<\/b><\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">Autonomous trucks behave in a surprisingly similar manner to human drivers. They observe, they measure, they attempt to predict the behavior of vehicles around them, and they are constantly planning alternative strategies, all while maintaining their present course.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">The better human drivers do that, too. However, lesser drivers, or those distracted or fatigued, might miss one of those steps. The result can be a crash. That\u2019s why supporters say robotic trucks, because they never tire or get distracted, can reduce crash rates significantly.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">To drive as well as humans, robotic trucks must be able to see as well as humans and make the right decisions all the time. How does that work?<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">Situational awareness begins with the truck\u2019s ability to assess its surroundings.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">Robotic truck companies have developed perception systems that enable the truck to \u201csee\u201d its surroundings. Baked-in high-resolution maps, GPS location data and inertial sensors tell the truck where it is and how fast it\u2019s going. That awareness of itself is complemented by perception technology that informs the artificial-intelligence-based \u201cdriver\u201d on what\u2019s going on around it: where other vehicles are, how fast they are travelling relative to the truck, what physical obstacles exists, and what possible threats may be present.<\/span><\/p>\n<div style=\"text-align: justify;\">\n<figure class=\"article-img\"><span style=\"color: #0000ff;\"><img class=\"wrapImageCMS aligncenter\" src=\"https:\/\/fleetimages.bobitstudios.com\/upload\/trucking-info\/content\/article\/2021-03\/auto_hdtmar21-at-5-park-__-720x516-s.jpg\" alt=\"Waymo\u2019s autonomous truck project uses cameras and radar as well as a lidar system of its own design.\u00a0 - Photo: Jim Park\" \/><\/span><figcaption class=\"caption-description\">\n<p style=\"text-align: center;\"><span style=\"color: #0000ff;\">Waymo\u2019s autonomous truck project uses cameras and radar as well as a lidar system of its own design. Photo: Jim Park<\/span><\/p>\n<\/figcaption><\/figure>\n<\/div>\n<div><\/div>\n<div>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">\u201cWe have various types of sensors that can see the world in different ways, and they complement each other in a variety of ways to get a sense of what\u2019s around us,\u201d says Andrew Stein, a staff software engineer at Waymo and one of the lead engineers on the company\u2019s autonomous trucking project. \u201cWe also have the ability to decide what all those things we\u2019re sensing are going to do next. Our next job is to figure out \u2014 based on where I am, what I\u2019m seeing, and what everybody\u2019s about to do \u2014 is what should I do?\u201d\u00a0<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">First, the trucks need to \u201csee\u201d the world around them. The keys here are diversity and redundancy, and an overlap in the data collected so the AI driver can compare one to another to verify if what the truck is seeing is real and correct.<\/span><\/p>\n<h2 style=\"text-align: center;\"><span style=\"color: #0000ff;\">Three sets of eyes<\/span><\/h2>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">To do this, autonomous trucks use cameras, radar, and lidar. Together they provide a \u201cspatial\u201d representation of the truck\u2019s surroundings and detect objects near and far from it. This fusion of three types of sensor input enables the truck to determine speed and distance of objects and vehicles, down to a centimeter in some cases. Trained machine learning algorithms then help the truck to determine what it is seeing, such as other cars, trucks, pedestrians, animals, guard rails, signs, etc.<\/span><\/p>\n<figure class=\"article-img\" style=\"text-align: justify;\"><span style=\"color: #0000ff;\"><img class=\"wrapImageCMS aligncenter\" src=\"https:\/\/fleetimages.bobitstudios.com\/upload\/trucking-info\/content\/article\/2021-03\/auto_hdtmar21-at-1-luminar-technologiesscreengrab-__-720x516-s.jpg\" alt=\"A visual representation of what lidar sees. Lidar can detect large and small, moving or stationary objects. The bands of color represent distance from the device.\u00a0 - Photo:\u00a0\u00a0Luminar Technologies\" \/><\/span><figcaption class=\"caption-description\">\n<p style=\"text-align: center;\"><span style=\"color: #0000ff;\">A visual representation of what lidar sees. Lidar can detect large and small, moving or stationary objects. The bands of color represent distance from the device.\u00a0Photo:\u00a0\u00a0Luminar Technologies<\/span><\/p>\n<\/figcaption><\/figure>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">\u201cWe use three different categories of sensors to support the perception systems,\u201d says Ben Hastings, chief technology officer at Torc Robotics. \u201cCameras see the world similarly to how people perceive the world. They can see visible light and different colors and things like traffic signs and lane lines. And again, similar information to how you see the roads when you\u2019re driving down the highway.\u201d<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">The ability to recognize color is important for determining if a traffic light is red, yellow or green, for example.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">Cameras are placed all around the truck, often with overlapping fields of view. Some are wide-angle to get the big picture. Others are more narrowly focused for longer-range visuals. Super-high-resolution cameras make object detection and differentiation possible at a range of distances, but they cannot provide accurate distance information. They have other limitations too, such as low-light conditions or darkness, fog, rain, snow, or if the lens is otherwise obscured. If the camera is blocked, it can\u2019t produce an image.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">\u201cWe also use radars and lidars,\u201d says Hastings. \u201cThese are active sensors, meaning they don\u2019t rely just on ambient light. They emit their own energy. Radar and lidar are similar in principle, but they produce very different returns.\u201d<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">Lidar, which is a sort-of acronym for \u201clight detection and ranging,\u201d uses one or multiple laser beams to sweep across the environment. These laser beams reflect off of other vehicles, pavement, surrounding buildings, people. etc., and distance measurement can be obtained based on the amount of time it takes from when that laser is fired to when the reflection comes back.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">The reflected beams produce a high-definition, three-dimensional \u201cpoint cloud\u201d from which the AI can deduce objects and most importantly, distance and speed. These point clouds are often represented visually as images in shades of red, green, yellow and blue, which represent varying degrees of reflectivity, but the lidar does not see the world in color, only in varying degrees of reflectivity.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">\u201cLidar is very high resolution, which produces a very accurate representation of the shapes surrounding the truck,\u201d Hastings says.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">But like cameras, lidar uses light, in the form of a laser beam. It\u2019s outside the visual spectrum, but it\u2019s still light. Therefore it\u2019s limited by line-of-sight and to some extent inclement weather or hazy conditions where the laser beams can be scattered by particles suspended in air, as in fog or smoke.<\/span><\/p>\n<figure class=\"article-img\" style=\"text-align: justify;\"><span style=\"color: #0000ff;\"><img class=\"wrapImageCMS aligncenter\" src=\"https:\/\/fleetimages.bobitstudios.com\/upload\/trucking-info\/content\/article\/2021-03\/auto_hdtmar21-at-2-park-__-720x516-s.jpg\" alt=\"This image represents a radar point cloud, showing return from a radar device. Each point is a metallic surface. Radar can very accurately measure distance and directionality, but it doesn't produce a visually recognizable image. Radar can tell you in a single measurement how fast an object is moving relative to the sending unit. - Photo: Jim Park\" \/><\/span><figcaption class=\"caption-description\">\n<p style=\"text-align: center;\"><span style=\"color: #0000ff;\">This image represents a radar point cloud, showing return from a radar device. Each point is a metallic surface. Radar can very accurately measure distance and directionality, but it doesn&#8217;t produce a visually recognizable image. Radar can tell you in a single measurement how fast an object is moving relative to the sending unit. <\/span><span style=\"color: #0000ff;\">Photo: Jim Park<\/span><\/p>\n<\/figcaption><\/figure>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">Radar, on the other hand, is not affected by visual barriers. The third leg of the autonomous diversity and redundancy stool, radar produces an electromagnetic signal which, like lidar, is emitted and reflected back to the sensor. It can very accurately measure distance and directionality, but it doesn\u2019t produce a visually recognizable image.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">\u201cSince we\u2019re not bats, we\u2019re not used to thinking about this kind of reasoning,\u201d says Waymo\u2019s Stein. \u201cBut radar is good at measuring distance as well as velocity. Two of the really key things about radar are its ability to measure speed or velocity directly. You don\u2019t have to watch a thing over time and figure out how it\u2019s moving. Radar can tell you in one measurement how fast the thing is going.\u201d<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">The thing to remember is the complementary nature of these three sensors, he adds. \u201cThey\u2019re good at different things and they overlap in some ways so there\u2019s a form of redundancy, but also an ability to really cover the various challenges that we have in capturing everything we can.\u201d<\/span><\/p>\n<h2 style=\"text-align: justify;\"><span style=\"color: #0000ff;\">Artificial intelligence<\/span><\/h2>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">The truck wouldn\u2019t have a clue what to do with all this input without a set of instructions and something to guide it on its journey. Companies such as Waymo use terms such as \u201cthe Waymo Driver\u201d to describe the artificial intelligence that is making the mile-by-mile, second-by-second driving decisions. We\u2019ll call it the AI driver.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">The robotic truck companies have been mapping various routes for several years, producing highly detailed maps in visual, infrared (lidar) and electromagnetic (radar) formats for the AI driver to use as a reference.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">In addition to the high-resolution maps, the AI driver has been loaded with thousands of actual images: cars and trucks, people and animals, signs and roadway infrastructure, and just about anything one can imagine seeing along a highway. Like human drivers, the AI driver learns what these things are through the imagery and other inputs, so when it \u201csees\u201d one of them along a road it will have an idea what to expect from it. The AI driver will know that stop signs aren\u2019t likely to dart out into the street, whereas small humans (children) are likely to behave less predictably.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">In other words, the AI driver learns, like human drivers do. One of the advantages AI has over humans is that it can learn faster. Engineers can simulate situations that can run in fractions of a second in AI time but might take several minutes in real time. Through repeated simulations and analysis of the outcome, the engineers teach the AI how to \u201cdrive.\u201d This includes everything from discerning how much torque to apply to the steering shaft to get the truck to change lanes, to preparing to stop if it sees a ball bounce out onto a road or the door on a parked car open.<\/span><\/p>\n<h2 style=\"text-align: justify;\"><span style=\"color: #0000ff;\">Predicting the future<\/span><\/h2>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">\u201cWe try to detect what\u2019s out there, where it is around us, and what we think it is, but that only gives you a snapshot of either what\u2019s just happened or what is happening at a moment,\u201d says Waymo\u2019s Stein. \u201cWhat you want to know is what\u2019s about to happen. As a driver, you\u2019re always trying to predict what\u2019s going to happen, even if you\u2019re not explicitly doing so in your head. You are looking and thinking ahead. That\u2019s what AI does, too.\u201d<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">And it\u2019s not just observation. Reasoning is part of the equation, too.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">\u201cPart of the advances of AI machine learning over the last couple of decades is being able to learn, not only what\u2019s present in an image, or perceptually, but behavior of the world and to predict what might happen next,\u201d Stein says. \u201cThe system is constantly predicting, and not just one thing. It doesn\u2019t think, \u2018Okay, here\u2019s what\u2019s going to happen,\u2019 because it doesn\u2019t know. It\u2019s asking, \u2018What are all the things that could happen,\u2019 and then it starts its reasoning about a variety of possible scenarios and how likely it thinks they all are.\u201d<\/span><\/p>\n<figure class=\"article-img\" style=\"text-align: justify;\"><span style=\"color: #0000ff;\"><img class=\"wrapImageCMS aligncenter\" src=\"https:\/\/fleetimages.bobitstudios.com\/upload\/trucking-info\/content\/article\/2021-03\/auto_hdtmar21-at-3-park-__-720x516-s.jpg\" alt=\"Robotic truck company TuSimple displayed this truck at the CES electronics show in Las Vegas in 2019. Cameras are located above the cab and on the side of the sleeper. The lidar devices are on the hood and the radar device is on the bumper. Together, the array provides a complete view of the area around the tractor with object-detection and tracking capabilities.\u00a0 \u00a0 - Photo: Jim Park\" \/><\/span><figcaption class=\"caption-description\">\n<p style=\"text-align: center;\"><span style=\"color: #0000ff;\">Robotic truck company TuSimple displayed this truck at the CES electronics show in Las Vegas in 2019. Cameras are located above the cab and on the side of the sleeper. The lidar devices are on the hood and the radar device is on the bumper. Together, the array provides a complete view of the area around the tractor with object-detection and tracking capabilities.\u00a0 <\/span><span style=\"color: #0000ff;\">Photo: Jim Park<\/span><\/p>\n<\/figcaption><\/figure>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">A rubber ball alone poses no threat to a truck, but the reasoning that a child could dart out unseen from behind a parked car in pursuit of the ball might cause the truck to ease up on the throttle in preparation for a sudden stop.\u00a0\u00a0<\/span><\/p>\n<h2 style=\"text-align: justify;\"><span style=\"color: #0000ff;\">Hazard recognition\u00a0\u00a0<\/span><\/h2>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">One of the reasons heavy trucks in freeway operations are likely to commercialize at scale before passenger cars is the \u201crelatively predictable\u201d environment in which they operate. Compared to a city street, limited-access freeways are quite tame, especially in rural settings. Not much happens beside or behind the truck \u2014 usually.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">But consider what Daimler and Torc call the \u201clost cargo scenario.\u201d An object suddenly appears in the middle of the road, something that may have fallen off a truck, or a chunk of peeled-off tire tread, or an animal struck by a previous vehicle. At night, such an object would be more difficult to detect. The lidar may or may not catch it. It would depend on the level of reflectivity of the object. Unless it was metallic, the radar might not detect it either, and cameras would be of limited use in the dark. It might be easier to see in daylight, but if the AI truck is following another truck, it would have limited forward visibility even with lidar.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">\u201cWe call this a lost-cargo scenario in our development phase, and there is no easy answer to this, because the number of permutations that can result from that is huge,\u201d says Suman Narayanan, director of engineering in the Autonomous Technology Group at Daimler Trucks North America. \u201cThis scenario \u2014 and of course there are lots of variables \u2014 is one we are training the AI on. The best way we can do that is get out there, put more test trucks with safety drivers, and learn from how much we train the system.\u201d<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">For every imaginable situation, there might be several correct responses leading to a successful outcome, or perhaps a very few, or even none. Narayanan notes that human drivers with all their experience take action based on the circumstances.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">\u201cIt might be safer to stay on course than to swerve or change lanes,\u201d he says. \u201cThe alternatives could pose more danger than staying the course. In most cases, slowing down gives us time to assess the situation, and like humans, the computer also benefits from time.\u201d<\/span><\/p>\n<figure class=\"article-img\" style=\"text-align: justify;\"><span style=\"color: #0000ff;\"><img class=\"wrapImageCMS aligncenter\" src=\"https:\/\/fleetimages.bobitstudios.com\/upload\/trucking-info\/content\/article\/2021-03\/locomation-testing-__-720x516-s.jpg\" alt=\"Locomation is taking a slightly different approach to automation: human-guided automation. The company believes it can bring automation to the market sooner if it\u2019s less reliant on technology that requires AI and machine learning and a vast array of sensors. A human driver operates the lead truck, while a lighter degree of automation keeps the rear truck in a safe following position.\u00a0 - Photo: Locomation\" \/><\/span><figcaption class=\"caption-description\">\n<p style=\"text-align: center;\"><span style=\"color: #0000ff;\">Locomation is taking a slightly different approach to automation: human-guided automation. The company believes it can bring automation to the market sooner if it\u2019s less reliant on technology that requires AI and machine learning and a vast array of sensors. A human driver operates the lead truck, while a lighter degree of automation keeps the rear truck in a safe following position.\u00a0Photo: Locomation<\/span><\/p>\n<\/figcaption><\/figure>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">In such a scenario, the AI driver doesn\u2019t cycle through a list of pre-programmed options, such as \u201crefrigerator in the middle of the road: plan A, change lanes to the right, plan B, change lanes to the left.\u201d They are trained to think like humans, cycle through the options based on the present situation (traffic in the left lane, can\u2019t go that way) and hopefully choose the safest option.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\">While it might take a human driver a second or two to even realize what\u2019s happening and respond, usually by hitting the brakes, the AI driver can recognize the hazard within hundredths of a second and take action, possibly applying the brakes. The difference in that time interval can mean the difference between a catastrophe and a bruise.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #0000ff;\"><em>This article originally appeared in the March print edition of Heavy Duty Trucking.<\/em><\/span><\/p>\n<\/div>\n<div><\/div>\n<div><\/div>\n<p class=\"p-16-gray\">by <a href=\"https:\/\/www.truckinginfo.com\/authors\/3299\/jim-park\">Jim Park<\/a><\/p>\n<div class=\"widget-see-also\">\n<div class=\"byline\">\n<p><span class=\"posted-by\">Source: <a href=\"https:\/\/www.truckinginfo.com\" target=\"_blank\" rel=\"noopener noreferrer\">https:\/\/www.truckinginfo.com<\/a><\/span><\/p>\n<\/div>\n<\/div>\n<div class=\"g-cols wpb_row type_default valign_top vc_inner vc_custom_1585038969469\">\n<div class=\"vc_col-sm-12 wpb_column vc_column_container\">\n<div class=\"vc_column-inner\">\n<div class=\"wpb_wrapper\">\n<div class=\"w-post-elm post_content\">\n<h3 style=\"text-align: center;\"><a href=\"https:\/\/advancedfleetmanagementconsulting.com\/eng\/consultancy\/\" target=\"_blank\" rel=\"noopener noreferrer\"><strong>CUT COTS OF THE FLEET WITH OUR AUDIT PROGRAM<\/strong><\/a><\/h3>\n<p><a href=\"https:\/\/advancedfleetmanagementconsulting.com\/eng\/consultancy\/\"><img loading=\"lazy\" class=\"aligncenter wp-image-5377\" src=\"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-content\/uploads\/sites\/3\/2020\/04\/nueva-ley-auditoria.jpg\" sizes=\"(max-width: 858px) 100vw, 858px\" srcset=\"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-content\/uploads\/sites\/3\/2020\/04\/nueva-ley-auditoria.jpg 2000w, https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-content\/uploads\/sites\/3\/2020\/04\/nueva-ley-auditoria-300x200.jpg 300w, https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-content\/uploads\/sites\/3\/2020\/04\/nueva-ley-auditoria-1024x682.jpg 1024w\" alt=\"\" width=\"858\" height=\"572\" \/><\/a><\/p>\n<p style=\"text-align: justify;\">The audit is a key tool to know the overall status and provide the analysis, the assessment, the advice, the suggestions and the actions to take in order to cut costs and increase the efficiency and efficacy of the fleet. We propose the following fleet management audit.<\/p>\n<h3 style=\"text-align: center;\"><a href=\"https:\/\/advancedfleetmanagementconsulting.com\/eng\/consultancy\/\" target=\"_blank\" rel=\"noopener noreferrer\"><strong>FLEET MANAGEMENT AUDIT<\/strong><\/a><\/h3>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>Photo: Jack Roberts Daimler Trucks North America, which is working with autonomous-tech companies such as Torc Robotics and Waymo, recently showed off its second-generation autonomous Freightliner Cascadia. Note the sensor clusters on the front bumper and above the windshield and doors. Autonomous trucks behave in a surprisingly similar manner to human drivers. They observe, they&#8230;<\/p>\n","protected":false},"author":3,"featured_media":10584,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[18],"tags":[8],"_links":{"self":[{"href":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-json\/wp\/v2\/posts\/10582"}],"collection":[{"href":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-json\/wp\/v2\/comments?post=10582"}],"version-history":[{"count":2,"href":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-json\/wp\/v2\/posts\/10582\/revisions"}],"predecessor-version":[{"id":10586,"href":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-json\/wp\/v2\/posts\/10582\/revisions\/10586"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-json\/wp\/v2\/media\/10584"}],"wp:attachment":[{"href":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-json\/wp\/v2\/media?parent=10582"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-json\/wp\/v2\/categories?post=10582"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/advancedfleetmanagementconsulting.com\/eng\/wp-json\/wp\/v2\/tags?post=10582"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}