Advancing AI Emotion Sensors in the Transportation and Logistics Industry Series: Blog # 3

0


In the first blog in the AI ​​Transportation and Logistics series, I featured innovations in AI transformation at Purolator, the second blog focused on accelerating a smarter AI telematics infrastructure. in fleet management. This third blog explores the emotion sensors of AI and the impact that the affective computing market will have on the transportation and logistics industry.

We are now entering a space where things are smarter in everything.

We can already monitor a driver’s vehicle movements for safety hazards, from sharp turns, speeding to the intensity of braking. The sensors move around in our automobile steering wheels so that we know the tension a human may be feeling due to their grip on the steering wheel (too tight can be classified as tension / not relaxed and monitor if the tension continues throughout the day. or decreases later in the day). Going further, research into gait (body posture) and the way humans walk to and from their vehicles can also determine whether a human’s body posture is straight with the chin facing forward (can be classified as confident) or head down (can be classified as deep in thought or sad?).

It seems like everyone in the auto industry is thinking outside the box to get a comprehensive view of human behavior 24/7 and bring man and machine closer together.

How far these innovations will go and be incorporated into societal norms and have ethical and confidentiality implications is still unexplored territory. What is clear, the changes are well underway.

You can already see how quickly Samsung is introducing more vehicles into its branded SmartThings ecosystem. Although they are currently focused on automotive innovations, where you can start or stop the engine of vehicles compatible with your SmartThings mobile app, and turn on the heating or air conditioning before getting into your car, these earlier innovation projects create a future vision of only what is to come. Samsung has already partnered with Google to bring a SmartThings dashboard to Android Auto. So you can control your smart home products or open your garage door from your dashboard after connecting it to a Samsung phone.

In October, Samsung announced that Mercedes-Benz’s MBUX voice assistant would support SmartThings for hands-free control of smart home devices.

Understanding Affective Computing (Emotional AI Segment)

The Affective Computing or Emotional AI segment is a market that is expected to grow from $ 28.6 billion in 2020 to $ 140.0 billion by 2025, at a CAGR of 37.4% over the forecast period. Emotional AI or emotional artificial intelligence enables computer systems and algorithms to recognize and interpret human emotions by following facial expressions, body language or from voice / speech.

Emotion AI strives to bring man and machine closer together.

Examples of facial recognition in vehicles will let cars or trucks know who the authorized driver is, and can automatically recognize you and follow your voice commands. Computer vision algorithms are now very precise and make it possible to identify an individual’s face and break it down: eyes, tip of the nose, eyebrows, corners of the lips, etc., then to follow the movement of a person to also identify his emotions.

This is currently done by comparing large databases of facial expressions that can identify types of emotions from facial gestures (joy, sadness, anger, contempt, disgust, fear and surprise). Additional software can increase the classification of emotions and include face identification and verification, age and gender detection, ethnicity and multiple face detection, and much more.

Then we add to this mix, speech recognition software to complement / correlate with facial recognition software to recognize emotional states of acoustic speech and measure with high levels of accuracy whether the speaker is happy, sad, surprised. , angry, or in a neutral state of bother.

According to the 7-38-55 rule of personal communication, words influence only 7% of our perception of emotional state. Body language accounts for 55% and our tone of voice accounts for 38% of our non-verbal messages.

The majority of emotional AI developers are unanimous in stating that the main goal of multimodal emotion recognition is to make human-machine communication more natural. However, as these fields are still relatively new, questions arise about the ethical and confidentiality boundaries that affect what it means to be human.

Another example of AI-based technology developed by MIT Media Lab can track the wearer’s cardio-respiratory information and release different scent combinations as needed to treat certain psychological issues, such as stress or anxiety.

So below you will find a scenario that is not yet common, but that gives you an idea of ​​what is possible today with the different AI activation technologies coming together, which will have an impact. on the transport and logistics industry.

Scenario

Alexia Bolt is the fictional character of The AI ​​Dilemma, where she personifies a day in the life of each industry from a positive and negative perspective, challenging leaders to think wisely about the future world order we want to create in an AI-driven world.

Below is a new storyline for Alexia Bolt to experience and it’s written from a positive perspective.

Alexia Bolt is a courier driver and she is a very successful driver, with no safety incidents or accidents on her track record. Her typical day is to get up at six in the morning and drive to the New York terminal to pick up her assigned truck for the day. She checks her cell phone and is informed that today she has been assigned to Truck 3. As she enters the terminal parking lot, Truck 3 has already recognized her step, and flashes its lights towards her, and as she gets closer, her door is automatically unlocked and her truck says “hello”.

Alexia also wears a driving uniform rich in smart sensors. His morning temperature is checked for any symptoms of Covid-100 and his readings are recorded in his vehicle’s telematics software. Her route for the day has been pre-programmed and her truck has already been preloaded with the packages she will deliver for the day. His truck was loaded by a robot named Robbie who preloaded earlier today.

When she checks her dashboard on her route summary, she can see that there are construction zones on her route and as a result, she activates her rerouting system to minimize traffic and congestion. She asks Alexa what the weather will be like today and she tells her it’s a sunny day and everything is clear in front of her and wishes her a wonderful day of safe driving. Alexia says thank you with a generous smile on her face and she is automatically classified as a happy person with a good attitude.

A refreshing scent of lavender is briefly released to continue the calming vibe, as her truck automatically begins to exit the parking lot to begin its route for the day. As she stops at her first customer destination, she brings the package to the door and rings the doorbell and meets a pleasant elderly woman, who thanks her and wishes her a nice day.

Her sensors automatically register that the customer she is interacting with is friendly and polite. She continues to make over twenty visits over a six-hour period, stopping for lunch at a driveway where she automatically receives her lunch by robotic arms assembling her previous phone order that she submitted while driving.

As she returns to the parking lot, all voice communications with all customer interactions have been recorded and all vehicle driving behaviors calculate a risk score for different profiles: customer, vehicle and driver. All of this information is aggregated at the device level to create a stronger health and safety footprint and keep individual behavior confidential. The management can easily see the emotions of all the driving behaviors of the employees, as well as the employees can easily see the behaviors of their team.

Open Transparent, collaborative communication is at the heart of the smart sensors that are found throughout this modern T&L terminal – bringing people and machines together to create a smarter, connected world.

Conclusion

If you are the CEO or director of a small, medium or large transport and logistics company, have you thought about your affective IT strategy in managing your fleet? Will you be one of the first innovators to pioneer wearable computing, with affective computing innovations interconnected with smarter vehicles?

In short, the transport and logistics industry is changing. Is Your Organization Ready for Emotional AI and Affective Computing? We’ve seen that these methods work well in call centers to guide humans to deal with customer emotions more effectively – will we start to see these innovations transfer into the logistics and transportation industry.

Absolutely we will.

We also know that China is ahead of these innovations, and the U.S. government is already voicing concerns about being left behind in the AI ​​race.

Scenarios like this can help us advance our social context and recognize that only from experience can we design effective emotional AI solutions.

Alexia Bolt is a viable character and she represents a world that is almost there. It’s just that some of us can’t see it yet.

We have the opportunity to design affective IT approaches to advance our health and safety strategies.

Much of the human being resides in our facial expressions and voices, so why not harness the full potential of what humans can do and also recognize that well-designed AI can advance our safety and health of our lives. way to improve our well-being.


Share.

Leave A Reply