• Home
  • News
  • Project
    • PRYSTINE About
    • PRYSTINE Demonstrators
    • PRYSTINE Publications
    • PRYSTINE DISSEMINATION MATERIAL
    • PRYSTINE Final Event
  • Consortium
  • Contact
  • Impressum
  • Search

 

Programmable Systems for Intelligence in Automobiles

 

  • Home
  • News
  • Project
    • PRYSTINE About
    • PRYSTINE Demonstrators
    • PRYSTINE Publications
    • PRYSTINE DISSEMINATION MATERIAL
    • PRYSTINE Final Event
  • Consortium
  • Contact
  • Impressum
  • Search

About the PRYSTINE Project

The Project

Overview, Goals, Objectives

Demonstrators

Descriptions and Videos

Publications

Overviews and links

Dissemination

Posters, Presentations, etc.

Demonstrators

Prystine Demonstrators

 

1.1 LiDAR + AURIX

 

This demonstrator is showcasing the 2D MEMS-based LiDAR

 

PRYSTINE - 1.1 Murata LIDAR demo - person walking away

 

PRYSTINE - 1.1 Murata LIDAR demo-car passing near

 

PRYSTINE - 1.1 Murata LIDAR demo - person walking away and back

 

 

 

1.2 RADAR + AURIX

 

This demonstrator is showcasing the clustering of radar components. When fault occurs in one cluster, it operates non-affected cluster, which provides higher availability.

 

 

 

 

1.3 Radar

 

NXPNL’s novel radar-to-radar interference detection technique reduces the number of false negatives with respect to the state-of-the-art by reducing the noise floor, when interference is present. Lab measurement results have shown 5 to 10 dB noise floor reduction with respect to the state-of-art. On-the road measurements with a radar sensor prototype, using off-the-shelf automotive components, confirm this and have shown that the detection range can be doubled using this technique in the presence of a nearby interferer, restoring most of the radar’s original performance.

PRYSTINE - 1.3 NXPNL vehicle-level health monitoring in Toyota Prius v3

 

 

 

1.4 Radar

 

IMEC‘s demonstration platform comprising the PRYSTINE scalable 60 GHz radar designed in 28nm CMOS. This demonstrator is described in D6.1 and D6.10. No video demonstration is foreseen.

 

 

 

1.5 IC-, vehicle-level health monitoring

 

NXPNL‘s health monitors analyze safety of in-vehicle ICs and software components by detecting five fault models: wrong communication timing, corrupt packet data, implausible message streams, as well as OS and hardware anomalies. The prototyped health monitors were integrated in an autonomous vehicle and demonstrated to detect diverse malfunctions. Based on this monitoring information, the redundant automated driving ECUs in the vehicle can respond in time to various faults and realize fail-operational behavior of the system.

PRYSTINE - 1.5 NXPNL vehicle-level health monitoring in Toyota Prius v3

 

 

 

2.1 Fail-operational autonomous riving platform

 

This demonstrator illustrates the possibility to bring hardware architectures to the next level of safety for highly automated driving. The use of a sensor fusion failover mechanism, developed by TTTech Auto in the project, enables the implementation of embedded control to advance safe technologies. Thus, valuably contributing to the mobility of the future. The benefits from this modularity concept, combining COTS elements such as the SoCs, Infineon‘s AURIX™ automotive microcontroller, power supply, Deterministic backbone network for low latency data exchange, multiple cameras, etc. allow for the flexibility of the developed solution and advances the automotive market.

More Information

 

 

 

2.2 Drive-by-wire car

 

The demonstrator of EDI, presents a novel approach to software component integration: the developed COMPAGE framework (fail-operational system component management framework) and AI-based algorithms are capable of identifying faulty sensors by analyzing different types of data , e.g. LIDAR, Radar, cameras. The system is equipped with Aurix microcontroller providing additional safety integration level and redundancy, acting as a fallback system for LIDAR and RADAR perception subsystems to facilitate successful Automatic Emergency Breaking execution.

See the Video here

 

 

 

2.3 Data Fusion and Fall-back

 

A fully integrated security engineering process for realizing secure autonomous driving as well as a trust model for evaluating the trustworthiness of sensor data, with the data fusion module for improving the accuracy of object detection and tracking, is presented in this demonstrator by the University of Turku in collaboration with TTS.

The main building blocks of this novel approach are; fail-operational middleware, secure data communication and the sensors’ reliability-aware data fusion for assisting automated heavy-duty vehicle driving.

See the Video here

 

 

 

2.4 Passenger vehicle for low speed autonomy

 

The main objective of this demonstrator is to show an autonomous parking solution utilizing the newly developed FUSION algorithms. The developed perception algorithms also provide a working basis for the Ford heavy duty truck demonstrator in SC5. The proposed solution is related to Automated Vale Parking Systems and provides fail-operationality and robustness by the utilization and fusion of multiple sensor sources, including Lidar, cameras, and Radar.

Watch the results: Watch

Path planning for parking: Watch

Multi Object Tracking: Watch

3D Object Detection: Watch

Occupancy grid filtering: Watch

Semantic segmentation: Watch

 

 

 

2.5 Fail-operational AI Inference Processing

 

The fail-operational multi-processor will be demonstrating run time fault detection of a multi-processor system at lower HW overhead than full duplication as for lock step. Videantis GmbH is developing a fail-operational multiprocessor system with flexible redundancy at reduced silicon overhead for an AI algorithm in the project. Results are being finalized and will be available later.

 

 

 

3.1 - 3.3 Demonstrators

 

See an overview over the demonstrators here: Watch

 

 

 

3.1 Demonstrator E/E architecture demonstrator for automotive electronics enabling AD

 

Optimized E/E architecture enabling FUSION-based connected vehicles with autonomous functionality.

PRYSTINE - 3.1 E/E architecture demonstrator for automotive electronics enabling AD

 

 

 

3.2 Simulation, development and validation framework for fail-operational sensor-fusion E/Earchitecture

 

Optimized E/E architecture enabling FUSION-based connected vehicles with autonomous functionality.

PRYSTINE - 3.2 Simulation, development and validation framework for fail-operational sensor-fusion E/Earchitecture

 

 

 

3.3 Dynamically shaped, reliable mobile communication

 

* LiDAR / RADAR sensor compound demonstrator

* Enhanced reliability and performance of V2N data connections,

*Dependable embedded control by co-integration of cellular connections and network-level connection management
* Fail-operational V2N communication for urban and rural environments based on FUSION

PRYSTINE - 3.3 Dynamically shaped, reliable mobile communication

 

 

 

4.1 Hardware In the Loop (HIL) for lidar sensor data processing

 

 

 

 

4.2 Hardware In the Loop (HIL) for back-maneuver assist

 

Results are being finalized and will be available later.

 

 

 

4.3 Hardware In the Loop (HIL) for data fusion based VRU detection

 

PRYSTINE - 4.3 Hardware In the Loop (HIL) for data fusionbased VRU detection

 

 

 

4.4 Hardware In the Loop (HIL) for back-maneuver assist

 

Results are being finalized and will be available later.

 

 

 

4.5 CiThruS field test

 

PRYSTINE - 4.5 CiThruS field test

 

 

 

4.6 Trajectory planning and vehicle dynamics control

 

PRYSTINE - 4.6 Trajectory planning and vehicle dynamics control

 

 

 

4.7 Fusion of real and virtual sensor data for chassis control

 

PRYSTINE - 4.7 Fusion of real and virtual sensor data for chassis control

 

 

 

4.9 Lab demo for Programmable Accelerator Architecture for multi-sensor data fusion and perception

 

Results are being finalized and will be available later.

 

 

 

5.1 Heavy Duty Truck

 

PRYSTINE - 5.1 Heavy Duty Truck - Use Case 1

 

 

PRYSTINE - 5.1 Heavy Duty Truck - Use Case 2

 

 

PRYSTINE - 5.1 Heavy Duty Truck - Use Case 3

 

 

 

5.2 Truck (3 axels lorry with full size trailer)

 

PRYSTINE - 5.2 Truck (3 axels lorry with full size trailer)

 

 

 

6.1 "Traffic light time-to-green"

 

Based on the received traffic light phase schedule, the system calculates the approaching speed at which the vehicle reaches the green wave, according to the actual traffic condition (vehicles ahead).

PRYSTINE - 6.1 "Traffic light time-to-green"

 

 

 

6.2 "Vulnerable Road User (VRU) detection and Trajectory recognition"

 

The vehicle, with the support of the infrastructure (I2V), recognizes the type of obstacles and predicts the Vulnerable Road Users (VRUs) trajectory, in order to avoid potential collisions.

 

 

 

6.3 "Driver monitoring and emergency maneuver “

 

The Driver Monitoring System (DMS) detects the cognitive status, to decide if s/he is still capable to control the vehicle, or alternatively, if s/he is able to get back into the control-loop in case of a “take over request” (TOR) from the system.

PRYSTINE - 6.3 "Driver monitoring and emergency maneuver“

 

 

 

7.1 Shared control and arbitration (Level 2-3), studying driver-automation interaction and methods for vehicle authority transition Driver in the Loop (DiL) Simulator

 

a) Demo shows how automation assist the driver under two different condition (driver distraction and possible collision).

PRYSTINE - 7.1 Shared control and arbitration (Level 2-3)

 

 

b) Demo on the visualization of the HMI.

PRYSTINE - 7.1 Demo on the visualization of the HMI

 

 

c) Demo on the Driver monitoring system (Driver Activity).

PRYSTINE - 7.1 Demo on the Driver monitoring system (Driver Activity)

 
 
 

d) Demo on the Driver monitoring system (Drowsiness detection)

PRYSTINE - 7.1 Demo on the Driver monitoring system (Driver Activity)

 

 

 

7.1 - 7.3 Demonstrators

 

See an overview over the demonstrators here: Watch

 

 

 

7.2 Layered Control (Level 2-3-4), studying cooperation between a passenger car and a bus, and driver role in supervising or controlling the vehicle when requested.

 

a) Demo shows how automation and driver operate under different levels of scenario complexity  

PRYSTINE - 7.2 Layered Control (Level 2-3-4)

 

 

b) Demo shows the performance of the Driver monitoring system

PRYSTINE - 7.2 Demo shows the performance of the Driver monitoring system

 
 
 
 

7.3 Highly automated vehicle (Level 3-4), study AI-based decision algorithms for urban and highway scenarios.

 

Acknowledgement

PRYSTINE has received funding within the Electronic Components and Systems for European Leadership Joint Undertaking (ECSEL JU) in collaboration with the European Union's H2020 Framework Programme and National Authorities, under grant agreement n° 783190

Imprint Privacy Policy

 

ECSEL Joint Undertaking

EU