LANGUAGE
YouTube
FXB-Z04005 BYD Dolphin Autonomous Driving Teaching and Training Platform
About Feature Technical Parameter Basic Configuration
i. About the Product
Select BYD Dolphin car for development; Through the communication protocol of the vehicle control system and the independently developed vehicle control VCU, the highest control authority of the vehicle can be obtained, enabling line control of the lighting system, steering system, braking system, and drive system; Cooperate with the laser radar, vehicle camera, millimeter wave radar, ultrasonic radar and inertial navigation system data collected by the auto drive system, and convert them into CAN protocol data through perception, decision-making, planning and control algorithms to communicate with the vehicle, so as to realize the automatic driving function under specific road conditions. Applicable to the needs of undergraduate institutions for teaching and practical training in autonomous driving.

ii. Features

1. Developed based on BYD Dolphin, it can complete functions such as manual driving, L2 assisted driving, L3 autonomous driving, OTA intelligent remote upgrade, etc. The L3 autonomous driving mode can be switched with one click through the autonomous driving button on the dashboard.

2. The vehicle is equipped with three 32 line laser radar, front view camera, high-precision integrated navigation, millimeter wave radar, ultrasonic radar, automatic driving computing platform and other core parts of the auto drive system, which can carry out high-precision map creation, high-precision positioning, driving area detection, lane marking line detection, obstacle detection, dynamic object tracking, obstacle classification and recognition and other functions. Lidar has a maximum detection range of up to 150 meters, giving 100km/h high-speed autonomous vehicles up to 5 seconds of observation and response time, which can more effectively ensure the safety of high-speed autonomous driving. The auto drive system has built-in algorithm, which can be used for secondary development.

3. The vehicle's autonomous driving system is developed based on the autoware open source autonomous driving system. The instructions are compiled in C language, and the human-computer interaction interface can be completed, which is simple and convenient to use.

4. The cab is equipped with a high-definition display device that can display the laser radar and camera imaging and recognition mechanism in real time, allowing students to have a clearer understanding of the internal algorithm logic of the software.

5. Reserve OBU interface, which can expand the vehicle road cooperation function and realize the communication connection between the vehicle and the outside world.

6. Use an automotive-grade computing platform to meet the computing needs of unmanned driving scenarios. Built-in multi-sensor clock synchronization function reduces sensor fusion time difference and improves the accuracy of autonomous driving data.

7. Can complete the installation and debugging teaching training of millimeter wave radar.

8. Can complete the ultrasonic radar installation and debugging teaching training.

9. Can complete the installation and debugging teaching training of 360 panoramic camera and monocular camera.

10. Can complete the installation, debugging, teaching and training of LiDAR.

11. Can complete the installation and debugging teaching training of the training platform.

12. Can complete the installation and debugging teaching and training of the wire control brake system.

13. Can complete the debugging teaching and training of the wire control drive system.

14. Can complete the installation and debugging teaching and training of the wire-controlled steering system.

15. Can complete the installation, debugging, teaching and training of the combined inertial navigation system.

16. Can complete the teaching and training of wire-controlled lighting system debugging.

17. Can complete the teaching and training of high-precision map collection and production.

 

18. Can complete the test and verification of the whole vehicle's autonomous driving function.

19. Can complete the test and verification of the whole vehicle's wire-controlled control strategy

20. Can complete the installation and test verification of the ROS system.

21. Can complete the installation and test verification of Autoware auto drive system.

22. Can complete the installation and testing verification of the laser radar ROS driver.

23. Can complete the installation and testing verification of camera ROS driver.

24. Can complete the installation and testing verification of millimeter wave ROS drivers.

25. Can complete the tnstallation and testing verification of ultrasonic ROS driver.

26. Can complete the installation and testing verification of ROS driver for integrated navigation.

27. Can complete the installation and testing verification of ROS drive for wire controlled chassis.

28. Can complete the practical application of lane recognition algorithm function.

29. Can complete the practical application of visual recognition algorithms for pedestrians, vehicles, and traffic lights.

30. Can complete the practical application of 360 panoramic stitching algorithm function.

31. Can complete the practical application of visual recognition algorithms for cardboard boxes and mineral water bottles.

32. Can complete the practical application of obstacle recognition algorithm for LiDAR.

33. Can complete the practical application of visual and LiDAR fusion target detection algorithm functions.

34. Can complete the practical application of visual and LiDAR fusion target tracking algorithm functions.

35. Can complete the practical application of laser radar recording point cloud data and point cloud map generation algorithm.

36. Can complete the practical application of high-precision map positioning algorithms for laser radar and navigation.

37. Can complete the practical application of A * global path planning algorithm.

38. Can complete the practical application of the lattice planner local path planning algorithm.

39. Can complete the practice of PID control algorithm function.

iii. Technical Parameters

1. Complete vehicle

Overall dimensions (mm): ≥ length 4150mm width 1770mm height 1600mm

Braking method: front and rear disc brakes

Structural form: front-wheel drive

Suspension: front McPherson and rear torsion beam

Wheelbase: 2700mm

Track: 1530mm

Minimum turning radius: 5.20M

Drive motor type: AC permanent magnet synchronous motor

Drive motor power: 70KW

2. Power battery system:

Battery capacity: 44.9kwh

3. LiDAR:

Number of lines: 32

Laser wavelength: 905nm

Laser safety level: Class1 eye safety

Distance measurement capability: 150m80m@10 %NIST)

Accuracy (typical value): ± 3cm

Horizontal field of view angle: 360 °

Vertical field of view angle: -16 ° -15 °

Frame rate: 5Hz/10Hz/20Hz

Speed: 300/600/1200rpm (5/10/20Hz)

Output points:~30.000pt/s (single echo mode)~600.000pt/s

(Dual echo mode) UDP packet content, three-dimensional spatial coordinates such as timestamp, reflection intensity, and time

Ethernet output: 100 Mbps

Output data protocol: UDP packets over Ethernet

Working voltage: 9V-32V

Working temperature: -30 ℃~+60 ℃

Product power: 12W

Storage temperature: -40 ℃~+85 ℃

Protection level: IP67

Time synchronization: $GPRMC with 1PPS

Size: 120mm * H110mm

Weight (excluding data cable):~0.87 kg

4. Millimeter wave radar:

Detection distance:

±9°0.2m~250m@far range

±45°0.2m~70m/100m@near range/far range

±60°0.2m~20m@near range

Detection distance resolution:

1.79m@far range

0.39m (0.2m in static state)@near range

Working bandwidth: 76GHz-77GHz.

Distance detection accuracy: ≤±0.1m for near range, ±0.4m for far range.

Speed ​​detection accuracy: ≤±0.1km/h.

Azimuth detection accuracy: ≤±0.3°@0°/±1°@±45°/±5°@±60° for far range ± 0.1°.

5. Surround view camera

Pixel: AR0147 RGGB ISP AP0101

Image size: 1/4 inch CMOS

Output pixel: 1280H*720V

Pixel size: 3um

Back side illumination: (BSI)

Frame rate: 1280*720@30fps HDR range 140 dB 120dB@LFM.

Output data: Parallel/YUV422-8 bit serializer Maxim MAX96705

Camera interface coaxial power supply: 5~16V POC

Current: <200mA

Connector: Amphenol (Z-type Fakra)

Operating temperature: range -40~+85℃

Waterproof reference lens:

Dimensions W: 30mm, L: 30mm, H: 22.5mm

Weight: <50g

6. Ultrasonic radar:

Working power supply:+12V~24V Working current:<200mA (+12V power supply)

Working temperature range: -40 ℃ to+85 ℃

Ultrasonic stable ranging range: 200mm-3500mm Extreme range: 130mm-5000mm (reflective surface is the wall)

Accuracy: 0.5% of detection distance

Resolution: 5mm

Communication interface: compatible with CAN2.0A and CAN2.0B

Sampling rate and CAN transmission cycle: 100ms

Probe protection level: IP67

Probe shooting angle: 60 degrees

Box dimensions: 161 (with ears 190) * 72 * 46mm

7. Forward camera:

Processor: FPGA, Dual-Core ARM

Memory: 1G

Flash: 8G

Lens focal length: 8mm

Distance measurement range: 3m-100m

Distance measurement error: within 5%

Baseline: 12cm

Dynamic range: 120dB

Resolution: 1280*720

Field of view: HFOV38/VFOV21°

Pitch angle: 70°~90°

Operating voltage: 9~36V

Machine power: <6W

Storage temperature: -30℃~85℃

Operating temperature: -20℃~70℃

Image frame rate: 10fps

8. Autonomous driving computing platform:

Basic computing unit parameters:

AI computing power: 8 TOPS

Dual-core C66x (1.35GHz) + C7x (1GHz)

CAN modules: ≥16 (support CAN-FD)

Ethernet: 8 channels (1Gbps)

Memory: 4GB LPDDR4x

Internal storage: 32GB eMMC

Core computing unit parameters:

CPU: 8 cores 64-bit

AI computing power: 32 TOPS

Memory: 32GB LPDDR4x

Internal storage: 32GB eMMC

SSD interface: support m.2 interface nvme protocol (default 512GB)

9. Combined inertial navigation system:

Performance indicators: 432 channels, cold start<30s BDS: B1/B2 signal count, Beidou: L1/L2 GLONASS: G1/G2

GALILEO: E1/E5b initial time<5s update frequency 20Hz differential data RTCM 2. X/3. X

Reliability>99.9% Data format NMEA-0183, Unicore

Accuracy index: Single point accuracy plane: 13m elevation: 3.0m RTK accuracy plane: 1cm+1ppm elevation: 15cm+1ppm

DGPS plane: 0.4m, elevation: 0.8m, time accuracy 20ns, velocity accuracy 0.03m/s, orientation accuracy 0.2 degrees/1m baseline

10. Hardware system:

Communication serial port: Hardware system 32-bit communication processor FLASH 512KB DDR2 256KB

Communication interface: GX16-8 * 1 supports RS232

Radio interface: GX12-4 * 1 supports RS232 indicator light 3 4G interface SMA * 1 GNSS interface TNC * 2 SIM/UIM 1

Physical environment: Standard power supply 12V/1.5A

Power supply range: 9~36V

Power standby: 220~250mA, communication: 80~330mA

Size: 100x119x38 mm

Weight: 280g

Protection level: IP30

Working temperature: -35~+75 ° C

Storage temperature: -40~+85 ° C

Relative humidity: 95% (no condensation)

iv. Basic Configuration

One BYD Dolphin autonomous driving teaching and training platform, one 360-degree panoramic calibration cloth, one corner radar reflector, one camera calibration board, one infrared rangefinder, and one vertical measuring device.