What is Bird’s Eye View ADAS Application and How to Develop This Using Zynq® UltraScale+™ MPSoC FPGA? Bird’s eye view definition, HW/SW setup and implementation algorithms Farhad Fallahlalehzari, Applications Engineer Like(1) Comments (0) Will the world be a better place in which to live by having autonomous cars driving around us? Or would it be unsafe and scary? Maybe someone was asking such a question even when the first steam-powered automobile capable of human transportation was built in 1769 [1]!! As a person who likes driving, I wouldn’t like to have a ‘fully’ autonomous car, but I would like to get some assistance [from the car] to avoid accidents. An Advanced Driving Assistant Systems (ADAS) could help the driver in various ways, such as providing a 360-degree surround view of the car, a bird’s eye view, forward collision detection, smart rear view, driver drowsiness detection, pedestrian detection, blind spot detection and lane departing detection. Since the agility of a driving assistance systems is vital, the processing unit should be high performance to deliver the above features. Moreover, these days, with the emergence of electric cars, having a low power yet high performance processing system covers most of the main requirements. FPGAs are known to be low power and provide high performance, as we can accelerate computationally heavy algorithms using them. I have covered why FPGAs are a better choice for ADAS applications than CPUs, GPUs and ASICs in another blog titled “How to develop an FPGA-based Embedded Vision application for ADAS”. In addition, the developing process of a Driver Drowsiness Detection application using FPGAs is also covered in that blog. In this blog, you will learn about bird’s eye view as a subset of ADAS and how to implement it using FPGAs. What is Bird’s Eye View? The bird’s eye view is a vision monitoring system used in automotive ADAS technology that provides the 360-degree, top-down view. The main benefits of this system is to assist the driver in parking the vehicle safely. However, it can be used for lane departure and obstacle detection. This system normally includes between four and six fish-eye cameras mounted around the car to provide right, left, front and rear views of the car’s surroundings. Figure 1: Bird's eye view camera setup The HW and SW required to run such a system are explained in the following sections. Hardware setup As mentioned, FPGAs have shown compelling results as the ADAS processing unit over GPUs, ASICs and CPUs in power consumption and performance. You can read more about it here. Since the main competitor of FPGAs are GPUs, especially in parallelism, I have dedicated another blog to compare FPGAs vs GPUs which I highly recommend you read. Aldec has designed and created a bird’s eye view application to help ADAS designers. This application is implemented using the TySOM-3-ZU7EV embedded development board, an FMC-ADAS daughter card and four blue eagle cameras, each with 192-degree wide lenses running at 30fps. The following image shows the HW set up for this demo. Figure 2: Aldec bird's eye view demo setup Some of the main features of the TySOM-3-ZU7EV board are as follows. Quad-core ARM Cortex-A53 as an Application Dual-core ARM Cortex-R5 as a Real-Time Processing Unit 504K Logic Cells, 157K Flip-Flops, 38Mb RAM, 1728 DSP 4Gb RAM – DDR4 Programmable Logic DDR4 SODIMM Memory for the Processing System Video Encoder Unit HDMI 2.0, USB 3.0, Wi-Fi/BT, Ethernet, SATA, CAN, QSFP+, Pmod, DisplayPort Some of the main features of the FMC-ADAS card are as follows: 5x FPD-Link III deserializer terminated with HSD camera connectors URM37 V4 module support LIDAR-Lite V2 module interface One of the main advantages of using the TySOM-3-ZU7EV development board is that it contains a Xilinx Zynq US+ MPSoC device and all the necessary peripherals required to implement an ADAS application. The Zynq device has ARM processor core and an FPGA integrated into a single chip which gives you much more flexibility. You can read more about the structure and advantages of Zynq device here. Software setup The main software tools used to create this design are Xilinx SDSoC tool and Vivado Design Suite. SDSoC includes an easy-to-use Eclipse IDE and a complete development environment for Zynq MPSoC device. The main advantage of this tool is to automate the software acceleration in programmable logic; i.e. you can write your algorithms in C/C++ and convert them to HDL using this tool. The Vivado tool is used for hardware creation and bitstream generation for Xilinx Zynq US+ MPSoC device. Bird’s eye view algorithm implementation Let’s jump into the detail about how we implemented the bird’s eye view demo using Zynq device. To use this implementation or to customize the application using SDSoC tool, Aldec has provided the SDSoC platform for our TySOM-ADAS kit. This platform provides you with the ready-to-use HW that covers all the peripherals on the TySOM-3-ZU7EV board. It also includes the HW setup for the FMC-ADAS and camera IPs. It also contains the Linux binary files to use on the board. Basically, the platform provides you with everything you need to start designing an ADAS application. You can see an overview of the prebuilt HW in the following image. Figure 3: Designed HW for bird's eye view using Vivado The development of Aldec Bird’s Eye View application have six main steps. To reach the 30fps, we needed to accelerate some of the algorithms into the FPGA. Fig.4, shows the detail of the implementation of the bird’s eye view using Zynq device. Figure 4: Bird's eye view implementation steps Frame capture: In this step, we read the frames of the four cameras, which were connected to the FPGA by high-speed connectors through the FMC-ADAS. Frame resizing: The captured images were bigger than what we needed, so resized them. This step was accelerated inside the FPGA. By preparing four resizing functions, we could accelerate this process by resizing all four images in parallel. Removing lens distortion: As mentioned the automotive cameras have used for this project have 192-degree wide lenses. This wide angle caused some problems, especially in showing straight lines on the edge of the lens. The line wouldn’t be straight completely then. To remove these distortions, we had to use algorithms - which were again accelerated inside the FPGA. Perspective transformation: There is a major perspective effect on the cameras for surround view. These effects cause difficulty for the advanced image processing and also make problems for the driver to accurately gauge distance. As shown in the following image, the raw images should be transferred to the bird’s eye view. Figure 5: Perspective transformation Creating final image: After adjusting the perspective, we then had the images of each view (right, left, front and back) ready. To make a 360-degree bird’s eye view image, they needed to be stitched together. Before stitching, the overlapping parts of the images were removed. This step was done inside the ARM processor of the Zynq device as it did not need accelerating. Display the results: Now we had all the images ready and we just needed to show them on a screen. To do so, we used the DisplayPort on the TySOM-3-ZU7EV board that provided us with better performance. It can also handle multiple resolutions. Tip: if you are doing this exercise yourself, to test if the design is working well, we recommend using a chess board under the HW setup to help you assess the level of the distortion. The following image shows the output on the screen. On the left side of the screen, you can see the bird’s eye view results of the car. In this demo, the Plexiglas car is simulated as a real car. Figure 6: Aldec bird's eye view demonstration The above demo unit has now been displayed at a number of trade shows including Embedded Vision Summit 2018 and DAC 2018, and has received considerable interest from ADAS designers. You can watch a demo video, taken at Embedded Vision Summit 2018 conference, Santa Clara, CA of our demo here. In this blog, the Aldec bird’s eye view reference design was explained step by step. In future blogs, we will cover more algorithms such as forward collision detection and object detection for ADAS. Reference[1] Eckermann, Erik (2001). World History of the Automobile. SAE Press. p. 14. ISBN 9780768008005.[2] Appia, Vikram, et al. "Surround view camera system for ADAS on TI’s TDAx SoCs." Texas Instruments, Oct (2015).[3] Luo, Linbo, et al. "Look-up Table-Based Hardware Implementation of Bird's-Eye View System for Camera Mounted on Vehicle." International Journal of Digital Content Technology and its Applications 6.15 (2012).[4] Liu, Yu-Chih, Kai-Ying Lin, and Yong-Sheng Chen. "Bird’s-eye view vision system for vehicle surrounding monitoring." International Workshop on Robot Vision. Springer, Berlin, Heidelberg, 2008.[5] Luo, Lin-Bo, et al. "Low-cost implementation of bird's-eye view system for camera-on-vehicle." Consumer Electronics (ICCE), 2010 Digest of Technical Papers International Conference on. IEEE, 2010. Tags:Aceleration,ARM,Embedded,FPGA,Hardware,HDL,Prototyping,Validation,Verification,Verilog,Design,Digital,SoC,Xilinx,Zynq