Posts

Demostration Videos

 In this short blog, there is some demostration videos showing how the car works.  

Week 4

Image
Achievements of the project as the bench inspection approaches Following week 4 and the final lab day, the car is in a hardware-complete form.  All connections are soldered and everything is mounted in place on the chassis. The AI camera is successfully detecting objects and outputting the results to the terminal. The sensors are successfully detecting objects and are able to control the wheels via the program that was created. There is the occasional collision with objects such as walls, but through some software tweaking, it might be possible to reduce the occurrence of this. Summary of the fourth week in the lab The circuit design was finalised and implemented into the system using soldered connections through a Veroboard, along with the Battery HAT allowing the system to move around without any wired connections. The camera is also now connected via it's ribbon cable within the same system. The image below shows the completed car. Image showing the completed car. Image showing ...

Week 3

Image
Goals of the project as decided so far Following discoveries made in the lab session described in this week's blog, due to issues with the AI camera, we have decided to run the project based on 2 Raspberry Pi's in order to develop both systems as far as possible in the time frame. Other than this, the goals remain the same as set out in the first briefing. Summary of the third week in the lab Trying to train a custom AI model based on the dataset made last week and using YOLOv8n AI was very time consuming as trying to find a method that converts the AI model into the IMX500 format as well as complete the required Post Training Quantization was difficult. After all the time spent on working on creating a custom dataset and train the AI on it for the project, for this lab session we decided to focus on getting YOLOv8 to run using pre-trained models as the custom model was not working for an unknown reason. Nearly all of the focus during this lab session was based around the AI ca...

Week 2

Image
Goals of the project as decided so far The goals for the project have remained the same from what was set out from last week. However, as part of this weeks project session, it became apparent that one of the ultrasonic sensors was found to be broken, however, the project will now continue based on 3 ultrasonic sensors, 1 on the front and 1 on each side, for left and right detection. Summary of the second week in the lab For the duration of this lab session, the primary focus was connecting the ultrasonic sensors to the Raspberry Pi, along with the motor controllers and the 4 motors, 1 for each wheel. Firstly, we focused on the ultrasonic sensors and getting them connected to the Raspberry Pi as per the circuit design at the end of the blog last week. We ended up receiving mixed results from the ultrasonic sensors once we started trying to get 2 to work at the same time, this lead onto multiple hours of debugging within the lab session where it was found that one of the sensors was bro...

Week 1

Image
Aim of the project The main aim for our project is to create an object avoidance vehicle that will use both ultrasonic sensors and camera detection to determine any objects that are in the vehicle's path, as well as the surrounding area on all sides to then adjust the speed/direction accordingly. To start off, we set out each component of the project: The main drivetrain consisting of 4 motors and 2 dual motor controllers. 4 Ultrasonic sensors, 1 on each side, to provide an estimation of the surrounding area around the car. A Raspberry PI AI camera that will be used for the camera detection and image processing. All of the above components will be controlled and powered by a Raspberry PI 4 Model B. Goals of the project as decided so far Use the Raspberry PI AI Camera along with specific image processing models to detect objects and signs relating to road networks in the UK, for example traffic lights and stop signs. Use the ultrasonic sensors to provide estimations of distance betw...