Hardware-Software Integration, Real-Time Control & Intelligent Systems
A robotics system integrating computer vision and deep learning for real-time object detection, tracking, and autonomous navigation. The robot perceives its environment using a camera pipeline fed into a lightweight CNN, making intelligent navigation decisions with sub-100ms latency.
Bridging the gap between AI research and embedded hardware — the same techniques used in large-scale autonomous vehicles, miniaturized and optimized for microcontroller constraints.
From obstacle-avoiding rovers to IoT smart systems — each project is an exercise in translating algorithmic thinking into physical, real-world behaviour.
A remote-controlled robot inspired by Wall-E, featuring a three-wheel chassis, obstacle detection, and interactive LEDs and buzzers. This project demonstrates wireless control, sensor integration, and robotics design.
An automated medicine dispenser that opens compartments based on time and proximity, with notifications sent via GSM. This project demonstrates automation, IoT, and assistive technology.
Designed an autonomous obstacle avoidance system using ultrasonic and IR sensors. Implemented decision trees and real-time path correction algorithms directly in embedded C++ for responsive navigation.
An IoT-based system that detects early signs of forest fires using smoke and flame sensors and sends real-time alerts via GSM. This project demonstrates environmental monitoring and IoT integration.
Built a PID-controlled line-following robot using IR sensor arrays. Tuned proportional-integral-derivative parameters to achieve smooth, high-speed track navigation with minimal oscillation.
Developed a soil moisture-sensing smart irrigation controller. The system reads sensor data, makes automated watering decisions, and logs readings — integrating microcontroller logic with IoT cloud connectivity.
Engineered IoT-based smart parking and water level monitoring systems. Ultrasonic sensors detect occupancy and water level; data is relayed via wireless modules for real-time remote monitoring and alerts.
Designed sound-controlled and RF/Bluetooth remote-controlled robotic prototypes. Built full hardware-software pipelines: command decoding, motor control logic, and real-time wireless communication interfaces.
The hardware and software that power the builds.
The progression from first circuit to AI-integrated systems.
Began exploring embedded systems through Arduino. Built first LED control circuits, understood PWM, learned C++ for microcontrollers, and soldered first prototype boards.
Designed and built obstacle-avoiding and line-following robots. Implemented sensor integration, motor driver circuits, and real-time control logic. Introduced to PID tuning.
Extended projects into wireless domains: sound-activated and RF/Bluetooth remote-controlled robotic systems with custom command decoding and real-time motor control.
Built IoT-based smart irrigation, parking, and water monitoring systems. Integrated ESP modules for wireless data, cloud logging, and automated decision-making in physical environments.
Merged AI research with robotics hardware. Developed vision-based autonomous systems using OpenCV and TensorFlow Lite. Research presentation at ICT-CEEL 2023. Ongoing work in AI-driven robotics.