Projects

Rocket flying through the skies

Exo Bronco Project Manager

Exo Bronco is a project team aiming to build high powered two stage rockets that reach the Karman Line. Currently, we're attempting to achieve the first successful high-powered rocket launch at Cal Poly Pomona. I started on Exo Bronco as a Sophomore, where I designed an interstage and conducted structural analysis to evaluate its performance under estimated loading conditions. In 2024, I was appointed as a lead where I led many design iterations and thorough testing of the interstage to ensure its capability for a high powered rocket launch. After an unsuccessful launch, I was appointed project manager in 2025. Since then, I've led subteams to conduct thorough analysis, investigations, and optimizations on various rocket components, including the development of a 0.003" tolerance on a new aluminum interstage. Aerodynamic analysis revealed a sub-optimal stability caliber and this was promptly adjusted by moving the center of mass of the rocket forward. I assisted with the rocket's manufacturing and assembly. While awaiting this rocket's launch on February 7, I have led the team to begin optimizing the rocket to minimize dead weight and maximize usable mass to optimize the rocket's performance on the next iteration.

Microcontroller board

Glider-Recovery Capsule Software Lead

A team on campus was attempting to develop an autonomous glider recovery capsule. The goal was to be able to control not just that a rocket lands safely but also precisely where it lands. Tests needed to be conducted by dropping the glider from a building. I developed code that filled these constraints by logging data obtained from an IMU and GPS to onboard SPIFFS with an ESP32. I then programmed the ESP32 to create its own access point, run an HTML webpage, and send the data over HTTP once the test was finished. This allowed anyone with a laptop to connect to the ESP32's AP and go to a simple testing page to begin data recording and end it. It would deliver it in the form of a CSV file so that it would be easy to open and resulted in accelerated testing.

Large multirotor drone being tuned

Autonomous Delivery Drone

The goal was to develop an autonomous delivery drone. I designed the airframe and selected brushless motors and ESCs as well as all necessary onboard sensors. I selected Li-ion battery packs that were sufficient for 30 minutes of flight with a factor of safety. A Raspberry Pi 4 was used as the flight controller. I integrated GPS and a YOLO-based detection pipeline for perception and developed a Kalman filter to achieve accurate position data between the GPS and IMU. To tune the control loop, we gimbaled the drone between 2 tables using a pool cue stick we got from our dorm. We manually tuned PID and achieved stable initial flights (although lots of positional drift), then paused the project due to regulatory restrictions. I may revisit this project to explore the potential for nonlinear controllers!

Audio Spectrogram generated by NN

Audio Generator Neural Network

When Sora 1.0 (video generator model by OpenAI) came out, the videos were not capable of generating audio. I set out to generate plausible audio from short video clips to solve this issue. I implemented a CNN-to-latent-to-spectrogram architecture (basically a VAE). I found 10000 2-second youtube clips in a dataset provided by google which already contained text embeddings as to what the noise is. I fed these into the VAE and reconstructed waveforms via inverse STFT while iterating losses. While the model was unsuccessful, it clarified the need for more training data and a new model architecture, potentially with transformers.

Laser Tracker

Laser Tracker with stereo vision

For fun, I wanted to make a tracking system that would point a high-power laser at something (inspired by a laser mosquito killer I saw on the internet). I developed a stereo vision camera and handled computing on a Raspberry Pi 4, where I calculated disparity and triangulated objects. I used a YOLO model for object detection and was able to detect people as well as localize them in 3D space. I had a 2-axis servo gimbal where I calculated the angles the servos had to move at to point at the detect object. And, it worked! It was a lot of fun to make.

AI Home Assistant White Board Peripheral

AI Home Assistant

Due to the growing use of LLMs, I wanted to make a home assistant with a more interactable LLM agent integrated so that you don't just have an assistant, but a fun AI to talk to! Using OpenAI's API documentation for their realtime models, I developed a full-duplex pipeline with minimal latency. I developed a neural network for wake-word detection and manually obtained training data over the course of a week to obtain 99% accuracy. I implemented an ESP32-C6 as a modem to run Zigbee commands (popular with smarthome devices). In short, it was capable of turning on and off any lights as well as detecting them without the need for any apps! It could self discover all your smarthome devices, connect to them, figure out their capabilities, and control them at your will. I also developed another variant that connects to your whiteboard and contains a camera. It fed your whiteboard feed into ChatGPT so that it could see what you're writing, reason, and assist you. Soon after, Amazon came out with a software update with LLM integration for all their Alexas for similar capability!

Biomechatronic Foot CAD

Biomechatronic Foot

As my senior project, I worked with a few other students to develop a dextrous but high power output biomechatronic foot. We used several clever mechanisms to make this happen. First, We used a clever linkage mechanism to achieve co-actuation which increased our power efficiency average by 50%!. We selected plastic bushings to save on cost while reducing friction and minimizing backlash. We used 4 brushless motors paired with O drive controllers to achieve co-actuation for inversion, eversion, plantarflexion, dorsiflexion, as well as toe actuation. Abduction and adduction was also achieved with a micro servo and a tendon drive along with a spring. This mechanism allowed for the toes to naturally spread out when the assembly is standing on its toes, and restore by the spring force when its not. However, it also allowed for active actuation when the toes of the foot needed to be spread. Resistor-based pressure sensors were placed along the bottom of the foot, and potentiometers were placed at key joints to interpolate the current state of the system. An IMU is going to be placed along the foot, and a Teensy 4.1 will be used to perform onboard computing. Soon, we will assemble the foot and model it dynamically to develop a nonlinear control algorithm and get it to stand on its toes!

Publications

Contact

Open to internships, full time positions and research collaborations. The quickest way to reach me is by email.