Authors
Derek Li1, Austin Amakye Ansah2, 1USA, 2California State Polytechnic University, USA
Abstract
Traditional basketball training relies on feedback from coaches and video playback, which can be delayed and inconvenient for solo practice. This paper introduces HoopLab, a cross-platform mobile application designed to provide immediate, comprehensive analysis of basketball shots. The app, built with Flutter, utilizes a custom-trained YOLO object detection model to analyze user-submitted videos. It features two distinct analysis modes: a backboard view that confirms if a shot is successful by tracking the ball's path through the rim, and a side-view that evaluates a shot's quality by comparing its actual path to a computationally generated optimal trajectory. The underlying YOLOv11 model was trained on a dataset of over 8,200 images, achieving a mean average precision (mAP@0.5) of 94.72% and demonstrating high performance in identifying key objects like the ball, rim, and player[10]. HoopLab offers players a powerful tool for instant, data-driven feedback to accelerate skill improvement.
Keywords
Computer vision, AI, mobile, basketball, Flutter