I built an augmented reality iPhone game that allows users to invoke 3D effects on their physical environment, e.g. the coffee table, when the correct hand gestures are performed. Gestures are recognized by the camera in real-time. Built in Swift with ARKit and CoreML, using a custom model trained with Microsoft Custom Vision.