Senior Flutter/Mobile Engineer – Real-Time Motion Intelligence, Agentic AI, AR Stabilization & On-Device Privacy
Location: Remote / Hybrid / On-Site (Silicon Valley)
Our client is enabling every human to earn from anywhere by leveraging their real-world presence through their smartphone. The app with intelligent in-app execution engine that makes this possible—turning remote actions into precise, safe, real-time physical execution.
Role Overview
We are hiring a Senior Mobile Engineer with deep expertise in Flutter, real-time sensor fusion, AR-style 3D object stabilization, privacy-preserving computer vision, and agentic offline execution. This role sits at the heart of our real-time execution engine for millions of remote helpers.
️ Key Responsibilities
- Real-Time Command Interpretation:
- Build ultra-low-latency pipelines to receive consumer commands.
- Generate synchronized directional animations and voice cues.
- Implement deterministic state machines for safety-critical execution.
- Sensor Fusion & Intelligent Motion Guidance
- Use accelerometer, gyro, and magnetometer data to detect: Speed, Direction & rotation angle, Jerks, drift, or shaky motion, Incorrect movement (too slow, too fast, wrong angle)
- Trigger auto-corrections via: Visual cues, Professional voice instructions, Temporary blurring as a warning, Maintain <50ms latency.
- Agentic AI for Offline Task Execution: Build lightweight on-device agentic systems capable of understanding intent sequences (“move 3m then stop”), Continuing execution when offline, predicting incorrect motion, and auto-correcting without requiring new consumer input
- Real-Time Object Detection & Privacy Masking: Develop edge AI systems that :
- Detect faces, number plates, and sensitive objects.
- Apply blur before streaming, ensuring true privacy at source.
- Run efficiently across mid-range devices.
- AR-Style Object Stabilization & Persistent Blurring (Pokémon GO–like)
- A major requirement: Once a face/plate is blurred, the blur must stay locked to that object in 3D space even as:
- The camera moves
- Angles shift
- Lighting changes
- Use techniques including:
- Optical flow tracking
- Feature-point detection
- ARKit/ARCore anchoring concepts
- Multi-frame object persistence
- Blur should not flicker or jump.
- Provider ↔ Consumer Feedback Loop, Send structured telemetry for motion deviation and error states. Drive consumer-side safety screens (“billing paused due to difficulty”). Integrate deeply with backend telemetry and billing.
- Cross-Platform Architecture, Build the entire experience in Flutter, with advanced native integrations for:
- Motion sensors
- CV + blurring
- GPU acceleration
- AR stabilization primitives
Required Skills & Experience
Mandatory Technical Requirements
- 5+ years of hands-on Flutter experience (required)
- 5–12 years total mobile engineering experience
- Strong, demonstrable experience in:
- IMU sensor fusion (Kalman filters, quaternions, complementary filters)
- Real-time computer vision (TFLite, MediaPipe, CoreML, ONNX)
- On-device blurring, masking, data redaction
- AR tracking, optical flow, feature locking
- Deterministic state machine design
- High-performance animation + real-time audio sync
- Flutter + native platform channels (Swift/Kotlin)
- Strong