Skip to content

iOS AR-powered language learning app that identifies objects, anchors labels in 3D space, provides translations, and teaches pronunciation. Winner of Apple Swift Student Challenge 2025.

License

Notifications You must be signed in to change notification settings

abhyas01/Lingo-lens

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Lingo lens (🏆 Winner - Apple Swift Student Challenge 2025)

Lingo lens logo

See. Translate. Learn.

  • Lingo lens is an augmented reality (AR) language learning app that transforms your surroundings into an interactive vocabulary builder.

  • Using your device’s camera, Lingo lens identifies everyday objects, allows you to anchor labels in 3D space, and when you tap a label it reveals the translation and plays the correct pronunciation.

Achievement

  • This app, in its pre-mature state at commit e163259a2cf234c037cc77b1eff6b222212c42e3, was submitted for the Apple Swift Student Challenge 2025, and it won the Swift Student Challenge. 🎉

  • Received a certificate signed by Susan Prescott (Apple’s VP of Worldwide Developer Relations), a pair of AirPods Max, and a free Apple Developer account.

Certificate by Apple

Demo

Watch the demo video showcasing Lingo Lens in action.

Demo-Lingo-lens.mp4

Note: The demo was recorded before the iOS 26 update. For the updated UI, check the User Interface section.

How It Works

  1. Select the language you want to learn.
  2. Point the camera at an object.
  3. Adjust the detection box and anchor a label in 3D space.
  4. Tap the anchored label to reveal its translation and hear the pronunciation.
  5. Long press a label to delete it.
  6. Save the word to your personal collection.

Key Technologies

Framework / Component Purpose
ARKit Spatial tracking and anchoring labels in the real world
Vision + CoreML FastViT image classifier for real-time object recognition
Apple Translation Framework Accurate translations for detected objects
AVFoundation Speech synthesis for pronunciation playback
CoreData Local persistence for saved words and settings

Note: All processing happens on-device for privacy and offline usability.

User Interface

Section Screenshot Description
Translate Tab ar-config-stage-ui-update-to-user AR initializes and communicates its status to the user.
detection-mode Shows the app in Detection Mode.
anchor-labels Shows the anchored labels visible in 3D space.
translation-label-unsaved Translation pop-up for "coffee mug" and options to Listen or Save (orange, unsaved).
translation-label-saved Translation pop-up for "laptop"; shows it has been Saved (green checkmark) and allows Listen.
lang-not-installed-check Triggered when detection mode starts but the selected language isn’t installed (edge case).
camera-permission-check Shown on the Translate tab when camera access is not granted, prompting the user to enable permissions.
Saved Words Tab saved-words Vocabulary list saved by the user.
saved-words-filtering Vocabulary list filtering by language.
saved-words-sorting Vocabulary list sorted by date added.
saved-word-detail-view Saved Word Detail View
Settings Tab settings-tab Settings tab for selecting language and color scheme.
lang-selection Language selection sheet.
color-scheme Color Scheme settings.
Onboarding Screen onboarding-1 Shown when the user downloads the app and opens it for the first time (Onboarding 1/4)
onboarding-2 Onboarding 2/4
onboarding-3 Onboarding 3/4
onboarding-4 Onboarding 4/4
Instructions Sheet ins(1:3) Shown when the user opens the Translate tab for the first time after downloading the app, and also appears when the info button is tapped on the Translate tab (Instruction Sheet 1/3)
ins(2:3) Instruction Sheet 2/3
ins(3:3) Instruction Sheet 3/3

Future Development

Planned improvements:

  • Enhanced object recognition accuracy.
  • iCloud sync for saved vocabulary.
  • Gamified progress tracking and achievements.

Note: Lingo Lens works best on Pro iPhones/iPads with a LiDAR sensor. Placing anchors on objects may take a few retries on other devices due to hardware limitations, and I'm actively working to improve this experience.

Project Context

Lingo lens was developed as the final project for the course MPCS 51030 iOS Application Development (Winter 2025) at the University of Chicago.

Author

Developed by: Abhyas Mall
Project: Lingo lens
Contact: [email protected]

License

Lingo lens is under the MIT License

About

iOS AR-powered language learning app that identifies objects, anchors labels in 3D space, provides translations, and teaches pronunciation. Winner of Apple Swift Student Challenge 2025.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages