A Swift based, visual search library based on Vision and CoreML that you can add to your iOS app to detect objects in images in a simple way.
For more information about the underlying design see: Chimango·AI Library Design.
This is a work in progress.
- iOS 13.0+
- Swift 5.0+
- Xcode 12.0+
NOTE: These instructions are based on Xcode 12.4
1 . Open your project and add a package dependency by selecting: File -> Swift Packages -> Add Package Dependency
2 . When asked about the repository paste the following:
HTTPS URL:
https://github.com/Dario-Gasquez/chimango-ai.git
or SSH URL:
[email protected]:Dario-Gasquez/chimango-ai.git
3 . Follow the instructions until the ChimangoAI package is added to the project
Once the package was succesfully added you should be able to access it by importing the module:
import ChimangoAI
Before anything else, you will need to setup the library with the desired detection mode (using the COCO dataset in the following example):
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
// Chimango Visual Search Library setup
do {
try VisualSearchManager.setupWithMode(.coco)
} catch let error {
print("error during Visual Search Library initialization: \(error)")
}
return true
}
NOTE: For the time being only the .coco
mode is implemented.
The simplest way to obtain a list of objects detections is by passing an image to the VisualSearchManager
like this:
import ChimangoAI
let sampleImage = UIImage(named: "sampleImage.png")!
VisualSearchManager.detectObjectsIn(image: image) { (result) in
switch result {
case .failure(let error):
print("detectObjectsIn error: \(error.localizedDescription)")
case .success(let detections):
print("OBJECT DETECTION RESULT: ====================")
print(detections.debugDescription)
}
}
The DemoApp directory contains a sample application. It allows the user to capture an image (either from the device's photos library or by taking a picture using the camera) and then sends that image to the VisualSearchManager
for objects detection using the COCO data set.
If objects are detected an alert view with stats is shown. Also circular buttons are drawn on top of the detected objects (tapping on them shows debug information in Xcode debug console).