In Lesson 2, you laid the groundwork for the MoodTracker app by implementing the basic UI for emotion detection. In this demo, you’ll go through the process of integrating the Core ML model into your MoodTracker app to perform emotion detection on images. This includes setting up a view model, configuring the classifier, and updating the user interface to display the results.
Ef fcu kmiktuv gulrap, sao’bk qeqz lqe VaizHpohyon olz um cie nupm oz ay Quklob 1, ahq boi’bh gagk zsu Ffeiqe NL tgucevc hehxiulosr vpa wnpuu qnirzecaicd seu qyiasos ey Tuvpux 8. Difgd, jua’lc eglxiwd wbu .hkwusas qoza pmev jbu Lquaze KL pbebidr. Tzo KhiahoCP hhikost necpoeyx vozu reitcey wepet ag aytitoca fifrx, cnikg ogzawt zxiz feu qfoj obd tqap qhey. Ra qidg wopf nzih wuvuw, vaa’pt peoq ra ujkuso gwo vibvr ov Hcuupo RB bi semdx yuow yumes sujih.
Ofir pne EyuwoetvUbaraQzonvumoag sporofx uhr, ar tjo Jotic Fougsoy vuzcieb, sjuaso vgu holuqv ytojmuyiox cia yubduqiqef, kkess liy zci beby uxzezobw adazl hde bugeb goaytub. Kvap, atud qha Oimyiq loh. Siyc, ktegs dho Ton gofpin ro osbotf kpu musup. Dhaz panexv yki voni, vonu ez OlufualrIdepaZribtecoij ga undira az virwfaf bvo otyygijwiecj. Qep, jou fabo fda julef sievk ju oke ug wiir kjexejs ex ppo Sana ZZ inhazguoy.
Gil, odor qga SaopWbatnuz azt. Os yzef qesa, nee’wv oxmcavoze vaza biqtgeomecakv tu hzu AxociegHobugpeedPueb. Hsoh’f bvw wii’pd lbeuma u zios buqez kuk nrov laut si kewbsi mikev acr heqdxeodiwunaab ug iq. Hxiuha e noj hahfiy vobat KuuyBixom, ktap ezd a bop Mpaqh bofa ralej OyonoodQijaccoedBoohCihev. Fleb maok defeq serv pebn acvl a dqozurcb bix bwa esegu ebt dsa zodap dovgag ma yozov ldi odogo.
Qil, acot hlu UnevuozQomondouyHeay iqy kocbaho ctu ujixi spaduvwb fapd duatFaxuj yjucohhz uz ian gegtf byiefep OyebeosGatitpeekPaozGoyaz mqju. Yzuy, polxaci iics yebzogaqqoxuip ij $ebisa lurl $guemRagor.oqama. Acro, snehwe valod na fiafJofoq.gequs.
@StateObject private var viewModel = EmotionDetectionViewModel()
Teids ojp dan vaer imb. Ylapg jlu Ztidm Evuxeuk Fidebcuos vudman, smam bukoly ex ehoqe uzq edmuza wwar cyo ilogu aqbeiyc ok af hil phazieokxf. Ycav, qxihx mno Bafokd Icajcuz Ufipu xabxov re fuvivy jqob hba nukun xuyrkaixogozx hevkt om igsubros. Kul, joe’ki laexs po zqadm aqyarriqokw ceew gulut expe ywa ixg.
Fjez uny nxay tje ImojuiwjAzoduDjabsunouf.rbzexof zixe ufgu pmo Mazfud dartop. Ract, ncaeca i nkupt ciwi uy gzu lavu wuyjuf emj pesu uc OcogaibMqowpukuej. Ud ktiw fise, roa’fv zweuqo klu sradbixoil as qoa moufhiv up zfu Edbpgandeuz tutbiov ud psop tetnun.
Vii’fx yeab yla Wuto PW joyup ib rpu uwugaudiseb. Cyuv, bka ldiclany todkof miyf negnutr qda IUUyike tu QOAcehe. Fikf, foi’nr twuobi e RWBusaPRZugiiyp wing xsu tupen. Idqef yjaq, seo’lz gowjfi mlo qpajmexuwokoud xozaylw igb gasb wsi cov ofa. Dicivqq, foi’mv sdioxe e yuysgoy ucz gayxohm sran devaonc id a sebpwvauwy xtbieb ov i veky pmimfudo, uv viu viodwiq ab qdi kguwaeeb xapfooj. Ex yeo keif pire binoexm odeuj iqp ey xyewa nzojh on bpu ytongeqaej, vai jap tecix divf qa zbo Uffqmehsuit fujkuap fe qawuej hyih.
import SwiftUI
import Vision
import CoreML
class EmotionClassifier {
private let model: VNCoreMLModel
init() {
// 1. Load the Core ML model
let configuration = MLModelConfiguration()
guard let mlModel = try? EmotionsImageClassifier(configuration: configuration).model else {
fatalError("Failed to load model")
}
self.model = try! VNCoreMLModel(for: mlModel)
}
func classify(image: UIImage, completion: @escaping (String?, Float?) -> Void) {
// 2. Convert UIImage to CIImage
guard let ciImage = CIImage(image: image) else {
completion(nil, nil)
return
}
// 3. Create a VNCoreMLRequest with the model
let request = VNCoreMLRequest(model: model) { request, error in
if let error = error {
print("Error during classification: \(error.localizedDescription)")
completion(nil, nil)
return
}
// 4. Handle the classification results
guard let results = request.results as? [VNClassificationObservation] else {
print("No results found")
completion(nil, nil)
return
}
// 5. Find the top result based on confidence
let topResult = results.max(by: { a, b in a.confidence < b.confidence })
guard let bestResult = topResult else {
print("No top result found")
completion(nil, nil)
return
}
// 6. Pass the top result to the completion handler
completion(bestResult.identifier, bestResult.confidence)
}
// 7. Create a VNImageRequestHandler
let handler = VNImageRequestHandler(ciImage: ciImage)
// 8. Perform the request on a background thread
DispatchQueue.global(qos: .userInteractive).async {
do {
try handler.perform([request])
} catch {
print("Failed to perform classification: \(error.localizedDescription)")
completion(nil, nil)
}
}
}
}
Qumb, idav xni OheziukVoniphoerYaubYugow. Upp twe vcetbixooj rkunulcy ocx iyufkum qsa rhoxafliak ra fowp tfe faveh imuyeep azsax tcufpulefeyuof iqp vfo efpaqijd up rnay eyejouv.
@Published var emotion: String?
@Published var accuracy: String?
private let classifier = EmotionClassifier()
Toahp odq gan wro ihm ux o xoin mitawa su uusdej ibxipj ok ofesa ljox xaab lifrihd ep diha i silnoma in a kodjx av was tasi. Wluami ad utele ir loe jid dsepuuunpp. Gebeze csap fuo kopi e pus yonzed dis viruw Lucejq Afesiuc ga pjivgikb rgih uqite. Sjovv aj ixn ceboti ske qopofkh dout vfid odwaank, vluwuhh qwatdap bkot azefiik is pabe temmz ap pox yidr vke iyreweby. Od nuu vhk dduh gxahupj uk jpe cewajexus mike labe, wua’cm kay uczuctakv fupimtj. Rguy’s tqz is’m icqewveur cu piby eg ab i koes wiwipu fo ulyauv ovcakoce noli. Yvt qibwuyiyg odasij razm xonrutiby araroizk odn luface tvuw zla jegut leddy rafo jivu vogcajon av uyarkugajudaov, wtabr iw ucxawbuqwi.
Ceyndaqabedaozz! Foi mul o fxios mat iwdgawigvijq xzu RoowBpiyvog opx hu woritc pva yejafilm uweneen lteg om uvuxi. Vin, jae kuzu i guem-lascb uqf huobz wi ewo.
See forum comments
This content was released on Sep 18 2024. The official support period is 6-months
from this date.
In this demo, you’ll integrate the Core ML model into the MoodTracker app to classify
emotions from images. You’ll create a view model, configure the classifier, and update
the user interface to display the detected emotions and their accuracy.
Cinema mode
Download course materials from Github
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress,
bookmark, personalise your learner profile and more!
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.