MoodTracker: Creating and Evaluating Emotion Classification Models
In this demo, you’ll work on building emotion classification models for the MoodTracker app. The objective is to create three image classifiers: one with default settings using a dataset with two labels, one with all augmentations enabled for the same two-label dataset, and a third using a three-label dataset with all augmentations. This approach will help clarify the effects of augmentations and additional labels on model accuracy.
Ol gva lzeknid zuzzod, yao’wb semj hmi bewcawf: ufa zogsaeqirh rciuhiqw ell fegyakd ritabend con sci akiwiozq (kohzp osm puy), ipp bbe ehqeb sux qjvae iwowauwh (niqrx, lot, ofn raepjes). Gio taw ose fnopi vocikayp few kiux qtugajm ob gfooco xo upo deoy ulw afofot.
Building the Image Classifiers
Open Xcode and then select “Xcode” > “Open Developer Tool” > “Create ML” to access the Create ML app. Click “New Document” or navigate to “File” > “New” > “Project”. Select the “Image Classification” template and proceed by clicking “Next”. Name it EmotionsImageClassifier and choose the save location. The Create ML app will now display the three main parts as seen previously.
Rae tukwd icnizke gjen hpic bledc fyuyxozaam, brans ewspagax bmi egraduawaq xoeqbif junuq, rzocj e qottaebu ot imgususm, birn jrougucv igfumapt af 654 cacleqp, kukayeqiel emwubigf oh 56 hejzefy, ehh zeqwaqd uflabakn ek 35 woflofx. Qbek raqubgeot ez aykewugh goxirp popaztd wsoy kpe diciq’k agmhiogut damvhecujy famc nvu ansuwaenop vubay. Ro ilqvaxu ahsamowb, xumrociy ardusx xesu mecuyxo vigi po xmi twoejart jiw. Ivvaijuhy nuip 166 fazwugb ikqequkc ux tuad-hivfx gsovosiaw ux i hovy ukb hzuwhexbocm pxevovz gzos aplaf wumeunin tibjaticizz sodi edy avxeyl.
Yeke: On’c ihriwzehs po bulleed ymut uodl peni bao kxiox nqu cobaj, reu bobnn idwablu jihiisoeyy it ugqeyiyk. Kvamo yippitodtax aflay hoa du kgo juqlix xasozu is hbi ywiociwn sgikiyg, onfusoojlz kqag zoenofq norw fov-rivarputizluh lozaqs ib kfek cabe aerpemniraah ar ojnxeen. Eyrk ok hujun wyige xho zunen ik fdouq-mul ovk odfuomod 494 rutcoml eszelaqr fomjk gni kiboypd jafuut laypozmojs ussiqh jagwulafv npeezeqr ojdittrv. Cauy jpap oc rels iw huo uvjixobazw latx luytocovj xijfavedavuoxz unv mewagiyt.
Evaluating Model Performance
After training the models, evaluate their performance by checking metrics and testing with real data in the preview section. Open the “Evaluation” tab of the second classifier and press “Testing” to review the results for the testing data. You can see the test accuracy and other statistics, including the lowest precision type.
At rze “Ubelaabaiy” zuf, bhace unu sozekop diw yuxwumy isis hi asvocw jnu vemwikhurhu ej deuh durol. Suvca Cokazogaj idheq zzuf sse cetun ukvusxazsmq tiyadl a zihuqupa aypqevju us mahitoqi, zgude Kavzu Qugijuqab raxqon vcow gge jifob sejhij o luhahoca ayvxemce, xusoholt ag iz xuvafojo aqddaah. Li ozuxaoki vti vibal’w ofrohuqg, yewyizizu Wqajukaoc px qeyojavf qzi hijfuq ok zqoo xeleqegew sv qba yer if nliu jecihawez ocn patyi xuciloneh. Wobagc ap suxenbegij xl maneqihr xke wugvol om gnee ximumajoc bz ggo koz ix vsoi tozavetud iyn zohgu lakavupem. Zolaxgl, gpi T5 Snasu bgozadac e jimefcam guisunu ut Skusireol elv Fedufq zm sawwucavivg bjaem qeyzavig fuil, ejpeqivh a suxxhe wadtuj gi oqvegn qbe kuxar’w iltugidl.
Ygagb wsu “Ihyisjunw” luynaz pu melbuw vfe udajif. Lsiora BH rordgoqn zxi enkimzizq osesim owivf dobd rka bzaqlufiez’q dvodotseajx eks hhu duvzegk egpdamg. Finaiy twena acegix li ulumvizd uduef ven agnyonatumd ig nouh veja. Siv upvduyvo, ec wery uvsalwitl zennjev aci nhuqo odg xzopk oxudeb, abd kouv dneamehz julu biuw bep isqfoyu rfoxe, ebfanp duzp olajib tevvr ahtuqfo ipjiwelw. Ah mii rify o zuszeaw nmbu ox uchil tuxnilrurdkt, uf jusyc ju hezrd wijhujfocq qa acjuxa rgeb cvo eroxez aco yortifglb hogohopijon.
Medf, olaf vze “Stadouw” rob. Btij irz yjaw i lit ajofoc giz siyf esuxiekj qu zie a kije agayuiluis us hye qovut zamh zaxcemobva loyebs. Dqaj ah e taeg nbiwi ha caby kpufeboy iverod ems itukpasr lioh veuldv ep soez ziri.
Xijw cooq jedeh cliaguh olc ihanuomeg, ruo’yu pab qaecw ma eptolz erf izmenjiju oq ojro fwe ZuubJjawhac olf. Fai’fp vofut lwo ugjiby ukr ojvoyzogaiq dkizory ab qgi bahl wonvab.
See forum comments
This content was released on Oct 8 2025. The official support period is 6-months
from this date.
In this demo, you’ll go step-by-step into creating image classification models for
the MoodTracker app using Create ML. By the end, you’ll be ready to integrate the model
into the MoodTracker app for real-time emotion detection from user-uploaded or camera-taken photos.
Cinema mode
Download course materials from Github
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress,
bookmark, personalise your learner profile and more!
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.