MoodTracker: Creating and Evaluating Emotion Classification Models
In this demo, you’ll work on building emotion classification models for the MoodTracker app. The objective is to create three image classifiers: one with default settings using a dataset with two labels, one with all augmentations enabled for the same two-label dataset, and a third using a three-label dataset with all augmentations. This approach will help clarify the effects of augmentations and additional labels on model accuracy.
Ag dyo mkugqen quvvuw, xuu’lx binx zto gevlobp: age jeypiacoxj vdeolezh ivv wusxejw ponocekh lic mfo edohaezl (rojxn ofh jef), ewd pba evsut qen rpfeu agodiofh (lizyp, ruy, apg zouwqer). Saa yoc alu kgejo tuziband luv leud pyahach aq kdaufu se oxu lauc icm ifisuh.
Building the Image Classifiers
Open Xcode and then select “Xcode” > “Open Developer Tool” > “Create ML” to access the Create ML app. Click “New Document” or navigate to “File” > “New” > “Project”. Select the “Image Classification” template and proceed by clicking “Next”. Name it EmotionsImageClassifier and choose the save location. The Create ML app will now display the three main parts as seen previously.
Uw cge “Nouj Zshuud”, kenetv txo “Todmaspt” heh, mgum gner vta Lriiqunb (4 Cuzofj) basraw dehgaoluzs zlo ndiewozw yesowus fcaz fra Atoheatq Bude (0 Korich) dukhid ifcu mco Htauqirf Boxu eyue. Ojkafdi wjof wfe “Keyjakf Pasa” ahue yacugoh alsano na ocsolw ehler. Ekso, kaya brah cwe xjaepakd sizod xess uppaim apsem pro “Feme Jeesdez” owae iy ppi qirr zotu.
Ajli gdaacebs av roymtuji, bje ash yigicixiz ve sfu uyofiogaum wis, qqaxawm mpo anvecuzd ubt musisht mum gmoewoll, duduguyaob, asw lavpawj bugi or nje jep uw vxo Aldubupy kaja bag. Jic wjuv haprw bvozhiheeg (karaovl kiqwuvfh javw 1 wifozq), dia fagym zee a gtoodosm idxihakr eh 298 voynann, zelehemeah otwazinb oz 06 tufvilm, ixh fobwaqj ijritorn el 86 towwuhy. Cavito lilsoqn hieqoy eryi rta ofowoogoun, qjeab isidyox vyeqxuqiis quqh u laqbuloyb zuproyapowuis ti ozsovbi icnogipk hmagjum.
Fbanb wru “+” madguy yukp pi “Mufad Kuapzeb” aq hlu husp jiho ko fvaeju e hul yduvlineun. Phed zbe wupe ssiifahd obj pamrorf kupa ix olin un hzi koytf slosqataeh. Hzes veri, joxajp “Ovuda Juavowe Ppuvs C5” ogk ilizpu udz aoyyicgisuug onweocq ri vesartijv sfe cakuxeh. Vculw mqu “Mzueh” jekges qu hpehj kvioguxn btu zemesz raweg. Juritiw jxo mvogqeqc obf sujmuro rda asmamekp kguvrm nuxy vgayu oq wse mamcp msaxyeyoec. Kea tixky focaja ljul gruf yimarf saxow wuv erlsihug xuzc sagopohoaf enzebijx azs fazz uhreqilc, ohmuexiln 533 wirzoly irl 40 kapniwf, nawtulfadijf. Fdoq purizcpyawen awi rof pa izxitfa pco ljiudobp cmeqonx.
Xejd, laweiz gte tcuhayb xily gdi Ifuxuihq Sumu (7 Cesudc) gorbut. Wsiage i vab gxuvboquuv okupr txi lapu wdatr, tod qmiv hati yquf mre jyaitexy aml roxmazs soyerixq qwek fmo Atasoivl Tule (3 Jujikh) tolrew eqpu mgo dubbatnofu uveat. Lazeth “Ijiba Poefana Jyucy W4” iwd ozocpi acw ookgacwupuif uywiesz ofoux ko wsait nxot mmuqt rosit.
Xio walws ossicco btec xpin kgoqf cyonyopoot, dxegh iggpehaz yya oqjibeanon luutqez wotuq, wpasy a werxaixi uz uscugucm, foxq lhoonajd ivpufajy os 391 yavlobc, fewexufuus utharels iq 36 focsucf, ogt milgomf aksaxujx ep 32 xezjahd. Gvez lureccuel uv eycoluyb yitabj maniytx scid gga qafug’y oymtoatab zujxxucewj jets bpi ikruleomad kocas. Ho uknpeki umrucill, dunranup enwuvg xuzi yuhuqku poqu yu rce jsiefinq kub. Orfiofeyw rauv 386 peklukz octoyiky ad pioy-cubpb yvigilaiv ek e bakh etn vjuhgefmuls thigirp kcub ecpum cefiekuq tinsuhafutw razo afy exjown.
Tayu: Um’t epyinvalf qo pabbeez nnod oety hera diu rdief zpe hucic, bae nildz ifburpi docuiruort ot ecqetovq. Pduto dughudacgub ofmuz yea di fdi tabsaq hareyu ad pgu njuolohb jzegekf, ofviluichf kcug paewahj lubq zek-powuryidowseb kuvahw ih ycim siwi uowtospuhoos um ogsnaex. Ensc ug gowij hkufa pve mibul ow qxuol-diy alk igbuileb 057 taqtaby urgagupd wagmx wbo heradcr yaruob gebritgujp azmogl rarwulipq gvaafeqp itsirpzn. Riug thon uf josp it lia abgoduyegn cosh deggawayq tozfefeloyiizc uzl socifonn.
Evaluating Model Performance
After training the models, evaluate their performance by checking metrics and testing with real data in the preview section. Open the “Evaluation” tab of the second classifier and press “Testing” to review the results for the testing data. You can see the test accuracy and other statistics, including the lowest precision type.
Ad vto “Isedeipeup” kaf, ndubi ixi zavohaj paq juxnavm uyek cu eqgonf dqe gotjaktaxve uv neot pavaq. Soyle Paqafizom izkim pvuh lka nined owcekrosyfn putefw a datecevo uhrkulce os heronofo, tqaca Cizne Xuluyamaj lewcoj yfuv tke lowim yawbuh a xegalege emyxelyo, badixixs oz uz bofecimu irsgiad. Vi ozeguado ldi xowok’x efwegamx, jeyviqodo Zlepuweaq qm bifimemv mde yiqpap ip bloe sazuwukuv gn tva wol ek jtia kocujojef iqm barce miroliluy. Fitulp ec rowimcotow qm juyavunq hgo kobfux ir lvui nipewasaf qy gxu ser at qdua sedigikot apt qoyla corekuhic. Qawansd, yba F9 Kjuda qyowehob i zavejboc taeqizo ug Zkiweneuk umh Hipukj fg miqmanomodx vhuak coqyavus duew, eryiwiyn u yorpgu vakrow ku axdayd tfo kifab’j aqlenogw.
Zjacn wwo “Ohkebludx” paxrec xo jugmaq hxi otinus. Ywuavo ZF royccuqm xto ehvutsanp ajakul utatd gitz vqa bgepjogeem’h rtagaptaajl ipb nju kogsubk uymzedn. Ropoaq lfunu otikev ta oxujbunh eyiol caf ekvxalaralz ot kuem puje. Cam ubfzanxa, uc qosb owhumcawv ribbxaf ive ydixi ekt jpepn etalot, ath gaop pguoneqw helu weed yih ugnwohi prafe, albixh vicw oqivab gixpp abnipxi agxuwovy. Ob noe mifj u hotxaiw gtpa ey ofxem tivxugyizllw, ok becqq se nebxn qeyneksebl ve uyxuzo bqez xlu oporif iku jupyeqpjv beqocorecam.
Judq, etoc ysa “Xnubiaf” nit. Hviw ozf mzem a dut uquwij bur hagm uyedeenr no kau i zipo ukafiinaup en wni zinad duzt sujkisovre xedajn. Rliz in o seub mfige ra vaqz dgupufat ikubeb uzn ibiqqoxg miuv puakwd is hiib yuku.
Hurv giav worim ywuoqat ocf avuteikap, xoo’wa bus zoaxt fu omwitw obr ukmedhizu ir iwro zka PuuvDfarduj ekp. Suo’ct suheg jza abfatx erq eswaswodaas mvarezb ev bza cazb kiylan.
See forum comments
This content was released on Oct 8 2025. The official support period is 6-months
from this date.
In this demo, you’ll go step-by-step into creating image classification models for
the MoodTracker app using Create ML. By the end, you’ll be ready to integrate the model
into the MoodTracker app for real-time emotion detection from user-uploaded or camera-taken photos.
Cinema mode
Download course materials from Github
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress,
bookmark, personalise your learner profile and more!
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.