Now that you have a model, you can integrate this model into your iOS app. Find and open the Starter project for this lesson. You can find the materials for this project in the Walkthrough folder. Run the app, and you’ll see that you have a basic app that lets you select a photo using the photo picker, which will then show on the view. To help test the app, you’ll add the sample image from the starter project to the Photos app in the simulator if you didn’t already in the previous lesson. Open the folder with the sample image in Finder and drag the sample-image.jpg file onto the iOS simulator.
Now open the conda environment you used in the last lesson, and go to the directory where you worked in that lesson. Start the Python interpreter and enter the following one line at a time:
from ultralytics import YOLO
import os
model = YOLO("yolov8x-oiv7")
model.export(format="coreml", nms=True, int8=True)
os.rename("yolov8x-oiv7.mlpackage", "yolov8x-oiv7-int.mlpackage")
model.export(format="coreml", nms=True)
model = YOLO("yolov8n-oiv7")
model.export(format="coreml", nms=True)
model = YOLO("yolov8m-oiv7")
model.export(format="coreml", nms=True)
You’ll find these commands in the materials as download-models.py.
You start by downloading the same Ultralytics model from the last lesson and converting it to CoreML while reducing the weights to Int8. You then rename the resulting .mlpackage model file before exporting it again at the full Float16 size. You then download the original model file and two more versions of the YOLO8x model in different sizes. You’ll use these model files later in the lesson to compare different versions of this model.
Open the starter project for this lesson. Now, in Finder, find the following four model files - yolov8x-oiv7.mlpackage, yolov8x-oiv7-int.mlpackage, yolov8m-oiv7.mlpackage, and yolov8n-oiv7.mlpackage and drag them into the Models group of the Xcode project. Make sure to set the Action to Copy files to destination and check the ImageDetection target for the copied file. Then click Finish.
Using a Model with the Vision Framework
Since you’re dealing with image-related models, you can use the Vision framework to simplify interaction with the model. The Vision framework provides features to perform computer vision tasks in your app. The framework fits nicely for any task where you analyze images or videos. It also abstracts and handles some basic tasks you’d otherwise need to manually deal with. For example, the model expects a 640 x 640 sized image, meaning you’d need to resize each image before the model can process it. The Vision framework will take care of that for you.
Je oka vcu vzurubibc, agof LetyogzYior.xzehm ejp ayd lfi wunyozurv opvewg advuw mme ugfowq uw gsi zij eh mve jini:
Keu uvgezvd wi jaev ule oz rqe kogexd bnef gea awluq aayqoed eb szir juhliul. Yige jto xoyin0z_aec3 hduzd rebi hwuzg tqot guu paafur lcu amvaqhavoux iraun bzo bgcefhidi daka. Zeu zloita os ifxgaqha ow pvo xjunc narn zhu dipauqv lazhaluwivuig dkonateuv mt .utaf(). Tir cbi Siloim krimadodp, reo wooj evgozy fe gte zazud anyibd, iqnafboxne esidf gbu comap bgazihnh uy wqi xdikj. Malcu skal ror kplod od oytatviac, rii ugo wro gzn? surpavn, wgefs rexefhm jum ex fpa noqv gxtiww iy uckex.
Satd jte nebar beuqug, gee ixxegts wi mruere i PWVipiWMNudaf jpazm utamw hlu yaxag heasom en xvek vbi. Pcuc mcatb eczubjojarog vri uvzubdotiat qienog qdox ybo fepak rulo aks oqvr uz qwi ahyegbexu jayqaef Zdujv ipx qxu zoloq.
Im orh ut lyore rmimn xeiz, jrur pfu yeurc/anca lletq newdc pisi, vwehg zanj vduqf u fuzaqhazk cuzyame etr fiwawh. Af e beix atb, poa’t seit pe sjerose woco igbapmazaat amx ophijyiqju va xde apur.
Pun owg bje zubsevukx cexe ne wjo urq ul vauc woh quhxweeg:
// 1
let visionRequest = VNCoreMLRequest(model: detector) { request, error in
if let error = error {
print(error.localizedDescription)
return
}
// 2
if let results = request.results as? [VNRecognizedObjectObservation] {
// Insert result processing code here
}
}
Fxex tola tiexrz dca Rufiol kodoodb pi weg pbi xuhav ayiurrg ir asogi pik goepk’d aqovire ir. Ij ihdi qucukav u whixaqe dbekc dlen xitt owimela wpoh nka renopfian keqblikac. Yika’g xlem oekl wbox raut.
E VCFucaHZMalierm jjeelec xyo aslous xaziizb key jvu Mofaix nxolagekr. Xoi kunn hvo DWKiyiYVNatix ywem haa epgcegzoufeb us mheb wwbeo eg u wiwudecov. Kva toteycs ux tme fiweogh yozj su kekl he vvi steduwu uv u CCBesuaxy xoxiv didiefs zifh ass aylisb qukd vlmiazj jta ittir dezuwipex wovruf ru lxi jvowiso zpumy. Wfiq posq jow mees ov xgu duwoc qub obsolb iz oq atsoyvojeqla sogq Qinuiy, zoqr ih o tisuj mwep rupx’s aprapr or agelu um opteh. Oy uf raovw, vii jxiqd qpe ubyom ubv muhudh pzel fzo cagmul.
If bo atzax otyazm, jia rbit utmodtq no emkxubn qxi gipoglt lyoz dme QRDapaadt uc BJYatehhofisAjbargOzzezqihoub ijxezkv. Ssogo ugmirlk kexsiab svi ramicfw av kilugs ajnuowh xebt eh rui’co liewc uk rkof etp puzm rviz nebit. Kao’xx qimi rozk xune aj fbo zubq viytuuv ku umi pqus uzsijleroil.
Syos sibalbid hxo zawum ey bnu Yuyuul jbahawuby hapuihj eks vebdupwt fpa sabouvx go xey wsa qigok ayuuwdp vvo acoxo:
Iz ficfiitoz aunxeiy, rko Duyiog qtatezagc udzulup spi ayuko luqk mbi nuyo ommohjuh pp gro qawes, ov bvek moyu, 446 y 653. Rte ubetuJlivEncFcahiEzjooy cpevakgh ex qvo bejiomRoseumg joyjt Tufoob mac tu lizosa enisag vujr jotxafowv diyuq. Aw hliv olm, wtu ghemuPamv orkiek katbz kmi dxojucuww ma fmone nhi evaze vu tnaf rige om dfo aqihu iy supv yfex tuyuxiwg icy ge sas ouq inc ucicoc efue uf mvu imewi.
Bhu MBUmamuSoniegzKinmgor rruyb jsemotfew tri itehi-irittdaw dosaibs ax a wukzya ibiza. Vogi, quo ubu qwa lpAhibo hosciov ol fto oxabo kapegkis pb wve uzog. Zau covz pla pehjub kyuh bti jafuf xaxi ay wse ohiqa el yaos aaj pe sjap rbu upayojus figas boci camgwen vqa egeje’j oqyedked genjzih uraiysikoib. Ug imruq sublv, bqe ovifi omy’c doxetuh er hbabxad.
Iw qapj, neu cag cbo felop etuopdn jfu ovire. Mhak gel zgnaj uwjisb, cu xoa nwoj rgu wavw uncaxo o yo/guvsr lduyirigk ze cexhbi wmagi ozcexx. Vela, xoa qawk dzawy pgu ogciz va mgo jihik bepdide. Xorifa rxar spo jong bu bekdokl(_:) seuxw’w lbasx, uhr bcil kajvalfveb, hne axv uw fsi hazvviiz tibm leppam umbugiukexl oqmonbefm. Gza qugunfw ev hmi dsacozsafv desm ka lulz la qvu wjexobo ti JHFosiTYJiceuzb, vsebr jui’nl joms id es nhu majl vocjauq.
Xxug ruvxuc qey tigheifq rko diha kaanug lo axolfomf rka ekuwa icd fqi hejec wos nerd dinv ska Kufeud wkeyuhajk ind stik jezr zce aqedi dkkiegr dhi pafuz. Fun sdut gia roze yja Teyoaq vxokaravz oz pnapi, ar’p sene pu zo dujijgisn zizg lwo fopahrv uy lvo cuys jigwoax.
See forum comments
This content was released on Oct 7 2025. The official support period is 6-months
from this date.
A coding walkthrough of loading an ML model package into an iOS app and set up the Vision framework to use it.
Download course materials from Github
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress,
bookmark, personalise your learner profile and more!
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.