Explore chapters and articles related to this topic
Here Is One I Made Earlier: Machine Learning Deployment
Published in Jesús Rogel-Salazar, Advanced Data Science and Analytics with Python, 2020
Core ML supports a variety of machine learning models, from generalised linear models (GLMs for short) to neural networks. A summary of the models that can be used with Core ML is shown in Table 5.1. It is possible for you to develop your own custom conversion tool in case your model is currently not supported. Core ML also helps with the task of adding the trained machine learning model to your application by automatically creating a custom programmatic interface that supplies an API to your model. All this is within the comfort of Apple’s own IDE: Xcode. Check the Apple Developer documentation for further model supported in the future.API stands for Application Programming Interface
Mobile Programming
Published in Jithesh Sathyan, Anoop Narayanan, Navin Narayan, K V Shibu, A Comprehensive Guide to Enterprise Mobility, 2016
Jithesh Sathyan, Anoop Narayanan, Navin Narayan, K V Shibu
Similar to all other application developments, an iOS-application development process generally follows the major steps that are given here: Create your project: A project template can be used for creating a project. Choose the template for the application type that needs to be developed.Design the UI: Design the application's UI graphically and save designs as resource files so that the application loads at runtime.Write code: Use Xcode features such as code completion, class and data modeling, refactoring, and direct access to documentation while writing the code.Build and run your application: Build the application on the computer and run the application in a simulator or on the device.Measure and tune application performance: After running the application, application performance needs to be analyzed for ensuring that it uses the device's resources efficiently and also provides satisfactory responses to a user's gestures.
Game.UP: Gamified Urban Planning Participation Enhancing Exploration, Motivation, and Interactions
Published in International Journal of Human–Computer Interaction, 2023
Sarah L. Muehlhaus, Chloe Eghtebas, Nils Seifert, Gerhard Schubert, Frank Petzold, Gudrun Klinker
The prototype was built in Swift for iOS 13.1 using the Xcode development environment. All user studies were conducted on an iPad 11 as it provided a larger screen for accessibility and visibility. An initial selection screen at the start of the application allowed the switching between application versions (gamification, gamified-a, control). The application prompts the surveys throughout the beginning, middle, and end of the user study through a web-view controller accessing instances of the survey on SurveyMonkey, so a reliable data connection was required to record the user session on location. GPS and Camera access were required for the Apple map, in which the AR view (see Figure 1) is accessed. There were some technical difficulties with the on-site AR concerning scene detection due to the high number of moving variables (pedestrians, cyclists, cars), as well as environmental issues hindering the reliable placement of a QR code. As a result, the AR view in the application, implemented using the ARkit framework, only displayed a scene of the bridge upon manually tapping the screen.
Validity and reliability of a computer-vision-based smartphone app for measuring barbell trajectory during the snatch
Published in Journal of Sports Sciences, 2020
Carlos Balsalobre-Fernández, Gretchen Geiser, John Krzyszkowski, Kristof Kipp
The vertical and horizontal positions recorded with the motion capture system and were exported as .csv files for further processing. For the app, an update to the previously validated My Lift app (Balsalobre-Fernández et al., 2018) was specifically developed for this study using Xcode 10 for macOS High Sierra 10.14 and the Swift 4 programming language with iOS 12 SDK (Apple Inc., USA). The update included a set of custom computer-vision algorithms using Apple’s Vision framework (Apple Inc., USA) designed to automatically detect barbell trajectory during weightlifting movements. To calibrate the app, a scalable circle was drawn around the barbell plate that was closest to the camera. Then, computer-vision algorithms automatically tracked the motion of the selected plate during the whole movement. A video-tutorial showing how to use the app to measure the trajectory of the barbell in the snatch exercise can be found in the following URL: https://youtu.be/WGU4VR8efzQ, and as a supplementary file. Once the barbell trajectory was tracked in the app, the vertical and horizontal positions were exported as .csv files for further processing.
The validity and reliability of a novel app for the measurement of change of direction performance
Published in Journal of Sports Sciences, 2019
Carlos Balsalobre-Fernández, Chris Bishop, José Vicente Beltrán-Garrido, Pau Cecilia-Gallego, Aleix Cuenca-Amigó, Daniel Romero-Rodríguez, Marc Madruga-Parera
The CODTimer app was specifically developed for this study using Xcode 10.2.1 for macOS High Sierra 10.14.4 and the Swift 5 programming language with iOS 12 SDK (Apple Inc., USA). The AVFoundation and AVKit frameworks (Apple Inc., USA) were used for capturing, importing and manipulating high-speed videos. Then, the app (version 1.0) was installed on an iPhone X running iOS 12.2 (Apple Inc., USA) which has a recording frequency of 240 frames per second (fps) at a quality of FullHD (1920 × 1080 pixels). The app’s user interface was designed to record and high-speed videos and to allow a frame-by-frame inspection of them. Then, the app calculates the total time in the 5 + 5 change of direction test (5 + 5) as the difference between two time events which were manually selected by an independent user as follows: the beginning of the 5 + 5 was considered as the first frame in which the participant crossed the timing gate in the starting/end line of the test, and the end was considered as the first frame in which the participant crossed that gate again. A video-tutorial showing the complete procedure can be found in the following URL: https://youtu.be/_Y2xZjMA7fc