Explore chapters and articles related to this topic
Big Data in Computational Health Informatics
Published in Ayman El-Baz, Jasjit S. Suri, Big Data in Multimodal Medical Imaging, 2019
Ruogu Fang, Yao Xiao, Jianqiao Tian, Samira Pouyanfar, Yimin Yang, Shu-Ching Chen, S. S. Iyengar
Machine learning is an area of computer science which explores the creation of algorithms that can automatically learn from data and improve through experience. Nowadays, machine learning techniques have been applied in a variety of applications including audio processing, autonomous vehicles, detection of fraudulent credit card activity, to name a few [64]. Computerization in healthcare is growing day in, day out, which leads to complex and large medical databases. Machine learning algorithms are able to automatically manage such large databases. A general overview of machine learning techniques in healthcare informatics is presented in the following.
Industry 4.0 and real-time synchronization of operation and maintenance
Published in Stein Haugen, Anne Barros, Coen van Gulijk, Trond Kongsvik, Jan Erik Vinnem, Safety and Reliability – Safe Societies in a Changing World, 2018
Machine learning is a field of computer science that gives computers the ability to learn without being explicitly programmed. Machine-learning is quite different from the model based approach advocated here. A strength of machine-learning is it’s efficiency to produce huge amount of results without the explicit need to do all the “hard work”. From a model based approach perspective most of us are reluctant to just “let the computer work out the answers”. However, for sub-problems like establishing a failure model, looking into machine learning approaches are more acceptable.
Prologue: Why data science?
Published in Benjamin S. Baumer, Daniel T. Kaplan, Nicholas J. Horton, Texts in Statistical Science, 2017
Benjamin S. Baumer, Daniel T. Kaplan, Nicholas J. Horton
Computer science is more than just programming; it is the creation of appropriate abstractions to express computational structures and the development of algorithms that operate on those abstractions. Similarly, statistics is more than just collections of estimators and tests; it is the interplay of general notions of sampling, models, distributions and decision-making. [Data science] is based on the idea that these styles of thinking support each other [159].
Comparative Investigation of Learning Algorithms for Image Classification with Small Dataset
Published in Applied Artificial Intelligence, 2021
Imran Iqbal, Gbenga Abiodun Odesanmi, Jianxiang Wang, Li Liu
Machine learning is a subfield of computer science that provides ability to electronic devices to learn automatically and improve their performance without being programmed with any task-specific rules. Arthur Samuel in 1952 made the first learning code game of checkers (Samuel 1959). Deep learning is subclass of machine learning. We typically say it “Deep” when we have more than two layers in network otherwise it is called “Shallow” learning. The study of deep learning started in 1940s (Mcculloch and Pitts 1943) when McCulloch introduce the ideas immanent in nervous activity using calculus. Using massive datasets in deep networks achieved better performance for image recognition (LeCun et al. 1998), video classification (Karpathy et al. 2014), natural language processing (Collobert and Weston 2008), and speech recognition (Hinton et al. 2012) tasks.
Image compression in resource-constrained eye tracking devices*
Published in Journal of Information and Telecommunication, 2019
Pavel Morozkin, Marc Swynghedauw, Maria Trocan
Machine Learning (ML) (Cortes & Vapnik, 1995; Samuel, 1959) is a field of computer science that gives computers the ability to learn without being explicitly programmed. In this definition term ‘learning’ refers to a task of inferring a function from labelled training data. The training data consist of a set of training examples. Machine learning approaches are divided into two main types: Supervised learning (Møller, 1993) – each training sample is a pair consisting of an input object (typically a vector) and a desired output value (also called the supervisory signal). Such sample is also called a ‘labelled’ sample.Unsupervised learning (Hastie, Tibshirani, & Friedman, 2009) – each training sample is only an input object (typically a vector), i.e. without a desired output value. Such sample is also called an ‘unlabelled’ sample.
The STG-framework: a pattern-based algorithmic framework for developing generative models of parametric architectural design at the conceptual design stage
Published in Computer-Aided Design and Applications, 2018
In the domain of computer science, an algorithm is a process for solving a problem in a finite number of steps. Algorithmic modeling tools such as Grasshopper were developed to automate and accelerate 3D modeling tasks by applying generative algorithms. However, cognitive research has revealed that designers typically apply algorithms only as means of exploring geometric intentions, and prefer to apply known solutions and design patterns for other, non-geometric issues [16]. When designers’ intentions go beyond geometry, regardless of the type of design objective [4], designers need to find or develop appropriate algorithms before they can implement generative or evaluative scripts. Algorithmic modeling is gradually being applied to the generation of complex forms, multiple objective optimizations, and the control and evaluation of building performance. One of the reasons for this is that the relevant algorithms, including mathematical formulas for complex geometries, metaheuristic algorithms for artificial intelligence [15], structural analysis, and energy consumption formulas, have been validated in relevant domains. The task of algorithmic design has thus become the implementation of algorithms in modeling tools, rather than the interpretation of architectural design problems and derivation of solving algorithms. Since there is insufficient guidance and assistance for converting architectural knowledge into algorithmic scripts, it is not surprising that designers prefer to apply known solutions, rather than develop or implement algorithmic scripts on their own.