Explore chapters and articles related to this topic
Human-driven machine-automation of engineering research
Published in António S. Cardoso, José L. Borges, Pedro A. Costa, António T. Gomes, José C. Marques, Castorina S. Vieira, Numerical Methods in Geotechnical Engineering IX, 2018
M.D.L. Millen, A. Viana Da Fonseca, X. Romão
Historically this has happened before. Mathematical research in the 19th century still relied on human intuition, and in some cases, had inconsistencies and lacked formal proofs that underpinned major branches of mathematics. There was a movement, “Hilbert’s program”, to rigorously rebuild mathematics from its foundations using a set of axioms to re-prove and formalise old theorems (Hilbert 1902). The rebuilding and formalisation of mathematics allowed mathematicians to more easily collaborate and to develop more advanced and consistent theories. Another case is the work of René Descartes (1596–1650), who wanted to remove all doubt from science and philosophy by completely rebuilding it from nothing, the first truth: “I think, therefore I am”. Currently, many fields of science are suffering from poorly organised global research and non-reproducible results, which has prompted new initiatives such as the ‘Reproducibility Project’ in Psychology (Poldrack and Poline 2015). While in other fields of research they have fully formalised the research process and use robots and machine learning to automate scientific discovery for some narrow research problems such as drug development (Sparkes et al. 2010). Engineering research does not need to be as extreme as Descartes or move completely to robotics, but a greater focus on consistency is required now to allow research to happen efficiently and effectively at a global level.
Human-driven machine-automation of engineering research
Published in António S. Cardoso, José L. Borges, Pedro A. Costa, António T. Gomes, José C. Marques, Castorina S. Vieira, Numerical Methods in Geotechnical Engineering IX, 2018
M.D.L. Millen, A. Viana Da Fonseca, X. Romão
Historically this has happened before. Mathematical research in the 19th century still relied on human intuition, and in some cases, had inconsistencies and lacked formal proofs that underpinned major branches of mathematics. There was a movement, “Hilbert’s program”, to rigorously rebuild mathematics from its foundations using a set of axioms to re-prove and formalise old theorems (Hilbert 1902). The rebuilding and formalisation of mathematics allowed mathematicians to more easily collaborate and to develop more advanced and consistent theories. Another case is the work of René Descartes (1596– 1650), who wanted to remove all doubt from science and philosophy by completely rebuilding it from nothing, the first truth: “I think, therefore I am”. Currently, many fields of science are suffering from poorly organised global research and non-reproducible results, which has prompted new initiatives such as the ‘Reproducibility Project’ in Psychology (Poldrack and Poline 2015). While in other fields of research they have fully formalised the research process and use robots and machine learning to automate scientific discovery for some narrow research problems such as drug development (Sparkes et al. 2010). Engineering research does not need to be as extreme as Descartes or move completely to robotics, but a greater focus on consistency is required now to allow research to happen efficiently and effectively at a global level.
Foundations of mathematics under neuroscience conditions of lateral inhibition and lateral activation
Published in International Journal of Parallel, Emergent and Distributed Systems, 2018
Andrew Schumann, Alexander V. Kuznetsov
The Principia Mathematica, a three-volume work written jointly by Alfred North Whitehead and Bertrand Russell and published in 1910, 1912, and 1913, was one of the first books devoted to ‘foundations of mathematics’ in the strict sense, i.e. in fact it was one of the first attempts to make explicitly mathematics from the point of view of symbolic logic, that is an attempt to consider mathematical theorems as logical statements which are automatically inferred from axioms by logical inference rules. Their work was written by a direct influence of Gottlob Frege’s ideas presented in his fundamental book Die Grundlagen der Arithmetik (the Foundations of Arithmetic) published in 1884. To continue and enhance the approach established by Whitehead and Russell, David Hilbert, the German mathematician (1862–1943), put forward a new proposal for the foundation of mathematics called the Finitist Program (or Hilbert’s Program). In this proposal all of mathematics should have been formalized in axiomatic form, together with a proof by ‘finitary’ methods proposed by Hilbert that this axiomatization is consistent. One of the attempts to axiomatize all the mathematics was made by ‘Nicolas Bourbaki’ – the group of 20th-century mathematicians written jointly the many-volume work entitled Éléments de mathématique (the Elements of Mathematics)