Explore chapters and articles related to this topic
The Business Case for Augmented Intelligence
Published in Judith Hurwitz, Henry Morris, Candace Sidner, Daniel Kirsch, Augmented Intelligence, 2019
Judith Hurwitz, Henry Morris, Candace Sidner, Daniel Kirsch
Every era sees the advent of new technologies that disrupt the way we live, buy, and manage our lives. The steam engine transformed commerce in 1698; the telegraph invented in 1837 changed communications forever. Alexander Graham Bell’s first US patent for the telephone in 1876 is the most transformative technology because it changed the pace of business as never experienced before. The invention of the automobile changed how individuals conducted their daily lives and how businesses transformed commerce. And, of course, the commercialization of the Internet in the early 1990s led to dramatic business changes. The Internet and then the advent of cloud computing and innovations in distributed computing has made the world of AI and machine learning commercially viable, as we discussed in Chapter 2. The bottom line is that what seems to be focused on a single purpose initially will often lead to dramatic changes in the way businesses must operate.
Natural language processing (NLP) in management research: A literature review
Published in Journal of Management Analytics, 2020
Yue Kang, Zhao Cai, Chee-Wee Tan, Qian Huang, Hefu Liu
After the mid-1990s, two events fundamentally promoted the recovery and development of NLP research. The first was the rapid increase of speed and storage in computers, which improved the material foundation for NLP and made the commercial development of speech and language processing possible. The second event was the commercialization of the Internet in 1994. Overall, the development of network technology during this period has made the demand for natural language-based information retrieval and information extraction more prominent. In 2001, Yoshua Bengio proposed the first neural language model, the feed-forward neural network. In 2008, Ronan Collobert was the first to apply multitasking to NLP’s neural network. In 2013, Tomas Mikolov developed Word2Vec, a statistical method that can effectively learn independent word embedding from a text corpus based on neural networks in Google. In 2014, Ilya Sutskever proposed the sequence-to-sequence learning model, a general framework for mapping a sequence to another sequence using a neural network. Based on these statistical models, people make machines better understand and produce human language.