Explore chapters and articles related to this topic
“Alexa, What about LGPD?”:
Published in Syed Rameem Zahra, Mohammad Ahsan Chishti, Security and Privacy in the Internet of Things, 2020
According to Hepp et al., (2018), quoting Karanasios et al. (2013), the term ‘datafication’ refers to the increasing digitalization of media with software-based technology. For this reason, the authors update the media studies in the proposal to investigate the “digital tracks” of data that can be aggregated and processed in an automated way based on algorithms derived from the term. Big data technologies contextualize datafication in all social spheres, including interactions with voice assistants such as Alexa and Google Assistant. In the two sections following this chapter, the reader will find evidence of new actors who legislatively contribute to the datafication of voice assistance devices such as Alexa products; in particular, how this can be conflated with Brazilian and European data protection regulations.
Ethical Issues and Considerations of Big Data
Published in Kuan-Ching Li, Beniamino DiMartino, Laurence T. Yang, Qingchen Zhang, Smart Data, 2019
“A set of collective tools, technologies, and processes used to transform an organization to a data-driven enterprise” is called datafication. It is also called Datafy. A firm that carries out datafication is called datafied.
Digital earth: yesterday, today, and tomorrow
Published in International Journal of Digital Earth, 2023
Alessandro Annoni, Stefano Nativi, Arzu Çöltekin, Cheryl Desha, Eugene Eremchenko, Caroline M. Gevaert, Gregory Giuliani, Min Chen, Luis Perez-Mora, Joseph Strobl, Stephanie Tumampos
As introduced above, the current and forthcoming digital transformation of society has led to the emergence of a new paradigm sometimes known as datafication (Mayer-Schönberger and Cukier 2014). According to this paradigm, all aspects of our life are converted into quantified data, which can be analysed to generate actionable intelligence. When a user interacts with DE the large volume of data available now require a new paradigm for processing and extracting knowledge. DE must embrace the datafication paradigm because it fits beautifully with its vision and supports the expected services. In the DE application domain, the datafication model should largely be based on three digital processes (Nativi, Mazzetti, and Craglia 2021; Guo et al. 2020): (Big) Data collection: the collection, aggregation, and contextualization of digital artefacts and digital footprints constantly generated by humans, machines, and real objects connected to the network. The next generations of IoT (IoT 2.0), social sensing platforms, remote sensing instruments, and global communications broadband systems will further increase the volume, diversity, and speed for which we can talk about big data.Generation of deep insights: the recognition of valuable insights by analysing the collected big data. This is commonly achieved by using big data analytics techniques, i.e. advanced (visual) analytic techniques against very large and diverse big data sets that include structured, semi-structured, and unstructured data from different sources and at different sizes in the order of terabytes/zettabytes. Today, these practices make largely use of advanced data management systems and data-driven AI technologies. Scientific methods in remote sensing are changing because of their impact to generate insights. In the near future, to respond to the evolution and increase in the challenges posed by Big Data, an ever greater analytical capacity with ever shorter response times will be required.Interpretation of insights and actionable intelligence generation: the interpretation of the generated insights to develop a profiled intelligence based on user needs. This is achieved through specialized online platforms that interact with users, as well as data analytics and AI stakeholders to provide personalized services. This approach offers a rich user experience by applying the principles of the platform economy. In the next future, application tools and services will work more and more with analytical insights and less with observational data. New systems and approaches will be increasingly needed, for example by applying the Digital Twin paradigm.