Explore chapters and articles related to this topic
Wind Energy Resources
Published in Radian Belu, Fundamentals and Source Characteristics of Renewable Energy Systems, 2019
After the field data about the wind speed are collected and transferred to the computing environment, the next steps are to validate and process data, and generate reports. Data validation is defined as the inspection of all the collected data for completeness and reasonableness, and the elimination of erroneous values. Data validation transforms raw data into validated data. The validated data are then processed to produce the summary reports required for analysis. This step is also crucial in maintaining high rates of data completeness during the course of the monitoring program. Therefore data must be validated as soon as possible, after they are transferred. The sooner the site operator is notified of a potential measurement problem, the lower the risk of data loss. Data can be validated either manually or by using computer-based techniques. The latter is preferred to take advantage of the power and speed of computers, although some manual review will always be required. Validation software may be purchased from some data-logger vendors, created in-house using popular spreadsheet programs: e.g., Microsoft Excel, or adapted from other utility environmental monitoring projects. An advantage of using spreadsheet programs is that they can also be used to process data and generate reports. Data validation implies visual data inspection, missing data interpolation, outliers and questionable data rejection, saving data in an appropriate file format for further processing.
From Analysis to Communication
Published in Nathalie Henry Riche, Christophe Hurter, Nicholas Diakopoulos, Sheelagh Carpendale, Data-Driven Storytelling, 2018
Fanny Chevalier, Melanie Tory, Bongshin Lee, Jarke van Wijk, Giuseppe Santucci, Marian Dörk, Jessica Hullman
The top-down approach to tooling for exploration is based on end-user applications that aim to support one (smaller or larger) task. Examples of such tools are those for cleaning and conversion (such as OpenRefine and Trifacta Wrangler), statistical analysis (like SAS), and interactive visual analysis (including Tableau and Spotfire). The lines are blurry here as statistical analysis tools often provide visualizations, and visualization tools provide statistics. To address the needs of users, more and more functionality is added, either built-in (leading to many options and feature creep), or more by providing programming options. The most popular tool for data exploration is probably the spreadsheet (Microsoft Excel, Google Sheets), where this pattern is visible. Spreadsheets start from a simple metaphor but provide substantial additional functionality, such as options for business graphics and scripting; as a result, they can be employed for surprisingly complex tasks. But, for many tasks and data types, spreadsheets fall short and more specialized tools are needed, for instance, to analyze and visualize networks.
Waste Generation
Published in Charles R. Rhyner, Leander J. Schwartz, Robert B. Wenger, Mary G. Kohrell, Waste Management and Resource Recovery, 2017
Charles R. Rhyner, Leander J. Schwartz, Robert B. Wenger, Mary G. Kohrell
A computer spreadsheet is extremely useful for doing calculations such as those described above. Construct a table, such as Table 2.14, with column 1 listing the components and column 2, the weight of each component. Enter the densities of each of the components when landfilled in column 3. The volumes of the components (column 4) are found by dividing the mass (column 2 entry) by the corresponding landfilled density (column 3 entry). The total volume is found by summing the values in column 4. The percentage of the total volume contributed by the volume of each component is calculated and displayed in column 5. It is instructive to add another column—the ratio of the volume percent to weight percent. A ratio of 1.0, which is the ratio for paper, means that the material occupies the same proportion of a landfill by volume as it does by weight. Plastics, rubber and leather, textiles, and aluminum have ratios of 2.0 or more, indicating that these materials occupy more volume in a landfill than indicated by their percentage by weight. Yard trimmings, food, and glass have ratios less than 1.0 and occupy proportionately less space than their percentages by weight indicate.
Solving Last-Mile Deliveries for Dairy Products Using a Biased Randomization-Based Spreadsheet. A Case Study
Published in American Journal of Mathematical and Management Sciences, 2022
José G. Sanabria-Rey, Elyn L. Solano-Charris, Carlos Alberto Vega-Mejía, Carlos L. Quintero-Araújo
The proposed solution method was developed as a spreadsheet application using MS Excel. According to Erdoğan (2017), the use of a spreadsheet-based solution has several advantages such as: interface familiarity, ease of use, flexibility, and accessibility. Furthermore, since MS Excel is largely known around the world, using it as the engine for the spreadsheet provides additional benefits, such as integration with software packages that offer built-in functionalities to obtain/send data from/to MS Excel, and the possibility of customizing code in Visual Basic for Applications (Erdoğan, 2017). It is also worth mentioning that using spreadsheet-based solutions may result in low-cost solutions that may yield significant savings for enterprises. Two examples related to routing problems are available in the studies by Erdoğan (2017), who developed a VRP spreadsheet solver tool for the healthcare and tourism sector, and by Manikas et al. (2016) who proposed a spreadsheet-based system with a built-in Genetic Algorithm to plan deliveries for the Meals on Wheels Program in the US.
Make saving crucial again: building energy efficiency awareness of people living in urban areas
Published in Advances in Building Energy Research, 2022
Abdullah Emre Keleş, Ecem Önen, Jarosław Górecki
Microsoft Excel is a well-known application for editing and creating spreadsheets, one of the essential elements of the Microsoft Office suite's composition. The software allows us to design and edit various spreadsheets using many different functions that enable automatic calculations.
A systematic video analysis of 21 anterior cruciate ligament injuries in elite netball players during games
Published in Sports Biomechanics, 2022
Suzanne Belcher, Chris Whatman, Matt Brughelli
The agreed critical features for analysis were determined by the group of experts following (1) review of previous variables used in similar studies in netball and other sports (Krosshaug et al., 2007b; Montgomery et al., 2018; Olsen et al., 2004; Stuelcken et al., 2016; Waldén et al., 2015) 2) review of performance analysis studies in netball (Davidson & Trewartha, 2008; Fox et al., 2013), and (3) consultation with high-performance NNZ staff. The critical features included information on the game and environmental external situation (e.g., contact or not with other players) and the player’s movement during the 10-s index time interval (e.g., running on to a ball, landing). It also included the player’s movement behaviour (e.g., head position or where they moved the ball). A full list of all critical features assessed is provided in Table 2. Visual inspection of knee joint flexion angles was in line with previous video analysis protocols (Cochrane et al., 2007; Krosshaug et al., 2007b; Montgomery et al., 2018) and determined through consensus by the expert panel. Joint flexion angles were used to signify if the knee was in a relatively extended position (≤ 30° flexion) or flexed (> 30° flexion) during the IF moment. These parameters for knee flexion were based on previous research indicating <30° flexion increases injury risk and agreement amongst the expert observational group (Hewett et al., 2010; Koga et al., 2010; Krosshaug et al., 2007b). Apparent deceleration or braking force against player’s momentum could not be quantifiably measured through the video analysis but was determined by the consensus of the expert group, through identifying shaking of the limb, impact sound on court and relevant body position when videos were slowed down frame-by-frame. All five members of the expert observation group viewed the videos individually and sent their results to a single reviewer to collate. If a consensus (defined as four of the five experts agreeing) was not reached for a particular critical feature, the video was re-analysed by the full expert group and discussed until a consensus was reached. Microsoft Excel spreadsheets from Windows (Microsoft Excel 2016, Redmond, Washington, USA) were used to store and analyse data.