Explore chapters and articles related to this topic
Variability in Library Evolution
Published in Ivan Mistrik, Matthias Galster, Bruce R. Maxim, Software Engineering for Variability Intensive Systems, 2019
Hussein Alrubaye, Mohamed Wiem Mkaouer, Anthony Peruma
We notice in Figure 13.13 that the target libraries have a relatively higher VD value compared to source libraries. This applies to the majority of migrations such as log4j to slf4j and junit to testng. We noticed for two migration rules that the VD values, for both source and target libraries, were similar. This is explained by the fact that the source and target library are historically identical; they are being seen as different libraries since they are identified by a different GroupID and ArtifactID, as discussed in Section 5.3. Another outlying rule was the swapping of jersey-server by pax-logging-api, pax-logging-service. We applied the Mann-Whitney U test with p-value = 0.05, for the migrations rules we identified, and we could not find the difference between the values of two sets to be statistically significant. Thus, we plan to qualitatively analyze the detected migrations and filter out the false-positives since they negatively impact the statistical analysis by introducing identical values in both groups under comparison.
Path forward
Published in Nikki Robinson, Mind the Tech Gap, 2023
Without appreciation for each other’s technical skills, and even gaps, teams will continue to be misaligned and lack behind other organizations. And similarly, without an understanding of moving targets related to security incidents, IT and security teams will have difficult adapting to and reacting to malicious events. For example, consider the SolarWinds and Log4j incidents, information about those zero-day events was changing daily and even hourly, over many months. Security teams worked closely with operations and development teams to give the most up to date and accurate remediation information. Although these were tense situations, if both groups worked in tandem, they would be able to resolve the concerns quickly and efficiently.
Using a Notification, Recommendation and Monitoring System to Improve Interaction in an Automated Assessment Tool: An Analysis of Students’ Perceptions
Published in International Journal of Human–Computer Interaction, 2022
Joan Manuel Marquès, Laura Calvet, Marta Arguedas, Thanasis Daradoumis, Enric Mor
A major issue when assessing an assignment transparently in a distributed set of computers is that students don’t know what occurs during the assessment. To avoid this, the DSLab includes a logging service to store the execution logs generated by each instance of the distributed algorithm under assessment. This log service, similarly to log4j (https://logging.apache.org/log4j/2.x/), has six log levels (in decreasing order of severity): FATAL, ERROR, WARN, INFO, DEBUG, and TRACE. Each distributed computer builds its log with all log entries generated in the computer and sends them to DSLab. Each entry has a timestamp. Students can download the logs of each execution they have submitted (in that case, they download a .zip with a log for each distributed computer) or may consult them in the web interface of DSLab (in that case, they can choose the computer logs they want to visualize; the logs are merged and presented to the user in a web page indicating, for each entry, the computer where the log was generated, the timestamp and the message).