ARTICLE | doi:10.20944/preprints202007.0537.v1
Online: 23 July 2020 (08:10:38 CEST)
Energy use is of crucial importance for the global challenge of climate change but also an essential part of daily life. Hence, research on energy needs to be robust and valid. Other scientific disciplines have experienced a reproducibility crisis, that is, existing findings could not be reproduced in new studies, and energy research might be impacted as well. In this paper, we suggest the ‘TReQ’ approach to improve the research practices in the energy field and arrive at greater Transparency, Reproducibility, and Quality. We acknowledge the specific challenges of energy research and suggest a highly adaptable suite of tools that can be applied to research approaches across this multi-disciplinary and fast-changing field. In particular, we introduce preregistration of studies, making data and code publicly available, using preprints, and employing reporting guidelines to heighten the standard of research practices within the energy field. We argue that through wider adoption of these tools, we will be able to have greater trust in the findings of research used to inform evidence-based policy and practice in the energy field.
ARTICLE | doi:10.20944/preprints201909.0122.v1
Subject: Medicine & Pharmacology, Other Keywords: open health; simple rules; ethics; reproducibility; research significance; open science
Online: 11 September 2019 (13:27:26 CEST)
We are witnessing a dramatic transformation in the way we do science. In recent years, significant flaws with existing scientific methods have come to light, including lack of transparency, insufficient involvement of stakeholders, disconnection from the public, and limited reproducibility of research findings. These concerns have sparked a global movement to revolutionize scientific practice and the emergence of Open Science. This new approach to science extends principles of openness to the entire research cycle, from hypothesis generation to data collection, analysis, replication, and translation from research to practice. Open Science seeks to remove all barriers to conducting high quality, rigorous, and impactful scientific research by ensuring that the data, methods, and opportunities for collaboration are open to all. Emerging digital technologies and "big data" (see "Ten simple rules for responsible big data research") have further accelerated the Open Science movement by affording new approaches to data sharing, connecting researcher networks, and facilitating the dissemination of research findings. Open scientific practices are also having a profound impact on the health sciences and medical research, and specifically how we conduct clinical research with human participants. Human health research necessitates careful considerations for practicing science in an ethical manner. There is also a particular urgency to human health research since the goal is to help people, so doing good science takes on a different meaning than simply doing science well. It also implores the scientist to reassess the conventional view of human health research as a pursuit conducted by scientists on human subjects, and lays a greater emphasis on inclusive and ethical practices to ensure that the research takes into account the interests of those who would be most impacted by the research. Openness in the context of human health research also raises greater concerns about privacy and security and presents more opportunities for people, including participants of research studies, to contribute in every capacity. At the core of open health research, scientific discoveries are not only the product of collaboration across disciplines, but must also be owned by the community that is inclusive of researchers, health workers, and patients and their families. To guide successful open health research practices, it is essential to carefully consider and delineate its guiding principles. This editorial is aimed at individuals participating in health science in any capacity, including but not limited to people living with medical conditions, health professionals, study participants, and researchers spanning all types of disciplines. We present ten simple rules that, while not comprehensive, offer guidance for conducting health research with human participants in an open, ethical, and rigorous manner. These rules can be difficult, resource-intensive, and can conflict with one another. They are aspirational and are intended to accelerate and improve the quality of human health research. Work that fails to follow these rules is not necessarily an indication of poor quality research, especially if the reasons for breaking the rules are considered and articulated (see rule 6: document everything). While most of the responsibility of following these rules falls on researchers, anyone involved in human health research in any capacity can apply them.
ARTICLE | doi:10.20944/preprints202011.0282.v1
Subject: Social Sciences, Accounting Keywords: Open Research Data; Open Peer Review; medicine; health sciences; Open Science; Open Access; health scientists; FAIR
Online: 9 November 2020 (16:02:24 CET)
During the last years, significant initiatives have been launched for the dissemination of Open Access as part of the Open Science movement. Nevertheless, the other major pillars of Open Science such as Open Research Data (ORD) and Open Peer Review (OPR) are still in an early stage of development among the communities of researchers and stakeholders. The present study sought to unveil the perceptions of a medical and health sciences community about these issues. Through the investigation of researchers’ attitude, valuable conclusions can be drawn, especially in the field of medicine and health sciences, where an explosive growth of scientific publishing exists. A quantitative survey was conducted based on a structured questionnaire, with 51.8% response rate (215 responses out of 415 electronic invitations). The participants in the survey agreed with the ORD principles However they ignored basic terms like FAIR (Findable, Accessible, Interoperable and Reusable) and appeared incentive to permit the exploitation of their data. Regarding OPR, participants expressed their agreement, implying their interest for a trustworthy evaluation system. Conclusively, researchers urge to receive proper training for both ORD principles and OPR processes which combined with a reformed evaluation system will enable them to take full advantage of the opportunities that arise from the new scholar publishing and communication landscape.
ARTICLE | doi:10.20944/preprints201806.0243.v1
Online: 15 June 2018 (05:19:00 CEST)
This paper explores whether preprints can better support open science by providing links to other early-stage research outputs. This potentially has benefits for transparency and discoverability of research projects. By looking at preprint submission systems, online preprints and surveying those who run preprint servers, I examined to what extent this is currently possible. No preprints server provided a complete service, however many allowed the linking of several open science elements from the abstract page. I looked at variation based on subject, age, and size of preprint server. In conclusion, authors posting preprints should consider the options provided by different preprint servers. It appears that open science is just one focus of preprint servers and further improvements will be dependent on preprint server policies and priorities rather than overcoming any technical difficulties.
REVIEW | doi:10.20944/preprints201805.0418.v1
Subject: Mathematics & Computer Science, Other Keywords: big data training and learning; company and business requirements; ethics; impact; decision support; data engineering; open data; smart homes; smart cities; IoT
Online: 29 May 2018 (08:45:52 CEST)
In Data Science we are concerned with the integration of relevant sciences in observed and empirical contexts. This results in the unification of analytical methodologies, and of observed and empirical data contexts. Given the dynamic nature of convergence, described are the origins and many evolutions of the Data Science theme. The following are covered in this article: the rapidly growing post-graduate university course provisioning for Data Science; a preliminary study of employability requirements, and how past eminent work in the social sciences and other areas, certainly mathematics, can be of immediate and direct relevance and benefit for innovative methodology, and for facing and addressing the ethical aspect of Big Data analytics, relating to data aggregation and scale effects. Associated also with Data Science is how direct and indirect outcomes and consequences of Data Science include decision support and policy making, and both qualitative as well as quantitative outcomes. For such reasons, the importance is noted of how Data Science builds collaboratively on other domains, potentially with innovative methodologies and practice. Further sections point towards some of the most major current research issues.
REVIEW | doi:10.20944/preprints201905.0302.v1
Subject: Social Sciences, Economics Keywords: open science; open access; open data; economic impacts
Online: 27 May 2019 (11:19:59 CEST)
A common motivation for increasing open access to research findings and data is the potential to create economic benefits – but evidence is patchy and diverse. This study systematically reviewed the evidence on what kinds of economic impacts (positive and negative) open science can have, how these comes about, and how benefits could be maximized. Use of open science outputs often leaves no obvious trace, so most evidence of impacts is based on interviews, surveys, inference based on existing costs, and modelling approaches. There is indicative evidence that open access to findings/data can lead to savings in access costs, labour costs and transaction costs. There are examples of open science enabling new products, services, companies, research and collaborations. Modelling studies suggest higher returns to R&D if open access permits greater accessibility and efficiency of use of findings. Barriers include lack of skills capacity in search, interpretation and text mining, and lack of clarity around where benefits accrue. There are also contextual considerations around who benefits most from open science (e.g. sectors, small vs larger companies, types of dataset). Recommendations captured in the review include more research, monitoring and evaluation (including developing metrics), promoting benefits, capacity building and making outputs more audience-friendly.
Subject: Social Sciences, Library & Information Science Keywords: bioeconomy; open science; open access
Online: 30 October 2020 (14:45:27 CET)
The purpose of this paper is to assess the degree of openness of scientific articles on bioeconomy. Based on a WoS corpus of 2,489 articles published between 2015 and 2019, we calculated bibliometric indicators, explored the openness of each paper and assessed the share of journals, countries and research areas of these articles. The results show a sharp increase and diversification of articles in the field of bioeconomy, with a beginning long tail distribution. 45.6% of the articles are freely available, and the share of OA papers is steadily increasing, from 31% in 2015 to 52% in 2019. Gold is the most important variant of OA. Open access is low in the applied research areas of chemical, agricultural and environmental engineering but higher in the domains of energy and fuels, forestry, and green and sustainable science and technology. The UK and the Netherlands have the highest rates of OA papers, followed by Spain and Germany. The funding rate of OA papers is higher than of non-OA papers. This is the first bibliometric study on open access to articles on bioeconomy. The results can be useful for the further development of OA editorial and funding criteria in the field of bioeconomy.
CASE REPORT | doi:10.20944/preprints201905.0166.v1
Subject: Social Sciences, Library & Information Science Keywords: Open Annotation; Monographs; Open Access; Higher Education; Open Peer Review
Online: 14 May 2019 (10:03:41 CEST)
The digital format opens up new possibilities for interaction with monographic publications. In particular, annotation tools make it possible to broaden the discussion on the content of a book, to suggest new ideas, to report errors or inaccuracies, and to conduct open peer reviews. However, this requires the support of the users who might not yet be familiar with the annotation of digital documents. This paper will give concrete examples and recommendations for exploiting the potential of annotation in academic research and teaching. After presenting the annotation tool of Hypothesis, the article focuses on its use in the context of HIRMEOS (High Integration of Research Monographs in the European Open Science Infrastructure), a project aimed to improve the Open Access digital monograph. The general line and the aims of a post-peer review experiment with the annotation tool, as well as its usage in didactic activities concerning monographic publications are presented and proposed as potential best practices for similar annotation activities.
CONCEPT PAPER | doi:10.20944/preprints201702.0053.v1
Subject: Social Sciences, Finance Keywords: Systemic risk; Systemically Important Firms(SIFs); Stock Price; Stock Price (Close); Stock Price (Open); Stock Price: Bid; Stock Price: Ask; Stock Price: Spread; Joint Probability of Distress(JPoD); Banking Stability Index (BSI); Co-Risk Model
Online: 15 February 2017 (10:40:33 CET)
The Estimation of systemic risk in India is still in its infancy stage. There are several methods which are available but none of the methods are fully compatible to forecast the systemic risk since under different circumstances the factors responsible for the risk differs. In this paper the systemic risk estimation in India being carried out based on spread in daily stock market price(Difference between the bid and ask price of a share) of the top 100 firms in India according to market capitalization for the period of July2007 to March 2016. The results were compared with the Financial Stability Report published by Reserve Bank Of India for the period of March 2010 to June 2016.The results clearly indicates that there exits relationship between market illiquidity represented by spread and risks associated with the Financial System. In most of the cases the Z score (deviation from the mean/Standard Deviation) of the spread has become negative which provides the spread which is farther from the mean, also a good indicator of volatility in market and risk to financial system. It is also seen that the Systemic Risk Survey conducted by Reserve Bank of India which started during October 2011 has supported the results.
EDITORIAL | doi:10.20944/preprints201605.0001.v1
Subject: Keywords: Preprints, Open Science
Online: 3 May 2016 (14:43:02 CEST)
Preprints is a multidisciplinary preprint platform that makes scientific manuscripts from all fields of research immediately available at www.preprints.org. Preprints is a free (not-for-profit) open access service supported by MDPI in Basel, Switzerland.
ARTICLE | doi:10.20944/preprints201905.0098.v2
Subject: Mathematics & Computer Science, Other Keywords: open review; open science; zero-blind review; peer review; methodology
Online: 16 August 2019 (05:27:55 CEST)
We present a discussion and analysis regarding the benefits and limitations of open and non-anonymized peer review based on literature results and responses to a survey on the reviewing process of alt.chi, a more or less open-review track within the CHI conference, the predominant conference in the field of human-computer interaction (HCI). This track currently is the only implementation of an open-peer-review process in the field of HCI while, with the recent increase in interest in open science practices, open review is now being considered and used in other fields. We collected 30 responses from alt.chi authors and reviewers and found that, while the benefits are quite clear and the system is generally well liked by alt.chi participants, they are reluctant to see it used in other venues. This concurs with a number of recent studies that suggest a divergence between support for a more open review process and its practical implementation. The data and scripts are available on https://osf.io/vuw7h/, and the figures and follow-up work on http://tiny.cc/OpenReviews.
REVIEW | doi:10.20944/preprints202004.0054.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: pandemic; influenza pandemic; open source; open hardware; COVID-19; COVID-19 pandemic; medical hardware; open source medicine
Online: 6 April 2020 (12:38:59 CEST)
Distributed digital manufacturing offers a solution to medical supply and technology shortages during pandemics. To prepare for the next pandemic, this study reviews the state-of-the-art for open hardware designs needed in a COVID-19-like pandemic. It evaluates the readiness of the top twenty technologies requested by the Government of India. The results show that the majority of the actual medical products have had some open source development, however, only 15% of the supporting technologies that make the open source device possible are freely available. The results show there is still considerable work needed to provide open source paths for the development of all the medical hardware needed during pandemics. Five core areas of future work are discussed that include: i) technical development of a wide-range of open source solutions for all medical supplies and devices, ii) policies that protect the productivity of laboratories, makerspaces and fabrication facilities during a pandemic, as well as iii) streamlining the regulatory process, iv) developing Good-Samaritan laws to protect makers and designers of open medical hardware, as well as to compel those with knowledge that will save lives to share it, and v) requiring all citizen-funded research to be released with free and open source licenses.
ARTICLE | doi:10.20944/preprints202004.0472.v1
Subject: Chemistry, Analytical Chemistry Keywords: 3-D printing; additive manufacturing; distributed manufacturing; laboratory equipment; open hardware; open source; open source hardware; scale; balance; mass
Online: 27 April 2020 (02:59:34 CEST)
This study provides designs for a low-cost, easily replicable open source lab-grade digital scale that can be used as a precision balance. The design is such that it can be manufactured for use in most labs throughout the world with open source RepRap-class material extrusion-based 3-D printers for the mechanical components and readily available open source electronics including the Arduino Nano. Several versions of the design were fabricated and tested for precision and accuracy for a range of load cells. The results showed the open source scale was found to be repeatable within 0.1g with multiple load cells, with even better precision (0.01g) depending on load cell range and style. The scale tracks linearly with proprietary lab-grade scales, meeting the performance specified in the load cell data sheets, indicating that it is accurate across the range of the load cell installed. The smallest loadcell tested(100g) offers precision on the order of a commercial digital mass balance. The scale can be produced at significant cost savings compared to scales of comparable range and precision when serial capability is present. The cost savings increase significantly as the range of the scale increases and are particularly well-suited for resource-constrained medical and scientific facilities.
ARTICLE | doi:10.20944/preprints202202.0088.v1
Subject: Medicine & Pharmacology, Sport Sciences & Therapy Keywords: Public open spaces; Open streets; Built environment; Leisure-time physical activity; Epidemiology
Online: 7 February 2022 (13:02:09 CET)
Leisure-time physical activity (LTPA) is associated with access and use of public open spaces. The “President João Goulart Elevated Avenue” and current denominated “Minhocão” is a facility for leisure activities that are open for people during the night/weekends. The aim of this study was to examine if the prevalence of LTPA among individuals living in the surroundings of Minhocão is different according to proximity to, and use of, the facility. We conducted a cross-sectional study with cluster sampling with people aged ≥18 years who lived in households until 500m and between 501m to 1500m of Minhocão. The survey was conducted between December/2017 until March/2019 with an electronic questionnaire self-responded. We conducted bivariate analysis and Poisson regression to examine possible differences in LTPA according to the proximity of residences and use of Minhocão. The analysis used post-stratification weights. A total of 12,030 telephone numbers of people were drawn (≤500m = 6,942; and >500m to ≤1500m = 5,088). The final sample analyzed were of 235 residents who returned the questionnaires. There was a higher prevalence of individuals engaging in at least 150 minutes per week of LTPA among users than non-users (Prevalence Ratio=2.23, IC95%1.72-2.90). People who used the park had higher prevalence of all types of LTPA than non-users. The results can serve to inform government decision-making on the future of Minhocão.
ARTICLE | doi:10.20944/preprints202001.0240.v1
Subject: Social Sciences, Library & Information Science Keywords: openness under neoliberalism; open-access licensing in capitalism; the politics of open-licensing
Online: 21 January 2020 (11:00:41 CET)
The terms 'open' and 'openness' are widely used across the current higher education environment particularly in the areas of repository services and scholarly communications. Open-access licensing and open-source licensing are two prevalent manifestations of open culture within higher education research environments. As theoretical ideals, open-licensing models aim at openness and academic freedom. But operating as they do within the context of global neoliberalism, to what extent are these models constructed by, sustained by, and co-opted by neoliberalism? In this paper, we interrogate the use of open-licensing within scholarly communications and within the larger societal context of neoliberalism. Through synthesis of various sources, we will examine how open access licensing models have been constrained by neoliberal or otherwise corporate agendas, how open access and open scholarship have been reframed within discourses of compliance, how open-source software models and software are co-opted by politico-economic forces, and how the language of 'openness' is widely misused in higher education and repository services circles to drive agendas that run counter to actually increasing openness. We will finish by suggesting ways to resist this trend and use open-licensing models to resist neoliberal agendas in open scholarship.
ARTICLE | doi:10.20944/preprints201708.0069.v1
Subject: Mathematics & Computer Science, Other Keywords: energy system analysis; model challenges; open science; open source; energy modelling framework; oemof
Online: 21 August 2017 (03:02:34 CEST)
The research field of energy system analysis is dealing with increasingly complex energy systems and their respective challenges. Moreover, the requirement for open science has become a focal point of public interest. Both drivers have triggered the development of a broad range of (open) energy models and frameworks in recent years. However, there are hardly any approaches on how to evaluate these tools in terms of their capabilities to tackle energy system modelling challenges. This paper describes a first step towards a flexible evaluation of software to model energy systems. We propose a qualitative approach as an useful supplementary to existing model fact sheets and transparency checklists. We demonstrate the applicability by evaluating the newly developed “Open Energy Modelling Framework” with respect to existing challenges in energy system modelling. The case study results highlight that challenges related to complexity and scientific standards can be tackled to a large extent while the challenges of model utilization and interdisciplinary modelling are only tackled partially. However, the challenge of uncertainty remains for the most part unaddressed at present. Advantages of the evaluation approach lie in its simplicity, flexibility and transferability to other tools. Disadvantages mostly stem from its qualitative nature. Our analysis reveals that some challenges in the field of energy system modelling cannot be addressed by a software as they are on meta level like model result communication and interdisciplinary modelling.
ARTICLE | doi:10.20944/preprints202112.0099.v1
Online: 7 December 2021 (11:30:56 CET)
Forest recreation can be successfully used for the psychological relaxation of respondents and can be used as a remedy for common problems with stress. The special form of forest recreation intended for restoration is forest bathing. These activities might be distracted by some factors, such as viewing buildings in the forest or using a computer in nature, which interrupt psychological relaxation. One factor that might interrupt psychological relaxation is the occurrence of an open dump in the forest during an outdoor experience. To test the hypothesis that an open dump might decrease psychological relaxation, a case study was planned that used a randomized, controlled crossover design. For this purpose, two groups of healthy young adults viewed a control forest or a forest with an open dump in reverse order and filled in psychological questionnaires after each stimulus. A pretest was used. Participants wore oblique eye patches to stop their visual stimulation before the experimental stimulation, and the physical environment was monitored. The results were analyzed using the two-way repeated measures ANOVA. The measured negative psychological indicators significantly increased after viewing the forest with waste, and the five indicators of the Profile of Mood States increased: Tension-Anxiety, Depression-Dejection, Anger-Hostility, Fatigue, and Confusion. In addition, the negative aspect of the Positive and Negative Affect Schedule increased in comparison to the control and pretest. The measured positive indicators significantly decreased after viewing the forest with waste, the positive aspect of the Positive and Negative Affect Schedule decreased, and the Restorative Outcome Scale and Subjective Vitality scores decreased (in comparison to the control and pretest). The occurrence of an open dump in the forest might interrupt a normal restorative experience in the forest by reducing psychological relaxation. Nevertheless, the mechanism of these relevancies is not known, and thus, it will be further investigated. In addition, in a future study, the size of the impact of these open dumps on normal everyday experiences should be investigated. It is proposed that different mechanisms might be responsible for these reactions; however, the aim of this manuscript is to only measure this reaction. The identified psychological reasons for these mechanisms can be assessed in further studies.
ARTICLE | doi:10.20944/preprints201901.0238.v1
Online: 23 January 2019 (10:15:00 CET)
In December 2012, DOAJ’s parent company, IS4OA, announced they would introduce new criteria for inclusion in DOAJ  and that DOAJ would collect vastly more information from journals as part of the accreditation process – and that journals already included, would need to reapply in order to be kept in the registry. My hypothesis was that the journals removed from DOAJ on May 9th 2016 would chiefly be journals from small publishers (mostly single journal publishers) and that DOAJ journal metadata information would reveal that they were journals with a lower level of publishing competence than those that would remain in the DOAJ. Among indicators of publishing competence could be the use of APCs, permanent article identifiers, journal licenses, article level metadata deposited with DOAJ, archiving policy/solutions and/or having a policy in SHERPA/RoMEO. The analysis shows my concerns to be correct.
CONCEPT PAPER | doi:10.20944/preprints201707.0095.v1
Online: 31 July 2017 (16:05:17 CEST)
Chemistry is the last natural science discipline to embrace prepublishing, namely the publication of non-peer reviewed scientific articles on the internet. After a brief insight into the origins and the purpose of prepublishing in science, we conduct a concrete analysis of the concrete situation, aiming at providing an answer to several questions. Why the chemistry community has been late in embracing prepublishing? Is this in relation with the slow acceptance of open access publishing by the same community? Will prepublishing become a common habit also for chemistry scholars?
ARTICLE | doi:10.20944/preprints202005.0479.v1
Subject: Engineering, Mechanical Engineering Keywords: open source; open hardware; COVID-19; medical hardware; RepRap; 3-D printing; open source medical hardware; high temperature 3-D printing; additive manufacturing; ULTEM; polycarbonate
Online: 31 May 2020 (16:18:20 CEST)
Thermal sterilization is generally avoided for 3-D printed components because of the relatively low deformation temperatures for common thermoplastics used for material extrusion-based additive manufacturing. 3-D printing materials required for high-temperature heat sterilizable components for COVID-19 and other applications demands 3-D printers with heated beds, hot ends that can reach higher temperatures than polytetrafluoroethylene (PTFE) hot ends and heated chambers to avoid part warping and delamination. There are several high temperature printers on the market, but their high costs make them inaccessible for full home-based distributed manufacturing required during pandemic lockdowns. To allow for all these requirements to be met for under $1,000, the Cerberus – an open source three-headed self-replicating rapid prototyper (RepRap) was designed and tested with the following capabilities: i) 200oC-capable heated bed, ii) 500oC-capabel hot end, iii) isolated heated chamber with 1kW space heater core and iv) mains voltage chamber and bed heating for rapid start. The Cereberus successfully prints polyetherketoneketone (PEKK) and polyetherimide (PEI, ULTEM) with tensile strengths of 77.5 and 80.5 MPa, respectively. As a case study, open source face masks were 3-D printed in PEKK and shown not to warp upon widely home-accessible oven-based sterilization.
ARTICLE | doi:10.20944/preprints202002.0165.v1
Subject: Social Sciences, Library & Information Science Keywords: open access; api; self archiving,; automation
Online: 13 February 2020 (10:34:30 CET)
This proposal describes the design and development of an interoperable application that supports green open access with long-term sustainability and improved user experience of article deposit. Introduction: The lack of library resources and unfriendly repository user interface are two significant barriers that hinder green open access. Tasked to implement the open access mandate, librarians at an American research university developed a comprehensive system called Easy Deposit 2 to automate the support workflow of green open access. Implementation: Easy Deposit 2 is a web application that is able to harvest newly publications, outreach for manuscript on behalf of the library, and facilitate self-archiving to IR. It is developed and maintained by the library and integrated with the IR. Results and Discussion: The article deposit rate is about 25% with Easy Deposit 2, which increases significantly comparing to the previous period. It also serves as a local database for faculty publications with open access status. The lesson learned is that library cannot rely on a single commercial provider for publication data due to mismatched priorities. Conclusion: Recent IT developments provides new opportunities of innovation like Easy Deposit 2 in supporting open access. Academic librarians are vital in promoting "openness" in scholarly communication such as transparency and diversity in the sharing of publication data.
ARTICLE | doi:10.20944/preprints201905.0029.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: multilingual; open information extraction; parallel corpus
Online: 6 May 2019 (06:14:07 CEST)
The number of documents published on the Web other languages than English grows every year. As a consequence, it increases the necessity of extracting useful information from different languages, pointing out the importance of researching Open Information Extraction (OIE) techniques. Different OIE methods have been dealing with features from a unique language. On the other hand, few approaches tackle multilingual aspects. In such approaches, multilingual is only treated as an extraction method, which results in low precision due to the use of general rules. Multilingual methods have been applied to a vast amount of problems in Natural Language Processing achieving satisfactory results and demonstrating that knowledge acquisition for a language can be transferred to other languages to improve the quality of the facts extracted. We state that a multilingual approach can enhance OIE methods, being ideal to evaluate and compare OIE systems, and as a consequence, to applying it to the collected facts. In this work, we discuss how the transfer knowledge between languages can increase the acquisition from multilingual approaches. We provide a roadmap of the Multilingual Open IE area concerning the state of the art studies. Additionally, we evaluate the transfer of knowledge to improve the quality of the facts extracted in each language. Moreover, we discuss the importance of a parallel corpus to evaluate and compare multilingual systems.
ARTICLE | doi:10.20944/preprints201809.0017.v1
Online: 3 September 2018 (09:39:24 CEST)
Universities, like cities, have embraced novel technologies and data-based solutions to improve their campuses with ‘smart’ becoming a welcomed concept. Campuses in many ways are small-scale cities. They increasingly seek to address similar challenges and to deliver improved experiences to their users. How can data be used in making this vision a reality? What can we learn from smart campuses that can be scale up to smart cities? A short research study was conducted over a three-month period at a public university in the United Kingdom employing stakeholder interviews and user surveys, aiming at gaining insight into these questions. Based on the study, the authors suggest that making data publicly available could bring many benefits to different groups of stakeholders and campus users. These benefits come with risks and challenges such as data privacy and protection and infrastructure hurdles. However, if these challenges can be overcome, open data could contribute significantly to improving campuses and user experiences, and potentially set an example for smart cities.
CASE REPORT | doi:10.20944/preprints201801.0066.v1
Online: 8 January 2018 (11:11:47 CET)
The implementation of the European Cohesion Policy aiming at fostering regions competitiveness, economic growth and creation of new jobs is documented over the period 2014–2020 in the publicly available Open Data Portal for the European Structural and Investment funds. On the base of this source, this paper aims at describing the process of data mining and visualization for information production on regional programmes performace in achieving effective expenditure of resouces.
ARTICLE | doi:10.20944/preprints201904.0207.v1
Subject: Engineering, Biomedical & Chemical Engineering Keywords: 3-D printing; additive manufacturing; biomedical equipment; biomedical engineering; centrifuge; design; distributed manufacturing; laboratory equipment; open hardware; open source; open source hardware; medical equipment; medical instrumentation; scientific instrumentation
Online: 18 April 2019 (08:03:58 CEST)
Centrifuges are commonly required devices in medical diagnostics facilities as well as scientific laboratories. Although there are commercial and open source centrifuges, costs of the former and required electricity to operate the latter, limit accessibility in resource-constrained settings. There is a need for low-cost, human-powered, verified and reliable lab-scale centrifuge. This study provides the designs for a low-cost 100% 3-D printed centrifuge, which can be fabricated on any low-cost RepRap-class fused filament fabrication (FFF) or fused particle fabrication (FPF)-based 3-D printer. In addition, validation procedures are provided using a web camera and free and open source software. This paper provides the complete open source plans including instructions for fabrication and operation for a hand-powered centrifuge. This study successfully tested and validated the instrument, which can be operated anywhere in the world with no electricity inputs obtaining a radial velocity of over 1750rpm and over 50N of relative centrifugal force. Using commercial filament the instrument costs about US$25, which is less than half of all commercially available systems; however, the costs can be dropped further using recycled plastics on open source systems for over 99% savings. The results are discussed in the contexts of resource-constrained medical and scientific facilities.
REVIEW | doi:10.20944/preprints202110.0390.v1
Subject: Life Sciences, Biotechnology Keywords: Biological contaminants; grazers; microalgae; open cultivation; biopesticides
Online: 26 October 2021 (14:36:19 CEST)
Microalgae biomass is a budding raw material for the origination of food, fuel, and other value-added products. However, bulk production of microalgal biomass at commercial level is a herculean task for the current microalgal mass production technologies due to the undesirable contaminations by biological pollutants. These contaminants hamstring the production of microalgae biomass by debilitating the growth of cultures, crumble the quality of biomass and sometimes may crash the whole culture. The best utilization of the microalgae biomass at industrial level could be attained by avoiding various possible biological contaminations in mass cultivation system, understanding the contamination mechanisms, and the complex interactions of algae with other microorganisms. This review explores the various types of biological pollutants, their possible mode of infection along with mechanisms, different controlling methods to maintain desired microalgae culture.
ARTICLE | doi:10.20944/preprints202106.0648.v1
Subject: Social Sciences, Accounting Keywords: open access; article processing charges; monitoring systems
Online: 28 June 2021 (12:33:05 CEST)
The Open Access (OA) publishing model that is based on article processing charges (APC) is often associated with the potential for more transparency regarding the expenditures for publications. However, the extent to which transparency can be achieved depends not least on the completeness of data in APC monitoring systems. This article investigates two blind spots of the largest collection of APC payment information, OpenAPC. It aims to identify likely APC-liable publications for German universities that contribute to this system and for those that do not provide data to it. The calculation combines data from Web of Science, the ISSN-Gold-OA-list and OpenAPC. The results show that for the group of universities contributing to the monitoring system, more than half of the APC payments are not covered by it and the average payments for non-covered APCs is higher than for APCs covered by the system. In addition, the group of universities that does not contribute to OpenAPC accounts for two thirds of the number of APC-liable publications recorded for contributing universities. Regarding the size of these blind spots, the value of the monitoring system is limited at present.
ARTICLE | doi:10.20944/preprints202101.0082.v2
Subject: Earth Sciences, Atmospheric Science Keywords: Shoreline Evolution; Open-Source Software; GIS; Modeling
Online: 19 February 2021 (09:46:48 CET)
This paper presents the validation of the End Point Rate (EPR) tool for QGIS (EPR4Q), a tool built-in QGIS Graphical Modeler to calculate the shoreline change by End Point Rate method. The EPR4Q tries to fill the gap of user-friendly and free open-source tool for shoreline analysis in Geographic Information System environment, since the most used software - Digital Shoreline Analysis System (DSAS) - although is a free extension, is suited for commercial software. Besides, the best free open-source option to calculate EPR called Analyzing Moving Boundaries Using R (AMBUR), since it is a robust and powerful tool, the complexity and heavy processes can restrict the accessibility and simple usage. The validation methodology consists of applying the EPR4Q, DSAS, and AMBUR on different examples of shorelines found in nature, extracted from the U.S. Geological Survey Open-File. The obtained results of each tool were compared with Pearson correlation coefficient. The validation results indicate that the EPR4Q tool created acquired high correlation values with DSAS and AMBUR, reaching a coefficient of 0.98 to 1.00 on linear, extensive, and non-extensive shorelines, guarantying that the EPR4Q tool is ready to be freely used by the academic, scientific, engineering, and coastal managers communities worldwide.
ARTICLE | doi:10.20944/preprints202003.0443.v2
Subject: Social Sciences, Library & Information Science Keywords: COVID-19; open science; data; bibliometric; pandemic
Online: 22 April 2020 (06:15:34 CEST)
Introduction: The Pandemic of COVID-19, an infectious disease caused by SARS-CoV-2 motivated the scientific community to work together in order to gather, organize, process and distribute data on the novel biomedical hazard. Here, we analyzed how the scientific community responded to this challenge by quantifying distribution and availability patterns of the academic information related to COVID-19. The aim of our study was to assess the quality of the information flow and scientific collaboration, two factors we believe to be critical for finding new solutions for the ongoing pandemic. Materials and methods: The RISmed R package, and a custom Python script were used to fetch metadata on articles indexed in PubMed and published on Rxiv preprint server. Scopus was manually searched and the metadata was exported in BibTex file. Publication rate and publication status, affiliation and author count per article, and submission-to-publication time were analysed in R. Biblioshiny application was used to create a world collaboration map. Results: Our preliminary data suggest that COVID-19 pandemic resulted in generation of a large amount of scientific data, and demonstrates potential problems regarding the information velocity, availability, and scientific collaboration in the early stages of the pandemic. More specifically, our results indicate precarious overload of the standard publication systems, significant problems with data availability and apparent deficient collaboration. Conclusion: In conclusion, we believe the scientific community could have used the data more efficiently in order to create proper foundations for finding new solutions for the COVID-19 pandemic. Moreover, we believe we can learn from this on the go and adopt open science principles and a more mindful approach to COVID-19-related data to accelerate the discovery of more efficient solutions. We take this opportunity to invite our colleagues to contribute to this global scientific collaboration by publishing their findings with maximal transparency.
ARTICLE | doi:10.20944/preprints201912.0312.v1
Subject: Engineering, Civil Engineering Keywords: thermal comfort; draught; cooling period; open office
Online: 24 December 2019 (08:42:03 CET)
Local thermal comfort (TC) and draught rate (DR) has been studied widely. There has been more meaningful research performed in controlled boundary condition situations than in actual work environments involving occupants. TC conditions in office buildings in Estonia have been barely investigated in the past. In this paper, the results of TC and DR assessment in five office buildings in Tallinn are presented and discussed. Studied office landscapes vary in heating, ventilation and cooling (HVAC) system parameters, room units and elements. All sample buildings were less than six years old, equipped with dedicated outdoor air ventilation system and room conditioning units. The on-site measurements consisted of TC and DR assessment with indoor climate questionnaire (ICQ). The purpose of the survey is to assess the correspondence between HVAC design and the actual situation. Results show, whether and in what extent the standard-based criteria for TC is suitable for actual usage of the occupants. Preferring one room conditioning unit type or system may not guarantee better thermal environment without draught. Although some HVAC systems observed in this study should create the prerequisites for ensuring more comfort, results show that this is not the case for all buildings in this study.
ARTICLE | doi:10.20944/preprints201808.0326.v2
Subject: Social Sciences, Library & Information Science Keywords: business plan; publishing; academic libraries; open access
Online: 27 September 2018 (04:27:08 CEST)
Over the last twenty years, library publishing has emerged in higher education as a new class of publisher. Conceived as a response to commercial publishing practices that have strained library budgets and prevented scholars from openly licensing and sharing their works, library publishing is both a local service program and a broader movement to disrupt the current scholarly publishing arena. It is growing both in numbers of publishers and numbers of works produced. The commercial publishing framework which determines the viability of monetizing a product is not necessarily applicable for library publishers who exist as a common good to address the needs of their academic communities. Like any business venture, however, library publishers must develop a clear service model and business plan in order to create shared expectations for funding streams, quality markers, as well as technical and staff capacity. As the field is maturing from experimental projects to full programs, library publishers are formalizing their offerings and limitations. The anatomy of a library publishing business plan is presented and includes the principles of the program, scope of services, and staffing and governance requirements. Other aspects include production policies, financial structures, and measures of success.
ARTICLE | doi:10.20944/preprints201808.0492.v1
Subject: Social Sciences, Library & Information Science Keywords: bibliometrics; publication statistics; open Access; citation impact
Online: 29 August 2018 (10:32:21 CEST)
Based on the total scholarly article output of Norway, we investigated the coverage and degree of openness according to three bibliographic services 1) Google Scholar, 2) oaDOI by Impact Story and 3) 1findr by 1science. According to Google Scholar, we find that more than 70% of all Norwegian articles are openly available. However, degrees are profoundly lower according to oaDOI and 1findr, respectively 31% and 52%. Varying degrees are mainly caused by different interpretations of openness, with oaDOI being most restrictive. Furthermore, open shares vary considerably by discipline, with the Medcine and Health sciences at the upper and the Humanities at the lower end. We also determined the citation frequencies using Cited-by values as of Google Scholar, applying year and subject normalization. We find a significant citation advantage for open articles. However, this is not the case for all types of openness. In fact, the category Open Access journals was by far lowest cited, indicating that young journals with a declared open access policy still lack recognition.
ARTICLE | doi:10.20944/preprints201803.0067.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: open microcontrolled platform; data acquisition; remote measurement
Online: 8 March 2018 (15:21:13 CET)
The commercial equipment that carries out the measurement of temperature has a high cost. Therefore, this article describes the development of a temperature measurement equipment, which uses a microcontrolled platform, responsible for managing the data of the collected temperature signals and making available the acquired information, so that they can be verified in real time at the measurement site, or remotely. The construction of the temperature measurement equipment was performed using open platform hardware / software, where performance tests were carried out with the objective of developing a temperature measurement equipment that has measurement quality and low cost.
ARTICLE | doi:10.20944/preprints201705.0045.v1
Online: 5 May 2017 (05:29:10 CEST)
Dump design and scheduling are critical elements to effective mine planning, especially if several of them are required in large-scale open pit mines. Infrastructure capital and transportation costs are considerable from an early stage in the mining project, and through the life-of-mine as these dumps gradually become immense structures. Delivered mining rates, as well as certain spatial and physical constraints, provide a set of parameters of mathematical and economic relationship that creates opportunities for modelling and thus facilitates the measuring and optimization of ultimate dump design by using programming and empirical techniques while achieving economic objectives. This paper presents a methodology to model and optimize the design of a mine dump by minimizing the total haulage costs. The proposed methodology consists on: (i) Formulation of a dump model based on a system of equations relying on multiple relevant parameters; (ii) Solves by minimizing the total cost using linear programming and determines a ‘preliminary’ dump design; (iii) Through a series of iterations, modifies the ‘preliminary’ footprint by projecting it to the topography and creates the ultimate dump design. Finally, an example application for a waste rock dump illustrates this methodology.
ARTICLE | doi:10.20944/preprints201906.0294.v1
Subject: Mathematics & Computer Science, Geometry & Topology Keywords: is*-open set, is*-continuous, is*-open, is*-irresolute, is*-totally continuous, is-contra-continuous mappings, is*-separation
Online: 28 June 2019 (11:45:03 CEST)
In this paper, we introduce a new class of open sets that is called is*-open set . Also, we present the notion of is*-continuous, is*-open, is*-irresolute, is*-totally continuous, and is-contra-continuous mappings, and we investigate some properties of these mappings. Furthermore, we introduce some is*-separation axioms, and is*-mappings are related with is*-separation axioms. . .
REVIEW | doi:10.20944/preprints202203.0217.v1
Subject: Engineering, Energy & Fuel Technology Keywords: energy policy; energy conservation; climate change; global safety; open hardware; open source; photovoltaic; renewable energy; solar energy; national security
Online: 15 March 2022 (14:27:35 CET)
Free and open source hardware (FOSH) development has been shown to increase innovation and reduce economic costs. This article reviews the opportunity to use FOSH like a sanction to undercut imports and exports from a target criminal country. A formal methodology is presented for selecting strategic national investments in FOSH development to improve both national security and global safety. In this methodology, first the target country that is threatening national security or safety is identified. Next, the top imports from the target country as well as potentially other importing countries (allies) are quantified. Hardware is identified that could undercut imports/exports from the target country. Finally, methods to support the FOSH development are enumerated to support production in a commons-based peer production strategy. To demonstrate how this theoretical method works in practice it is applied as a case study to the current criminal military aggressor nation, who is also a fossil fuel exporter. The results show there are numerous existing FOSH and opportunities to develop new FOSH for energy conservation and renewable energy to reduce fossil fuel energy demand. Widespread deployment would reduce the concomitant pollution, human health impacts, and environmental desecration as well as cut financing of military operations.
Subject: Engineering, Other Keywords: drying; materials processing; vacuum oven; small-scale; lab equipment; air-powered; open hard-ware; open source; digital manufacturing; dehydration
Online: 22 April 2021 (09:16:02 CEST)
Vacuum drying can dehydrate materials further than dry heat methods while protecting sensitive materials from thermal degradation. Many industries have shifted to vacuum drying as cost- or time-saving measures. Small-scale vacuum drying, however, has been limited by high costs of specialty scientific tools. To make vacuum drying more accessible, this study provides design and performance information for a small-scale open source vacuum oven, which can be fabricated from off-the-shelf and 3-D printed components. The oven is tested for drying speed and effective-ness on both waste plastic polyethylene terephthalate (PET) and a consortium of bacteria developed for bioprocessing of terephthalate wastes to assist in distributed recycling of PET for both additive manufacturing as well as potential food. Both materials can be damaged when exposed to high temperatures, making vacuum drying a desirable solution. The results showed the open source vacuum oven was effective at drying both plastic and biomaterials, drying at a higher rate than a hot-air dryer for small samples or for low volumes of water. The system can be constructed for less than 20% of commercial vacuum dryer costs for several laboratory-scale applications including dehydration of bio-organisms, drying plastic for distributed recycling and additive manufacturing, and chemical processing.
ARTICLE | doi:10.20944/preprints201905.0060.v1
Subject: Keywords: open source; 3D printing; Drosophila; laser cutter; lab equipment; open labware; fly-pushing; fly pad; fly plate; CO2 anesthesia
Online: 6 May 2019 (11:51:52 CEST)
One of the most important pieces of equipment used in labs in culturing populations of fruit flies (Drosophila sp.) is that of the “CO2 gas plate”, which is used to anesthesize individuals during “fly-pushing”. This piece of equipment consists of a box with a porous top into which carbon-dioxide is pumped. Flies placed on its surface are left immobilized, permitting the sorting, categorizing and/or counting of flies during population culturing and experimental assays. Unfortunately, commercially available gas plates are typically expensive. Here, we describe a new design for a gas plate that can be easily produced using a 3D printer and a laser cutter, which we are making freely available to the fly community.
REVIEW | doi:10.20944/preprints202105.0352.v1
Subject: Life Sciences, Biochemistry Keywords: 3d printing; microscopy; open-source; optics; super-resolution
Online: 14 May 2021 (16:10:24 CEST)
The maker movement has reached the optics labs, empowering researchers to actively create and modify microscope designs and imaging accessories. 3D printing has especially had a disruptive impact on the field, as it entails an accessible new approach in fabrication technologies, namely additive manufacturing, making prototyping in the lab available at low cost. Examples of this trend are taking advantage of the easy availability of 3D printing technology. For example, inexpensive microscopes for education have been designed, such as the FlyPi. Also, the highly complex robotic microscope OpenFlexure represents a clear desire for the democratisation of this technology. 3D printing facilitates new and powerful approaches to science and promotes collaboration between researchers, as 3D designs are easily shared. This holds the unique possibility to extend the open-access concept from knowledge to technology, allowing researchers from everywhere to use and extend model structures. Here we present a review of additive manufacturing applications in microscopy, guiding the user through this new and exciting technology and providing a starting point to anyone willing to employ this versatile and powerful new tool.
ARTICLE | doi:10.20944/preprints202102.0513.v1
Subject: Earth Sciences, Atmospheric Science Keywords: Sea-Level Rise; GIS; Open-Source Software; Modeling
Online: 23 February 2021 (12:39:09 CET)
Sea-level rise is a problem increasingly affecting coastal areas worldwide. The existence of Free and Open-Source Models to estimate the sea-level impact can contribute to better coastal man-agement. This study aims to develop and to validate two different models to predict the sea-level rise impact supported by Google Earth Engine (GEE) – a cloud-based platform for planetary-scale environmental data analysis. The first model is a Bathtub Model based on the uncertainty of projections of the Sea-level Rise Impact Module of TerrSet - Geospatial Monitoring and Modeling System software. The validation process performed in the Rio Grande do Sul coastal plain (S Brazil) resulted in correlations from 0.75 to 1.00. The second model uses Bruun Rule formula implemented in GEE and is capable to determine the coastline retreat of a profile through the creation of a simple vector line from topo-bathymetric data. The model shows a very high cor-relation (0.97) with a classical Bruun Rule study performed in Aveiro coast (NW Portugal). The GEE platform seems to be an important tool for coastal management. The models developed have been openly shared, enabling the continuous improvement of the code by the scientific commu-nity.
ARTICLE | doi:10.20944/preprints202102.0421.v1
Subject: Earth Sciences, Atmospheric Science Keywords: Sea-Level Rise; GIS; Open-Source Software; Modeling
Online: 18 February 2021 (13:52:49 CET)
Sea-level rise is a problem increasingly affecting coastal areas worldwide. The existence 15 of Free and Open-Source Models to estimate the sea-level impact can contribute to better coastal 16 management. This study aims to develop and to validate two different models to predict the 17 sea-level rise impact supported by Google Earth Engine (GEE) – a cloud-based platform for plan-18 etary-scale environmental data analysis. The first model is a Bathtub Model based on the uncer-19 tainty of projections of the Sea-level Rise Impact Module of TerrSet - Geospatial Monitoring and 20 Modeling System software. The validation process performed in the Rio Grande do Sul coastal 21 plain (S Brazil) resulted in correlations from 0.75 to 1.00. The second model uses Bruun Rule for-22 mula implemented in GEE and is capable to determine the coastline retreat of a profile through the 23 creation of a simple vector line from topo-bathymetric data. The model shows a very high correla-24 tion (0.97) with a classical Bruun Rule study performed in Aveiro coast (NW Portugal). The GEE 25 platform seems to be an important tool for coastal management. The models developed have been 26 openly shared, enabling the continuous improvement of the code by the scientific community.
ARTICLE | doi:10.20944/preprints202012.0016.v1
Subject: Physical Sciences, Acoustics Keywords: Open quantum systems, Tensor networks, non-equilibrium dynamics
Online: 1 December 2020 (12:20:25 CET)
Simulating the non-perturbative and non-Markovian dynamics of open quantum systems is a very challenging many body problem, due to the need to evolve both the system and its environments on an equal footing. Tensor network and matrix product states (MPS) have emerged as powerful tools for open system models, but the numerical resources required to treat finite temperature environments grow extremely rapidly and limit their applications. In this study we use time-dependent variational evolution of MPS to expore the striking theory of Tamescelli et al. that shows how finite temperture open dyanmics can be obtained from zero temperature, i.e. pure wave function, simulations. Using this approach, we produce a benchmark data set for the dynamics of the Ohmic spin-boson model across a wide range of coupling and temperatures, and also present detailed analysis of the numerical costs of simulating non-equilibrium steady states, such as those emerging from the non-perturbative coupling of a qubit to baths at different temperatures. Despite ever growing resource requirements, we find that converged non-perturbative results can be obtained, and we discuss a number of recent ideas and numerical techniques that should allow wide application of MPS to complex open quantum systems.
ARTICLE | doi:10.20944/preprints202009.0350.v1
Subject: Medicine & Pharmacology, Sport Sciences & Therapy Keywords: anterior cruciate ligament; open kinetic chain; laxity; isokinetic
Online: 16 September 2020 (05:55:45 CEST)
Rehabilitation following anterior cruciate ligament reconstruction with hamstring graft allows the patient to regain his functional capacities and to support him in the resumption of sports activities. Rehabilitation also aims to minimize the risk of recurrence, which is why it ensures that the patient's muscular capacities develop properly until they return to sport. Isokinetics helps strengthen and assess the strength of muscle groups in the thigh, but controversy exists as to its use by resistance to the open kinetic chain knee extension that would cause the transplant to distend. The objective of this study is to determine the influence of isokinetic muscle strengthening on the possible laxity of the anterior cruciate ligament and to be able to determine risk factors. The study relates to a population having benefited from anterior cruciate ligament reconstruction with hamstring graft from 3 to 6 months after surgery. Two groups are differentiated, one group exposed to isokinetism during rehabilitation, the other group, named unexposed, undergoes rehabilitation without the use of isokinetism. An anterior knee laxity test is performed 6 months postoperatively using the GNRB® machine for all subjects according to the same protocol. The test results were statistically analyzed to determine a relative risk of transplant distension for each group in the study. Comparison of the results of each group by univariate analysis did not reveal any significant result. Multivariate analysis showed interactions in the two strata of the study. It was argued that the use of isokinetics seems to have no effect on the risk of developing distension for the majority of subjects in the exposed group. A tendency towards transplant protection was perceived for each variable except the age under 25 years (RRa = 1.07). The use of isokinetics does not appear to be a cause of transplant distension in patients undergoing an anterior cruciate ligament reconstruction when this method is introduced 3 months postoperatively.
ARTICLE | doi:10.20944/preprints202008.0081.v1
Subject: Arts & Humanities, Architecture And Design Keywords: air pollution; particulate; PM2.5; open market; pedestrian traffic
Online: 4 August 2020 (08:20:44 CEST)
Market air quality is very important to the economic lives of the people which is rarely researched, however, market activities particularly pedestrian traffic releases particulates which is detrimental to the health of the users and stakeholders. Thermo scientific MIE pDR-1500 particulate was used to monitor the quality of air within the market for eight (8) weeks, air pollutant of concern is PM2.5. ten (10) sample points were located in the market which covers ten (10) sample points for pedestrian traffic to represent the entire market environment spectrum. The analysis of PM2.5 measured daily during dry and wet season shows a clear seasonal variation of this particular pollutant as elevated concentration was measured during the dry season than the wet season. The assessment of PM2.5 concentration shows exceedances of the standards stated by WHO and NAAQS during the dry season which ranges from 47.9 μg/m3- 231.88 μg/m3 in the morning and 65.17 μg/m3- 1806.33 μg/m3 in the afternoon. From the findings, pedestrian traffic contributes immensely to air pollution in an open market, with this elevated concentration, prolonged exposure is highly detrimental to health. This study creates awareness to the pedestrians in an open market about air pollution and informs policy changes.
Subject: Social Sciences, Library & Information Science Keywords: scientific publishing; scientific journals; scholarly publishing; scientific papers; open science; scientific articles
Online: 20 August 2020 (09:48:21 CEST)
In the digital era in which over 4 billion people regularly access the internet, the conventional process of publishing scientific articles in academic journals following peer review is undergoing profound changes. Following physics and mathematics scholars who started to publish their work on the freely accessible arXiv server in the early 1990s, researchers of all disciplines increasingly publish scientific articles in the form of freely accessible and fully citeable preprints before or in parallel to conventional submission to academic journals for peer review. The full transition to open science, I argue in this study, requires to expand the education of students and young researchers to include scholarly communication in the digital era.
REVIEW | doi:10.20944/preprints202003.0362.v1
Online: 24 March 2020 (14:46:29 CET)
With the current rapid spread of COVID-19, global health systems are increasingly overburdened by the sheer number of people that need diagnosis, isolation and treatment. Shortcomings are evident across the board, from staffing, facilities for rapid and reliable testing to availability of hospital beds and key medical-grade equipment. The scale and breadth of the problem calls for an equally substantive response not only from frontline workers such as medical staff and scientists, but from skilled members of the public who have the time, facilities and knowledge to meaningfully contribute to a consolidated global response. Here, we summarise community-driven approaches based on Free and Open Source scientific and medical Hardware (FOSH) currently being developed and deployed to bolster access to personal protective equipment (PPE), patient treatment and diagnostics.
ARTICLE | doi:10.20944/preprints201908.0070.v1
Subject: Social Sciences, Geography Keywords: city; large urban regions; Russia; globalization; open database
Online: 6 August 2019 (08:33:24 CEST)
This study explores how to delineate Russian cities in order to make them comparable on the world scale. In doing so we introduce the concept of large urban regions (LUR) applicable to the Russian urban context. This research is motivated by a principal research question: how to construct a statistical urban delineation, which would allow first, to demonstrate integration of cities into globalization, and second, to make global urban comparative research. Previous studies on urban delineation in Russia have focused almost exclusively on functional urban areas, which have substantial limitations and are not suitable for global urban comparisons. Addressing this research gap, we propose a new definition of Large Urban Regions (LUR). In doing so, first, we introduce the context of Russian cities (2), then we discuss existing Russian urban concepts (3), and justify a need for a new urban delineation (4). Afterwards, we present a general method to delineate Large Urban Regions in Russian context (5.1), and illustrate it in the two case studies of St. Petersburg (polycentric region) and Samara (monocentric region) (5.2). In the last part (6), we discuss the 10 the largest urban regions in Russia and describe a constructed database including all Russian LURs.
ARTICLE | doi:10.20944/preprints201810.0245.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: R; Open data; API; Statistics; DSS; Web service.
Online: 11 October 2018 (17:17:15 CEST)
We present a methodology to enable users to interact with statistical information owned by an institution and stored in a cloud infrastructure. Mainly based on R, this approach was developed following the open-data philosophy. Also, since we use R, the implementation is mainly based on open-source software. R gives several advantages from the point of view of data management and acquisition, as it becomes a common framework that can be used to structure the processes involved in any statistical operation. This simplifies the access to the data and enable to use all the power of R in the cloud information. This methodology was applied successfully to develop a tool to manage the data of the Centre d’Estudis d’Opinió, but it can be applied to other institutions to enable open access to its data. The infrastructure also was deployed to a cloud infrastructure, to assure the scalability and a 24/7 access.
ARTICLE | doi:10.20944/preprints201809.0543.v1
Subject: Medicine & Pharmacology, Behavioral Neuroscience Keywords: Open Science, Data Sharing, Neuroimaging, Reproducibility, Transparency, Reform
Online: 27 September 2018 (11:50:12 CEST)
Ongoing debates regarding the virtues and challenges of implementing open science for brain imaging research mirror those of the larger scientific community. The present commentary acknowledges the merits of arguments on both sides, as well as the underlying realities that have forced so many to feel the need to resist the implementation of an ideal. Potential sources of top-down reform are discussed, along with the factors that threaten to slow their progress. The potential roles of generational change and the individual are discussed, and a starter list of actionable steps that any researcher can take, big or small, is provided.
ARTICLE | doi:10.20944/preprints201805.0470.v1
Subject: Earth Sciences, Environmental Sciences Keywords: remote sensing; python; data management; landsat; open-source
Online: 31 May 2018 (11:12:27 CEST)
Many remote sensing analytical data products are most useful when they are in an appropriate regional or national projection, rather than globally based projections like Universal Transverse Mercator (UTM) or geographic coordinates, i.e., latitude and longitude. Furthermore, leaving data in the global systems can create problems, either due to misprojection of imagery because of UTM zone boundaries, or because said projections are not optimised for local use. We developed the open-source Irish Earth Observation (IEO) Python module to maintain a local remote sensing data library for Ireland. This pure Python module, in conjunction with the IEOtools Python scripts, utilises the Geospatial Data Abstraction Library (GDAL) for its geoprocessing functionality. At present, the module supports only Landsat TM/ETM+/OLI/TIRS data that have been corrected to surface reflectance using the USGS/ESPA LEDAPS/ LaSRC Collection 1 architecture. This module and the IEOtools catalogue available Landsat data from the USGS/EROS archive, and includes functions for the importation of imagery into a defined local projection and calculation of cloud-free vegetation indices. While this module is distributed with default values and data for Ireland, it can be adapted for other regions with simple modifications to the configuration files and geospatial data sets.
TECHNICAL NOTE | doi:10.20944/preprints201804.0047.v1
Online: 4 April 2018 (06:00:40 CEST)
The exceptional increase in molecular DNA sequence data in open repositories is mirrored by an ever-growing interest among evolutionary biologists to harvest and use those data for phylogenetic inference. Many quality issues, however, are known and the sheer amount and complexity of data available can pose considerable barriers to their usefulness. A key issue in this domain is the high frequency of sequence mislabelling encountered when searching for suitable sequences for phylogenetic analysis. These issues include the incorrect identification of sequenced species, non-standardised and ambiguous sequence annotation, and the inadvertent addition of paralogous sequences by users, among others. Taken together, these issues likely add considerable noise, error or bias to phylogenetic inference, a risk that is likely to increase with the size of phylogenies or the molecular datasets used to generate them. Here we present a software package, phylotaR, that bypasses the above issues by using instead an alignment search tool to identify orthologous sequences. Our package builds on the framework of its predecessor, PhyLoTa, by providing a modular pipeline for identifying overlapping sequence clusters using up-to-date GenBank data and providing new features, improvements and tools. We demonstrate our pipeline’s effectiveness by presenting trees generated from phylotaR clusters for two large taxonomic clades: palms and primates. Given the versatility of this package, we hope that it will become a standard tool for any research aiming to use GenBank data for phylogenetic analysis.
ARTICLE | doi:10.20944/preprints201711.0181.v3
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: 3D printing; open source; RepRap; calibration; bed levelling
Online: 12 January 2018 (07:35:31 CET)
Inexpensive piezoelectric diaphragms can be used as sensors to facilitate both nozzle height setting and build platform leveling in FFF (Fused Filament Fabrication) 3D printers. Tests simulating nozzle contact are conducted to establish the available output and an output of greater than 8 Volts found at 20 ºC, a value which is readily detectable by simple electronic circuits. Tests are also conducted at a temperature of 80 ºC and, despite a reduction of greater than 80% in output voltage, this is still detectable. The reliability of piezoelectric diaphragms is investigated by mechanically stressing samples over 100,000 cycles at both 20 ºC and 80 ºC and little loss of output over the test duration is found. The development of a nozzle contact sensor using a single piezoelectric diaphragm is described.
ARTICLE | doi:10.20944/preprints201702.0055.v1
Online: 15 February 2017 (11:20:31 CET)
The process of modelling energy systems is accompanied by challenges inherently connected with mathematical modelling. However, due to modern realities in the 21st century, existing challenges are gaining in magnitude and are supplemented with new ones. Modellers are confronted with a rising complexity of energy systems and high uncertainties on different levels. In addition, interdisciplinary modelling is necessary for getting insight in mechanisms of an integrated world. At the same time models need to meet scientific standards as public acceptance becomes increasingly important. In this intricate environment model application as well as result communication and interpretation is also getting more difficult. In this paper we present the open energy modelling framework (oemof) as a novel approach for energy system modelling and derive its contribution to existing challenges. Therefore, based on literature review, we outline challenges for energy system modelling as well as existing and emerging approaches. Based on a description of the philosophy and elementary structural elements of oemof, a qualitative analysis of the framework with regard to the challenges is undertaken. Inherent features of oemof such as the open source, open data, non-proprietary and collaborative modelling approach are preconditions to meet modern realities of energy modelling. Additionally, a generic basis with an object-oriented implementation allows to tackle challenges related to complexity of highly integrated future energy systems and sets the foundation to address uncertainty in the future. Experiences from the collaborative modelling approach can enrich interdisciplinary modelling activities. Our analysis concludes that there are remaining challenges that can neither be tackled by a model nor a modelling framework. Among these are problems connected to result communication and interpretation.
ARTICLE | doi:10.20944/preprints202012.0612.v1
Subject: Social Sciences, Accounting Keywords: Academic journals; Growth of knowledge; Non-peer review; Open access; Open science; Paradigm change; Peer review; Scholarly communication; Science communication; Simplicity
Online: 24 December 2020 (09:31:35 CET)
This article challenges the assumption that journals and peer review are essential for developing, evaluating and disseminating scientific and other academic knowledge. It suggests a more flexible ecosystem, and examines some of the possibilities this might facilitate. The market for academic outputs should be opened up by encouraging the separation of the dissemination service from the evaluation service. Publishing research in subject-specific journals encourages compartmentalizing research into rigid categories. The dissemination of knowledge would be better served by an open access, web-based repository system encompassing all disciplines. Reviews from peers should be supplemented by reviews from non-peers from a variety of different perspectives: user reviews, statistical reviews, reviews from the perspective of different disciplines, and so on. This should reduce the inevitably conservative influence of relying on two or three peers, and make the evaluation system more critical, multi-dimensional and responsive to the requirements of different audience groups, changing circumstances, and new ideas. Non-peer review might make it easier to challenge dominant paradigms, and expanding the potential audience beyond a narrow group of peers might encourage the criterion of simplicity to be taken more seriously - which is essential if knowledge is to continue to progress.
ARTICLE | doi:10.20944/preprints202003.0073.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: digital object; data infrastructure; research infrastructure; data management; data science; FAIR data; open science; European Open Science Cloud; EOSC; persistent identifier
Online: 5 March 2020 (02:30:06 CET)
Data science is facing the following major challenges: (1) developing scalable cross-disciplinary capabilities, (2) dealing with the increasing data volumes and their inherent complexity, (3) building tools that help to build trust, (4) creating mechanisms to efficiently operate in the domain of scientific assertions, (5) turning data into actionable knowledge units and (6) promoting data interoperability. As a way to overcome these challenges, we further develop the proposals by early Internet pioneers for Digital Objects as encapsulations of data and metadata made accessible by persistent identifiers. In the past decade, this concept was revisited by various groups within the Research Data Alliance and put in the context of the FAIR Guiding Principles for findable, accessible, interoperable and reusable data. The basic components of a FAIR Digital Object (FDO) as a self-contained, typed, machine-actionable data package are explained. A survey of use cases has indicated the growing interest of research communities in FDO solutions. We conclude that the FDO concept has the potential to act as the interoperable federative core of a hyperinfrastructure initiative such as the European Open Science Cloud (EOSC).
ARTICLE | doi:10.20944/preprints201907.0263.v1
Subject: Chemistry, Analytical Chemistry Keywords: ambient ionization; mass spectrometry; high-throughput sampling; imaging; modular robot; open hardware; lab automation; peer production; open software; low-temperature plasma
Online: 23 July 2019 (15:20:38 CEST)
Abstract: Mass spectrometry research laboratories reported multiple probes for ambient ionization in the last years. Combining them with a mechanical moving stage enables automated sampling and imaging applications. We developed a robotic platform, which is based on RepRap 3D-printer components, and therefore easy to reproduce and to adopt for custom prototypes. The minimal step width of the Open LabBot is 12.5 μm, and the sampling dimensions (x, y, z) are 18 × 15 × 20 cm. Adjustable rails in an aluminium frame construction facilitate the mounting of additional parts such as sensors, probes, or optical components. The Open LabBot uses industry-standard G-code for its control. The simple syntax facilitates the programming of the movement. We developed two programs: 1) LABI-Imaging, for direct control via a USB connection and the synchronization with MS data acquisition. 2) RmsiGUI, which integrates all steps of mass spectrometry imaging: The creation of G-code for robot control, the assembly of imzML files from raw data and the analysis of imzML files. We proved the functionality of the system by the automated sampling and classification of essential oils with a PlasmaChip probe. Further, we performed an ambient ionization mass spectrometry imaging (AIMSI) experiment of a lime slice with laser desorption low-temperature plasma (LD-LTP) ionization, demonstrating the integration of the complete workflow in RmsiGUI. The design of the Open LabBot and the software are released under open licenses to promote their use and adoption in the instrument developers’ community.
REVIEW | doi:10.20944/preprints201712.0182.v4
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: open access initiative; challenges of data sharing; data management; open government data; human-computer interaction; documentation; human factors; standardization; information policy
Online: 17 January 2018 (11:09:52 CET)
The Release of government dataset for public use can potentially strengthen the relationship between the government and its constituents. However, research shows that there are several challenges for open data effectiveness. This paper reviews current determinants and issues associated with the open government data (OGD) procedures. The review concentrates on two ends of the spectrum: First, from the perspective of the preparation by the government, focusing on the organization of traditional governmental datasets and how the recording of the data is administered. Second, from the perspective of the users, focusing on the way in which the data is released to the general public and on human-computer interaction (HCI) issues between end-user and data-consumption interfaces. Following a thorough analysis of these two opposing challenges, the paper proposes approaches to mitigate them. This review and subsequent recommendations contribute and expand current understanding of open government data effectiveness and can lead to public policy changes, development of new procedures and strategies, and ultimately improvements at both ends of the federal open data endeavor.
ARTICLE | doi:10.20944/preprints202006.0318.v1
Subject: Medicine & Pharmacology, General Medical Research Keywords: ventilator; pandemic; ventilation; influenza pandemic; coronavirus; coronavirus pandemic; pandemic ventilator; single-limb; open source; open hardware; COVID-19; medical hardware; RepRap; 3-D printing; open source medical hardware; embedded systems; real-time operating system
Online: 26 June 2020 (17:25:16 CEST)
This study describes the development of an automated bag valve mask (BVM) compression system, which, during acute shortages and supply chain disruptions can serve as a temporary emergency ventilator. The resuscitation system is based on the Arduino controller with a real-time operating system installed on a largely RepRap 3-D printable parametric component-based structure. The cost of the system is under $170, which makes it affordable for replication by makers around the world. The device provides a controlled breathing mode with tidal volumes from 100 to 800 milliliters, breathing rates from 5 to 40 breaths/minute, and inspiratory-to-expiratory ratio from 1:1 to 1:4. The system is designed for reliability and scalability of measurement circuits through the use of the serial peripheral interface and has the ability to connect additional hardware due to the object-oriented algorithmic approach. Experimental results demonstrate repeatability and accuracy exceeding human capabilities in BVM-based manual ventilation. Future work is necessary to further develop and test the system to make it acceptable for deployment outside of emergencies in clinical environments, however, the nature of the design is such that desired features are relatively easy to add with the test using protocols and parametric design files provided.
ARTICLE | doi:10.20944/preprints202106.0729.v1
Subject: Behavioral Sciences, Applied Psychology Keywords: age; behaviour; open field; physical activity; anxiety; Wistar rat.
Online: 30 June 2021 (10:57:00 CEST)
The aim of this work was to study age-related changes in the behaviour of adult Wistar rats using the open field (OF) and elevated plus maze (EPM) tests. Behavioural changes related to motor activity and anxiety were of particular interest. Results showed that as male and female rats progressed from 2 to 5 months of age there was a decrease in the level of motor and exploratory activities, and an increase in the level of anxiety. Age-related changes were dependent upon initial individual characteristics of behaviour. For example, animals that demonstrated high motor activity at 2 months become significantly less active by 5 months, and animals that showed a low level of anxiety at 2 months become more anxious by 5 months. Low-activity and high-anxiety rats did not show any significant age-related changes in OF and EPM tests from 2 to 5 months of age, except for a decrease in the number of rearings in EPM. Significant individual differences in the behaviour of rats in OF and EPM tests observed at 2 months were not apparent by 5 months.
ARTICLE | doi:10.20944/preprints202010.0107.v1
Subject: Life Sciences, Biochemistry Keywords: Bioprinting; microextrusion; tissue engineering; bioink; open-source; stem cells
Online: 6 October 2020 (08:24:54 CEST)
Three-dimensional (3D) bioprinting promises to be essential in tissue engineering for solving the rising demand for organs and tissues. Some bioprinters are commercially available, but their impact on the field of TE is still limited due to their cost or difficulty to tune. Herein, we present a low-cost easy-to-build printhead for microextrusion-based bioprinting (MEBB) that can be installed in many desktop 3D printers to transform them into 3D bioprinters. We can extrude bioinks with precise control of print temperature between 2 - 60 ºC. We validated the versatility of the printhead, by assembling it in three low-cost open-source desktop 3D printers. Multiple units of the printhead can also be easily put together in a single printer carriage for building a multi-material 3D bioprinter. Print resolution was evaluated by creating representative calibration models at different temperatures using natural hydrogels such as gelatin and alginate, and synthetic ones like poloxamer. Using one of the three modified low-cost 3D printers, we successfully printed cell-laden lattice constructs with cell viabilities higher than 90% after 24h post printing. Controlling temperature and pressure according to the rheological properties of the bioinks was essential in achieving optimal printability and great cell viability. The cost per unit of our device, which can be used with syringes of different volume, is less expensive than any other commercially available product. These data demonstrate an affordable open-source printhead with the potential to become a reliable alternative to commercial bioprinters for any laboratory.
REVIEW | doi:10.20944/preprints202007.0153.v1
Online: 8 July 2020 (11:53:33 CEST)
Large datasets that enable researchers to perform investigations with unprecedented rigor are growing increasingly common in neuroimaging. Due to the simultaneous increasing popularity of open science, these state-of-the-art datasets are more accessible than ever to researchers around the world. While analysis of these samples has pushed the field forward, they pose a new set of challenges that might cause difficulties for novice users. Here, we offer practical tips for working with large datasets from the end-user’s perspective. We cover all aspects of the data life cycle: from what to consider when downloading and storing the data, to tips on how to become acquainted with a dataset one did not collect, to what to share when communicating results. This manuscript serves as a practical guide one can use when working with large neuroimaging datasets, thus dissolving barriers to scientific discovery.
Subject: Physical Sciences, Condensed Matter Physics Keywords: open topological systems; Kitaev chains and ladders; Majorana fermions
Online: 14 June 2019 (09:41:01 CEST)
In this work the general problem of the characterization of the topological phase of an open quantum system is addressed. In particular, we study the topological properties of Kitaev chains and ladders under the perturbing effect of a current flux injected into the system using an external normal lead and derived from it via a superconducting electrode. After discussing the topological phase diagram of the isolated systems, using a scattering technique within the Bogoliubov-de Gennes formulation, we analyze the differential conductance properties of these topological devices as a function of all relevant model parameters. The relevant problem of implementing local spectroscopic measurements to characterize topological systems is also addressed by studying the system electrical response as a function of the position and the distance of the normal electrode (tip). The results show how the signatures of topological order affect the electrical response of the analyzed systems, a subset of such signatures being robust also against the effects of a moderate amount of disorder. The analysis of the internal modes of the nanodevices demonstrates that topological protection can be lost when quantum states of an initially isolated topological system are hybridized with those of the external reservoirs. The conclusions of this work could be useful in understanding the topological phases of nanowire-based mesoscopic devices.
ARTICLE | doi:10.20944/preprints201811.0099.v1
Subject: Social Sciences, Other Keywords: smart city; smart citizen; participation; smartmentality; open data; metaphor
Online: 5 November 2018 (10:19:03 CET)
The goal of the paper is to investigate the expected participation and mentality of smart citizens in smart cities. The key question is the role of the human factor in smart environments globally studied through a research corpus of the mainstream summaries, trend reports, white papers and visions of business – governmental – university research co- operations. Foremost, a short review of the changing scholarly trends is presented as a theoretical framework. Concerning its key ideas, the corpus based findings are recapped and analysed by content networks and the most referred city strategies. Besides, a critical approach reveal further required factors and risks to investigate. The ultimate goal is to understand how the smart city landscape is shaped by citizen-based strategies, open data, empowerment and responsibility. Accordingly, the paper closes with theoretical, practical and metaphor-based recommendations to support the business and political decision making, and also, the emerging scholarly trends in the context of upcoming technological-structural changes.
ARTICLE | doi:10.20944/preprints201810.0354.v1
Subject: Earth Sciences, Geoinformatics Keywords: open LiDAR; terrestrial images; building reconstruction; point cloud registration
Online: 16 October 2018 (11:20:43 CEST)
Recent advances in open data initiatives allow us to free access to a vast amount of open LiDAR data in many cities. However, most of these open LiDAR data over cities are acquired by airborne scanning, where the points on façades are sparse or even completely missing due to the viewpoint and object occlusions in the urban environment. Integrating other sources of data, such as ground images, to complete the missing parts is an effective and practical solution. This paper presents an approach for improving open LiDAR data coverage on building façades by using point cloud generated from ground images. A coarse-to-fine strategy is proposed to fuse these two different sources of data. Firstly, the façade point cloud generated from terrestrial images is initially geolocated by matching the SFM camera positions to their GPS meta-information. Next, an improved Coherent Point Drift algorithm with normal consistency is proposed to accurately align building façades to open LiDAR data. The significance of the work resides in the use of 2D overlapping points on the outline of buildings instead of limited 3D overlap between the two point clouds and the achievement to a reliable and precise registration under possible incomplete coverage and ambiguous correspondence. Experiments show that the proposed approach can significantly improve the façades details of buildings in open LiDAR data and improving registration accuracy from up to 10 meters to less than half a meter compared to classic registration methods.
ARTICLE | doi:10.20944/preprints201802.0149.v2
Subject: Social Sciences, Library & Information Science Keywords: CERN; Journal Flipping; Gold Open Access; Particle Physics; SCOAP3
Online: 19 March 2018 (14:15:52 CET)
Gigantic particle accelerators, incredibly complex detectors, an antimatter factory and the discovery of the Higgs boson – this is part of what makes CERN famous. Only a few know that CERN also hosts the world largest Open Access initiative: SCOAP3. The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3) started operation in 2014 and has since supported the publication of 19,000 Open Access articles in the field of particle physics, at no direct cost, nor burden, for individual authors worldwide. SCOAP3 is made possible by a 3,000-institute strong partnership, where libraries re-direct funds previously used for subscriptions to ’flip’ articles to ’gold Open Access’. With its recent expansion, the initiative now covers about 90% of the journal literature of the field. This article describes the economic principles of SCOAP3, the collaborative approach of the partnership, and finally summarizes financial results after four years of successful operation.
ARTICLE | doi:10.20944/preprints202107.0651.v1
Subject: Behavioral Sciences, Applied Psychology Keywords: multiple measures synchronization; automatic device integration; open-source; PsychoPy; Unity
Online: 29 July 2021 (11:48:02 CEST)
Background: The human mind is multimodal. Yet most behavioral studies rely on century-old measures such as task accuracy and latency. To create a better understanding of human behavior and brain functionality, we should introduce other measures and analyze behavior from various aspects. However, it is technically complex and costly to design and implement the experiments that record multiple measures. To address this issue, a platform that allows synchronizing multiple measures from human behavior is needed. Method: This paper introduces an opensource platform named OpenSync, which can be used to synchronize multiple measures in neuroscience experiments. This platform helps to automatically integrate, synchronize and record physiological measures (e.g., electroencephalogram (EEG), galvanic skin response (GSR), eye-tracking, body motion, etc.), user input response (e.g., from mouse, keyboard, joystick, etc.), and task-related information (stimulus markers). In this paper, we explain the structure and details of OpenSync, provide two case studies in PsychoPy and Unity. Comparison with existing tools: Unlike proprietary systems (e.g., iMotions), OpenSync is free and it can be used inside any opensource experiment design software (e.g., PsychoPy, OpenSesame, Unity, etc., https://pypi.org/project/OpenSync/ and https://github.com/moeinrazavi/OpenSync_Unity). Results: Our experimental results show that the OpenSync platform is able to synchronize multiple measures with microsecond resolution.
ARTICLE | doi:10.20944/preprints202104.0770.v1
Subject: Social Sciences, Library & Information Science Keywords: Wikipedia, knowledge equity, Wikimedia, open culture, visual arts, cultural bias
Online: 29 April 2021 (09:16:07 CEST)
We explore gaps in Wikipedia's coverage of the visual arts by comparing the representation of 100 artists and 100 artworks from the Western canon against corresponding sets of notable artists and artworks from non-Western cultures. We measure the coverage of these two sets of topics across Wikipedia as a whole and for its individual language versions. We also compare the coverage for Wikimedia Commons and Wikidata, sister-projects of Wikipedia that host digital media and structured data. We show that all these platforms strongly favour the Western canon, giving many times more coverage to Western art. We highlight specific examples of differing coverage of visual art inside and outside the Western canon. We find that European language versions of Wikipedia are generally more "Western" in their coverage and Asian languages more "global", with interesting exceptions. We suggest how both Wikipedia and the wider cultural sector can address this gap in content and thus give Wikipedia a truly global perspective on the visual arts.
ARTICLE | doi:10.20944/preprints202104.0448.v1
Subject: Social Sciences, Accounting Keywords: preprint; citations; scholarly communication; open science; peer review; impact factor
Online: 16 April 2021 (16:45:49 CEST)
Preprints are regularly cited in peer reviewed journal articles, books and conference paper. Are preprint citations somehow less important than citations to peer reviewed research papers? This study investigates citation patterns between 2017 and 2020 for preprints published in three preprint servers, one specializing in biology (bioRxiv), one in chemistry (ChemRxiv), and another hosting preprints in all disciplines (Research Square). As evaluation of scholarship continues to largely rely on citation-based metrics, this analysis and its outcomes will be useful to inform new research-based education in today’s scholarly communication.
TECHNICAL NOTE | doi:10.20944/preprints202103.0194.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: Active Learning, Classification, Machine Learning, Python, Github, Repository, Open Source
Online: 5 March 2021 (21:14:20 CET)
Machine learning applications often need large amounts of training data to perform well. Whereas unlabeled data can be easily gathered, the labeling process is difficult, time-consuming, or expensive in most applications. Active learning can help solve this problem by querying labels for those data points that will improve the performance the most. Thereby, the goal is that the learning algorithm performs sufficiently well with fewer labels. We provide a library called scikit-activeml that covers the most relevant query strategies and implements tools to work with partially labeled data. It is programmed in Python and builds on top of scikit-learn.
ARTICLE | doi:10.20944/preprints202102.0198.v1
Subject: Engineering, Automotive Engineering Keywords: 180-degree bend; Sediment transport; Clear water; Open-channels; Scour
Online: 8 February 2021 (12:12:09 CET)
As 180-degree meanders are observed in abundance in nature, a meandering channel with two consecutive 180-degree bends was designed and constructed to investigate bed topography variations. These two 180-degree mild bends are located between two upstream and downstream straight paths. In this study, different mean velocity to critical velocity ratios have been tested at the upstream straight path to determine the meander's incipient motion. To this end, bed topography variations along the meander and the downstream straight path were addressed for different mean velocity to critical velocity ratios. In addition, the upstream bend's effect on the downstream bend has been investigated. Results indicated that the maximum scour depth at the downstream bend has increased as a result of changing the mean velocity to critical velocity ratio from 0.8 to 0.84, 0.86, 0.89, 0.92, 0.95, and 0.98 by respectively 1.5, 2.5, 5, 10, 12, and 26 times. Moreover, increasing the ratio increased the maximum sedimentary height by 3, 10, 23, 48, 49, and 56 times. The upstream bend's incipient motion was observed for the mean velocity to critical velocity ratio of 0.89, while the downstream bend was equal to 0.78.
REVIEW | doi:10.20944/preprints202010.0243.v1
Subject: Social Sciences, Accounting Keywords: distance education; open and distance education; student retention; survival analysis
Online: 12 October 2020 (13:22:55 CEST)
Student retention is one indicator of accountability in the implementation of educational programs. Achievement of student retention rates indicates the performance of the quality objectives of an institution or college. To get an accurate picture of the factors related to retention, we need to do modeling. The retention variable is the time response variable measured in semester units. One of the statistical analyzes that can be used to analyze response data in time is survival analysis. The selection of an accurate analytical method in modeling will produce valid conclusions and impact making policies that are right and on target. This paper presents alternative modeling of student retention in distance education using survival analysis. The method used is a literature review. This paper also briefly describes distance education, open and distance education, distance education students' characteristics, distance education student retention, and survival models for modeling student retention in distance education.
SHORT NOTE | doi:10.20944/preprints202001.0196.v1
Subject: Biology, Entomology Keywords: reproducibility; open access; data curation; data mangement; pre-print servers
Online: 18 January 2020 (09:05:49 CET)
The ability to replicate scientific experiments is a cornerstone of the scientific method. Sharing ideas, workflows, data, and protocols facilitates testing the generalizability of results, increases the speed that science progresses, and enhances quality control of published work. Fields of science such as medicine, the social sciences, and the physical sciences have embraced practices designed to increase replicability. Granting agencies, for example, may require data management plans and journals may require data and code availability statements along with the deposition of data and code in publicly available repositories. While many tools commonly used in replicable workflows such as distributed version control systems (e.g. “git”) or scripted programming languages for data cleaning and analysis may have a steep learning curve, their adoption can increase individual efficiency and facilitate collaborations both within entomology and across disciplines. The open science movement is developing within the discipline of entomology, but practitioners of these concepts or those desiring to work more collaboratively across disciplines may be unsure where or how to embrace these initiatives. This article is meant to introduce some of the tools entomologists can incorporate into their workflows to increase the replicability and openness of their work. We describe these tools and others, recommend additional resources for learning more about these tools, and discuss the benefits to both individuals and the scientific community and potential drawbacks associated with implementing a replicable workflow.
ARTICLE | doi:10.20944/preprints202001.0080.v1
Subject: Engineering, Energy & Fuel Technology Keywords: shale gas; MRST; embedded discrete fracture model; open-source implementation
Online: 9 January 2020 (09:59:37 CET)
We present a generic and open-source framework for the numerical modeling of the expected transport and storage mechanisms in unconventional gas reservoirs. These unconventional reservoirs typically contain natural fractures at multiple scales. Considering the importance of these fractures in shale gas production, we perform a rigorous study on the accuracy of different fracture models. The framework is validated against an industrial simulator and is used to perform a history-matching study on the Barnett shale. This work presents an open-source code that leverages cutting-edge numerical modeling capabilities like automatic differentiation, stochastic fracture modeling, multi-continuum modeling and other explicit and discrete fracture models. We modified the conventional mass balance equation to account for the physical mechanisms that are unique to organic-rich source rocks. Some of these include the use of an adsorption isotherm, a dynamic permeability-correction function, and an embedded discrete fracture model (EDFM) with fracture-well connectivity. We explore the accuracy of the EDFM for modeling hydraulically-fractured shale-gas wells, which could be connected to natural fractures of finite or infinite conductivity, and could deform during production. Simulation results indicates that although the EDFM provides a computationally efficient model for describing flow in natural and hydraulic fractures, it could be inaccurate under these three conditions: 1. when the fracture conductivity is very low. 2. when the fractures are not orthogonal to the underlying Cartesian grid blocks, and 3. when sharp pressure drops occur in large grid blocks with insufficient mesh refinement. Each of these results are very significant considering that most of the fluids in these ultra-low matrix permeability reservoirs get produced through the interconnected natural fractures, which are expected to have very low fracture conductivities. We also expect sharp pressure drops near the fractures in these shale gas reservoirs, and it is very unrealistic to expect the hydraulic fractures or complex fracture networks to be orthogonal to any structured grid. In conclusion, this paper presents an open-source numerical framework to facilitate the modeling of the expected physical mechanisms in shale-gas reservoirs. The code was validated against published results and a commercial simulator. We also performed a history-matching study on a naturally-fractured Barnett shale-gas well considering adsorption, gas slippage & diffusion and fracture closure as well as proppant embedment, using the framework presented. This work provides the first open-source code that can be used to facilitate the modeling and optimization of fractured shale-gas reservoirs. To provide the numerical flexibility to accurately model stochastic natural fractures that are connected to hydraulically-fractured wells, it is built atop other related open-source codes. We also present the first rigorous study on the accuracy of using EDFM to model both hydraulic fractures and natural fractures that may or may not be interconnected.
ARTICLE | doi:10.20944/preprints201909.0162.v1
Subject: Engineering, Mechanical Engineering Keywords: thermal actuator; compliant architecture; open and closed operating cycles; mesoscale
Online: 16 September 2019 (10:56:57 CEST)
Thermal-based actuators are known for generating large force and displacement strokes at mesoscale (millimeter) regime. In particular, two-phase thermal actuators are found to benefit from the scaling laws of physics at mesoscale to offer large force and displacement strokes; but they have low thermal efficiencies. As an alternative, a combustion-based thermal actuator is proposed and its performance is studied in both open and closed cycle operations. Through a physics-based lumped-parameter model, we investigate the behavior and performance of the actuator using a spring-mass-damper analogy and taking an air standard cycle approach. Three observations are reported: (1) the mesoscale actuator can generate peak forces of up to 400 N and displacement strokes of about 16 cm suitable for practical applications; (2) an increase in heat input to the actuator results in increasing the thermal efficiency of the actuator for both open and closed cycles; and (3) for a specific heat input, both the open and closed cycle operations respond differently \textemdash different stroke lengths, peak pressures, and thermal efficiencies.
ARTICLE | doi:10.20944/preprints201906.0154.v1
Subject: Social Sciences, Library & Information Science Keywords: Open Access; institutional repositories; institutional mandates; self-archiving; Estudo Geral
Online: 17 June 2019 (07:08:09 CEST)
Changes brought about by the Internet to Scholarly Communication and the spread of Open Access movement, have made it possible to increase the number of potential readers of published research dramatically. This two-phase study aims, at first, to assert the satisfaction of the potential for increased open access to articles published by authors at the University of Coimbra, in a context when there was no stimulus for the openness of published science other than an institutional mandate set by the University policy on Open Access (“Acesso Livre”). The satisfaction of the access openness was measured by observing the actual archiving behavior of researchers (either directly or through their agents). We started by selecting the top journal titles used to publish the STEM research of the University of Coimbra (2004-2013) by using Thomson Reuters’ Science Citation Index (SCI). These titles were available at the University libraries or through online subscriptions, some of them in open access (21%). By checking the journals' policy at the time regarding self-archiving at the SHERPA/RoMEO service, we found that the percentage of articles in Open Access (OA) could rise to 80% if deposited at Estudo Geral, the Institutional Repository of the University of Coimbra, as prescribed by the Open Access Policy of the University. As we concluded by verifying the deposit status of every single paper of researchers of the University that published in those journals, this potential was far from being fulfilled, despite the existence of the institutional mandate and favorable editorial conditions. We concluded, therefore, that an institutional mandate was not sufficient by itself to fully implement an open access policy and to close the gap between publication and access. The second phase of the study, to follow, will rescan the status of published papers in a context where the Portuguese public funding agency, the Fundação para a Ciência e a Tecnologia, introduced in 2014 a new significant stimulus for open access in science. The FCT Open Access Policy stipulates that publicly funded published research must be available as soon as possible in a repository of the Portuguese network of scientific repositories, RCAAP, which integrates the Estudo Geral.
ARTICLE | doi:10.20944/preprints201905.0015.v1
Subject: Medicine & Pharmacology, General Medical Research Keywords: laparoscopic; open surgery; non-metastatic colorectal cancer; single surgeon experience
Online: 5 May 2019 (11:25:43 CEST)
The oncologic merits of laparoscopic technique for colorectal cancer surgery remain debatable. Eligible patients with non-metastatic colorectal cancer who were scheduled for an elective resection by only one surgeon in a medical institution were randomized to either laparoscopic or open treatment. During this period, total 188 patients received laparoscopic surgery and other 163 patients to open approach. The primary endpoint was cancer-free 5-year survival after operative treatment and secondary endpoint was the tumor recurrence incidence. We found there was no statistically significant difference between open and laparoscopic groups regarding average number of lymph nodes dissected, overall mortality rate, cancer recurrence rate or cancer-free 5-year survival. Nevertheless, laparoscopic approach was more effective for colorectal cancer treatment with shorter hospital stay and less blood loss despite operation time was significantly longer. Meanwhile fewer patients receiving laparoscopic approach developed postoperative urinary tract infection, wound infection, pneumonia or anastomosis leakage, which reached statistical significance. For non-metastatic colorectal cancer patients, laparoscopic surgery resulted in better short-term outcomes whether in total complications and intra-operative blood loss. Though there was no significant statistical difference in terms of cancer-free 5-year survival and tumor recurrence, we favor patients receiving laparoscopic surgery if not contraindicated.
ARTICLE | doi:10.20944/preprints201904.0008.v1
Subject: Earth Sciences, Geoinformatics Keywords: GRASS GIS; g.citation; software citation; open science; OSGeo; credit; rewards
Online: 1 April 2019 (10:19:53 CEST)
The authors introduce the GRASS GIS add-on module g.citation as an initial implementation of a fine-grained software citation concept. The module extends the existing citation capabilities of GRASS GIS, which until now only provide for automated citation of the software project as a whole, authored by the GRASS Development Team, without reference to individual persons. The functionalities of the new module enable individual code citation for each of the over 500 implemented functionalities, including add-on modules. Three different classes of citation output are provided in a variety human- and machine-readable formats. The implications of this reference implementation of scientific software citation for both for the GRASS GIS project and the OSGeo foundation are outlined.
ARTICLE | doi:10.20944/preprints201901.0165.v3
Subject: Social Sciences, Library & Information Science Keywords: Plan S; open access journals; APC; technical requirements; publisher size
Online: 22 January 2019 (11:39:01 CET)
Much of the debate on Plan S seems to concentrate on how to make toll access journals open access, taking for granted that existing open access journals are Plan S compliant. We suspected this was not so, and set out to explore this using DOAJ's journal metadata. We conclude that an overwhelmingly large majority of open access journals are not Plan S compliant, and that it is small HSS publishers not charging APCs that are least compliant and will face major challenges with becoming compliant. Plan S need to give special considerations to smaller publishers and/or non-APC-based journals.
ARTICLE | doi:10.20944/preprints201706.0093.v2
Subject: Keywords: decision support; energy system modelling; optimization; collaborative development; open science
Online: 27 March 2018 (05:34:38 CEST)
Energy system models have become indispensable to shape future energy systems by providing insights into different trajectories. However, sustainable systems with high shares of renewable energy are characterized by growing cross-sectoral interdependencies and decentralized structures. To capture important properties of increasingly complex energy systems, sophisticated and flexible modelling tools are needed. At the same time open science becomes increasingly important in energy system modelling. This paper presents the Open Energy Modelling Framework (oemof) as a novel approach in energy system modelling, representation and analysis. The framework forms a toolbox to construct comprehensive energy system models and has been published open source under a free license. With a collaborative development based on open processes the framework seeks for a maximum level of participation and transparency to facilitate open science principles in energy system modelling. Based on a generic graph based description of energy systems it is well suited to flexibly model complex cross-sectoral systems and incorporate various modelling approaches. This makes the framework a multi-purpose modelling environment for modelling and analyzing different systems ranging from an urban to a transnational scale.
ARTICLE | doi:10.20944/preprints201708.0022.v1
Subject: Mathematics & Computer Science, Other Keywords: real‐time reconstruction; SLAM; kinect sensors; depth cameras; open source
Online: 7 August 2017 (11:03:23 CEST)
Given a stream of depth images with a known cuboid reference object present in the scene, we propose a novel approach for accurate camera tracking and volumetric surface reconstruction in real-time. Our contribution in this paper is threefold: (a) utilizing a priori knowledge of the cuboid reference object, we keep drift-free camera tracking without explicit global optimization; (b) we improve the fineness of the volumetric surface representation by proposing a prediction-corrected data fusion strategy rather than simple moving average, which enables accurate reconstruction of high-frequency details such as sharp edges of objects and geometries of high curvature; (c) we introduce a benchmark dataset CU3D containing both synthetic and real-world scanning sequences with ground-truth camera trajectories and surface models for quantitative evaluation of 3D reconstruction algorithms. We test our algorithm on our dataset and demonstrate its accuracy compared with other state-of-the-art algorithms. We release both our dataset and code as opensource1 for other researchers to reproduce and verify our results.
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: 3-D printing; additive manufacturing; distributed manufacturing; distributed recycling; granulator; shredder; open hardware; fab lab; open-source; polymers; recycling; waste plastic; extruder; upcycle; circular economy
Online: 1 September 2019 (08:25:03 CEST)
Abstract: In order to accelerate deployment of distributed recycling by providing low-cost feed stocks of granulated post-consumer waste plastic, this study analyzes an open source waste plastic granulator system. It is designed, built and tested for its ability to convert post-consumer waste, 3-D printed products and waste into polymer feedstock for recyclebots of fused particle/granule printers. The technical specifications of the device are quantified in terms of power consumption (380 to 404W for PET and PLA, respectively) and particle size distribution. The open source device can be fabricated for less than USD$2000 in materials. The experimentally-measured power use is only a minor contribution to the overall embodied energy of distributed recycling of waste plastic. The resultant plastic particle size distributions were found to be appropriate for use in both recyclebots and direct material extrusion 3-D printers. Simple retrofits are shown to reduce sound levels during operation by 4dB-5dB for the vacuum. These results indicate that the open source waste plastic granulator is an appropriate technology for community, library, makespace, fab lab or small business-based distributed recycling.
ARTICLE | doi:10.20944/preprints202209.0023.v1
Subject: Mathematics & Computer Science, Applied Mathematics Keywords: Pancreatic cancer; cancer evolution; tumour microenvironment; mathematical model; open quasispecies model
Online: 1 September 2022 (10:52:10 CEST)
Pancreatic cancer represents one of the difficult problems of contemporary medicine. The development illness evolves very slowly, takes place in special sake (stroma) and manifests clinically close to a final stage. Another feature of this pathology is a coexistence (symbiotic) effect between cancer cells and normal cells inside stroma. All these aspects make it difficult to understand the pathogenesis of pancreatic cancer and develop a proper therapy. The emergence of pancreatic pre-cancer and cancer cells represents a branching stochastic process engaging populations of 64 cells differing in the number of acquired mutations. In this study we formulate and calibrate the mathematical model of pancreatic cancer using the quasispecies framework. The mathematical model incorporates the mutation matrix, fineness landscape matrix and the death rates. Each element of the mutation matrix presents the probability of appearing a specific mutation in the branching sequence of cells representing the accumulation of mutations. The model incorporates the cancer cell elimination by effect CD8 T cells (CTL). The down-regulation of the effector function of CTLs and exhaustion are parameterized. The symbiotic effect of coexistence of normal and cancer cells is considered. The computational predictions obtained with the model are consistent with empirical data. The modelling approach can be used to investigate other types of cancers and examine various treatment procedures.
ARTICLE | doi:10.20944/preprints202205.0344.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Linked (open) Data; Semantic Interoperability; Data Mapping; Governmental Data; SPARQL; Ontologies
Online: 25 May 2022 (08:18:46 CEST)
In this paper, we present a method to map information regarding service activity provision residing in governmental portals across European Commission. In order to perform this, we used as a basis the enriched Greek e-GIF ontology, modeling concepts, and relations in one of the two data portals (i.e., Points of Single Contacts) examined, since relevant information on the second was not provided. Mapping consisted in transforming information appearing in governmental portals in RDF format (i.e., as Linked data), in order to be easily exchangeable. Mapping proved a tedious task, since description on how information is modeled in the second Point of Single Contact is not provided and must be extracted in a manual manner.
REVIEW | doi:10.20944/preprints202104.0714.v1
Subject: Keywords: atomtronics; open quantum systems; Bose Einstein condensates; quantum simulation; quantum sensing
Online: 27 April 2021 (12:35:20 CEST)
Atomtronics is a relatively new subfield of atomic physics that aims to realize the device behavior of electronic components in ultracold atom-optical systems. The fact that these systems are coherent makes them particularly interesting since, in addition to current, one can impart quantum states onto the current carriers themselves or perhaps perform quantum computational operations on them. After reviewing the fundamental ideas of this subfield, we report on the theoretical and experimental progress made towards developing externally-driven and closed loop devices. The functionality and potential applications for these atom analogs to electronic and spintronic systems is also discussed.
Subject: Medicine & Pharmacology, Ophthalmology Keywords: Ascorbate; Pars Plana Vitrectomy; Open Angle Glaucoma; Oxidative Stress; Ocular Hypertension
Online: 29 October 2020 (10:58:20 CET)
Purpose is to o review the pathogenic mechanism for ocular hypertension and glaucoma development after pars plana vitrectomy. Both acute and chronic causes are considered and special attention is paid to the theories and clinical evidence on the risk of developing Open Angle Glaucoma (OAG) after Pars Plana Vitrectomy (PPV). Most existing scientific literature on the issue agree on the role of ascorbate as an oxygen scavenger within the vitreous chamber. Oxygen tension in the vitreous and anterior chamber is maximum inn proximity of the retinal surface and endothelium, respectively and steeply decreases toward the lens; on both sides, and trabeculate. Vitreous removal and, to a lesser extent, liquefaction, greatly reduces oxygen tension gradient in vitreous chamber while cataract extraction has similar effects on anterior chamber oxygen gradients. Oxygen derivatives originated from the cornea and retina are actively reduced by the vitreous gel and/or the crystalline lens. Vitreous removal and cataract extraction reduce drastically this function. Most reported clinical series confirm this hypothesis although protocol difference and follow-up length greatly impact the reliability of results.
Subject: Keywords: amendment; corrigendum; erratum; errors; open science; peer review; preprint; replacement; retractions
Online: 25 February 2020 (11:45:37 CET)
Academic publishing is undergoing a highly transformative process, and many rules and value systems that were in place for years are being challenged in unprecedented forms leading to the evolution of novel ways of dealing with new pressures. One of the most important aspects of an integrated and valid academic literature is the ability to screen publications for errors during peer review to weed out mistakes, fraud and inconsistencies, such that the final published product represents a product that has value, intellectually, and otherwise. It is difficult to claim the existence of perfect manuscripts. The level of errors that exist in a manuscript will depend on the rigor of the research group and of the peer review process that was used to screen that paper. When errors slip into a final published paper, either through honest error or misconduct, and are not detected during peer review and editorial screening, but are spotted during post-publication peer review, an opportunity is created to set the record straight, and to correct it. To date, the most common forms of correcting the literature have been errata, corrigenda, expressions of concern, and retractions. Despite this range of corrective measures, which represent artificially created corrals around pockets of imperfect literature, certain cases do not quite fit this mold, and new suggested measures for correcting the literature have been proposed, including manuscript versioning, amendments, partial retractions and retract and replace. In this commentary, a discussion of the evolving correction of the literature is provided, as are perspectives of the risks and benefits of such new measures to improve it.
ARTICLE | doi:10.20944/preprints201910.0232.v1
Subject: Earth Sciences, Geophysics Keywords: solar radiation; diffuse; LSA SAF; aerosols; MSG SEVIRI; open source code
Online: 20 October 2019 (02:24:07 CEST)
Several studies have shown that changes in incoming solar radiation and variations of the diffuse fraction can significantly modify the vegetation carbon uptake. Hence, monitoring the incoming solar radiation at large scale and with high temporal frequency is crucial for this reason along with many others. The EUMETSAT Satellite Application Facility for Land Surface Analysis (LSA SAF) operationally disseminates in near real time estimates of the downwelling shortwave radiation at the surface since 2005. This product is derived from observations provided by the SEVIRI instrument onboard the Meteosat Second Generation series of geostationary satellites, which covers Europe, Africa, the Middle East, and part of South America. However, near real time generation of the diffuse fraction at the surface level has only recently been initiated. The main difficulty towards achieving this goal was the general lack of accurate information on the aerosol particles in the atmosphere. This limitation is nowadays less important thanks to the improvements in atmospheric numerical models. This study presents an upgrade of the LSA-SAF operational retrieval method, which provides the simultaneous estimation of the incoming solar radiation and its diffuse fraction from satellite every 15 minutes. The upgrade includes a comprehensive representation of the influence of aerosols based on physical approximations of the radiative transfer within an atmosphere-surface associated medium. This article explains the retrieval method, discusses its limitations and differences with the previous method, and details the characteristics of the output products. A companion article will focus on the evaluation of the products against independent measurements of solar radiation. Finally, the access to the source code is provided through an open access platform in order to share with the community the expertise on the satellite retrieval of this variable.
ARTICLE | doi:10.20944/preprints201806.0138.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: controller; industry network; open flow; software defined networking; programmable logic controller
Online: 8 June 2018 (13:35:22 CEST)
Trends such as Industrial Internet of Things (IIoT) and Industry 4.0 have increased the need to use powerfull network technologies in industrial automation. The growing communication in industrial automation is harnessing the productivity and efficiency of manufacturing and process automation with minimum human intervention. Due to the ongoing evolution of industrial networks from Fieldbus technologies to Ethernet, the new opportunity has emerged to integrate the Software Defined Networking (SDN) technique. In this paper, we provide a brief overview of SDN in the domain of industrial automation. We propose a network architecture called Software Defined Industrial Automation Network (SDIAN), with the objective of improving network scalability and efficiency. To match the specific considerations and requirements of having a deterministic system in an industrial network, we propose two solutions for flow creation: Pro-active Flow Installation Scheme (PFIS) and Hybrid Flow-Installation Scheme (HFIS). We analytically quantify the proposed solutions in alleviating the overhead incurred from the flow setup cost. The analytical model is verified through monte carlo simulations. We also evaluate the SDIAN architecture and analyze the network performance of the modified topology using an emulator called Mininet. We further list and motivate SDIAN features and in particular report on an experimental food processing plant demonstration featuring Raspberry PIs (RPIs) instead of traditional Programmable Logic Controllers (PLCs). Our demonstration exemplifies the characteristics of SDIAN.
ARTICLE | doi:10.20944/preprints201804.0310.v1
Subject: Medicine & Pharmacology, Ophthalmology Keywords: primary open angle glaucoma; dorzolamide/timolol fixed combination; drug efficacy; safety
Online: 24 April 2018 (08:36:58 CEST)
Purpose: To compare the therapeutic efficacy and safety of dorzolamide/timolol fixed-combination in newly diagnosed primary open angle glaucoma patients. Methods: In this prospective, interventional case series, newly diagnosed primary open angle glaucoma (POAG) patients that had not been treated for glaucoma were included. Patients were started on Cosopt twice a day (BID) for 1 month and then switched to three times a day (TDS) for additional 1 month. Patients underwent comprehensive ophthalmic examination, diurnal intraocular pressure (IOP), blood pressure (BP) and 24-hours heart rate (HR) measurements at baseline, month 1( BID), and month 2( TDS). IOP, systolic and diastolic pressures were measured at 8:00 AM,12:00 AM, 4:00 PM, 8:00 PM and 12:00 PM. Throughout the study, all adverse events were recorded and monitored by the investigators. Results: In 31 POAG patients that completed the study ,mean baseline IOP was 23.1±3.15 mmHg . IOP was decreased significantly 16.5 ± 2.21 at 1 month (P < 0.0001) and 13.9 ± 2.23 mmHg at 1 and 2 month follow up. (P < 0.0001) IOP was significantly lower in month 2 compared to month 1 (P = 0.0004). While Cosopt BID significantly reduced the mean 24-hour systolic BP and mean 24-hour HR from baseline (P < 0.0001), the mean 24-hour systolic BP and HR remained unchanged 2ith Cosopt TDS compared to BID (P = 0.62). Conclusions: Cosopt TDS has a superior IOP-lowering effect than Cosopt BID in POAG patients with comparable safety profile.
ARTICLE | doi:10.20944/preprints201607.0088.v2
Subject: Social Sciences, Organizational Economics & Management Keywords: joint patent application; the structure of collaboration; open innovation; long tail
Online: 29 July 2016 (06:48:53 CEST)
The way people innovate and create new ideas and bring them to the market is undergoing a fundamental change from closed innovation to open innovation. Why and how do firms perform open innovation? Firms’ open innovation is measured through the levels of firms’ joint patent applications. Next, we analyze network structures and characters of firms’ joint patent applications such as betweenness and degree centrality, structure hole, and closure. From this research, we drew conclusions as follows. First, the structure of collaboration networks has both direct and indirect effects on firms’ innovative performance. Second, in the process of joint patent applications, there is a long tail phenomenon in networks of joint patent applications. Third, the number of patents and International Patent Classification (IPC) subclasses together constitute a meaningful measure of the innovation performance of firms.
REVIEW | doi:10.20944/preprints202208.0434.v1
Subject: Engineering, Mechanical Engineering Keywords: COVID-19; 3D Printing; Additive Manufacturing; Medical Applications; Open-source files; Innovation
Online: 25 August 2022 (10:24:14 CEST)
The Coronavirus disease 2019 (COVID-19) rapidly spread to over 180 countries and abruptly disrupted the production rates and supply chains worldwide. Since then, 3D printing also recognized as additive manufacturing (AM) and known to be a novel technique that uses layer-by-layer deposition of material to produce the intricate 3D geometry, has been engaged in reducing the distress caused by the outbreak. During the early stages of this pandemic, shortages of Personal Protection Equipment (PPE), including facemasks, shields, respirators, and other medical gears, were significantly answered by remotely 3D printing them. Amidst the growing testing requirements, the 3D printing emerged as a potential and fast solution manufacturing process to meet the production needs due to its flexibility, reliability, and rapid response capabilities. In the recent past, some of the other medical applications that have gained prominence in the scientific community include 3D printed ventilator splitters, device components, and patient-specific products. Regarding the non-medical applications, researchers have successfully developed contact-free devices to address the sanitary crisis in public places. This work aims to systematically review the applications of 3D printing or AM techniques that have been involved in producing various critical products essential to limit this deadly pandemic's progression.
ARTICLE | doi:10.20944/preprints202104.0486.v1
Subject: Earth Sciences, Atmospheric Science Keywords: Land-surface modelling system; hydrology; carbon; surface energy balance; open water; snow
Online: 19 April 2021 (13:23:53 CEST)
The land-surface developments of the European Centre for Medium-range Weather Forecasts (ECMWF) are based on the Carbon-Hydrology Tiled Scheme for Surface Exchanges over Land (CHTESSEL) and form an integral part of the Integrated Forecasting System (IFS), supporting a wide range of global weather, climate and environmental applications. In order to structure, coordinate and focus future developments and benefit from international collaboration in new areas, a flexible system named ECLand which would facilitates modular extensions to support numerical weather prediction (NWP) and society-relevant operational services, e.g. Copernicus, is presented . This paper introduces recent examples of novel ECLand developments on (i) vegetation, (ii) snow, (iii) soil, (iv) open water/lake (v) river/inundation, and (vi) urban areas. The developments are evaluated separately with long-range, atmosphere-forced surface offline simulations, and coupled land-atmosphere-ocean experiments. This illustrates the benchmark criteria for assessing both, process fidelity with regards to land surface fluxes and reservoirs of the water-energy-carbon exchange on the one hand, and on the other hand the requirements of ECMWF’s NWP, climate and atmospheric composition monitoring services using an Earth system assimilation prediction framework.
ARTICLE | doi:10.20944/preprints202103.0531.v2
Subject: Engineering, Energy & Fuel Technology Keywords: Sector coupling; 100% renewable; Sub-national energy model; Energy transition; Open science.
Online: 24 March 2021 (13:32:30 CET)
The energy transition requires integration of different energy carriers, including electricity, heat, and transport sectors. Energy modeling methods and tools are essential to provide a clear insight into the energy transition. However, the methodologies often overlook the details of small-scale energy systems. The study states an innovative approach to facilitate sub-national energy systems with 100% renewable penetration and sectoral integration. An optimization model, OSeEM-SN, is developed under the Oemof framework. The model is validated using the case study of Schleswig-Holstein. The study assumes three scenarios representing 25%, 50%, and 100% of the total available biomass potentials. OSeEM-SN reaches feasible solutions without additional offshore wind investment, indicating that they can be reserved for supplying other states’ energy demand. The annual investment cost varies between 1.02 bn – 1.44 bn €/yr for the three scenarios. The electricity generation decreases by 17%, indicating that with high biomass-based combined heat and power plants, the curtailment from other renewable plants can be decreased. Ground source heat pumps dominate the heat mix; however, their installation decreases by 28% as the biomass penetrates fully into the energy mix. The validation confirms OSeEM-SN as a beneficial tool to examine different scenarios for sub-national energy systems.
ARTICLE | doi:10.20944/preprints202012.0064.v1
Subject: Engineering, Other Keywords: Smart City, Urban ICT, Open Urban Platforms, Sustainable Cities, Resiliency, Quality Assurance
Online: 2 December 2020 (14:08:40 CET)
Information and Communication Technology (ICT) is at the heart of the Smart City approach, which constitutes the next level of cities’ and communities’ development across the globe. Thereby, ICT serves as the gluing component enabling different domains to interact with each other and facilitating the management and processing of vast amounts of data and information towards intelligently steering the cities infrastructure and processes, engaging the citizens and facilitating new services and applications on various aspects of urban life - e.g. supply chains, mobility, transportation, energy, citizens’ participation, public safety, interactions between citizens and the public administration, water management, parking and many other use cases and domains. Hence, given the fundamental role of ICT in cities in the near future, it is of paramount importance to lay the ground for a sustainable and reliable ICT infrastructure, which can enable a city/community to respond in a resilient way to upcoming challenges whilst increasing the quality of life for its citizens. This paper constitutes a continuation of a series of research documents and standardization activities, which relate to the concept of Open Urban Platforms (OUP) and the way they serve as a blueprint for each city/community towards the establishment of an ICT backbone. Thereby, the current paper emphasizes on the aspects of sustainability and resilient ICT, whilst reporting on our latest activities and related developments in the research area.
Subject: Life Sciences, Other Keywords: data science; reuse; sequencing data; genomics; bioinformatics; databases; computational biology; open science
Online: 16 July 2020 (12:39:43 CEST)
The 'big data revolution' has enabled novel types of analyses in the life sciences, facilitated by public sharing and reuse of datasets. Here, we review the prodigious potential of reusing publicly available datasets and the challenges, limitations and risks associated with it. Possible solutions to issues and research integrity considerations are also discussed. Due to the prominence, abundance and wide distribution of sequencing data, we focus on the reuse of publicly available sequence datasets. We define ‘successful reuse’ as the use of previously published data to enable novel scientific findings and use selected examples of such reuse from different disciplines to illustrate the enormous potential of the practice, while acknowledging their respective limitations and risks. A checklist to determine the reuse value and potential of a particular dataset is also provided. The open discussion of data reuse and the establishment of the practice as a norm has the potential to benefit all stakeholders in the life sciences.
REVIEW | doi:10.20944/preprints202003.0220.v1
Subject: Biology, Ecology Keywords: 3D printing; 3D scanning; customized ecological objects; methods; stereolithography; open-source lab
Online: 12 March 2020 (14:46:07 CET)
3D printing is described as the third industrial revolution: its impact is global in industry and progresses every day in society. It presents a huge potential for ecology and evolution, sciences with a long tradition of inventing and creating objects for research, education and outreach. Its general principle as an additive manufacturing technique is relatively easy to understand: objects are created by adding material layers on top of each other. Although this may seem very straightforward on paper, it is much harder in the real world. Specific knowledge is indeed needed to successfully turn an idea into a real object, because of technical choices and limitations at each step of the implementation. This article aims at helping scientists to jump in the 3D printing revolution, by offering a hands-on guide to current 3D printing technology. We first give a brief overview of uses of 3D printing in ecology and evolution, then review the whole process of object creation, split into three steps: (1) obtaining the digital 3D model of the object of interest, (2) choosing the 3D printing technology and material best adapted to the requirements of its intended use, (3) pre- and post-processing the 3D object. We compare the main technologies available and their pros and cons according to the features and the use of the object to be printed. We give specific and key details in appendices, based on examples in ecology and evolution.
Subject: Keywords: 3D object reconstruction, depth cameras, Kinect sensors; open source, signal denoising, SLAM
Online: 9 April 2019 (12:24:34 CEST)
3D object reconstruction from depth image streams using Kinect-style depth cameras has been extensively studied. In this paper, we propose an approach for accurate camera tracking and volumetric dense surface reconstruction assuming a known cuboid reference object is present in the scene. Our contribu¬tion is three-fold. (a) We maintain drift-free camera pose tracking by incorporating the 3D geometric constraints of the cuboid reference object into the image registration process. (b) We reformulate the problem of depth stream fusion as a binary classification problem, enabling high-fidelity surface reconstruction, especially in the con¬cave zones of objects. (c) We further present a surface denoising strategy to mitigate the topological inconsistency (e.g., holes and dangling triangles), which facilitates the generation of a noise-free triangle mesh. We extend our public dataset CU3D with several new image sequences, test our algorithm on these sequences and quantitatively compare them with other state-of-the-art algorithms. Both our dataset and our algorithm are available as open-source content at https://github.com/zhangxaochen/CuFusion for oth-er researchers to reproduce and verify our results.
ARTICLE | doi:10.20944/preprints201902.0087.v1
Subject: Social Sciences, Geography Keywords: Noise mapping; END directive; GIS; open source; standards, road traffic; population exposure
Online: 11 February 2019 (09:48:53 CET)
The urbanisation phenomenon and related cities expansion and transport networks entail preventing the increase of population exposed to environmental pollution. Regarding noise exposure, the Environmental Noise Directive demands on main metropolis to produce noise maps. While based on standard methods, these latter are usually generated by proprietary software and require numerous input data concerning, for example, the buildings, land use, transportation network and traffic. The present work describes an open source implementation of a noise mapping tool fully implemented in a Geographic Information System compliant with the Open Geospatial Consortium standards. This integration makes easier at once the formatting and harvesting of noise model input data, cartographic rendering and output data linkage with population data. An application is given for a French city, which consists in estimating the impact of road traffic-related scenarios in terms of population exposure to noise levels both in relation to a threshold value and level classes.
ARTICLE | doi:10.20944/preprints201812.0049.v1
Subject: Physical Sciences, Particle & Field Physics Keywords: NA61/SHINE, upgrade, open charm, silicon detector, TPC, CERN long shutdown 2
Online: 4 December 2018 (09:24:42 CET)
The NA61/SHINE experiment studies hadron production in hadron-hadron, hadron-nucleus and nucleus-nucleus collisions. The physics programme includes the study of the onset of deconfinement and search for the critical point as well as reference measurements for neutrino and cosmic ray experiments. For strong interactions, future plans are to extend the programme of study of the onset of deconfinement by measurements of open-charm and possibly other short-lived, exotic particle production in nucleus-nucleus collisions. This new programme is planned to start after 2020 and requires upgrades to the present NA61/SHINE detector setup. Besides the construction of a large acceptance silicon detector, a 10-fold increase of the event recording rate is foreseen, which will necessitate a general upgrade of most detectors.
ARTICLE | doi:10.20944/preprints201801.0268.v1
Online: 29 January 2018 (05:29:38 CET)