We present a pipeline for analyzing long-term video monitoring data from a mouse model of epilepsy. Mice were constantly monitored for 5 consecutive days, generating a large amount of high-resolution video data. Subsequently, we apply markerless pose estimation of user-defined body parts using deep neural networks (DNNs) to extract exact 2D motion data. To automatically identify behavioral...
Since its inception, computer graphics research is focused on modeling and creating computer-generated content that captures the appearance of the real world. For several decades, a lot of work went into reconstructing objects and scenes through photographs and other capturing methods. This does not only include geometry acquisition but also material and appearance reconstruction. In more...
At the Institute of Structural Biology we aim to unravel structure-function relationships in biological macromolecules. For this purpose we use state of the art methods such as cryo-EM and X-ray crystallography to determine experimental structures of macromolecules. We also provide various in silico labelling tools ("http://www.mtsslsuite.isb.ukbonn.de/") for scientists in the EPR- and FRET...
School teachers are disproportionately more likely to experience voice problems compared to the general population. One understudied component of this elevated risk is the role of stress on motor control for voice and speech production. The goal of this study is to determine the functional and clinical relevance of stress on the motor control for voice and speech production in the brain and...
In Geophysics, physical methods are applied to the earth's subsurface and related objects of investigation on both laboratory and field scale.
The data resulting from measurements and synthetic modeling are acquired mostly digitally or are at least managed in digital data structures in order to be analysed both quantitatively and qualitatively.
Therefore, the digital infrastructure for...
LHCb and other experiments at CERN have the commitment to make the collected data
available to the public. In December of 2022, LHCb has released the first data set
containing 200 Terabytes of data from proton-proton collisions collected in the years
2011 and 2012 (Run1 of the LHC). This data is now available through CERN Open Data Portal.
The aim of the open data release is to...
High Performance Computing (HPC) leverages the power of multiple compute nodes and architectures to solve complex problems, often with large data sets. Typical applications range from large scale simulations to machine learning and data analysis. The University of Bonn maintains several HPC clusters for specialized and general purposes, such as a massively parallel computing system with GPU...
A key to defining the organizational structure of simulation data is the
computational mesh. It encodes where in space, and possibly in time,
simulation data points are located. This is essential for the inner
functioning of the simulation on the one hand and for in-situ processing,
storing, and post-processing the data on the other. The primary danger is
losing the parallel, highly...
Our poster presents the data management strategy of the cluster of excellence PhenoRob from the University of Bonn. PhenoRob is an interdisciplinary project that combines research from robotics and phenotyping and aims for sustainable crop production. Due to the interdisciplinarity, we deal with very heterogeneous data from different research fields, with multiple file formats, and varying...
Research in Lattice Quantum Field Theory (LQFT) is performed by international collaborations in both overlapping and disjoint research projects studying a wide range of non-perturbative physical problems. The generated data, which can roughly be classified into three tiers, span the whole spectrum of storage, metadata and lifetime requirements.
LQFT simulations are some of the most...
Within the Collaborative Research Centre 1502, DETECT, large amounts of research data from various sources are being produced and shared between the CRC partners and to the outside world. These sources comprise model input and output data, observational data from satellites and large networks as well as economical and statistical information affecting land use and land cover developments....
Efficient, extendible knowledge representation and reasoning enabling data mining and data visualization is becoming increasingly important in systems biology due to the constant growth, accumulation and availability of vast multi-variate and multi-modal data sets rendered possible through former and recent advantages in e.g., high-throughput experimental measuring techniques, i.e., omics and...
The Collaborative Research Centre 1502, DETECT, deals with the various anthropogenic changes affecting energy and water redistribution in the atmosphere and subsurface. For this, a considerable amount of data is being used. Experts from DETECT, including hydrologists, meteorologists, land use modelers, geodesists, remote sensing experts, agricultural economists, and social scientists, will use...
Digitization is changing day-to-day research practices across all fields of academic inquiry. Research data of every kind are collected, processed, analyzed, published and archived in digital systems. The term Research Data Management (RDM) refers to a range of activities and topics relating to the handling of digital research data, including technical, methodological, organizational and legal...
We have developed VAMPIRA, software capable of automatically generating provenance for data-intensive scientific workflows. Provenance generated by VAMPIRA describes the record of the data processing, metadata, infrastructure and user data involved within a workflow as well as the interactions between them. Armed with this extra information, scientists will be able to make more informed...