Serial crystallography (SX) involves combining observations from a very large number of diffraction patterns coming from crystals in random orientations. To compile a complete data set, these patterns must be indexed ( their orientation determined), integrated and merged. Introduced here is (-powered robust optimization) , a robust and adaptable indexing algorithm developed using the framework.
View Article and Find Full Text PDFThe web-based IceBear software is a versatile tool to monitor the results of crystallization experiments and is designed to facilitate supervisor and student communications. It also records and tracks all relevant information from crystallization setup to PDB deposition in protein crystallography projects. Fully automated data collection is now possible at several synchrotrons, which means that the number of samples tested at the synchrotron is currently increasing rapidly.
View Article and Find Full Text PDFStrategies for collecting X-ray diffraction data have evolved alongside beamline hardware and detector developments. The traditional approaches for diffraction data collection have emphasised collecting data from noisy integrating detectors (i.e.
View Article and Find Full Text PDFMacromolecular crystallography (MX) has been a motor for biology for over half a century and this continues apace. A series of revolutions, including the production of recombinant proteins and cryo-crystallography, have meant that MX has repeatedly reinvented itself to dramatically increase its reach. Over the last 30 years synchrotron radiation has nucleated a succession of advances, ranging from detectors to optics and automation.
View Article and Find Full Text PDFSegmentation is the process of isolating specific regions or objects within an imaged volume, so that further study can be undertaken on these areas of interest. When considering the analysis of complex biological systems, the segmentation of three-dimensional image data is a time consuming and labor intensive step. With the increased availability of many imaging modalities and with automated data collection schemes, this poses an increased challenge for the modern experimental biologist to move from data to knowledge.
View Article and Find Full Text PDFThis paper provides an overview of the discussion and presentations from the Workshop on the Management of Large CryoEM Facilities held at the New York Structural Biology Center, New York, NY on February 6-7, 2017. A major objective of the workshop was to discuss best practices for managing cryoEM facilities. The discussions were largely focused on supporting single-particle methods for cryoEM and topics included: user access, assessing projects, workflow, sample handling, microscopy, data management and processing, and user training.
View Article and Find Full Text PDFThe recent resolution revolution in cryo-EM has led to a massive increase in demand for both time on high-end cryo-electron microscopes and access to cryo-electron microscopy expertise. In anticipation of this demand, eBIC was set up at Diamond Light Source in collaboration with Birkbeck College London and the University of Oxford, and funded by the Wellcome Trust, the UK Medical Research Council (MRC) and the Biotechnology and Biological Sciences Research Council (BBSRC) to provide access to high-end equipment through peer review. eBIC is currently in its start-up phase and began by offering time on a single FEI Titan Krios microscope equipped with the latest generation of direct electron detectors from two manufacturers.
View Article and Find Full Text PDFSegmentation of biological volumes is a crucial step needed to fully analyse their scientific content. Not having access to convenient tools with which to segment or annotate the data means many biological volumes remain under-utilised. Automatic segmentation of biological volumes is still a very challenging research field, and current methods usually require a large amount of manually-produced training data to deliver a high-quality segmentation.
View Article and Find Full Text PDFWith the development of fourth-generation high-brightness synchrotrons on the horizon, the already large volume of data that will be collected on imaging and mapping beamlines is set to increase by orders of magnitude. As such, an easy and accessible way of dealing with such large datasets as quickly as possible is required in order to be able to address the core scientific problems during the experimental data collection. Savu is an accessible and flexible big data processing framework that is able to deal with both the variety and the volume of data of multimodal and multidimensional scientific datasets output such as those from chemical tomography experiments on the I18 microfocus scanning beamline at Diamond Light Source.
View Article and Find Full Text PDFSynchrotron light source facilities worldwide generate terabytes of data in numerous incompatible data formats from a wide range of experiment types. The Data Analysis WorkbeNch (DAWN) was developed to address the challenge of providing a single visualization and analysis platform for data from any synchrotron experiment (including single-crystal and powder diffraction, tomography and spectroscopy), whilst also being sufficiently extensible for new specific use case analysis environments to be incorporated (e.g.
View Article and Find Full Text PDFActa Crystallogr D Biol Crystallogr
January 2015
Logging experiments with the laboratory-information management system ISPyB (Information System for Protein crystallography Beamlines) enhances the automation of small-angle X-ray scattering of biological macromolecules in solution (BioSAXS) experiments. The ISPyB interface provides immediate user-oriented online feedback and enables data cross-checking and downstream analysis. To optimize data quality and completeness, ISPyBB (ISPyB for BioSAXS) makes it simple for users to compare the results from new measurements with previous acquisitions from the same day or earlier experiments in order to maximize the ability to collect all data required in a single synchrotron visit.
View Article and Find Full Text PDFMacromolecular crystallography (MX) is the most powerful technique available to structural biologists to visualize in atomic detail the macromolecular machinery of the cell. Since the emergence of structural genomics initiatives, significant advances have been made in all key steps of the structure determination process. In particular, third-generation synchrotron sources and the application of highly automated approaches to data acquisition and analysis at these facilities have been the major factors in the rate of increase of macromolecular structures determined annually.
View Article and Find Full Text PDFJ Appl Crystallogr
October 2014
The macromolecular crystallography (MX) user experience at synchrotron radiation facilities continues to evolve, with the impact of developments in X-ray detectors, computer hardware and automation methods making it possible for complete data sets to be collected on timescales of tens of seconds. Data can be reduced in a couple of minutes and in favourable cases structures solved and refined shortly after. The information-rich database ISPyB, automatically populated by data acquisition software, data processing and structure solution pipelines at the Diamond Light Source beamlines, allows users to automatically track MX experiments in real time.
View Article and Find Full Text PDFWe report the outcomes of the discussion initiated at the workshop entitled A 3D Cellular Context for the Macromolecular World and propose how data from emerging three-dimensional (3D) cellular imaging techniques—such as electron tomography, 3D scanning electron microscopy and soft X-ray tomography—should be archived, curated, validated and disseminated, to enable their interpretation and reuse by the biomedical community.
View Article and Find Full Text PDFData formats for recording X-ray diffraction data continue to evolve rapidly to accommodate new detector technologies developed in response to more intense light sources. Processing the data from single-crystal X-ray diffraction experiments therefore requires the ability to read, and correctly interpret, image data and metadata from a variety of instruments employing different experimental representations. Tools that have previously been developed to address this problem have been limited either by a lack of extensibility or by inconsistent treatment of image metadata.
View Article and Find Full Text PDFMotivation: Individual research groups now analyze thousands of samples per year at synchrotron macromolecular crystallography (MX) resources. The efficient management of experimental data is thus essential if the best possible experiments are to be performed and the best possible data used in downstream processes in structure determination pipelines. Information System for Protein crystallography Beamlines (ISPyB), a Laboratory Information Management System (LIMS) with an underlying data model allowing for the integration of analyses down-stream of the data collection experiment was developed to facilitate such data management.
View Article and Find Full Text PDFActa Crystallogr D Biol Crystallogr
September 2010
A novel raster-scanning method combining continuous sample translation with the fast readout of a Pilatus P6M detector has been developed on microfocus beamline I24 at Diamond Light Source. This fast grid-scan tool allows the rapid evaluation of large sample volumes without the need to increase the beam size at the sample through changes in beamline hardware. A slow version is available for slow-readout detectors.
View Article and Find Full Text PDFData management has emerged as one of the central issues in the high-throughput processes of taking a protein target sequence through to a protein sample. To simplify this task, and following extensive consultation with the international structural genomics community, we describe here a model of the data related to protein production. The model is suitable for both large and small facilities for use in tracking samples, experiments, and results through the many procedures involved.
View Article and Find Full Text PDFMOLE (mining, organizing, and logging experiments) has been developed to meet the growing data management and target tracking needs of molecular biologists and protein crystallographers. The prototype reported here will become a Laboratory Information Management System (LIMS) to help protein scientists manage the large amounts of laboratory data being generated due to the acceleration in proteome research and will furthermore facilitate collaborations between groups based at different sites. To achieve this, MOLE is based on the data model for protein production devised at the European Bioinformatics Institute (Pajon A, et al.
View Article and Find Full Text PDF